NASA Technical Reports Server (NTRS)
Ramapriyan, H. K. (Rama); Peng, Ge; Moroni, David; Shie, Chung-Lin
2016-01-01
Quality of products is always of concern to users regardless of the type of products. The focus of this paper is on the quality of Earth science data products. There are four different aspects of quality scientific, product, stewardship and service. All these aspects taken together constitute Information Quality. With increasing requirement on ensuring and improving information quality, there has been considerable work related to information quality during the last several years. Given this rich background of prior work, the Information Quality Cluster (IQC), established within the Federation of Earth Science Information Partners (ESIP) has been active with membership from multiple organizations. Its objectives and activities, aimed at ensuring and improving information quality for Earth science data and products, are discussed briefly.
NASA Technical Reports Server (NTRS)
Ramapriyan, Hampapuram; Peng, Ge; Moroni, David; Shie, Chung-Lin
2016-01-01
Quality of products is always of concern to users regardless of the type of products. The focus of this paper is on the quality of Earth science data products. There are four different aspects of quality - scientific, product, stewardship and service. All these aspects taken together constitute Information Quality. With increasing requirement on ensuring and improving information quality, there has been considerable work related to information quality during the last several years. Given this rich background of prior work, the Information Quality Cluster (IQC), established within the Federation of Earth Science Information Partners (ESIP) has been active with membership from multiple organizations. Its objectives and activities, aimed at ensuring and improving information quality for Earth science data and products, are discussed briefly.
How to run a successful Journal
Jawaid, Shaukat Ali; Jawaid, Masood
2017-01-01
Publishing and successfully running a good quality peer reviewed biomedical scientific journal is not an easy task. Some of the pre-requisites include a competent experienced editor supported by a team. Long term sustainability of a journal will depend on good quality manuscripts, active editorial board, good quality of reviewers, workable business model to ensure financial support, increased visibility which will ensure increased submissions, indexation in various important databases, online availability and easy to use website. This manuscript outlines the logistics and technical issues which need to be resolved before starting a new journal and ensuring sustainability of a good quality peer reviewed journal. PMID:29492089
77 FR 21158 - VA Directive 0005 on Scientific Integrity: Availability for Review and Comment
Federal Register 2010, 2011, 2012, 2013, 2014
2012-04-09
... the Director, Office of Science and Technology Policy's Memorandum of December 17, 2010, on scientific integrity. It addresses how VA ensures quality science in its methods, review, policy application, and...: Background The Presidential Memorandum on Scientific Integrity and the Office of Science and Technology...
Enhancing Scientific Foundations to Ensure Reproducibility: A New Paradigm.
Hsieh, Terry; Vaickus, Max H; Remick, Daniel G
2018-01-01
Progress in science is dependent on a strong foundation of reliable results. The publish or perish paradigm in research, coupled with an increase in retracted articles from the peer-reviewed literature, is beginning to erode the trust of both the scientific community and the public. The NIH is combating errors by requiring investigators to follow new guidelines addressing scientific premise, experimental design, biological variables, and authentication of reagents. Herein, we discuss how implementation of NIH guidelines will help investigators proactively address pitfalls of experimental design and methods. Careful consideration of the variables contributing to reproducibility helps ensure robust results. The NIH, investigators, and journals must collaborate to ensure that quality science is funded, explored, and published. Copyright © 2018 American Society for Investigative Pathology. Published by Elsevier Inc. All rights reserved.
Electronic publishing: opportunities and challenges for clinical linguistics and phonetics.
Powell, Thomas W; Müller, Nicole; Ball, Martin J
2003-01-01
This paper discusses the contributions of informatics technology to the field of clinical linguistics and phonetics. The electronic publication of research reports and books has facilitated both the dissemination and the retrieval of scientific information. Electronic archives of speech and language corpora, too, stimulate research efforts. Although technology provides many opportunities, there remain significant challenges. Establishment and maintenance of scientific archives is largely dependent upon volunteer efforts, and there are few standards to ensure long-term access. Coordinated efforts and peer review are necessary to ensure utility and quality.
Charter for the ARM Climate Research Facility Science Board
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ferrell, W
The objective of the ARM Science Board is to promote the Nation’s scientific enterprise by ensuring that the best quality science is conducted at the DOE’s User Facility known as the ARM Climate Research Facility. The goal of the User Facility is to serve scientific researchers by providing unique data and tools to facilitate scientific applications for improving understanding and prediction of climate science.
The quality assurance liaison: Combined technical and quality assurance support
NASA Astrophysics Data System (ADS)
Bolivar, S. L.; Day, J. L.
1993-03-01
The role of the quality assurance liaison, the responsibilities of this position, and the evolutionary changes in duties over the last six years are described. The role of the quality assurance liaison has had a very positive impact on the Los Alamos Yucca Mountain Site Characterization (YW) quality assurance program. Having both technical and quality assurance expertise, the quality assurance liaisons are able to facilitate communications with scientists on quality assurance issues and requirements, thereby generating greater productivity in scientific investigations. The quality assurance liaisons help ensure that the scientific community knows and implements existing requirements, is aware of new or changing regulations, and is able to conduct scientific work within Project requirements. The influence of the role of the quality assurance liaison can be measured by an overall improvement in attitude of the staff regarding quality assurance requirements and improved job performance, as well as a decrease in deficiencies identified during both internal and external audits and surveillances. This has resulted in a more effective implementation of quality assurance requirements.
Steps in Moving Evidence-Based Health Informatics from Theory to Practice.
Rigby, Michael; Magrabi, Farah; Scott, Philip; Doupi, Persephone; Hypponen, Hannele; Ammenwerth, Elske
2016-10-01
To demonstrate and promote the importance of applying a scientific process to health IT design and implementation, and of basing this on research principles and techniques. A review by international experts linked to the IMIA Working Group on Technology Assessment and Quality Development. Four approaches are presented, linking to the creation of national professional expectations, adherence to research-based standards, quality assurance approaches to ensure safety, and scientific measurement of impact. Solely marketing- and aspiration-based approaches to health informatics applications are no longer ethical or acceptable when scientifically grounded evidence-based approaches are available and in use.
Sound data management as a foundation for natural resources management and science
Burley, Thomas E.
2012-01-01
Effective decision making is closely related to the quality and completeness of available data and information. Data management helps to ensure data quality in any discipline and supports decision making. Managing data as a long-term scientific asset helps to ensure that data will be usable beyond the original intended application. Emerging issues in water-resources management and climate variability require the ability to analyze change in the conditions of natural resources over time. The availability of quality, well-managed, and documented data from the past and present helps support this requirement.
MODELLING QUALITY ASSURANCE PLAN FOR THE LAKE MICHIGAN MASS BALANCE PROJECT
With the ever increasing complexity and costs of ecosystem protection and remediation, the USEPA is placing more emphasis on ensuring the quality and credibility of scientific tools, such as models, that are used to help guide decision-makers who are faced with difficult manageme...
Ferreira, Catarina; Bastille-Rousseau, Guillaume; Bennett, Amanda M; Ellington, E Hance; Terwissen, Christine; Austin, Cayla; Borlestean, Adrian; Boudreau, Melanie R; Chan, Kevin; Forsythe, Adrian; Hossie, Thomas J; Landolt, Kristen; Longhi, Jessica; Otis, Josée-Anne; Peers, Michael J L; Rae, Jason; Seguin, Jacob; Watt, Cristen; Wehtje, Morgan; Murray, Dennis L
2016-08-01
Peer review is pivotal to science and academia, as it represents a widely accepted strategy for ensuring quality control in scientific research. Yet, the peer-review system is poorly adapted to recent changes in the discipline and current societal needs. We provide historical context for the cultural lag that governs peer review that has eventually led to the system's current structural weaknesses (voluntary review, unstandardized review criteria, decentralized process). We argue that some current attempts to upgrade or otherwise modify the peer-review system are merely sticking-plaster solutions to these fundamental flaws, and therefore are unlikely to resolve them in the long term. We claim that for peer review to be relevant, effective, and contemporary with today's publishing demands across scientific disciplines, its main components need to be redesigned. We propose directional changes that are likely to improve the quality, rigour, and timeliness of peer review, and thereby ensure that this critical process serves the community it was created for. © 2015 Cambridge Philosophical Society.
75 FR 43190 - Statement of Organization, Functions, and Delegations of Authority
Federal Register 2010, 2011, 2012, 2013, 2014
2010-07-23
... human and animal health; (5) ensures scientific quality and ethical and regulatory compliance of center... investigations on the biology, ecology, and control of arthropod vectors of viral, rickettsial, and bacterial...
Information Quality in Regulatory Decision Making: Peer Review versus Good Laboratory Practice.
McCarty, Lynn S; Borgert, Christopher J; Mihaich, Ellen M
2012-07-01
There is an ongoing discussion on the provenance of toxicity testing data regarding how best to ensure its validity and credibility. A central argument is whether journal peer-review procedures are superior to Good Laboratory Practice (GLP) standards employed for compliance with regulatory mandates. We sought to evaluate the rationale for regulatory decision making based on peer-review procedures versus GLP standards. We examined pertinent published literature regarding how scientific data quality and validity are evaluated for peer review, GLP compliance, and development of regulations. Some contend that peer review is a coherent, consistent evaluative procedure providing quality control for experimental data generation, analysis, and reporting sufficient to reliably establish relative merit, whereas GLP is seen as merely a tracking process designed to thwart investigator corruption. This view is not supported by published analyses pointing to subjectivity and variability in peer-review processes. Although GLP is not designed to establish relative merit, it is an internationally accepted quality assurance, quality control method for documenting experimental conduct and data. Neither process is completely sufficient for establishing relative scientific soundness. However, changes occurring both in peer-review processes and in regulatory guidance resulting in clearer, more transparent communication of scientific information point to an emerging convergence in ensuring information quality. The solution to determining relative merit lies in developing a well-documented, generally accepted weight-of-evidence scheme to evaluate both peer-reviewed and GLP information used in regulatory decision making where both merit and specific relevance inform the process.
Ten questions you should consider before submitting an article to a scientific journal.
Falcó-Pegueroles, A; Rodríguez-Martín, D
Investigating involves not only knowing the research methods and designs; it involves knowing the strategies for disseminating and publishing the results in scientific journals. An investigation is considered complete when it is published and is disclosed to the scientific community. The publication of a manuscript is not simple, since it involves examination by a rigorous editorial process evaluator to ensure the scientific quality of the proposal. The objective of this article is to communicate to potential authors the main errors or deficiencies that typically and routinely explain the decision by the referees of scientific journals not to accept a scientific article. Based on the experience of the authors as referees of national and international journals in the field of nursing and health sciences, we have identified a total of 10 types or groups, which cover formulation errors, inconsistencies between different parts of the text, lack of structuring, imprecise language, information gaps, and the detection of relevant inaccuracies. The identification and analysis of these issues enables their prevention, and is of great use to future researchers in the dissemination of the results of their work to the scientific community. In short, the best publishing strategy is one that ensures the scientific quality of the work and spares no effort in avoiding the errors or deficiencies that referees routinely detect in the articles they evaluate. Copyright © 2017 Sociedad Española de Enfermería Intensiva y Unidades Coronarias (SEEIUC). Publicado por Elsevier España, S.L.U. All rights reserved.
Web quality control for lectures: Supercourse and Amazon.com.
Linkov, Faina; LaPorte, Ronald; Lovalekar, Mita; Dodani, Sunita
2005-12-01
Peer review has been at the corner stone of quality control of the biomedical journals in the past 300 years. With the emergency of the Internet, new models of quality control and peer review are emerging. However, such models are poorly investigated. We would argue that the popular system of quality control used in Amazon.com offers a way to ensure continuous quality improvement in the area of research communications on the Internet. Such system is providing an interesting alternative to the traditional peer review approaches used in the biomedical journals and challenges the traditional paradigms of scientific publishing. This idea is being explored in the context of Supercourse, a library of 2,350 prevention lectures, shared for free by faculty members from over 150 countries. Supercourse is successfully utilizing quality control approaches that are similar to Amazon.com model. Clearly, the existing approaches and emerging alternatives for quality control in scientific communications needs to be assessed scientifically. Rapid explosion of internet technologies could be leveraged to produce better, more cost effective systems for quality control in the biomedical publications and across all sciences.
Guidance for Identifying, Selecting and Evaluating Open Literature Studies
This guidance for Office of Pesticide Program staff will assist in their evaluation of open literature studies of pesticides. It also describes how we identify, select, and ensure that data we use in risk assessments is of sufficient scientific quality.
U.S. EPA. 2000. Science Policy Council Handbook: Peer Review
The goal of the Peer Review Policy and this Handbook is to enhance the quality and credibility of Agency decisions by ensuring that the scientific and technical work products underlying these decisions receive appropriate levels of peer review by independe
The quest to standardize hemodialysis care.
Hegbrant, Jörgen; Gentile, Giorgio; Strippoli, Giovanni F M
2011-01-01
A large global dialysis provider's core activities include providing dialysis care with excellent quality, ensuring a low variability across the clinic network and ensuring strong focus on patient safety. In this article, we summarize the pertinent components of the quality assurance and safety program of the Diaverum Renal Services Group. Concerning medical performance, the key components of a successful quality program are setting treatment targets; implementing evidence-based guidelines and clinical protocols; consistently, regularly, prospectively and accurately collecting data from all clinics in the network; processing collected data to provide feedback to clinics in a timely manner, incorporating information on interclinic and intercountry variations; and revising targets, guidelines and clinical protocols based on sound scientific data. The key activities for ensuring patient safety include a standardized approach to education, i.e. a uniform education program including control of theoretical knowledge and clinical competencies; implementation of clinical policies and procedures in the organization in order to reduce variability and potential defects in clinic practice; and auditing of clinical practice on a regular basis. By applying a standardized and systematic continuous quality improvement approach throughout the entire organization, it has been possible for Diaverum to progressively improve medical performance and ensure patient safety. Copyright © 2011 S. Karger AG, Basel.
77 FR 27070 - Statement of Organization, Functions, and Delegations of Authority
Federal Register 2010, 2011, 2012, 2013, 2014
2012-05-08
...) provides the scientific leadership and management to ensure the quality of science within OSTLTS. Office of... activities related to STLT health agencies; (4) provides leadership in the development and implementation of evidence-based approaches for agency and system management, evolution, and transformation; (5) identifies...
Knobel, LeRoy L.; Tucker, Betty J.; Rousseau, Joseph P.
2008-01-01
Water-quality activities conducted by the staff of the U.S. Geological Survey (USGS) Idaho National Laboratory (INL) Project Office coincide with the USGS mission of appraising the quantity and quality of the Nation's water resources. The activities are conducted in cooperation with the U.S. Department of Energy's (DOE) Idaho Operations Office. Results of the water-quality investigations are presented in various USGS publications or in refereed scientific journals. The results of the studies are highly regarded, and they are used with confidence by researchers, regulatory and managerial agencies, and interested civic groups. In its broadest sense, quality assurance refers to doing the job right the first time. It includes the functions of planning for products, review and acceptance of the products, and an audit designed to evaluate the system that produces the products. Quality control and quality assurance differ in that quality control ensures that things are done correctly given the 'state-of-the-art' technology, and quality assurance ensures that quality control is maintained within specified limits.
A new dataset validation system for the Planetary Science Archive
NASA Astrophysics Data System (ADS)
Manaud, N.; Zender, J.; Heather, D.; Martinez, S.
2007-08-01
The Planetary Science Archive is the official archive for the Mars Express mission. It has received its first data by the end of 2004. These data are delivered by the PI teams to the PSA team as datasets, which are formatted conform to the Planetary Data System (PDS). The PI teams are responsible for analyzing and calibrating the instrument data as well as the production of reduced and calibrated data. They are also responsible of the scientific validation of these data. ESA is responsible of the long-term data archiving and distribution to the scientific community and must ensure, in this regard, that all archived products meet quality. To do so, an archive peer-review is used to control the quality of the Mars Express science data archiving process. However a full validation of its content is missing. An independent review board recently recommended that the completeness of the archive as well as the consistency of the delivered data should be validated following well-defined procedures. A new validation software tool is being developed to complete the overall data quality control system functionality. This new tool aims to improve the quality of data and services provided to the scientific community through the PSA, and shall allow to track anomalies in and to control the completeness of datasets. It shall ensure that the PSA end-users: (1) can rely on the result of their queries, (2) will get data products that are suitable for scientific analysis, (3) can find all science data acquired during a mission. We defined dataset validation as the verification and assessment process to check the dataset content against pre-defined top-level criteria, which represent the general characteristics of good quality datasets. The dataset content that is checked includes the data and all types of information that are essential in the process of deriving scientific results and those interfacing with the PSA database. The validation software tool is a multi-mission tool that has been designed to provide the user with the flexibility of defining and implementing various types of validation criteria, to iteratively and incrementally validate datasets, and to generate validation reports.
NASA Technical Reports Server (NTRS)
Eppler, Dean B.
2013-01-01
The scientific success of any future human lunar exploration mission will be strongly dependent on design of both the systems and operations practices that underpin crew operations on the lunar surface. Inept surface mission preparation and design will either ensure poor science return, or will make achieving quality science operation unacceptably difficult for the crew and the mission operations and science teams. In particular, ensuring a robust system for managing real-time science information flow during surface operations, and ensuring the crews receive extensive field training in geological sciences, are as critical to mission success as reliable spacecraft and a competent operations team.
The NOAA Scientific Computing System Data Assembly Center
NASA Astrophysics Data System (ADS)
Suchdeve, K. L.; Smith, S. R.; Van Waes, M.
2016-02-01
The Scientific Computing System (SCS) Data Assembly Center (DAC) was established in 2014 by the Office of Marine and Aviation Operations (OMAO) to evaluate the quality of full-resolution (sampling on the order of once per second) data collected by SCS onboard NOAA-operated research vessels. The SCS data are nominally transferred from the vessel to the National Centers for Environmental Information (NCEI) soon after the completion of each cruise and are complimented with detailed cruise metadata from OMAO. The authors will describe tools developed by the SCS DAC to monitor the timeliness of SCS data delivery to NCEI and the completeness of the SCS packages received by NCEI (ensuring the package contains data for all enabled sensors on a given cruise). Feedback to OMAO and NCEI regarding the timeliness and data completeness will be outlined along with challenges encountered by the DAC as it works to develop automated quality assessment of the SCS data packages.Data collected by SCS on NOAA vessels represent a significant investment by the American taxpayer. The mission of the SCS DAC is to ensure that archived SCS data at NCEI are a complete record of the observations made on NOAA research cruises. Archival of complete SCS datasets at NCEI ensures these data are preserved for future generations of scientists, policy makers, and the public.
Scientific Data Stewardship in the 21'st Century
NASA Astrophysics Data System (ADS)
Mabie, J. J.; Redmon, R.; Bullett, T.; Kihn, E. A.; Conkright, R.; Manley, J.; Horan, K.
2008-12-01
The Ionosonde Program at the National Geophysical Data Center (NGDC) serves as a case study for how to approach data stewardship in the 21'st century. As the number and sophistication of scientific instruments increase, along with the volumes and complexity of data that need to be preserved for future generations, the old approach of simply storing data in a library, physical or electronic, is no longer sufficient to ensure the long term preservation of our important environmental data. To ensure the data can be accessed, understood, and used by future generations, the data stewards must be familiar with the observation process before the data reach the archive and the scientific applications to which the data may be called to serve. This familiarity is best obtained by active participation. In the NGDC Ionosonde Program team, we strive to have activity and expertise in ionosonde field operations and scientific data analysis in addition to our core mission of preservation and distribution of data and metadata. We believe this approach produces superior data quality, proper documentation and evaluation tools for data customers as part of the archive process. We are presenting the Ionosonde Program as an example of modern scientific data stewardship.
Progress in Brewing Science and Beer Production.
Bamforth, C W
2017-06-07
The brewing of beer is an ancient biotechnology, the unit processes of which have not changed in hundreds of years. Equally, scientific study within the brewing industry not only has ensured that modern beer making is highly controlled, leading to highly consistent, high-quality, healthful beverages, but also has informed many other fermentation-based industries.
[Ethics and biomedical research].
Goussard, Christophe
2007-01-01
Ethics in biomedical research took off from the 1947 Nuremberg Code to its own right in the wake of the Declaration of Helsinki in 1964. Since then, (inter)national regulations and guidelines providing a framework for clinical studies and protection for study participants have been drafted and implemented, while ethics committees and drug evaluation agencies have sprung up throughout the world. These two developments were crucial in bringing about the protection of rights and safety of the participants and harmonization of the conduct of biomedical research. Ethics committees and drug evaluation agencies deliver ethical and scientific assessments on the quality and safety of the projects submitted to them and issue respectively approvals and authorizations to carry out clinical trials, while ensuring that they comply with regulatory requirements, ethical principles, and scientific guidelines. The advent of biomedical ethics, together with the responsible commitment of clinical investigators and of the pharmaceutical industry, has guaranteed respect for the patient, for whom and with whom research is conducted. Just as importantly, it has also ensured that patients reap the benefit of what is the primary objective of biomedical research: greater life expectancy, well-being, and quality of life.
Redrawing the frontiers in the age of post-publication review
Galbraith, David W.
2015-01-01
Publication forms the core structure supporting the development and transmission of scientific knowledge. For this reason, it is essential that the highest standards of quality control be maintained, in particular to ensure that the information being transmitted allows reproducible replication of the described experiments, and that the interpretation of the results is sound. Quality control has traditionally involved editorial decisions based on anonymous pre-publication peer review. Post-publication review of individual articles took the lesser role since it did not feed directly back to the original literature. Rapid advances in computer and communications technologies over the last thirty years have revolutionized scientific publication, and the role and scope of post-publication review has greatly expanded. This perspective examines the ways in which pre- and post-publication peer review influence the scientific literature, and in particular how they might best be redrawn to deal with the twin problems of scientific non-reproducibility and fraud increasingly encountered at the frontiers of science. PMID:26097488
Conducting remote bioanalytical data monitoring and review based on scientific quality objectives.
He, Ling
2011-07-01
For bioanalytical laboratories that follow GLP regulations and generate data for new drug filing, ensuring quality standards set by regulatory guidance is a fundamental expectation. Numerous guidelines and White Papers have been published by regulatory agencies, professional working groups and field experts in the past two decades, and have significantly improved the standards of good practices for bioanalysis. From a sponsor's perspective, continuous quality monitoring of the data generated by CRO laboratories, identifying adverse trends and taking corrective and preventative actions against issues encountered, are critical aspects of effective bioanalytical outsourcing management. This is especially important for clinical bioanalysis, where one validated assay is applied for analyzing a large number of samples of diverse demographics and disease states. This perspective article presents thoughts toward remote data monitoring and its merits for scientific quality oversight, and introduces a novel Bioanalytical Data Review software that was custom-developed and platform-neural, to conduct remote data monitoring on raw or processed LC-MS/MS data from CROs. Flexible, adaptive and user-customizable queries are applied for conducting project-, batch- and sample-level data review based on scientific quality performance factors commonly assessed for good bioanalytical practice.
Quality assurance and ergonomics in the mammography department.
Reynolds, April
2014-01-01
Quality assurance (QA) in mammography is a system of checks that helps ensure the proper functioning of imaging equipment and processes. Ergonomics is a scientific approach to arranging the work environment to reduce the risk of work-related injuries while increasing staff productivity and job satisfaction. This article reviews both QA and ergonomics in mammography and explains how they work together to create a safe and healthy environment for radiologic technologists and their patients. QA and quality control requirements in mammography are discussed, along with ergonomic best practices in the mammography setting.
Bartholomay, Roy C.; Maimer, Neil V.; Wehnke, Amy J.
2014-01-01
Water-quality activities and water-level measurements by the personnel of the U.S. Geological Survey (USGS) Idaho National Laboratory (INL) Project Office coincide with the USGS mission of appraising the quantity and quality of the Nation’s water resources. The activities are carried out in cooperation with the U.S. Department of Energy (DOE) Idaho Operations Office. Results of the water-quality and hydraulic head investigations are presented in various USGS publications or in refereed scientific journals and the data are stored in the National Water Information System (NWIS) database. The results of the studies are used by researchers, regulatory and managerial agencies, and interested civic groups. In the broadest sense, quality assurance refers to doing the job right the first time. It includes the functions of planning for products, review and acceptance of the products, and an audit designed to evaluate the system that produces the products. Quality control and quality assurance differ in that quality control ensures that things are done correctly given the “state-of-the-art” technology, and quality assurance ensures that quality control is maintained within specified limits.
NASA Astrophysics Data System (ADS)
Gold, A. U.; Ledley, T. S.; McCaffrey, M. S.; Buhr, S. M.; Manduca, C. A.; Niepold, F.; Fox, S.; Howell, C. D.; Lynds, S. E.
2010-12-01
The topic of climate change permeates all aspects of our society: the news, household debates, scientific conferences, etc. To provide students with accurate information about climate science and energy awareness, educators require scientifically and pedagogically robust teaching materials. To address this need, the NSF-funded Climate Literacy & Energy Awareness Network (CLEAN) Pathway has assembled a new peer-reviewed digital collection as part of the National Science Digital Library (NSDL) featuring teaching materials centered on climate and energy science for grades 6 through 16. The scope and framework of the collection is defined by the Essential Principles of Climate Science (CCSP 2009) and a set of energy awareness principles developed in the project. The collection provides trustworthy teaching materials on these socially relevant topics and prepares students to become responsible decision-makers. While a peer-review process is desirable for curriculum developer as well as collection builder to ensure quality, its implementation is non-trivial. We have designed a rigorous and transparent peer-review process for the CLEAN collection, and our experiences provide general guidelines that can be used to judge the quality of digital teaching materials across disciplines. Our multi-stage review process ensures that only resources with teaching goals relevant to developing climate literacy and energy awareness are considered. Each relevant resource is reviewed by two individuals to assess the i) scientific accuracy, ii) pedagogic effectiveness, and iii) usability/technical quality. A science review by an expert ensures the scientific quality and accuracy. Resources that pass all review steps are forwarded to a review panel of educators and scientists who make a final decision regarding inclusion of the materials in the CLEAN collection. Results from the first panel review show that about 20% (~100) of the resources that were initially considered for inclusion passed final review. Reviewer comments are recorded as annotations to enhance the resources in the collection and help educators with the implementation in their curriculum. CLEAN launched the first collection of digital educational resources about climate science and energy awareness in November 2010. The final CLEAN collection will include ≥500 resources and will also provide the alignment with the Benchmarks for Science Literacy and the NAAEE Excellence in Environmental Education Guidelines for Learning through the interactive NSDL strandmaps. We will present the first user feedback to this new collection.
Tracking and Establishing Provenance of Earth Science Datasets: A NASA-Based Example
NASA Technical Reports Server (NTRS)
Ramapriyan, Hampapuram K.; Goldstein, Justin C.; Hua, Hook; Wolfe, Robert E.
2016-01-01
Information quality is of paramount importance to science. Accurate, scientifically vetted and statistically meaningful and, ideally, reproducible information engenders scientific trust and research opportunities. Not surprisingly, federal bodies (e.g., NASA, NOAA, USGS) have very strictly affirmed the importance of information quality in their product requirements. So-called Highly Influential Scientific Assessments (HISA) such as The Third US National Climate Assessment (NCA3) published in 2014 undergo a very rigorous review process to ensure transparency and credibility. To support the transparency of such reports, the U.S. Global Change Research Program (USGCRP) has developed the Global Change Information System (GCIS). A recent activity was performed to trace the provenance as completely as possible for all NCA3 figures that were predominantly based on NASA data. This poster presents the mechanics of that project and the lessons learned from that activity.
Disaster Victim Identification: quality management from an odontology perspective.
Lake, A W; James, H; Berketa, J W
2012-06-01
The desired outcome of the victim identification component of a mass fatality event is correct identification of deceased persons in a timely manner allowing legal and social closure for relatives of the victims. Quality Management across all aspects of the Disaster Victim Identification (DVI) structure facilitates this process. Quality Management in forensic odontology is the understanding and implementation of a methodology that ensures collection, collation and preservation of the maximum amount of available dental data and the appropriate interpretation of that data to achieve outcomes to a standard expected by the DVI instructing authority, impacted parties and the forensic odontology specialist community. Managerial pre-event planning responsibility, via an odontology coordinator, includes setting a chain of command, developing and reviewing standard operating procedures (SOP), ensuring use of current scientific methodologies and staff training. During a DVI managerial responsibility includes tailoring SOP to the specific situation, ensuring member accreditation, encouraging inter-disciplinary cooperation and ensuring security of odontology data and work site. Individual responsibilities include the ability to work within a team, accept peer review, and share individual members' skill sets to achieve the best outcome. These responsibilities also include adherence to chain of command and the SOP, maintenance of currency of knowledge and recognition of professional boundaries of expertise. This article highlights issues of Quality Management pertaining particularly to forensic odontology but can also be extrapolated to all DVI actions.
Griffin, G; Clark, J MacArthur; Zurlo, J; Ritskes-Hoitinga, M
2014-04-01
The principles of humane experimental technique, first described by Russell and Burch in 1959, focus on minimising suffering to animals used for scientific purposes. Internationally, as these principles became embedded in the various systems of oversight for the use of animals in science, attention focused on how to minimise pain, distress and lasting harm to animals while maximising the benefits to be obtained from the work. Suffering can arise from the experimental procedures, but it can also arise from the manner in which the animals are housed and cared for. Increased attention is therefore being paid to the entire lifetime experience of an animal, in order to afford it as good a quality of life as possible. Russell and Burch were also concerned that animals should not be used if alternatives to such use were available, and that animals were not wasted through poor-quality science. This concept is being revisited through new efforts to ensure that experiments are well designed and properly reported in the literature, that all results--positive, negative or neutral--are made available to ensure a complete research record, and that animal models are properly evaluated through periodic systematic reviews. These efforts should ensure that animal use is truly reduced as far as possible and that the benefits derived through the use of animals truly outweigh the harms.
Study on Providing Professors with Efficient Service Based on Time Management Strategy
ERIC Educational Resources Information Center
Li, Chunlin; Liu, Mengchao; Wang, Yining
2016-01-01
Time management is the study to use time scientifically by deploying skills, techniques and means, and maximizing time value to help individuals or organizations efficiently complete tasks and achieve goals. University professor as a body is an important force in teaching and research. In order to ensure high-quality teaching, productive research,…
Crystallographic publishing in the electronic age
NASA Astrophysics Data System (ADS)
Strickland, P. R.; McMahon, B.
2008-01-01
The journal publishing activities of the IUCr over the past 60 years are described, together with the new technological, economic and cultural challenges faced by the journals. Particular emphasis is placed on the role of innovative publishing technologies in ensuring the quality of the published information and in providing effective access to the data underpinning the scientific results.
Saying One Thing and Doing Another: The Paradox of Best Practices and Sex Education
ERIC Educational Resources Information Center
Oster, Maryjo M.
2008-01-01
The No Child Left Behind Act of 2001 (NCLB) specifies that all educational programs or curricula be supported by "scientifically based research" in order to ensure better quality control. However, in the arena of sex education, the federal government allocates millions of dollars in grants for schools and organizations to implement…
Promoting and evaluating scientific rigour in qualitative research.
Baillie, Lesley
2015-07-15
This article explores perspectives on qualitative research and the variety of views concerning rigour in the research process. Evaluating and ensuring the quality of research are essential considerations for practitioners who are appraising evidence to inform their practice or research. Several criteria and principles for evaluating quality in qualitative research are presented, recognising that their application in practice is influenced by the qualitative methodology used. The article examines a range of techniques that a qualitative researcher can use to promote rigour and apply it to practice.
2010-04-01
Journal of Supply Chain Management , Vol. 4, No. 4. pp. 7-27. [21] Ellram, L. M. (1996): The use of the case study method in logistics research. Journal ...logistics. European Journal of Operational Research, No. 144, pp. 321-332. There is no ´A´ in CD&E, neither for Analysis nor for Anarchy – Ensuring...analytical support as quality assurance. For managers of CD&E, it is necessary to be able to state that scarce resources are being used to develop the
Improvement of Productivity in TIG Welding Plant by Equipment Design in Orbit
NASA Astrophysics Data System (ADS)
Gnanavel, C.; Saravanan, R.; Chandrasekaran, M.; Jayakanth, J. J.
2017-03-01
Measurements and improvements are very indispensable task at all levels of management. Here some samples are, at operator level: Measuring operating parameters to ensure OEE (Overall Equipment Effectiveness) and measuring Q components performance to ensure quality, at supervisory level: measuring operator’s performance to ensure labour utility at managerial level: production and productivity measurements and at top level capital and capacity utilization. An often accepted statement is “Improvement is impossible without measurement”. Measurements often referred as observation. The case study was conducted at Government Boiler factory in India. The scientific approach followed for indentifying non value added activities. Personalised new equipment designed and installed to achieve productivity improvement of 85% for a day. The new equipment can serve 360o around its axis hence it simplified loading and unloading procedures as well as reduce their times and ensured effective space and time.
Nedza, Susan M
2009-12-01
As the government attempts to address the high cost of health care in the United States, the issues being confronted include variations in the quality of care administered and the inconsistent application of scientifically proven treatments. To improve quality, methods of measurement and reporting with rewards or, eventually, penalties based on performance, must be developed. To date, well-intentioned national policy initiatives, such as value-based purchasing, have focused primarily on the measurement of discrete events and on attempts to construct incentives. While important, the current approach alone cannot improve quality, ensure equitability, decrease variability, and optimize value. Additional thought-leadership is required, both theoretical and applied. Academic medical centers' (AMCs') scholarly and practical participation is needed. Although quality cannot be sustainably improved without measurement, the existing measures alone do not ensure quality. There is not enough evidence to support strong measure development and, further, not enough insight regarding whether the existing measures have their intended effect of enhancing health care delivery that results in quality outcomes for patients. Perhaps the only way that the United States health care system will achieve a standard of quality care is through the strong embrace, effective engagement, intellectual insights, educational contributions, and practical applications in AMCs. Quality will never be achieved through public policies or national initiatives alone but instead through the commitment of the academic community to forward the science of performance measurement and to ensure that measurement leads to better health outcomes for our nation.
The impact of the EU general data protection regulation on scientific research.
Chassang, Gauthier
2017-01-01
The use of personal data is critical to ensure quality and reliability in scientific research. The new Regulation [European Union (EU)] 2016/679 of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data [general data protection regulation (GDPR)], repealing Directive 95/46/EC, strengthens and harmonises the rules for protecting individuals' privacy rights and freedoms within and, under certain conditions, outside the EU territory. This new and historic legal milestone both prolongs and updates the EU acquis of the previous Data Protection Directive 95/46/EC. The GDPR fixes both general rules applying to any kind of personal data processing and specific rules applying to the processing of special categories of personal data such as health data taking place in the context of scientific research, this including clinical and translational research areas. This article aims to provide an overview of the new rules to consider where scientific projects include the processing of personal health data, genetic data or biometric data and other kinds of sensitive information whose use is strictly regulated by the GDPR in order to give the main key facts to researchers to adapt their practices and ensure compliance to the EU law to be enforced in May 2018.
The impact of the EU general data protection regulation on scientific research
Chassang, Gauthier
2017-01-01
The use of personal data is critical to ensure quality and reliability in scientific research. The new Regulation [European Union (EU)] 2016/679 of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data [general data protection regulation (GDPR)], repealing Directive 95/46/EC, strengthens and harmonises the rules for protecting individuals’ privacy rights and freedoms within and, under certain conditions, outside the EU territory. This new and historic legal milestone both prolongs and updates the EU acquis of the previous Data Protection Directive 95/46/EC. The GDPR fixes both general rules applying to any kind of personal data processing and specific rules applying to the processing of special categories of personal data such as health data taking place in the context of scientific research, this including clinical and translational research areas. This article aims to provide an overview of the new rules to consider where scientific projects include the processing of personal health data, genetic data or biometric data and other kinds of sensitive information whose use is strictly regulated by the GDPR in order to give the main key facts to researchers to adapt their practices and ensure compliance to the EU law to be enforced in May 2018. PMID:28144283
ESGF and WDCC: The Double Structure of the Digital Data Storage at DKRZ
NASA Astrophysics Data System (ADS)
Toussaint, F.; Höck, H.
2016-12-01
Since a couple of years, Digital Repositories of climate science face new challenges: International projects are global collaborations. The data storage in parallel moved to federated, distributed storage systems like ESGF. For the long term archival storage (LTA) on the other hand, communities, funders, and data users make stronger demands for data and metadata quality to facilitate data use and reuse. At DKRZ, this situation led to a twofold data dissemination system - a situation which has influence on administration, workflows, and sustainability of the data. The ESGF system is focused on the needs of users as partners in global projects. It includes replication tools, detailed global project standards, and efficient search for the data to download. In contrast, DKRZ's classical CERA LTA storage aims for long term data holding and data curation as well as for data reuse requiring high metadata quality standards. In addition, for LTA data a Digital Object Identifier publication service for the direct integration of research data in scientific publications has been implemented. The editorial process at DKRZ-LTA ensures the quality of metadata and research data. The DOI and a citation code are provided and afterwards registered under DataCite's (datacite.org) regulations. In the overall data life cycle continuous reliability of the data and metadata quality is essential to allow for data handling at Petabytes level, data long term usability, and adequate publication of the results. These considerations lead to the question "What is quality" - with respect to data, to the repository itself, to the publisher, and the user? Global consensus is needed for these assessments as the phases of the end to end workflow gear into each other: For data and metadata, checks need to go hand in hand with the processes of production and storage. The results can be judged following a Quality Maturity Matrix (QMM). Repositories can be certified according to their trustworthiness. For the publication of any scientific conclusions, scientific community, funders, media, and policy makers ask for the publisher's impact in terms of readers' credit, run, and presentation quality. The paper describes the data life cycle. Emphasis is put on the different levels of quality assessment which at DKRZ ensure the data and metadata quality.
ERIC Educational Resources Information Center
Lorenzo, G.; Santagueda, M.
2016-01-01
The current evaluation and comparison system of scientific production by impact factor has been criticised from different perspectives in recent years, and has ensured that publishing in high-impact journals does not necessarily imply that works are quality works. Many of these jobs are mostly not cited or, in the best of cases, only a very small…
Astronomical Instrumentation Systems Quality Management Planning: AISQMP
NASA Astrophysics Data System (ADS)
Goldbaum, Jesse
2017-06-01
The capability of small aperture astronomical instrumentation systems (AIS) to make meaningful scientific contributions has never been better. The purpose of AIS quality management planning (AISQMP) is to ensure the quality of these contributions such that they are both valid and reliable. The first step involved with AISQMP is to specify objective quality measures not just for the AIS final product, but also for the instrumentation used in its production. The next step is to set up a process to track these measures and control for any unwanted variation. The final step is continual effort applied to reducing variation and obtaining measured values near optimal theoretical performance. This paper provides an overview of AISQMP while focusing on objective quality measures applied to astronomical imaging systems.
Astronomical Instrumentation Systems Quality Management Planning: AISQMP (Abstract)
NASA Astrophysics Data System (ADS)
Goldbaum, J.
2017-12-01
(Abstract only) The capability of small aperture astronomical instrumentation systems (AIS) to make meaningful scientific contributions has never been better. The purpose of AIS quality management planning (AISQMP) is to ensure the quality of these contributions such that they are both valid and reliable. The first step involved with AISQMP is to specify objective quality measures not just for the AIS final product, but also for the instrumentation used in its production. The next step is to set up a process to track these measures and control for any unwanted variation. The final step is continual effort applied to reducing variation and obtaining measured values near optimal theoretical performance. This paper provides an overview of AISQMP while focusing on objective quality measures applied to astronomical imaging systems.
Attitudes About Regulation Among Direct-to-Consumer Genetic Testing Customers
Green, Robert C.; Kaufman, David
2013-01-01
Introduction: The first regulatory rulings by the U.S. Food and Drug Administration on direct-to-consumer (DTC) genetic testing services are expected soon. As the process of regulating these and other genetic tests moves ahead, it is important to understand the preferences of DTC genetic testing customers about the regulation of these products. Methods: An online survey of customers of three DTC genetic testing companies was conducted 2–8 months after they had received their results. Participants were asked about the importance of regulating the companies selling DTC genetic tests. Results: Most of the 1,046 respondents responded that it would be important to have a nongovernmental (84%) or governmental agency (73%) monitor DTC companies' claims to ensure the consistency with scientific evidence. However, 66% also felt that it was important that DTC tests be available without governmental oversight. Nearly, all customers favored a policy to ensure that insurers and law enforcement officials could not access their information. Discussion: Although many DTC customers want access to genetic testing services without restrictions imposed by the government regulation, most also favor an organization operating alongside DTC companies that will ensure that the claims made by the companies are consistent with sound scientific evidence. This seeming contradiction may indicate that DTC customers want to ensure that they have unfettered access to high-quality information. Additionally, policies to help ensure privacy of data would be welcomed by customers, despite relatively high confidence in the companies. PMID:23560882
Attitudes about regulation among direct-to-consumer genetic testing customers.
Bollinger, Juli Murphy; Green, Robert C; Kaufman, David
2013-05-01
The first regulatory rulings by the U.S. Food and Drug Administration on direct-to-consumer (DTC) genetic testing services are expected soon. As the process of regulating these and other genetic tests moves ahead, it is important to understand the preferences of DTC genetic testing customers about the regulation of these products. An online survey of customers of three DTC genetic testing companies was conducted 2-8 months after they had received their results. Participants were asked about the importance of regulating the companies selling DTC genetic tests. Most of the 1,046 respondents responded that it would be important to have a nongovernmental (84%) or governmental agency (73%) monitor DTC companies' claims to ensure the consistency with scientific evidence. However, 66% also felt that it was important that DTC tests be available without governmental oversight. Nearly, all customers favored a policy to ensure that insurers and law enforcement officials could not access their information. Although many DTC customers want access to genetic testing services without restrictions imposed by the government regulation, most also favor an organization operating alongside DTC companies that will ensure that the claims made by the companies are consistent with sound scientific evidence. This seeming contradiction may indicate that DTC customers want to ensure that they have unfettered access to high-quality information. Additionally, policies to help ensure privacy of data would be welcomed by customers, despite relatively high confidence in the companies.
2015-01-01
The mission of the Water Resources Discipline of the U.S. Geological Survey (USGS) is to provide the information and understanding needed for wise management of the Nation's water resources. Inherent in this mission is the responsibility to collect data that accurately describe the physical, chemical, and biological attributes of water systems. These data are used for environmental and resource assessments by the USGS, other government agenices and scientific organizations, and the general public. Reliable and quality-assured data are essential to the credibility and impartiality of the water-resources appraisals carried out by the USGS. The development and use of a National Field Manual is necessary to achieve consistency in the scientific methods and procedures used, to document those methods and procedures, and to maintain technical expertise. USGS field personnel use this manual to ensure that the data collected are of the quality required to fulfill our mission.
Ten tips for authors of scientific articles.
Hong, Sung-Tae
2014-08-01
Writing a good quality scientific article takes experience and skill. I propose 'Ten Tips' that may help to improve the quality of manuscripts for scholarly journals. It is advisable to draft first version of manuscript and revise it repeatedly for consistency and accuracy of the writing. During the drafting and revising the following tips can be considered: 1) focus on design to have proper content, conclusion, points compliant with scope of the target journal, appropriate authors and contributors list, and relevant references from widely visible sources; 2) format the manuscript in accordance with instructions to authors of the target journal; 3) ensure consistency and logical flow of ideas and scientific facts; 4) provide scientific confidence; 5) make your story interesting for your readers; 6) write up short, simple and attractive sentences; 7) bear in mind that properly composed and reflective titles increase chances of attracting more readers; 8) do not forget that well-structured and readable abstracts improve citability of your publications; 9) when revising adhere to the rule of 'First and Last' - open your text with topic paragraph and close it with resolution paragraph; 10) use connecting words linking sentences within a paragraph by repeating relevant keywords.
Ten Tips for Authors of Scientific Articles
2014-01-01
Writing a good quality scientific article takes experience and skill. I propose 'Ten Tips' that may help to improve the quality of manuscripts for scholarly journals. It is advisable to draft first version of manuscript and revise it repeatedly for consistency and accuracy of the writing. During the drafting and revising the following tips can be considered: 1) focus on design to have proper content, conclusion, points compliant with scope of the target journal, appropriate authors and contributors list, and relevant references from widely visible sources; 2) format the manuscript in accordance with instructions to authors of the target journal; 3) ensure consistency and logical flow of ideas and scientific facts; 4) provide scientific confidence; 5) make your story interesting for your readers; 6) write up short, simple and attractive sentences; 7) bear in mind that properly composed and reflective titles increase chances of attracting more readers; 8) do not forget that well-structured and readable abstracts improve citability of your publications; 9) when revising adhere to the rule of 'First and Last' - open your text with topic paragraph and close it with resolution paragraph; 10) use connecting words linking sentences within a paragraph by repeating relevant keywords. PMID:25120310
Four simple recommendations to encourage best practices in research software
Jiménez, Rafael C.; Kuzak, Mateusz; Alhamdoosh, Monther; Barker, Michelle; Batut, Bérénice; Borg, Mikael; Capella-Gutierrez, Salvador; Chue Hong, Neil; Cook, Martin; Corpas, Manuel; Flannery, Madison; Garcia, Leyla; Gelpí, Josep Ll.; Gladman, Simon; Goble, Carole; González Ferreiro, Montserrat; Gonzalez-Beltran, Alejandra; Griffin, Philippa C.; Grüning, Björn; Hagberg, Jonas; Holub, Petr; Hooft, Rob; Ison, Jon; Katz, Daniel S.; Leskošek, Brane; López Gómez, Federico; Oliveira, Luis J.; Mellor, David; Mosbergen, Rowland; Mulder, Nicola; Perez-Riverol, Yasset; Pergl, Robert; Pichler, Horst; Pope, Bernard; Sanz, Ferran; Schneider, Maria V.; Stodden, Victoria; Suchecki, Radosław; Svobodová Vařeková, Radka; Talvik, Harry-Anton; Todorov, Ilian; Treloar, Andrew; Tyagi, Sonika; van Gompel, Maarten; Vaughan, Daniel; Via, Allegra; Wang, Xiaochuan; Watson-Haigh, Nathan S.; Crouch, Steve
2017-01-01
Scientific research relies on computer software, yet software is not always developed following practices that ensure its quality and sustainability. This manuscript does not aim to propose new software development best practices, but rather to provide simple recommendations that encourage the adoption of existing best practices. Software development best practices promote better quality software, and better quality software improves the reproducibility and reusability of research. These recommendations are designed around Open Source values, and provide practical suggestions that contribute to making research software and its source code more discoverable, reusable and transparent. This manuscript is aimed at developers, but also at organisations, projects, journals and funders that can increase the quality and sustainability of research software by encouraging the adoption of these recommendations. PMID:28751965
Four simple recommendations to encourage best practices in research software.
Jiménez, Rafael C; Kuzak, Mateusz; Alhamdoosh, Monther; Barker, Michelle; Batut, Bérénice; Borg, Mikael; Capella-Gutierrez, Salvador; Chue Hong, Neil; Cook, Martin; Corpas, Manuel; Flannery, Madison; Garcia, Leyla; Gelpí, Josep Ll; Gladman, Simon; Goble, Carole; González Ferreiro, Montserrat; Gonzalez-Beltran, Alejandra; Griffin, Philippa C; Grüning, Björn; Hagberg, Jonas; Holub, Petr; Hooft, Rob; Ison, Jon; Katz, Daniel S; Leskošek, Brane; López Gómez, Federico; Oliveira, Luis J; Mellor, David; Mosbergen, Rowland; Mulder, Nicola; Perez-Riverol, Yasset; Pergl, Robert; Pichler, Horst; Pope, Bernard; Sanz, Ferran; Schneider, Maria V; Stodden, Victoria; Suchecki, Radosław; Svobodová Vařeková, Radka; Talvik, Harry-Anton; Todorov, Ilian; Treloar, Andrew; Tyagi, Sonika; van Gompel, Maarten; Vaughan, Daniel; Via, Allegra; Wang, Xiaochuan; Watson-Haigh, Nathan S; Crouch, Steve
2017-01-01
Scientific research relies on computer software, yet software is not always developed following practices that ensure its quality and sustainability. This manuscript does not aim to propose new software development best practices, but rather to provide simple recommendations that encourage the adoption of existing best practices. Software development best practices promote better quality software, and better quality software improves the reproducibility and reusability of research. These recommendations are designed around Open Source values, and provide practical suggestions that contribute to making research software and its source code more discoverable, reusable and transparent. This manuscript is aimed at developers, but also at organisations, projects, journals and funders that can increase the quality and sustainability of research software by encouraging the adoption of these recommendations.
Facilitating Stewardship of scientific data through standards based workflows
NASA Astrophysics Data System (ADS)
Bastrakova, I.; Kemp, C.; Potter, A. K.
2013-12-01
There are main suites of standards that can be used to define the fundamental scientific methodology of data, methods and results. These are firstly Metadata standards to enable discovery of the data (ISO 19115), secondly the Sensor Web Enablement (SWE) suite of standards that include the O&M and SensorML standards and thirdly Ontology that provide vocabularies to define the scientific concepts and relationships between these concepts. All three types of standards have to be utilised by the practicing scientist to ensure that those who ultimately have to steward the data stewards to ensure that the data can be preserved curated and reused and repurposed. Additional benefits of this approach include transparency of scientific processes from the data acquisition to creation of scientific concepts and models, and provision of context to inform data use. Collecting and recording metadata is the first step in scientific data flow. The primary role of metadata is to provide details of geographic extent, availability and high-level description of data suitable for its initial discovery through common search engines. The SWE suite provides standardised patterns to describe observations and measurements taken for these data, capture detailed information about observation or analytical methods, used instruments and define quality determinations. This information standardises browsing capability over discrete data types. The standardised patterns of the SWE standards simplify aggregation of observation and measurement data enabling scientists to transfer disintegrated data to scientific concepts. The first two steps provide a necessary basis for the reasoning about concepts of ';pure' science, building relationship between concepts of different domains (linked-data), and identifying domain classification and vocabularies. Geoscience Australia is re-examining its marine data flows, including metadata requirements and business processes, to achieve a clearer link between scientific data acquisition and analysis requirements and effective interoperable data management and delivery. This includes participating in national and international dialogue on development of standards, embedding data management activities in business processes, and developing scientific staff as effective data stewards. Similar approach is applied to the geophysical data. By ensuring the geophysical datasets at GA strictly follow metadata and industry standards we are able to implement a provenance based workflow where the data is easily discoverable, geophysical processing can be applied to it and results can be stored. The provenance based workflow enables metadata records for the results to be produced automatically from the input dataset metadata.
Selecting clinical quality indicators for laboratory medicine.
Barth, Julian H
2012-05-01
Quality in laboratory medicine is often described as doing the right test at the right time for the right person. Laboratory processes currently operate under the oversight of an accreditation body which gives confidence that the process is good. However, there are aspects of quality that are not measured by these processes. These are largely focused on ensuring that the most clinically appropriate test is performed and interpreted correctly. Clinical quality indicators were selected through a two-phase process. Firstly, a series of focus groups of clinical scientists were held with the aim of developing a list of quality indicators. These were subsequently ranked in order by an expert panel of primary and secondary care physicians. The 10 top indicators included the communication of critical results, comprehensive education to all users and adequate quality assurance for point-of-care testing. Laboratories should ensure their tests are used to national standards, that they have clinical utility, are calibrated to national standards and have long-term stability for chronic disease management. Laboratories should have error logs and demonstrate evidence of measures introduced to reduce chances of similar future errors. Laboratories should make a formal scientific evaluation of analytical quality. This paper describes the process of selection of quality indicators for laboratory medicine that have been validated sequentially by deliverers and users of the service. They now need to be converted into measureable variables related to outcome and validated in practice.
Kroes, Burt H
2014-12-02
In the European Union a complex regulatory framework is in place for the regulation of (traditional) herbal medicinal products. It is based on the principle that a marketing authorisation granted by the competent authorities is required for placing medicinal products on the market. The requirements and procedures for acquiring such a marketing authorisation are laid down in regulations, directives and scientific guidelines. This paper gives an overview of the quality requirements for (traditional) herbal medicinal products that are contained in European pharmaceutical legislation. Pharmaceutical quality of medicinal product is the basis for ensuring safe and effective medicines. The basic principles governing the assurance of the quality of medicinal products in the European Union are primarily defined in the amended Directive 2001/83/EC and Directive 2003/63/EC. Quality requirements of herbal medicinal products are also laid down in scientific guidelines. Scientific guidelines provide a basis for practical harmonisation of how the competent authorities of EU Member States interpret and apply the detailed requirements for the demonstration of quality laid down in regulations and directives. Detailed quality requirements for herbal medicinal products on the European market are contained in European Union (EU) pharmaceutical legislation. They include a system of manufacturing authorisations which ensures that all herbal medicinal products on the European market are manufactured/imported only by authorised manufacturers, whose activities are regularly inspected by the competent authorities. Additionally, as starting materials only active substances are allowed which have been manufactured in accordance with the GMP for starting materials as adopted by the Community. The European regulatory framework encompasses specific requirements for herbal medicinal products. These requirements are independent from the legal status. Thus, the same quality standards equally apply to herbal products based on clinical evidence and traditional herbal medicinal products. The basic principle is that the quality of herbal medicinal products is intrinsically associated with the quality standard of the herbal substances and/or herbal preparations. Furthermore, the herbal substance or herbal preparation in its entirety is regarded as the active substance. Consequently, a mere determination of the content of marker(s) or constituents with known therapeutic activity is not sufficient for the quality control of herbal medicinal products. Specific quality requirements include thorough product characterisation, adherence to the Good Agricultural and Collection Practices, good manufacturing practices and validated manufacturing process, e.g., raw material testing, in-process testing, fingerprint characterisation etc. Quality control of herbal medicinal products is primarily intended to define the quality of the herbal substance/preparation and herbal medicinal product rather than to establish full characterisation. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
ERIC Educational Resources Information Center
Slessarev, Yuri Vassilyevich; Moisseyev, Vassily Borisovich; Vostroknutov, Evgeniy Vladimirovich
2015-01-01
This article describes pedagogical conditions of ensuring students readiness for scientific researches on the basis of scientific literature and experience of Penza State Technological University students. Introduction of suggested conditions favors the process of training of highly skilled expert who is ready for generation of new ideas in fields…
Development of a surgical educational research program-fundamental principles and challenges.
Ahmed, Kamran; Ibrahim, Amel; Anderson, Oliver; Patel, Vanash M; Zacharakis, Emmanouil; Darzi, Ara; Paraskeva, Paraskevas; Athanasiou, Thanos
2011-05-15
Surgical educational research is the scientific investigation of any aspect of surgical learning, teaching, training, and assessment. The research into development and validation of educational tools is vital to optimize patient care. This can be accomplished by establishing high quality educational research programs within academic surgical departments. This article aims to identify the components involved in educational research and describes the challenges as well as solutions to establishing a high quality surgical educational research program. A variety of sources including journal articles, books, and online literature were reviewed in order to determine the pathways involved in conducting educational research and establishing a research program. It is vital to ensure that educational research is acceptable, innovative, robust in design, funded correctly, and disseminated successfully. Challenges faced by the current surgical research programs include structural organization, academic support, credibility, time, funding, relevance, and growth. The solutions to these challenges have been discussed. To ensure research in surgical education is of high quality and yields credible results, strong leadership in the organization of an educational research program is necessary. Copyright © 2011 Elsevier Inc. All rights reserved.
Science in Emergency Response at CDC: Structure and Functions.
Iskander, John; Rose, Dale A; Ghiya, Neelam D
2017-09-01
Recent high-profile activations of the US Centers for Disease Control and Prevention (CDC) Emergency Operations Center (EOC) include responses to the West African Ebola and Zika virus epidemics. Within the EOC, emergency responses are organized according to the Incident Management System, which provides a standardized structure and chain of command, regardless of whether the EOC activation occurs in response to an outbreak, natural disaster, or other type of public health emergency. By embedding key scientific roles, such as the associate director for science, and functions within a Scientific Response Section, the current CDC emergency response structure ensures that both urgent and important science issues receive needed attention. Key functions during emergency responses include internal coordination of scientific work, data management, information dissemination, and scientific publication. We describe a case example involving the ongoing Zika virus response that demonstrates how the scientific response structure can be used to rapidly produce high-quality science needed to answer urgent public health questions and guide policy. Within the context of emergency response, longer-term priorities at CDC include both streamlining administrative requirements and funding mechanisms for scientific research.
Tracking and Establishing Provenance of Earth Science Datasets: A NASA-based Example
NASA Technical Reports Server (NTRS)
Ramapriyan, Hampapuram K.; Goldstein, Justin C.; Hua, Hook; Wolfe, Robert E.
2016-01-01
Information quality is of paramount importance to science. Accurate, scientifically vetted and statistically meaningful and, ideally, reproducible information engenders scientific trust and research opportunities. Therefore, so-called Highly Influential Scientific Assessments (HISA) such as the U.S. Third National Climate Assessment undergo a very rigorous process to ensure transparency and credibility. As an activity to support the transparency of such reports, the U.S. Global Change Research Program has developed the Global Change Information System (GCIS). Specifically related to the transparency of NCA3, a recent activity was carried out to trace the provenance as completely as possible for all figures in the NCA3 report that predominantly used NASA data. This paper discusses lessons learned from this activity that trace the provenance of NASA figures in a major HISA-class pdf report.
Towards an effective data peer review
NASA Astrophysics Data System (ADS)
Düsterhus, André; Hense, Andreas
2014-05-01
Peer review is an established procedure to ensure the quality of scientific publications and is currently used as a prerequisite for acceptance of papers in the scientific community. In the past years the publication of raw data and its metadata got increased attention, which led to the idea of bringing it to the same standards the journals for traditional publications have. One missing element to achieve this is a comparable peer review scheme. This contribution introduces the idea of a quality evaluation process, which is designed to analyse the technical quality as well as the content of a dataset. It bases on quality tests, which results are evaluated with the help of the knowledge of an expert. The results of the tests and the expert knowledge are evaluated probabilistically and are statistically combined. As a result the quality of a dataset is estimated with a single value only. This approach allows the reviewer to quickly identify the potential weaknesses of a dataset and generate a transparent and comprehensible report. To demonstrate the scheme, an application on a large meteorological dataset will be shown. Furthermore, potentials and risks of such a scheme will be introduced and practical implications for its possible introduction to data centres investigated. Especially, the effects of reducing the estimate of quality of a dataset to a single number will be critically discussed.
Consideration and Checkboxes: Incorporating Ethics and Science into the 3Rs
Landi, Margaret S; Shriver, Adam J; Mueller, Anne
2015-01-01
Members of the research community aim to both produce high-quality research and ensure that harm is minimized in animals. The primary means of ensuring these goals are both met is the 3Rs framework of replacement, reduction, and refinement. However, some approaches to the 3Rs may result in a ‘check box mentality’ in which IACUC members, researchers, administrators, and caretakers check off a list of tasks to evaluate a protocol. We provide reasons for thinking that the 3Rs approach could be enhanced with more explicit discussion of the ethical assumptions used to arrive at an approved research protocol during IACUC review. Here we suggest that the notion of moral considerability, and all of the related issues it gives rise to, should be incorporated into IACUC discussions of 3Rs deliberations during protocol review to ensure that animal wellbeing is enhanced within the constraints of scientific investigation. PMID:25836970
Nolte, Kurt B; Stewart, Douglas M; O'Hair, Kevin C; Gannon, William L; Briggs, Michael S; Barron, A Marie; Pointer, Judy; Larson, Richard S
2008-10-01
The authors developed a novel continuous quality improvement (CQI) process for academic biomedical research compliance administration. A challenge in developing a quality improvement program in a nonbusiness environment is that the terminology and processes are often foreign. Rather than training staff in an existing quality improvement process, the authors opted to develop a novel process based on the scientific method--a paradigm familiar to all team members. The CQI process included our research compliance units. Unit leaders identified problems in compliance administration where a resolution would have a positive impact and which could be resolved or improved with current resources. They then generated testable hypotheses about a change to standard practice expected to improve the problem, and they developed methods and metrics to assess the impact of the change. The CQI process was managed in a "peer review" environment. The program included processes to reduce the incidence of infections in animal colonies, decrease research protocol-approval times, improve compliance and protection of animal and human research subjects, and improve research protocol quality. This novel CQI approach is well suited to the needs and the unique processes of research compliance administration. Using the scientific method as the improvement paradigm fostered acceptance of the project by unit leaders and facilitated the development of specific improvement projects. These quality initiatives will allow us to improve support for investigators while ensuring that compliance standards continue to be met. We believe that our CQI process can readily be used in other academically based offices of research.
[Managing a health research institute: towards research excellence through continuous improvement].
Olmedo, Carmen; Buño, Ismael; Plá, Rosa; Lomba, Irene; Bardinet, Thierry; Bañares, Rafael
2015-01-01
Health research institutes are a strategic commitment considered the ideal environment to develop excellence in translational research. Achieving quality research requires not only a powerful scientific and research structure but also the quality and integrity of management systems that support it. The essential instruments in our institution were solid strategic planning integrated into and consistent with the system of quality management, systematic evaluation through periodic indicators, measurement of key user satisfaction and internal audits, and implementation of an innovative information management tool. The implemented management tools have provided a strategic thrust to our institute while ensuring a level of quality and efficiency in the development and management of research that allows progress towards excellence in biomedical research. Copyright © 2015 SESPAS. Published by Elsevier Espana. All rights reserved.
Wilfley, Denise E.; Staiano, Amanda E.; Altman, Myra; Lindros, Jeanne; Lima, Angela; Hassink, Sandra G.; Dietz, William H.; Cook, Stephen
2017-01-01
Objectives To improve systems of care to advance implementation of the U.S. Preventive Services Task Force recommendations for childhood obesity treatment (i.e. clinicians offer/refer children with obesity to intensive, multicomponent behavioral interventions of >25 hours over 6–12 months to improve weight status) and to expand payment for these services. Methods In July 2015, forty-three cross-sector stakeholders attended a conference supported by the Agency for Healthcare Research and Quality, American Academy of Pediatrics Institute for Healthy Childhood Weight, and The Obesity Society. Plenary sessions presenting scientific evidence and clinical and payment practices were interspersed with breakout sessions to identify consensus recommendations. Results Consensus recommendations for childhood obesity treatment included: family-based multicomponent behavioral therapy; integrated care model; and multi-disciplinary care team. The use of evidence-based protocols, a well-trained healthcare team, medical oversight, and treatment at or above the minimum dose (e.g. >25 hours) are critical components to ensure effective delivery of high-quality care and to achieve clinically meaningful weight loss. Approaches to secure reimbursement for evidence-based obesity treatment within payment models were recommended. Conclusion Continued cross-sector collaboration is crucial to ensure a unified approach to increase payment and access for childhood obesity treatment and to scale-up training to ensure quality of care. PMID:27925451
Wilfley, Denise E; Staiano, Amanda E; Altman, Myra; Lindros, Jeanne; Lima, Angela; Hassink, Sandra G; Dietz, William H; Cook, Stephen
2017-01-01
To improve systems of care to advance implementation of the U.S. Preventive Services Task Force recommendations for childhood obesity treatment (i.e., clinicians offer/refer children with obesity to intensive, multicomponent behavioral interventions of >25 h over 6 to 12 months to improve weight status) and to expand payment for these services. In July 2015, 43 cross-sector stakeholders attended a conference supported by the Agency for Healthcare Research and Quality, American Academy of Pediatrics Institute for Healthy Childhood Weight, and The Obesity Society. Plenary sessions presenting scientific evidence and clinical and payment practices were interspersed with breakout sessions to identify consensus recommendations. Consensus recommendations for childhood obesity treatment included: family-based multicomponent behavioral therapy; integrated care model; and multidisciplinary care team. The use of evidence-based protocols, a well-trained healthcare team, medical oversight, and treatment at or above the minimum dose (e.g., >25 h) are critical components to ensure effective delivery of high-quality care and to achieve clinically meaningful weight loss. Approaches to secure reimbursement for evidence-based obesity treatment within payment models were recommended. Continued cross-sector collaboration is crucial to ensure a unified approach to increase payment and access for childhood obesity treatment and to scale up training to ensure quality of care. © 2016 The Obesity Society.
Scientific Framework for Stormwater Monitoring by the Washington State Department of Transportation
Sheibley, R.W.; Kelly, V.J.; Wagner, R.J.
2009-01-01
The Washington State Department of Transportation municipal stormwater monitoring program, in operation for about 8 years, never has received an external, objective assessment. In addition, the Washington State Department of Transportation would like to identify the standard operating procedures and quality assurance protocols that must be adopted so that their monitoring program will meet the requirements of the new National Pollutant Discharge Elimination System municipal stormwater permit. As a result, in March 2009, the Washington State Department of Transportation asked the U.S. Geological Survey to assess their pre-2009 municipal stormwater monitoring program. This report presents guidelines developed for the Washington State Department of Transportation to meet new permit requirements and regional/national stormwater monitoring standards to ensure that adequate processes and procedures are identified to collect high-quality, scientifically defensible municipal stormwater monitoring data. These include: (1) development of coherent vision and cooperation among all elements of the program; (2) a comprehensive approach for site selection; (3) an effective quality assurance program for field, laboratory, and data management; and (4) an adequate database and data management system.
Balance in scientific impact assessment: the EGU Awards Committe experience
NASA Astrophysics Data System (ADS)
Montanari, Alberto
2016-04-01
Evaluation of scientific impact is becoming an essential step all over the world for assigning academic positions, funding and recognition. Impact is generally assessed by means of objective bibliometric indicators which are frequently integrated with a subjective evaluation by one or more individuals. An essential requirement of impact assessment is to ensure balance across several potential discriminating factors, including gender, ethnics, culture, scientific field and many others. Scientific associations need to ensure balance in any step of their activity and in particular when electing their representatives, evaluating scientific contributions, reviewing papers and assigning awards. While ensuring balance is a strict necessity, how to get to target is still a matter of vivid debates. In fact, the context of science is very different with respect to the general context of society and the need for scientific associations to maintain confidentiality in their evaluation procedures makes the application of transparent procedures more complicated. This talk aims to present the experience and the efforts of the European Geosciences Union to ensure balance, with a particular focus on gender balance. Data and statistics will be presented in the attempt to provide constructive indications to get to the target of giving equal opportunities to researchers across gender, continents and ethnic groups. Science is a unifying discipline and balance will be vital to ensure that humans and our planet co-evolve sustainably.
Quality Management in Astronomical Software and Data Systems
NASA Astrophysics Data System (ADS)
Radziwill, N. M.
2007-10-01
As the demand for more sophisticated facilities increases, the complexity of the technical and organizational challenges faced by operational space- and ground-based telescopes also increases. In many organizations, funding tends not to be proportional to this trend, and steps must be taken to cultivate a lean environment in both development and operations to consistently do more with less. To facilitate this transition, an organization must be aware of how it can meet quality-related goals, such as reducing variation, improving productivity of people and systems, streamlining processes, ensuring compliance with requirements (scientific, organizational, project, or regulatory), and increasing user satisfaction. Several organizations are already on this path. Quality-based techniques for the efficient, effective development of new telescope facilities and maintenance of existing facilities are described.
[Mental health: an identification of new directions walking in Archie Cochrane footsteps].
Tansella, Michele
2006-10-01
New borders and promising new directions in scientific fields are often difficult to identify and define. This paper attempts to it so in relation to recent developments and new research evidence in mental health, recognizing that this exercise may be biased by many factors, including: the author's own perspective, professional background and research training and the always present dialectic between the analysis of the past and the attraction of the future. A good starting point is the ground-breaking work by Sir Archibald Cochrane. He recommended to adopt a rigorous and continuous evaluation of clinical practice and protocols, promoting well designed clinical research and the use of scientific methods. This evidence-based approach should also be used in mental health. High quality research, continuous education and good clinical practice, incorporating the results of scientific experiments and observations, represent the approach that ensures an improvement in care provision and patient satisfaction. Currently, mental health care is still too "opinion oriented", due to the over emphasis placed on personal experience and traditional approaches by many psychiatrists. In this paper some of the most promising recent results in psychosocial research, psychopharmacological studies and genetics, as well as in neuroimaging studies, are briefly summarised. From a pragmatic point of view, it is possible to achieve a significant improvement in the quality of mental health care if the following procedure is followed: firstly, to start from solid evidence; secondly, to promote the wide implementation of evidence-based research into every day practice; thirdly, to ensure that administrators and policy makers incorporate the available scientific evidence in the planning and evaluating services and mental health systems of care. The integration between research, education and practice remains the hardest border to cross, yet the achievement of it holds the greatest promise for better mental health care in the future.
Making USGS Science Data more Open, Accessible, and Usable: Leveraging ScienceBase for Success
NASA Astrophysics Data System (ADS)
Chang, M.; Ignizio, D.; Langseth, M. L.; Norkin, T.
2016-12-01
In 2013, the White House released initiatives requiring federally funded research to be made publicly available and machine readable. In response, the U.S. Geological Survey (USGS) has been developing a unified approach to make USGS data available and open. This effort has involved the establishment of internal policies and the release of a Public Access Plan, which outlines a strategy for the USGS to move forward into the modern era in scientific data management. Originally designed as a catalog and collaborative data management platform, ScienceBase (www.sciencebase.gov) is being leveraged to serve as a robust data hosting solution for USGS researchers to make scientific data accessible. With the goal of maintaining persistent access to formal data products and developing a management approach to facilitate stable data citation, the ScienceBase Data Release Team was established to ensure the quality, consistency, and meaningful organization of USGS data through standardized workflows and best practices. These practices include the creation and maintenance of persistent identifiers for data, improving the use of open data formats, establishing permissions for read/write access, validating the quality of standards compliant metadata, verifying that data have been reviewed and approved prior to release, and connecting to external search catalogs such as the USGS Science Data Catalog (data.usgs.gov) and data.gov. The ScienceBase team is actively building features to support this effort by automating steps to streamline the process, building metrics to track site visits and downloads, and connecting published digital resources in line with USGS and Federal policy. By utilizing ScienceBase to achieve stewardship quality and employing a dedicated team to help USGS scientists improve the quality of their data, the USGS is helping to meet today's data quality management challenges and ensure that reliable USGS data are available to and reusable for the public.
Manuel, Sharrón L; Johnson, Brian W; Frevert, Charles W; Duncan, Francesca E
2018-04-21
Immunohistochemistry (IHC) is a robust scientific tool whereby cellular components are visualized within a tissue, and this method has been and continues to be a mainstay for many reproductive biologists. IHC is highly informative if performed and interpreted correctly, but studies have shown that the general use and reporting of appropriate controls in IHC experiments is low. This omission of the scientific method can result in data that lacks rigor and reproducibility. In this editorial, we highlight key concepts in IHC controls and describe an opportunity for our field to partner with the Histochemical Society to adopt their IHC guidelines broadly as researchers, authors, ad hoc reviewers, editorial board members, and editors-in-chief. Such cross-professional society interactions will ensure that we produce the highest quality data as new technologies emerge that still rely upon the foundations of classic histological and immunohistochemical principles.
Evidence-based approach for continuous improvement of occupational health.
Manzoli, Lamberto; Sotgiu, Giovanni; Magnavita, Nicola; Durando, Paolo
2015-01-01
It was recognized early on that an Evidence-Based Medicine (EBM) approach could be applied to Public Health (PH), including the area of Occupational Health (OH). The aim of Evidence-Based Occupational Health (EBOH) is to ensure safety, health, and well-being in the workplace. Currently, high-quality research is necessary in order to provide arguments and scientific evidence upon which effective, efficient, and sustainable preventive measures and policies are to be developed in the workplace in Western countries. Occupational physicians need to integrate available scientific evidence and existing recommendations with a framework of national employment laws and regulations. This paper addresses the state of the art of scientific evidence available in the field (i.e., efficacy of interventions, usefulness of education and training of workers, and need of a multidisciplinary strategy integrated within the national PH programs) and the main critical issues for their implementation. Promoting good health is a fundamental part of the smart, inclusive growth objectives of Europe 2020 - Europe's growth strategy: keeping people healthy and active for longer has a positive impact on productivity and competitiveness. It appears clear that health quality and safety in the workplace play a key role for smart, sustainable, and inclusive growth in Western countries.
Ensuring right to organic food in public health system.
Pashkov, Vitalii; Batyhina, Olena; Leiba, Liudmyla
2018-01-01
Introduction: Human health directly depends on safety and quality of food. In turn, quality and safety of food directly depend on its production conditions and methods. There are two main food production methods: traditional and organic. Organic food production is considered safer and more beneficial for human health. Aim: to determine whether the organic food production method affects human health. Materials and methods: international acts, data of international organizations and conclusions of scientists have been examined and used in the study. The article also summarizes information from scientific journals and monographs from a medical and legal point of view with scientific methods. This article is based on dialectical, comparative, analytic, synthetic and comprehensive research methods. The problems of effects of food production methods and conditions on human health have been analyzed within the framework of the system approach. Conclusions: Food production methods and conditions ultimately affect the state and level of human health. The organic method of production activity has a positive effect on human health.
Sustainable management for the eastern Mediterranean coast of Turkey.
Berberoglu, Süha
2003-03-01
The objective of this article is to propose a program for the integrated coastal zone management that is required to stimulate and guide sustainable development of the Mediterranean coastal zone of Turkey. Improved data collection, quality control, analysis, and data management will provide a firm basis for future scientific understanding of the East Mediterranean coast of Turkey and will support long-term management. Various innovative procedures were proposed for a promising ecosystem-based approach to manage coastal wetlands in the Mediterranean: remote data acquisition with new technologies; environmental quality monitoring program that will provide a baseline for monitoring; linking a Geographic Information System (GIS) with natural resource management decision routines in the context of operational wetlands, fisheries, tourism management system; environmental sensitivity analysis to ensure that permitted developments are environmentally sustainable; and use of natural species to restore the wetlands and coastal dunes and sustain the system processes. The proposed management scheme will benefit the scientific community in the Mediterranean and the management/planning community in Eastern Turkey.
The State of Software for Evolutionary Biology.
Darriba, Diego; Flouri, Tomáš; Stamatakis, Alexandros
2018-05-01
With Next Generation Sequencing data being routinely used, evolutionary biology is transforming into a computational science. Thus, researchers have to rely on a growing number of increasingly complex software. All widely used core tools in the field have grown considerably, in terms of the number of features as well as lines of code and consequently, also with respect to software complexity. A topic that has received little attention is the software engineering quality of widely used core analysis tools. Software developers appear to rarely assess the quality of their code, and this can have potential negative consequences for end-users. To this end, we assessed the code quality of 16 highly cited and compute-intensive tools mainly written in C/C++ (e.g., MrBayes, MAFFT, SweepFinder, etc.) and JAVA (BEAST) from the broader area of evolutionary biology that are being routinely used in current data analysis pipelines. Because, the software engineering quality of the tools we analyzed is rather unsatisfying, we provide a list of best practices for improving the quality of existing tools and list techniques that can be deployed for developing reliable, high quality scientific software from scratch. Finally, we also discuss journal as well as science policy and, more importantly, funding issues that need to be addressed for improving software engineering quality as well as ensuring support for developing new and maintaining existing software. Our intention is to raise the awareness of the community regarding software engineering quality issues and to emphasize the substantial lack of funding for scientific software development.
Bonertz, A; Roberts, G; Slater, J E; Bridgewater, J; Rabin, R L; Hoefnagel, M; Timon, M; Pini, C; Pfaar, O; Sheikh, A; Ryan, D; Akdis, C; Goldstein, J; Poulsen, L K; van Ree, R; Rhyner, C; Barber, D; Palomares, O; Pawankar, R; Hamerlijnk, D; Klimek, L; Agache, I; Angier, E; Casale, T; Fernandez-Rivas, M; Halken, S; Jutel, M; Lau, S; Pajno, G; Sturm, G; Varga, E M; Gerth van Wijk, R; Bonini, S; Muraro, A; Vieths, S
2018-04-01
Adequate quality is essential for any medicinal product to be eligible for marketing. Quality includes verification of the identity, content and purity of a medicinal product in combination with a specified production process and its control. Allergen products derived from natural sources require particular considerations to ensure adequate quality. Here, we describe key aspects of the documentation on manufacturing and quality aspects for allergen immunotherapy products in the European Union and the United States. In some key parts, requirements in these areas are harmonized while other fields are regulated separately between both regions. Essential differences are found in the use of Reference Preparations, or the requirement to apply standardized assays for potency determination. As the types of products available are different in specific regions, regulatory guidance for such products may also be available in one specific region only, such as for allergoids in the European Union. Region-specific issues and priorities are a result of this. As allergen products derived from natural sources are inherently variable in their qualitative and quantitative composition, these products present special challenges to balance the variability and ensuring batch-to-batch consistency. Advancements in scientific knowledge on specific allergens and their role in allergic disease will consequentially find representation in future regulatory guidelines. © 2017 EAACI and John Wiley and Sons A/S. Published by John Wiley and Sons Ltd.
Scientific integrity memorandum
NASA Astrophysics Data System (ADS)
Showstack, Randy
2009-03-01
U.S. President Barack Obama signed a presidential memorandum on 9 March to help restore scientific integrity in government decision making. The memorandum directs the White House Office of Science and Technology Policy to develop a strategy within 120 days that ensures that "the selection of scientists and technology professionals for science and technology positions in the executive branch is based on those individuals' scientific and technological knowledge, credentials, and experience; agencies make available to the public the scientific or technological findings or conclusions considered or relied upon in policy decisions; agencies use scientific and technological information that has been subject to well-established scientific processes such as peer review; and agencies have appropriate rules and procedures to ensure the integrity of the scientific process within the agency, including whistleblower protection."
Emergent Imaging and Geospatial Technologies for Soil Investigations
NASA Technical Reports Server (NTRS)
DeGloria, Stephen D.; Beaudette, Dylan E.; Irons, James R.; Libohova, Zamir; O'Neill, Peggy E.; Owens, Phillip R.; Schoeneberger, Philip J.; West, Larry T.; Wysocki, Douglas A.
2014-01-01
Soil survey investigations and inventories form the scientific basis for a wide spectrum of agronomic and environmental management programs. Soil data and information help formulate resource conservation policies of federal, state, and local governments that seek to sustain our agricultural production system while enhancing environmental quality on both public and private lands. The dual challenges of increasing agricultural production and ensuring environmental integrity require electronically available soil inventory data with both spatial and attribute quality. Meeting this societal need in part depends on development and evaluation of new methods for updating and maintaining soil inventories for sophisticated applications, and implementing an effective framework to conceptualize and communicate tacit knowledge from soil scientists to numerous stakeholders.
NASA Astrophysics Data System (ADS)
Brasseur, Pierre
2015-04-01
The MyOcean projects supported by the European Commission period have been developed during the 2008-2015 period to build an operational service of ocean physical state and ecosystem information to intermediate and downstream users in the areas of marine safety, marine resources, marine and coastal environment and weather, climate and seasonal forecasting. The "core" information provided to users is obtained through the combination of satellite and in situ observations, eddy-resolving modelling of the global ocean and regional european seas, biochemistry, ecosystem and sea-ice modelling, and data assimilation for global to basin scale circulation. A comprehensive R&D plan was established in 2010 to ensure the collection and provision of information of best possible quality for daily estimates of the ocean state (real-time), its short-term evolution, and its history over the past (reanalyses). A service validation methodology was further developed to ensure proper scientific evaluation and routine monitoring of the accuracy of MyOcean products. In this presentation, we will present an overview of the main scientific advances achieved in MyOcean using the NEMO modelling platform, ensemble-based assimilation schemes, coupled circulation-ecosystem, sea-ice assimilative models and probabilistic methodologies for ensemble validation. We will further highlight the key areas that will require additional innovation effort to support the Marine Copernicus service evolution.
Marchewka, J; Watanabe, T T N; Ferrante, V; Estevez, I
2013-06-01
In modern rearing systems, turkey producers often face economic losses due to increased aggression, feather pecking, cannibalism, leg disorders, or injuries among birds, which are also significant welfare issues. The main underlying causes appear to relate to rapid growth, flock size, density, poor environmental complexity, or lighting, which may be deficient in providing the birds with an adequate physical or social environment. To date, there is little information regarding the effect of these factors on turkey welfare. This knowledge is, however, essential to ensure the welfare of turkeys and to improve their quality of life, but may also be beneficial to industry, allowing better bird performance, improved carcass quality, and reduced mortality and condemnations. This paper reviews the available scientific literature related to the behavior of turkeys as influenced by the physical and social environment that may be relevant to advances toward turkey production systems that take welfare into consideration. We addressed the effects that factors such as density, group size, space availability, maturation, lightning, feeding, and transport may have over parameters that may be relevant to ensure welfare of turkeys. Available scientific studies were based in experimental environments and identified individual factors corresponding to particular welfare problems. Most of the studies aimed at finding optimal levels of rearing conditions that allow avoiding or decreasing most severe welfare issues. This paper discusses the importance of these factors for development of production environments that would be better suited from a welfare and economic point of view.
Brown, Marty Skemp; Maurer, Martha A
2014-01-01
Abstract Objective To determine whether national drug control laws ensure that opioid drugs are available for medical and scientific purposes, as intended by the 1972 Protocol amendment to the 1961 Single Convention on Narcotic Drugs. Methods The authors examined whether the text of a convenience sample of drug laws from 15 countries: (i) acknowledged that opioid drugs are indispensable for the relief of pain and suffering; (ii) recognized that government was responsible for ensuring the adequate provision of such drugs for medical and scientific purposes; (iii) designated an administrative body for implementing international drug control conventions; and (iv) acknowledged a government’s intention to implement international conventions, including the Single Convention. Findings Most national laws were found not to contain measures that ensured adequate provision of opioid drugs for medical and scientific purposes. Moreover, the model legislation provided by the United Nations Office on Drugs and Crime did not establish an obligation on national governments to ensure the availability of these drugs for medical use. Conclusion To achieve consistency with the Single Convention, as well as with associated resolutions and recommendations of international bodies, national drug control laws and model policies should be updated to include measures that ensure drug availability to balance the restrictions imposed by the existing drug control measures needed to prevent the diversion and nonmedical use of such drugs. PMID:24623904
Long-Term Ecological Monitoring Field Sampling Plan for 2007
DOE Office of Scientific and Technical Information (OSTI.GOV)
T. Haney
2007-07-31
This field sampling plan describes the field investigations planned for the Long-Term Ecological Monitoring Project at the Idaho National Laboratory Site in 2007. This plan and the Quality Assurance Project Plan for Waste Area Groups 1, 2, 3, 4, 5, 6, 7, 10, and Removal Actions constitute the sampling and analysis plan supporting long-term ecological monitoring sampling in 2007. The data collected under this plan will become part of the long-term ecological monitoring data set that is being collected annually. The data will be used t determine the requirements for the subsequent long-term ecological monitoring. This plan guides the 2007more » investigations, including sampling, quality assurance, quality control, analytical procedures, and data management. As such, this plan will help to ensure that the resulting monitoring data will be scientifically valid, defensible, and of known and acceptable quality.« less
Building and Using Digital Repository Certifications across Science
NASA Astrophysics Data System (ADS)
McIntosh, L.
2017-12-01
When scientific recommendations are made based upon research, the quality and integrity of the data should be rigorous enough to verify claims and in a trusted location. Key to ensuring the transparency and verifiability of research, reproducibility hinges not only on the availability of the documentation, analyses, and data, but the ongoing accessibility and viability of the files and documents, enhanced through a process of curation. The Research Data Alliance (RDA) is an international, community-driven, action-oriented, virtual organization committed to enabling the open sharing of data by building social and technical bridges. Within the RDA, multiple groups are working on consensus-building around the certification of digital repositories across scientific domains. For this section of the panel, we will discuss the work to date on repository certification from this RDA perspective.
National Institutes of Health addresses the science of diversity
Valantine, Hannah A.; Collins, Francis S.
2015-01-01
The US biomedical research workforce does not currently mirror the nation’s population demographically, despite numerous attempts to increase diversity. This imbalance is limiting the promise of our biomedical enterprise for building knowledge and improving the nation’s health. Beyond ensuring fairness in scientific workforce representation, recruiting and retaining a diverse set of minds and approaches is vital to harnessing the complete intellectual capital of the nation. The complexity inherent in diversifying the research workforce underscores the need for a rigorous scientific approach, consistent with the ways we address the challenges of science discovery and translation to human health. Herein, we identify four cross-cutting diversity challenges ripe for scientific exploration and opportunity: research evidence for diversity’s impact on the quality and outputs of science; evidence-based approaches to recruitment and training; individual and institutional barriers to workforce diversity; and a national strategy for eliminating barriers to career transition, with scientifically based approaches for scaling and dissemination. Evidence-based data for each of these challenges should provide an integrated, stepwise approach to programs that enhance diversity rapidly within the biomedical research workforce. PMID:26392553
National Institutes of Health addresses the science of diversity.
Valantine, Hannah A; Collins, Francis S
2015-10-06
The US biomedical research workforce does not currently mirror the nation's population demographically, despite numerous attempts to increase diversity. This imbalance is limiting the promise of our biomedical enterprise for building knowledge and improving the nation's health. Beyond ensuring fairness in scientific workforce representation, recruiting and retaining a diverse set of minds and approaches is vital to harnessing the complete intellectual capital of the nation. The complexity inherent in diversifying the research workforce underscores the need for a rigorous scientific approach, consistent with the ways we address the challenges of science discovery and translation to human health. Herein, we identify four cross-cutting diversity challenges ripe for scientific exploration and opportunity: research evidence for diversity's impact on the quality and outputs of science; evidence-based approaches to recruitment and training; individual and institutional barriers to workforce diversity; and a national strategy for eliminating barriers to career transition, with scientifically based approaches for scaling and dissemination. Evidence-based data for each of these challenges should provide an integrated, stepwise approach to programs that enhance diversity rapidly within the biomedical research workforce.
Prescriptive scientific narratives for communicating usable science.
Downs, Julie S
2014-09-16
In this paper I describe how a narrative approach to science communication may help audiences to more fully understand how science is relevant to their own lives and behaviors. The use of prescriptive scientific narrative can help to overcome challenges specific to scientific concepts, especially the need to reconsider long-held beliefs in the face of new empirical findings. Narrative can captivate the audience, driving anticipation for plot resolution, thus becoming a self-motivating vehicle for information delivery. This quality gives narrative considerable power to explain complex phenomena and causal processes, and to create and reinforce memory traces for better recall and application over time. Because of the inherent properties of narrative communication, their creators have a special responsibility to ensure even-handedness in selection and presentation of the scientific evidence. The recent transformation in communication and information technology has brought about new platforms for delivering content, particularly through interactivity, which can use structured self-tailoring to help individuals most efficiently get exactly the content that they need. As with all educational efforts, prescriptive scientific narratives must be evaluated systematically to determine whether they have the desired effects in improving understanding and changing behavior.
Prescriptive scientific narratives for communicating usable science
Downs, Julie S.
2014-01-01
In this paper I describe how a narrative approach to science communication may help audiences to more fully understand how science is relevant to their own lives and behaviors. The use of prescriptive scientific narrative can help to overcome challenges specific to scientific concepts, especially the need to reconsider long-held beliefs in the face of new empirical findings. Narrative can captivate the audience, driving anticipation for plot resolution, thus becoming a self-motivating vehicle for information delivery. This quality gives narrative considerable power to explain complex phenomena and causal processes, and to create and reinforce memory traces for better recall and application over time. Because of the inherent properties of narrative communication, their creators have a special responsibility to ensure even-handedness in selection and presentation of the scientific evidence. The recent transformation in communication and information technology has brought about new platforms for delivering content, particularly through interactivity, which can use structured self-tailoring to help individuals most efficiently get exactly the content that they need. As with all educational efforts, prescriptive scientific narratives must be evaluated systematically to determine whether they have the desired effects in improving understanding and changing behavior. PMID:25225369
Peer Review in Scientific Publications: Benefits, Critiques, & A Survival Guide
Kelly, Jacalyn; Sadeghieh, Tara
2014-01-01
Peer review has been defined as a process of subjecting an author’s scholarly work, research or ideas to the scrutiny of others who are experts in the same field. It functions to encourage authors to meet the accepted high standards of their discipline and to control the dissemination of research data to ensure that unwarranted claims, unacceptable interpretations or personal views are not published without prior expert review. Despite its wide-spread use by most journals, the peer review process has also been widely criticised due to the slowness of the process to publish new findings and due to perceived bias by the editors and/or reviewers. Within the scientific community, peer review has become an essential component of the academic writing process. It helps ensure that papers published in scientific journals answer meaningful research questions and draw accurate conclusions based on professionally executed experimentation. Submission of low quality manuscripts has become increasingly prevalent, and peer review acts as a filter to prevent this work from reaching the scientific community. The major advantage of a peer review process is that peer-reviewed articles provide a trusted form of scientific communication. Since scientific knowledge is cumulative and builds on itself, this trust is particularly important. Despite the positive impacts of peer review, critics argue that the peer review process stifles innovation in experimentation, and acts as a poor screen against plagiarism. Despite its downfalls, there has not yet been a foolproof system developed to take the place of peer review, however, researchers have been looking into electronic means of improving the peer review process. Unfortunately, the recent explosion in online only/electronic journals has led to mass publication of a large number of scientific articles with little or no peer review. This poses significant risk to advances in scientific knowledge and its future potential. The current article summarizes the peer review process, highlights the pros and cons associated with different types of peer review, and describes new methods for improving peer review. PMID:27683470
American College of Surgeons remains committed to patient safety.
Russell, Thomas R; Jones, R Scott
2006-11-01
Since 1913 the American College of Surgeons has addressed patient safety as a top priority, so they are pleased to contribute this article offering the College's perspective on this critical subject. More specifically, this piece reviews the College's perennial efforts to ensure surgeons and hospitals access to scientifically verifiable standards, availability of effective quality improvement tools, and a better understanding of errors in care. Additionally, they examine the cultural changes required within surgery and provide an overview of the College's recent initiatives in research, accreditation, and education.
NASA Technical Reports Server (NTRS)
1976-01-01
Trade studies were conducted to ensure the overall feasibility of the focal plane camera in a radial module. The primary variable in the trade studies was the location of the pickoff mirror, on axis versus off-axis. Two alternatives were: (1) the standard (electromagnetic focus) SECO submodule, and (2) the MOD 15 permanent magnet focus SECO submodule. The technical areas of concern were the packaging affected parameters of thermal dissipation, focal plane obscuration, and image quality.
The State of Software for Evolutionary Biology
Darriba, Diego; Flouri, Tomáš; Stamatakis, Alexandros
2018-01-01
Abstract With Next Generation Sequencing data being routinely used, evolutionary biology is transforming into a computational science. Thus, researchers have to rely on a growing number of increasingly complex software. All widely used core tools in the field have grown considerably, in terms of the number of features as well as lines of code and consequently, also with respect to software complexity. A topic that has received little attention is the software engineering quality of widely used core analysis tools. Software developers appear to rarely assess the quality of their code, and this can have potential negative consequences for end-users. To this end, we assessed the code quality of 16 highly cited and compute-intensive tools mainly written in C/C++ (e.g., MrBayes, MAFFT, SweepFinder, etc.) and JAVA (BEAST) from the broader area of evolutionary biology that are being routinely used in current data analysis pipelines. Because, the software engineering quality of the tools we analyzed is rather unsatisfying, we provide a list of best practices for improving the quality of existing tools and list techniques that can be deployed for developing reliable, high quality scientific software from scratch. Finally, we also discuss journal as well as science policy and, more importantly, funding issues that need to be addressed for improving software engineering quality as well as ensuring support for developing new and maintaining existing software. Our intention is to raise the awareness of the community regarding software engineering quality issues and to emphasize the substantial lack of funding for scientific software development. PMID:29385525
Evidence of the Association Between Psychology and Tissue and Organ Transplantation in Brazil.
Silva, J D A; Ariente, L C; Roza, B A; Mucci, S
2016-09-01
The addition of psychologists to organ transplant teams is still new in Brazil. In seeking the efficient performance of this professional, the knowledge of the scientific production and the development of research in the area is fundamental. In this sense, this study aims to survey the Brazilian scientific research that has investigated the psychologic aspects involved in tissue and organ transplantation. A literature narrative review was performed with the use of the "Transplante AND Psicologia" descriptors in the Biblioteca Virtual em Saúde and the CAPES Journal Portal. Fifty-three articles were found, of which 22 met the inclusion criteria: publications dating from 2000 to 2014 and the main topic of interest of the studies being quality of life, followed by organ donation. The instruments used most frequently were interviews developed by the researchers and the SF-36 Quality of Life Questionnaire. Recent Brazilian studies on the association between psychology and transplantation are still scarce, possibly because of the recent addition of psychologists to transplantation teams. Therefore, it is suggested that more scientific research is made in the area and that the objects of study are more varied, to ensure adequacy of the psychologist to meet the specific demands of organ and tissue transplantation process. Copyright © 2016 Elsevier Inc. All rights reserved.
Regulatory Science in Professional Education.
Akiyama, Hiroshi
2017-01-01
In the field of pharmaceutical sciences, the subject of regulatory science (RS) includes pharmaceuticals, food, and living environments. For pharmaceuticals, considering the balance between efficacy and safety is a point required for public acceptance, and in that balance, more importance is given to efficacy in curing disease. For food, however, safety is the most important consideration for public acceptance because food should be essentially free of risk. To ensure food safety, first, any hazard that is an agent in food or condition of food with the potential to cause adverse health effects should be identified and characterized. Then the risk that it will affect public health is scientifically analyzed. This process is called risk assessment. Second, risk management should be conducted to reduce a risk that has the potential to affect public health found in a risk assessment. Furthermore, risk communication, which is the interactive exchange of information and opinions concerning risk and risk management among risk assessors, risk managers, consumers, and other interested parties, should be conducted. Food safety is ensured based on risk analysis consisting of the three components of risk assessment, risk management, and risk communication. RS in the field of food safety supports risk analysis, such as scientific research and development of test methods to evaluate food quality, efficacy, and safety. RS is also applied in the field of living environments because the safety of environmental chemical substances is ensured based on risk analysis, similar to that conducted for food.
Impact of scientific and technological advances.
Dragan, I F; Dalessandri, D; Johnson, L A; Tucker, A; Walmsley, A D
2018-03-01
Advancements in research and technology are transforming our world. The dental profession is changing too, in the light of scientific discoveries that are advancing biological technology-from new biomaterials to unravelling the genetic make-up of the human being. As health professionals, we embrace a model of continuous quality improvement and lifelong learning. Our pedagogical approach to incorporating the plethora of scientific-technological advancements calls for us to shift our paradigm from emphasis on skill acquisition to knowledge application. The 2017 ADEE/ADEA workshop provided a forum to explore and discuss strategies to ensure faculty, students and, ultimately, patients are best positioned to exploit the opportunities that arise from integrating new technological advances and research outcomes. Participants discussed methods of incorporating the impact of new technologies and research findings into the education of our dental students. This report serves as a signpost of the way forward and how to promote incorporation of research and technology advances and lifelong learning into the dental education curriculum. © 2018 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
NASA Technical Reports Server (NTRS)
Baumeister, Joseph
2009-01-01
NASA has established 6 Themes for Exploration: 1) USE THE MOON: Reduce risks and cost and increase productivity of future missions by testing technologies, systems, and operations in a planetary environment other than the Earth. 2) PURSUE SCIENTIFIC: Engage in scientific investigations of the Moon (solar system processes), on the Moon (use the unique environment), and from the Moon (to study other celestial phenomena). 3) EXTEND PERMANENT HUMAN PRESENCE: Develop the capabilities and infrastructure required to expand the number of people, the duration, the self-sufficiency, and the degree of non-governmental activity. 4) EXPAND EARTH S ECONOMIC SPHERE: Create new markets based on lunar activity that will return economic, technological, and quality-of-life benefits. 5) ENHANCE GLOBAL SECURTIY: Provide a challenging, shared, and peaceful global vision that unites nations in pursuit of common objectives. 6) ENGAGE, INSPIRE: Excite the public about space, encourage students to pursue careers in high technology fields, ensure that individuals enter the workforce with the scientific and technical knowledge necessary to sustain exploration.
Cross-cultural perspectives of scientific misconduct.
Momen, Hooman; Gollogly, Laragh
2007-09-01
The increasing globalization of scientific research lends urgency to the need for international agreement on the concepts of scientific misconduct. Universal spiritual and moral principles on which ethical standards are generally based indicate that it is possible to reach international agreement on the ethical principles underlying good scientific practice. Concordance on an operational definition of scientific misconduct that would allow independent observers to agree which behaviour constitutes misconduct is more problematic. Defining scientific misconduct to be universally recognized and universally sanctioned means addressing the broader question of ensuring that research is not only well-designed - and addresses a real need for better evidence - but that it is ethically conducted in different cultures. An instrument is needed to ensure that uneven ethical standards do not create unnecessary obstacles to research, particularly in developing countries.
Defending the scientific integrity of conservation-policy processes.
Carroll, Carlos; Hartl, Brett; Goldman, Gretchen T; Rohlf, Daniel J; Treves, Adrian; Kerr, Jeremy T; Ritchie, Euan G; Kingsford, Richard T; Gibbs, Katherine E; Maron, Martine; Watson, James E M
2017-10-01
Government agencies faced with politically controversial decisions often discount or ignore scientific information, whether from agency staff or nongovernmental scientists. Recent developments in scientific integrity (the ability to perform, use, communicate, and publish science free from censorship or political interference) in Canada, Australia, and the United States demonstrate a similar trajectory. A perceived increase in scientific-integrity abuses provokes concerted pressure by the scientific community, leading to efforts to improve scientific-integrity protections under a new administration. However, protections are often inconsistently applied and are at risk of reversal under administrations publicly hostile to evidence-based policy. We compared recent challenges to scientific integrity to determine what aspects of scientific input into conservation policy are most at risk of political distortion and what can be done to strengthen safeguards against such abuses. To ensure the integrity of outbound communications from government scientists to the public, we suggest governments strengthen scientific integrity policies, include scientists' right to speak freely in collective-bargaining agreements, guarantee public access to scientific information, and strengthen agency culture supporting scientific integrity. To ensure the transparency and integrity with which information from nongovernmental scientists (e.g., submitted comments or formal policy reviews) informs the policy process, we suggest governments broaden the scope of independent reviews, ensure greater diversity of expert input and transparency regarding conflicts of interest, require a substantive response to input from agencies, and engage proactively with scientific societies. For their part, scientists and scientific societies have a responsibility to engage with the public to affirm that science is a crucial resource for developing evidence-based policy and regulations in the public interest. © 2017 The Authors. Conservation Biology published by Wiley Periodicals, Inc. on behalf of Society for Conservation Biology.
NASA Astrophysics Data System (ADS)
Escartin, Terenz R.; Nano, Tomi F.; Cunningham, Ian A.
2016-03-01
The detective quantum efficiency (DQE), expressed as a function of spatial frequency, describes the ability of an x-ray detector to produce high signal-to-noise ratio (SNR) images. While regulatory and scientific communities have used the DQE as a primary metric for optimizing detector design, the DQE is rarely used by end users to ensure high system performance is maintained. Of concern is that image quality varies across different systems for the same exposures with no current measures available to describe system performance. Therefore, here we conducted an initial DQE measurement survey of clinical x-ray systems using a DQE-testing instrument to identify their range of performance. Following laboratory validation, experiments revealed that the DQE of five different systems under the same exposure level (8.0 μGy) ranged from 0.36 to 0.75 at low spatial frequencies, and 0.02 to 0.4 at high spatial frequencies (3.5 cycles/mm). Furthermore, the DQE dropped substantially with decreasing detector exposure by a factor of up to 1.5x in the lowest spatial frequency, and a factor of 10x at 3.5 cycles/mm due to the effect of detector readout noise. It is concluded that DQE specifications in purchasing decisions, combined with periodic DQE testing, are important factors to ensure patients receive the health benefits of high-quality images for low x-ray exposures.
George M. Low Trophy: NASA's quality and excellence award
NASA Technical Reports Server (NTRS)
1991-01-01
NASA's major goal is the preservation of America's position as a leader in the aerospace industry. To maintain that status, it is crucial that the products and services we depend upon from NASA contractors, subcontractors, and suppliers meet the highest quality standards to ensure the space program's success. The George M. Low Trophy: NASA's Quality and Excellence Award is the result of NASA's desire to encourage continuous improvement and Total Quality Management (TQM) in the aerospace industry and is awarded to members of NASA's contractor community that have demonstrated sustained excellence, customer orientation, and outstanding achievements in a Total Quality Management (TQM) environment. The purpose in presenting this award is to increase public awareness of the importance of quality and productivity to the nation's aerospace industry and the nation's leadership position overall; encourage domestic business to continuously pursue efforts that enhance quality and increase productivity which will strengthen the nation's competitiveness in the international arena; and provide a forum for sharing the successful techniques and strategies used by applicants with other American organizations. Awards to Rockwell International and Marotta Scientific Controls, Inc. are announced and discussed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Amerio, S.; Behari, S.; Boyd, J.
The Fermilab Tevatron collider's data-taking run ended in September 2011, yielding a dataset with rich scientific potential. The CDF and D0 experiments each have approximately 9 PB of collider and simulated data stored on tape. A large computing infrastructure consisting of tape storage, disk cache, and distributed grid computing for physics analysis with the Tevatron data is present at Fermilab. The Fermilab Run II data preservation project intends to keep this analysis capability sustained through the year 2020 and beyond. To achieve this goal, we have implemented a system that utilizes virtualization, automated validation, and migration to new standards inmore » both software and data storage technology and leverages resources available from currently-running experiments at Fermilab. Lastly, these efforts have also provided useful lessons in ensuring long-term data access for numerous experiments, and enable high-quality scientific output for years to come.« less
,
2007-01-01
The U.S. Geological Survey (USGS) enhances and protects the quality of life in the United States by advancing scientific knowledge to facilitate effective management of hydrologic, biologic, and geologic resources. Results of selected USGS research and monitoring projects in agricultural landscapes are presented in this Fact Sheet. Significant environmental and social issues associated with agricultural production include changes in the hydrologic cycle; introduction of toxic chemicals, nutrients, and pathogens; reduction and alteration of wildlife habitats; and invasive species. Understanding environmental consequences of agricultural production is critical to minimize unintended environmental consequences. The preservation and enhancement of our natural resources can be achieved by measuring the success of improved management practices and by adjusting conservation policies as needed to ensure long-term protection.
Data preservation at the Fermilab Tevatron
NASA Astrophysics Data System (ADS)
Amerio, S.; Behari, S.; Boyd, J.; Brochmann, M.; Culbertson, R.; Diesburg, M.; Freeman, J.; Garren, L.; Greenlee, H.; Herner, K.; Illingworth, R.; Jayatilaka, B.; Jonckheere, A.; Li, Q.; Naymola, S.; Oleynik, G.; Sakumoto, W.; Varnes, E.; Vellidis, C.; Watts, G.; White, S.
2017-04-01
The Fermilab Tevatron collider's data-taking run ended in September 2011, yielding a dataset with rich scientific potential. The CDF and D0 experiments each have approximately 9 PB of collider and simulated data stored on tape. A large computing infrastructure consisting of tape storage, disk cache, and distributed grid computing for physics analysis with the Tevatron data is present at Fermilab. The Fermilab Run II data preservation project intends to keep this analysis capability sustained through the year 2020 and beyond. To achieve this goal, we have implemented a system that utilizes virtualization, automated validation, and migration to new standards in both software and data storage technology and leverages resources available from currently-running experiments at Fermilab. These efforts have also provided useful lessons in ensuring long-term data access for numerous experiments, and enable high-quality scientific output for years to come.
Twenty years of meta-analyses in orthopaedic surgery: has quality kept up with quantity?
Dijkman, Bernadette G; Abouali, Jihad A K; Kooistra, Bauke W; Conter, Henry J; Poolman, Rudolf W; Kulkarni, Abhaya V; Tornetta, Paul; Bhandari, Mohit
2010-01-01
As the number of studies in the literature is increasing, orthopaedic surgeons highly depend on meta-analyses as their primary source of scientific evidence. The objectives of this review were to assess the scientific quality and number of published meta-analyses on orthopaedics-related topics over time. We conducted, in duplicate and independently, a systematic review of published meta-analyses in orthopaedics in the years 2005 and 2008 and compared them with a previous systematic review of meta-analyses from 1969 to 1999. A search of electronic databases (MEDLINE, EMBASE, and the Cochrane Database of Systematic Reviews) was performed to identify meta-analyses published in 2005 and 2008. We searched bibliographies and contacted content experts to identify additional relevant studies. Two investigators independently assessed the quality of the studies, using the Oxman and Guyatt index, and abstracted relevant data. We included forty-five and forty-four meta-analyses from 2005 and 2008, respectively. While the number of meta-analyses increased fivefold from 1999 to 2008, the mean quality score did not change significantly over time (p = 0.067). In the later years, a significantly lower proportion of meta-analyses had methodological flaws (56% in 2005 and 68% in 2008) compared with meta-analyses published prior to 2000 (88%) (p = 0.006). In 2005 and 2008, respectively, 18% and 30% of the meta-analyses had major to extensive flaws in their methodology. Studies from 2008 with positive conclusions used and described appropriate criteria for the validity assessment less often than did those with negative results. The use of random-effects and fixed-effects models as pooling methods became more popular toward 2008. Although the methodological quality of orthopaedic meta-analyses has increased in the past twenty years, a substantial proportion continues to show major to extensive flaws. As the number of published meta-analyses is increasing, a routine checklist for scientific quality should be used in the peer-review process to ensure methodological standards for publication.
Desai, Sapan S; Shortell, Cynthia K
2011-09-01
Competition of interest may exist at all levels in the medical publication process. Ensuring the integrity of scientific scholarship involves protecting editorial independence, promoting the use of scientific arbitration boards, promoting transparency throughout all stages of publication, and protecting the relationship between the publisher and its editors through an effective legal framework. It is incumbent upon the publisher, editors, authors, and readers to ensure that the highest standards of scientific scholarship are upheld. Doing so will help reduce fraud and misrepresentation in medical research and increase the trustworthiness of landmark findings in science. Copyright © 2011 Society for Vascular Surgery. Published by Mosby, Inc. All rights reserved.
Moreland, Joe A.
1991-01-01
As the Nation's principal earth-science information agency, the U.S. Geological Survey has developed a worldwide reputation for collecting accurate data and producing factual, impartial interpretive reports. To ensure continued confidence in the pro- ducts, the Water Resources Division of the U.S. Geological Survey has implemented a policy that all scientific work will be performed in accordance with a centrally managed quality-assurance program. The formal policy for quality assurance within the Montana District was established and documented in USGS Open-File Report 91-194. This report has been revised to reflect changes in personnel and organi- zational structure that have occurred since 1991. Quality assurance is formalized by describing organization and operational responsibilities, the quality-assurance policy, and the quality- assurance responsibilities for performing District functions. The District conducts its work through offices in Helena, Billings, Kalispell, and Fort Peck. Data-collection programs and interpretive studies are conducted by three operating sections and four support units. Discipline specialists provide technical advice and assistance. Management advisors provide guidance on various personnel issues and support functions.
Quality-assurance plan for water-resources activities of the U.S. Geological Survey in Montana--1995
Moreland, Joe A.
1995-01-01
As the Nation's principal earth-science information agency, the U.S. Geological Survey has developed a worldwide reputation for collecting accurate data and producing factual, impartial interpretive reports. To ensure continued confidence in the pro- ducts, the Water Resources Division of the U.S. Geological Survey has implemented a policy that all scientific work will be performed in accordance with a centrally managed quality-assurance program. The formal policy for quality assurance within the Montana District was established and documented in USGS Open-File Report 91-194. This report has been revised to reflect changes in personnel and organi- zational structure that have occurred since 1991. Quality assurance is formalized by describing organization and operational responsibilities, the quality-assurance policy, and the quality- assurance responsibilities for performing District functions. The District conducts its work through offices in Helena, Billings, Kalispell, and Fort Peck. Data-collection programs and interpretive studies are conducted by three operating sections and four support units. Discipline specialists provide technical advice and assistance. Management advisors provide guidance on various personnel issues and support functions.
Treadwell, Henrie M.
2009-01-01
Prisoners, ex-offenders, and the communities they belong to constitute a distinct and highly vulnerable population, and research must be sensitive to their priorities. In light of recent suggestions that scientific experimentation involving prisoners be reconsidered, community-based participatory research can be a valuable tool for determining the immediate concerns of prisoners, such as the receipt of high-quality and dignified health care inside and outside prisons. In building research agendas, more must be done to ensure the participation of communities affected by the resulting policies. PMID:19141599
75 FR 53325 - Proposed Scientific Integrity Policy of the Department of the Interior
Federal Register 2010, 2011, 2012, 2013, 2014
2010-08-31
... September 20, 2010. ADDRESSES: Send comments to: [email protected]ios.doi.gov . FOR FURTHER INFORMATION... scientific products, or on documents compiled and translated from scientific products, to ensure that agency... involving inventorying, monitoring, experimentation, study, research, modeling, and scientific assessment...
Fraser, David; Duncan, Ian J H; Edwards, Sandra A; Grandin, Temple; Gregory, Neville G; Guyonnet, Vincent; Hemsworth, Paul H; Huertas, Stella M; Huzzey, Juliana M; Mellor, David J; Mench, Joy A; Spinka, Marek; Whay, H Rebecca
2013-10-01
In 2012, the World Organisation for Animal Health adopted 10 'General Principles for the Welfare of Animals in Livestock Production Systems' to guide the development of animal welfare standards. The General Principles draw on half a century of scientific research relevant to animal welfare: (1) how genetic selection affects animal health, behaviour and temperament; (2) how the environment influences injuries and the transmission of diseases and parasites; (3) how the environment affects resting, movement and the performance of natural behaviour; (4) the management of groups to minimize conflict and allow positive social contact; (5) the effects of air quality, temperature and humidity on animal health and comfort; (6) ensuring access to feed and water suited to the animals' needs and adaptations; (7) prevention and control of diseases and parasites, with humane euthanasia if treatment is not feasible or recovery is unlikely; (8) prevention and management of pain; (9) creation of positive human-animal relationships; and (10) ensuring adequate skill and knowledge among animal handlers. Research directed at animal welfare, drawing on animal behaviour, stress physiology, veterinary epidemiology and other fields, complements more established fields of animal and veterinary science and helps to create a more comprehensive scientific basis for animal care and management. Copyright © 2013 Elsevier Ltd. All rights reserved.
Evidence-based health care: its place within clinical governance.
McSherry, R; Haddock, J
This article explores the principles of evidence-based practice and its role in achieving quality improvements within the clinical governance framework advocated by the recent White Papers 'The New NHS: Modern, Dependable' (Department of Health (DoH), 1997) and 'A First Class Service: Quality in the New NHS' (DoH, 1998a). Within these White Papers there is an emphasis on improving quality of care, treatment and services through employing the principles of clinical governance. A major feature of clinical governance is guaranteeing quality to the public and the NHS, and ensuring that clinical, managerial and educational practice is based on scientific evidence. This article also examines what evidence-based practice is and what processes are required to promote effective healthcare interventions. The authors also look at how clinical governance relates to other methods/systems involved in clinical effectiveness. Finally, the importance for nurses and other healthcare professionals of familiarizing themselves with the development of critical appraisal skills, and their implications for developing evidence-based practice, is emphasized.
Holmes, Christina; McDonald, Fiona; Jones, Mavis; Ozdemir, Vural; Graham, Janice E
2010-06-01
Standardization is critical to scientists and regulators to ensure the quality and interoperability of research processes, as well as the safety and efficacy of the attendant research products. This is perhaps most evident in the case of "omics science," which is enabled by a host of diverse high-throughput technologies such as genomics, proteomics, and metabolomics. But standards are of interest to (and shaped by) others far beyond the immediate realm of individual scientists, laboratories, scientific consortia, or governments that develop, apply, and regulate them. Indeed, scientific standards have consequences for the social, ethical, and legal environment in which innovative technologies are regulated, and thereby command the attention of policy makers and citizens. This article argues that standardization of omics science is both technical and social. A critical synthesis of the social science literature indicates that: (1) standardization requires a degree of flexibility to be practical at the level of scientific practice in disparate sites; (2) the manner in which standards are created, and by whom, will impact their perceived legitimacy and therefore their potential to be used; and (3) the process of standardization itself is important to establishing the legitimacy of an area of scientific research.
Schroeder, R.L.
2006-01-01
It is widely accepted that plans for restoration projects should contain specific, measurable, and science-based objectives to guide restoration efforts. The United States Fish and Wildlife Service (USFWS) is in the process of developing Comprehensive Conservation Plans (CCPs) for more than 500 units in the National Wildlife Refuge System (NWRS). These plans contain objectives for biological and ecosystem restoration efforts on the refuges. Based on USFWS policy, a system was developed to evaluate the scientific quality of such objectives based on three critical factors: (1) Is the objective specific, measurable, achievable, results-oriented, and time-fixed? (2) What is the extent of the rationale that explains the assumptions, logic, and reasoning for the objective? (3) How well was available science used in the development of the objective? The evaluation system scores each factor on a scale of 1 (poor) to 4 (excellent) according to detailed criteria. The biological and restoration objectives from CCPs published as of September 2004 (60 total) were evaluated. The overall average score for all biological and restoration objectives was 1.73. Average scores for each factor were: Factor 1-1.97; Factor 2-1.86; Factor 3-1.38. The overall scores increased from 1997 to 2004. Future restoration efforts may benefit by using this evaluation system during the process of plan development, to ensure that biological and restoration objectives are of the highest scientific quality possible prior to the implementation of restoration plans, and to allow for improved monitoring and adaptive management.
1991-12-01
The Guidelines for Good Epidemiology Practices (GEPs) for Occupational and Environmental Epidemiologic Research address the conduct of studies generally undertaken to answer questions about human health in relationship to the work place or the environment. The GEPs propose minimum practices and procedures that should be considered to help ensure the quality and integrity of data used in epidemiologic research and to provide adequate documentation of the research methods. The GEPs address the process of conducting individual epidemiologic studies and do not prescribe specific research methods. The Guidelines for Good Epidemiology Practices propose minimum practices and procedures in the following areas: I. Organization and Personnel II. Facilities, Resource Commitment, and Contractors III. Protocol IV. Review and Approval V. Study Conduct VI. Communication VII. Archiving VIII. Quality Assurance Although the Guidelines for Good Epidemiology Practices will not guarantee good epidemiology, they do provide a useful framework for ensuring that all research issues are adequately addressed. This framework is proposed as a first step in improving epidemiologic research practices through adherence to sound scientific research principles. Appendices provide an overview of standard operating procedures, a glossary of terms used in the Guidelines, and suggested references on occupational epidemiology methods.
[Pay for performance in rehabilitation after stroke - results of a pilot project 2001-2008].
Gerdes, N; Funke, U-N; Schüwer, U; Kunze, H; Walle, E; Kleinfeld, A; Reiland, M; Jäckel, W H
2009-08-01
The project aimed at developing and testing a new payment system which provides financial incentives for rehabilitation centers to achieve the best outcomes possible for their patients but does not create additional costs for the insurance funds. The system is conceived as a "quality competition" organized by the centers among themselves with a scientific institute acting as a "referee". Centers with outcomes above average receive a bonus financed by a corresponding malus from the centers below average. In a stepwise process which started in 2001 and was continually accompanied by a scientific institute, we developed the methodological and organizational prerequisites for the new payment system and tested them in two multicentric studies with large case numbers (n=1,058 and n=700, respectively). As a first step, a new assessment instrument (SINGER) was developed and validated in order to measure the outcomes in a reliable, valid, and change-sensitive way. In the second phase, we developed a regression analytic model which predicted the central outcome variable with >84% variance explained. With this model, the different case-mix in the participating centers can be controlled, so that comparisons of outcomes across centers can take place under fair conditions. In the recently completed third phase, we introduced an internet-based programme SINGER-online into which the centers can enter all relevant data. This programme ensures a high quality of all data and makes comparisons of outcomes across all centers possible at any chosen time. The programme contains a special module accessible to the medical services of the health insurance only, which allows sample checks of the data entered by the clinics and helps to ensure that all centers keep to the principles of a fair competition for better quality for their patients. After successful testing of these elements, a functioning model of pay-for-performance in rehabilitation after stroke is now available. (c) Georg Thieme Verlag KG Stuttgart, New York.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Boyd, J.; Herner, K.; Jayatilaka, B.
The Fermilab Tevatron collider's data-taking run ended in September 2011, yielding a dataset with rich scientific potential. The CDF and DO experiments each have nearly 9 PB of collider and simulated data stored on tape. A large computing infrastructure consisting of tape storage, disk cache, and distributed grid computing for physics analysis with the Tevatron data is present at Fermilab. The Fermilab Run II data preservation project intends to keep this analysis capability sustained through the year 2020 or beyond. To achieve this, we are implementing a system that utilizes virtualization, automated validation, and migration to new standards in bothmore » software and data storage technology as well as leveraging resources available from currently-running experiments at Fermilab. Furthermore, these efforts will provide useful lessons in ensuring long-term data access for numerous experiments throughout high-energy physics, and provide a roadmap for high-quality scientific output for years to come.« less
Data preservation at the Fermilab Tevatron
Amerio, S.; Behari, S.; Boyd, J.; ...
2017-01-22
The Fermilab Tevatron collider's data-taking run ended in September 2011, yielding a dataset with rich scientific potential. The CDF and D0 experiments each have approximately 9 PB of collider and simulated data stored on tape. A large computing infrastructure consisting of tape storage, disk cache, and distributed grid computing for physics analysis with the Tevatron data is present at Fermilab. The Fermilab Run II data preservation project intends to keep this analysis capability sustained through the year 2020 and beyond. To achieve this goal, we have implemented a system that utilizes virtualization, automated validation, and migration to new standards inmore » both software and data storage technology and leverages resources available from currently-running experiments at Fermilab. Lastly, these efforts have also provided useful lessons in ensuring long-term data access for numerous experiments, and enable high-quality scientific output for years to come.« less
Data preservation at the Fermilab Tevatron
Boyd, J.; Herner, K.; Jayatilaka, B.; ...
2015-12-23
The Fermilab Tevatron collider's data-taking run ended in September 2011, yielding a dataset with rich scientific potential. The CDF and DO experiments each have nearly 9 PB of collider and simulated data stored on tape. A large computing infrastructure consisting of tape storage, disk cache, and distributed grid computing for physics analysis with the Tevatron data is present at Fermilab. The Fermilab Run II data preservation project intends to keep this analysis capability sustained through the year 2020 or beyond. To achieve this, we are implementing a system that utilizes virtualization, automated validation, and migration to new standards in bothmore » software and data storage technology as well as leveraging resources available from currently-running experiments at Fermilab. Furthermore, these efforts will provide useful lessons in ensuring long-term data access for numerous experiments throughout high-energy physics, and provide a roadmap for high-quality scientific output for years to come.« less
Development of RFI Mitigation Techniques with Digital Beamforming
NASA Technical Reports Server (NTRS)
Bollian, Tobias; Rincon, Rafael; Fatoyinbo, Temilola; Osmanoglu, Batuhan
2016-01-01
Remote sensing radars with longer wavelengths penetrate deeper into the observed scene and are more suitable for the scientific observation of ice sheets or vegetation. Therefore, SAR systems are moving to lower frequencies like L- or P-band. However, as the frequency spectrum is a limited resource, this means that the occupied frequency band has to be shared with existing users. These users can have serious impact on the imaging quality. Radio frequency interference (RFI) that arrives at the antenna together with the SAR backscatter is causing a drop of the signal-to-noise ratio. Despite the high processing gain of the SAR signal, artifacts can appear in the image if the RFI is strong enough. This can lead to a corruption of the acquired data and make it unsuitable for scientific purposes. Hence, the investigation of methods for RFI mitigation is critical to the performance of radar missions and to ensure they meet their main task.
Data preservation at the Fermilab Tevatron
NASA Astrophysics Data System (ADS)
Boyd, J.; Herner, K.; Jayatilaka, B.; Roser, R.; Sakumoto, W.
2015-12-01
The Fermilab Tevatron collider's data-taking run ended in September 2011, yielding a dataset with rich scientific potential. The CDF and DO experiments each have nearly 9 PB of collider and simulated data stored on tape. A large computing infrastructure consisting of tape storage, disk cache, and distributed grid computing for physics analysis with the Tevatron data is present at Fermilab. The Fermilab Run II data preservation project intends to keep this analysis capability sustained through the year 2020 or beyond. To achieve this, we are implementing a system that utilizes virtualization, automated validation, and migration to new standards in both software and data storage technology as well as leveraging resources available from currently-running experiments at Fermilab. These efforts will provide useful lessons in ensuring long-term data access for numerous experiments throughout high-energy physics, and provide a roadmap for high-quality scientific output for years to come.
PFLOTRAN Verification: Development of a Testing Suite to Ensure Software Quality
NASA Astrophysics Data System (ADS)
Hammond, G. E.; Frederick, J. M.
2016-12-01
In scientific computing, code verification ensures the reliability and numerical accuracy of a model simulation by comparing the simulation results to experimental data or known analytical solutions. The model is typically defined by a set of partial differential equations with initial and boundary conditions, and verification ensures whether the mathematical model is solved correctly by the software. Code verification is especially important if the software is used to model high-consequence systems which cannot be physically tested in a fully representative environment [Oberkampf and Trucano (2007)]. Justified confidence in a particular computational tool requires clarity in the exercised physics and transparency in its verification process with proper documentation. We present a quality assurance (QA) testing suite developed by Sandia National Laboratories that performs code verification for PFLOTRAN, an open source, massively-parallel subsurface simulator. PFLOTRAN solves systems of generally nonlinear partial differential equations describing multiphase, multicomponent and multiscale reactive flow and transport processes in porous media. PFLOTRAN's QA test suite compares the numerical solutions of benchmark problems in heat and mass transport against known, closed-form, analytical solutions, including documentation of the exercised physical process models implemented in each PFLOTRAN benchmark simulation. The QA test suite development strives to follow the recommendations given by Oberkampf and Trucano (2007), which describes four essential elements in high-quality verification benchmark construction: (1) conceptual description, (2) mathematical description, (3) accuracy assessment, and (4) additional documentation and user information. Several QA tests within the suite will be presented, including details of the benchmark problems and their closed-form analytical solutions, implementation of benchmark problems in PFLOTRAN simulations, and the criteria used to assess PFLOTRAN's performance in the code verification procedure. References Oberkampf, W. L., and T. G. Trucano (2007), Verification and Validation Benchmarks, SAND2007-0853, 67 pgs., Sandia National Laboratories, Albuquerque, NM.
NASA Astrophysics Data System (ADS)
Squibb, Gael F.
1984-10-01
The operation teams for the Infrared Astronomical Satellite (IRAS) included scientists from the IRAS International Science Team. The scientific decisions on an hour-to-hour basis, as well as the long-term strategic decisions, were made by science team members. The IRAS scientists were involved in the analysis of the instrument performance, the analysis of the quality of the data, the decision to reacquire data that was contaminated by radiation effects, the strategy for acquiring the survey data, and the process for using the telescope for additional observations, as well as the processing decisions required to ensure the publication of the final scientific products by end of flight operations plus one year. Early in the project, two science team members were selected to be responsible for the scientific operational decisions. One, located at the operations control center in England, was responsible for the scientific aspects of the satellite operations; the other, located at the scientific processing center in Pasadena, was responsible for the scientific aspects of the processing. These science team members were then responsible for approving the design and test of the tools to support their responsibilities and then, after launch, for using these tools in making their decisions. The ability of the project to generate the final science data products one year after the end of flight operations is due in a large measure to the active participation of the science team members in the operations. This paper presents a summary of the operational experiences gained from this scientific involvement.
Schmidt, Wiebke; Raymond, David; Parish, David; Ashton, Ian G C; Miller, Peter I; Campos, Carlos J A; Shutler, Jamie D
2018-01-01
The need to ensure future food security and issues of varying estuarine water quality is driving the expansion of aquaculture into near-shore coastal waters. It is prudent to fully evaluate new or proposed aquaculture sites, prior to any substantial financial investment in infrastructure and staffing. Measurements of water temperature, salinity and dissolved oxygen can be used to gain insight into the physical, chemical and biological water quality conditions within a farm site, towards identifying its suitability for farming, both for the stock species of interest and for assessing the potential risk from harmful or toxic algae. The latter can cause closure of shellfish harvesting. Unfortunately, commercial scientific monitoring systems can be cost prohibitive for small organisations and companies to purchase and operate. Here we describe the design, construction and deployment of a low cost (<£ 5000) monitoring buoy suitable for use within a near-shore aquaculture farm or bathing waters. The mooring includes a suite of sensors designed for supporting and understanding variations in near-shore physical, chemical and biological water quality. The system has been designed so that it can be operated and maintained by non-scientific staff, whilst still providing good quality scientific data. Data collected from two deployments totalling 14 months, one in a coastal bay location, another in an estuary, have illustrated the robust design and provided insight into the suitability of these sites for aquaculture and the potential occurrence of a toxin causing algae ( Dinophysis spp.). The instruments maintained good accuracy during the deployments when compared to independent in situ measurements (e.g. RMSE 0.13-0.16 °C, bias 0.03-0.08 °C) enabling stratification and biological features to be identified, along with confirming that the waters were suitable for mussel ( Mytilus spp.) and lobster ( Homarus gammarus ) aquaculture, whilst sites showed conditions agreeable for Dinophysis spp.
Quality-assurance plan for water-resources activities of the U.S. Geological Survey in Idaho
Packard, F.A.
1996-01-01
To ensure continued confidence in its products, the Water Resources Division of the U.S. Geological Survey implemented a policy that all its scientific work be performed in accordance with a centrally managed quality-assurance program. This report establishes and documents a formal policy for current (1995) quality assurance within the Idaho District of the U.S. Geological Survey. Quality assurance is formalized by describing district organization and operational responsibilities, documenting the district quality-assurance policies, and describing district functions. The districts conducts its work through offices in Boise, Idaho Falls, Twin Falls, Sandpoint, and at the Idaho National Engineering Laboratory. Data-collection programs and interpretive studies are conducted by two operating units, and operational and technical assistance is provided by three support units: (1) Administrative Services advisors provide guidance on various personnel issues and budget functions, (2) computer and reports advisors provide guidance in their fields, and (3) discipline specialists provide technical advice and assistance to the district and to chiefs of various projects. The district's quality-assurance plan is based on an overall policy that provides a framework for defining the precision and accuracy of collected data. The plan is supported by a series of quality-assurance policy statements that describe responsibilities for specific operations in the district's program. The operations are program planning; project planning; project implementation; review and remediation; data collection; equipment calibration and maintenance; data processing and storage; data analysis, synthesis, and interpretation; report preparation and processing; and training. Activities of the district are systematically conducted under a hierarchy of supervision an management that is designed to ensure conformance with Water Resources Division goals quality assurance. The district quality-assurance plan does not describe detailed technical activities that are commonly termed "quality-control procedures." Instead, it focuses on current policies, operations, and responsibilities that are implemented at the management level. Contents of the plan will be reviewed annually and updated as programs and operations change.
Peng, Jing; Tang, Juming; Barrett, Diane M; Sablani, Shyam S; Anderson, Nathan; Powers, Joseph R
2017-09-22
Increasing consumer desire for high quality ready-to-eat foods makes thermal pasteurization important to both food producers and researchers. To be in compliance with the Food Safety Modernization Act (FSMA), food companies seek regulatory and scientific guidelines to ensure that their products are safe. Clearly understanding the regulations for chilled or frozen foods is of fundamental importance to the design of thermal pasteurization processes for vegetables that meet food safety requirements. This article provides an overview of the current regulations and guidelines for pasteurization in the U.S. and in Europe for control of bacterial pathogens. Poorly understood viral pathogens, in terms of their survival in thermal treatments, are an increasing concern for both food safety regulators and scientists. New data on heat resistance of viruses in different foods are summarized. Food quality attributes are sensitive to thermal degradation. A review of thermal kinetics of inactivation of quality-related enzymes in vegetables and the effects of thermal pasteurization on vegetable quality is presented. The review also discusses shelf-life of thermally pasteurized vegetables.
Rouiller, Yolande; Solacroup, Thomas; Deparis, Véronique; Barbafieri, Marco; Gleixner, Ralf; Broly, Hervé; Eon-Duval, Alex
2012-06-01
The production bioreactor step of an Fc-Fusion protein manufacturing cell culture process was characterized following Quality by Design principles. Using scientific knowledge derived from the literature and process knowledge gathered during development studies and manufacturing to support clinical trials, potential critical and key process parameters with a possible impact on product quality and process performance, respectively, were determined during a risk assessment exercise. The identified process parameters were evaluated using a design of experiment approach. The regression models generated from the data allowed characterizing the impact of the identified process parameters on quality attributes. The main parameters having an impact on product titer were pH and dissolved oxygen, while those having the highest impact on process- and product-related impurities and variants were pH and culture duration. The models derived from characterization studies were used to define the cell culture process design space. The design space limits were set in such a way as to ensure that the drug substance material would consistently have the desired quality. Copyright © 2012 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Leibovici, D. G.; Pourabdollah, A.; Jackson, M.
2011-12-01
Experts and decision-makers use or develop models to monitor global and local changes of the environment. Their activities require the combination of data and processing services in a flow of operations and spatial data computations: a geospatial scientific workflow. The seamless ability to generate, re-use and modify a geospatial scientific workflow is an important requirement but the quality of outcomes is equally much important [1]. Metadata information attached to the data and processes, and particularly their quality, is essential to assess the reliability of the scientific model that represents a workflow [2]. Managing tools, dealing with qualitative and quantitative metadata measures of the quality associated with a workflow, are, therefore, required for the modellers. To ensure interoperability, ISO and OGC standards [3] are to be adopted, allowing for example one to define metadata profiles and to retrieve them via web service interfaces. However these standards need a few extensions when looking at workflows, particularly in the context of geoprocesses metadata. We propose to fill this gap (i) at first through the provision of a metadata profile for the quality of processes, and (ii) through providing a framework, based on XPDL [4], to manage the quality information. Web Processing Services are used to implement a range of metadata analyses on the workflow in order to evaluate and present quality information at different levels of the workflow. This generates the metadata quality, stored in the XPDL file. The focus is (a) on the visual representations of the quality, summarizing the retrieved quality information either from the standardized metadata profiles of the components or from non-standard quality information e.g., Web 2.0 information, and (b) on the estimated qualities of the outputs derived from meta-propagation of uncertainties (a principle that we have introduced [5]). An a priori validation of the future decision-making supported by the outputs of the workflow once run, is then provided using the meta-propagated qualities, obtained without running the workflow [6], together with the visualization pointing out the need to improve the workflow with better data or better processes on the workflow graph itself. [1] Leibovici, DG, Hobona, G Stock, K Jackson, M (2009) Qualifying geospatial workfow models for adaptive controlled validity and accuracy. In: IEEE 17th GeoInformatics, 1-5 [2] Leibovici, DG, Pourabdollah, A (2010a) Workflow Uncertainty using a Metamodel Framework and Metadata for Data and Processes. OGC TC/PC Meetings, September 2010, Toulouse, France [3] OGC (2011) www.opengeospatial.org [4] XPDL (2008) Workflow Process Definition Interface - XML Process Definition Language.Workflow Management Coalition, Document WfMC-TC-1025, 2008 [5] Leibovici, DG Pourabdollah, A Jackson, M (2011) Meta-propagation of Uncertainties for Scientific Workflow Management in Interoperable Spatial Data Infrastructures. In: Proceedings of the European Geosciences Union (EGU2011), April 2011, Austria [6] Pourabdollah, A Leibovici, DG Jackson, M (2011) MetaPunT: an Open Source tool for Meta-Propagation of uncerTainties in Geospatial Processing. In: Proceedings of OSGIS2011, June 2011, Nottingham, UK
Quality-control materials in the USDA National Food and Nutrient Analysis Program (NFNAP).
Phillips, Katherine M; Patterson, Kristine Y; Rasor, Amy S; Exler, Jacob; Haytowitz, David B; Holden, Joanne M; Pehrsson, Pamela R
2006-03-01
The US Department of Agriculture (USDA) Nutrient Data Laboratory (NDL) develops and maintains the USDA National Nutrient Databank System (NDBS). Data are released from the NDBS for scientific and public use through the USDA National Nutrient Database for Standard Reference (SR) ( http://www.ars.usda.gov/ba/bhnrc/ndl ). In 1997 the NDL initiated the National Food and Nutrient Analysis Program (NFNAP) to update and expand its food-composition data. The program included: 1) nationwide probability-based sampling of foods; 2) central processing and archiving of food samples; 3) analysis of food components at commercial, government, and university laboratories; 4) incorporation of new analytical data into the NDBS; and 5) dissemination of these data to the scientific community. A key feature and strength of the NFNAP was a rigorous quality-control program that enabled independent verification of the accuracy and precision of analytical results. Custom-made food-control composites and/or commercially available certified reference materials were sent to the laboratories, blinded, with the samples. Data for these materials were essential to ongoing monitoring of analytical work, to identify and resolve suspected analytical problems, to ensure the accuracy and precision of results for the NFNAP food samples.
Providing Goal-Based Autonomy for Commanding a Spacecraft
NASA Technical Reports Server (NTRS)
Rabideau, Gregg; Chien, Steve; Liu, Ning
2008-01-01
A computer program for use aboard a scientific-exploration spacecraft autonomously selects among goals specified in high-level requests and generates corresponding sequences of low-level commands, understandable by spacecraft systems. (As used here, 'goals' signifies specific scientific observations.) From a dynamic, onboard set of goals that could oversubscribe spacecraft resources, the program selects a non-oversubscribing subset that maximizes a quality metric. In an early version of the program, the requested goals are assumed to have fixed starting times and durations. Goals can conflict by exceeding a limit on either the number of separate goals or the number of overlapping goals making demands on the same resource. The quality metric used in this version is chosen to ensure that a goal will never be replaced by another having lower priority. At any time, goals can be added or removed, or their priorities can be changed, and the 'best' goal will be selected. Once a goal has been selected, the program implements a robust, flexible approach to generation of low-level commands: Rather than generate rigid sequences with fixed starting times, the program specifies flexible sequences that can be altered to accommodate run time variations.
NASA Astrophysics Data System (ADS)
Chen, R. S.; Levy, M. A.; de Sherbinin, A. M.; Fischer, A.
2015-12-01
The Sustainable Development Goals (SDGs) represent an unprecedented international commitment to a shared future encompassing sustainable management of the planet and significant improvement in the human condition around the world. The scientific community has both an ethical responsibility and substantial self-interest—as residents of this planet—to help the world community to better understand the complex, interlinked behavior of human and environmental systems and to elucidate pathways to achieve long-term sustainability. Critical to making progress towards the SDGs is the open availability of timely, reliable, usable, and well integrated data and indicators relevant to all SDGs and associated targets. Such data and indicators will not only be valuable in monitoring and evaluation of progress, but also in developing policies and making decisions on environmental and societal issues affecting sustainability from local to global scales. The open availability of such data and indicators can help motivate performance, promote accountability, and facilitate cooperation. A range of scientific, technical, organizational, political, and resource challenges need to be addressed in developing a coherent SDG monitoring and indicator framework. For example, assembling and integrating diverse data on consistent spatial and temporal scales across the relevant natural, social, health, and engineering sciences pose both scientific and technical difficulties, and may require new ways to interlink and organize existing cyberinfrastructure, reconcile different data policy regimes, and fund integration efforts. New information technologies promise more timely and efficient ways of collecting many types of data, but may also raise privacy, control, and equity issues. Scientific review processes to ensure data quality need to be coordinated with the types of quality control and review employed by national statistical agencies for trusted economic and social statistics. Although large investments are already being made in some observing systems such as satellite-based remote sensing, additional resources are needed to fill key gaps, make data useful for decision making, and build capacity in developing countries. Broad engagement by the scientific community is urgently needed.
A quality-refinement process for medical imaging applications.
Neuhaus, J; Maleike, D; Nolden, M; Kenngott, H-G; Meinzer, H-P; Wolf, I
2009-01-01
To introduce and evaluate a process for refinement of software quality that is suitable to research groups. In order to avoid constraining researchers too much, the quality improvement process has to be designed carefully. The scope of this paper is to present and evaluate a process to advance quality aspects of existing research prototypes in order to make them ready for initial clinical studies. The proposed process is tailored for research environments and therefore more lightweight than traditional quality management processes. Focus on quality criteria that are important at the given stage of the software life cycle. Usage of tools that automate aspects of the process is emphasized. To evaluate the additional effort that comes along with the process, it was exemplarily applied for eight prototypical software modules for medical image processing. The introduced process has been applied to improve the quality of all prototypes so that they could be successfully used in clinical studies. The quality refinement yielded an average of 13 person days of additional effort per project. Overall, 107 bugs were found and resolved by applying the process. Careful selection of quality criteria and the usage of automated process tools lead to a lightweight quality refinement process suitable for scientific research groups that can be applied to ensure a successful transfer of technical software prototypes into clinical research workflows.
NASA Astrophysics Data System (ADS)
Arzayus, K. M.; Garcia, H. E.; Jiang, L.; Michael, P.
2012-12-01
As the designated Federal permanent oceanographic data center in the United States, NOAA's National Oceanographic Data Center (NODC) has been providing scientific stewardship for national and international marine environmental and ecosystem data for over 50 years. NODC is supporting NOAA's Ocean Acidification Program and the science community by providing end-to-end scientific data management of ocean acidification (OA) data, dedicated online data discovery, and user-friendly access to a diverse range of historical and modern OA and other chemical, physical, and biological oceanographic data. This effort is being catalyzed by the NOAA Ocean Acidification Program, but the intended reach is for the broader scientific ocean acidification community. The first three years of the project will be focused on infrastructure building. A complete ocean acidification data content standard is being developed to ensure that a full spectrum of ocean acidification data and metadata can be stored and utilized for optimal data discovery and access in usable data formats. We plan to develop a data access interface capable of allowing users to constrain their search based on real-time and delayed mode measured variables, scientific data quality, their observation types, the temporal coverage, methods, instruments, standards, collecting institutions, and the spatial coverage. In addition, NODC seeks to utilize the existing suite of international standards (including ISO 19115-2 and CF-compliant netCDF) to help our data producers use those standards for their data, and help our data consumers make use of the well-standardized metadata-rich data sets. These tools will be available through our NODC Ocean Acidification Scientific Data Stewardship (OADS) web page at http://www.nodc.noaa.gov/oceanacidification. NODC also has a goal to provide each archived dataset with a unique ID, to ensure a means of providing credit to the data provider. Working with partner institutions, such as the Carbon Dioxide Information Analysis Center (CDIAC), Biological and Chemical Oceanography Data management Office (BCO-DMO), and federal labs, NODC is exploring the challenges of coordinated data flow and quality control for diverse ocean acidification data sets. These data sets include data from coastal and ocean monitoring, laboratory and field experiments, model output, and remotely sensed data. NODC already has in place automated data extraction protocols for archiving oceanographic data from BCO-DMO and CDIAC. We present a vision for how these disparate data streams can be more fully utilized when brought together using data standards. Like the Multiple-Listing Service in the real estate market, the OADS project is dedicated to developing a repository of ocean acidification data from all sources, and to serving them to the ocean acidification community using a user-friendly interface in a timely manner. For further information please contact NODC.Ocean.Acidification@noaa.gov.
Implementing scientific evidence to improve the quality of Child Protection
Cowley, Laura; Tempest, Vanessa; Maguire, Sabine; Mann, Mala; Naughton, Aideen; Wain, Laura; Kemp, Alison
2013-01-01
In contrast to other areas of medical practice, there was a lack of a clear, concise and accessible synthesis of scientific literature to aid the recognition and investigation of suspected child abuse, and no national training program or evidence based guidelines for clinicians. The project's aim was to identify the current scientific evidence for the recognition and investigation of suspected child abuse and neglect and to disseminate and introduce this into clinical practice. Since 2003 a comprehensive program of Systematic Reviews of all aspects of physical abuse, emotional abuse, and neglect of children, has been developed. Based on NHS Centre for Reviews and Dissemination standards, methodology was devised and reviewers trained. Dissemination was via peer reviewed publications, a series of leaflets highlighting key points in a Question and Answer format, and a website. To date, 21 systematic reviews have been completed, generating 28 peer reviewed publications, and six leaflets around each theme (eg fractures, bruising). More than 250,000 have been distributed to date. Our website generates more than 10,000 hits monthly. It hosts primary reviews that are updated annually, links to all included studies, publications, and detailed methodology. The reviews have directly informed five national clinical guidelines, and the first evidence based training in Child Maltreatment. Child abuse is every health practitioner's responsibility, and it is vital that the decisions made are evidence based, as it is expected in all other fields of medicine. Although challenging, this project demonstrates that it is possible to conduct high quality systematic reviews in this field. For the first time a clear concise synthesis of up to date scientific evidence is available to all practitioners in a range of accessible formats. This has underpinned high quality national guidance and training programs. It ensures all professionals have the appropriate knowledge base in this difficult and challenging field. PMID:26734183
Integrating Data and Networks: Human Factors
NASA Astrophysics Data System (ADS)
Chen, R. S.
2012-12-01
The development of technical linkages and interoperability between scientific networks is a necessary but not sufficient step towards integrated use and application of networked data and information for scientific and societal benefit. A range of "human factors" must also be addressed to ensure the long-term integration, sustainability, and utility of both the interoperable networks themselves and the scientific data and information to which they provide access. These human factors encompass the behavior of both individual humans and human institutions, and include system governance, a common framework for intellectual property rights and data sharing, consensus on terminology, metadata, and quality control processes, agreement on key system metrics and milestones, the compatibility of "business models" in the short and long term, harmonization of incentives for cooperation, and minimization of disincentives. Experience with several national and international initiatives and research programs such as the International Polar Year, the Group on Earth Observations, the NASA Earth Observing Data and Information System, the U.S. National Spatial Data Infrastructure, the Global Earthquake Model, and the United Nations Spatial Data Infrastructure provide a range of lessons regarding these human factors. Ongoing changes in science, technology, institutions, relationships, and even culture are creating both opportunities and challenges for expanded interoperability of scientific networks and significant improvement in data integration to advance science and the use of scientific data and information to achieve benefits for society as a whole.
McDonald, Fiona; Jones, Mavis; Ozdemir, Vural; Graham, Janice E.
2010-01-01
Abstract Standardization is critical to scientists and regulators to ensure the quality and interoperability of research processes, as well as the safety and efficacy of the attendant research products. This is perhaps most evident in the case of “omics science,” which is enabled by a host of diverse high-throughput technologies such as genomics, proteomics, and metabolomics. But standards are of interest to (and shaped by) others far beyond the immediate realm of individual scientists, laboratories, scientific consortia, or governments that develop, apply, and regulate them. Indeed, scientific standards have consequences for the social, ethical, and legal environment in which innovative technologies are regulated, and thereby command the attention of policy makers and citizens. This article argues that standardization of omics science is both technical and social. A critical synthesis of the social science literature indicates that: (1) standardization requires a degree of flexibility to be practical at the level of scientific practice in disparate sites; (2) the manner in which standards are created, and by whom, will impact their perceived legitimacy and therefore their potential to be used; and (3) the process of standardization itself is important to establishing the legitimacy of an area of scientific research. PMID:20455752
Weng, Naidong
2012-11-01
In the pharmaceutical industry, bioanalysis is very dynamic and is probably one of the few fields of research covering the entire drug discovery, development and post-marketing process. Important decisions on drug safety can partially rely on bioanalytical data, which therefore can be subject to regulatory scrutiny. Bioanalytical scientists have historically contributed significant numbers of scientific manuscripts in many peer-reviewed analytical journals. All of these journals provide some high-level instructions, but they also leave sufficient flexibility for reviewers to perform independent critique and offer recommendations for each submitted manuscript. Reviewers play a pivotal role in the process of bioanalytical publication to ensure the publication of high-quality manuscripts in a timely fashion. Their efforts usually lead to improved manuscripts. However, it has to be a joint effort among authors, reviewers and editors to promote scientifically sound and ethically fair bioanalytical publications. Most of the submitted manuscripts were well written with only minor or moderate revisions required for further improvement. Nevertheless, there were small numbers of submitted manuscripts that did not meet the requirements for publications because of scientific or ethical deficiencies, which are discussed in this Letter to the Editor. Copyright © 2012 John Wiley & Sons, Ltd.
Due Diligence Processes for Public Acquisition of Mining-Impacted Landscapes
NASA Astrophysics Data System (ADS)
Martin, E.; Monohan, C.; Keeble-Toll, A. K.
2016-12-01
The acquisition of public land is critical for achieving conservation and habitat goals in rural regions projected to experience continuously high rates of population growth. To ensure that public funds are utilized responsibly in the purchase of conservation easements appropriate due diligence processes must be established that limit landowner liability post-acquisition. Traditional methods of characterizing contamination in regions where legacy mining activities were prevalent may not utilize current scientific knowledge and understanding of contaminant fate, transport and bioavailability, and therefore are likely to have type two error. Agency prescribed assessment methods utilized under CERLA in many cases fail to detect contamination that presents liability issues by failing to require water quality sampling that would reveal offsite transport potential of contaminants posing human health risks, including mercury. Historical analysis can be used to inform judgmental sampling to identify hotspots and contaminants of concern. Land acquisition projects at two historic mine sites in Nevada County, California, the Champion Mine Complex and the Black Swan Preserve have established the necessity of re-thinking due diligence processes for mining-impacted landscapes. These pilot projects demonstrate that pre-acquisition assessment in the Gold Country must include judgmental sampling and evaluation of contaminant transport. Best practices using the current scientific knowledge must be codified by agencies, consultants, and NGOs in order to ensure responsible use of public funds and to safeguard public health.
Data Quality Parameters and Web Services Facilitate User Access to Research-Ready Seismic Data
NASA Astrophysics Data System (ADS)
Trabant, C. M.; Templeton, M. E.; Van Fossen, M.; Weertman, B.; Ahern, T. K.; Casey, R. E.; Keyson, L.; Sharer, G.
2016-12-01
IRIS Data Services has the mission of providing efficient access to a wide variety of seismic and related geoscience data to the user community. With our vast archive of freely available data, we recognize that there is a constant challenge to provide data to scientists and students that are of a consistently useful level of quality. To address this issue, we began by undertaking a comprehensive survey of the data and generating metrics measurements that provide estimates of data quality. These measurements can inform the scientist of the level of suitability of a given set of data for their scientific investigation. They also serve as a quality assurance check for network operators, who can act on this information to improve their current recording or mitigate issues with already recorded data and metadata. Following this effort, IRIS Data Services is moving forward to focus on providing tools for the scientist that make it easier to access data of a quality and characteristic that suits their investigation. Data that fulfill this criterion are termed "research-ready". In addition to filtering data by type, geographic location, proximity to events, and specific time ranges, we will offer the ability to filter data based on specific quality assessments. These include signal-to-noise ratio measurements, data continuity, timing quality, absence of channel cross-talk, and potentially many other factors. Our goal is to ensure that the user receives only the data that meets their specifications and will not require extensive review and culling after delivery. We will present the latest developments of the MUSTANG automated data quality system and introduce the Research-Ready Data Sets (RRDS) service. Together these two technologies serve as a data quality assurance ecosystem that will provide benefit to the scientific community by aiding efforts to readily find appropriate and suitable data for use in any number of objectives.
Policy for Robust Space-based Earth Science, Technology and Applications
NASA Technical Reports Server (NTRS)
Brown, Molly E.; Escobar, Vanessa M.; Macauley, Molly; Aschbacher, Josef; Milagro-Perez, Maria Pilar; Doorn, Bradley; Friedl, Lawrence
2012-01-01
Over the past six decades, satellite remote sensing technology has contributed to the transformation of using earth science not only to advance science, but to improve quality of life. With satellite missions launched almost every year, new types of earth science data are being incorporated into science, models and decision-making systems in a broad array of organizations. A challenge for space agencies has been ensuring that satellite missions serve both the scientific community and the applied community of decision makers without the missions becoming unfocused and overly expensive. By understanding and considering the needs of the environmental data and applied research user community early on in the mission-design process, agencies can ensure that satellites meet the needs of multiple constituencies. This paper describes the mission development process in the European Space Agency and the National Aeronautics and Space Administration and compares and contrasts the successes of and challenges faced by these agencies in balancing science and applications within their missions.
30 CFR 280.2 - What is the purpose of this part?
Code of Federal Regulations, 2011 CFR
2011-07-01
...: (a) Allow you to conduct prospecting activities or scientific research activities on the OCS in...) Ensure that you carry out prospecting activities or scientific research activities in a safe and...
EDP Sciences and A&A: partnering to providing services to support the scientific community
NASA Astrophysics Data System (ADS)
Henri, Agnes
2015-08-01
Scholarly publishing is no longer about simply producing and packaging articles and sending out to subscribers. To be successful, as well as being global and digital, Publishers and their journals need to be fully engaged with their stakeholders (authors, readers, funders, libraries etc), and constantly developing new products and services to support their needs in the ever-changing environment that we work in.Astronomy & Astrophysics (A&A) is a high quality, major international Journal that belongs to the astronomical communities of a consortium of European and South American countries supported by ESO who sponsor the journal. EDP Sciences is a non-profit publisher belonging to several learned societies and is appointed by ESO to publish the journal.Over the last decade, as well as publishing the results of worldwide astronomical and astrophysical research, A&A and EDP Sciences have worked in partnership to develop a wide range of services for the authors and readers of A&A:- A specialist language editing service: to provide a clear and excellent level of English ensuring full understanding of the high-quality science.- A flexible and progressive Open Access Policy including Gold and Green options and strong links with arXiv.- Enriched articles: authors are able to enhance their articles using a wide range of rich media such as 3D models, videos and animations.Multiple publishing formats: allowing readers to browse articles on multiple devices including eReaders and Kindles.- “Scientific Writing for Young Astronomers”: In 2008 EDP Sciences and A&A set up the Scientific Writing for Young Astronomers (SWYA) School with the objective to teach early PhD Students how write correct and efficient scientific papers for different mediums (journals, proceedings, thesis manuscripts, etc.).
Agency for toxic substances and disease registry
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
The congressional mandates under which the Agency for Toxic Substances and Disease Registry (ATSDR) operates are generally broad in scope, but very specific in intent. They concern the health effects of human exposure to hazardous substances in the environment. This report recounts the accomplishments in meeting specific mandates and indicates plans and directions for work to meet others. The report is organized by program area and covers the federal fiscal year 1987 (October 1, 1986 through September 30, 1987). Two items of importance were performed in FY 1987 by senior management at ATSDR that are not directly reportable by individualmore » program area: first, the priorities of the agency's programs were reordered, and second, the formation of an ATSDR Board of Scientific Counselors was initiated. The reordering of priorities reflects the agency's have in met certain mandates (such as completion of the first 25 toxicological profiles) and takes cognizance of other congressionally mandated deadlines (such as performing health assessments for all National Priorities List Superfund sites). The agency is establishing a Board of Scientific Counselors to provide advice and guidance on ATSDR's programs to ensure scientific quality, timeliness, utility, and dissemination of results. Specifically, the board will advise on the adequacy of science in ATSDR-supported research, emerging problems that require scientific investigation, accuracy and currency of science in ATSDR reports, and program areas to be emphasized and/or deemphasized. The Agency for Toxic Substances and Disease Registry continued , in FY 1987, to meet its mission of preventing of mitigating adverse human health effects and diminished quality of life resulting from exposure to hazardous substances in the environment. 156 refs.« less
LANL Multiyear Strategy Performance Improvement (MYSPI), Fiscal Years 2017–2021
DOE Office of Scientific and Technical Information (OSTI.GOV)
Leasure, Craig Scott
2016-05-03
Los Alamos National Laboratory (LANL) protects the nation and the world using innovative science, technology, and engineering through an integrated approach that harnesses the strength of our people, capabilities, and operations. The Laboratory’s Strategic Plan and Purpose statement provide the framework for scientific excellence and operational excellence now and in the future. Our Strategic Plan and Purpose help position Los Alamos for continuing mission success that ensures the safety, security, and effectiveness of the nation’s deterrent; protects the nation from nuclear and emerging threats through our larger global security missions; provides energy security to the nation; and ensures that themore » nation’s scientific reputation and capabilities remain robust enough to assure our allies and deter our adversaries. Moreover, we use these principles and guidance to ensure that Los Alamos is successful in attracting, recruiting, and retaining the next generation of world-class talent, while creating an efficient, environmentally responsible workplace that provides our employees with access to modern scientific tools and resources. Using this guidance and its underlying principles, we are continuing to restore credibility and operational effectiveness to the Laboratory, deliver mission success and continuing scientific excellence, and protect our employees and the nation’s secrets.« less
LANL Multiyear Strategy Performance Improvement (MYSPI), Fiscal Years 2018-2022
DOE Office of Scientific and Technical Information (OSTI.GOV)
Leasure, Craig Scott
Los Alamos National Laboratory (LANL) protects the nation and the world using innovative science, technology, and engineering through an integrated approach that harnesses the strength of our people, capabilities, and operations. The Laboratory’s Strategic Plan and Purpose statement provide the framework for scientific excellence and operational excellence now and in the future. Our Strategic Plan and Purpose help position Los Alamos for continuing mission success that ensures the safety, security, and effectiveness of the nation’s deterrent; protects the nation from nuclear and emerging threats through our larger global security missions; provides energy security to the nation; and ensures that themore » nation’s scientific reputation and capabilities remain robust enough to assure our allies and deter our adversaries. Moreover, we use these principles and guidance to ensure that Los Alamos is successful in attracting, recruiting, and retaining the next generation of excellent talent, while creating an efficient, environmentally responsible workplace that provides our employees with access to modern scientific tools and resources. Using this guidance and its underlying principles, we are continuing to restore credibility and operational effectiveness to the Laboratory, deliver mission success and continuing scientific excellence, and protect our employees and the nation’s secrets.« less
Fisher, Adam C; Lee, Sau L; Harris, Daniel P; Buhse, Lucinda; Kozlowski, Steven; Yu, Lawrence; Kopcha, Michael; Woodcock, Janet
2016-12-30
Failures surrounding pharmaceutical quality, particularly with respect to product manufacturing issues and facility remediation, account for the majority of drug shortages and product recalls in the United States. Major scientific advancements pressure established regulatory paradigms, especially in the areas of biosimilars, precision medicine, combination products, emerging manufacturing technologies, and the use of real-world data. Pharmaceutical manufacturing is increasingly globalized, prompting the need for more efficient surveillance systems for monitoring product quality. Furthermore, increasing scrutiny and accelerated approval pathways provide a driving force to be even more efficient with limited regulatory resources. To address these regulatory challenges, the Office of Pharmaceutical Quality (OPQ) in the Center for Drug Evaluation and Research (CDER) at the U.S. Food and Drug Administration (FDA) harbors a rigorous science and research program in core areas that support drug quality review, inspection, surveillance, standards, and policy development. Science and research is the foundation of risk-based quality assessment of new drugs, generic drugs, over-the-counter drugs, and biotechnology products including biosimilars. This is an overview of the science and research activities in OPQ that support the mission of ensuring that safe, effective, and high-quality drugs are available to the American public. Published by Elsevier B.V.
New challenges in assuring vaccine quality.
Dellepiane, N.; Griffiths, E.; Milstien, J. B.
2000-01-01
In the past, quality control of vaccines depended on use of a variety of testing methods to ensure that the products were safe and potent. These methods were developed for vaccines whose safety and efficacy were based on several years worth of data. However, as vaccine production technologies have developed, so have the testing technologies. Tests are now able to detect potential hazards with a sensitivity not possible a few years ago, and an increasing array of physicochemical methods allows a much better characterization of the product. In addition to sophisticated tests, vaccine regulation entails a number of other procedures to ensure safety. These include characterization of starting materials by supplier audits, cell banking, seed lot systems, compliance with the principles of good manufacturing practices, independent release of vaccines on a lot-by-lot basis by national regulatory authorities, and enhanced pre- and post-marketing surveillance for possible adverse events following immunization. These procedures help assure vaccine efficacy and safety, and some examples are given in this article. However, some contaminants of vaccines that can be detected by newer assays raise theoretical safety concerns but their presence may be less hazardous than not giving the vaccines. Thus risk-benefit decisions must be well informed and based on scientific evidence. PMID:10743279
NASA Astrophysics Data System (ADS)
Ivanov, Nikolay; Safe Aldeen, Ahmed
2018-03-01
Recently, more and more attention in scientific literature has been drawn to improving the sustainability of organization. The growth in the volume of high-rise construction in Russia makes the task of assessing and ensuring the sustainability of organizations and enterprises leading this type of construction very relevant. The article considers the approach to assessing the sustainability of the organization's activities in the context of functioning of quality management system (QMS). It puts forward the hypothesis that assessment of sustainability of an organization that has a real and efficient functioning quality management system can be based on the results of assessing the effectiveness of the QMS. The article describes in sufficient detail the sequence of actions to form a list of criteria for assessing the effectiveness of the QMS and sustainability of the organization, and to evaluate both characteristics on the basis of these criteria. For a clear interpretation of the results obtained, the authors use so-called petal diagrams. It suggests an original approach to their creation and analysis. Based on the results of the study, the authors conclude that in order to assess the sustainability of enterprises and organizations analysis of the dynamics of changes in the basic sustainability factors is mandatory.
Sociometric approaches for managing military units and predicting of behavior of military personnel
NASA Astrophysics Data System (ADS)
Kudro, Nataliya M.; Puzikova, Svetlana M.
2017-09-01
In the Republic of Kazakhstan military service becomes attractive primarily for that category of people who have no opportunity to acquire high quality vocational or higher education, decent income by the speciality available, or those who have not yet identified themselves professionally and socially. Its a serious problem how to ensure ability of military units to execute their service duties in conditions of more and more increasing requirements for professional competences of military personnel, increased intellectualization of military service when the quality of "human material" often is not corresponding to the required standards. This problem in the national and foreign science is still being developed and has no final solutions accessible for the scientific society. This article presents an effort to offer specialists in the military administration area one of probable tools to forecast successfulness of execution of professional tasks by military units based on results of sociometric studies and algorithms of plotting Bayesian networks. Using these tools a military leader will be able to evaluate effectiveness of his managerial activity, correct mechanisms of individual and mentoring activity with regard to individual servicemen, provide an opportunity to eliminate risks of failing to fulfill professional tasks on time and failing to ensure combat readiness of entrusted military team.
Zimmer, LO; Nolen, TL; Pramanpol, S; Wallace, D; Walker, ME; Pappas, P; Chetchotisakd, P
2010-01-01
Background International clinical trials can provide scientific and logistic benefits in spite of the many challenges. Determining whether a country, especially a developing country, is an appropriate location for the research should include in-country consultation and partnering to assess its social value for the population; that treatments are relevant for the population under study; and that the research infrastructure and ethical oversight are adequate. Collaboration increases the likelihood of study success and helps ensure that benefits accrue to recruited populations and their community. Purpose This paper describes our experiences on a bi-national study and may provide guidance for those planning to engage in future collaborations. Methods A Thai and United States team collaborated to develop and implement a Phase II clinical trial for HIV-associated cryptococcal meningitis to assess safety and tolerability of combination therapy versus standard treatment. Clinical and cultural differences, regulatory hurdles and operational issues were addressed before and during the study to ensure a successful collaboration between the 2 groups. Results The international multicenter study allowed for more rapid enrollment, reduced costs to complete the study, sharing of the benefits of research, greater generalizability of results and capacity building in Thailand; quality metrics in Thailand were equivalent to or better than those in the U.S. Conclusions Conducting successful clinical trials internationally requires early and ongoing collaboration to ensure the study meets sites’ requirements and expectations, conforms to varying national regulations, adheres to data quality standards and is responsive to the health needs of studied populations. PMID:19897055
Multi-station basis for Polar Cap (PC) indices: ensuring credibility and operational reliability
NASA Astrophysics Data System (ADS)
Stauning, Peter
2018-02-01
The Polar Cap (PC) indices, PCN (North) and PCS (South) are based on polar geomagnetic observations from Qaanaaq (Thule) and Vostok, respectively, processed to measure the transpolar plasma convection that may seriously affect space weather conditions. To establish reliable space weather forecasts based on PC indices, and also to ensure credibility of their use for scientific analyses of solar wind-magnetosphere interactions, additional sources of data for the PC indices are investigated. In the search for alternative index sources, objective quality criteria are established here to be used for the selection among potential candidates. These criteria are applied to existing PC index series to establish a quality scale. In the Canadian region, the data from Resolute Bay magnetometer are shown to provide alternative PCN indices of adequate quality. In Antarctica, the data from Concordia Dome-C observatory are shown to provide basis for alternative PCS indices. In examples to document the usefulness of these alternative index sources it is shown that PCN indices in a real-time version based on magnetometer data from Resolute Bay could have given 6 h of early warning, of which the last 2 h were "red alert", up to the onset of the strong substorm event on 13 March 1989 that caused power outage in Quebec. The alternative PCS indices based on data from Dome-C have helped to disclose that presently available Vostok-based PCS index values are corrupted throughout most of 2011.
Scientific Research & Subsistence: Protocols to Ensure Co-Existence
NASA Astrophysics Data System (ADS)
Nachman, C.; Holman, A.; DeMaster, D.
2017-12-01
Commercial, industrial, and research interests in the Arctic are expanding rapidly. Potentials are numerous and exciting, giving rise to the need for guidelines to ensure interactions among waterway users do not conflict. Of particular concern is the potential for adverse impacts to U.S. Arctic coastal communities that rely on living marine resources for nutritional and cultural health, through subsistence hunts from small craft, ice edges, and shore. Recent events raised concerns over research surveys potentially interfering with subsistence hunts in the Bering, Chukchi, and Beaufort Seas. Incidents led to calls by Native Alaskan communities to restrict science activities with a mixed response from the scientific community (i.e., some sympathetic, some defensive). With a common goal of wanting to mitigate this potential interaction, Federal agencies made a commitment in the National Strategy for the Arctic Region to coordinate and consult with Alaska Natives and also to pursue responsible Arctic stewardship, with understanding through scientific research and traditional knowledge. The effort to create a "Standard of Care" for research surveys incorporates years of experience by subsistence hunters working to mitigate impacts of other anthropogenic activities in the region, as well as best practices by many in the research community. The protocols are designed to ensure potential conflicts between the scientific research community and subsistence hunters are avoided and to encourage mutual assistance and collaboration between researchers and hunters. The guidelines focus on enhancing communication between researchers and subsistence hunters before, during, and after research occurs. The best management practices outlined in the Standard of Care assist those overseeing and funding scientific research in making decisions about how best to accomplish the goals of the research while ensuring protection of the Alaska subsistence lifestyle. These protocols could also be used in a larger context to address concerns over increased vessel traffic from other activities. We will outline the importance of establishing the guidelines, describe the general process, and highlight examples of positive interactions with Alaska Native hunters during scientific research operations using this protocol.
NASA Astrophysics Data System (ADS)
Arvidson, R.; Bell, J. F., III; Kaplan, D.; Marshall, J.; Mishkin, A.; Saunders, S.; Smith, P.; Squyres, S.
1999-03-01
The Science Operations Working Group, Mars 2001 Mission, has developed coordinated plans for scientific observations that treat the instruments as an integrated payload. This approach ensures maximum return of scientific information.
Scientific Review in Cancer Clinical Trials
Scientific review ensures that studies are based on sound science, which contributes to the safety of clinical trial participants. Learn about the role of Institutional Review Boards (IRBs), Data and Safety Monitoring Boards (DSMBs), and government agenci
Code of Federal Regulations, 2010 CFR
2010-01-01
... monument-related scientific exploration and research, tourism, and recreational and economic activities and... and enforcement necessary to ensure that scientific exploration and research, tourism, and...
Masic, Izet; Mujanovic, Olivera Batic; Racic, Maja; Gavran, Larisa; Stanetic, Kosana; Hodzic, Merzika; Cojic, Milena; Cvejanov-Kezunovic, Ljiljana; Stepanovic, Aleksandar; Stavrikj, Katarina; Jatic, Zaim; Obrdalj, Edita Cerny; Zalihic, Amra; Tusek-Bunc, Ksenija
2017-03-01
Education means: learning, teaching or the process of acquiring skills or behavior modification through various exercises. Traditionally, medical education meant the oral, practical and more passive transferring of knowledge and skills from the educators to students and health professionals. Today the importance of focus on educational quality, particularly in the professions operating in the services required by people is agreed by all involved. The higher educational system shoulders some critical responsibilities in the economic, social, cultural and educational development and growth in the communities. In countries that are in transition it is in charge of educating professional human workforce in every field and if the education is optimal in terms of quality, it is capable of carrying out its responsibilities. It is reason why there is the necessity behind discovering some strategies to uplift the quality of education, especially at university level.. By increasing the courses and establishing universities and higher education centers, the countries around the world have generated more opportunities for learning, especially using modern information technologies. Regarding to evaluating different educational services quality, one of the most important measures should be the way to develop programs to promote quality and also due to the shortage of resources, evaluating the services quality enables the management to allocate the limited financial resources for realization whole educational process. Advances in medicine in recent decades are in significant correlation with the advances in the new models and concepts of medical education supported by information technologies. Modern information technologies have enabled faster, more reliable and comprehensive data collection. These technologies have started to create a large number of irrelevant information, which represents a limiting factor and a real growing gap, between the medical knowledge on one hand, and the ability of students and physicians to follow its growth on the other. Furthermore, in our environment, the term technology is generally reserved for its technical component. This terminology essentially means not only the purchase of the computer and related equipment, but also the technological foresight and technological progress, which are defined as specific combination of fundamental scientific, research and development work that gives a concrete result. The quality of the teaching-learning process at the universities in former Yugoslav countries and abroad, depends mainly of infrastructure that includes an optimal teaching space, personnel and equipment, in accordance with existing standards and norms at the cantonal or entity level, which are required to implement adequately the educational curriculum for students from first to sixth year by Bologna studying concept. For all of this it is necessary to ensure adequate funding. Technologies (medical and information, including communications) have a special role and value in ensuring the quality of medical education at universities and their organizational units (faculties). "Splitska inicijativa" project, which started 6 years ago as simple intention to exchange experiences of application new model of education, based on: Bologna studying concept, and other types of under and postgraduate education, was good idea to improve also theory and practice of it within Family medicine as academic and scientific discipline. This year scope of our scientific meeting held in Sarajevo on 24th and 25th March 2017, was quality assessment of theoretical and practical education and, also, evaluation of knowledge by students exams (a-y).
Implementing a Data Quality Strategy to Simplify Access to Data
NASA Astrophysics Data System (ADS)
Druken, K. A.; Trenham, C. E.; Evans, B. J. K.; Richards, C. J.; Wang, J.; Wyborn, L. A.
2016-12-01
To ensure seamless programmatic access for data analysis (including machine learning), standardization of both data and services is vital. At the Australian National Computational Infrastructure (NCI) we have developed a Data Quality Strategy (DQS) that currently provides processes for: (1) the consistency of data structures in the underlying High Performance Data (HPD) platform; (2) quality control through compliance with recognized community standards; and (3) data quality assurance through demonstrated functionality across common platforms, tools and services. NCI hosts one of Australia's largest repositories (10+ PBytes) of research data collections spanning datasets from climate, coasts, oceans and geophysics through to astronomy, bioinformatics and the social sciences. A key challenge is the application of community-agreed data standards to the broad set of Earth systems and environmental data that are being used. Within these disciplines, data span a wide range of gridded, ungridded (i.e., line surveys, point clouds), and raster image types, as well as diverse coordinate reference projections and resolutions. By implementing our DQS we have seen progressive improvement in the quality of the datasets across the different subject domains, and through this, the ease by which the users can programmatically access the data, either in situ or via web services. As part of its quality control procedures, NCI has developed a compliance checker based upon existing domain standards. The DQS also includes extensive Functionality Testing which include readability by commonly used libraries (e.g., netCDF, HDF, GDAL, etc.); accessibility by data servers (e.g., THREDDS, Hyrax, GeoServer), validation against scientific analysis and programming platforms (e.g., Python, Matlab, QGIS); and visualization tools (e.g., ParaView, NASA Web World Wind). These tests ensure smooth interoperability between products and services as well as exposing unforeseen requirements and dependencies. The results provide an important component of quality control within the DQS as well as clarifying the requirement for any extensions to the relevant standards that help support the uptake of data by broader international communities.
Trigeminal neuralgia--a coherent cross-specialty management program.
Heinskou, Tone; Maarbjerg, Stine; Rochat, Per; Wolfram, Frauke; Jensen, Rigmor Højland; Bendtsen, Lars
2015-01-01
Optimal management of patients with classical trigeminal neuralgia (TN) requires specific treatment programs and close collaboration between medical, radiological and surgical specialties. Organization of such treatment programs has never been described before. With this paper we aim to describe the implementation and feasibility of an accelerated cross-speciality management program, to describe the collaboration between the involved specialties and to report the patient flow during the first 2 years after implementation. Finally, we aim to stimulate discussions about optimal management of TN. Based on collaboration between neurologists, neuroradiologists and neurosurgeons a standardized program for TN was implemented in May 2012 at the Danish Headache Center (DHC). First out-patient visit and subsequent 3.0 Tesla MRI scan was booked in an accelerated manner. The MRI scan was performed according to a special TN protocol developed for this program. Patients initially referred to neurosurgery were re-directed to DHC for pre-surgical evaluation of diagnosis and optimization of medical treatment. Follow-up was 2 years with fixed visits where medical treatment and indication for neurosurgery was continuously evaluated. Scientific data was collected in a structured and prospective manner. From May 2012 to April 2014, 130 patients entered the accelerated program. Waiting time for the first out-patient visit was 42 days. Ninety-four percent of the patients had a MRI performed according to the special protocol after a mean of 37 days. Within 2 years follow-up 35% of the patients were referred to neurosurgery after a median time of 65 days. Five scientific papers describing demographics, clinical characteristics and neuroanatomical abnormalities were published. The described cross-speciality management program proved to be feasible and to have acceptable waiting times for referral and highly specialized work-up of TN patients in a public tertiary referral centre for headache and facial pain. Early high quality MRI ensured correct diagnosis and that the neurosurgeons had a standardized basis before decision-making on impending surgery. The program ensured that referral of the subgroup of patients in need for surgery was standardized, ensured continuous evaluation of the need for adjustments in pharmacological management and formed the basis for scientific research.
Practices in NASA's EOSDIS to Promote Open Data and Research Integrity
NASA Astrophysics Data System (ADS)
Behnke, J.; Ramapriyan, H.
2017-12-01
The purpose of this paper is to highlight the key practices adopted by NASA in its Earth Observing System Data and Information System (EOSDIS) to promote and facilitate open data and research integrity. EOSDIS is the system that manages most of NASA's Earth science data from various sources - satellites, aircraft, field campaigns and some research projects. Since its inception in 1990 as a part of the Earth Observing System (EOS) Program, EOSDIS has been following NASA's free and open data and information policy, whereby data are shared with all users on a non-discriminatory basis and are provided at no cost. To ensure that the data are discoverable and accessible to the user community, NASA follows an evolutionary development approach, whereby the latest technologies that can be practically adopted are infused into EOSDIS. This results in continuous improvements in system capabilities such that technologies that users are accustomed to in other environments are brought to bear in their access to NASA's Earth observation data. Mechanisms have existed for ensuring that the data products offered by EOSDIS are vetted by the community before they are released. Information about data products such as Algorithm Theoretical Basis Documents and quality assessments are openly available with the products. The EOSDIS Distributed Active Archive Centers (DAACs) work with the science teams responsible for product generation to assist with proper use of metadata. The DAACs have knowledgeable staff to answer users' questions and have access to scientific experts as needed. Citation of data products in scientific papers are facilitated by assignment of Digital Object Identifiers (DOIs) - at present, over 50% of data products in EOSDIS have been assigned DOIs. NASA gathers and publishes citation metrics for the datasets offered by the DAACs. Through its Software and Services Citations Working Group, NASA is currently investigating broadening DOI assignments to promote greater provenance traceability. NASA has developed Preservation Content Specifications for Earth science data to ensure that provenance and context are captured and preserved for the future and is applying them to data and information from its missions. All these actions promote availability of information to promote integrity in scientific research.
Comparative Analysis of Sustainable Approaches and Systems for Scientific Data Stewardship
NASA Astrophysics Data System (ADS)
Downs, R. R.; Chen, R. S.
2012-12-01
Sustainable data systems are critical components of the cyberinfrastructure needed to provide long-term stewardship of scientific data, including Earth science data, throughout their entire life cycle. A variety of approaches may help ensure the sustainability of such systems, but these approaches must be able to survive the demands of competing priorities and decreasing budgets. Analyzing and comparing alternative approaches can identify viable aspects of each approach and inform decisions for developing, managing, and supporting the cyberinfrastructure needed to facilitate discovery, access, and analysis of data by future communities of users. A typology of sustainability approaches is proposed, and example use cases are offered for comparing the approaches over time. These examples demonstrate the potential strengths and weaknesses of each approach under various conditions and with regard to different objectives, e.g., open vs. limited access. By applying the results of these analyses to their particular circumstances, systems stakeholders can assess their options for a sustainable systems approach along with other metrics and identify alternative strategies to ensure the sustainability of the scientific data and information for which they are responsible. In addition, comparing sustainability approaches should inform the design of new systems and the improvement of existing systems to meet the needs for long-term stewardship of scientific data, and support education and workforce development efforts needed to ensure that the appropriate scientific and technical skills are available to operate and further develop sustainable cyberinfrastructure.
Quality standards of the European Pharmacopoeia.
Bouin, Anne-Sophie; Wierer, Michael
2014-12-02
The European Pharmacopoeia (Ph. Eur.) provides a legal and scientific reference for the quality control of medicines. It is legally binding in the 38 signatory parties of the Convention on the elaboration of a European Pharmacopoeia (37 member states and the European Union). The requirements for a specific herbal drug are prescribed in the corresponding individual monograph and the relevant general monographs. Criteria for pesticides and heavy metals for example are defined in the general monograph on Herbal drugs. The Ph. Eur. also provides general methods including methods for determination of aflatoxins B1 and ochratoxin A. Screening methods for aristolochic acids are applied for herbal drugs that may be subject to adulteration or substitution with plant material containing aristolochic acids. The Ph. Eur. collaborate in many areas with the European Medicines Agency (EMA) to ensure close collaboration as regards the respective work programmes and approach. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
Quality assurance for the query and distribution systems of the RCSB Protein Data Bank
Bluhm, Wolfgang F.; Beran, Bojan; Bi, Chunxiao; Dimitropoulos, Dimitris; Prlić, Andreas; Quinn, Gregory B.; Rose, Peter W.; Shah, Chaitali; Young, Jasmine; Yukich, Benjamin; Berman, Helen M.; Bourne, Philip E.
2011-01-01
The RCSB Protein Data Bank (RCSB PDB, www.pdb.org) is a key online resource for structural biology and related scientific disciplines. The website is used on average by 165 000 unique visitors per month, and more than 2000 other websites link to it. The amount and complexity of PDB data as well as the expectations on its usage are growing rapidly. Therefore, ensuring the reliability and robustness of the RCSB PDB query and distribution systems are crucially important and increasingly challenging. This article describes quality assurance for the RCSB PDB website at several distinct levels, including: (i) hardware redundancy and failover, (ii) testing protocols for weekly database updates, (iii) testing and release procedures for major software updates and (iv) miscellaneous monitoring and troubleshooting tools and practices. As such it provides suggestions for how other websites might be operated. Database URL: www.pdb.org PMID:21382834
Martínez-Flores, Francisco; Sandoval-Zamora, Hugo; Machuca-Rodriguez, Catalina; Barrera-López, Araceli; García-Cavazos, Ricardo; Madinaveitia-Villanueva, Juan Antonio
2016-01-01
Tissue storage is a medical process that is in the regulation and homogenisation phase in the scientific world. The international standards require the need to ensure safety and efficacy of human allografts such as skin and other tissues. The activities of skin and tissues banks currently involve their recovery, processing, storage and distribution, which are positively correlated with technological and scientific advances present in current biomedical sciences. A description is presented of the operational model of Skin and Tissue Bank at INR as successful case for procurement, recovery and preservation of skin and tissues for therapeutic uses, with high safety and biological quality. The essential and standard guidelines are presented as keystones for a tissue recovery program based on scientific evidence, and within an ethical and legal framework, as well as to propose a model for complete overview of the donation of tissues and organ programs in Mexico. Finally, it concludes with essential proposals for improving the efficacy of transplantation of organs and tissue programs. Copyright © 2015 Academia Mexicana de Cirugía A.C. Published by Masson Doyma México S.A. All rights reserved.
[TOPICS-MDS: a versatile resource for generating scientific and social knowledge for elderly care].
van den Brink, Danielle; Lutomski, Jennifer E; Qin, Li; den Elzen, Wendy P J; Kempen, Gertrudis I J M; Krabbe, Paul F M; Steyerberg, Ewout W; Muntinga, Maaike; Moll van Charante, Eric P; Bleijenberg, Nienke; Olde Rikkert, Marcel G M; Melis, René J F
2015-04-01
Developed as part of the National Care for the Elderly Programme (NPO), TOPICS-MDS is a uniform, national database on the health and wellbeing of the older persons and caregivers who participated in NPO-funded projects. TOPICS-MDS Consortium has gained extensive experience in constructing a standardized questionnaire to collect relevant health care data on quality of life, health services utilization, and informal care use. A proactive approach has been undertaken not only to ensure the standardization and validation of instruments but also the infrastructure for external data requests. Efforts have been made to promote scientifically and socially responsible use of TOPICS-MDS; data has been available for secondary use since early 2014. Through this data sharing initiative, researchers can explore health issues in a broader framework which may have not been possible within individual NPO projects; this broader framework is highly relevant for influencing health policy. In this article, we provide an overview of the development and on-going progress of TOPICS-MDS. We further describe how information derived from TOPICS-MDS can be applied to facilitate future scientific innovations and public health initiatives to improve care for frail older persons and their caregivers.
Promissory accounts of personalisation in the commercialisation of genomic knowledge.
Arribas-Ayllon, Michael; Sarangi, Srikant; Clarke, Angus
2011-01-01
As part of personalised medicine emerging from the human genomics revolution, many websites now offer direct-to-consumer genetic testing. Here, we examine three personal genomics companies--Navigenics, deCODEme and 23andMe--each of which represents contrasting registers of 'personalisation'. We identify three distinctive registers in these websites: a paternalistic (medical) register; a translational (scientific) register and a democratic (consumerist) register. We explore in detail the rhetorical and discourse devices employed in these websites to assess how personalised healthcare is promised to the public. Promising information that will empower prevention of common complex diseases and ensure better quality of life is conflated with promising greater access to personal information. The presence and absence of scientific legitimacy is related to concerns about accuracy and validity on the one side, and fears of paternalism and elitism on the other. Nevertheless, a common strategy uniting these different styles of personalisation is consumer empowerment. Finally, we consider the tension between the drive of translational medicine to make human genomic research practically relevant, and the intrinsic uncertainties of scientific research and show how, in the commercial domain, future risks are transformed into discourses of promise by concealing these uncertainties.
Overview of engineering activities at the SMA
NASA Astrophysics Data System (ADS)
Christensen, R. D.; Kubo, D. Y.; Rao, Ramprasad
2008-07-01
The Submillmeter Array (SMA) consists of 8 6-meter telescopes on the summit of Mauna Kea. The array has been designed to operate from the summit of Mauna Kea and from 3 remote facilities: Hilo, Hawaii, Cambridge, Massachusetts and Taipei, Taiwan. The SMA provides high-resolution scientific observations in most of the major atmospheric windows from 180 to 700 GHz. Each telescope can house up to 8 receivers in a single cryostat and can operate with one or two receiver bands simultaneously. The array being a fully operational observatory, the demand for science time is extremely high. As a result specific time frames have been set-aside during both the day and night for engineering activities. This ensures that the proper amount of time can be spent on maintaining existing equipment or upgrading the system to provide high quality scientific output during nighttime observations. This paper describes the methods employed at the SMA to optimize engineering development of the telescopes and systems such that the time available for scientific observations is not compromised. It will also examine some of the tools used to monitor the SMA during engineering and science observations both at the site and remote facilities.
Strategy for earth explorers in global earth sciences
NASA Technical Reports Server (NTRS)
1988-01-01
The goal of the current NASA Earth System Science initiative is to obtain a comprehensive scientific understanding of the Earth as an integrated, dynamic system. The centerpiece of the Earth System Science initiative will be a set of instruments carried on polar orbiting platforms under the Earth Observing System program. An Earth Explorer program can open new vistas in the earth sciences, encourage innovation, and solve critical scientific problems. Specific missions must be rigorously shaped by the demands and opportunities of high quality science and must complement the Earth Observing System and the Mission to Planet Earth. The committee believes that the proposed Earth Explorer program provides a substantial opportunity for progress in the earth sciences, both through independent missions and through missions designed to complement the large scale platforms and international research programs that represent important national commitments. The strategy presented is intended to help ensure the success of the Earth Explorer program as a vital stimulant to the study of the planet.
The Power of Online Community and Citizen Science
NASA Astrophysics Data System (ADS)
Cook, J.; Nuccitelli, D. A.; Winkler, B.; Cowtan, K.; Brimelow, J.
2012-12-01
The Internet offers innovative and creative means of disseminating content. But where the Internet comes into its own is in the non-linear power of community. Not only can communicators interact directly with their audience, more importantly, the audience can network with each other. This enables publishers to build communities rallied around common topics of interest. Online communities lead to exciting opportunities such as citizen science where communities crowd-source the collection or analysis of data. Skeptical Science is a case study in the development of a volunteer community that produces regular content developed within an internal review system that ensures a high level of accuracy and quality. The community also engages with the peer-reviewed literature, submitting responses to peer-reviewed papers, collecting meta-data used in other scientific research and conducting the largest ever survey of climate papers. Thus this online community both contributes to the outreach effort of climate communication and also seeks to add to the body of scientific knowledge.
Iyioha, Ireh
2011-01-01
This paper examines the (in)compatibility between the diagnostic and therapeutic theories of complementary and alternative medicine (CAM) and a science-based regulatory framework. Specifically, the paper investigates the nexus between statutory legitimacy and scientific validation of health systems, with an examination of its impact on the development of complementary and alternative therapies. The paper evaluates competing theories for validating CAM ranging from the RCT methodology to anthropological perspectives and contends that while the RCT method might be beneficial in the regulation of many CAM therapies, yet dogmatic adherence to this paradigm as the exclusive method for legitimizing CAM will be adverse to the independent development of many CAM therapies whose philosophies and mechanisms of action are not scientifically interpretable. Drawing on history and research evidence to support this argument, the paper sues for a regulatory model that is accommodative of different evidential paradigms in support of a pluralistic healthcare system that balances the imperative of quality assurance with the need to ensure access. PMID:20953428
Focus Group in Community Mental Health Research: Need for Adaption.
Zupančič, Vesna; Pahor, Majda; Kogovšek, Tina
2018-04-27
The article presents an analysis of the use of focus groups in researching community mental health users, starting with the reasons for using them, their implementation in mental health service users' research, and the adaptations of focus group use when researching the experiences of users. Based on personal research experience and a review of scientific publications in the Google Scholar, Web of Science, ProQuest, EBSCOhost, and Scopus databases, 20 articles published between 2010 and 2016 were selected for targeted content analysis. A checklist for reporting on the use of focus groups with community mental health service users, aiming to improve the comparability, verifiability and validity was developed. Adaptations of the implementation of focus groups in relation to participants' characteristics were suggested. Focus groups are not only useful as a scientific research technique, but also for ensuring service users' participation in decision-making in community mental health and evaluating the quality of the mental health system and services .
Evidence based policy making in the European Union: the role of the scientific community.
Majcen, Špela
2017-03-01
In the times when the acquis of the European Union (EU) has developed so far as to reach a high level of technical complexity, in particular in certain policy fields such as environmental legislation, it is important to look at what kind of information and data policy decisions are based on. This position paper looks at the extent to which evidence-based decision-making process is being considered in the EU institutions when it comes to adopting legislation in the field of environment at the EU level. The paper calls for closer collaboration between scientists and decision-makers in view of ensuring that correct data is understood and taken into consideration when drafting, amending, negotiating and adopting new legal texts at all levels of the EU decision-making process. It concludes that better awareness of the need for such collaboration among the decision-makers as well as the scientific community would benefit the process and quality of the final outcomes (legislation).
Equality of opportunities in geosciences: The EGU Awards Committee experience
NASA Astrophysics Data System (ADS)
Karatekin, Özgür
2017-04-01
Scientists are evaluated on the basis of creativity and productivity, and their scientific excellence are rewarded by scientific associations. Providing equal opportunities and ensuring balance is a strict necessity when recognizing scientific excellence. The processes and procedures that lead to the recognition of excellence has to be transparent and free of gender biases. However, establishment of clear and transparent evaluation criteria and performance metrics in order to provide equal opportunities to researchers across gender, continents and ethnic groups can be challenging since the definition of scientific excellence is elusive. This talk aims to present the experience and the efforts of the European Geosciences Union to ensure balance, with a particular focus on gender balance. Data and statistics will be presented in the attempt to provide constructive indications to get to the target of giving equal opportunities to researchers across gender, continents and ethnic groups.
Potency testing of veterinary vaccines: the way from in vivo to in vitro.
Romberg, Judith; Lang, Stefan; Balks, Elisabeth; Kamphuis, Elisabeth; Duchow, Karin; Loos, Daniela; Rau, Henriette; Motitschke, Andreas; Jungbäck, Carmen
2012-01-01
Current quality control of inactivated animal vaccines still focuses on the potency of final products in a batch-wise manner. Animal welfare concerns as well as scientific considerations have led to the '3Rs-concept' that comprises the refinement of animal procedures, the reduction of animal numbers, and the replacement of animal models. Although the 3Rs-concept has been widely accepted as a fundamental principle, the number of approved alternatives for in vivo tests is still limited. To promote further progress, the international scientific workshop 'Potency Testing of Veterinary Vaccines: The Way from in vivo to in vitro' was held at the Paul-Ehrlich-Institut in Langen, Germany, on 01-03 December 2010. More than 130 participants from industry, academia and regulatory authorities discussed the current state of the 3Rs-concept, examples of its successful implementation as well as still existing hurdles. Special emphasis was laid on the 'consistency approach' that aims to ensure relevant quality attributes of vaccine batches by in vitro analyses during production rather than by in vivo potency tests on the final product. This report provides an overview of the insights gained, including the recommendations produced at the end of the workshop. Copyright © 2011. Published by Elsevier Ltd.. All rights reserved.
The Mauna Kea Weather Center: Custom Atmospheric Forecasting Support for Mauna Kea
NASA Astrophysics Data System (ADS)
Businger, Steven
2011-03-01
The success of operations at Mauna Kea Observatories is strongly influenced by weather conditions. The Mauna Kea Weather Center, an interdisciplinary research program, was established in 1999 to develop and provide custom weather support for Mauna Kea Observatories. The operational forecasting goals of the program are to facilitate the best possible use of favorable atmospheric conditions for scientific benefit and to ensure operational safety. During persistent clear periods, astronomical observing quality varies substantially due to changes in the vertical profiles of temperature, wind, moisture, and turbulence. Cloud and storm systems occasionally cause adverse or even hazardous conditions. A dedicated, daily, real-time mesoscale numerical modeling effort provides crucial forecast guidance in both cases. Several key atmospheric variables are forecast with sufficient skill to be of operational and scientific benefit to the telescopes on Mauna Kea. Summit temperature forecasts allow mirrors to be set to the ambient temperature to reduce image distortion. Precipitable water forecasts allow infrared observations to be prioritized according to atmospheric opacity. Forecasts of adverse and hazardous conditions protect the safety of personnel and allow for scheduling of maintenance when observing is impaired by cloud. The research component of the project continues to improve the accuracy and content of the forecasts. In particular, case studies have resulted in operational forecasts of astronomical observing quality, or seeing.
36 CFR § 1260.38 - How does the NDC ensure the quality of declassification reviews?
Code of Federal Regulations, 2013 CFR
2013-07-01
... 36 Parks, Forests, and Public Property 3 2013-07-01 2012-07-01 true How does the NDC ensure the quality of declassification reviews? § 1260.38 Section § 1260.38 Parks, Forests, and Public Property... INFORMATION The National Declassification Center (NDC) § 1260.38 How does the NDC ensure the quality of...
Scientific Software - the role of best practices and recommendations
NASA Astrophysics Data System (ADS)
Fritzsch, Bernadette; Bernstein, Erik; Castell, Wolfgang zu; Diesmann, Markus; Haas, Holger; Hammitzsch, Martin; Konrad, Uwe; Lähnemann, David; McHardy, Alice; Pampel, Heinz; Scheliga, Kaja; Schreiber, Andreas; Steglich, Dirk
2017-04-01
In Geosciences - like in most other communities - scientific work strongly depends on software. For big data analysis, existing (closed or open source) program packages are often mixed with newly developed codes. Different versions of software components and varying configurations can influence the result of data analysis. This often makes reproducibility of results and reuse of codes very difficult. Policies for publication and documentation of used and newly developed software, along with best practices, can help tackle this problem. Within the Helmholtz Association a Task Group "Access to and Re-use of scientific software" was implemented by the Open Science Working Group in 2016. The aim of the Task Group is to foster the discussion about scientific software in the Open Science context and to formulate recommendations for the production and publication of scientific software, ensuring open access to it. As a first step, a workshop gathered interested scientists from institutions across Germany. The workshop brought together various existing initiatives from different scientific communities to analyse current problems, share established best practices and come up with possible solutions. The subjects in the working groups covered a broad range of themes, including technical infrastructures, standards and quality assurance, citation of software and reproducibility. Initial recommendations are presented and discussed in the talk. They are the foundation for further discussions in the Helmholtz Association and the Priority Initiative "Digital Information" of the Alliance of Science Organisations in Germany. The talk aims to inform about the activities and to link with other initiatives on the national or international level.
Maxim, Laura; van der Sluijs, Jeroen P.
2014-01-01
In regulatory toxicology, quality assessment of in vivo studies is a critical step for assessing chemical risks. It is crucial for preserving public health studies that are considered suitable for regulating chemicals are robust. Current procedures for conducting quality assessments in safety agencies are not structured, clear or consistent. This leaves room for criticism about lack of transparency, subjective influence and the potential for insufficient protection provided by resulting safety standards. We propose a tool called “Qualichem in vivo” that is designed to systematically and transparently assess the quality of in vivo studies used in chemical health risk assessment. We demonstrate its use here with 12 experts, using two controversial studies on Bisphenol A (BPA) that played an important role in BPA regulation in Europe. The results obtained with Qualichem contradict the quality assessments conducted by expert committees in safety agencies for both of these studies. Furthermore, they show that reliance on standardized guidelines to ensure scientific quality is only partially justified. Qualichem allows experts with different disciplinary backgrounds and professional experiences to express their individual and sometimes divergent views—an improvement over the current way of dealing with minority opinions. It provides a transparent framework for expressing an aggregated, multi-expert level of confidence in a study, and allows a simple graphical representation of how well the study integrates the best available scientific knowledge. Qualichem can be used to compare assessments of the same study by different health agencies, increasing transparency and trust in the work of expert committees. In addition, it may be used in systematic evaluation of in vivo studies submitted by industry in the dossiers that are required for compliance with the REACH Regulation. Qualichem provides a balanced, common framework for assessing the quality of studies that may or may not be following standardized guidelines. PMID:24489958
Xiong, Xi; He, Ya-Nan; Feng, Bi; Pan, Yuan; Zhang, Hai-Zhu; Ke, Xiu-Mei; Zhang, Yi; Yang, Ming; Han, Li; Zhang, Ding-Kun
2018-05-10
Nowadays, breast disorders seriously affect women's health in an increasing number. In China, Xiaojin Pills are commonly used in the treatment of breast diseases. Doctors have concluded that the combined use of Xiaojin Pills with conventional therapy can significantly improve the efficacy with fewer side effects. However, the prescription of Xiaojin Pills is complicated and their quality control methods cannot completely ensure the quality of Xiaojin Pills. On the basis of its mechanism, our study combined chemical evaluation and biological evaluation to identify the anti-inflammatory markers of Xiaojin Pills. In this manuscript, 13 compounds in Xiaojin Pills were quantified. At the same time, the cyclooxygenase-2 inhibition rates of different Xiaojin Pills were measured and the possible markers were screened by spectrum-effect relationship. Further, anti-inflammatory activities of markers were verified and protein interaction network was analyzed, identifying the components of Protocatechuate, Beta-Boswellic acid and Levistilide A as the anti-inflammatory quality markers of Xiaojin Pills. We hope our studies can provide a scientific theoretical basis for accurately quality control of Xiaojin Pills and reasonable suggestions for pharmaceutical companies and new ideas for the quality control of other medicines.
NASA Astrophysics Data System (ADS)
Nielsen, S. Suzanne
Investigations in food science and technology, whether by the food industry, governmental agencies, or universities, often require determination of food composition and characteristics. Trends and demands of consumers, the food industry, and national and international regulations challenge food scientists as they work to monitor food composition and to ensure the quality and safety of the food supply. All food products require analysis as part of a quality management program throughout the development process (including raw ingredients), through production, and after a product is in the market. In addition, analysis is done of problem samples and competitor products. The characteristics of foods (i.e., chemical composition, physical properties, sensory properties) are used to answer specific questions for regulatory purposes and typical quality control. The nature of the sample and the specific reason for the analysis commonly dictate the choice of analytical methods. Speed, precision, accuracy, and ruggedness often are key factors in this choice. Validation of the method for the specific food matrix being analyzed is necessary to ensure usefulness of the method. Making an appropriate choice of the analytical technique for a specific application requires a good knowledge of the various techniques (Fig. 1.1). For example, your choice of method to determine the salt content of potato chips would be different if it is for nutrition labeling than for quality control. The success of any analytical method relies on the proper selection and preparation of the food sample, carefully performing the analysis, and doing the appropriate calculations and interpretation of the data. Methods of analysis developed and endorsed by several nonprofit scientific organizations allow for standardized comparisons of results between different laboratories and for evaluation of less standard procedures. Such official methods are critical in the analysis of foods, to ensure that they meet the legal requirements established by governmental agencies. Government regulations and international standards most relevant to the analysis of foods are mentioned here but covered in more detail in Chap. 2, and nutrition labeling regulations in the USA are covered in Chap. 3. Internet addresses for many of the organizations and government agencies discussed are given at the end of this chapter.
Mayer, S
2006-11-01
To determine whether authors of scientific publications in molecular biology declare patents and other potential financial interests. Survey of a 6-month sample of papers related to molecular biology in Nature. The esp@cenet worldwide patent search engine was used to search for patents applied for by the authors of scientific papers in Nature that were related to molecular biology and genetics, between January and June 2005. Of the 79 papers considered, four had declared that certain authors had competing financial interests. Seven papers in which no financial interests were declared had authors with patent applications that were based on the research in the paper or were closely related to it. Another paper had two authors with connections to biotechnology companies that were not disclosed. Two thirds of the papers in which authors had patent applications or company affiliations that might be considered to be competing financial interests did not disclose them. Failure to disclose such information may have negative implications on the perception of science in society and on its quality if the possible bias is hidden. Journals should make greater efforts to ensure full disclosure, and scientific institutions should consider failure to disclose financial interests as an example of scientific malpractice. Establishing a register of interests for scientists is one way to increase transparency and openness.
Mayer, S
2006-01-01
Objectives To determine whether authors of scientific publications in molecular biology declare patents and other potential financial interests. Design Survey of a 6‐month sample of papers related to molecular biology in Nature. Methods The esp@cenet worldwide patent search engine was used to search for patents applied for by the authors of scientific papers in Nature that were related to molecular biology and genetics, between January and June 2005. Results Of the 79 papers considered, four had declared that certain authors had competing financial interests. Seven papers in which no financial interests were declared had authors with patent applications that were based on the research in the paper or were closely related to it. Another paper had two authors with connections to biotechnology companies that were not disclosed. Conclusion Two thirds of the papers in which authors had patent applications or company affiliations that might be considered to be competing financial interests did not disclose them. Failure to disclose such information may have negative implications on the perception of science in society and on its quality if the possible bias is hidden. Journals should make greater efforts to ensure full disclosure, and scientific institutions should consider failure to disclose financial interests as an example of scientific malpractice. Establishing a register of interests for scientists is one way to increase transparency and openness. PMID:17074824
Knowledge which Cannot be Used is Useless.
ERIC Educational Resources Information Center
Cox, Ken
1987-01-01
The medical school is responsible for ensuring that its graduating doctors can apply scientific principles to health problems. That responsibility requires (1) selection of content; (2) presentation of content in appropriate ways; and (3) examination of students. Discussed are procedures for ensuring desired outcomes. (Author/RH)
Study partners should be required in preclinical Alzheimer's disease trials.
Grill, Joshua D; Karlawish, Jason
2017-12-06
In an effort to intervene earlier in Alzheimer's disease (AD), clinical trials are testing promising candidate therapies in preclinical disease. Preclinical AD trial participants are cognitively normal, functionally independent, and autonomous decision-makers. Yet, like AD dementia trials, preclinical trials require dual enrollment of a participant and a knowledgeable informant, or study partner. The requirement of dyadic enrollment is a barrier to recruitment and may present unique ethical challenges. Despite these limitations, the requirement should continue. Study partners may be essential to ensure participant safety and wellbeing, including overcoming distress related to biomarker disclosure and minimizing risk for catastrophic reactions and suicide. The requirement may maximize participant retention and ensure data integrity, including that study partners are the source of data that will ultimately instruct whether a new treatment has a clinical benefit and meaningful impact on the population health burden associated with AD. Finally, study partners are needed to ensure the scientific and clinical value of trials. Preclinical AD will represent a new model of care, in which persons with no symptoms are informed of probable cognitive decline and eventual dementia. The rationale for early diagnosis in symptomatic AD is equally applicable in preclinical AD-to minimize risk, maximize quality of life, and ensure optimal planning and communication. Family members and other sources of support will likely be essential to the goals of this new model of care for preclinical AD patients and trials must instruct this clinical practice.
Science Education as Public and Social Wealth: The Notion of Citizenship from a European Perspective
ERIC Educational Resources Information Center
Siatras, Anastasios; Koumaras, Panagiotis
2013-01-01
In this paper, (a) we present a framework for developing a science content (i.e., science concepts, scientific methods, scientific mindset, and problem-solving strategies for socio-scientific issues) used to design the new Cypriot science curriculum aiming at ensuring a democratic and human society, (b) we use the previous framework to explore the…
ERIC Educational Resources Information Center
Zhang, M.
2013-01-01
The abundant scientific resources on the Web provide great opportunities for students to expand their science learning, yet easy access to information does not ensure learning. Prior research has found that middle school students tend to read Web-based scientific resources in a shallow, superficial manner. A software tool was designed to support…
Data quality can make or break a research infrastructure
NASA Astrophysics Data System (ADS)
Pastorello, G.; Gunter, D.; Chu, H.; Christianson, D. S.; Trotta, C.; Canfora, E.; Faybishenko, B.; Cheah, Y. W.; Beekwilder, N.; Chan, S.; Dengel, S.; Keenan, T. F.; O'Brien, F.; Elbashandy, A.; Poindexter, C.; Humphrey, M.; Papale, D.; Agarwal, D.
2017-12-01
Research infrastructures (RIs) commonly support observational data provided by multiple, independent sources. Uniformity in the data distributed by such RIs is important in most applications, e.g., in comparative studies using data from two or more sources. Achieving uniformity in terms of data quality is challenging, especially considering that many data issues are unpredictable and cannot be detected until a first occurrence of the issue. With that, many data quality control activities within RIs require a manual, human-in-the-loop element, making it an expensive activity. Our motivating example is the FLUXNET2015 dataset - a collection of ecosystem-level carbon, water, and energy fluxes between land and atmosphere from over 200 sites around the world, some sites with over 20 years of data. About 90% of the human effort to create the dataset was spent in data quality related activities. Based on this experience, we have been working on solutions to increase the automation of data quality control procedures. Since it is nearly impossible to fully automate all quality related checks, we have been drawing from the experience with techniques used in software development, which shares a few common constraints. In both managing scientific data and writing software, human time is a precious resource; code bases, as Science datasets, can be large, complex, and full of errors; both scientific and software endeavors can be pursued by individuals, but collaborative teams can accomplish a lot more. The lucrative and fast-paced nature of the software industry fueled the creation of methods and tools to increase automation and productivity within these constraints. Issue tracking systems, methods for translating problems into automated tests, powerful version control tools are a few examples. Terrestrial and aquatic ecosystems research relies heavily on many types of observational data. As volumes of data collection increases, ensuring data quality is becoming an unwieldy challenge for RIs. Business as usual approaches to data quality do not work with larger data volumes. We believe RIs can benefit greatly from adapting and imitating this body of theory and practice from software quality into data quality, enabling systematic and reproducible safeguards against errors and mistakes in datasets as much as in software.
The International Planetary Data Alliance
NASA Astrophysics Data System (ADS)
Stein, T.; Arviset, C.; Crichton, D. J.
2017-12-01
The International Planetary Data Alliance (IPDA) is an association of partners with the aim of improving the quality of planetary science data and services to the end users of space based instrumentation. The specific mission of the IPDA is to facilitate global access to, and exchange of, high quality scientific data products managed across international boundaries. Ensuring proper capture, accessibility and availability of the data is the task of the individual member space agencies. The IPDA was formed in 2006 with the purpose of adopting standards and developing collaborations across agencies to ensure data is captured in common formats. Member agencies include: Armenian Astronomical Society, China National Space Agency (CNSA), European Space Agency (ESA), German Aerospace Center (DLR), Indian Space Research Organization (ISRO), Italian Space Agency (ASI), Japanese Aerospace Exploration Agency (JAXA), National Air and Space Administration (NASA), National Centre for Space Studies (CNES), Space Research Institute (IKI), UAE Space Agency, and UK Space Agency. The IPDA Steering Committee oversees the execution of projects and coordinates international collaboration. The IPDA conducts a number of focused projects to enable interoperability, construction of compatible archives, and the operation of the IPDA as a whole. These projects have helped to establish the IPDA and to move the collaboration forward. A key project that is currently underway is the implementation of the PDS4 data standard. Given the international focus, it has been critical that the PDS and the IPDA collaborate on its development. Also, other projects have been conducted successfully, including developing the IPDA architecture and corresponding requirements, developing shared registries for data and tools across international boundaries, and common templates for supporting agreements for archiving and sharing data for international missions. Several projects demonstrating interoperability across systems have been applied to specific missions and data sets. IPDA membership is open to space agencies and scientific research institutes. Representatives who are interested in joining the IPDA should contact the author or use the contact form on the web page http://www.planetarydata.org.
[Validation and verfication of microbiology methods].
Camaró-Sala, María Luisa; Martínez-García, Rosana; Olmos-Martínez, Piedad; Catalá-Cuenca, Vicente; Ocete-Mochón, María Dolores; Gimeno-Cardona, Concepción
2015-01-01
Clinical microbiologists should ensure, to the maximum level allowed by the scientific and technical development, the reliability of the results. This implies that, in addition to meeting the technical criteria to ensure their validity, they must be performed with a number of conditions that allows comparable results to be obtained, regardless of the laboratory that performs the test. In this sense, the use of recognized and accepted reference methodsis the most effective tool for these guarantees. The activities related to verification and validation of analytical methods has become very important, as there is continuous development, as well as updating techniques and increasingly complex analytical equipment, and an interest of professionals to ensure quality processes and results. The definitions of validation and verification are described, along with the different types of validation/verification, and the types of methods, and the level of validation necessary depending on the degree of standardization. The situations in which validation/verification is mandatory and/or recommended is discussed, including those particularly related to validation in Microbiology. It stresses the importance of promoting the use of reference strains as controls in Microbiology and the use of standard controls, as well as the importance of participation in External Quality Assessment programs to demonstrate technical competence. The emphasis is on how to calculate some of the parameters required for validation/verification, such as the accuracy and precision. The development of these concepts can be found in the microbiological process SEIMC number 48: «Validation and verification of microbiological methods» www.seimc.org/protocols/microbiology. Copyright © 2013 Elsevier España, S.L.U. y Sociedad Española de Enfermedades Infecciosas y Microbiología Clínica. All rights reserved.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-06-22
... DEPARTMENT OF HEALTH AND HUMAN SERVICES Agency for Healthcare Research and Quality Scientific... for Healthcare Research and Quality (AHRQ), HHS. ACTION: Request for scientific information submissions. SUMMARY: The Agency for Healthcare Research and Quality (AHRQ) is seeking scientific information...
General introduction for the “National Field Manual for the Collection of Water-Quality Data”
,
2018-02-28
BackgroundAs part of its mission, the U.S. Geological Survey (USGS) collects data to assess the quality of our Nation’s water resources. A high degree of reliability and standardization of these data are paramount to fulfilling this mission. Documentation of nationally accepted methods used by USGS personnel serves to maintain consistency and technical quality in data-collection activities. “The National Field Manual for the Collection of Water-Quality Data” (NFM) provides documented guidelines and protocols for USGS field personnel who collect water-quality data. The NFM provides detailed, comprehensive, and citable procedures for monitoring the quality of surface water and groundwater. Topics in the NFM include (1) methods and protocols for sampling water resources, (2) methods for processing samples for analysis of water quality, (3) methods for measuring field parameters, and (4) specialized procedures, such as sampling water for low levels of mercury and organic wastewater chemicals, measuring biological indicators, and sampling bottom sediment for chemistry. Personnel who collect water-quality data for national USGS programs and projects, including projects supported by USGS cooperative programs, are mandated to use protocols provided in the NFM per USGS Office of Water Quality Technical Memorandum 2002.13. Formal training, for example, as provided in the USGS class, “Field Water-Quality Methods for Groundwater and Surface Water,” and field apprenticeships supplement the guidance provided in the NFM and ensure that the data collected are high quality, accurate, and scientifically defensible.
[Tasks and duties of veterinary reference laboratories for food borne zoonoses].
Ellerbroek, Lüppo; Alter, T; Johne, R; Nöckler, K; Beutin, L; Helmuth, R
2009-02-01
Reference laboratories are of central importance for consumer protection. Field expertise and high scientific competence are basic requirements for the nomination of a national reference laboratory. To ensure a common approach in the analysis of zoonotic hazards, standards have been developed by the reference laboratories together with national official laboratories on the basis of Art. 33 of Directive (EG) No. 882/2004. Reference laboratories function as arbitrative boards in the case of ambivalent or debatable results. New methods for detection of zoonotic agents are developed and validated to provide tools for analysis, e. g., in legal cases, if results from different parties are disputed. Besides these tasks, national reference laboratories offer capacity building and advanced training courses and control the performance of ring trials to ensure consistency in the quality of analyses in official laboratories. All reference laboratories work according to the ISO standard 17025 which defines the grounds for strict laboratory quality rules and in cooperation with the respective Community Reference Laboratories (CRL). From the group of veterinary reference laboratories for food-borne zoonoses, the national reference laboratories are responsible for Listeria monocytogenes, for Campylobacter, for the surveillance and control of viral and bacterial contamination of bivalve molluscs, for E. coli, for the performance of analysis and tests on zoonoses (Salmonella), and from the group of parasitological zoonotic agents, the national reference laboratory for Trichinella.
Ensuring the Quality of Stem Cell-Derived In Vitro Models for Toxicity Testing.
Stacey, Glyn N; Coecke, Sandra; Price, Anna-Bal; Healy, Lyn; Jennings, Paul; Wilmes, Anja; Pinset, Christian; Ingelman-Sundberg, Magnus; Louisse, Jochem; Haupt, Simone; Kidd, Darren; Robitski, Andrea; Jahnke, Heinz-Georg; Lemaitre, Gilles; Myatt, Glenn
Quality control of cell cultures used in new in vitro toxicology assays is crucial to the provision of reliable, reproducible and accurate toxicity data on new drugs or constituents of new consumer products. This chapter explores the key scientific and ethical criteria that must be addressed at the earliest stages of developing toxicology assays based on human pluripotent stem cell (hPSC) lines. It also identifies key considerations for such assays to be acceptable for regulatory, laboratory safety and commercial purposes. Also addressed is the development of hPSC-based assays for the tissue and cell types of greatest interest in drug toxicology. The chapter draws on a range of expert opinion within the European Commission/Cosmetics Europe-funded alternative testing cluster SEURAT-1 and consensus from international groups delivering this guidance such as the International Stem Cell Banking Initiative. Accordingly, the chapter summarizes the most up-date best practices in the use and quality control of human Pluripotent Stem Cell lines in the development of in vitro toxicity assays from leading experts in the field.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-01-26
... DEPARTMENT OF HEALTH AND HUMAN SERVICES Agency for Healthcare Research and Quality Scientific...: Agency for Healthcare Research and Quality (AHRQ), HHS. ACTION: Request for scientific information submissions. SUMMARY: The Agency for Healthcare Research and Quality (AHRQ) is seeking scientific information...
NASA Astrophysics Data System (ADS)
Downs, R. R.; Chen, R. S.; de Sherbinin, A. M.
2017-12-01
Growing recognition of the importance of sharing scientific data more widely and openly has refocused attention on the state of data repositories, including both discipline- or topic-oriented data centers and institutional repositories. Data creators often have several alternatives for depositing and disseminating their natural, social, health, or engineering science data. In selecting a repository for their data, data creators and other stakeholders such as their funding agencies may wish to consider the user community or communities served, the type and quality of data products already offered, and the degree of data stewardship and associated services provided. Some data repositories serve general communities, e.g., those in their host institution or region, whereas others tailor their services to particular scientific disciplines or topical areas. Some repositories are selective when acquiring data and conduct extensive curation and reviews to ensure that data products meet quality standards. Many repositories have secured credentials and established a track record for providing trustworthy, high quality data and services. The NASA Socioeconomic Data and Applications Center (SEDAC) serves users interested in human-environment interactions, including researchers, students, and applied users from diverse sectors. SEDAC is selective when choosing data for dissemination, conducting several reviews of data products and services prior to release. SEDAC works with data producers to continually improve the quality of its open data products and services. As a Distributed Active Archive Center (DAAC) of the NASA Earth Observing System Data and Information System, SEDAC is committed to improving the accessibility, interoperability, and usability of its data in conjunction with data available from other DAACs, as well as other relevant data sources. SEDAC is certified as a Regular Member of the International Council for Science World Data System (ICSU-WDS).
Past Accomplishments and Future Challenges
ERIC Educational Resources Information Center
Danielson, Louis; Doolittle, Jennifer; Bradley, Renee
2005-01-01
Three broad issues continue to dramatically impact the education of children with specific learning disabilities (SLD): (1) the development and implementation of scientifically defensible methods of identification; (2) the development and implementation of scientific interventions to ensure that children with SLD have access to and make progress…
Nanomedicine: Problem Solving to Treat Cancer
ERIC Educational Resources Information Center
Hemling, Melissa A.; Sammel, Lauren M.; Zenner, Greta; Payne, Amy C.; Crone, Wendy C.
2006-01-01
Many traditional classroom science and technology activities often ask students to complete prepackaged labs that ensure that everyone arrives at the same "scientifically accurate" solution or theory, which ignores the important problem-solving and creative aspects of scientific research and technological design. Students rarely have the…
Waller, P; Cassell, J A; Saunders, M H; Stevens, R
2017-03-01
In order to promote understanding of UK governance and assurance relating to electronic health records research, we present and discuss the role of the Independent Scientific Advisory Committee (ISAC) for MHRA database research in evaluating protocols proposing the use of the Clinical Practice Research Datalink. We describe the development of the Committee's activities between 2006 and 2015, alongside growth in data linkage and wider national electronic health records programmes, including the application and assessment processes, and our approach to undertaking this work. Our model can provide independence, challenge and support to data providers such as the Clinical Practice Research Datalink database which has been used for well over 1,000 medical research projects. ISAC's role in scientific oversight ensures feasible and scientifically acceptable plans are in place, while having both lay and professional membership addresses governance issues in order to protect the integrity of the database and ensure that public confidence is maintained.
Letting the daylight in: Reviewing the reviewers and other ways to maximize transparency in science
Wicherts, Jelte M.; Kievit, Rogier A.; Bakker, Marjan; Borsboom, Denny
2012-01-01
With the emergence of online publishing, opportunities to maximize transparency of scientific research have grown considerably. However, these possibilities are still only marginally used. We argue for the implementation of (1) peer-reviewed peer review, (2) transparent editorial hierarchies, and (3) online data publication. First, peer-reviewed peer review entails a community-wide review system in which reviews are published online and rated by peers. This ensures accountability of reviewers, thereby increasing academic quality of reviews. Second, reviewers who write many highly regarded reviews may move to higher editorial positions. Third, online publication of data ensures the possibility of independent verification of inferential claims in published papers. This counters statistical errors and overly positive reporting of statistical results. We illustrate the benefits of these strategies by discussing an example in which the classical publication system has gone awry, namely controversial IQ research. We argue that this case would have likely been avoided using more transparent publication practices. We argue that the proposed system leads to better reviews, meritocratic editorial hierarchies, and a higher degree of replicability of statistical analyses. PMID:22536180
Medical Physics Education at the University of Novi Sad - Serbia
NASA Astrophysics Data System (ADS)
Stanković, Slobodanka; Vesković, Miroslav; Klisurić, Olivera; Spasić, Vesna
2007-04-01
Overview of new educational program and training in Medical Physics at the University of Novi Sad is presented, where the medical physics education from undergraduate to doctoral study is established in the last decade. Necessity for basic and additional education and hospital training for medical physicists becomes the evident subject in clinical practice in which physicists and physicians are in close collaboration to ensure high quality of patient care. Learning objectives: to incorporate the latest scientific and professional findings in the field of medical physics, medical diagnostics, therapy and instruments; to accomodate students' pursuits of individual fields by offering elective courses from different areas of current medical practice; to reflect the multidisciplinary spirit of the studies, since teaching is performed by experts from diverse fields.
Sahoo, Satya S; Valdez, Joshua; Rueschman, Michael
2016-01-01
Scientific reproducibility is key to scientific progress as it allows the research community to build on validated results, protect patients from potentially harmful trial drugs derived from incorrect results, and reduce wastage of valuable resources. The National Institutes of Health (NIH) recently published a systematic guideline titled "Rigor and Reproducibility " for supporting reproducible research studies, which has also been accepted by several scientific journals. These journals will require published articles to conform to these new guidelines. Provenance metadata describes the history or origin of data and it has been long used in computer science to capture metadata information for ensuring data quality and supporting scientific reproducibility. In this paper, we describe the development of Provenance for Clinical and healthcare Research (ProvCaRe) framework together with a provenance ontology to support scientific reproducibility by formally modeling a core set of data elements representing details of research study. We extend the PROV Ontology (PROV-O), which has been recommended as the provenance representation model by World Wide Web Consortium (W3C), to represent both: (a) data provenance, and (b) process provenance. We use 124 study variables from 6 clinical research studies from the National Sleep Research Resource (NSRR) to evaluate the coverage of the provenance ontology. NSRR is the largest repository of NIH-funded sleep datasets with 50,000 studies from 36,000 participants. The provenance ontology reuses ontology concepts from existing biomedical ontologies, for example the Systematized Nomenclature of Medicine Clinical Terms (SNOMED CT), to model the provenance information of research studies. The ProvCaRe framework is being developed as part of the Big Data to Knowledge (BD2K) data provenance project.
Sahoo, Satya S.; Valdez, Joshua; Rueschman, Michael
2016-01-01
Scientific reproducibility is key to scientific progress as it allows the research community to build on validated results, protect patients from potentially harmful trial drugs derived from incorrect results, and reduce wastage of valuable resources. The National Institutes of Health (NIH) recently published a systematic guideline titled “Rigor and Reproducibility “ for supporting reproducible research studies, which has also been accepted by several scientific journals. These journals will require published articles to conform to these new guidelines. Provenance metadata describes the history or origin of data and it has been long used in computer science to capture metadata information for ensuring data quality and supporting scientific reproducibility. In this paper, we describe the development of Provenance for Clinical and healthcare Research (ProvCaRe) framework together with a provenance ontology to support scientific reproducibility by formally modeling a core set of data elements representing details of research study. We extend the PROV Ontology (PROV-O), which has been recommended as the provenance representation model by World Wide Web Consortium (W3C), to represent both: (a) data provenance, and (b) process provenance. We use 124 study variables from 6 clinical research studies from the National Sleep Research Resource (NSRR) to evaluate the coverage of the provenance ontology. NSRR is the largest repository of NIH-funded sleep datasets with 50,000 studies from 36,000 participants. The provenance ontology reuses ontology concepts from existing biomedical ontologies, for example the Systematized Nomenclature of Medicine Clinical Terms (SNOMED CT), to model the provenance information of research studies. The ProvCaRe framework is being developed as part of the Big Data to Knowledge (BD2K) data provenance project. PMID:28269904
15 CFR 1180.1 - Purpose and scope.
Code of Federal Regulations, 2014 CFR
2014-01-01
... TECHNICAL INFORMATION SERVICE, DEPARTMENT OF COMMERCE TRANSFER BY FEDERAL AGENCIES OF SCIENTIFIC, TECHNICAL.... (a) The purpose of this regulation is to facilitate public access to the vast amount of scientific... regulation provides a variety of methods for federal agencies to adopt to ensure the timely transfer to the...
15 CFR 1180.1 - Purpose and scope.
Code of Federal Regulations, 2011 CFR
2011-01-01
...) TECHNOLOGY ADMINISTRATION, DEPARTMENT OF COMMERCE TRANSFER BY FEDERAL AGENCIES OF SCIENTIFIC, TECHNICAL AND... purpose of this regulation is to facilitate public access to the vast amount of scientific, technical and... variety of methods for federal agencies to adopt to ensure the timely transfer to the National Technical...
15 CFR 1180.1 - Purpose and scope.
Code of Federal Regulations, 2013 CFR
2013-01-01
...) TECHNOLOGY ADMINISTRATION, DEPARTMENT OF COMMERCE TRANSFER BY FEDERAL AGENCIES OF SCIENTIFIC, TECHNICAL AND... purpose of this regulation is to facilitate public access to the vast amount of scientific, technical and... variety of methods for federal agencies to adopt to ensure the timely transfer to the National Technical...
15 CFR 1180.1 - Purpose and scope.
Code of Federal Regulations, 2012 CFR
2012-01-01
...) TECHNOLOGY ADMINISTRATION, DEPARTMENT OF COMMERCE TRANSFER BY FEDERAL AGENCIES OF SCIENTIFIC, TECHNICAL AND... purpose of this regulation is to facilitate public access to the vast amount of scientific, technical and... variety of methods for federal agencies to adopt to ensure the timely transfer to the National Technical...
15 CFR 1180.1 - Purpose and scope.
Code of Federal Regulations, 2010 CFR
2010-01-01
...) TECHNOLOGY ADMINISTRATION, DEPARTMENT OF COMMERCE TRANSFER BY FEDERAL AGENCIES OF SCIENTIFIC, TECHNICAL AND... purpose of this regulation is to facilitate public access to the vast amount of scientific, technical and... variety of methods for federal agencies to adopt to ensure the timely transfer to the National Technical...
ERIC Educational Resources Information Center
Schrementi, Laurel N.
2011-01-01
Scientific exploration can happen anywhere in a classroom full of eager learners. By dedicating time for reflection and planning, teachers can feel empowered to make small changes to classroom spaces to increase their students' scientific experiences. At the museum, teachers believe that by ensuring that the environment is richly stocked with a…
Science Support: The Building Blocks of Active Data Curation
NASA Astrophysics Data System (ADS)
Guillory, A.
2013-12-01
While the scientific method is built on reproducibility and transparency, and results are published in peer reviewed literature, we have come to the digital age of very large datasets (now of the order of petabytes and soon exabytes) which cannot be published in the traditional way. To preserve reproducibility and transparency, active curation is necessary to keep and protect the information in the long term, and 'science support' activities provide the building blocks for active data curation. With the explosive growth of data in all fields in recent years, there is a pressing urge for data centres to now provide adequate services to ensure long-term preservation and digital curation of project data outputs, however complex those may be. Science support provides advice and support to science projects on data and information management, from file formats through to general data management awareness. Another purpose of science support is to raise awareness in the science community of data and metadata standards and best practice, engendering a culture where data outputs are seen as valued assets. At the heart of Science support is the Data Management Plan (DMP) which sets out a coherent approach to data issues pertaining to the data generating project. It provides an agreed record of the data management needs and issues within the project. The DMP is agreed upon with project investigators to ensure that a high quality documented data archive is created. It includes conditions of use and deposit to clearly express the ownership, responsibilities and rights associated with the data. Project specific needs are also identified for data processing, visualization tools and data sharing services. As part of the National Centre for Atmospheric Science (NCAS) and National Centre for Earth Observation (NCEO), the Centre for Environmental Data Archival (CEDA) fulfills this science support role of facilitating atmospheric and Earth observation data generating projects to ensure successful management of the data and accompanying information for reuse and repurpose. Specific examples at CEDA include science support provided to FAAM (Facility for Airborne Atmospheric Measurements) aircraft campaigns and large-scale modelling projects such as UPSCALE, the largest ever PRACE (Partnership for Advanced Computing in Europe) computational project, dependent on CEDA to provide the high-performance storage, transfer capability and data analysis environment on the 'super-data-cluster' JASMIN. The impact of science support on scientific research is conspicuous: better documented datasets with an increasing collection of metadata associated to the archived data, ease of data sharing with the use of standards in formats and metadata and data citation. These establish a high-quality of data management ensuring long-term preservation and enabling re-use by peer scientists which ultimately leads to faster paced progress in science.
Sheldon, Elizabeth; Vo, Kim Chi; McIntire, Ramsey A; Aghajanova, Lusine; Zelenko, Zara; Irwin, Juan C; Giudice, Linda C
2011-05-01
To develop a standard operating procedure (SOP) for collection, transport, storage of human endometrial tissue and blood samples, subject and specimen annotation, and establishing sample priorities. The SOP synthesizes sound scientific procedures, the literature on ischemia research, sample collection and gene expression profiling, good laboratory practices, and the authors' experience of workflow and sample quality. The National Institutes of Health, University of California, San Francisco, Human Endometrial Tissue and DNA Bank. Women undergoing endometrial biopsy or hysterectomy for nonmalignant indications. Collecting, processing, storing, distributing endometrial tissue and blood samples under approved institutional review board protocols and written informed consent from participating subjects. Standard operating procedure. The SOP addresses rigorous and consistent subject annotation, specimen processing and characterization, strict regulatory compliance, and a reference for researchers to track collection and storage times that may influence their research. The comprehensive and systematic approach to the procurement of human blood and endometrial tissue in this SOP ensures the high quality, reliability, and scientific usefulness of biospecimens made available to investigators by the National Institutes of Health, University of California, San Francisco, Human Endometrial Tissue and DNA Bank. The detail and perspective in this SOP also provides a blueprint for implementation of similar collection programs at other institutions. Copyright © 2011 American Society for Reproductive Medicine. Published by Elsevier Inc. All rights reserved.
Saunders, Gabrielle H; Biswas, Kousick; Serpi, Tracey; McGovern, Stephanie; Groer, Shirley; Stock, Eileen M; Magruder, Kathryn M; Storzbach, Daniel; Skelton, Kelly; Abrams, Thad; McCranie, Mark; Richerson, Joan; Dorn, Patricia A; Huang, Grant D; Fallon, Michael T
2017-11-01
Posttraumatic stress disorder (PTSD) is a leading cause of impairments in quality of life and functioning among Veterans. Service dogs have been promoted as an effective adjunctive intervention for PTSD, however published research is limited and design and implementation flaws in published studies limit validated conclusions. This paper describes the rationale for the study design, a detailed methodological description, and implementation challenges of a multisite randomized clinical trial examining the impact of service dogs on the on the functioning and quality of life of Veterans with PTSD. Trial design considerations prioritized participant and intervention (dog) safety, selection of an intervention comparison group that would optimize enrollment in all treatment arms, pragmatic methods to ensure healthy well-trained dogs, and the selection of outcomes for achieving scientific and clinical validity in a Veteran PTSD population. Since there is no blueprint for conducting a randomized clinical trial examining the impact of dogs on PTSD of this size and scope, it is our primary intent that the successful completion of this trial will set a benchmark for future trial design and scientific rigor, as well as guiding researchers aiming to better understand the role that dogs can have in the management of Veterans experiencing mental health conditions such as PTSD. Published by Elsevier Inc.
Networking seismological data exchange in Europe
NASA Astrophysics Data System (ADS)
Sleeman, Reinoud; van Eck, Torild; van den Hazel, Gert-Jan; Trani, Luca; Spinuso, Alessandro
2010-05-01
The mission of the ORFEUS Data Centre (ODC) is to collect and archive high-quality seismic broadband waveform data from European-Mediterranean organizations and to provide open access to this data for monitoring and research purposes by the scientific community. The core activity of the ODC is to run an automatic, sustainable system to achieve this mission. Our 4 key operations are: data exchange protocols, quality control procedures, data management and data services. All these activities at the ODC benefit from developments within the EC Infrastructure (I3) project NERIES (Network of Research Infrastructure for European Seismology). For the data acquisition the ODC uses different standard, real-time data exchange protocols (e.g. Antelope, SeedLink, Scream) to ensure a very high data availability from stations in the Virtual European Broadband Seismic Network (VEBSN), which currently exists of about 500 BB stations. Within the data services a number of tools (e.g. Wilber II, NetDC, BreqFast, AutoDRM and webforms) are in place to serve the scientific community. These are currently being complemented by webservices and an integrated portal. The data management part relies on a simple flat file structure and a MySQL data management system on which both ArcLink and the Generic Data Interface (GDI) operate. In this presentation we will present an overview of the different aspects concerning data acquisition, services and management at ODC.
The GCOS Reference Upper-Air Network (GRUAN)
NASA Astrophysics Data System (ADS)
Vömel, H.; Berger, F. H.; Immler, F. J.; Seidel, D.; Thorne, P.
2009-04-01
While the global upper-air observing network has provided useful observations for operational weather forecasting for decades, its measurements lack the accuracy and long-term continuity needed for understanding climate change. Consequently, the scientific community faces uncertainty on such key issues as the trends of temperature in the upper troposphere and stratosphere or the variability and trends of stratospheric water vapour. To address these shortcomings, and to ensure that future climate records will be more useful than the records to date, the Global Climate Observing System (GCOS) program initiated the GCOS Reference Upper Air Network (GRUAN). GRUAN will be a network of about 30-40 observatories with a representative sampling of geographic regions and surface types. These stations will provide upper-air reference observations of the essential climate variables, i.e. temperature, geopotential, humidity, wind, radiation and cloud properties using specialized radiosondes and complementary remote sensing profiling instrumentation. Long-term stability, quality assurance / quality control, and a detailed assessment of measurement uncertainties will be the key aspects of GRUAN observations. The network will not be globally complete but will serve to constrain and adjust data from more spatially comprehensive global observing systems including satellites and the current radiosonde networks. This paper outlines the scientific rationale for GRUAN, its role in the Global Earth Observation System of Systems, network requirements and likely instrumentation, management structure, current status and future plans.
NASA Technical Reports Server (NTRS)
Tibbitts, T. W. (Principal Investigator)
1986-01-01
This report includes procedures for ensuring the quality of the environment provided for plant growth in controlled environment facilities. Biologists and engineers may use these procedures for ensuring quality control during experiments or for ensuring quality control in the design of plant growth facilities. Environmental monitoring prior to and during experiments is included in these procedures. Specific recommendations cover control, acquisition, and calibration for sensor types for the separate parameters of radiation (light), temperature, humidity, carbon dioxide, and air movement.
No Such Thing as "Good Vibrations" in Science
ERIC Educational Resources Information Center
Lancaster, Franklin D.
2011-01-01
A facilities manager must ensure that a building runs as smoothly and successfully as possible. For college, university, and school managers dealing with laboratories and other spaces for scientific study and research, this means making sure that nothing disrupts experiments and other scientific endeavors. Such disruptions can wreak havoc,…
Chemical annotation of small and peptide-like molecules at the Protein Data Bank
Young, Jasmine Y.; Feng, Zukang; Dimitropoulos, Dimitris; Sala, Raul; Westbrook, John; Zhuravleva, Marina; Shao, Chenghua; Quesada, Martha; Peisach, Ezra; Berman, Helen M.
2013-01-01
Over the past decade, the number of polymers and their complexes with small molecules in the Protein Data Bank archive (PDB) has continued to increase significantly. To support scientific advancements and ensure the best quality and completeness of the data files over the next 10 years and beyond, the Worldwide PDB partnership that manages the PDB archive is developing a new deposition and annotation system. This system focuses on efficient data capture across all supported experimental methods. The new deposition and annotation system is composed of four major modules that together support all of the processing requirements for a PDB entry. In this article, we describe one such module called the Chemical Component Annotation Tool. This tool uses information from both the Chemical Component Dictionary and Biologically Interesting molecule Reference Dictionary to aid in annotation. Benchmark studies have shown that the Chemical Component Annotation Tool provides significant improvements in processing efficiency and data quality. Database URL: http://wwpdb.org PMID:24291661
Sexual counseling and cardiovascular disease: practical approaches
Steinke, Elaine E; Jaarsma, Tiny
2015-01-01
Patients with cardiovascular disease and their partners expect health care providers to provide sexual counseling to assist them in maintaining sexual quality of life. Evidence suggests however, that there is a gap in integrating evidence into practice and that relatively few cardiac patients receive sexual counseling. This can result in negative psychological, physical, and quality of life outcomes for couples who may needlessly decide sexual activity is too risky and cease all sexual activity. Two scientific statements now exist that provide ample guidance to health care providers in discussing this important topic. Using a team approach that includes physicians, nurses, physical therapists, rehabilitation staff, and others is important to ensure that sexual counseling occurs throughout recovery. In addition, several trials using interventional approaches for sexual counseling provide insight into successful approaches for sexual counseling in practice. This article provides practical strategies and evidence-based approaches for assessment and sexual counseling for all cardiac patients and their partners, and specific counseling for those with ischemic conditions, heart failure, and implanted devices. PMID:25219908
Sexual counseling and cardiovascular disease: practical approaches.
Steinke, Elaine E; Jaarsma, Tiny
2015-01-01
Patients with cardiovascular disease and their partners expect health care providers to provide sexual counseling to assist them in maintaining sexual quality of life. Evidence suggests however, that there is a gap in integrating evidence into practice and that relatively few cardiac patients receive sexual counseling. This can result in negative psychological, physical, and quality of life outcomes for couples who may needlessly decide sexual activity is too risky and cease all sexual activity. Two scientific statements now exist that provide ample guidance to health care providers in discussing this important topic. Using a team approach that includes physicians, nurses, physical therapists, rehabilitation staff, and others is important to ensure that sexual counseling occurs throughout recovery. In addition, several trials using interventional approaches for sexual counseling provide insight into successful approaches for sexual counseling in practice. This article provides practical strategies and evidence-based approaches for assessment and sexual counseling for all cardiac patients and their partners, and specific counseling for those with ischemic conditions, heart failure, and implanted devices.
Technical support for Life Sciences communities on a production grid infrastructure.
Michel, Franck; Montagnat, Johan; Glatard, Tristan
2012-01-01
Production operation of large distributed computing infrastructures (DCI) still requires a lot of human intervention to reach acceptable quality of service. This may be achievable for scientific communities with solid IT support, but it remains a show-stopper for others. Some application execution environments are used to hide runtime technical issues from end users. But they mostly aim at fault-tolerance rather than incident resolution, and their operation still requires substantial manpower. A longer-term support activity is thus needed to ensure sustained quality of service for Virtual Organisations (VO). This paper describes how the biomed VO has addressed this challenge by setting up a technical support team. Its organisation, tooling, daily tasks, and procedures are described. Results are shown in terms of resource usage by end users, amount of reported incidents, and developed software tools. Based on our experience, we suggest ways to measure the impact of the technical support, perspectives to decrease its human cost and make it more community-specific.
China's Air Quality and Respiratory Disease Mortality Based on the Spatial Panel Model.
Cao, Qilong; Liang, Ying; Niu, Xueting
2017-09-18
Background : Air pollution has become an important factor restricting China's economic development and has subsequently brought a series of social problems, including the impact of air pollution on the health of residents, which is a topical issue in China. Methods : Taking into account this spatial imbalance, the paper is based on the spatial panel data model PM 2.5 . Respiratory disease mortality in 31 Chinese provinces from 2004 to 2008 is taken as the main variable to study the spatial effect and impact of air quality and respiratory disease mortality on a large scale. Results : It was found that there is a spatial correlation between the mortality of respiratory diseases in Chinese provinces. The spatial correlation can be explained by the spatial effect of PM 2.5 pollutions in the control of other variables. Conclusions : Compared with the traditional non-spatial model, the spatial model is better for describing the spatial relationship between variables, ensuring the conclusions are scientific and can measure the spatial effect between variables.
Chemical annotation of small and peptide-like molecules at the Protein Data Bank.
Young, Jasmine Y; Feng, Zukang; Dimitropoulos, Dimitris; Sala, Raul; Westbrook, John; Zhuravleva, Marina; Shao, Chenghua; Quesada, Martha; Peisach, Ezra; Berman, Helen M
2013-01-01
Over the past decade, the number of polymers and their complexes with small molecules in the Protein Data Bank archive (PDB) has continued to increase significantly. To support scientific advancements and ensure the best quality and completeness of the data files over the next 10 years and beyond, the Worldwide PDB partnership that manages the PDB archive is developing a new deposition and annotation system. This system focuses on efficient data capture across all supported experimental methods. The new deposition and annotation system is composed of four major modules that together support all of the processing requirements for a PDB entry. In this article, we describe one such module called the Chemical Component Annotation Tool. This tool uses information from both the Chemical Component Dictionary and Biologically Interesting molecule Reference Dictionary to aid in annotation. Benchmark studies have shown that the Chemical Component Annotation Tool provides significant improvements in processing efficiency and data quality. Database URL: http://wwpdb.org.
Synthetic Biology: Applications in the Food Sector.
Tyagi, Ashish; Kumar, Ashwani; Aparna, S V; Mallappa, Rashmi H; Grover, Sunita; Batish, Virender Kumar
2016-08-17
Synthetic biology also termed as "genomic alchemy" represents a powerful area of science that is based on the convergence of biological sciences with systems engineering. It has been fittingly described as "moving from reading the genetic code to writing it" as it focuses on building, modeling, designing and fabricating novel biological systems using customized gene components that result in artificially created genetic circuitry. The scientifically compelling idea of the technological manipulation of life has been advocated since long time. Realization of this idea has gained momentum with development of high speed automation and the falling cost of gene sequencing and synthesis following the completion of the human genome project. Synthetic biology will certainly be instrumental in shaping the development of varying areas ranging from biomedicine, biopharmaceuticals, chemical production, food and dairy quality monitoring, packaging, and storage of food and dairy products, bioremediation and bioenergy production, etc. However, potential dangers of using synthetic life forms have to be acknowledged and adoption of policies by the scientific community to ensure safe practice while making important advancements in the ever expanding field of synthetic biology is to be fully supported and implemented.
NASA Technical Reports Server (NTRS)
Griffin, Ashley
2017-01-01
The Joint Polar Satellite System (JPSS) Program Office is the supporting organization for the Suomi National Polar Orbiting Partnership (S-NPP) and JPSS-1 satellites. S-NPP carries the following sensors: VIIRS, CrIS, ATMS, OMPS, and CERES with instruments that ultimately produce over 25 data products that cover the Earths weather, oceans, and atmosphere. A team of scientists and engineers from all over the United States document, monitor and fix errors in operational software code or documentation with the algorithm change process (ACP) to ensure the success of the S-NPP and JPSS 1 missions by maintaining quality and accuracy of the data products the scientific community relies on. This poster will outline the programs algorithm change process (ACP), identify the various users and scientific applications of our operational data products and highlight changes that have been made to the ACP to accommodate operating system upgrades to the JPSS programs Interface Data Processing Segment (IDPS), so that the program is ready for the transition to the 2017 JPSS-1 satellite mission and beyond.
Assessing the Privacy Risks of Data Sharing in Genomics
Heeney, C.; Hawkins, N.; de Vries, J.; Boddington, P.; Kaye, J.
2010-01-01
The protection of identity of participants in medical research has traditionally been guaranteed by the maintenance of the confidentiality of health information through mechanisms such as only releasing data in an aggregated form or after identifying variables have been removed. This protection of privacy is regarded as a fundamental principle of research ethics, through which the support of research participants and the public is maintained. Whilst this traditional model was adopted for genetics and genomics research, and was generally considered broadly fit for purpose, we argue that this approach is increasingly untenable in genomics. Privacy risk assessments need to have regard to the whole data environment, not merely the quality of the dataset to be released in isolation. As sources of data proliferate, issues of privacy protection are increasingly problematic in relation to the release of genomic data. However, we conclude that, by paying careful attention to potential pitfalls, scientific funders and researchers can take an important part in attempts to safeguard the public and ensure the continuation of potentially important scientific research. PMID:20339285
Scientific Advances Shaping the Future Roles of Oncology Nurses.
Wujcik, Debra
2016-05-01
To discuss the recent scientific advances that influence current oncology care and explore the implications of these advances for the future of oncology nursing. Current nursing, medical and basic science literature; Clinicaltrials.gov. The future of oncology care will be influenced by an aging population and increasing number of patients diagnosed with cancer. The advancements in molecular sequencing will lead to more clinical trials, targeted therapies, and treatment decisions based on the genetic makeup of both the patient and the tumor. Nurses must stay current with an ever changing array of targeted therapies and developing science. Nurses will influence cancer care quality, value, cost, and patient satisfaction. It is critical for oncology nurses and nursing organizations to engage with all oncology care stakeholders in identifying the future needs of oncology patients and the environment in which care will be delivered. Nurses themselves must identify the roles that will be needed to ensure a workforce that is adequate in number and well trained to meet the future challenges of care delivery. Copyright © 2016 Elsevier Inc. All rights reserved.
Access to the scientific literature
NASA Astrophysics Data System (ADS)
Albarède, Francis
The Public Library of Science Open Letter (http://www.publiclibraryofscience.org) is a very generous initiative, but, as most similar initiatives since the advent of electronic publishing, it misses the critical aspects of electronic publishing.Ten years ago, a Publisher would be in charge of running a system called a “scientific journal.” In such a system, the presence of an Editor and peer Reviewers secures the strength of the science and the rigor of writing; the Publisher guarantees the professional quality of printing, efficient dissemination, and long-term archiving. Publishing used to be in everyone's best interest, or nearly everyone. The Publisher, because he/she is financially motivated, ensures widespread dissemination of the journal amongst libraries and individual subscribers. The interest of the Author is that the system guarantees a broad potential readership. The interest of the Reader is that a line is drawn between professionally edited literature, presumably of better quality, and gray literature or home publishing, so that he/she does not waste time going through ‘low yield’ ungraded information. The Publisher could either be a private company, an academic institution, or a scholarly society. My experience is that, when page charges and subscription rates are compounded, journals published by scholarly societies are not necessarily cheaper. The difference between these cases is not the cost of running an office with rents, wages, printing, postage, advertisement, and archiving, but that a private Publisher pays shareholders. Shareholders have the bad habit of minding their own business and, therefore, they may interfere negatively with scientific publishing. Nevertheless, while the stranglehold imposed by private Publishers on our libraries over the last 10 years by increasing subscription rates may in part be due to shareholders' greed, this is true only in part. The increases are also a consequence of the booming number of pages being printed.
Status of NGS CORS Network and Its Contribution to the GGOS Infrastructure
NASA Astrophysics Data System (ADS)
Choi, K. K.; Haw, D.; Sun, L.
2017-12-01
Recent advancement of Satellite Geodesy techniques can now contribute to the global frame realization needed to improve worldwide accuracies. These techniques rely on coordinates computed using continuously observed GPS data and corresponding satellite orbits. The GPS-based reference system continues to depend on the physical stability of a ground-based network of points as the primary foundation for these observations. NOAA's National Geodetic Survey (NGS) has been operating Continuously Operating Reference Stations (CORS) to provide direct access to the National Spatial Reference System (NSRS). By virtue of NGS' scientific reputation and leadership in national and international geospatial issues, NGS has determined to increase its participation in the maintenance of the U.S. component of the global GPS tracking network in order to realize a long-term stable national terrestrial reference frame. NGS can do so by leveraging its national leadership role coupled with NGS' scientific expertise, in designating and upgrading a subset of the current tracking network for this purpose. This subset of stations must have the highest operational standards to serve the dual functions: being the U.S. contribution to the international frame, along with providing the link to the national datum. These stations deserve special attention to ensure that the highest possible levels of quality and stability are maintained. To meet this need, NGS is working with the international scientific groups to add and designate these reference stations based on scientific merit such as: colocation with other geodetic techniques, geographic area, and monumentation stability.
Open Data and Open Science for better Research in the Geo and Space Domain
NASA Astrophysics Data System (ADS)
Ritschel, B.; Seelus, C.; Neher, G.; Iyemori, T.; Koyama, Y.; Yatagai, A. I.; Murayama, Y.; King, T. A.; Hughes, S.; Fung, S. F.; Galkin, I. A.; Hapgood, M. A.; Belehaki, A.
2015-12-01
Main open data principles had been worked out in the run-up and finally adopted in the Open Data Charta at the G8 summit in Lough Erne, Northern Ireland in June 2013. Important principles are also valid for science data, such as Open Data by Default, Quality and Quantity, Useable by All, Releasing Data for Improved Governance, Releasing Data for Innovation. There is also an explicit relationship to such areas of high values as earth observation, education and geospatial data. The European union implementation plan of the Open Data Charta identifies among other things objectives such as making data available in an open format, enabling semantic interoperability, ensuring quality, documentation and where appropriate reconciliation across different data sources, implementing software solutionsallowing easy management, publication or visualization of datasets and simplifying clearance of intellectual property rights.Open Science is not just a list of already for a longer time known principles but stands for a lot of initiatives and projects around a better handling of scientific data and openly shared scientific knowledge. It is also about transparency in methodology and collection of data, availability and reuse of scientific data, public accessibility to scientific communication and using of social media to facility scientific collaboration. Some projects are concentrating on open sharing of free and open source software and even further hardware in kind of processing capabilities. In addition question about the mashup of data and publication and an open peer review process are addressed.Following the principles of open data and open science the newest results of the collaboration efforts in mashing up the data servers related to the Japanese IUGONET, the European Union ESPAS and the GFZ ISDC semantic Web projects will be presented here. The semantic Web based approach for the mashup is focusing on the design and implementation of a common but still distributed data catalog based on semantical interoperability including the transparent access to data in relational data bases. References: https://www.gov.uk/government/uploads/system/uploads/attachment_data/file/207772/Open_Data_Charter.pdfhttp://www.openscience.org/blog/wp-content/uploads/2013/06/OpenSciencePoster.pdf
Reimagining the Pipeline: Advancing STEM Diversity, Persistence, and Success
Allen-Ramdial, Stacy-Ann A.; Campbell, Andrew G.
2014-01-01
Achieving trainee diversity in science, technology, engineering, and mathematics is rapidly becoming a challenge faced by many nations. Success in this area ensures the availability of a workforce capable of engaging in scientific practices that will promote increased production capacity and creativity and will preserve global scientific competitiveness. The near-term vision of achieving this goal is within reach and will capitalize on the growing numbers of underrepresented minority groups in the population. Although many nations have had remarkable histories as leaders in science and technology, few have simultaneously struggled with the challenge of meeting the educational and training needs of underrepresented groups. In this article, we share strategies for building the agency of the scientific community to achieve greater diversity by highlighting four key action areas: (1) aligning institutional culture and climate; (2) building interinstitutional partnerships; (3) building and sustaining critical mass; and (4) ensuring, rewarding, and maximizing faculty involvement. PMID:25561747
Reimagining the Pipeline: Advancing STEM Diversity, Persistence, and Success.
Allen-Ramdial, Stacy-Ann A; Campbell, Andrew G
2014-07-01
Achieving trainee diversity in science, technology, engineering, and mathematics is rapidly becoming a challenge faced by many nations. Success in this area ensures the availability of a workforce capable of engaging in scientific practices that will promote increased production capacity and creativity and will preserve global scientific competitiveness. The near-term vision of achieving this goal is within reach and will capitalize on the growing numbers of underrepresented minority groups in the population. Although many nations have had remarkable histories as leaders in science and technology, few have simultaneously struggled with the challenge of meeting the educational and training needs of underrepresented groups. In this article, we share strategies for building the agency of the scientific community to achieve greater diversity by highlighting four key action areas: (1) aligning institutional culture and climate; (2) building interinstitutional partnerships; (3) building and sustaining critical mass; and (4) ensuring, rewarding, and maximizing faculty involvement.
The clinical research triad: how can we ensure quality in out-sourced clinical trials?
Strause, L G; Vogel, J R
1999-01-01
The importance of quality within clinical trials cannot be underestimated. Built on the foundation of patient care where quality may simply be understood and expected, the business of conducting clinical trials must evolve to instill quality and ensure that quality is maintained. How that is accomplished within the drug development process is complicated by the relationship between the vendors--the sponsor, the contractor and the investigative site. This article will discuss the dynamics of the drug development triad from the perspective of the authors. Who are the players and what is quality from each of their perspectives? Communication among all parties is essential in order to ensure that quality is maintained. Unfortunately, even with optimal communication, if expectations and goals are not clearly defined, the results may be unsatisfactory. Vision and values of each player contributes to the success of the relationship and the quality of the service.
Position of the American Dietetic Association: functional foods.
Hasler, Clare M; Brown, Amy C
2009-04-01
All foods are functional at some physiological level, but it is the position of the American Dietetic Association (ADA) that functional foods that include whole foods and fortified, enriched, or enhanced foods have a potentially beneficial effect on health when consumed as part of a varied diet on a regular basis, at effective levels. ADA supports research to further define the health benefits and risks of individual functional foods and their physiologically active components. Health claims on food products, including functional foods, should be based on the significant scientific agreement standard of evidence and ADA supports label claims based on such strong scientific substantiation. Food and nutrition professionals will continue to work with the food industry, allied health professionals, the government, the scientific community, and the media to ensure that the public has accurate information regarding functional foods and thus should continue to educate themselves on this emerging area of food and nutrition science. Knowledge of the role of physiologically active food components, from plant, animal, and microbial food sources, has changed the role of diet in health. Functional foods have evolved as food and nutrition science has advanced beyond the treatment of deficiency syndromes to reduction of disease risk and health promotion. This position paper reviews the definition of functional foods, their regulation, and the scientific evidence supporting this evolving area of food and nutrition. Foods can no longer be evaluated only in terms of macronutrient and micronutrient content alone. Analyzing the content of other physiologically active components and evaluating their role in health promotion will be necessary. The availability of health-promoting functional foods in the US diet has the potential to help ensure a healthier population. However, each functional food should be evaluated on the basis of scientific evidence to ensure appropriate integration into a varied diet.
NASA Astrophysics Data System (ADS)
Tahmooresnejad, Leila
Nanotechnology is considered to be the most promising high technology of this century. Worldwide investment in this technology has rapidly increased in the past two decades, and it will likely drive future economic growth. Research in this new science-based technology requires significant public funding to facilitate knowledge production, reduce related uncertainties and risks, and ensure the success of nanotechnology development. Given its potential in a wide range of domains, governments and policymakers have sought to efficiently allocate funding to maximize economic benefits. It is therefore essential to further our understanding of how public funding influences research performance. The main purpose of this thesis is to analyze the impact of public funding on nanotechnology development, with a special focus on scientific and technological research outputs. The research objectives are twofold: we first seek to examine this funding influence, and second to explore the impact of collaboration and related scientific and innovative networks on nanotechnology development. Afterwards, our goal is to compare the impact of funding and of nanotechnology collaborative networks between Canada and the US on scientific and technological research outputs. This research deals with the prominent outputs of academic research, publications and patents, and characterizes collaborative networks using the co-publication and co-invention links between scientists and inventors. This thesis contributes significantly to the following research questions: how increased public funding to nanotechnology scientists enhances nanotechnology-related publications and patents in terms of (a) number and (b) quality? Are researchers who hold a more influential network position in co-publication/co-invention networks more productive and more cited? Is the influence of public funding on nanotechnology research different in Canada compared with the US? To answer these questions, information about nanotechnology articles, patents and funding was extracted from various databases in Canada and in the US and was used to build the scientific and innovation networks, and to analyze the influence of funding by econometric analyses. Regarding the first research question, our results show that public funding generally increases the number and quality of these outputs. However, this positive impact is more significant in the US and funding is less likely to influence nanotechnology patents in Canada. Regarding the analysis of industry funding in Quebec, private funds are less likely to increase the quality of publications. Concerning our second research question, results show that scientific and technological outputs are correlated with the position of researchers in collaborative networks. Nanotechnology research outputs particularly in Canada show greater returns on publications and patents on network collaborations. Finally, although the impacts are somewhat different between Canada and the US, this research suggests that both funding and collaborative networks play an important role in boosting the quantity and quality of academic research.
Towards a more complete SOCCR: Establishing a Coastal Carbon Data Network
NASA Astrophysics Data System (ADS)
Pidgeon, E.; Howard, J.; Tang, J.; Kroeger, K. D.; Windham-Myers, L.
2015-12-01
The 2007 State of the Carbon Cycle Report (SOCCR) was highly influential in ensuring components of the carbon cycle were accounted for in national policy and related management. However, while SOCCR detailed the significance of North American coastal wetlands, it was not until recently that leading governments began to fully recognized these ecosystems for their carbon sequestration and storage capacity and hence the significant role coastal ecosystems can play in GHG emission reductions strategies, offset mechanisms, coastal management strategies and climate mitigation policy. The new attention on coastal carbon systems has exposed limitations in terms of data availability and data quality, as well as insufficient knowledge of coastal carbon distributions, characteristics and coastal carbon cycle processes. In addition to restricting scientific progress, lack of comprehensive, comparable, and quality-controlled coastal carbon data is hindering progress towards carbon based conservation and coastal management. To directly address those limitations, we are developing a Global Science and Data Network for Coastal "Blue" Carbon, with support from the Carbon Cycle Interagency Working Group. Goals include: • Improving basic and applied science on carbon and GHG cycling in vegetated coastal ecosystems; • Supporting a coastal carbon and associated GHG data archive for use by the science community, coastal and climate practitioners and other data users; • Building the capacity of coastal carbon stakeholders globally to collect and interpret high quality coastal carbon science and data; • Providing a forum and mechanism to promote exchange and collaboration between scientists and coastal carbon data users globally; and • Outreach activities to ensure the best available data are globally accessible and that science is responsive to the needs of coastal managers and policy-makers.
Hygienic support of the ISS air quality (main achievements and prospects)
NASA Astrophysics Data System (ADS)
Moukhamedieva, Lana; Tsarkov, Dmitriy; Pakhomova, Anna
Hygienic preventive measures during pre-flight processing of manned spaceships, selection of polymeric materials, sanitary-hygienic evaluation of cargo and scientific hardware to be used on the ISS and life support systems allow to maintain air quality in limits of regulatory requirements. However, graduate increase of total air contamination by harmful chemicals is observed as service life of the ISS gets longer. It is caused by polymeric materials used on the station overall quantity rise, by additional contamination brought by cargo spacecrafts and modules docking to the ISS and by the cargo. At the same time the range of contaminants that are typical for off-gassing from polymeric materials where modern stabilizers, plasticizers, flame retarders and other additives are used gets wider. In resolving the matters of the ISS service life extension the main question of hygienic researches is to determine real safe operation life of the polymeric material used in structures and hardware of the station, including: begin{itemize} research of polymers degradation (ageing) and its effect on intensity of off gassing and its toxicity; begin{itemize} introduction of polymers with minimal volatile organic compounds off gassing under conditions of space flight and thermal-oxidative degradation. In order to ensure human safety during long-term flight it is important to develop: begin{itemize} real-time air quality monitoring systems, including on-line analysis of highly toxic contaminants evolving during thermo-oxidative degradation of polymer materials and during blowouts of toxic contaminants; begin{itemize} hygienic standards of contaminants level for extended duration of flight up to 3 years. It is essential to develop an automated control system for on-line monitoring of toxicological status and to develop hygienic and engineer measures of its management in order to ensure crew members safety during off-nominal situation.
NASA Technical Reports Server (NTRS)
Keller, Richard M.; Norvig, Peter (Technical Monitor)
2000-01-01
NASA's ScienceDesk Project at the Ames Research Center is responsible for scientific knowledge management which includes ensuring the capture, preservation, and traceability of scientific knowledge. Other responsibilities include: 1) Maintaining uniform information access which is achieved through intelligent indexing and visualization, 2) Collaborating both asynchronous and synchronous science teamwork, 3) Monitoring and controlling semi-autonomous remote experimentation.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-06-04
...-1113-0000-C5] Endangered and Threatened Wildlife and Plants; 90-Day Finding on a Petition To Delist or... petition presents substantial scientific or commercial information indicating that the petitioned actions.... To ensure that these status reviews are comprehensive, we are requesting scientific and commercial...
Development of a Structured Undergraduate Research Experience: Framework and Implications
ERIC Educational Resources Information Center
Brown, Anne M.; Lewis, Stephanie N.; Bevan, David R.
2016-01-01
Participating in undergraduate research can be a pivotal experience for students in life science disciplines. Development of critical thinking skills, in addition to conveying scientific ideas in oral and written formats, is essential to ensuring that students develop a greater understanding of basic scientific knowledge and the research process.…
Ensuring the reliability of stable isotope ratio data--beyond the principle of identical treatment.
Carter, J F; Fry, B
2013-03-01
The need for inter-laboratory comparability is crucial to facilitate the globalisation of scientific networks and the development of international databases to support scientific and criminal investigations. This article considers what lessons can be learned from a series of inter-laboratory comparison exercises organised by the Forensic Isotope Ratio Mass Spectrometry (FIRMS) network in terms of reference materials (RMs), the management of data quality, and technical limitations. The results showed that within-laboratory precision (repeatability) was generally good but between-laboratory accuracy (reproducibility) called for improvements. This review considers how stable isotope laboratories can establish a system of quality control (QC) and quality assurance (QA), emphasising issues of repeatability and reproducibility. For results to be comparable between laboratories, measurements must be traceable to the international δ-scales and, because isotope ratio measurements are reported relative to standards, a key aspect is the correct selection, calibration, and use of international and in-house RMs. The authors identify four principles which promote good laboratory practice. The principle of identical treatment by which samples and RMs are processed in an identical manner and which incorporates three further principles; the principle of identical correction (by which necessary corrections are identified and evenly applied), the principle of identical scaling (by which data are shifted and stretched to the international δ-scales), and the principle of error detection by which QC and QA results are monitored and acted upon. To achieve both good repeatability and good reproducibility it is essential to obtain RMs with internationally agreed δ-values. These RMs will act as the basis for QC and can be used to calibrate further in-house QC RMs tailored to the activities of specific laboratories. In-house QA standards must also be developed to ensure that QC-based calibrations and corrections lead to accurate results for samples. The δ-values assigned to RMs must be recorded and reported with all data. Reference materials must be used to determine what corrections are necessary for measured data. Each analytical sequence of samples must include both QC and QA materials which are subject to identical treatment during measurement and data processing. Results for these materials must be plotted, monitored, and acted upon. Periodically international RMs should be analysed as an in-house proficiency test to demonstrate results are accurate.
Ensuring Quality Nursing Home Care
Ensuring Quality Nursing Home Care Before you choose a nursing home Expert information from Healthcare Professionals Who Specialize in the Care ... Nearly 1.6 million older Americans live in nursing homes in the United States. The move to ...
Ensuring Credibility of NASA's Earth Science Data (Invited)
NASA Astrophysics Data System (ADS)
Maiden, M. E.; Ramapriyan, H. K.; Mitchell, A. E.; Berrick, S. W.; Walter, J.; Murphy, K. J.
2013-12-01
The summary description of the Fall 2013 AGU session on 'Data Curation, Credibility, Preservation Implementation, and Data Rescue to Enable Multi-Source Science' identifies four attributes needed to ensure credibility in Earth science data records. NASA's Earth Science Data Systems Program has been working on all four of these attributes: transparency, completeness, permanence, and ease of access and use, by focusing on them and upon improving our practices of them, over many years. As far as transparency or openness, NASA was in the forefront of free and open sharing of data and associated information for Earth observations. The US data policy requires such openness, but allows for the recoup of the marginal cost of distribution of government data and information - but making the data available with no such charge greatly increases their usage in scientific studies and the resultant analyses hasten our collective understanding of the Earth system. NASA's currently available Earth observations comprise primarily those obtained from satellite-borne instruments, suborbital campaigns, and field investigations. These data are complex and must be accompanied by rich metadata and documentation to be understandable. To enable completeness, NASA utilizes standards for data format, metadata content, and required documentation for any data that are ingested into our distributed Earth Observing System Data and Information System, or EOSDIS. NASA is moving to a new metadata paradigm, primarily to enable a fuller description of data quality and fit-for-purpose attributes. This paradigm offers structured approaches for storing quality measures in metadata that include elements such as Positional Accuracy, Lineage and Cloud Cover. NASA exercises validation processes for the Earth Science Data Systems Program to ensure users of EOSDIS have a predictable level of confidence in data as well as assessing the data viability for usage and application. The Earth Science Data Systems Program has been improving its data management practices for over twenty years to assure permanence of data utility through reliable preservation of bits, readability, understandability, usability and reproducibility of results. While NASA has focused on the Earth System Science research community as the primary data user community, broad interest in the data due to climate change and how it is affecting people everywhere (e.g. sea level rise) by environmental managers, public policymakers and citizen scientists has led the Program to respond with new tools and ways to improve ease of access and use of the data. NASA's standard Earth observation data will soon be buttressed with the long tail of federally-funded research data created or analyzed by grantees, in response to John Holdren's OSTP Memorandum to federal departments and agencies entitled 'Increasing Access to the Results of Federally-Funded Scientific Research'. We fully expect that NASA's Earth Science Data Systems Program will be able to work with our grantees to comply early, and flexibly improve the openness of this source of scientific data to a best practice for NASA and the grantees
Improved quality monitoring of multi-center acupuncture clinical trials in China
2009-01-01
Background In 2007, the Chinese Science Division of the State Administration of Traditional Chinese Medicine(TCM) convened a special conference to discuss quality control for TCM clinical research. Control and assurance standards were established to guarantee the quality of clinical research. This paper provides practical guidelines for implementing strict and reproducible quality control for acupuncture randomized controlled trials (RCTs). Methods A standard quality control program (QCP) was established to monitor the quality of acupuncture trials. Case report forms were designed; qualified investigators, study personnel and data management personnel were trained. Monitors, who were directly appointed by the project leader, completed the quality control programs. They guaranteed data accuracy and prevented or detected protocol violations. Clinical centers and clinicians were audited, the randomization system of the centers was inspected, and the treatment processes were audited as well. In addition, the case report forms were reviewed for completeness and internal consistency, the eligibility and validity of the patients in the study was verified, and data was monitored for compliance and accuracy. Results and discussion The monitors complete their reports and submit it to quality assurance and the sponsors. Recommendations and suggestions are made for improving performance. By holding regular meetings to discuss improvements in monitoring standards, the monitors can improve quality and efficiency. Conclusions Supplementing and improving the existed guidelines for quality monitoring will ensure that large multi-centre acupuncture clinical trials will be considered as valid and scientifically stringent as pharmaceutical clinical trials. It will also develop academic excellence and further promote the international recognition of acupuncture. PMID:20035630
Federal Register 2010, 2011, 2012, 2013, 2014
2012-04-25
... DEPARTMENT OF HEALTH AND HUMAN SERVICES Agency for Healthcare Research and Quality Scientific... Endobronchial Obstruction Due to Advanced Lung Tumors AGENCY: Agency for Healthcare Research and Quality (AHRQ... Research and Quality (AHRQ) is seeking scientific information submissions from manufacturers of...
Maintaining Quality and Confidence in Open-Source, Evolving Software: Lessons Learned with PFLOTRAN
NASA Astrophysics Data System (ADS)
Frederick, J. M.; Hammond, G. E.
2017-12-01
Software evolution in an open-source framework poses a major challenge to a geoscientific simulator, but when properly managed, the pay-off can be enormous for both the developers and the community at large. Developers must juggle implementing new scientific process models, adopting increasingly efficient numerical methods and programming paradigms, changing funding sources (or total lack of funding), while also ensuring that legacy code remains functional and reported bugs are fixed in a timely manner. With robust software engineering and a plan for long-term maintenance, a simulator can evolve over time incorporating and leveraging many advances in the computational and domain sciences. In this positive light, what practices in software engineering and code maintenance can be employed within open-source development to maximize the positive aspects of software evolution and community contributions while minimizing its negative side effects? This presentation will discusses steps taken in the development of PFLOTRAN (www.pflotran.org), an open source, massively parallel subsurface simulator for multiphase, multicomponent, and multiscale reactive flow and transport processes in porous media. As PFLOTRAN's user base and development team continues to grow, it has become increasingly important to implement strategies which ensure sustainable software development while maintaining software quality and community confidence. In this presentation, we will share our experiences and "lessons learned" within the context of our open-source development framework and community engagement efforts. Topics discussed will include how we've leveraged both standard software engineering principles, such as coding standards, version control, and automated testing, as well unique advantages of object-oriented design in process model coupling, to ensure software quality and confidence. We will also be prepared to discuss the major challenges faced by most open-source software teams, such as on-boarding new developers or one-time contributions, dealing with competitors or lookie-loos, and other downsides of complete transparency, as well as our approach to community engagement, including a user group email list, hosting short courses and workshops for new users, and maintaining a website. SAND2017-8174A
Cheminformatics Research at the Unilever Centre for Molecular Science Informatics Cambridge.
Fuchs, Julian E; Bender, Andreas; Glen, Robert C
2015-09-01
The Centre for Molecular Informatics, formerly Unilever Centre for Molecular Science Informatics (UCMSI), at the University of Cambridge is a world-leading driving force in the field of cheminformatics. Since its opening in 2000 more than 300 scientific articles have fundamentally changed the field of molecular informatics. The Centre has been a key player in promoting open chemical data and semantic access. Though mainly focussing on basic research, close collaborations with industrial partners ensured real world feedback and access to high quality molecular data. A variety of tools and standard protocols have been developed and are ubiquitous in the daily practice of cheminformatics. Here, we present a retrospective of cheminformatics research performed at the UCMSI, thereby highlighting historical and recent trends in the field as well as indicating future directions.
Cheminformatics Research at the Unilever Centre for Molecular Science Informatics Cambridge
Fuchs, Julian E; Bender, Andreas; Glen, Robert C
2015-01-01
The Centre for Molecular Informatics, formerly Unilever Centre for Molecular Science Informatics (UCMSI), at the University of Cambridge is a world-leading driving force in the field of cheminformatics. Since its opening in 2000 more than 300 scientific articles have fundamentally changed the field of molecular informatics. The Centre has been a key player in promoting open chemical data and semantic access. Though mainly focussing on basic research, close collaborations with industrial partners ensured real world feedback and access to high quality molecular data. A variety of tools and standard protocols have been developed and are ubiquitous in the daily practice of cheminformatics. Here, we present a retrospective of cheminformatics research performed at the UCMSI, thereby highlighting historical and recent trends in the field as well as indicating future directions. PMID:26435758
Performance of a quality assurance program for assessing dental health in methamphetamine users.
Dye, Bruce A; Harrell, Lauren; Murphy, Debra A; Belin, Thomas; Shetty, Vivek
2015-07-05
Systematic characterization of the dental consequences of methamphetamine (MA) abuse presupposes a rigorous quality assurance (QA) program to ensure the credibility of the data collected and the scientific integrity and validity of the clinical study. In this report we describe and evaluate the performance of a quality assurance program implemented in a large cross-sectional study of the dental consequences of MA use. A large community sample of MA users was recruited over a 30 month period during 2011-13 and received comprehensive oral examinations and psychosocial assessments by site examiners based at two large community health centers in Los Angeles. National Health and Nutrition Examination Survey (NHANES) protocols for oral health assessments were utilized to characterize dental disease. Using NHANES oral health quality assurance guidelines, examiner reliability statistics such as Cohen's Kappa coefficients and inter-class correlation coefficients were calculated to assess the magnitude of agreement between the site examiners and a reference examiner to ensure conformance and comparability with NHANES practices. Approximately 9% (n = 49) of the enrolled 574 MA users received a repeat dental caries and periodontal examination conducted by the reference examiner. There was high concordance between the reference examiner and the site examiners for identification of untreated dental disease (Kappa statistic values: 0.57-0.75, percent agreement 83-88%). For identification of untreated caries on at least 5 surfaces of anterior teeth, the Kappas ranged from 0.77 to 0.87, and percent agreement from 94 to 97%. The intra-class coefficients (ICCs) ranged from 0.87 to 89 for attachment loss across all periodontal sites assessed and the ICCs ranged from 0.79 to 0.81 for pocket depth. For overall gingival recession, the ICCs ranged from 0.88 to 0.91. When Kappa was calculated based on the CDC/AAP case definitions for severe periodontitis, inter-examiner reliability for site examiners was low (Kappa 0.27-0.67). Overall, the quality assurance program confirmed the procedural adherence of the quality of the data collected on the distribution of dental caries and periodontal disease in MA-users. Examiner concordance was higher for dental caries but lower for specific periodontal assessments.
Wu, R Ryanne; Kinsinger, Linda S; Provenzale, Dawn; King, Heather A; Akerly, Patricia; Barnes, Lottie K; Datta, Santanu K; Grubber, Janet M; Katich, Nicholas; McNeil, Rebecca B; Monte, Robert; Sperber, Nina R; Atkins, David; Jackson, George L
2014-12-01
Collaboration between policy, research, and clinical partners is crucial to achieving proven quality care. The Veterans Health Administration has expended great efforts towards fostering such collaborations. Through this, we have learned that an ideal collaboration involves partnership from the very beginning of a new clinical program, so that the program is designed in a way that ensures quality, validity, and puts into place the infrastructure necessary for a reliable evaluation. This paper will give an example of one such project, the Lung Cancer Screening Demonstration Project (LCSDP). We will outline the ways that clinical, policy, and research partners collaborated in design, planning, and implementation in order to create a sustainable model that could be rigorously evaluated for efficacy and fidelity. We will describe the use of the Donabedian quality matrix to determine the necessary characteristics of a quality program and the importance of the linkage with engineering, information technology, and clinical paradigms to connect the development of an on-the-ground clinical program with the evaluation goal of a learning healthcare organization. While the LCSDP is the example given here, these partnerships and suggestions are salient to any healthcare organization seeking to implement new scientifically proven care in a useful and reliable way.
Kwak, Jin Il; Nam, Sun-Hwa; An, Youn-Joo
2018-02-01
Since the Korean Ministry of the Environment established the Master Plan for Water Environment (2006-2015), the need to revise the water quality standards (WQSs) has driven government projects to expand the standards for the protection of human health and aquatic ecosystems. This study aimed to provide an historical overview of how these WQSs were established, amended, and expanded over the past 10 years in Korea. Here, major projects related to national monitoring in rivers and the amendment of WQSs were intensely reviewed, including projects on the categorization of hazardous chemicals potentially discharged into surface water, the chemical ranking and scoring methodology for surface water (CRAFT, Chemical RAnking of surFace water polluTants), whole effluent toxicity (WET) management systems, the 4th, 5th, and 6th revisions of the water quality standards for the protection of human health, and efforts toward developing the 7th revision. In this review, we assimilated the past and current status as well as future perspectives of Korean surface WQSs. This research provides information that aids our understanding of how surface WQSs have been expanded, and how scientific approaches to ensure water quality have been applied at each step of the process in Korea.
EPA Office of Water (OW): Impaired Waters with TMDLs NHDPlus Indexed Dataset
The Total Maximum Daily Load (TMDL) Tracking System contains information on waters that are Not Supporting their designated uses. These waters are listed by the state as impaired under Section 303(d) of the Clean Water Act. The status of TMDLs are also tracked. TMDLs are pollution control measures that reduce the discharge of pollutants into impaired waters. A TMDL or Total Maximum Daily Load is a calculation of the maximum amount of a pollutant that a waterbody can receive and still meet water quality standards, and an allocation of that amount to the pollutant's sources. What is a total maximum daily load (TMDL)? Water quality standards are set by States, Territories, and Tribes. They identify the uses for each waterbody, for example, drinking water supply, contact recreation (swimming), and aquatic life support (fishing), and the scientific criteria to support that use. A TMDL is the sum of the allowable loads of a single pollutant from all contributing point and nonpoint sources. The calculation must include a margin of safety to ensure that the waterbody can be used for the purposes the state has designated. The calculation must also account for seasonal variation in water quality. The Clean Water Act, section 303, establishes the water quality standards and TMDL programs.
Impact of periodontal disease on quality of life: a systematic review.
Ferreira, M C; Dias-Pereira, A C; Branco-de-Almeida, L S; Martins, C C; Paiva, S M
2017-08-01
The diagnosis of periodontal disease is commonly based on objective evaluations of the patient's medical/dental history as well as clinical and radiographic examinations. However, periodontal disease should also be evaluated subjectively through measures that quantify its impact on oral health-related quality of life. The aim of this study was to evaluate the impact of periodontal disease on quality of life among adolescents, adults and older adults. A systematic search of the literature was performed for scientific articles published up to July 2015 using electronic databases and a manual search. Two independent reviewers performed the selection of the studies, extracted the data and assessed the methodological quality. Thirty-four cross-sectional studies involving any age group, except children, and the use of questionnaires for the assessment of the impact of periodontal disease on quality of life were included. Twenty-five studies demonstrated that periodontal disease was associated with a negative impact on quality of life, with severe periodontitis exerting the most significant impact by compromising aspects related to function and esthetics. Unlike periodontitis, gingivitis was associated with pain as well as difficulties performing oral hygiene and wearing dentures. Gingivitis was also negatively correlated with comfort. The results indicate that periodontal disease may exert an impact on quality of life of individuals, with greater severity of the disease related to greater impact. Longitudinal studies with representative samples are needed to ensure validity of the findings. © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Embedding Scientific Integrity and Ethics into the Scientific Process and Research Data Lifecycle
NASA Astrophysics Data System (ADS)
Gundersen, L. C.
2016-12-01
Predicting climate change, developing resources sustainably, and mitigating natural hazard risk are complex interdisciplinary challenges in the geosciences that require the integration of data and knowledge from disparate disciplines and scales. This kind of interdisciplinary science can only thrive if scientific communities work together and adhere to common standards of scientific integrity, ethics, data management, curation, and sharing. Science and data without integrity and ethics can erode the very fabric of the scientific enterprise and potentially harm society and the planet. Inaccurate risk analyses of natural hazards can lead to poor choices in construction, insurance, and emergency response. Incorrect assessment of mineral resources can bankrupt a company, destroy a local economy, and contaminate an ecosystem. This paper presents key ethics and integrity questions paired with the major components of the research data life cycle. The questions can be used by the researcher during the scientific process to help ensure the integrity and ethics of their research and adherence to sound data management practice. Questions include considerations for open, collaborative science, which is fundamentally changing the responsibility of scientists regarding data sharing and reproducibility. The publication of primary data, methods, models, software, and workflows must become a norm of science. There are also questions that prompt the scientist to think about the benefit of their work to society; ensuring equity, respect, and fairness in working with others; and always striving for honesty, excellence, and transparency.
Doi, Ryoichi; Pitiwut, Supachai
2014-01-01
The concept of crop yield maximization has been widely supported. In practice, however, yield maximization does not necessarily lead to maximum socioeconomic welfare. Optimization is therefore necessary to ensure quality of life of farmers and other stakeholders. In Thailand, a rice farmers' network has adopted a promising agricultural system aimed at the optimization of rice farming. Various feasible techniques were flexibly combined. The new system offers technical strengths and minimizes certain difficulties with which the rice farmers once struggled. It has resulted in fairly good yields of up to 8.75 t ha−1 or yield increases of up to 57% (from 4.38 to 6.88 t ha−1). Under the optimization paradigm, the farmers have established diversified sustainable relationships with the paddy fields in terms of ecosystem management through their own self-motivated scientific observations. The system has resulted in good health conditions for the farmers and villagers, financial security, availability of extra time, and additional opportunities and freedom and hence in the improvement of their overall quality of life. The underlying technical and social mechanisms are discussed herein. PMID:25089294
Strategies for ensuring quality data from Indian investigational sites
Hajos, Antal K.; Kamble, Sujal K.
2011-01-01
The topic of ensuring quality and compliance is and must be a top priority in the conduct of clinical trials, as warranted by regulatory guidelines as well as the inherent responsibility of the professionals conducting such research. Fast-growing emerging clinical geographies such as India demand special attention due to rapid growth and associated factors that may put study quality at risk. In this paper, we used the basic principle of PDCA (Plan, Do, Check, and Adjust) to structure the processes of a clinical trial from protocol to final analysis in order to highlight the interactive nature of involved people and processes required to ensure quality of data and site functioning. PMID:21731855
Report: EPA Needs to Fulfill Its Designated Responsibilities to Ensure Effective BioWatch Program
Report #2005-P-00012, March 23, 2005. EPA did not provide adequate oversight of the sampling operations to ensure quality assurance guidance was adhered to, potentially affecting the quality of the samples taken.
Yu, Yuanshan; Xiao, Gengsheng; Xu, Yujuan; Wu, Jijun; Wen, Jing
2014-05-01
This study investigated the effects of dimethyl dicarbonate (DMDC) on the fermentation of litchi juice by Lactobacillus casei as an alternative of heat treatment that may have undesirable effect on the juice. Quality attributes and products stability of both the fermented heat- and DMDC-treated litchi juice by L. casei were compared. It was found that residual indigenous microorganisms in both the heat- and DMDC-treated litchi juice cannot grow into dominant bacteria during further fermentation of litchi juice by L. casei. Compared with fermented heat-treated litchi juice, fermented DMDC-treated litchi juice showed a better color, flavor, and overall acceptance, and also retained more total phenolics and antioxidant capacity. The viability counts of L. casei in both the heat- and DMDC-treated litchi juice were more 8.0 lg CFU/mL after 4 wk of storage at 4 °C. Also, some quality attributes in both the fermented heat- and DMDC-treated litchi juices, including pH, total phenolics, ascorbic acid, antioxidant capacity, and so on, showed the tendency to slow decrease during storage at 4 °C, but the scores of overall acceptance showed no reduction after the storage of 4 wk at 4 °C. On the whole, the application of DMDC treatment could be an ideal alternative of heat treatment to ensure the microbial safety, consistent sensory, and nutritional quality of fermented litchi juice prior to fermentation. The pasteurization treatment is often recommended prior to fermentation of fruit juice by probiotics, as it would lead to a rapid inactivation and inhibition of spoilage and pathogenic bacteria, and ensure the fermented products with consistent sensory and nutritional quality. Dimethyl dicarbonate (DMDC) is a powerful antimicrobial agent, which was approved for use as a microbial control agent in juice beverages by FDA. This study provides a scientific basis for the application of DMDC prior to fermentation of litchi juice. © 2014 Institute of Food Technologists®
Schaefer, Cédric; Clicq, David; Lecomte, Clémence; Merschaert, Alain; Norrant, Edith; Fotiadu, Frédéric
2014-03-01
Pharmaceutical companies are progressively adopting and introducing Process Analytical Technology (PAT) and Quality-by-Design (QbD) concepts promoted by the regulatory agencies, aiming the building of the quality directly into the product by combining thorough scientific understanding and quality risk management. An analytical method based on near infrared (NIR) spectroscopy was developed as a PAT tool to control on-line an API (active pharmaceutical ingredient) manufacturing crystallization step during which the API and residual solvent contents need to be precisely determined to reach the predefined seeding point. An original methodology based on the QbD principles was designed to conduct the development and validation of the NIR method and to ensure that it is fitted for its intended use. On this basis, Partial least squares (PLS) models were developed and optimized using chemometrics methods. The method was fully validated according to the ICH Q2(R1) guideline and using the accuracy profile approach. The dosing ranges were evaluated to 9.0-12.0% w/w for the API and 0.18-1.50% w/w for the residual methanol. As by nature the variability of the sampling method and the reference method are included in the variability obtained for the NIR method during the validation phase, a real-time process monitoring exercise was performed to prove its fit for purpose. The implementation of this in-process control (IPC) method on the industrial plant from the launch of the new API synthesis process will enable automatic control of the final crystallization step in order to ensure a predefined quality level of the API. In addition, several valuable benefits are expected including reduction of the process time, suppression of a rather difficult sampling and tedious off-line analyses. © 2013 Published by Elsevier B.V.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-03-18
...-0011] [MO 92210-0-0008-B2] Endangered and Threatened Wildlife and Plants; 90-Day Finding on a Petition... that the petition presents substantial scientific or commercial information indicating that listing... ensure that this status review is comprehensive, we are requesting scientific and commercial data and...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-07-12
...-0012; MO 92210-0-0008] Endangered and Threatened Wildlife and Plants; 90-Day Finding on a Petition To... habitat. Based on our review, we find that the petition presents substantial scientific or commercial... warranted. To ensure that this status review is comprehensive, we are requesting scientific and commercial...
Slow off the Mark: Elementary School Teachers and the Crisis in STEM Education
ERIC Educational Resources Information Center
Epstein, Diana; Miller, Raegen T.
2011-01-01
Prospective teachers can typically obtain a license to teach elementary school without taking a rigorous college-level STEM class such as calculus, statistics, or chemistry, and without demonstrating a solid grasp of mathematics knowledge, scientific knowledge, or the nature of scientific inquiry. This is not a recipe for ensuring students have…
Metabolomic profiling in perinatal asphyxia: a promising new field.
Denihan, Niamh M; Boylan, Geraldine B; Murray, Deirdre M
2015-01-01
Metabolomics, the latest "omic" technology, is defined as the comprehensive study of all low molecular weight biochemicals, "metabolites" present in an organism. As a systems biology approach, metabolomics has huge potential to progress our understanding of perinatal asphyxia and neonatal hypoxic-ischaemic encephalopathy, by uniquely detecting rapid biochemical pathway alterations in response to the hypoxic environment. The study of metabolomic biomarkers in the immediate neonatal period is not a trivial task and requires a number of specific considerations, unique to this disease and population. Recruiting a clearly defined cohort requires standardised multicentre recruitment with broad inclusion criteria and the participation of a range of multidisciplinary staff. Minimally invasive biospecimen collection is a priority for biomarker discovery. Umbilical cord blood presents an ideal medium as large volumes can be easily extracted and stored and the sample is not confounded by postnatal disease progression. Pristine biobanking and phenotyping are essential to ensure the validity of metabolomic findings. This paper provides an overview of the current state of the art in the field of metabolomics in perinatal asphyxia and neonatal hypoxic-ischaemic encephalopathy. We detail the considerations required to ensure high quality sampling and analysis, to support scientific progression in this important field.
Metabolomic Profiling in Perinatal Asphyxia: A Promising New Field
Denihan, Niamh M.; Boylan, Geraldine B.; Murray, Deirdre M.
2015-01-01
Metabolomics, the latest “omic” technology, is defined as the comprehensive study of all low molecular weight biochemicals, “metabolites” present in an organism. As a systems biology approach, metabolomics has huge potential to progress our understanding of perinatal asphyxia and neonatal hypoxic-ischaemic encephalopathy, by uniquely detecting rapid biochemical pathway alterations in response to the hypoxic environment. The study of metabolomic biomarkers in the immediate neonatal period is not a trivial task and requires a number of specific considerations, unique to this disease and population. Recruiting a clearly defined cohort requires standardised multicentre recruitment with broad inclusion criteria and the participation of a range of multidisciplinary staff. Minimally invasive biospecimen collection is a priority for biomarker discovery. Umbilical cord blood presents an ideal medium as large volumes can be easily extracted and stored and the sample is not confounded by postnatal disease progression. Pristine biobanking and phenotyping are essential to ensure the validity of metabolomic findings. This paper provides an overview of the current state of the art in the field of metabolomics in perinatal asphyxia and neonatal hypoxic-ischaemic encephalopathy. We detail the considerations required to ensure high quality sampling and analysis, to support scientific progression in this important field. PMID:25802843
Knöss, Werner; Chinou, Ioanna
2012-08-01
The European legislation on medicinal products also addresses the medicinal use of products originating from plants. The objective of the legislation is to ensure the future existence of such products and to consider particular characteristics when assessing quality, efficacy, and safety. Two categories are defined: i) herbal medicinal products can be granted a marketing authorisation; and ii) traditional herbal medicinal products can be granted a registration based on their longstanding use if they are complying with a set of provisions ensuring their safe use. The Committee on Herbal Medicinal Products (HMPC) was established at the European Medicines Agency (EMA) to provide monographs and list entries on herbal substances and preparations thereof. Meanwhile, approx. 100 monographs have been published, which define a current scientific and regulatory standard for efficacy and safety of herbal substances and herbal preparations used in medicinal products. This harmonised European standard will facilitate the availability and adequate use of traditional herbal medicinal products and herbal medicinal products within the European Union. Consequent labelling shall also enable patients and health care professionals to differentiate medicinal products from other product categories like cosmetics, food supplements, and medical devices. Georg Thieme Verlag KG Stuttgart · New York.
Rohrlich, P S; Kerautret, K; Bancillon, N; Vauzelle, K; Bertrand-Letort, M; Ruiz, M; Schmitt, S; Samy, J-P; Guiraud, M; Yakoub-Agha, I
2013-08-01
To set up the minimal conditions necessary to ensure that the social and familial network of the patient is preserved during the hospital stay for allogeneic hematopoïetic stem cell transplantation. A national survey was conducted to increase knowledge about the conditions of hospital stay of adult and pediatric stem cell recipients. Then a multidisciplinary panel of health workers including doctors and nurses met to establish recommendations for maintaining the social and familial relationships optimally during the HSCT procedure. Practices and policies are very heterogeneous among the transplant centers. No consensus has been established and the literature data are scarce. The panel has thus established a list of recommendations to maintain optimal relationships between the patient and his relatives and friends during the hospital stay. The objective is to ensure a better acceptation of the isolation conditions and to facilitate the return home after D30. Since there is no established scientific background for drastic isolation conditions, a multi-disciplinary approach in relationship with patients associations should allow softening of the procedures without impairing the quality of care. Copyright © 2013 Elsevier Masson SAS. All rights reserved.
Khan, Ikhlas A; Smillie, Troy
2012-09-28
Natural products have provided a basis for health care and medicine to humankind since the beginning of civilization. According to the World Health Organization (WHO), approximately 80% of the world population still relies on herbal medicines for health-related benefits. In the United States, over 42% of the population claimed to have used botanical dietary supplements to either augment their current diet or to "treat" or "prevent" a particular health-related issue. This has led to the development of a burgeoning industry in the U.S. ($4.8 billion per year in 2008) to supply dietary supplements to the consumer. However, many commercial botanical products are poorly defined scientifically, and the consumer must take it on faith that the supplement they are ingesting is an accurate representation of what is listed on the label, and that it contains the purportedly "active" constituents they seek. Many dietary supplement manufacturers, academic research groups, and governmental organizations are progressively attempting to construct a better scientific understanding of natural products, herbals, and botanical dietary supplements that have co-evolved with Western-style pharmaceutical medicines. However, a deficiency of knowledge is still evident, and this issue needs to be addressed in order to achieve a significant level of safety, efficacy, and quality for commercial natural products. The authors contend that a "quality by design" approach for botanical dietary supplements should be implemented in order to ensure the safety and integrity of these products. Initiating this approach with the authentication of the starting plant material is an essential first step, and in this review several techniques that can aid in this endeavor are outlined.
NASA Astrophysics Data System (ADS)
Conley, Alan H.; Midgley, Desmond C.
1988-07-01
A resourceful holistic water management strategy has been developed for ensuring equitable provision of adequate quantities of water of satisfactory quality at acceptable risk and affordable cost to a wide international range of competing user groups subject to adverse physical and hydrological factors and under rapidly changing social conditions. Scarce resource allocation strategies, based on scientific studies and supported by modern data processing facilities, focus primarily on supply, demand and quality. Supply management implies creation of the best combination of affordable elements of infrastructure for bulk water supplies from available runoff, groundwater, re-use, imports and unconventional sources, sized to meet determinable requirements with appropriate degrees of assurance, coupled with continuous optimization of system operation. Demand management seeks optimum allocation of available supplies to towns, power generation, industry, mining, agriculture, forestry, recreation and ecology, according to priority criteria determined from scientific, economic and socioeconomic studies. Quality management strategies relate to the control of salination, eutrophication and pollution from both diffuse and point sources. As the combined demands of complex First and Third World societies and economies on the available resources rise, increasing attention has to be paid to finding practical compromises to facilitate handling of conflict between legitimate users having widely divergent interests, aspirations and levels of sophistication. For optimum joint utilization, the central regulating authority is striving to forge a consultative partnership within which to promote, among the widest possible spectrum of users, enlightened understanding of the opportunities and limitations in handling complex international, social, political, legal, economic and financial issues associated with water development. These cannot readily be resolved by the methods of traditional hydrological sciences alone.
Bamidis, P D; Lithari, C; Konstantinidis, S T
2010-01-01
With the number of scientific papers published in journals, conference proceedings, and international literature ever increasing, authors and reviewers are not only facilitated with an abundance of information, but unfortunately continuously confronted with risks associated with the erroneous copy of another's material. In parallel, Information Communication Technology (ICT) tools provide to researchers novel and continuously more effective ways to analyze and present their work. Software tools regarding statistical analysis offer scientists the chance to validate their work and enhance the quality of published papers. Moreover, from the reviewers and the editor's perspective, it is now possible to ensure the (text-content) originality of a scientific article with automated software tools for plagiarism detection. In this paper, we provide a step-bystep demonstration of two categories of tools, namely, statistical analysis and plagiarism detection. The aim is not to come up with a specific tool recommendation, but rather to provide useful guidelines on the proper use and efficiency of either category of tools. In the context of this special issue, this paper offers a useful tutorial to specific problems concerned with scientific writing and review discourse. A specific neuroscience experimental case example is utilized to illustrate the young researcher's statistical analysis burden, while a test scenario is purpose-built using open access journal articles to exemplify the use and comparative outputs of seven plagiarism detection software pieces. PMID:21487489
Bamidis, P D; Lithari, C; Konstantinidis, S T
2010-12-01
With the number of scientific papers published in journals, conference proceedings, and international literature ever increasing, authors and reviewers are not only facilitated with an abundance of information, but unfortunately continuously confronted with risks associated with the erroneous copy of another's material. In parallel, Information Communication Technology (ICT) tools provide to researchers novel and continuously more effective ways to analyze and present their work. Software tools regarding statistical analysis offer scientists the chance to validate their work and enhance the quality of published papers. Moreover, from the reviewers and the editor's perspective, it is now possible to ensure the (text-content) originality of a scientific article with automated software tools for plagiarism detection. In this paper, we provide a step-bystep demonstration of two categories of tools, namely, statistical analysis and plagiarism detection. The aim is not to come up with a specific tool recommendation, but rather to provide useful guidelines on the proper use and efficiency of either category of tools. In the context of this special issue, this paper offers a useful tutorial to specific problems concerned with scientific writing and review discourse. A specific neuroscience experimental case example is utilized to illustrate the young researcher's statistical analysis burden, while a test scenario is purpose-built using open access journal articles to exemplify the use and comparative outputs of seven plagiarism detection software pieces.
Klein, Marguerite A.; Nahin, Richard L.; Messina, Mark J.; Rader, Jeanne I.; Thompson, Lilian U.; Badger, Thomas M.; Dwyer, Johanna T.; Kim, Young S.; Pontzer, Carol H.; Starke-Reed, Pamela E.; Weaver, Connie M.
2010-01-01
The NIH sponsored a scientific workshop, “Soy Protein/Isoflavone Research: Challenges in Designing and Evaluating Intervention Studies,” July 28–29, 2009. The workshop goal was to provide guidance for the next generation of soy protein/isoflavone human research. Session topics included population exposure to soy; the variability of the human response to soy; product composition; methods, tools, and resources available to estimate exposure and protocol adherence; and analytical methods to assess soy in foods and supplements and analytes in biologic fluids and other tissues. The intent of the workshop was to address the quality of soy studies, not the efficacy or safety of soy. Prior NIH workshops and an evidence-based review questioned the quality of data from human soy studies. If clinical studies are pursued, investigators need to ensure that the experimental designs are optimal and the studies properly executed. The workshop participants identified methodological issues that may confound study results and interpretation. Scientifically sound and useful options for dealing with these issues were discussed. The resulting guidance is presented in this document with a brief rationale. The guidance is specific to soy clinical research and does not address nonsoy-related factors that should also be considered in designing and reporting clinical studies. This guidance may be used by investigators, journal editors, study sponsors, and protocol reviewers for a variety of purposes, including designing and implementing trials, reporting results, and interpreting published epidemiological and clinical studies. PMID:20392880
Balancing water scarcity and quality for sustainable irrigated agriculture
NASA Astrophysics Data System (ADS)
Assouline, Shmuel; Russo, David; Silber, Avner; Or, Dani
2015-05-01
The challenge of meeting the projected doubling of global demand for food by 2050 is monumental. It is further exacerbated by the limited prospects for land expansion and rapidly dwindling water resources. A promising strategy for increasing crop yields per unit land requires the expansion of irrigated agriculture and the harnessing of water sources previously considered "marginal" (saline, treated effluent, and desalinated water). Such an expansion, however, must carefully consider potential long-term risks on soil hydroecological functioning. The study provides critical analyses of use of marginal water and management approaches to map out potential risks. Long-term application of treated effluent (TE) for irrigation has shown adverse impacts on soil transport properties, and introduces certain health risks due to the persistent exposure of soil biota to anthropogenic compounds (e.g., promoting antibiotic resistance). The availability of desalinated water (DS) for irrigation expands management options and improves yields while reducing irrigation amounts and salt loading into the soil. Quantitative models are used to delineate trends associated with long-term use of TE and DS considering agricultural, hydrological, and environmental aspects. The primary challenges to the sustainability of agroecosystems lies with the hazards of saline and sodic conditions, and the unintended consequences on soil hydroecological functioning. Multidisciplinary approaches that combine new scientific knowhow with legislative, economic, and societal tools are required to ensure safe and sustainable use of water resources of different qualities. The new scientific knowhow should provide quantitative models for integrating key biophysical processes with ecological interactions at appropriate spatial and temporal scales.
ERIC Educational Resources Information Center
Data Quality Campaign, 2014
2014-01-01
High school feedback reports let school and district leaders know where their students go after graduation and how well they are prepared for college and beyond. This roadmap discusses the seven key focus areas the Data Quality Campaign (DQC) recommends states work on to ensure quality implementation of high school feedback reports.
Enabling ICH Q10 Implementation--Part 1. Striving for Excellence by Embracing ICH Q8 and ICH Q9.
Calnan, Nuala; O'Donnell, Kevin; Greene, Anne
2013-01-01
This article is the first in a series of articles that will focus on understanding the implementation essentials necessary to deliver operational excellence through a International Conference on Harmonisation of Technical Requirements for Registration of Pharmaceuticals for Human Use (ICH) Q10-based pharmaceutical quality system (PQS). The authors examine why, despite the fact that the ICH Q10 guideline has been with us since 2008, the transformation of the traditional Quality Management Systems QMS in use within the pharmaceutical industry is a work in progress for only a few forward-thinking organisations. Unfortunately, this transformation remains a mere aspiration for the majority of organisations. We explore the apparent lack of progress by the pharmaceutical sector in adopting six sigma and related quality management techniques to ensure the availability of high-quality medicines worldwide. The authors propose that the desired progress can be delivered through two key shifts in our current practices; by embodying the principles of operational excellence in every aspect of our business and by learning how to unlock the scientific and tacit knowledge within our organisations. It has been ten years since The Wall Street Journal revealed the pharmaceutical industry's "little secret" comparing the perceived level of manufacturing expertise in the industry as lagging far behind those of potato-chip and laundry-soap makers. Would you consider the quality and manufacturing strategies in place today in your organisation to be more efficient and scientifically based than those of 2003? If so, what evidence exists for you to draw any conclusion regarding enhanced performance? Do your current practices drive innovation and facilitate continual improvement and if so, how? Ultimately, can you confidently affirm that patient-related risks associated with the product(s) manufactured by your organisation have been reduced due to the quality assurance program now applied within your organisation? This article asks you to question if you have truly embraced Q8(R2), Q9, and Q10, and in doing so can you demonstrate that you have made the necessary changes that would warrant reduced regulatory oversight?
Plagiarism in scientific publishing.
Masic, Izet
2012-12-01
Scientific publishing is the ultimate product of scientist work. Number of publications and their quoting are measures of scientist success while unpublished researches are invisible to the scientific community, and as such nonexistent. Researchers in their work rely on their predecessors, while the extent of use of one scientist work, as a source for the work of other authors is the verification of its contributions to the growth of human knowledge. If the author has published an article in a scientific journal it cannot publish the article in any other journal h with a few minor adjustments or without quoting parts of the first article, which are used in another article. Copyright infringement occurs when the author of a new article with or without the mentioning the author used substantial portions of previously published articles, including tables and figures. Scientific institutions and universities should,in accordance with the principles of Good Scientific Practice (GSP) and Good Laboratory Practices (GLP) have a center for monitoring,security, promotion and development of quality research. Establish rules and compliance to rules of good scientific practice are the obligations of each research institutions,universities and every individual-researchers,regardless of which area of science is investigated. In this way, internal quality control ensures that a research institution such as a university, assume responsibility for creating an environment that promotes standards of excellence, intellectual honesty and legality. Although the truth should be the aim of scientific research, it is not guiding fact for all scientists. The best way to reach the truth in its study and to avoid the methodological and ethical mistakes is to consistently apply scientific methods and ethical standards in research. Although variously defined plagiarism is basically intended to deceive the reader's own scientific contribution. There is no general regulation of control of scientific research and intellectual honesty of researchers which would be absolutely applicable in all situations and in all research institutions. A special form of plagiarism is self-plagiarism. Scientists need to take into consideration this form of plagiarism, though for now there is an attitude as much as their own words can be used without the word about plagiarism. If the authors cite their own research facilities already stated then they should be put in quote sand cite the source in which it was published. Science should not be exempt from disclosure and sanctioning plagiarism. In the fight against intellectual dishonesty on ethics education in science has a significant place. A general understanding of ethics in scientific research work in all its stages had to be acquired during the undergraduate course and continue to intensify. It is also important ethical aspect of the publishing industry,especially in small and developing economies,because the issuer has an educational role in the development of the scientific community that aspires to relish so. In this paper author describe his experiences in discovering of plagiarism as Editor-in-Chief of three indexed medical journals with presentations of several examples of plagiarism recorded in countries in Southeastern Europe.
PLAGIARISM IN SCIENTIFIC PUBLISHING
Masic, Izet
2012-01-01
Scientific publishing is the ultimate product of scientist work. Number of publications and their quoting are measures of scientist success while unpublished researches are invisible to the scientific community, and as such nonexistent. Researchers in their work rely on their predecessors, while the extent of use of one scientist work, as a source for the work of other authors is the verification of its contributions to the growth of human knowledge. If the author has published an article in a scientific journal it cannot publish the article in any other journal h with a few minor adjustments or without quoting parts of the first article, which are used in another article. Copyright infringement occurs when the author of a new article with or without the mentioning the author used substantial portions of previously published articles, including tables and figures. Scientific institutions and universities should,in accordance with the principles of Good Scientific Practice (GSP) and Good Laboratory Practices (GLP) have a center for monitoring,security, promotion and development of quality research. Establish rules and compliance to rules of good scientific practice are the obligations of each research institutions,universities and every individual-researchers,regardless of which area of science is investigated. In this way, internal quality control ensures that a research institution such as a university, assume responsibility for creating an environment that promotes standards of excellence, intellectual honesty and legality. Although the truth should be the aim of scientific research, it is not guiding fact for all scientists. The best way to reach the truth in its study and to avoid the methodological and ethical mistakes is to consistently apply scientific methods and ethical standards in research. Although variously defined plagiarism is basically intended to deceive the reader’s own scientific contribution. There is no general regulation of control of scientific research and intellectual honesty of researchers which would be absolutely applicable in all situations and in all research institutions. A special form of plagiarism is self-plagiarism. Scientists need to take into consideration this form of plagiarism, though for now there is an attitude as much as their own words can be used without the word about plagiarism. If the authors cite their own research facilities already stated then they should be put in quote sand cite the source in which it was published. Science should not be exempt from disclosure and sanctioning plagiarism. In the fight against intellectual dishonesty on ethics education in science has a significant place. A general understanding of ethics in scientific research work in all its stages had to be acquired during the undergraduate course and continue to intensify. It is also important ethical aspect of the publishing industry,especially in small and developing economies,because the issuer has an educational role in the development of the scientific community that aspires to relish so. In this paper author describe his experiences in discovering of plagiarism as Editor-in-Chief of three indexed medical journals with presentations of several examples of plagiarism recorded in countries in Southeastern Europe. PMID:23378684
USGS Laboratory Review Program Ensures Analytical Quality
Erdmann, David E.
1995-01-01
The USGS operates a review program for laboratories that analyze samples for USGS environmental investigations. This program has been effective in providing QA feedback to laboratories while ensuring that analytical data are consistent, of satisfactory quality, and meet the data objectives of the investigation.
Designing a solution to enable agency-academic scientific collaboration for disasters
Mease, Lindley A.; Gibbs-Plessl, Theodora; Erickson, Ashley; Ludwig, Kristin A.; Reddy, Christopher M.; Lubchenco, Jane
2017-01-01
As large-scale environmental disasters become increasingly frequent and more severe globally, people and organizations that prepare for and respond to these crises need efficient and effective ways to integrate sound science into their decision making. Experience has shown that integrating nongovernmental scientific expertise into disaster decision making can improve the quality of the response, and is most effective if the integration occurs before, during, and after a crisis, not just during a crisis. However, collaboration between academic, government, and industry scientists, decision makers, and responders is frequently difficult because of cultural differences, misaligned incentives, time pressures, and legal constraints. Our study addressed this challenge by using the Deep Change Method, a design methodology developed by Stanford ChangeLabs, which combines human-centered design, systems analysis, and behavioral psychology. We investigated underlying needs and motivations of government agency staff and academic scientists, mapped the root causes underlying the relationship failures between these two communities based on their experiences, and identified leverage points for shifting deeply rooted perceptions that impede collaboration. We found that building trust and creating mutual value between multiple stakeholders before crises occur is likely to increase the effectiveness of problem solving. We propose a solution, the Science Action Network, which is designed to address barriers to scientific collaboration by providing new mechanisms to build and improve trust and communication between government administrators and scientists, industry representatives, and academic scientists. The Science Action Network has the potential to ensure cross-disaster preparedness and science-based decision making through novel partnerships and scientific coordination.
[Food industry funding and epidemiologic research in public health nutrition].
Navarrete-Muñoz, Eva María; Tardón, Adonina; Romaguera, Dora; Martínez-González, Miguel Ángel; Vioque, Jesús
The interests of the food industry to fund nutrition and health research are not limited to promoting scientific advances. Recently, several systematic reviews conducted about the effect of sugar-sweetened beverages and health outcomes have shown some biased conclusions in studies that acknowledge industry sponsorship. In this context, the Nutrition Working Group of the Spanish Epidemiology Society presented a scientific session entitled Food industry and epidemiologic research at its annual meeting. In a round table, four experts in nutrition research presented their points of view about whether the food industry should fund nutrition-related research and the related potential conflicts of interest of the food industry. All the experts agreed not only on defending independence in nutritional epidemiology regarding the design, interpretation and conclusion of their studies but also on the crucial need for guaranteed scientific rigor, scientific quality of the results and measures to protect studies against potential biases related to the conflicts of interest of funding by the food industry. Drs Pérez-Farinós and Romaguera believe that the most effective way to prevent conflicts of interest would be not to allow the food industry to fund nutrition research; Drs Marcos and Martínez-González suggested the need to establish mechanisms and strategies to prevent the potential influences of the food industry in selecting researchers or institutional sponsorship and in the analysis and results of the studies, to ensure maximum independence for researchers, as well as their professional ethics. Copyright © 2017 SESPAS. Publicado por Elsevier España, S.L.U. All rights reserved.
NASA Astrophysics Data System (ADS)
Guilyardi, E.
2003-04-01
The European Union's PRISM infrastructure project (PRogram for Integrated earth System Modelling) aims at designing a flexible environment to easily assemble and run Earth System Models (http://prism.enes.org). Europe's widely distributed modelling expertise is both a strength and a challenge. Recognizing this, the PRISM project aims at developing an efficient shared modelling software infrastructure for climate scientists, providing them with an opportunity for greater focus on scientific issues, including the necessary scientific diversity (models and approaches). The proposed PRISM system includes 1) the use - or definition - and promotion of scientific and technical standards to increase component modularity, 2) an end-to-end software environment (coupler, user interface, diagnostics) to launch, monitor and analyze complex Earth System Models built around the existing and future community models, 3) testing and quality standards to ensure HPC performance on a variety of platforms and 4) community wide inputs and requirements capture in all stages of system specifications and design through user/developers meetings, workshops and thematic schools. This science driven project, led by 22 institutes* and started December 1st 2001, benefits from a unique gathering of scientific and technical expertise. More than 30 models (both global and regional) have expressed interest to be part of the PRISM system and 6 types of components have been identified: atmosphere, atmosphere chemistry, land surface, ocean, sea ice and ocean biochemistry. Progress and overall architecture design will be presented. * MPI-Met (Coordinator), KNMI (co-coordinator), MPI-M&D, Met Office, University of Reading, IPSL, Meteo-France, CERFACS, DMI, SMHI, NERSC, ETH Zurich, INGV, MPI-BGC, PIK, ECMWF, UCL-ASTR, NEC, FECIT, SGI, SUN, CCRLE
Recommendations for cervical cancer prevention in Latin America and the Caribbean.
Muñoz, Nubia; Franco, Eduardo L; Herrero, Rolando; Andrus, Jon Kim; de Quadros, Ciro; Goldie, Sue J; Bosch, F Xavier
2008-08-19
Cervical cancer control in the Latin America and Caribbean (LAC) region has been, and remains, a priority and a major public health challenge. It also provides the opportunity for the advancement of research into novel cervical cancer preventative tools including the use of prophylactic human papillomavirus (HPV) vaccines, HPV-based screening options and low technology visual inspection methods. The challenges for prevention are compounded because cervical cancer cases continue to cluster in the low socio-economic and rural populations, thus requiring strong political and social commitments to ensure effective implementation in the region. Although cytology-based screening activities exist in the majority of LAC countries, these have been largely based on opportunistic screening services. Evaluation of the impact of screening is often focused on assessing coverage of the population with Pap smears. However, regardless of the chosen technology a screening program requires a complex set of activities that must also be of high quality such us ensuring access of the underserved populations to the program, maintaining routine quality controls of the screening procedures and organizing the proper follow-up of women with abnormal screening results. The cost of the HPV vaccine and of the delivery infrastructure required is currently a significant obstacle for widespread introduction that will require collaborative resolve between public health organizations, governments and vaccine manufacturers. It is important to ensure that HPV vaccines are made available to the wider public, not only to those who can afford it. This monograph and the associated regional reports have carefully identified and discussed the many challenges and opportunities to be considered for policy decisions, in particular the complex interplay between vaccination strategies and subsequent screening requirements. An advanced cost-benefit analysis, using models calibrated to specific countries in the region, presents the range of options and relative costs thus providing evidence-based scientific guidance to governments and providers in the context of a significant and systematic international review effort.
Monk, Johanna M; Rowley, Kevin G; Anderson, Ian Ps
2009-11-20
Priority setting is about making decisions. Key issues faced during priority setting processes include identifying who makes these decisions, who sets the criteria, and who benefits. The paper reviews the literature and history around priority setting in research, particularly in Aboriginal health research. We explore these issues through a case study of the Cooperative Research Centre for Aboriginal Health (CRCAH)'s experience in setting and meeting priorities.Historically, researchers have made decisions about what research gets done. Pressures of growing competition for research funds and an increased public interest in research have led to demands that appropriate consultation with stakeholders is conducted and that research is of benefit to the wider society. Within Australian Aboriginal communities, these demands extend to Aboriginal control of research to ensure that Aboriginal priorities are met.In response to these demands, research priorities are usually agreed in consultation with stakeholders at an institutional level and researchers are asked to develop relevant proposals at a project level. The CRCAH's experience in funding rounds was that scientific merit was given more weight than stakeholders' priorities and did not necessarily result in research that met these priorities. After reviewing these processes in 2004, the CRCAH identified a new facilitated development approach. In this revised approach, the setting of institutional priorities is integrated with the development of projects in a way that ensures the research reflects stakeholder priorities.This process puts emphasis on identifying projects that reflect priorities prior to developing the quality of the research, rather than assessing the relevance to priorities and quality concurrently. Part of the CRCAH approach is the employment of Program Managers who ensure that stakeholder priorities are met in the development of research projects. This has enabled researchers and stakeholders to come together to collaboratively develop priority-driven research. Involvement by both groups in project development has been found to be essential in making decisions that will lead to robust and useful research.
Processed foods: contributions to nutrition.
Weaver, Connie M; Dwyer, Johanna; Fulgoni, Victor L; King, Janet C; Leveille, Gilbert A; MacDonald, Ruth S; Ordovas, Jose; Schnakenberg, David
2014-06-01
Both fresh and processed foods make up vital parts of the food supply. Processed food contributes to both food security (ensuring that sufficient food is available) and nutrition security (ensuring that food quality meets human nutrient needs). This ASN scientific statement focuses on one aspect of processed foods: their nutritional impacts. Specifically, this scientific statement 1) provides an introduction to how processed foods contribute to the health of populations, 2) analyzes the contribution of processed foods to "nutrients to encourage" and "constituents to limit" in the American diet as recommended by the Dietary Guidelines for Americans, 3) identifies the responsibilities of various stakeholders in improving the American diet, and 4) reviews emerging technologies and the research needed for a better understanding of the role of processed foods in a healthy diet. Analyses of the NHANES 2003-2008 show that processed foods provide both nutrients to encourage and constituents to limit as specified in the 2010 Dietary Guidelines for Americans. Of the nutrients to encourage, processed foods contributed 55% of dietary fiber, 48% of calcium, 43% of potassium, 34% of vitamin D, 64% of iron, 65% of folate, and 46% of vitamin B-12. Of the constituents to limit, processed foods contributed 57% of energy, 52% of saturated fat, 75% of added sugars, and 57% of sodium. Diets are more likely to meet food guidance recommendations if nutrient-dense foods, either processed or not, are selected. Nutrition and food science professionals, the food industry, and other stakeholders can help to improve the diets of Americans by providing a nutritious food supply that is safe, enjoyable, affordable, and sustainable by communicating effectively and accurately with each other and by working together to improve the overall knowledge of consumers. © 2014 American Society for Nutrition.
AGU Celebrates 83 Geophysicists at 2013 Honors Tribute
NASA Astrophysics Data System (ADS)
Paredes, Beth
2014-02-01
The 2013 AGU Honors Tribute, celebrated on Wednesday, 11 December 2013, honored 83 AGU geophysicists for their passion for scientific excellence and outstanding achievements in advancing and communicating science to ensure a better future for humanity. The work conducted by this distinguished group of scientists, leaders, educators, and communicators truly embodies AGU's vision to "advance and communicate science and its power to ensure a sustainable future."
NASA Astrophysics Data System (ADS)
Rezinskikh, V. F.; Grin', E. A.
2013-01-01
The problem concerned with safe and reliable operation of ageing heat-generating and mechanical equipment of thermal power stations is discussed. It is pointed out that the set of relevant regulatory documents serves as the basis for establishing an efficient equipment diagnostic system. In this connection, updating the existing regulatory documents with imparting the required status to them is one of top-priority tasks. Carrying out goal-oriented scientific research works is a necessary condition for solving this problem as well as other questions considered in the paper that are important for ensuring reliable performance of equipment operating for a long period of time. In recent years, the amount of such works has dropped dramatically, although the need for them is steadily growing. Unbiased assessment of the technical state of equipment that has been in operation for a long period of time is an important aspect in solving the problem of ensuring reliable and safe operation of thermal power stations. Here, along with the quality of diagnostic activities, monitoring of technical state performed on the basis of an analysis of statistical field data and results of operational checks plays an important role. The need to concentrate efforts taken in the mentioned problem areas is pointed out, and it is indicated that successful implementation of the outlined measures requires proper organization and efficient operation of a system for managing safety in the electric power industry.
2013-01-01
Background Integrated into the work in health systems strengthening (HSS) is a growing focus on the importance of ensuring quality of the services delivered and systems which support them. Understanding how to define and measure quality in the different key World Health Organization building blocks is critical to providing the information needed to address gaps and identify models for replication. Description of approaches We describe the approaches to defining and improving quality across the five country programs funded through the Doris Duke Charitable Foundation African Health Initiative. While each program has independently developed and implemented country-specific approaches to strengthening health systems, they all included quality of services and systems as a core principle. We describe the differences and similarities across the programs in defining and improving quality as an embedded process essential for HSS to achieve the goal of improved population health. The programs measured quality across most or all of the six WHO building blocks, with specific areas of overlap in improving quality falling into four main categories: 1) defining and measuring quality; 2) ensuring data quality, and building capacity for data use for decision making and response to quality measurements; 3) strengthened supportive supervision and/or mentoring; and 4) operational research to understand the factors associated with observed variation in quality. Conclusions Learning the value and challenges of these approaches to measuring and improving quality across the key components of HSS as the projects continue their work will help inform similar efforts both now and in the future to ensure quality across the critical components of a health system and the impact on population health. PMID:23819662
ERIC Educational Resources Information Center
Gelman, Rochel; Brenneman, Kimberly; Macdonald, Gay; Roman, Moises
2009-01-01
To ensure they're meeting state early learning guidelines for science, preschool educators need fun, age-appropriate, and research-based ways to teach young children about scientific concepts. The basis for the PBS KIDS show "Sid the Science Kid," this teaching resource helps children ages 3-5 investigate their everyday world and develop the…
ERIC Educational Resources Information Center
Sands, Ashley Elizabeth
2017-01-01
Ground-based astronomy sky surveys are massive, decades-long investments in scientific data collection. Stakeholders expect these datasets to retain scientific value well beyond the lifetime of the sky survey. However, the necessary investments in knowledge infrastructures for managing sky survey data are not yet in place to ensure the long-term…
[The 1, 2, 3 of laboratory animal experimentation].
Romero-Fernandez, Wilber; Batista-Castro, Zenia; De Lucca, Marisel; Ruano, Ana; García-Barceló, María; Rivera-Cervantes, Marta; García-Rodríguez, Julio; Sánchez-Mateos, Soledad
2016-06-01
The slow scientific development in Latin America in recent decades has delayed the incorporation of laboratory animal experimentation; however, this situation has started to change. Today, extraordinary scientific progress is evident, which has promoted the introduction and increased use of laboratory animals as an important tool for the advancement of biomedical sciences. In the aftermath of this boom, the need to provide the scientific community with training and guidance in all aspects related to animal experimentation has arisen. It is the responsibility of each country to regulate this practice, for both bioethical and legal reasons, to ensure consideration of the animals' rights and welfare. The following manuscript is the result of papers presented at the International Workshop on Laboratory Animal Testing held at the Technical University of Ambato, Ecuador; it contains information regarding the current state of affairs in laboratory animal testing and emphasizes critical aspects such as main species used, ethical and legal principles, and experimental and alternative designs for animal use. These works aim to ensure good practices that should define scientific work. This document will be relevant to both researchers who aim to newly incorporate animal testing into their research and those who seek to update their knowledge.
Using Data Maturity Metrics to Help Insure Scientific Integrity
NASA Astrophysics Data System (ADS)
Bates, J. J.
2016-12-01
In the past few years, the full and open sharing of data and other artifacts of research in the geophysics have become mandatory. These include commitments from scientific societies, calls from international organizations, and in the United States, Executive Orders and legislation for open data. Unfortunately, these calls for open data have had, at best, mixed results. Audits by publishers indicate most who do not comply with are unaware of the policies or simply ignore them. A recent high profile publication on global warming resulted in repeated demands of `all data' from Congress and a prolonged back and forth on what `all data' meant. The emerging field of Data Management Maturity is making substantial progress on quantifying the elements needed to fully document, and making open and accessible, geophysical data sets. The proposed metrics have been culled from best practices for observational data sets and these metrics have been applied to a number of geophysical disciplines. A case study applying this metric to climate change indicators will be presented. It is recommended that U.S. Agencies and scientific societies formally adopt data maturity metrics to help ensure a consistent approach to ensure open data and scientific integrity.
Report of the DOD-University Forum Working Group on Engineering and Science Education.
1983-07-01
high priority to strengthening our national base of scientific and technical personnel. That included im- mediate emphasis on training people in the...4 - DOD Requirements for Civilian Engineering and Scientific Personnel .. 5 - DOD Requirements for Military Engineering and Scientific ...15 - The Problem is Quality ................ o................. ...... 15 - The Quality of Engineering and Scientific Personnel in
Preclinical Development of Cell-Based Products: a European Regulatory Science Perspective.
McBlane, James W; Phul, Parvinder; Sharpe, Michaela
2018-06-25
This article describes preclinical development of cell-based medicinal products for European markets and discusses European regulatory mechanisms open to developers to aid successful product development. Cell-based medicinal products are diverse, including cells that are autologous or allogeneic, have been genetically modified, or not, or expanded ex vivo, and applied systemically or to an anatomical site different to that of their origin; comments applicable to one product may not be applicable to others, so bespoke development is needed, for all elements - quality, preclinical and clinical. After establishing how the product is produced, proof of potential for therapeutic efficacy, and then safety, of the product need to be determined. This includes understanding biodistribution, persistence and toxicity, including potential for malignant transformation. These elements need to be considered in the context of the intended clinical development. This article describes regulatory mechanisms available to developers to support product development that aim to resolve scientific issues prior to marketing authorization application, to enable patients to have faster access to the product than would otherwise be the case. Developers are encouraged to be aware of both the scientific issues and regulatory mechanisms to ensure patients can be supplied with these products.
Aligning the 3Rs with new paradigms in the safety assessment of chemicals.
Burden, Natalie; Mahony, Catherine; Müller, Boris P; Terry, Claire; Westmoreland, Carl; Kimber, Ian
2015-04-01
There are currently several factors driving a move away from the reliance on in vivo toxicity testing for the purposes of chemical safety assessment. Progress has started to be made in the development and validation of non-animal methods. However, recent advances in the biosciences provide exciting opportunities to accelerate this process and to ensure that the alternative paradigms for hazard identification and risk assessment deliver lasting 3Rs benefits, whilst improving the quality and relevance of safety assessment. The NC3Rs, a UK-based scientific organisation which supports the development and application of novel 3Rs techniques and approaches, held a workshop recently which brought together over 20 international experts in the field of chemical safety assessment. The aim of this workshop was to review the current scientific, technical and regulatory landscapes, and to identify key opportunities towards reaching these goals. Here, we consider areas where further strategic investment will need to be focused if significant impact on 3Rs is to be matched with improved safety science, and why the timing is right for the field to work together towards an environment where we no longer rely on whole animal data for the accurate safety assessment of chemicals.
ISBT 128 Standard for Coding Medical Products of Human Origin
Ashford, Paul; Delgado, Matthew
2017-01-01
Background ISBT 128 is an international standard for the terminology, coding, labeling, and identification of medical products of human origin (MPHO). Full implementation of ISBT 128 improves traceability, transparency, vigilance and surveillance, and interoperability. Methods ICCBBA maintains the ISBT 128 standard through the activities of a network of expert volunteers, including representatives from professional scientific societies, governments and users, to standardize and maintain MPHO identification. These individuals are organized into Technical Advisory Groups and work within a structured framework as part of a quality-controlled standards development process. Results The extensive involvement of international scientific and professional societies in the development of the standard has ensured that ISBT 128 has gained widespread recognition. The user community has developed confidence in the ability of the standard to adapt to new developments in their fields of interest. The standard is fully compatible with Single European Code requirements for tissues and cells and is utilized by many European tissue establishments. ISBT 128's flexibility and robustness has allowed for expansions into subject areas such as cellular therapy, regenerative medicine, and tissue banking. Conclusion ISBT 128 is the internationally recognized standard for coding MPHO and has gained widespread use globally throughout the past two decades. PMID:29344013
Jassam, Nuthar; Yundt-Pacheco, John; Jansen, Rob; Thomas, Annette; Barth, Julian H
2013-08-01
The implementation of national and international guidelines is beginning to standardise clinical practice. However, since many guidelines have decision limits based on laboratory tests, there is an urgent need to ensure that different laboratories obtain the same analytical result on any sample. A scientifically-based quality control process will be a pre-requisite to provide this level of analytical performance which will support evidence-based guidelines and movement of patients across boundaries while maintaining standardised outcomes. We discuss the finding of a pilot study performed to assess UK clinical laboratories readiness to work to a higher grade quality specifications such as biological variation-based quality specifications. Internal quality control (IQC) data for HbA1c, glucose, creatinine, cholesterol and high density lipoprotein (HDL)-cholesterol were collected from UK laboratories participating in the Bio-Rad Unity QC programme. The median of the coefficient of variation (CV%) of the participating laboratories was evaluated against the CV% based on biological variation. Except creatinine, the other four analytes had a variable degree of compliance with the biological variation-based quality specifications. More than 75% of the laboratories met the biological variation-based quality specifications for glucose, cholesterol and HDL-cholesterol. Slightly over 50% of the laboratories met the analytical goal for HBA1c. Only one analyte (cholesterol) had a performance achieving the higher quality specifications consistent with 5σ. Our data from IQC do not consistently demonstrate that the results from clinical laboratories meet evidence-based quality specifications. Therefore, we propose that a graded scale of quality specifications may be needed at this stage.
Optimizing staffing, quality, and cost in home healthcare nursing: theory synthesis.
Park, Claire Su-Yeon
2017-08-01
To propose a new theory pinpointing the optimal nurse staffing threshold delivering the maximum quality of care relative to attendant costs in home health care. Little knowledge exists on the theoretical foundation addressing the inter-relationship among quality of care, nurse staffing, and cost. Theory synthesis. Cochrane Library, PubMed, CINAHL, EBSCOhost Web and Web of Science (25 February - 26 April 2013; 20 January - 22 March 2015). Most of the existing theories/models lacked the detail necessary to explain the relationship among quality of care, nurse staffing and cost. Two notable exceptions are: 'Production Function for Staffing and Quality in Nursing Homes,' which describes an S-shaped trajectory between quality of care and nurse staffing and 'Thirty-day Survival Isoquant and Estimated Costs According to the Nurse Staff Mix,' which depicts a positive quadric relationship between nurse staffing and cost according to quality of care. A synthesis of these theories led to an innovative multi-dimensional econometric theory helping to determine the maximum quality of care for patients while simultaneously delivering nurse staffing in the most cost-effective way. The theory-driven threshold, navigated by Mathematical Programming based on the Duality Theorem in Mathematical Economics, will help nurse executives defend sufficient nurse staffing with scientific justification to ensure optimal patient care; help stakeholders set an evidence-based reasonable economical goal; and facilitate patient-centred decision-making in choosing the institution which delivers the best quality of care. A new theory to determine the optimum nurse staffing maximizing quality of care relative to cost was proposed. © 2017 The Author. Journal of Advanced Nursing © John Wiley & Sons Ltd.
Farnbach, Sara; Eades, Anne-Maree; Fernando, Jamie K; Gwynn, Josephine D; Glozier, Nick; Hackett, Maree L
2017-10-11
Objectives and importance of the study: Primary health care research focused on Aboriginal and Torres Strait Islander (Indigenous) people is needed to ensure that key frontline services provide evidence based and culturally appropriate care. We systematically reviewed the published primary health care literature to identify research designs, processes and outcomes, and assess the scientific quality of research focused on social and emotional wellbeing. This will inform future research to improve evidence based, culturally appropriate primary health care. Systematic review in accordance with PRISMA and MOOSE guidelines. Four databases and one Indigenous-specific project website were searched for qualitative, quantitative and mixed-method published research. Studies that were conducted in primary health care services and focused on the social and emotional wellbeing of Indigenous people were included. Scientific quality was assessed using risk-of-bias assessment tools that were modified to meet our aims. We assessed community acceptance by identifying the involvement of community governance structures and representation during research development, conduct and reporting. Data were extracted using standard forms developed for this review. We included 32 articles, which reported on 25 studies. Qualitative and mixed methods were used in 18 studies. Twelve articles were judged as high or unclear risk of bias, four as moderate and five as low risk of bias. Another four studies were not able to be assessed as they did not align with the risk-of-bias tools. Of the five articles judged as low risk of bias, two also had high community acceptance and both of these were qualitative. One used a phenomenological approach and the other combined participatory action research with a social-ecological perspective and incorporated 'two-way learning' principles. Of the 16 studies where a primary outcome was identified, eight aimed to identify perceptions or experiences. The remaining studies assessed resources, or evaluated services, interventions, programs or policies. We were unable to identify primary outcomes in eight studies. Conducting Indigenous-focused primary health care research that is scientifically robust, culturally appropriate and produces community-level outcomes is challenging. We suggest that research teams use participatory, culturally sensitive approaches and collaborate closely to plan and implement high-quality research that incorporates local perspectives. Research should result in beneficial outcomes for the communities involved.
Pollinator Protection Strategic Plan
Developed by EPA, this ensures that pesticide risk assessments and risk management decisions use best available information and scientific methods, and full evaluation of pollinator protection when making registration decisions.
Strengthening Transparency in Regulatory Science
Where available and appropriate, EPA will use peer-reviewed information, standardized test methods, consistent data evaluation procedures, and good laboratory practices to ensure transparent, understandable, and reproducible scientific assessments.
15 CFR 995.25 - Quality management system.
Code of Federal Regulations, 2012 CFR
2012-01-01
... management system are those defined in this part. The quality management system must ensure that the... type approved conversion software is maintained by a third party, CEVAD shall ensure that no changes made to the conversion software render the type approval of the conversion software invalid, and shall...
15 CFR 995.25 - Quality management system.
Code of Federal Regulations, 2014 CFR
2014-01-01
... management system are those defined in this part. The quality management system must ensure that the... type approved conversion software is maintained by a third party, CEVAD shall ensure that no changes made to the conversion software render the type approval of the conversion software invalid, and shall...
15 CFR 995.25 - Quality management system.
Code of Federal Regulations, 2011 CFR
2011-01-01
... management system are those defined in this part. The quality management system must ensure that the... type approved conversion software is maintained by a third party, CEVAD shall ensure that no changes made to the conversion software render the type approval of the conversion software invalid, and shall...
15 CFR 995.25 - Quality management system.
Code of Federal Regulations, 2013 CFR
2013-01-01
... management system are those defined in this part. The quality management system must ensure that the... type approved conversion software is maintained by a third party, CEVAD shall ensure that no changes made to the conversion software render the type approval of the conversion software invalid, and shall...
Analysis on Key Points of Construction and Management of Municipal Landscape Engineering
NASA Astrophysics Data System (ADS)
Liang, Mingxia; Fei, Cheng
2018-02-01
At present, China has made great efforts to promote the construction of ecological civilization and promote the development of ecological protection and environmental construction. It has important practical significance to maintain the ecological balance and environmental quality of our country. Especially with the gradual improvement in people’s awareness of environmental protection, so that the green of the city also put forward higher requirements at the same time with the rising of the level of urbanization. In the process of urban landscape construction, the rational planning of urban landscaping involves a lot of subject knowledge. In the green process, we should fully consider the system of urban development and construction in China, based on the design of urban development and long-term planning of the landscaping project. In addition, we must also consider the traditional layout of the city area and the physical and geographical situation and so on, to enhance the objective and scientific nature of urban landscape. Therefore, it is of great practical significance to ensure the quality of landscaping in the effective management of municipal landscape engineering.
China’s Air Quality and Respiratory Disease Mortality Based on the Spatial Panel Model
Cao, Qilong; Liang, Ying; Niu, Xueting
2017-01-01
Background: Air pollution has become an important factor restricting China’s economic development and has subsequently brought a series of social problems, including the impact of air pollution on the health of residents, which is a topical issue in China. Methods: Taking into account this spatial imbalance, the paper is based on the spatial panel data model PM2.5. Respiratory disease mortality in 31 Chinese provinces from 2004 to 2008 is taken as the main variable to study the spatial effect and impact of air quality and respiratory disease mortality on a large scale. Results: It was found that there is a spatial correlation between the mortality of respiratory diseases in Chinese provinces. The spatial correlation can be explained by the spatial effect of PM2.5 pollutions in the control of other variables. Conclusions: Compared with the traditional non-spatial model, the spatial model is better for describing the spatial relationship between variables, ensuring the conclusions are scientific and can measure the spatial effect between variables. PMID:28927016
NASA's Postdoctoral Fellowship Programs
NASA Astrophysics Data System (ADS)
Beichman, Charles A.; Gelino, D. M.; Allen, R. J.; Prestwich, A. H.
2013-01-01
The three named fellowships --- the Einstein, Hubble and Sagan programs --- are among the most prestigious postdoctoral positions in astronomy. Their policies are closely coordinated to ensure the highest scientific quality, the broadest possible access to a diverse community of recent PhD graduates, and flexibility in completing the 3 year appointments in light of individual personal circumstances. We will discuss practical details related to "family-friendly" best practices such as no-cost extensions and the ability to transfer the host institution in response to "two body problems." We note, however, that the terms of the NASA fellowships are such that fellows become employees of their host institutions which set specific policies on issues such as parental leave. We look forward to participating in the discussion at this special session and conveying to NASA any suggestions for improving the fellowship program.
Young, Jasmine Y; Westbrook, John D; Feng, Zukang; Sala, Raul; Peisach, Ezra; Oldfield, Thomas J; Sen, Sanchayita; Gutmanas, Aleksandras; Armstrong, David R; Berrisford, John M; Chen, Li; Chen, Minyu; Di Costanzo, Luigi; Dimitropoulos, Dimitris; Gao, Guanghua; Ghosh, Sutapa; Gore, Swanand; Guranovic, Vladimir; Hendrickx, Pieter M S; Hudson, Brian P; Igarashi, Reiko; Ikegawa, Yasuyo; Kobayashi, Naohiro; Lawson, Catherine L; Liang, Yuhe; Mading, Steve; Mak, Lora; Mir, M Saqib; Mukhopadhyay, Abhik; Patwardhan, Ardan; Persikova, Irina; Rinaldi, Luana; Sanz-Garcia, Eduardo; Sekharan, Monica R; Shao, Chenghua; Swaminathan, G Jawahar; Tan, Lihua; Ulrich, Eldon L; van Ginkel, Glen; Yamashita, Reiko; Yang, Huanwang; Zhuravleva, Marina A; Quesada, Martha; Kleywegt, Gerard J; Berman, Helen M; Markley, John L; Nakamura, Haruki; Velankar, Sameer; Burley, Stephen K
2017-03-07
OneDep, a unified system for deposition, biocuration, and validation of experimentally determined structures of biological macromolecules to the PDB archive, has been developed as a global collaboration by the worldwide PDB (wwPDB) partners. This new system was designed to ensure that the wwPDB could meet the evolving archiving requirements of the scientific community over the coming decades. OneDep unifies deposition, biocuration, and validation pipelines across all wwPDB, EMDB, and BMRB deposition sites with improved focus on data quality and completeness in these archives, while supporting growth in the number of depositions and increases in their average size and complexity. In this paper, we describe the design, functional operation, and supporting infrastructure of the OneDep system, and provide initial performance assessments. Published by Elsevier Ltd.
Diel cycling of trace elements in streams draining mineralized areas: a review
Gammons, Christopher H.; Nimick, David A.; Parker, Stephen R.
2015-01-01
Many trace elements exhibit persistent diel, or 24-h, concentration cycles in streams draining mineralized areas. These cycles can be caused by various physical and biogeochemical mechanisms including streamflow variation, photosynthesis and respiration, as well as reactions involving photochemistry, adsorption and desorption, mineral precipitation and dissolution, and plant assimilation. Iron is the primary trace element that exhibits diel cycling in acidic streams. In contrast, many cationic and anionic trace elements exhibit diel cycling in near-neutral and alkaline streams. Maximum reported changes in concentration for these diel cycles have been as much as a factor of 10 (988% change in Zn concentration over a 24-h period). Thus, monitoring and scientific studies must account for diel trace-element cycling to ensure that water-quality data collected in streams appropriately represent the conditions intended to be studied.
A User's Guide to the Encyclopedia of DNA Elements (ENCODE)
2011-01-01
The mission of the Encyclopedia of DNA Elements (ENCODE) Project is to enable the scientific and medical communities to interpret the human genome sequence and apply it to understand human biology and improve health. The ENCODE Consortium is integrating multiple technologies and approaches in a collective effort to discover and define the functional elements encoded in the human genome, including genes, transcripts, and transcriptional regulatory regions, together with their attendant chromatin states and DNA methylation patterns. In the process, standards to ensure high-quality data have been implemented, and novel algorithms have been developed to facilitate analysis. Data and derived results are made available through a freely accessible database. Here we provide an overview of the project and the resources it is generating and illustrate the application of ENCODE data to interpret the human genome. PMID:21526222
Position of the American Dietetic Association: Functional foods.
Hasler, Clare M; Bloch, Abby S; Thomson, Cynthia A; Enrione, Evelyn; Manning, Carolyn
2004-05-01
It is the position of the American Dietetic Association that functional foods, including whole foods and fortified, enriched, or enhanced foods, have a potentially beneficial effect on health when consumed as part of a varied diet on a regular basis, at effective levels. The Association supports research to define further the health benefits and risks of individual functional foods and their physiologically active components. Dietetics professionals will continue to work with the food industry, the government, the scientific community, and the media to ensure that the public has accurate information regarding this emerging area of food and nutrition science. Knowledge of the role of physiologically active food components, from both phytochemicals and zoochemicals, has changed the role of diet in health. Functional foods have evolved as food and nutrition science has advanced beyond the treatment of deficiency syndromes to reduction of disease risk. This position reviews the definition of functional foods, their regulation, and the scientific evidence supporting this emerging area of food and nutrition. Foods can no longer be evaluated only in terms of macronutrient and micronutrient content alone. Analyzing the content of other physiologically active components and evaluating their role in health promotion will be necessary. The availability of health-promoting functional foods in the US diet has the potential to help ensure a healthier population. However, each functional food should be evaluated on the basis of scientific evidence to ensure appropriate integration into a varied diet.
77 FR 22324 - Scientific Information Request on Treatment of Tinnitus
Federal Register 2010, 2011, 2012, 2013, 2014
2012-04-13
... DEPARTMENT OF HEALTH AND HUMAN SERVICES Agency for Healthcare Research and Quality Scientific Information Request on Treatment of Tinnitus AGENCY: Agency for Healthcare Research and Quality (AHRQ), HHS. ACTION: Request for scientific information submissions. SUMMARY: The Agency for Healthcare Research and...
NASA Technical Reports Server (NTRS)
Macatangay, Ariel
2009-01-01
Crew: Approximately 53% metabolic load Product of protein metabolism Limit production of ammonia by external regulation NOT possbile Payloads Potential source Scientific experiments Thorough safety review ensures sufficient levels of containment
Ensuring Data Quality in Extension Research and Evaluation Studies
ERIC Educational Resources Information Center
Radhakrishna, Rama; Tobin, Daniel; Brennan, Mark; Thomson, Joan
2012-01-01
This article presents a checklist as a guide for Extension professionals to use in research and evaluation studies they carry out. A total of 40 statements grouped under eight data quality components--relevance, objectivity, validity, reliability, integrity, generalizability, completeness, and utility--are identified to ensure that research…
76 FR 68373 - Proposed Revision to Vintage Date Requirements
Federal Register 2010, 2011, 2012, 2013, 2014
2011-11-04
... foreign, while still ensuring that consumers are provided with adequate information as to the identity and... labels provide the consumer with adequate information as to the identity and quality of the product. The... mandate to ensure that consumers have adequate information about the quality and identity of the product...
Reference materials for cellular therapeutics.
Bravery, Christopher A; French, Anna
2014-09-01
The development of cellular therapeutics (CTP) takes place over many years, and, where successful, the developer will anticipate the product to be in clinical use for decades. Successful demonstration of manufacturing and quality consistency is dependent on the use of complex analytical methods; thus, the risk of process and method drift over time is high. The use of reference materials (RM) is an established scientific principle and as such also a regulatory requirement. The various uses of RM in the context of CTP manufacturing and quality are discussed, along with why they are needed for living cell products and the analytical methods applied to them. Relatively few consensus RM exist that are suitable for even common methods used by CTP developers, such as flow cytometry. Others have also identified this need and made proposals; however, great care will be needed to ensure any consensus RM that result are fit for purpose. Such consensus RM probably will need to be applied to specific standardized methods, and the idea that a single RM can have wide applicability is challenged. Written standards, including standardized methods, together with appropriate measurement RM are probably the most appropriate way to define specific starting cell types. The characteristics of a specific CTP will to some degree deviate from those of the starting cells; consequently, a product RM remains the best solution where feasible. Each CTP developer must consider how and what types of RM should be used to ensure the reliability of their own analytical measurements. Copyright © 2014 International Society for Cellular Therapy. Published by Elsevier Inc. All rights reserved.
Moll, F H; Rathert, P; Fangerau, H
2013-12-01
Quality criteria are necessary for the evaluation and rating of scientific collections of medical associations and are necessary for their development in order to argue within the scientific community or with other relevant actors.
78 FR 40147 - Scientific Information Request on Vitamin D and Calcium
Federal Register 2010, 2011, 2012, 2013, 2014
2013-07-03
... DEPARTMENT OF HEALTH AND HUMAN SERVICES Agency for Healthcare Research and Quality Scientific Information Request on Vitamin D and Calcium AGENCY: Agency for Healthcare Research and Quality (AHRQ), HHS. ACTION: Request for Scientific Information Submissions. SUMMARY: The Agency for Healthcare Research and...
78 FR 42952 - Scientific Information Request on Vitamin D and Calcium
Federal Register 2010, 2011, 2012, 2013, 2014
2013-07-18
... DEPARTMENT OF HEALTH AND HUMAN SERVICES Agency for Healthcare Research and Quality Scientific Information Request on Vitamin D and Calcium AGENCY: Agency for Healthcare Research and Quality (AHRQ), HHS. ACTION: Request for scientific information submissions. SUMMARY: The Agency for Healthcare Research and...
Laboratory Experimental Design for a Glycomic Study.
Ugrina, Ivo; Campbell, Harry; Vučković, Frano
2017-01-01
Proper attention to study design before, careful conduct of procedures during, and appropriate inference from results after scientific experiments are important in all scientific studies in order to ensure valid and sometimes definitive conclusions can be made. The design of experiments, also called experimental design, addresses the challenge of structuring and conducting experiments to answer the questions of interest as clearly and efficiently as possible.
ERIC Educational Resources Information Center
Bjørkvold, Tuva; Blikstad-Balas, Marte
2018-01-01
All scientists depend on both reading and writing to do their scientific work. It is of paramount importance to ensure that students have a relevant repertoire of practices they can employ when facing scientific content inside and outside the school context. The present study reports on students in seventh grade acting as researchers. Over an…
76 FR 50453 - Submission for OMB Review; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-15
..., tourism, and recreational and economic activities and opportunities in the Commonwealth of the Northern... enforcement necessary to ensure that scientific exploration and research, tourism, and recreational and...
How to Apply for Protection Time Graphic
We will review insect repellent products that voluntarily apply to use the repellency awareness graphic to ensure that their scientific data meet current testing protocols and standard evaluation processes.
Thermal integration of Spacelab experiments
NASA Technical Reports Server (NTRS)
Patterson, W. C.; Hopson, G. D.
1978-01-01
The method of thermally integrating the experiments for Spacelab is discussed. The scientific payload consists of a combination of European and United States sponsored experiments located in the module as well as on a single Spacelab pallet. The thermal integration must result in accomodating the individual experiment requirements as well as ensuring that the total payload is within the Spacelab Environmental Control System (ECS) resource capability. An integrated thermal/ECS analysis of the module and pallet is performed in concert with the mission timeline to ensure that the agreed upon experiment requirements are accommodated and to ensure the total payload is within the Spacelab ECS resources.
Data stewardship - a fundamental part of the scientific method (Invited)
NASA Astrophysics Data System (ADS)
Foster, C.; Ross, J.; Wyborn, L. A.
2013-12-01
This paper emphasises the importance of data stewardship as a fundamental part of the scientific method, and the need to effect cultural change to ensure engagement by earth scientists. It is differentiated from the science of data stewardship per se. Earth System science generates vast quantities of data, and in the past, data analysis has been constrained by compute power, such that sub-sampling of data often provided the only way to reach an outcome. This is analogous to Kahneman's System 1 heuristic, with its simplistic and often erroneous outcomes. The development of HPC has liberated earth sciences such that the complexity and heterogeneity of natural systems can be utilised in modelling at any scale, global, or regional, or local; for example, movement of crustal fluids. Paradoxically, now that compute power is available, it is the stewardship of the data that is presenting the main challenges. There is a wide spectrum of issues: from effectively handling and accessing acquired data volumes [e.g. satellite feeds per day/hour]; through agreed taxonomy to effect machine to machine analyses; to idiosyncratic approaches by individual scientists. Except for the latter, most agree that data stewardship is essential. Indeed it is an essential part of the science workflow. As science struggles to engage and inform on issues of community importance, such as shale gas and fraccing, all parties must have equal access to data used for decision making; without that, there will be no social licence to operate or indeed access to additional science funding (Heidorn, 2008). The stewardship of scientific data is an essential part of the science process; but often it is regarded, wrongly, as entirely in the domain of data custodians or stewards. Geoscience Australia has developed a set of six principles that apply to all science activities within the agency: Relevance to Government Collaborative science Quality science Transparent science Communicated science Sustained science capability Every principle includes data stewardship: this is to effect cultural change at both collective and individual levels to ensure that our science outcomes and technical advice are effective for the Government and community.
CCCT - Patient Advocate Steering Committee
The Patient Advocate Steering Committee (PASC) works to ensure advocates involved with the Scientific Steering Committees (SSCs) are completely integrated in the development, implementation, and monitoring of clinical trials within those groups.
... level based on scientific research evidence. Adequate Intake (AI): This level is established when there is not ... that is thought to ensure enough nutrition. Infants (AI) 0 to 6 months old: 0.18 grams ...
Systems and processes that ensure high quality care.
Bassett, Sally; Westmore, Kathryn
2012-10-01
This is the second in a series of articles examining the components of good corporate governance. It considers how the structures and processes for quality governance can affect an organisation's ability to be assured about the quality of care. Complex information systems and procedures can lead to poor quality care, but sound structures and processes alone are insufficient to ensure good governance, and behavioural factors play a significant part in making sure that staff are enabled to provide good quality care. The next article in this series looks at how the information reporting of an organisation can affect its governance.
Pottegård, Anton; Haastrup, Maija Bruun; Stage, Tore Bjerregaard; Hansen, Morten Rix; Larsen, Kasper Søltoft; Meegaard, Peter Martin; Meegaard, Line Haugaard Vrdlovec; Horneberg, Henrik; Gils, Charlotte; Dideriksen, Dorthe; Aagaard, Lise; Almarsdottir, Anna Birna; Hallas, Jesper; Damkier, Per
2014-12-16
To describe the development of acronym use across five major medical specialties and to evaluate the technical and aesthetic quality of the acronyms. Acronyms obtained through a literature search of Pubmed.gov followed by a standardised assessment of acronym quality (BEAUTY and CHEATING criteria). Randomised controlled trials within psychiatry, rheumatology, pulmonary medicine, endocrinology, and cardiology published between 2000 and 2012. Prevalence proportion of acronyms and composite quality score for acronyms over time. 14,965 publications were identified, of which 18.3% (n=2737) contained an acronym in the title. Acronym use was more common among cardiological studies than among the other four medical specialties (40% v 8-15% in 2012, P<0.001). Except for within cardiology, the prevalence of acronyms increased over time, with the average prevalence proportion among the remaining four specialties increasing from 4.0% to 12.4% from 2000 to 2012 (P<0.001). The median combined acronym quality score decreased significantly over the study period (P<0.001), from a median 9.25 in 2000 to 5.50 in 2012. From 2000 to 2012 the prevalence of acronyms in trial reports increased, coinciding with a substantial decrease in the technical and aesthetic quality of the acronyms. Strict enforcement of current guidelines on acronym construction by journal editors is necessary to ensure the proper use of acronyms in the future. © Pottegård et al 2014.
Ruchholtz, S; Kühne, C A; Siebert, H
2007-04-01
The quality of care in Germany for seriously injured patients varies greatly in individual hospitals due to geographic variations among States and differences in resource allocation and treatment concepts. To assure and enhance treatment quality it seems sensible to establish a structured, quality assured network of clinics, which participate in the management of seriously injured patients according to different specified assignments. The conditions necessary for this type of network on a regional scale and for the clinics charged with the care of the seriously injured were summarized in the White Paper entitled "Management of the Seriously Injured-Recommendations for the Structure and Organization of Facilities in Germany for the Treatment of Seriously Injured Persons." The goal of this action is to ensure that every seriously injured person in Germany receives the best possible round-the-clock care in adherence to standardized quality criteria. This requires specialized expertise and the willingness of all involved parties-care providers, cost bearers, and hospital owners-to cooperate in further improving existing treatment concepts. As a logical consequence of long years of experience and scientific knowledge, the German Association of Trauma Surgery has developed a concept for establishing a regional trauma network of clinics, adapted to local conditions, for management of seriously injured patients. The participating facilities assume different responsibilities in the network depending on their equipment and structure. This article describes the individual steps toward establishing and organizing a network.
ENHANCING SCIENTIFIC COLLABORATION THROUGH QUALITY ASSURANCE
The basic features of the Quality Assurance Program have been in existence since the early 1980's, but this poster will highlight some topics that have emerged more recently, in particular the Agency's laboratory competency policy, the information quality guidelines, and scientif...
ERIC Educational Resources Information Center
Lee, John
2002-01-01
The problem of ensuring quality in mass education systems is as old as the systems themselves. Responses to this problem reflect the political and cultural organisation of different nation states. In the USA the problem has to be dealt with at a local level. The federal government is very restricted in powers in the field of education and social…
Quality assessment of the visits of pharmaceutical company representatives to hospital pharmacists.
Fonzo-Christe, Caroline; Herrmann, François; Bonnabry, Pascal
2005-11-19
To evaluate whether the quality of pharmaceutical company representatives' (PCRs) visits to hospital pharmacists can be improved by written communication of the results of an evaluation of their visits. Pilot study with prospective evaluation of overall visit quality and strength of request for adding drugs to the hospital formulary, and of the scientific quality of products presentations using a standardized form. Two one-year study periods (59 vs. 61 visits) separated by the intervention (global results of the first period sent to each drug company). No difference was observed between both periods in overall visit quality (VAS 0 = null, 10 = excellent: mean 4.7 (2.1 SD) vs. 5.2 (2.1) or strength of request for adding drug to hospital formulary (VAS 0 = null, 10 = extreme: 7.0 [2.6] vs. 7.2 [2.7]). Clarity and scientific value of products' presentations and scientific value of responses were better during the second study period, as a sign of quality improvement. This study suggests that systematic quality evaluation of PCRs visits and communication of results to drug companies may improve the scientific quality of products' presentation.
Poor reporting of scientific leadership information in clinical trial registers.
Sekeres, Melanie; Gold, Jennifer L; Chan, An-Wen; Lexchin, Joel; Moher, David; Van Laethem, Marleen L P; Maskalyk, James; Ferris, Lorraine; Taback, Nathan; Rochon, Paula A
2008-02-20
In September 2004, the International Committee of Medical Journal Editors (ICMJE) issued a Statement requiring that all clinical trials be registered at inception in a public register in order to be considered for publication. The World Health Organization (WHO) and ICMJE have identified 20 items that should be provided before a trial is considered registered, including contact information. Identifying those scientifically responsible for trial conduct increases accountability. The objective is to examine the proportion of registered clinical trials providing valid scientific leadership information. We reviewed clinical trial entries listing Canadian investigators in the two largest international and public trial registers, the International Standard Randomized Controlled Trial Number (ISRCTN) register, and ClinicalTrials.gov. The main outcome measures were the proportion of clinical trials reporting valid contact information for the trials' Principal Investigator (PI)/Co-ordinating Investigator/Study Chair/Site PI, and trial e-mail contact address, stratified by funding source, recruiting status, and register. A total of 1388 entries (142 from ISRCTN and 1246 from ClinicalTrials.gov) comprised our sample. We found non-compliance with mandatory registration requirements regarding scientific leadership and trial contact information. Non-industry and partial industry funded trials were significantly more likely to identify the individual responsible for scientific leadership (OR = 259, 95% CI: 95-701) and to provide a contact e-mail address (OR = 9.6, 95% CI: 6.6-14) than were solely industry funded trials. Despite the requirements set by WHO and ICMJE, data on scientific leadership and contact e-mail addresses are frequently omitted from clinical trials registered in the two leading public clinical trial registers. To promote accountability and transparency in clinical trials research, public clinical trials registers should ensure adequate monitoring of trial registration to ensure completion of mandatory contact information fields identifying scientific leadership.
21 CFR 212.20 - What activities must I perform to ensure drug quality?
Code of Federal Regulations, 2010 CFR
2010-04-01
... 21 Food and Drugs 4 2010-04-01 2010-04-01 false What activities must I perform to ensure drug quality? 212.20 Section 212.20 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) DRUGS: GENERAL CURRENT GOOD MANUFACTURING PRACTICE FOR POSITRON EMISSION TOMOGRAPHY...
21 CFR 212.20 - What activities must I perform to ensure drug quality?
Code of Federal Regulations, 2011 CFR
2011-04-01
... 21 Food and Drugs 4 2011-04-01 2011-04-01 false What activities must I perform to ensure drug quality? 212.20 Section 212.20 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) DRUGS: GENERAL CURRENT GOOD MANUFACTURING PRACTICE FOR POSITRON EMISSION TOMOGRAPHY...
Resources for Ensuring Quality School-to-Work Opportunities for Young Women. Draft.
ERIC Educational Resources Information Center
Wider Opportunities for Women, Inc., Washington, DC.
This annotated bibliography lists 49 resources for ensuring high quality school-to-work opportunities for young women. These resources are grouped into 10 categories: print material for middle and high school girls; videos for middle and high school girls; administrator/school guides; curriculum guides/resources for teachers; resources for…
36 CFR 1260.38 - How does the NDC ensure the quality of declassification reviews?
Code of Federal Regulations, 2012 CFR
2012-07-01
... 36 Parks, Forests, and Public Property 3 2012-07-01 2012-07-01 false How does the NDC ensure the quality of declassification reviews? 1260.38 Section 1260.38 Parks, Forests, and Public Property NATIONAL ARCHIVES AND RECORDS ADMINISTRATION DECLASSIFICATION DECLASSIFICATION OF NATIONAL SECURITY INFORMATION The...
36 CFR 1260.38 - How does the NDC ensure the quality of declassification reviews?
Code of Federal Regulations, 2014 CFR
2014-07-01
... 36 Parks, Forests, and Public Property 3 2014-07-01 2014-07-01 false How does the NDC ensure the quality of declassification reviews? 1260.38 Section 1260.38 Parks, Forests, and Public Property NATIONAL ARCHIVES AND RECORDS ADMINISTRATION DECLASSIFICATION DECLASSIFICATION OF NATIONAL SECURITY INFORMATION The...
Making the Grade: How Boards Can Ensure Academic Quality. Second Edition
ERIC Educational Resources Information Center
Ewell, Peter
2012-01-01
"Making the Grade: How Boards Can Ensure Academic Quality"--popularly referred to as "The Little Yellow Book" by boards, faculty, provosts, and assessment specialists--provides clear guidance for the board's role in the most important "business" of academe: educating students. As public calls for greater accountability skyrocket, this new edition…
Galipeau, James; Moher, David; Campbell, Craig; Hendry, Paul; Cameron, D William; Palepu, Anita; Hébert, Paul C
2015-03-01
To investigate whether training in writing for scholarly publication, journal editing, or manuscript peer review effectively improves educational outcomes related to the quality of health research reporting. We searched MEDLINE, Embase, ERIC, PsycINFO, and the Cochrane Library for comparative studies of formalized, a priori-developed training programs in writing for scholarly publication, journal editing, or manuscript peer review. Comparators included the following: (1) before and after administration of a training program, (2) between two or more training programs, or (3) between a training program and any other (or no) intervention(s). Outcomes included any measure of effectiveness of training. Eighteen reports of 17 studies were included. Twelve studies focused on writing for publication, five on peer review, and none fit our criteria for journal editing. Included studies were generally small and inconclusive regarding the effects of training of authors, peer reviewers, and editors on educational outcomes related to improving the quality of health research. Studies were also of questionable validity and susceptible to misinterpretation because of their risk of bias. This review highlights the gaps in our knowledge of how to enhance and ensure the scientific quality of research output for authors, peer reviewers, and journal editors. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.
NASA's Productivity Improvement and Quality Enhancement Initiatives
NASA Technical Reports Server (NTRS)
1984-01-01
The National Aeronautics and Space Administration celebrated its 25th Anniversary in 1983 at the Air and Space Museum in Washington, DC, with President Reagan in attendance. We look back on the accomplishments of these twenty-five years with pride in our missions and our people. NASA captured the world's imagination during the days of the Apollo mission. So much so, that we now talk about the Apollo era. In the l970s, we moved into the Space Transportation business and in the 199Os, we look forward to having a manned Space Station. Each succeeding mission has presented its own challenge in terms of technology and resources. This is especially true today, when we are being asked to do more with less. To ensure that NASA continues to be a productive and quality conscious agency, one of our highest Agency goals is leadership in the development and application of practices which contribute to high productivity and quality. greatest competitive strength, and this country has a solid scientific and engineering foundation. Traditionally we have spent more money on research and development than Japan and Europe combined, and we are the source of most of this century significant innovations. We should build on this solid base and use it more effectively.
Han, Haihong; Li, Ning; Li, Yepeng; Fu, Ping; Yu, Dongmin; Li Zhigang; Du, Chunming; Guo, Yunchang
2015-01-01
To test the aerobic plate count examining capability of microbiology laboratories, to ensure the accuracy and comparability of quantitative bacteria examination results, and to improve the quality of monitoring. The 4 different concentration aerobic plate count piece samples were prepared and noted as I, II, III and IV. After homogeneity and stability tests, the samples were delivered to monitoring institutions. The results of I, II, III samples were logarithmic transformed, and evaluated with Z-score method using the robust average and standard deviation. The results of IV samples were evaluated as "satisfactory" when reported as < 10 CFU/piece or as "not satisfactory" otherwise. Pearson χ2 test was used to analyze the ratio results. 309 monitoring institutions, which was 99.04% of the total number, reported their results. 271 institutions reported a satisfactory result, and the satisfactory rate was 87.70%. There was no statistical difference in satisfactory rates of I, II and III samples which were 81.52%, 88.30% and 91.40% respectively. The satisfactory rate of IV samples was 93.33%. There was no statistical difference in satisfactory rates between provincial and municipal CDC. The quality control program has provided scientific data that the aerobic plate count capability of the laboratories meets the requirements of monitoring tasks.
Quality Assurance Program for Molecular Medicine Laboratories
Hajia, M; Safadel, N; Samiee, S Mirab; Dahim, P; Anjarani, S; Nafisi, N; Sohrabi, A; Rafiee, M; Sabzavi, F; Entekhabi, B
2013-01-01
Background: Molecular diagnostic methods have played and continuing to have a critical role in clinical laboratories in recent years. Therefore, standardization is an evolutionary process that needs to be upgrade with increasing scientific knowledge, improvement of the instruments and techniques. The aim of this study was to design a quality assurance program in order to have similar conditions for all medical laboratories engaging with molecular tests. Methods: We had to design a plan for all four elements; required space conditions, equipments, training, and basic guidelines. Necessary guidelines was prepared and confirmed by the launched specific committee at the Health Reference Laboratory. Results: Several workshops were also held for medical laboratories directors and staffs, quality control manager of molecular companies, directors and nominees from universities. Accreditation of equipments and molecular material was followed parallel with rest of program. Now we are going to accredit medical laboratories and to evaluate the success of the program. Conclusion: Accreditation of medical laboratory will be succeeding if its basic elements are provided in advance. Professional practice guidelines, holding training and performing accreditation the molecular materials and equipments ensured us that laboratories are aware of best practices, proper interpretation, limitations of techniques, and technical issues. Now, active external auditing can improve the applied laboratory conditions toward the defined standard level. PMID:23865028
Quality assurance program for molecular medicine laboratories.
Hajia, M; Safadel, N; Samiee, S Mirab; Dahim, P; Anjarani, S; Nafisi, N; Sohrabi, A; Rafiee, M; Sabzavi, F; Entekhabi, B
2013-01-01
Molecular diagnostic methods have played and continuing to have a critical role in clinical laboratories in recent years. Therefore, standardization is an evolutionary process that needs to be upgrade with increasing scientific knowledge, improvement of the instruments and techniques. The aim of this study was to design a quality assurance program in order to have similar conditions for all medical laboratories engaging with molecular tests. We had to design a plan for all four elements; required space conditions, equipments, training, and basic guidelines. Necessary guidelines was prepared and confirmed by the launched specific committee at the Health Reference Laboratory. Several workshops were also held for medical laboratories directors and staffs, quality control manager of molecular companies, directors and nominees from universities. Accreditation of equipments and molecular material was followed parallel with rest of program. Now we are going to accredit medical laboratories and to evaluate the success of the program. Accreditation of medical laboratory will be succeeding if its basic elements are provided in advance. Professional practice guidelines, holding training and performing accreditation the molecular materials and equipments ensured us that laboratories are aware of best practices, proper interpretation, limitations of techniques, and technical issues. Now, active external auditing can improve the applied laboratory conditions toward the defined standard level.
Hoyt, David B; Schneidman, Diane S
2014-01-01
Throughout its 100-year history of working to ensure that surgical patients receive safe, high-quality, cost-effective care, the American College of Surgeons has adhered to four key principles: (1) Set the standards to identify and set the highest clinical standards based on the collection of outcomes data and other scientific evidence that can be customized to each patient's condition so that surgeons can offer the right care, at the right time, in the right setting. (2) Build the right infrastructure to provide the highest quality care with surgical facilities having in place appropriate and adequate staffing levels, a reasonable mix of specialists, and the right equipment. Checklists and health information technology, such as the electronic health record, are components of this infrastructure. (3) Collect robust data so that surgical decisions are based on clinical data drawn from medical charts that track patients after discharge from the hospital. Data should be risk-adjusted and collected in nationally benchmarked registries to allow institutions to compare their care with other providers. (4) Verify processes and infrastructure by having an external authority periodically affirm that the right systems are in place at health care institutions, that outcomes are being measured and benchmarked, and that hospitals and providers are proactively responding to these findings. © 2014.
Chantler, Tracey; Cheah, Phaik Yeong; Miiro, George; Hantrakum, Viriya; Nanvubya, Annet; Ayuo, Elizabeth; Kivaya, Esther; Kidola, Jeremiah; Kaleebu, Pontiano; Parker, Michael; Njuguna, Patricia; Ashley, Elizabeth; Guerin, Philippe J; Lang, Trudie
2014-01-01
Objectives To evaluate and determine the value of monitoring models developed by the Mahidol Oxford Tropical Research Unit and the East African Consortium for Clinical Research, consider how this can be measured and explore monitors’ and investigators’ experiences of and views about the nature, purpose and practice of monitoring. Research design A case study approach was used within the context of participatory action research because one of the aims was to guide and improve practice. 34 interviews, five focus groups and observations of monitoring practice were conducted. Setting and participants Fieldwork occurred in the places where the monitoring models are coordinated and applied in Thailand, Cambodia, Uganda and Kenya. Participants included those coordinating the monitoring schemes, monitors, senior investigators and research staff. Analysis Transcribed textual data from field notes, interviews and focus groups was imported into a qualitative data software program (NVIVO V. 10) and analysed inductively and thematically by a qualitative researcher. The initial coding framework was reviewed internally and two main categories emerged from the subsequent interrogation of the data. Results The categories that were identified related to the conceptual framing and nature of monitoring, and the practice of monitoring, including relational factors. Particular emphasis was given to the value of a scientific and cooperative style of monitoring as a means of enhancing data quality, trust and transparency. In terms of practice the primary purpose of monitoring was defined as improving the conduct of health research and increasing the capacity of researchers and trial sites. Conclusions The models studied utilise internal and network wide expertise to improve the ethics and quality of clinical research. They demonstrate how monitoring can be a scientific and constructive exercise rather than a threatening process. The value of cooperative relations needs to be given more emphasis in monitoring activities, which seek to ensure that research protects human rights and produces reliable data. PMID:24534257
Quality improvement in the use of medications through a drug use evaluation service.
Stevenson, J G; Bakst, C M; Zaran, F K; Rybak, M J; Smolarek, R T; Alexander, M R
1992-10-01
Continuous quality improvement methods have the potential to improve processes that cross several disciplines. The medication system is one in which coordination of activities between physicians, pharmacists, and nurses is essential for optimal therapy to occur. DUE services can play an important role in helping to ensure that patients receive high-quality pharmaceutical care. It is necessary for pharmacy managers to review the structure, goals, and outcomes of their DUE programs to ensure that they are consistent with a philosophy of continuous improvement in the quality of drug therapy.
Communicating the Results and Activities of the U.S. Climate Change Science Program
NASA Astrophysics Data System (ADS)
Chatterjee, K.; Parker, K.
2004-12-01
The Climate Change Science Program (CCSP) has a responsibility for credible and effective communications on issues related to climate variability and climate change science. As an essential part of its mission and responsibilities, the CCSP aims to enhance the quality of public discussion by stressing openness and transparency in its scientific research processes and results, and ensuring the widespread availability of credible, science-based information. The CCSP and individual federal agencies generate substantial amounts of authoritative scientific information on climate variability and change. Research findings are generally well reported in the scientific literature, but relevant aspects of these findings need to be reported in formats suitable for use by diverse audiences whose understanding and familiarity with climate change science issues vary. To further its commitment to the effective communication of climate change science information, the CCSP has established the Communications Interagency Working Group, which has produced an implementation plan for Climate Change communication, aimed at achieving the following goals: * Disseminate the results of CCSP activities credibly and effectively * Make CCSP science findings and products easily available to a diverse set of audiences. In addition to CCSP efforts, the individual federal agencies that comprise CCSP disseminate science-based climate information through their agency networks. The agencies of the CCSP are the Departments of Agriculture, Commerce, Defense, Energy, Health and Human Services, Interior, State, and Transportation and the U.S. EPA, NASA, NSF, Smithsonian Institute, and USAID.
NASA Astrophysics Data System (ADS)
Noelle, A.; Hartmann, G. K.; Martin-Torres, F. J.
2010-05-01
The science-softCon "UV/Vis+ Spectra Data Base" is a non-profit project established in August 2000 and is operated in accordance to the "Open Access" definitions and regulations of the CSPR Assessment Panel on Scientific Data and Information (International Council for Science, 2004, HYPERLINK "http://www.science-softcon.de/spectra/cspr.pdf" ICSU Report of the CSPR Assessment Panel on Data and Information; ISBN 0-930357-60-4). The on-line database contains currently about 5600 spectra (from low to very high resolution, at different temperatures and pressures) and datasheets (metadata) of about 850 substances. Additional spectra/datasheets will be added continuously. In addition more than 250 links to on-line free available original publications are provided. The interdisciplinary of this photochemistry database provides a good interaction between different research areas. So, this database is an excellent tool for scientists who investigate on different fields such as atmospheric chemistry, astrophysics, agriculture, analytical chemistry, environmental chemistry, medicine, remote sensing, etc. To ensure the high quality standard of the fast growing UV/Vis+ Spectra Data Base an international "Scientific Advisory Group" (SAG) has been established in 2004. Because of the importance of maintenance of the database the support of the scientific community is crucial. Therefore we would like to encourage all scientists to support this data compilation project thru the provision of new or missing spectral data and information.
Seagrass-Watch: Engaging Torres Strait Islanders in marine habitat monitoring
NASA Astrophysics Data System (ADS)
Mellors, Jane E.; McKenzie, Len J.; Coles, Robert G.
2008-09-01
Involvement in scientifically structured habitat monitoring is a relatively new concept to the peoples of Torres Strait. The approach we used was to focus on awareness, and to build the capacity of groups to participate using Seagrass-Watch as the vehicle to provide education and training in monitoring marine ecosystems. The project successfully delivered quality scientifically rigorous baseline information on the seasonality of seagrasses in the Torres Strait—a first for this region. Eight seagrass species were identified across the monitoring sites. Seagrass cover varied within and between years. Preliminary evidence indicated that drivers for seagrass variability were climate related. Generally, seagrass abundance increased during the north-west monsoon ( Kuki), possibly a consequence of elevated nutrients, lower tidal exposure times, less wind, and higher air temperatures. Low seagrass abundance coincided with the presence of greater winds and longer periods of exposure at low tides during the south-east trade wind season ( Sager). No seasonal patterns were apparent when frequency of disturbance from high sedimentation and human impacts was high. Seagrass-Watch has been incorporated in to the Thursday Island High School's Marine Studies Unit ensuring continuity of monitoring. The students, teachers, and other interested individuals involved in Seagrass-Watch have mastered the necessary scientific procedures to monitor seagrass meadows, and developed skills in coordinating a monitoring program and skills in mentoring younger students. This has increased the participants' self-esteem and confidence, and given them an insight into how they may participate in the future management of their sea country.
NASA Astrophysics Data System (ADS)
Ek, M. B.; Xia, Y.; Ford, T.; Wu, Y.; Quiring, S. M.
2015-12-01
The North American Soil Moisture Database (NASMD) was initiated in 2011 to provide support for developing climate forecasting tools, calibrating land surface models and validating satellite-derived soil moisture algorithms. The NASMD has collected data from over 30 soil moisture observation networks providing millions of in situ soil moisture observations in all 50 states as well as Canada and Mexico. It is recognized that the quality of measured soil moisture in NASMD is highly variable due to the diversity of climatological conditions, land cover, soil texture, and topographies of the stations and differences in measurement devices (e.g., sensors) and installation. It is also recognized that error, inaccuracy and imprecision in the data set can have significant impacts on practical operations and scientific studies. Therefore, developing an appropriate quality control procedure is essential to ensure the data is of the best quality. In this study, an automated quality control approach is developed using the North American Land Data Assimilation System phase 2 (NLDAS-2) Noah soil porosity, soil temperature, and fraction of liquid and total soil moisture to flag erroneous and/or spurious measurements. Overall results show that this approach is able to flag unreasonable values when the soil is partially frozen. A validation example using NLDAS-2 multiple model soil moisture products at the 20 cm soil layer showed that the quality control procedure had a significant positive impact in Alabama, North Carolina, and West Texas. It had a greater impact in colder regions, particularly during spring and autumn. Over 433 NASMD stations have been quality controlled using the methodology proposed in this study, and the algorithm will be implemented to control data quality from the other ~1,200 NASMD stations in the near future.
Houston, Lauren; Probst, Yasmine; Martin, Allison
2018-05-18
Data audits within clinical settings are extensively used as a major strategy to identify errors, monitor study operations and ensure high-quality data. However, clinical trial guidelines are non-specific in regards to recommended frequency, timing and nature of data audits. The absence of a well-defined data quality definition and method to measure error undermines the reliability of data quality assessment. This review aimed to assess the variability of source data verification (SDV) auditing methods to monitor data quality in a clinical research setting. The scientific databases MEDLINE, Scopus and Science Direct were searched for English language publications, with no date limits applied. Studies were considered if they included data from a clinical trial or clinical research setting and measured and/or reported data quality using a SDV auditing method. In total 15 publications were included. The nature and extent of SDV audit methods in the articles varied widely, depending upon the complexity of the source document, type of study, variables measured (primary or secondary), data audit proportion (3-100%) and collection frequency (6-24 months). Methods for coding, classifying and calculating error were also inconsistent. Transcription errors and inexperienced personnel were the main source of reported error. Repeated SDV audits using the same dataset demonstrated ∼40% improvement in data accuracy and completeness over time. No description was given in regards to what determines poor data quality in clinical trials. A wide range of SDV auditing methods are reported in the published literature though no uniform SDV auditing method could be determined for "best practice" in clinical trials. Published audit methodology articles are warranted for the development of a standardised SDV auditing method to monitor data quality in clinical research settings. Copyright © 2018. Published by Elsevier Inc.
NASA Technical Reports Server (NTRS)
Sen, Syamal K.; Shaykhian, Gholam Ali
2011-01-01
MatLab(TradeMark)(MATrix LABoratory) is a numerical computation and simulation tool that is used by thousands Scientists and Engineers in many countries. MatLab does purely numerical calculations, which can be used as a glorified calculator or interpreter programming language; its real strength is in matrix manipulations. Computer algebra functionalities are achieved within the MatLab environment using "symbolic" toolbox. This feature is similar to computer algebra programs, provided by Maple or Mathematica to calculate with mathematical equations using symbolic operations. MatLab in its interpreter programming language form (command interface) is similar with well known programming languages such as C/C++, support data structures and cell arrays to define classes in object oriented programming. As such, MatLab is equipped with most of the essential constructs of a higher programming language. MatLab is packaged with an editor and debugging functionality useful to perform analysis of large MatLab programs and find errors. We believe there are many ways to approach real-world problems; prescribed methods to ensure foregoing solutions are incorporated in design and analysis of data processing and visualization can benefit engineers and scientist in gaining wider insight in actual implementation of their perspective experiments. This presentation will focus on data processing and visualizations aspects of engineering and scientific applications. Specifically, it will discuss methods and techniques to perform intermediate-level data processing covering engineering and scientific problems. MatLab programming techniques including reading various data files formats to produce customized publication-quality graphics, importing engineering and/or scientific data, organizing data in tabular format, exporting data to be used by other software programs such as Microsoft Excel, data presentation and visualization will be discussed.
Data-driven Ontology Development: A Case Study at NASA's Atmospheric Science Data Center
NASA Astrophysics Data System (ADS)
Hertz, J.; Huffer, E.; Kusterer, J.
2012-12-01
Well-founded ontologies are key to enabling transformative semantic technologies and accelerating scientific research. One example is semantically enabled search and discovery, making scientific data accessible and more understandable by accurately modeling a complex domain. The ontology creation process remains a challenge for many anxious to pursue semantic technologies. The key may be that the creation process -- whether formal, community-based, automated or semi-automated -- should encompass not only a foundational core and supplemental resources but also a focus on the purpose or mission the ontology is created to support. Are there tools or processes to de-mystify, assess or enhance the resulting ontology? We suggest that comparison and analysis of a domain-focused ontology can be made using text engineering tools for information extraction, tokenizers, named entity transducers and others. The results are analyzed to ensure the ontology reflects the core purpose of the domain's mission and that the ontology integrates and describes the supporting data in the language of the domain - how the science is analyzed and discussed among all users of the data. Commonalities and relationships among domain resources describing the Clouds and Earth's Radiant Energy (CERES) Bi-Directional Scan (BDS) datasets from NASA's Atmospheric Science Data Center are compared. The domain resources include: a formal ontology created for CERES; scientific works such as papers, conference proceedings and notes; information extracted from the datasets (i.e., header metadata); and BDS scientific documentation (Algorithm Theoretical Basis Documents, collection guides, data quality summaries and others). These resources are analyzed using the open source software General Architecture for Text Engineering, a mature framework for computational tasks involving human language.
Steinwachs, Donald; Allen, Jennifer Dacey; Barlow, William Eric; Duncan, R Paul; Egede, Leonard E; Friedman, Lawrence S; Keating, Nancy L; Kim, Paula; Lave, Judith R; LaVeist, Thomas A; Ness, Roberta B; Optican, Robert J; Virnig, Beth A
2010-02-04
To provide health care providers, patients, and the general public with a responsible assessment of currently available data on enhancing use and quality of colorectal cancer screening. A non-DHHS, nonadvocate 13-member panel representing the fields of cancer surveillance, health services research, community-based research, informed decision-making, access to care, health care policy, health communication, health economics, health disparities, epidemiology, statistics, thoracic radiology, internal medicine, gastroenterology, public health, end-of-life care, and a public representative. In addition, 20 experts from pertinent fields presented data to the panel and conference audience. Presentations by experts and a systematic review of the literature prepared by the RTI International-University of North Carolina Evidence-based Practice Center, through the Agency for Healthcare Research and Quality. Scientific evidence was given precedence over anecdotal experience. The panel drafted its statement based on scientific evidence presented in open forum and on published scientific literature. The draft statement was presented on the final day of the conference and circulated to the audience for comment. The panel released a revised statement later that day at http://consensus.nih.gov. This statement is an independent report of the panel and is not a policy statement of the NIH or the Federal Government. The panel found that despite substantial progress toward higher colorectal cancer screening rates nationally, screening rates fall short of desirable levels. Targeted initiatives to improve screening rates and reduce disparities in underscreened communities and population subgroups could further reduce colorectal cancer morbidity and mortality. This could be achieved by utilizing the full range of screening options and evidence-based interventions for increasing screening rates. With additional investments in quality monitoring, Americans could be assured that all screening achieves high rates of cancer prevention and early detection. To close the gap in screening, this report identifies the following priority areas for implementation and research to enhance the use and quality of colorectal cancer screening: • Eliminate financial barriers to colorectal cancer screening and appropriate follow up. • Widely implement interventions that have proven effective at increasing colorectal cancer screening, including patient reminder systems and one-on-one interactions with providers, educators, or navigators. • Conduct research to assess the effectiveness of tailoring programs to match the characteristics and preferences of target population groups to increase colorectal cancer screening. • Implement systems to ensure appropriate follow-up of positive colorectal cancer screening results. • Develop systems to assure high quality of colorectal cancer screening programs. • Conduct studies to determine the comparative effectiveness of the various colorectal cancer screening methods in usual practice settings.
Accelerating Research Impact in a Learning Health Care System
Elwy, A. Rani; Sales, Anne E.; Atkins, David
2017-01-01
Background: Since 1998, the Veterans Health Administration (VHA) Quality Enhancement Research Initiative (QUERI) has supported more rapid implementation of research into clinical practice. Objectives: With the passage of the Veterans Access, Choice and Accountability Act of 2014 (Choice Act), QUERI further evolved to support VHA’s transformation into a Learning Health Care System by aligning science with clinical priority goals based on a strategic planning process and alignment of funding priorities with updated VHA priority goals in response to the Choice Act. Design: QUERI updated its strategic goals in response to independent assessments mandated by the Choice Act that recommended VHA reduce variation in care by providing a clear path to implement best practices. Specifically, QUERI updated its application process to ensure its centers (Programs) focus on cross-cutting VHA priorities and specify roadmaps for implementation of research-informed practices across different settings. QUERI also increased funding for scientific evaluations of the Choice Act and other policies in response to Commission on Care recommendations. Results: QUERI’s national network of Programs deploys effective practices using implementation strategies across different settings. QUERI Choice Act evaluations informed the law’s further implementation, setting the stage for additional rigorous national evaluations of other VHA programs and policies including community provider networks. Conclusions: Grounded in implementation science and evidence-based policy, QUERI serves as an example of how to operationalize core components of a Learning Health Care System, notably through rigorous evaluation and scientific testing of implementation strategies to ultimately reduce variation in quality and improve overall population health. PMID:27997456
EPA works with states and other key stakeholders, through sound scientific research and regulation; to help ensure that natural gas extraction from shale formations, also called fracking or hydrofracking, does not harm public health and the environment.
End-user perspective of low-cost sensors for outdoor air pollution monitoring.
Rai, Aakash C; Kumar, Prashant; Pilla, Francesco; Skouloudis, Andreas N; Di Sabatino, Silvana; Ratti, Carlo; Yasar, Ansar; Rickerby, David
2017-12-31
Low-cost sensor technology can potentially revolutionise the area of air pollution monitoring by providing high-density spatiotemporal pollution data. Such data can be utilised for supplementing traditional pollution monitoring, improving exposure estimates, and raising community awareness about air pollution. However, data quality remains a major concern that hinders the widespread adoption of low-cost sensor technology. Unreliable data may mislead unsuspecting users and potentially lead to alarming consequences such as reporting acceptable air pollutant levels when they are above the limits deemed safe for human health. This article provides scientific guidance to the end-users for effectively deploying low-cost sensors for monitoring air pollution and people's exposure, while ensuring reasonable data quality. We review the performance characteristics of several low-cost particle and gas monitoring sensors and provide recommendations to end-users for making proper sensor selection by summarizing the capabilities and limitations of such sensors. The challenges, best practices, and future outlook for effectively deploying low-cost sensors, and maintaining data quality are also discussed. For data quality assurance, a two-stage sensor calibration process is recommended, which includes laboratory calibration under controlled conditions by the manufacturer supplemented with routine calibration checks performed by the end-user under final deployment conditions. For large sensor networks where routine calibration checks are impractical, statistical techniques for data quality assurance should be utilised. Further advancements and adoption of sophisticated mathematical and statistical techniques for sensor calibration, fault detection, and data quality assurance can indeed help to realise the promised benefits of a low-cost air pollution sensor network. Copyright © 2017 Elsevier B.V. All rights reserved.
Ha, Jongsik
2014-01-01
Objectives South Korea’s air quality standards are insufficient in terms of establishing a procedure for their management. The current system lacks a proper decision-making process and prior evidence is not considered. The purpose of this study is to propose a measure for establishing atmospheric environmental standards in South Korea that will take into consideration the health of its residents. Methods In this paper, the National Ambient Air Quality Standards (NAAQS) of the US was examined in order to suggest ways, which consider health effects, to establish air quality standards in South Korea. Up-to-date research on the health effects of air pollution was then reviewed, and tools were proposed to utilize the key results. This was done in an effort to ensure the reliability of the standards with regard to public health. Results This study showed that scientific research on the health effects of air pollution and the methodology used in the research have contributed significantly to establishing air quality standards. However, as the standards are legally binding, the procedure should take into account the effects on other sectors. Realistically speaking, it is impossible to establish standards that protect an entire population from air pollution. Instead, it is necessary to find a balance between what should be done and what can be done. Conclusions Therefore, establishing air quality standards should be done as part of an evidence-based policy that identifies the health effects of air pollution and takes into consideration political, economic, and social contexts. PMID:25300297
Walsh, Stephen Joseph; Meador, Michael R.
1998-01-01
Fish community structure is characterized by the U.S. Geological Survey's National Water-Quality Assessment (NAWQA) Program as part of a perennial, multidisciplinary approach to evaluating the physical, chemical, and biological conditions of the Nation's water resources. The objective of quality assurance and quality control of fish taxonomic data that are collected as part of the NAWQA Program is to establish uniform guidelines and protocols for the identification, processing, and archiving of fish specimens to ensure that accurate and reliable data are collected. Study unit biologists, collaborating with regional biologists and fish taxonomic specialists, prepare a pre-sampling study plan that includes a preliminary faunal list and identification of an ichthyological curation center for receiving preserved fish specimens. Problematic taxonomic issues and protected taxa also are identified in the study plan, and collecting permits are obtained in advance of sampling activities. Taxonomic specialists are selected to identify fish specimens in the field and to assist in determining what fish specimens should be sacrificed, fixed, and preserved for laboratory identification, independent taxonomic verification, and long-term storage in reference or voucher collections. Quantitative and qualitative sampling of fishes follows standard methods previously established for the NAWQA Program. Common ichthyological techniques are used to process samples in the field and prepare fish specimens to be returned to the laboratory or sent to an institutional repository. Taxonomic identifications are reported by using a standardized list of scientific names that provides nomenclatural consistency and uniformity across study units.
Getting Data Right - and Righteous to Improve Hispanic or Latino Health.
Rodríguez-Lainz, Alfonso; McDonald, Mariana; Penman-Aguilar, Ana; Barrett, Drue H
2016-01-01
Hispanics or Latinos constitute the largest racial/ethnic minority in the United States. They are also a very diverse population. Latino/Hispanic's health varies significantly for subgroups defined by national origin, race, primary language, and migration-related factors (place of birth, immigration status, years of residence in the United States). Most Hispanics speak Spanish at home, and one-third have limited English proficiency (LEP). There is growing awareness on the importance for population health monitoring programs to collect those data elements (Hispanic subgroup, primary language, and migration-related factors) that better capture Hispanics' diversity, and to provide language assistance (translation of data collection forms, interpreters) to ensure meaningful inclusion of all Latinos/Hispanics in national health monitoring. There are strong ethical and scientific reasons for such expansion of data collection by public health entities. First, expand data elements can help identify otherwise hidden Hispanic subpopulations' health disparities. This may promote a more just and equitable distribution of health resources to underserved populations. Second, language access is needed to ensure fair and legal treatment of LEP individuals in federally supported data collection activities. Finally, these strategies are likely to improve the quality and representativeness of data needed to monitor and address the health of all Latino/Hispanic populations in the United States.
Getting Data Right — and Righteous to Improve Hispanic or Latino Health
Rodríguez-Lainz, Alfonso; McDonald, Mariana; Penman-Aguilar, Ana; Barrett, Drue H.
2017-01-01
Hispanics or Latinos constitute the largest racial/ethnic minority in the United States. They are also a very diverse population. Latino/Hispanic’s health varies significantly for subgroups defined by national origin, race, primary language, and migration-related factors (place of birth, immigration status, years of residence in the United States). Most Hispanics speak Spanish at home, and one-third have limited English proficiency (LEP). There is growing awareness on the importance for population health monitoring programs to collect those data elements (Hispanic subgroup, primary language, and migration-related factors) that better capture Hispanics’ diversity, and to provide language assistance (translation of data collection forms, interpreters) to ensure meaningful inclusion of all Latinos/Hispanics in national health monitoring. There are strong ethical and scientific reasons for such expansion of data collection by public health entities. First, expand data elements can help identify otherwise hidden Hispanic subpopulations’ health disparities. This may promote a more just and equitable distribution of health resources to underserved populations. Second, language access is needed to ensure fair and legal treatment of LEP individuals in federally supported data collection activities. Finally, these strategies are likely to improve the quality and representativeness of data needed to monitor and address the health of all Latino/Hispanic populations in the United States. PMID:29416934
Enabling Efficient Climate Science Workflows in High Performance Computing Environments
NASA Astrophysics Data System (ADS)
Krishnan, H.; Byna, S.; Wehner, M. F.; Gu, J.; O'Brien, T. A.; Loring, B.; Stone, D. A.; Collins, W.; Prabhat, M.; Liu, Y.; Johnson, J. N.; Paciorek, C. J.
2015-12-01
A typical climate science workflow often involves a combination of acquisition of data, modeling, simulation, analysis, visualization, publishing, and storage of results. Each of these tasks provide a myriad of challenges when running on a high performance computing environment such as Hopper or Edison at NERSC. Hurdles such as data transfer and management, job scheduling, parallel analysis routines, and publication require a lot of forethought and planning to ensure that proper quality control mechanisms are in place. These steps require effectively utilizing a combination of well tested and newly developed functionality to move data, perform analysis, apply statistical routines, and finally, serve results and tools to the greater scientific community. As part of the CAlibrated and Systematic Characterization, Attribution and Detection of Extremes (CASCADE) project we highlight a stack of tools our team utilizes and has developed to ensure that large scale simulation and analysis work are commonplace and provide operations that assist in everything from generation/procurement of data (HTAR/Globus) to automating publication of results to portals like the Earth Systems Grid Federation (ESGF), all while executing everything in between in a scalable environment in a task parallel way (MPI). We highlight the use and benefit of these tools by showing several climate science analysis use cases they have been applied to.
NASA Astrophysics Data System (ADS)
Werkheiser, W. H.
2016-12-01
10 Years of Scientific Integrity Policy at the U.S. Geological Survey The U.S. Geological Survey implemented its first scientific integrity policy in January 2007. Following the 2009 and 2010 executive memoranda aimed at creating scientific integrity policies throughout the federal government, USGS' policy served as a template to inform the U.S. Department of Interior's policy set forth in January 2011. Scientific integrity policy at the USGS and DOI continues to evolve as best practices come to the fore and the broader Federal scientific integrity community evolves in its understanding of a vital and expanding endeavor. We find that scientific integrity is best served by: formal and informal mechanisms through which to resolve scientific integrity issues; a well-communicated and enforceable code of scientific conduct that is accessible to multiple audiences; an unfailing commitment to the code on the part of all parties; awareness through mandatory training; robust protection to encourage whistleblowers to come forward; and outreach with the scientific integrity community to foster consistency and share experiences.
NASA Astrophysics Data System (ADS)
Werkheiser, W. H.
2017-12-01
10 Years of Scientific Integrity Policy at the U.S. Geological Survey The U.S. Geological Survey implemented its first scientific integrity policy in January 2007. Following the 2009 and 2010 executive memoranda aimed at creating scientific integrity policies throughout the federal government, USGS' policy served as a template to inform the U.S. Department of Interior's policy set forth in January 2011. Scientific integrity policy at the USGS and DOI continues to evolve as best practices come to the fore and the broader Federal scientific integrity community evolves in its understanding of a vital and expanding endeavor. We find that scientific integrity is best served by: formal and informal mechanisms through which to resolve scientific integrity issues; a well-communicated and enforceable code of scientific conduct that is accessible to multiple audiences; an unfailing commitment to the code on the part of all parties; awareness through mandatory training; robust protection to encourage whistleblowers to come forward; and outreach with the scientific integrity community to foster consistency and share experiences.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-02-15
... ``criteria pollutants.'' The air quality criteria are to ``accurately reflect the latest scientific knowledge... criteria. The revised air quality criteria reflect advances in scientific knowledge on the effects of the... Related to the Review of the Secondary National Ambient Air Quality Standards for Oxides of Nitrogen and...
Developmental Scientist | Center for Cancer Research
PROGRAM DESCRIPTION Within the Leidos Biomedical Research Inc.’s Clinical Research Directorate, the Clinical Monitoring Research Program (CMRP) provides high-quality comprehensive and strategic operational support to the high-profile domestic and international clinical research initiatives of the National Cancer Institute (NCI), National Institute of Allergy and Infectious Diseases (NIAID), Clinical Center (CC), National Institute of Heart, Lung and Blood Institute (NHLBI), National Institute of Arthritis and Musculoskeletal and Skin Diseases (NIAMS), National Center for Advancing Translational Sciences (NCATS), National Institute of Neurological Disorders and Stroke (NINDS), and the National Institute of Mental Health (NIMH). Since its inception in 2001, CMRP’s ability to provide rapid responses, high-quality solutions, and to recruit and retain experts with a variety of backgrounds to meet the growing research portfolios of NCI, NIAID, CC, NHLBI, NIAMS, NCATS, NINDS, and NIMH has led to the considerable expansion of the program and its repertoire of support services. CMRP’s support services are strategically aligned with the program’s mission to provide comprehensive, dedicated support to assist National Institutes of Health researchers in providing the highest quality of clinical research in compliance with applicable regulations and guidelines, maintaining data integrity, and protecting human subjects. For the scientific advancement of clinical research, CMRP services include comprehensive clinical trials, regulatory, pharmacovigilance, protocol navigation and development, and programmatic and project management support for facilitating the conduct of 400+ Phase I, II, and III domestic and international trials on a yearly basis. These trials investigate the prevention, diagnosis, treatment of, and therapies for cancer, influenza, HIV, and other infectious diseases and viruses such as hepatitis C, tuberculosis, malaria, and Ebola virus; heart, lung, and blood diseases and conditions; parasitic infections; rheumatic and inflammatory diseases; and rare and neglected diseases. CMRP’s collaborative approach to clinical research and the expertise and dedication of staff to the continuation and success of the program’s mission has contributed to improving the overall standards of public health on a global scale. The Clinical Monitoring Research Program (CMRP) provides quality assurance and regulatory compliance support to the National Cancer Institute’s (NCI’s), Center for Cancer Research (CCR), Surgery Branch (SB). KEY ROLES/RESPONSIBILITIES - THIS POSITION IS CONTINGENT UPON FUNDING APPROVAL The Developmental Scientist will: Provide support and advisement to the development of the T Cell receptor gene therapy protocols. Establishes, implements and maintains standardized processes and assesses performance to make recommendations for improvement. Provides support and guidance to the cellular therapy or vector production facilities at the NIH Clinical Center engaged in the manufacture of patient-specific therapies. Manufactures cellular therapy products for human use. Develops and manufactures lentiviral and/or retroviral vectors. Prepares technical reports, abstracts, presentations and program correspondence concerning assigned projects through research and analysis of information relevant to government policy, regulations and other relevant data and monitor all assigned programs for compliance. Provides project management support with planning and development of project schedules and deliverables, tracking project milestones, managing timelines, preparing status reports and monitoring progress ensuring adherence to deadlines. Facilitates communication through all levels of staff by functioning as a liaison between internal departments, senior management, and the customer. Serves as a leader/mentor to administrative staff and prepares employee performance evaluations. Develops and implements procedures/programs to ensure effective and efficient business and operational processes. Identifies potential bottlenecks in upcoming development processes and works with team members and senior management for resolution. Analyzes and tracks initiatives and contracts. Coordinates and reviews daily operations and logistics, including purchasing and shipping of miscellaneous equipment, laboratory and office supplies to ensure compliance with appropriate government regulations. Coordinates the administrative, fiscal, contractual, and quality aspects of all projects. Ensures that internal budgets, schedules and performance requirements are met. Monitors workflow and timelines to ensure production operations are on schedule and adequate raw materials and supplies are available. Ensures all activities are in compliance with applicable federal regulations and guidelines and proper testing/validation activities have been scheduled and conducted. Regularly interacts with senior or executive management both internally and externally, on matters concerning several functional areas such as operations, quality control and quality assurance. Participates in planning facility or operations modifications, upgrades and renovations. Performs technical audits of outsourced contractors in conjunction with Quality Assurance and or Quality Control. Assists in the evaluation and selection of staff, planning and coordination of training, assigning of tasks and scheduling workloads and evaluating overall performance. This position is located in Bethesda, Maryland.
NASA Astrophysics Data System (ADS)
Stambaugh, Ronald
2012-04-01
I am very pleased to join the outstanding leadership team for the journal Nuclear Fusion as Scientific Editor. The journal's high position in the field of fusion energy research derives in no small measure from the efforts of the IAEA team in Vienna, the production and marketing of IOP Publishing, the Board of Editors led by its chairman Mitsuru Kikuchi, the Associate Editor for Inertial Confinement Max Tabak and the outgoing Scientific Editor, Paul Thomas. During Paul's five year tenure submissions have grown by over 40%. The usage of the electronic journal has grown year by year with about 300 000 full text downloads of Nuclear Fusion articles in 2011, an impressive figure due in part to the launch of the full 50 year archive. High quality has been maintained while times for peer review and publishing have been reduced and the journal achieved some of the highest impact factors ever (as high as 4.27). The journal has contributed greatly to building the international scientific basis for fusion. I was privileged to serve from 2003 to 2010 as chairman of the Coordinating Committee for the International Tokamak Physics Activity (ITPA) which published in Nuclear Fusion the first ITER Physics Basis (1999) and its later update (2007). The scientific basis that has been developed to date for fusion has led to the construction of major facilities to demonstrate the production of power-plant relevant levels of fusion reactions. We look forward to the journal continuing to play a key role in the international effort toward fusion energy as these exciting major facilities and the various approaches to fusion continue to be developed. It is clear that Nuclear Fusion maintains its position in the field because of the perceived high quality of the submissions, the refereeing and the editorial processes, and the availability and utility of the online journal. The creation of the Nuclear Fusion Prize, led by the Board of Editors chairman Mitsuru Kikuchi, for the most outstanding paper published in the journal each year has furthered the submission and recognition of papers of the highest quality. The accomplishments of the journal's team over the last five years will be a tough act to follow but I look forward to working with this competent and dedicated group to continue the journal's high standards and ensure that Nuclear Fusion remains the journal of choice for authors and readers alike.
Forming a Team to Ensure High-Quality Measurement in Education Studies. REL 2014-052
ERIC Educational Resources Information Center
Kisker, Ellen Eliason; Boller, Kimberly
2014-01-01
This brief provides tips for forming a team of staff and consultants with the needed expertise to make key measurement decisions that will ensure high-quality data for answering the study's research questions. The brief outlines the main responsibilities of measurement team members. It also describes typical measurement tasks and discusses…
ERIC Educational Resources Information Center
Naik, B. M.
2012-01-01
The paper presents in brief the need and importance of effective, imaginative and responsible governing boards in colleges and universities, so as to ensure educational quality. BOG should engage fruitfully with the principal and activities in college/ university. UGC, AICTE have now prescribed creation of effective boards for both government and…
Scientific misconduct, the pharmaceutical industry, and the tragedy of institutions.
Cohen-Kohler, Jillian Clare; Esmail, Laura C
2007-09-01
This paper examines how current legislative and regulatory models do not adequately govern the pharmaceutical industry towards ethical scientific conduct. In the context of a highly profit-driven industry, governments need to ensure ethical and legal standards are not only in place for companies but that they are enforceable. We demonstrate with examples from both industrialized and developing countries how without sufficient controls, there is a risk that corporate behaviour will transgress ethical boundaries. We submit that there is a critical need for urgent drug regulatory reform. There must be robust regulatory structures in place which enforce corporate governance mechanisms to ensure that pharmaceutical companies maintain ethical standards in drug research and development and the marketing of pharmaceuticals. What is also needed is for the pharmaceutical industry to adopt authentic "corporate social responsibility" policies as current policies and practices are insufficient.
NASA Astrophysics Data System (ADS)
Showalter, L. M.; Gibeaut, J. C.
2015-12-01
As more journals and funding organizations require data to be made available, more and more scientists are being exposed to the world of data science, metadata development, and data standards so they can ensure future funding and publishing success. The Gulf of Mexico Research Initiative Information and Data Cooperative (GRIIDC) is the vehicle by which the Gulf of Mexico Research Initiative (GOMRI) is making all data collected in this program publically available. This varied group of researchers all have different levels of experience with data management standards and protocols, thus GRIIDC has evolved to embrace the cooperative nature of our work and develop a number of tools and training materials to help ensure data managers and researchers in the GoMRI program are submitting high quality data and metadata that will be useful for years to come. GRIIDC began with a group of 8 data managers many of which had only ever managed their own data, who were then expected to manage the data of a large group of geographically distant researchers. As the program continued to evolve these data managers worked with the GRIIDC team to help identify and develop much needed resources for training and communication for themselves and the scientists they represented. This essential cooperation has developed a team of highly motivated scientists, computer programmers and data scientists who are working to ensure a data and information legacy that promotes continual scientific discovery and public awareness of the Gulf of Mexico Ecosystem and beyond.
AGU President's Message: Obama Administration's Commitment to Scientific Integrity
NASA Astrophysics Data System (ADS)
McPhaden, Michael J.
2011-01-01
In March 2009, President Barack Obama issued a memorandum on the subject of scientific integrity in which he stated emphatically, 'Science and the scientific process must inform and guide decisions of my Administration on a wide range of issues, including improvement of public health, protection of the environment, increased efficiency in the use of energy and other resources, mitigation of the threat of climate change, and protection of national security.” The president charged John Holdren, director of the Office of Science and Technology Policy (OSTP), with developing specific recommendations “for ensuring the highest level of integrity in all aspects of the executive branch's involvement with scientific and technological processes.” On Friday, 17 December, OSTP released federal department and agency guidelines for implementing the administration’s policies on scientific integrity.
AHEAD. Advate in HaEmophilia A outcome Database.
Oldenburg, J; Kurnik, K; Huth-Kühne, A; Zimmermann, R; Abraham, I; Klamroth, R
2010-11-01
The clinical picture of haemophilia A patients is often characterised by recurrent bleedings, in particular joint bleeds. Thus far, long-term data on the outcome of haemophilia A patients are scarce as regards the development of target joints, joint replacement, lost days from school or work due to bleedings, and the quality of life, as most previous studies were limited to the aspects of safety and efficacy. The Baxter-initiated AHEAD (Advate in HaEmophilia A outcome Database) study is a multi-centre, prospective, non-interventional observational study of haemophilia A patients. All patients with a residual FVIII activity of £5% who are being treated with ADVATE are eligible. There are no limitations in terms of patient age or treatment regimen. AHEAD is scientifically supported by a renowned interdisciplinary steering board and is intended to yield data on 500 patients in up to 30 haemophilia centres, collected during a period of four years. The large patient population has been chosen in order to ensure a valid database. The objective of the study is to record haemophilia-related arthropathies, which will be defined based on imaging techniques (e. g. MRI, X-ray, ultrasound) and the judgment of the attending physician. In addition, extensive data will be collected on joint replacement surgeries, pseudotumour development, bleeding-related pain, quality of life (age-related questionnaires: Haem-A-QoL, Haemo-QoL, SF10, SF12v2), risk factors (diabetes mellitus, arterial hypertension, nicotine abuse), blood group, gene mutation, physical activity, and on the efficacy and safety of Advate. The patient data will be entered into an electronic CRF system at the centres. Plausibility checks during data entry, regular monitoring visits, and the option of auditing all serve to ensure a high data quality for AHEAD. The first patient was enrolled in the study in early June 2010; recruitment is planned to continue until the end of 2011. The Ethics Committee of the University of Bonn has given its favorable opinion.
Safer@home—Simulation and training: the study protocol of a qualitative action research design
Wiig, Siri; Guise, Veslemøy; Anderson, Janet; Storm, Marianne; Lunde Husebø, Anne Marie; Testad, Ingelin; Søyland, Elsa; Moltu, Kirsti L
2014-01-01
Introduction While it is predicted that telecare and other information and communication technology (ICT)-assisted services will have an increasingly important role in future healthcare services, their implementation in practice is complex. For implementation of telecare to be successful and ensure quality of care, sufficient training for staff (healthcare professionals) and service users (patients) is fundamental. Telecare training has been found to have positive effects on attitudes to, sustained use of, and outcomes associated with telecare. However, the potential contribution of training in the adoption, quality and safety of telecare services is an under-investigated research field. The overall aim of this study is to develop and evaluate simulation-based telecare training programmes to aid the use of videophone technology in elderly home care. Research-based training programmes will be designed for healthcare professionals, service users and next of kin, and the study will explore the impact of training on adoption, quality and safety of new telecare services. Methods and analysis The study has a qualitative action research design. The research will be undertaken in close collaboration with a multidisciplinary team consisting of researchers and managers and clinical representatives from healthcare services in two Norwegian municipalities, alongside experts in clinical education and simulation, as well as service user (patient) representatives. The qualitative methods used involve focus group interviews, semistructured interviews, observation and document analysis. To ensure trustworthiness in the data analysis, we will apply member checks and analyst triangulation; in addition to providing contextual and sample description to allow for evaluation of transferability of our results to other contexts and groups. Ethics and dissemination The study is approved by the Norwegian Social Science Data Services. The study is based on voluntary participation and informed written consent. Informants can withdraw at any point in time. The results will be disseminated at research conferences, peer review journals, one PhD thesis and through public presentations to people outside the scientific community. PMID:25079924
Ethical pharmaceutical promotion and communications worldwide: codes and regulations
2014-01-01
The international pharmaceutical industry has made significant efforts towards ensuring compliant and ethical communication and interaction with physicians and patients. This article presents the current status of the worldwide governance of communication practices by pharmaceutical companies, concentrating on prescription-only medicines. It analyzes legislative, regulatory, and code-based compliance control mechanisms and highlights significant developments, including the 2006 and 2012 revisions of the International Federation of Pharmaceutical Manufacturers and Associations (IFPMA) Code of Practice. Developments in international controls, largely built upon long-established rules relating to the quality of advertising material, have contributed to clarifying the scope of acceptable company interactions with healthcare professionals. This article aims to provide policy makers, particularly in developing countries, with an overview of the evolution of mechanisms governing the communication practices, such as the distribution of promotional or scientific material and interactions with healthcare stakeholders, relating to prescription-only medicines. PMID:24679064
Ethical pharmaceutical promotion and communications worldwide: codes and regulations.
Francer, Jeffrey; Izquierdo, Jose Zamarriego; Music, Tamara; Narsai, Kirti; Nikidis, Chrisoula; Simmonds, Heather; Woods, Paul
2014-03-29
The international pharmaceutical industry has made significant efforts towards ensuring compliant and ethical communication and interaction with physicians and patients. This article presents the current status of the worldwide governance of communication practices by pharmaceutical companies, concentrating on prescription-only medicines. It analyzes legislative, regulatory, and code-based compliance control mechanisms and highlights significant developments, including the 2006 and 2012 revisions of the International Federation of Pharmaceutical Manufacturers and Associations (IFPMA) Code of Practice.Developments in international controls, largely built upon long-established rules relating to the quality of advertising material, have contributed to clarifying the scope of acceptable company interactions with healthcare professionals. This article aims to provide policy makers, particularly in developing countries, with an overview of the evolution of mechanisms governing the communication practices, such as the distribution of promotional or scientific material and interactions with healthcare stakeholders, relating to prescription-only medicines.
Buchanan, Robert L; Oni, Ruth
2012-05-01
Microbiological testing for various indicator microorganisms is used extensively as a means of verifying the effectiveness of efforts to ensure the microbiological quality and safety of a wide variety of foods. However, for each use of an indicator organism the underlying scientific assumptions related to the behavior of the target microorganism, the characteristics of the food matrix, the details of the food manufacturing processes, environment, and distribution system, and the methodological basis for the assay must be evaluated to determine the validity, utility, and efficacy of potential microbiological indicator tests. The recent adoption by the Codex Alimentarius Commission of microbiological criteria for powdered infant formulae and related products provides an excellent example of an evidence-based approach for the establishment of consensus microbiological criteria. The present article reviews these criteria and those of various national governments in relation to emerging principles for the evidence-based establishment of effective indicator organisms.
Neuroscience in Nigeria: the past, the present and the future.
Balogun, Wasiu Gbolahan; Cobham, Ansa Emmanuel; Amin, Abdulbasit
2018-04-01
The science of the brain and nervous system cuts across almost all aspects of human life and is one of the fastest growing scientific fields worldwide. This necessitates the demand for pragmatic investment by all nations to ensure improved education and quality of research in Neurosciences. Although obvious efforts are being made in advancing the field in developed societies, there is limited data addressing the state of neuroscience in sub-Saharan Africa. Here, we review the state of neuroscience development in Nigeria, Africa's most populous country and its largest economy, critically evaluating the history, the current situation and future projections. This review specifically addresses trends in clinical and basic neuroscience research and education. We conclude by highlighting potentially helpful strategies that will catalyse development in neuroscience education and research in Nigeria, among which are an increase in research funding, provision of tools and equipment for training and research, and upgrading of the infrastructure at hand.
Good modeling practice guidelines for applying multimedia models in chemical assessments.
Buser, Andreas M; MacLeod, Matthew; Scheringer, Martin; Mackay, Don; Bonnell, Mark; Russell, Mark H; DePinto, Joseph V; Hungerbühler, Konrad
2012-10-01
Multimedia mass balance models of chemical fate in the environment have been used for over 3 decades in a regulatory context to assist decision making. As these models become more comprehensive, reliable, and accepted, there is a need to recognize and adopt principles of Good Modeling Practice (GMP) to ensure that multimedia models are applied with transparency and adherence to accepted scientific principles. We propose and discuss 6 principles of GMP for applying existing multimedia models in a decision-making context, namely 1) specification of the goals of the model assessment, 2) specification of the model used, 3) specification of the input data, 4) specification of the output data, 5) conduct of a sensitivity and possibly also uncertainty analysis, and finally 6) specification of the limitations and limits of applicability of the analysis. These principles are justified and discussed with a view to enhancing the transparency and quality of model-based assessments. Copyright © 2012 SETAC.
Young, Jasmine Y.; Westbrook, John D.; Feng, Zukang; Sala, Raul; Peisach, Ezra; Oldfield, Thomas J.; Sen, Sanchayita; Gutmanas, Aleksandras; Armstrong, David R.; Berrisford, John M.; Chen, Li; Chen, Minyu; Di Costanzo, Luigi; Dimitropoulos, Dimitris; Gao, Guanghua; Ghosh, Sutapa; Gore, Swanand; Guranovic, Vladimir; Hendrickx, Pieter MS; Hudson, Brian P.; Igarashi, Reiko; Ikegawa, Yasuyo; Kobayashi, Naohiro; Lawson, Catherine L.; Liang, Yuhe; Mading, Steve; Mak, Lora; Mir, M. Saqib; Mukhopadhyay, Abhik; Patwardhan, Ardan; Persikova, Irina; Rinaldi, Luana; Sanz-Garcia, Eduardo; Sekharan, Monica R.; Shao, Chenghua; Swaminathan, G. Jawahar; Tan, Lihua; Ulrich, Eldon L.; van Ginkel, Glen; Yamashita, Reiko; Yang, Huanwang; Zhuravleva, Marina A.; Quesada, Martha; Kleywegt, Gerard J.; Berman, Helen M.; Markley, John L.; Nakamura, Haruki; Velankar, Sameer; Burley, Stephen K.
2017-01-01
SUMMARY OneDep, a unified system for deposition, biocuration, and validation of experimentally determined structures of biological macromolecules to the Protein Data Bank (PDB) archive, has been developed as a global collaboration by the Worldwide Protein Data Bank (wwPDB) partners. This new system was designed to ensure that the wwPDB could meet the evolving archiving requirements of the scientific community over the coming decades. OneDep unifies deposition, biocuration, and validation pipelines across all wwPDB, EMDB, and BMRB deposition sites with improved focus on data quality and completeness in these archives, while supporting growth in the number of depositions and increases in their average size and complexity. In this paper, we describe the design, functional operation, and supporting infrastructure of the OneDep system, and provide initial performance assessments. PMID:28190782
Research coordinators' experiences with scientific misconduct and research integrity.
Habermann, Barbara; Broome, Marion; Pryor, Erica R; Ziner, Kim Wagler
2010-01-01
Most reports of scientific misconduct have been focused on principal investigators and other scientists (e.g., biostatisticians) involved in the research enterprise. However, by virtue of their position, research coordinators are often closest to the research field where much of misconduct occurs. The purpose of this study was to describe research coordinators' experiences with scientific misconduct in their clinical environment. The descriptive design was embedded in a larger cross-sectional national survey. A total of 266 respondents, predominately registered nurses, who answered "yes" to having firsthand knowledge of scientific misconduct in the past year, provided open-ended question responses. Content analysis was conducted by the research team, ensuring agreement of core categories and subcategories of misconduct. Research coordinators most commonly learned about misconduct via firsthand witness of the event, with the principal investigator being the person most commonly identified as the responsible party. Five major categories of misconduct were identified: protocol violations, consent violations, fabrication, falsification, and financial conflict of interest. In 70% of cases, the misconduct was reported. In most instances where misconduct was reported, some action was taken. However, in approximately 14% of cases, no action or investigation ensued; in 6.5% of cases, the coordinator was fired or he or she resigned. This study demonstrates the need to expand definitions of scientific misconduct beyond fabrication, falsification, and plagiarism to include other practices. The importance of the ethical climate in the institution in ensuring a safe environment to report and an environment where evidence is reviewed cannot be overlooked.
Science in the Era of Facebook and Twitter: Get Used to It
NASA Astrophysics Data System (ADS)
Falcke, H.
2018-02-01
Astrophysicist Heino Falcke reflects on the increased transparency of the scientific process with the rise of social media. He discusses the positives and negatives of having a spotlight shone on scientific results in the embryonic stage and, as a result, the rising number of false findings and claims that find their way into the public eye. What does this new age of communication mean for science? And how do scientists, science journalists and the public need to adapt to ensure a positive change in the way we conduct, communicate and trust science and scientific evidence?
Scientific Integrity in Washington: Politics Trumps Science?
NASA Astrophysics Data System (ADS)
Krauss, Lawrence
2005-04-01
Numerous documented examples exist in which the current administration has either censored or distorted the recommendations and/or the results of government scientific advisory panels and agencies, or has interfered with the makeup of scientific advisory panels for apparently political purposes. These instances seem more broad ranging than any recent administration, republican or democrat, and have continued despite various public outcries. I will describe several examples from the physical sciences, and the biological sciences, and then discuss what we might do as a community to encourage the administration in its second term to work to ensure that politics does not trump science.
Optimizing Resources for Trustworthiness and Scientific Impact of Domain Repositories
NASA Astrophysics Data System (ADS)
Lehnert, K.
2017-12-01
Domain repositories, i.e. data archives tied to specific scientific communities, are widely recognized and trusted by their user communities for ensuring a high level of data quality, enhancing data value, access, and reuse through a unique combination of disciplinary and digital curation expertise. Their data services are guided by the practices and values of the specific community they serve and designed to support the advancement of their science. Domain repositories need to meet user expectations for scientific utility in order to be successful, but they also need to fulfill the requirements for trustworthy repository services to be acknowledged by scientists, funders, and publishers as a reliable facility that curates and preserves data following international standards. Domain repositories therefore need to carefully plan and balance investments to optimize the scientific impact of their data services and user satisfaction on the one hand, while maintaining a reliable and robust operation of the repository infrastructure on the other hand. Staying abreast of evolving repository standards to certify as a trustworthy repository and conducting a regular self-assessment and certification alone requires resources that compete with the demands for improving data holdings or usability of systems. The Interdisciplinary Earth Data Alliance (IEDA), a data facility funded by the US National Science Foundation, operates repositories for geochemical, marine Geoscience, and Antarctic research data, while also maintaining data products (global syntheses) and data visualization and analysis tools that are of high value for the science community and have demonstrated considerable scientific impact. Balancing the investments in the growth and utility of the syntheses with resources required for certifcation of IEDA's repository services has been challenging, and a major self-assessment effort has been difficult to accommodate. IEDA is exploring a partnership model to share generic repository functions (e.g. metadata registration, long-term archiving) with other repositories. This could substantially reduce the effort of certification and allow effort to focus on the domain-specific data curation and value-added services.
NASA's Microgravity Science Research Program
NASA Technical Reports Server (NTRS)
1996-01-01
The ongoing challenge faced by NASA's Microgravity Science Research Program is to work with the scientific and engineering communities to secure the maximum return from our Nation's investments by: assuring that the best possible science emerges from the science community for microgravity investigations; ensuring the maximum scientific return from each investigation in the most timely and cost-effective manner; and enhancing the distribution of data and applications of results acquired through completed investigations to maximize their benefits.
On-line welding quality inspection system for steel pipe based on machine vision
NASA Astrophysics Data System (ADS)
Yang, Yang
2017-05-01
In recent years, high frequency welding has been widely used in production because of its advantages of simplicity, reliability and high quality. In the production process, how to effectively control the weld penetration welding, ensure full penetration, weld uniform, so as to ensure the welding quality is to solve the problem of the present stage, it is an important research field in the field of welding technology. In this paper, based on the study of some methods of welding inspection, a set of on-line welding quality inspection system based on machine vision is designed.
NASA Technical Reports Server (NTRS)
Witkin, S. A.
1976-01-01
A viable quality program for the urban mass transit industry, and a management approach to ensure compliance with the program are outlined. Included are: (1) a set of guidelines for quality assurance to be imposed on transit authorities, and a management approach to ensure compliance with them; (2) a management approach to be used by the transit authorities (properties) for assuring compliance with the QA guidelines; and (3) quality assurance guidelines to be imposed by properties and umta for procurement of hardware and systems.
Science for the Public Good: Tackling scientific integrity in the federal government
NASA Astrophysics Data System (ADS)
Goldman, G. T.; Halpern, M.; Johnson, C.
2016-12-01
From hydraulic fracturing to climate change to seismic risk, government science and scientists are integral to public decision making in the geosciences. Following calls for increased scientific integrity across the government, policies have been put in place in recent years to be promote transparency and appropriate use of science in government decision making. But how effective have these initiatives been? With the development of scientific integrity policies, new transparency measures, and other efforts in recent years, are we seeing improvements in how federal agencies use science? And importantly, can these safeguards prevent potential future breaches of scientific integrity and misuse science for political gain? Review of recent progress and problems around government scientific integrity, including case studies, policy assessments, and surveys of federal scientists, can shed light on how far the we have come and what areas still need improvement to ensure that government scientific integrity is preserved in the future.
NASA Astrophysics Data System (ADS)
Bernard, E. N.
2014-12-01
As the decade of mega-tsunamis has unfolded with new data, the science of tsunami has advanced at an unprecedented pace. Our responsibility to society should guide the use of these new scientific discoveries to better prepare society for the next tsunami. This presentation will focus on the impacts of the 2004 and 2011 tsunamis and new societal expectations accompanying enhanced funding for tsunami research. A list of scientific products, including tsunami hazard maps, tsunami energy scale, real-time tsunami flooding estimates, and real-time current velocities in harbors will be presented to illustrate society's need for relevant, easy to understand tsunami information. Appropriate use of these tsunami scientific products will be presented to demonstrate greater tsunami resilience for tsunami threatened coastlines. Finally, a scientific infrastructure is proposed to ensure that these products are both scientifically sound and represent today's best practices to protect the scientific integrity of the products as well as the safety of coastal residents.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-06-20
... chemistry and toxicology data. In addition, EPA must ensure that adequate enforcement of the tolerance can... (pesticide and other agricultural chemical manufacturing) and 541600 (management, scientific, and technical...
Unconventional Oil and Natural Gas Development
EPA works with states and other key stakeholders, through sound scientific research and regulation; to help ensure that natural gas extraction from shale formations, also called fracking or hydrofracking, does not harm public health and the environment.
NASA Astrophysics Data System (ADS)
Carter, Frances D.
2011-12-01
Low participation and performance in science, technology, engineering, and mathematics (STEM) fields by U.S. citizens are widely recognized as major problems with substantial economic, political, and social ramifications. Studies of collegiate interventions designed to broaden participation in STEM fields suggest that participation in undergraduate research is a key program component that enhances such student outcomes as undergraduate GPA, graduation, persistence in a STEM major, and graduate school enrollment. However, little is known about the mechanisms that are responsible for these positive effects. The current study hypothesizes that undergraduate research participation increases scientific self-efficacy and scientific research proficiency. This hypothesis was tested using data obtained from a survey of minority students from several STEM intervention programs that offer undergraduate research opportunities. Students were surveyed both prior to and following the summer of 2010. Factor analysis was used to examine the factor structure of participants' responses on scientific self-efficacy and scientific research proficiency scales. Difference-in-difference analysis was then applied to the resulting factor score differences to estimate the relationship of summer research participation with scientific self-efficacy and scientific research proficiency. Factor analytic results replicate and further validate previous findings of a general scientific self-efficacy construct (Schultz, 2008). While the factor analytic results for the exploratory scientific research proficiency scale suggest that it was also a measureable construct, the factor structure was not generalizable over time. Potential reasons for the lack of generalizability validity for the scientific research proficiency scale are explored and recommendations for emerging scales are provided. Recent restructuring attempts within federal science agencies threaten the future of STEM intervention programs. Causal estimates of the effect of undergraduate research participation on specific and measurable benefits can play an important role in ensuring the sustainability of STEM intervention programs. Obtaining such estimates requires additional studies that, inter alia, incorporate adequate sample sizes, valid measurement scales, and the ability to account for unobserved variables. Political strategies, such as compromise, can also play an important role in ensuring the sustainability of STEM intervention programs.
Evaluating Software Assurance Knowledge and Competency of Acquisition Professionals
2014-10-01
of ISO 12207 -2008, both internationally and in the United States [7]. That standard documents a comprehensive set of activities and supporting...grows, organizations must ensure that their procurement agents acquire high quality, secure software. ISO 12207 and the Software Assurance Competency...cyberattacks grows, organizations must ensure that their procurement agents acquire high quality, secure software. ISO 12207 and the Software Assurance
Uncovering Predictors of Disagreement: Ensuring the Quality of Expert Ratings
ERIC Educational Resources Information Center
Hoth, Jessica; Schwarz, Björn; Kaiser, Gabriele; Busse, Andreas; König, Johannes; Blömeke, Sigrid
2016-01-01
Rating scales are a popular item format used in many types of assessments. Yet, defining which rating is correct often represents a challenge. Using expert ratings as benchmarks is one approach to ensuring the quality of a rating instrument. In this paper, such expert ratings are analyzed in detail taking a video-based test instrument of teachers'…
2016-01-01
The ability of the United States to most efficiently make breakthroughs on the biology, diagnosis and treatment of human diseases requires that physicians and scientists in each state have equal access to federal research grants and grant dollars. However, despite legislative and administrative efforts to ensure equal access, the majority of funding for biomedical research is concentrated in a minority of states. To gain insight into the causes of such disparity, funding metrics were examined for all NIH research project grants (RPGs) from 2004 to 2013. State-by-state differences in per application success rates, per investigator funding rates, and average award size each contributed significantly to vast disparities (greater than 100-fold range) in per capita RPG funding to individual states. To the extent tested, there was no significant association overall between scientific productivity and per capita funding, suggesting that the unbalanced allocation of funding is unrelated to the quality of scientists in each state. These findings reveal key sources of bias in, and new insight into the accuracy of, the funding process. They also support evidence-based recommendations for how the NIH could better utilize the scientific talent and capacity that is present throughout the United States. PMID:27077009
Eaton, K A; Innes, N; Balaji, S M; Pugh, C; Honkala, E; Lynch, C D
2017-02-01
This satellite symposium was the fifth in a series for editors, publishers, reviewers and all those with an interest in scientific publishing. It was held on Wednesday, 11 March 2015 at the IADR meeting in Boston, Massachusetts. The symposium attracted more than 210 attendees. The symposium placed an emphasis on strategies to ensure that papers are accepted by peer reviewed journals. The speaker, representing the Journal of Dental Research gave a history of peer review and explained how to access material to advise new authors. The speaker from India outlined the problems that occur when there is no culture for dental research and it is given a low priority in dental education. He outlined remedies. The speaker from SAGE publications described the help that publishers and editors can provide authors. The final speaker suggested that in developing countries it was essential to create alliances with dental researchers in developed countries and that local conferences to which external speakers were invited, stimulated research both in terms of quantity and quality. A wide ranging discussion then took place. Copyright © 2016 Elsevier Ltd. All rights reserved.
Wahls, Wayne P
2016-01-01
The ability of the United States to most efficiently make breakthroughs on the biology, diagnosis and treatment of human diseases requires that physicians and scientists in each state have equal access to federal research grants and grant dollars. However, despite legislative and administrative efforts to ensure equal access, the majority of funding for biomedical research is concentrated in a minority of states. To gain insight into the causes of such disparity, funding metrics were examined for all NIH research project grants (RPGs) from 2004 to 2013. State-by-state differences in per application success rates, per investigator funding rates, and average award size each contributed significantly to vast disparities (greater than 100-fold range) in per capita RPG funding to individual states. To the extent tested, there was no significant association overall between scientific productivity and per capita funding, suggesting that the unbalanced allocation of funding is unrelated to the quality of scientists in each state. These findings reveal key sources of bias in, and new insight into the accuracy of, the funding process. They also support evidence-based recommendations for how the NIH could better utilize the scientific talent and capacity that is present throughout the United States.
How do you know it is true? Integrity in research and publications: AOA critical issues.
Buckwalter, Joseph A; Tolo, Vernon T; O'Keefe, Regis J
2015-01-07
High-quality medical care is the result of clinical decisions based upon scientific principles garnered from basic, translational, and clinical research. Information regarding the natural history of diseases and their responses to various treatments is introduced into the medical literature through the approximately one million PubMed journal articles published each year. Pharmaceutical and device companies, universities, departments, and researchers all stand to gain from research publication. Basic and translational research is highly competitive. Success in obtaining research funding and career advancement requires scientific publication in the medical literature. Clinical research findings can lead to changes in the pattern of orthopaedic practice and have implications for the utilization of pharmaceuticals and orthopaedic devices. Research findings can be biased by ownership of patents and materials, funding sources, and consulting arrangements. The current high-stakes research environment has been characterized by an increase in plagiarism, falsification or manipulation of data, selected presentation of results, research bias, and inappropriate statistical analyses. It is the responsibility of the orthopaedic community to work collaboratively with industry, universities, departments, and medical researchers and educators to ensure the integrity of the content of the orthopaedic literature and to enable the incorporation of best practices in the care of orthopaedic patients. Copyright © 2015 by The Journal of Bone and Joint Surgery, Incorporated.
Running an open experiment: transparency and reproducibility in soil and ecosystem science
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bond-Lamberty, Benjamin; Smith, Ashly P.; Bailey, Vanessa L.
Researchers in soil and ecosystem science, and almost every other field, are being pushed--by funders, journals, governments, and their peers--to increase transparency and reproducibility of their work. A key part of this effort is a move towards open data as a way to fight post-publication data loss, improve data and code quality, enable powerful meta- and cross-disciplinary analyses, and increase trust in, and the efficiency of, publicly-funded research. Many scientists however lack experience in, and may be unsure of the benefits of, making their data and fully-reproducible analyses publicly available. Here we describe a recent "open experiment", in which wemore » documented every aspect of a soil incubation online, making all raw data, scripts, diagnostics, final analyses, and manuscripts available in real time. We found that using tools such as version control, issue tracking, and open-source statistical software improved data integrity, accelerated our team's communication and productivity, and ensured transparency. There are many avenues to improve scientific reproducibility and data availability, of which is this only one example, and it is not an approach suited for every experiment or situation. Nonetheless, we encourage the communities in our respective fields to consider its advantages, and to lead rather than follow with respect to scientific reproducibility, transparency, and data availability.« less
Running an open experiment: transparency and reproducibility in soil and ecosystem science
NASA Astrophysics Data System (ADS)
Bond-Lamberty, Ben; Peyton Smith, A.; Bailey, Vanessa
2016-08-01
Researchers in soil and ecosystem science, and almost every other field, are being pushed—by funders, journals, governments, and their peers—to increase transparency and reproducibility of their work. A key part of this effort is a move towards open data as a way to fight post-publication data loss, improve data and code quality, enable powerful meta- and cross-disciplinary analyses, and increase trust in, and the efficiency of, publicly-funded research. Many scientists however lack experience in, and may be unsure of the benefits of, making their data and fully-reproducible analyses publicly available. Here we describe a recent ‘open experiment’, in which we documented every aspect of a soil incubation online, making all raw data, scripts, diagnostics, final analyses, and manuscripts available in real time. We found that using tools such as version control, issue tracking, and open-source statistical software improved data integrity, accelerated our team’s communication and productivity, and ensured transparency. There are many avenues to improve scientific reproducibility and data availability, of which is this only one example, and it is not an approach suited for every experiment or situation. Nonetheless, we encourage the communities in our respective fields to consider its advantages, and to lead rather than follow with respect to scientific reproducibility, transparency, and data availability.
Scientific writing and the quality of papers: towards a higher impact.
Cáceres, Ana Manhani; Gândara, Juliana Perina; Puglisi, Marina Leite
2011-12-01
Given the latent concern of scientists and editors on the quality of scientific writing, the aim of this paper was to present topics on the recommended structure of peer-reviewed papers. We described the key points of common sections of original papers and proposed two additional materials that may be useful for scientific writing: one particular guide to help the organization of the main ideas of the paper; and a table with examples of non desirable and desirable structures in scientific writing.
[Scientific journals of medical students in Latin-America].
Cabrera-Samith, Ignacio; Oróstegui-Pinilla, Diana; Angulo-Bazán, Yolanda; Mayta-Tristán, Percy; Rodríguez-Morales, Alfonso J
2010-11-01
This article deals with the history and evolution of student's scientific journals in Latin-America, their beginnings, how many still exist and which is their future projection. Relevant events show the growth of student's scientific journals in Latin-America and how are they working together to improve their quality. This article is addressed not only for Latin American readers but also to worldwide readers. Latin American medical students are consistently working together to publish scientific research, whose quality is constantly improving.
Kininmonth, Alice R; Jamil, Nafeesa; Almatrouk, Nasser; Evans, Charlotte E L
2017-12-27
To investigate the quality of nutrition articles in popular national daily newspapers in the UK and to identify important predictors of article quality. Newspapers are a primary source of nutrition information for the public. Newspaper articles were collected on 6 days of the week (excluding Sunday) for 6 weeks in summer 2014. Predictors included food type and health outcome, size of article, whether the journalist was named and day of the week. A validated quality assessment tool was used to assess each article, with a minimum possible score of -12 and a maximum score of 17. Newspapers were checked in duplicate for relevant articles. The association of each predictor on article quality score was analysed adjusting for remaining predictors. A logistic regression model was implemented with quality score as the binary outcome, categorised as poor (score less than zero) or satisfactory (score of zero or more). Over 6 weeks, 141 nutrition articles were included across the five newspapers. The median quality score was 2 (IQR -2-6), and 44 (31%) articles were poor quality. There was no substantial variation in quality of reporting between newspapers once other factors such as anonymous publishing, health outcome, aspect of diet covered and day of the week were taken into account. Particularly low-quality scores were obtained for anonymously published articles with no named journalist, articles that focused on obesity and articles that reported on high fat and processed foods. The general public are regularly exposed to poor quality information in newspapers about what to eat to promote health, particularly articles reporting on obesity. Journalists, researchers, university press officers and scientific journals need to work together more closely to ensure clear, consistent nutrition messages are communicated to the public in an engaging way. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
Quality assessment of nutrition coverage in the media: a 6-week survey of five popular UK newspapers
Kininmonth, Alice R; Jamil, Nafeesa; Almatrouk, Nasser
2017-01-01
Objectives To investigate the quality of nutrition articles in popular national daily newspapers in the UK and to identify important predictors of article quality. Setting Newspapers are a primary source of nutrition information for the public. Design Newspaper articles were collected on 6 days of the week (excluding Sunday) for 6 weeks in summer 2014. Predictors included food type and health outcome, size of article, whether the journalist was named and day of the week. Outcome measures A validated quality assessment tool was used to assess each article, with a minimum possible score of −12 and a maximum score of 17. Newspapers were checked in duplicate for relevant articles. The association of each predictor on article quality score was analysed adjusting for remaining predictors. A logistic regression model was implemented with quality score as the binary outcome, categorised as poor (score less than zero) or satisfactory (score of zero or more). Results Over 6 weeks, 141 nutrition articles were included across the five newspapers. The median quality score was 2 (IQR −2–6), and 44 (31%) articles were poor quality. There was no substantial variation in quality of reporting between newspapers once other factors such as anonymous publishing, health outcome, aspect of diet covered and day of the week were taken into account. Particularly low-quality scores were obtained for anonymously published articles with no named journalist, articles that focused on obesity and articles that reported on high fat and processed foods. Conclusions The general public are regularly exposed to poor quality information in newspapers about what to eat to promote health, particularly articles reporting on obesity. Journalists, researchers, university press officers and scientific journals need to work together more closely to ensure clear, consistent nutrition messages are communicated to the public in an engaging way. PMID:29284712
NASA Astrophysics Data System (ADS)
Montgomery, J. L.; Minsker, B. S.; Schnoor, J.; Haas, C.; Bonner, J.; Driscoll, C.; Eschenbach, E.; Finholt, T.; Glass, J.; Harmon, T.; Johnson, J.; Krupnik, A.; Reible, D.; Sanderson, A.; Small, M.; van Briesen, J.
2006-05-01
With increasing population and urban development, societies grow more and more concerned over balancing the need to maintain adequate water supplies with that of ensuring the quality of surface and groundwater resources. For example, multiple stressors such as overfishing, runoff of nutrients from agricultural fields and confined animal feeding lots, and pathogens in urban stormwater can often overwhelm a single water body. Mitigating just one of these problems often depends on understanding how it relates to others and how stressors can vary in temporal and spatial scales. Researchers are now in a position to answer questions about multiscale, spatiotemporally distributed hydrologic and environmental phenomena through the use of remote and embedded networked sensing technologies. It is now possible for data streaming from sensor networks to be integrated by a rich cyberinfrastructure encompassing the innovative computing, visualization, and information archiving strategies needed to cope with the anticipated onslaught of data, and to turn that data around in the form of real-time water quantity and quality forecasting. Recognizing this potential, NSF awarded $2 million to a coalition of 12 institutions in July 2005 to establish the CLEANER Project Office (Collaborative Large-Scale Engineering Analysis Network for Environmental Research; http://cleaner.ncsa.uiuc.edu). Over the next two years the project office, in coordination with CUAHSI (Consortium of Universities for the Advancement of Hydrologic Science, Inc.; http://www.cuahsi.org), will work together to develop a plan for a WATer and Environmental Research Systems Network (WATERS Network), which is envisioned to be a collaborative scientific exploration and engineering analysis network, using high performance tools and infrastructure, to transform our scientific understanding of how water quantity, quality, and related earth system processes are affected by natural and human-induced changes to the environment. This presentation will give an overview of the draft CLEANER program plans for the WATERS Network and next steps.
Guidelines and Suggestions for Balloon Gondola Design
NASA Technical Reports Server (NTRS)
Franco, Hugo
2016-01-01
The Columbia Scientific Balloon Facility is responsible for ensuring that science payloads meet the appropriate design requirements. The ultimate goal is to ensure that payloads stay within the allowable launch limits as well as survive the termination event. The purpose of this presentation is to provide some general guidelines for Gondola Design. These include rules and reasons on why CSBF has a certain preference and location for certain components within the gondola as well as other suggestions. Additionally, some recommendations are given on how to avoid common pitfalls.
Concept of JINR Corporate Information System
NASA Astrophysics Data System (ADS)
Filozova, I. A.; Bashashin, M. V.; Korenkov, V. V.; Kuniaev, S. V.; Musulmanbekov, G.; Semenov, R. N.; Shestakova, G. V.; Strizh, T. A.; Ustenko, P. V.; Zaikina, T. N.
2016-09-01
The article presents the concept of JINR Corporate Information System (JINR CIS). Special attention is given to the information support of scientific researches - Current Research Information System as a part of the corporate information system. The objectives of such a system are focused on ensuring an effective implementation and research by using the modern information technology, computer technology and automation, creation, development and integration of digital resources on a common conceptual framework. The project assumes continuous system development, introduction the new information technologies to ensure the technological system relevance.
[The basis of modern technologies in management of health care system].
Nemytin, Iu V
2014-12-01
For the development of national heaIth care it is required to implement modern and effective methods and forms of governance. It is necessary to clearly identify transition to process management followed by an introduction of quality management care. It is necessary to create a complete version of the three-level health care system based on the integration into the system "Clinic - Hospital - Rehabilitation", which will ensure resource conservation in general throughout the industry. The most important task is purposeful comprehensive management training for health care--statesmen who have the potential ability to manage. The leader must possess all forms of management and apply them on a scientific basis. Standards and other tools of health management should constantly improve. Standards should be a teaching tool and help to improve the quality and effectiveness of treatment processes, the transition to the single-channel financing--the most advanced form of payment for the medical assistance. This type of financing requires managers to new management approaches, knowledge of business economics. One of the breakthrough objectives is the creation of a new type of health care organizations, which as lead locomotives for a rest.
Kreuter, M; Birring, S S; Wijsenbeek, M; Wapenaar, M; Oltmanns, U; Costabel, U; Bonella, F
2016-11-01
Background: Health status and quality of life are impaired in patients with interstitial lung disease (ILD). To assess these parameters in ILD patients no valid and reliable questionnaire exists in German language so far. The K-BILD questionnaire is a brief and valid tool to evaluate health status in ILD patients, with no validated German version. Method: The linguistic validation of K-BILD was carried out in a multistage process in collaboration with the developer of the questionnaire and bilingual, professional translators. Review by the developers and back translations as well as clinical assessment by ILD patients ensured that the translated questionnaire reflected the intention of the original K-BILD. Results: A German version of K-BILD with 15 questions concerning the health status was composed. The questions cover the three domains breathlessness and activities, psychological aspects and chest symptoms. Problems in understanding or difficulties in replying to the questions were not stated by the ILD patients. Conclusion: The German version of the K-BILD questionnaire allows the clinical and scientific use to measure reliable health quality in ILD patients. © Georg Thieme Verlag KG Stuttgart · New York.
77 FR 76042 - Public Meeting of the Presidential Commission for the Study of Bioethical Issues
Federal Register 2010, 2011, 2012, 2013, 2014
2012-12-26
... promote policies and practices that ensure scientific research, health care delivery, and technological innovation are conducted in a socially and ethically responsible manner. The main agenda item for the...
EPA Science Matters Newsletter: Volume 1, Number 3
The term 'scientific integrity' is often used to describe an essential pillar of our work. It reflects our understanding that sound science is an irreplaceable necessity in ensuring the integrity of our actions and our decisions.
EPA Science Matters Newsletter: Volume 1, Number 3
2017-02-14
The term 'scientific integrity' is often used to describe an essential pillar of our work. It reflects our understanding that sound science is an irreplaceable necessity in ensuring the integrity of our actions and our decisions.
Ethics in medical information and advertising.
Serour, G I; Dickens, B M
2004-05-01
This article presents findings and recommendations of an international conference held in Cairo, Egypt in 2003 concerning issues of ethical practice in how information is provided to and by medical practitioners. Professional advertising to practitioners and the public is necessary, but should exclude misrepresentation of qualifications, resources, and authorship of research papers. Medical institutions are responsible for how staff members present themselves, and their institutions. Medical associations, both governmental licensing authorities and voluntary societies, have powers and responsibilities to monitor professional advertisement to defend the public interest against deception. Medical journals bear duties to ensure authenticity of authorship and integrity in published papers, and the scientific basis of commercial advertisers' claims. A mounting concern is authors' conflict of interest. Mass newsmedia must ensure accuracy and proportionality in reporting scientific developments, and product manufacturers must observe truth in advertising, particularly in Direct-to-Consumer advertising. Consumer protection by government agencies is a continuing responsibility.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
In 1989, the National Academy of Sciences, the National Academy of Engineering, and the Institute of Medicine initiated a major study to examine issues related to scientific responsibility and the conduct of research. This report thoughtfully examines the challenges posed in ensuring that the search for truth reflects adherence to ethical standards. In recent years we have learned that not all scientists adhere to this obligation. Issues of misconduct and integrity in science present complex questions. This report recommends specific actions that all scientists, their institutions, and their sponsors can take to preserve and strengthen the integrity of the researchmore » process and also to deal with allegations of misconduct. The recommendations provide a blueprint for encouraging and safeguarding the intellectual independence that is essential to doing the best science while also providing for fundamental accountability to those who sponsor and support scientific research.« less
Total Diet Studies as a Tool for Ensuring Food Safety
Lee, Joon-Goo; Kim, Sheen-Hee; Kim, Hae-Jung
2015-01-01
With the diversification and internationalization of the food industry and the increased focus on health from a majority of consumers, food safety policies are being implemented based on scientific evidence. Risk analysis represents the most useful scientific approach for making food safety decisions. Total diet study (TDS) is often used as a risk assessment tool to evaluate exposure to hazardous elements. Many countries perform TDSs to screen for chemicals in foods and analyze exposure trends to hazardous elements. TDSs differ from traditional food monitoring in two major aspects: chemicals are analyzed in food in the form in which it will be consumed and it is cost-effective in analyzing composite samples after processing multiple ingredients together. In Korea, TDSs have been conducted to estimate dietary intakes of heavy metals, pesticides, mycotoxins, persistent organic pollutants, and processing contaminants. TDSs need to be carried out periodically to ensure food safety. PMID:26483881
Osakwe, Zainab Toteh; Larson, Elaine; Agrawal, Mansi; Shang, Jinjing
2017-01-01
Older adult’s ability to self-manage illness is dependent on their ability to perform activities of daily living (ADL). Forty-five percent of those older than 65 years will have ongoing clinical needs after hospital discharge and require post-acute care (PAC) services in settings such as home health care (HHC) and skilled nursing facilities (SNF). The Improving Medicare Post-Acute Care Transformation Act (IMPACT) of 2014 requires PAC providers to begin collecting and reporting ADL data to build a coordinated approach to payment and standardize patient assessments and quality measurement. The aim of this integrative review was to compare the methods of assessing ADLs in HHC to SNF. The Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) statement was used to ensure results were reported systematically. A scientific literature search without date restriction within the PubMed and Cumulative Index of Nursing and Allied Health Literature (CINAHL) databases was conducted. Two independent investigators assessed study quality using the quality appraisal instrument developed by Kmet and colleagues. Study quality ranged from 94.5% to 100%. Of the 18749 articles identified by the search, eight met inclusion criteria and four tools were identified that are used to assess ADLs in SNF and HHC. Although SNF and HHC collect similar ADL information, the range of content covered, item definitions, scoring, and psychometrics are not comparable across settings. PMID:28471793
A Dutch Nationwide Bariatric Quality Registry: DATO.
Poelemeijer, Youri Q M; Liem, Ronald S L; Nienhuijs, Simon W
2017-12-22
In the Netherlands, the number of bariatric procedures increased exponentially in the 90s. To ensure and improve the quality of bariatric surgery, the nationwide Dutch Audit for Treatment of Obesity (DATO) was established in 2014. The audit was coordinated by the Dutch Institute for Clinical Auditing (DICA). This article provides a review of the aforementioned process in establishing a nationwide registry in the Netherlands. In collaboration with the DATO's scientific committee and other stakeholders, an annual list of several external quality indicators was formulated. This list consists of volume, process, and outcome indicators. In addition to the annual external indicators, the database permits individual hospitals to analyze their own data. The dashboard provides several standardized reports and detailed quality indicators, which are updated on a weekly base. Since the start, all 18 Dutch bariatric centers participated in the nationwide audit. A total of 21,941 cases were registered between 2015 and 2016. By 2016, the required variables were registered in 94.3% of all cases. A severe complicated course was seen in 2.87%, and mortality in 0.05% in 2016. The first-year follow-up shows a > 20% TWL in 86.1% of the registered cases. The DATO has become rapidly a mature registry. The well-organized structure of the national audit institution DICA and governmental funding were essential. However, most important were the bariatric teams themselves. The authors believe reporting the results from the registry has already contributed to more knowledge and acceptance by other health care providers.
Chandra X-ray Center Science Data Systems Regression Testing of CIAO
NASA Astrophysics Data System (ADS)
Lee, N. P.; Karovska, M.; Galle, E. C.; Bonaventura, N. R.
2011-07-01
The Chandra Interactive Analysis of Observations (CIAO) is a software system developed for the analysis of Chandra X-ray Observatory observations. An important component of a successful CIAO release is the repeated testing of the tools across various platforms to ensure consistent and scientifically valid results. We describe the procedures of the scientific regression testing of CIAO and the enhancements made to the testing system to increase the efficiency of run time and result validation.
2011-03-31
protocols conducted in Iraq. His office had been designated by the 1 A research protocol is a formal document detailing the study methodology and the...Human Research Protections Program plan requires scientific peer review to ensure that research is scientifically sound in its design and methods, and...ofthe approved research protocol and IRB minutes, revealed that there was no mention of "active rehabilitation and exercise" under the design
Scientific governance and the process for exposure scenario development in REACH.
Money, Chris D; Van Hemmen, Joop J; Vermeire, Theo G
2007-12-01
The primary process established by the European Commission to address the science needed to define key REACH concepts and to help rationally implement REACH's ambitions is enshrined in a series of activities known as the REACH Implementation Projects (RIPs). These are projects that aim to define the methodology that could be used, and present the basis for guidance on the actual principles and procedures that may be (are proposed to be) followed in the development of the required documentation that ensures the safe use of chemicals. In order to develop soundly based and equitable regulation, it is necessary that science governance using established and accepted scientific principles must take a leading role. The extent to which such governance is embraced will be determined by many factors, but notably the process adopted to enable scientific discussion to take place. This article addresses the issues of science as they have impacted on the exemplification of the Exposure Scenario concept under REACH. The current RIP activities have created a non-adversarial process in which the key stakeholders are able to discuss the key REACH challenges. But the RIP activities will be finalised before REACH comes into force. A suitable mechanism should perhaps now be identified to ensure that this positive spirit of scientific discussion and collaboration can continue to benefit REACH and those that it serves well into the future.
A collection of micrographs: where science and art meet
Uskoković, Vuk
2013-01-01
Micrographs obtained using different instrumental techniques are presented with the purpose of demonstrating their artistic qualities. The quality of uniformity currently dominates the aesthetic assessment in scientific practice and is discussed in relation to the classical appreciation of the interplay between symmetry and asymmetry in arts. It is argued that scientific and artistic qualities have converged and inspired each other throughout millennia. With scientific discoveries and inventions enriching the world of communication, broadening the space for artistic creativity and making artistic products more accessible than ever, science inevitably influences artistic creativity. On the other hand, the importance of aesthetic principles in guiding scientific conduct has been appreciated by some of the most creative scientific minds. Science and arts can be thus considered as parallel rails of a single railroad track. Only when precisely coordinated is the passing of the train of human knowledge enabled. The presented micrographs, occupying the central part of this discourse, are displayed with the purpose of showing the rich aesthetic character of even the most ordinary scientific images. The inherent aesthetic nature of scientific imagery and the artistic nature of scientific conduct have thus been offered as the conclusion. PMID:24465169
Dupuytren Disease: Is There Enough Comprehensive Patient Information on the Internet?
Raptis, Dimitri A; Fertsch, Sonia; Guggenheim, Merlin
2017-01-01
Background Dupuytren disease is a chronic nonmalign fibroproliferative disorder that causes finger contractures via proliferation of new tissue under the glabrous skin of the hand, resulting in multiple functional limitations for the patient. As many surgical therapy options exist, patients suffering from this condition actively search for information in their environment before consulting a health professional. Objective As little is known about the quality of Web-based patient information, the aim of this study was to conduct its systematic evaluation using a validated tool. Methods A total of 118 websites were included, and qualitative and quantitative assessment was performed using the modified Ensuring Quality Information for Patients (EQIP) tool. This standardized and reproducible tool consists of 36 items to assess available information in three categories: contents, identification, and structure data. Scientific data with restricted access, duplicates, and irrelevant websites were not included. Results Only 32 websites addressed more than 19 items, and the scores did not significantly differ among the website developers. The median number of items from the EQIP tool was 16, with the top websites addressing 28 out of 36 items. The quality of the newly developed websites did not increase with passing time. Conclusions This study revealed several shortcomings in the quality of Web-based information available for patients suffering from Dupuytren disease. In the world of continuously growing and instantly available Web-based information, it is the health providers’ negligence of the last two decades that there are very few good quality, informative, and educative websites that could be recommended to patients. PMID:28642214
Ensuring Quality in Early Childhood Education and Care: The Case of Turkey
ERIC Educational Resources Information Center
Gol-Guven, Mine
2018-01-01
With increasing numbers of women entering the workforce in Turkey, efforts have been made to provide services for children and their families. In 2016, 33.2% of 3- to 5-year olds in Turkey were attending preschool. This figure is lower than that of most OECD countries, but the important point is to increase the attendance rate by ensuring quality.…
Next level of board accountability in health care quality.
Pronovost, Peter J; Armstrong, C Michael; Demski, Renee; Peterson, Ronald R; Rothman, Paul B
2018-03-19
Purpose The purpose of this paper is to offer six principles that health system leaders can apply to establish a governance and management system for the quality of care and patient safety. Design/methodology/approach Leaders of a large academic health system set a goal of high reliability and formed a quality board committee in 2011 to oversee quality and patient safety everywhere care was delivered. Leaders of the health system and every entity, including inpatient hospitals, home care companies, and ambulatory services staff the committee. The committee works with the management for each entity to set and achieve quality goals. Through this work, the six principles emerged to address management structures and processes. Findings The principles are: ensure there is oversight for quality everywhere care is delivered under the health system; create a framework to organize and report the work; identify care areas where quality is ambiguous or underdeveloped (i.e. islands of quality) and work to ensure there is reporting and accountability for quality measures; create a consolidated quality statement similar to a financial statement; ensure the integrity of the data used to measure and report quality and safety performance; and transparently report performance and create an explicit accountability model. Originality/value This governance and management system for quality and safety functions similar to a finance system, with quality performance documented and reported, data integrity monitored, and accountability for performance from board to bedside. To the authors' knowledge, this is the first description of how a board has taken this type of systematic approach to oversee the quality of care.
Quality Risk Management: Putting GMP Controls First.
O'Donnell, Kevin; Greene, Anne; Zwitkovits, Michael; Calnan, Nuala
2012-01-01
This paper presents a practical way in which current approaches to quality risk management (QRM) may be improved, such that they better support qualification, validation programs, and change control proposals at manufacturing sites. The paper is focused on the treatment of good manufacturing practice (GMP) controls during QRM exercises. It specifically addresses why it is important to evaluate and classify such controls in terms of how they affect the severity, probability of occurrence, and detection ratings that may be assigned to potential failure modes or negative events. It also presents a QRM process that is designed to directly link the outputs of risk assessments and risk control activities with qualification and validation protocols in the GMP environment. This paper concerns the need for improvement in the use of risk-based principles and tools when working to ensure that the manufacturing processes used to produce medicines, and their related equipment, are appropriate. Manufacturing processes need to be validated (or proven) to demonstrate that they can produce a medicine of the required quality. The items of equipment used in such processes need to be qualified, in order to prove that they are fit for their intended use. Quality risk management (QRM) tools can be used to support such qualification and validation activities, but their use should be science-based and subject to as little subjectivity and uncertainty as possible. When changes are proposed to manufacturing processes, equipment, or related activities, they also need careful evaluation to ensure that any risks present are managed effectively. This paper presents a practical approach to how QRM may be improved so that it better supports qualification, validation programs, and change control proposals in a more scientific way. This improved approach is based on the treatment of what are called good manufacturing process (GMP) controls during those QRM exercises. A GMP control can be considered to be any control that is put in place to assure product quality and regulatory compliance. This improved approach is also based on how the detectability of risks is assessed. This is important because when producing medicines, it is not always good practice to place a high reliance upon detection-type controls in the absence of an adequate level of assurance in the manufacturing process that leads to the finished medicine.
ERIC Educational Resources Information Center
US Senate, 2016
2016-01-01
This is the seventh in a series of hearings to inform this committee's reauthorization of the Higher Education Act. The focus of this hearing, teacher preparation, is profoundly important for all students, from the very youngest to adult students. Study after study shows that teacher quality is the decisive in-school factor in boosting student…
How Drug Control Policy and Practice Undermine Access to Controlled Medicines
Csete, Joanne; Wilson, Duncan; Fox, Edward; Wolfe, Daniel; Rasanathan, Jennifer J. K.
2017-01-01
Abstract Drug conventions serve as the cornerstone for domestic drug laws and impose a dual obligation upon states to prevent the misuse of controlled substances while ensuring their adequate availability for medical and scientific purposes. Despite the mandate that these obligations be enforced equally, the dominant paradigm enshrined in the drug conventions is an enforcement-heavy criminal justice response to controlled substances that prohibits and penalizes their misuse. Prioritizing restrictive control is to the detriment of ensuring adequate availability of and access to controlled medicines, thereby violating the rights of people who need them. This paper argues that the drug conventions’ prioritization of criminal justice measures—including efforts to prevent non-medical use of controlled substances—undermines access to medicines and infringes upon the right to health and the right to enjoy the benefits of scientific progress. While the effects of criminalization under drug policy limit the right to health in multiple ways, we draw on research and documented examples to highlight the impact of drug control and criminalization on access to medicines. The prioritization and protection of human rights—specifically the right to health and the right to enjoy the benefits of scientific progress—are critical to rebalancing drug policy. PMID:28630556
How Drug Control Policy and Practice Undermine Access to Controlled Medicines.
Burke-Shyne, Naomi; Csete, Joanne; Wilson, Duncan; Fox, Edward; Wolfe, Daniel; Rasanathan, Jennifer J K
2017-06-01
Drug conventions serve as the cornerstone for domestic drug laws and impose a dual obligation upon states to prevent the misuse of controlled substances while ensuring their adequate availability for medical and scientific purposes. Despite the mandate that these obligations be enforced equally, the dominant paradigm enshrined in the drug conventions is an enforcement-heavy criminal justice response to controlled substances that prohibits and penalizes their misuse. Prioritizing restrictive control is to the detriment of ensuring adequate availability of and access to controlled medicines, thereby violating the rights of people who need them. This paper argues that the drug conventions' prioritization of criminal justice measures-including efforts to prevent non-medical use of controlled substances-undermines access to medicines and infringes upon the right to health and the right to enjoy the benefits of scientific progress. While the effects of criminalization under drug policy limit the right to health in multiple ways, we draw on research and documented examples to highlight the impact of drug control and criminalization on access to medicines. The prioritization and protection of human rights-specifically the right to health and the right to enjoy the benefits of scientific progress-are critical to rebalancing drug policy.
Article retracted, but the message lives on.
Greitemeyer, Tobias
2014-04-01
The retraction of an original article aims to ensure that readers are alerted to the fact that the findings are not trustworthy. However, the present research suggests that individuals still believe in the findings of an article even though they were later told that the data were fabricated and that the article was retracted. Participants in a debriefing condition and a no-debriefing condition learned about the scientific finding of an empirical article, whereas participants in a control condition did not. Afterward, participants in the debriefing condition were told that the article had been retracted because of fabricated data. Results showed that participants in the debriefing condition were less likely to believe in the findings than participants in the no-debriefing condition but were more likely to believe in the findings than participants in the control condition, suggesting that individuals do adjust their beliefs in the perceived truth of a scientific finding after debriefing-but insufficiently. Mediational analyses revealed that the availability of generated causal arguments underlies belief perseverance. These results suggest that a retraction note of an empirical article in a scientific journal is not sufficient to ensure that readers of the original article no longer believe in the article's conclusions.
Research Coordinators Experiences with Scientific Misconduct and Research Integrity
Habermann, Barbara; Broome, Marion; Pryor, Erica R.; Ziner, Kim Wagler
2010-01-01
Background Most reports of scientific misconduct have been focused on principal investigators and other scientists (e.g., biostatisticians) involved in the research enterprise. However, by virtue of their position, research coordinators are often closest to the research field where much of misconduct occurs. Objective To describe research coordinators’ experiences with scientific misconduct in their clinical environment. Design The descriptive design was embedded in a larger, cross-sectional national survey. A total of 266 respondents, predominately registered nurses, who answered yes to having first hand knowledge of scientific misconduct in the past year provided open-ended question responses. Methods Content analysis was conducted by the research team, ensuring agreement of core categories and subcategories of misconduct. Findings Research coordinators most commonly learned about misconduct via first-hand witness of the event, with the principal investigator being the person most commonly identified as the responsible party. Five major categories of misconduct were identified: protocol violations, consent violations, fabrication, falsification, and financial conflict of interest. In 70% of cases, the misconduct was reported. In the majority of instances where misconduct was reported, some action was taken. However, in approximately 14% of cases, no action or investigation ensued; in 6.5% of cases the coordinator was either fired or resigned. Conclusions The study demonstrates the need to expand definitions of scientific misconduct beyond fabrication, falsification, and plagiarism to include other practices. The importance of the ethical climate in the institution in ensuring a safe environment to report and an environment where evidence is reviewed cannot be overlooked. PMID:20010045
Computer applications in scientific balloon quality control
NASA Astrophysics Data System (ADS)
Seely, Loren G.; Smith, Michael S.
Seal defects and seal tensile strength are primary determinants of product quality in scientific balloon manufacturing; they therefore require a unit of quality measure. The availability of inexpensive and powerful data-processing tools can serve as the basis of a quality-trends-discerning analysis of products. The results of one such analysis are presently given in graphic form for use on the production floor. Software descriptions and their sample outputs are presented, together with a summary of overall and long-term effects of these methods on product quality.
Khan, Saeed R; Kona, Ravikanth; Faustino, Patrick J; Gupta, Abhay; Taylor, Jeb S; Porter, Donna A; Khan, Mansoor
2014-05-01
The Department of Defense (DoD)-United States Food and Drug Administration (FDA) shelf-life extension program (SLEP) was established in 1986 through an intra-agency agreement between the DoD and the FDA to extend the shelf life of product nearing expiry. During the early stages of development, special attention was paid to program operation, labeling requirements, and the cost benefits associated with this program. In addition to the substantial cost benefits, the program also provides the FDA's Center for Drug Evaluation and Research with significant scientific understanding and pharmaceutical resource. As a result of this unique resource, numerous regulatory research opportunities to improve public health present themselves from this distinctive scientific database, which includes examples of products shelf life, their long-term stability issues, and various physical and chemical tests to identify such failures. The database also serves as a scientific resource for mechanistic understanding and identification of test failures leading to the development of new formulations or more robust packaging. It has been recognized that SLEP is very important in maintaining both national security and public welfare by confirming that the stockpiled pharmaceutical products meet quality standards after the "expiration date" assigned by the sponsor. SLEP research is an example of regulatory science that is needed to best ensure product performance past the original shelf life. The objective of this article is to provide a brief history and background and most importantly the public health benefits of the SLEP. © 2014 Wiley Periodicals, Inc. and the American Pharmacists Association.
45 CFR 261.62 - What must a State do to verify the accuracy of its work participation information?
Code of Federal Regulations, 2012 CFR
2012-10-01
... ensure a consistent measurement of the work participation rates, including the quality assurance... work participation information? 261.62 Section 261.62 Public Welfare Regulations Relating to Public..., DEPARTMENT OF HEALTH AND HUMAN SERVICES ENSURING THAT RECIPIENTS WORK How Do We Ensure the Accuracy of Work...
45 CFR 261.62 - What must a State do to verify the accuracy of its work participation information?
Code of Federal Regulations, 2011 CFR
2011-10-01
... ensure a consistent measurement of the work participation rates, including the quality assurance... work participation information? 261.62 Section 261.62 Public Welfare Regulations Relating to Public..., DEPARTMENT OF HEALTH AND HUMAN SERVICES ENSURING THAT RECIPIENTS WORK How Do We Ensure the Accuracy of Work...
45 CFR 261.62 - What must a State do to verify the accuracy of its work participation information?
Code of Federal Regulations, 2013 CFR
2013-10-01
... ensure a consistent measurement of the work participation rates, including the quality assurance... work participation information? 261.62 Section 261.62 Public Welfare Regulations Relating to Public..., DEPARTMENT OF HEALTH AND HUMAN SERVICES ENSURING THAT RECIPIENTS WORK How Do We Ensure the Accuracy of Work...
45 CFR 261.62 - What must a State do to verify the accuracy of its work participation information?
Code of Federal Regulations, 2014 CFR
2014-10-01
... ensure a consistent measurement of the work participation rates, including the quality assurance... work participation information? 261.62 Section 261.62 Public Welfare Regulations Relating to Public..., DEPARTMENT OF HEALTH AND HUMAN SERVICES ENSURING THAT RECIPIENTS WORK How Do We Ensure the Accuracy of Work...
Organizing Principles of Mammalian Nonsense-Mediated mRNA Decay
Popp, Maximilian Wei-Lin; Maquat, Lynne E.
2014-01-01
Cells use messenger RNAs (mRNAs) to ensure the accurate dissemination of genetic information encoded by DNA. Given that mRNAs largely direct the synthesis of a critical effector of cellular phenotype, i.e., proteins, tight regulation of both the quality and quantity of mRNA is a prerequisite for effective cellular homeostasis. Here, we review nonsense-mediated mRNA decay (NMD), which is the best-characterized posttranscriptional quality control mechanism that cells have evolved in their cytoplasm to ensure transcriptome fidelity. We use protein quality control as a conceptual framework to organize what is known about NMD, highlighting overarching similarities between these two polymer quality control pathways, where the protein quality control and NMD pathways intersect, and how protein quality control can suggest new avenues for research into mRNA quality control. PMID:24274751
DOE Office of Scientific and Technical Information (OSTI.GOV)
Driver, C.J.
1994-05-01
Criteria for determining the quality of liver sediment are necessary to ensure that concentrations of contaminants in aquatic systems are within acceptable limits for the protection of aquatic and human life. Such criteria should facilitate decision-making about remediation, handling, and disposal of contaminants. Several approaches to the development of sediment quality criteria (SQC) have been described and include both descriptive and numerical methods. However, no single method measures all impacts at all times to all organisms (U.S. EPA 1992b). The U.S. EPA`s interest is primarily in establishing chemically based, numerical SQC that are applicable nation-wide (Shea 1988). Of the approachesmore » proposed for SQC development, only three are being considered for numerical SQC on a national level. These approaches include an Equilibrium Partitioning Approach, a site-specific method using bioassays (the Apparent Effects Threshold Approach), and an approach similar to EPA`s water quality criteria (Pavlou and Weston 1984). Although national (or even regional) criteria address a number of political, litigative, and engineering needs, some researchers feel that protection of benthic communities require site-specific, biologically based criteria (Baudo et al. 1990). This is particularly true for areas where complex mixtures of contaminants are present in sediments. Other scientifically valid and accepted procedures for freshwater SQC include a background concentration approach, methods using field or spiked bioassays, a screening level concentration approach, the Apparent Effects Threshold Approach, the Sediment Quality Triad, the International Joint Commission Sediment Assessment Strategy, and the National Status and Trends Program Approach. The various sediment assessment approaches are evaluated for application to the Hanford Reach and recommendations for Hanford Site sediment quality criteria are discussed.« less
Hoffman, Steven J; Justicz, Victoria
2016-07-01
To develop and validate a method for automatically quantifying the scientific quality and sensationalism of individual news records. After retrieving 163,433 news records mentioning the Severe Acute Respiratory Syndrome (SARS) and H1N1 pandemics, a maximum entropy model for inductive machine learning was used to identify relationships among 500 randomly sampled news records that correlated with systematic human assessments of their scientific quality and sensationalism. These relationships were then computationally applied to automatically classify 10,000 additional randomly sampled news records. The model was validated by randomly sampling 200 records and comparing human assessments of them to the computer assessments. The computer model correctly assessed the relevance of 86% of news records, the quality of 65% of records, and the sensationalism of 73% of records, as compared to human assessments. Overall, the scientific quality of SARS and H1N1 news media coverage had potentially important shortcomings, but coverage was not too sensationalizing. Coverage slightly improved between the two pandemics. Automated methods can evaluate news records faster, cheaper, and possibly better than humans. The specific procedure implemented in this study can at the very least identify subsets of news records that are far more likely to have particular scientific and discursive qualities. Copyright © 2016 Elsevier Inc. All rights reserved.
Quality and Safety in Health Care, Part IV: Quality and Cancer Care.
Harolds, Jay A
2015-11-01
The 1999 Institute of Medicine report Ensuring Quality Cancer Care discussed the difference between the actual cancer care received in the United States and the care that the patients should get, as well as some points to consider in delivering optimum care. In 2012, a follow-up review article in the journal Cancer entitled "Ensuring quality cancer care" indicated that there had been some interval progress, but more are needed to be done. The 2013 Institute of Medicine report Delivering High-Quality Cancer Care: Charting a New Course for a System in Crisis indicated that there are continuing major problems with cancer care and that they advocated a national system of quality reporting and a major information technology system to capture and help assess the data.
7 CFR 3401.17 - Review criteria.
Code of Federal Regulations, 2010 CFR
2010-01-01
... Regulations of the Department of Agriculture (Continued) COOPERATIVE STATE RESEARCH, EDUCATION, AND EXTENSION.... Overall scientific and technical quality of proposal 10 2. Scientific and technical quality of the.... Feasibility of attaining objectives; adequacy of professional training and experience, facilities and...
Cohen, Jonathan B.; Erwin, R. Michael; French, John B.; Marion, Jeffrey L.; Meyers, J. Michael
2009-01-01
The U.S. Geological Survey's Patuxent Wildlife Research Center (PWRC) conducted a study for the National Park Service (NPS) Southeast Region, Atlanta, GA, and Cape Hatteras National Seashore (CAHA) in North Carolina to review, evaluate, and summarize the available scientific information for selected species of concern at CAHA (piping plovers, sea turtles, seabeach amaranth, American oystercatchers, and colonial waterbirds). This work consisted of reviewing the scientific literature and evaluating the results of studies that examined critical life history stages of each species, and focused on the scientific findings reported that are relevant to the management of these species and their habitats at CAHA. The chapters that follow provide the results of that review separately for each species and present scientifically based options for resource management at CAHA. Although no new original research or experimental work was conducted, this synthesis of the existing information was peer reviewed by over 15 experts with familiarity with these species. This report does not establish NPS management protocols but does highlight scientific information on the biology of these species to be considered by NPS managers who make resource management decisions at CAHA. To ensure that the best available information is considered when assessing each species of interest at CAHA, this review included published research as well as practical experience of scientists and wildlife managers who were consulted in 2005. PWRC scientists evaluated the literature, consulted wildlife managers, and produced an initial draft that was sent to experts for scientific review. Revisions based on those comments were incorporated into the document. The final draft of the document was reviewed by NPS personnel to ensure that the description of the recent status and management of these species at CAHA was accurately represented and that the report was consistent with our work agreement. The following section summarizes the biological information relevant to resource management for the species of concern at CAHA.
"Wild cannabis": A review of the traditional use and phytochemistry of Leonotis leonurus.
Nsuala, Baudry N; Enslin, Gill; Viljoen, Alvaro
2015-11-04
Leonotis leonurus, locally commonly known as "wilde dagga" (=wild cannabis), is traditionally used as a decoction, both topically and orally, in the treatment of a wide variety of conditions such as haemorrhoids, eczema, skin rashes, boils, itching, muscular cramps, headache, epilepsy, chest infections, constipation, spider and snake bites. The dried leaves and flowers are also smoked to relieve epilepsy. The leaves and flowers are reported to produce a mild euphoric effect when smoked and have been said to have a similar, although less potent, psychoactive effect to cannabis. To amalgamate the botanical aspects, ethnopharmacology, phytochemistry, biological activity, toxicity and commercial aspects of the scientific literature available on L. leonurus. An extensive review of the literature from 1900 to 2015 was carried out. Electronic databases including Scopus, SciFinder, Pubmed, Google Scholar and Google were used as data sources. All abstracts, full-text articles and books written in English were considered. The phytochemistry of particularly the non-volatile constituents of L. leonurus has been comprehensively investigated due to interest generated as a result of the wide variety of biological effects reported for this plant. More than 50 compounds have been isolated and characterised. L. leonurus contains mainly terpenoids, particularly labdane diterpenes, the major diterpene reported is marrubiin. Various other compounds have been reported by some authors to have been isolated from the plant, including, in the popular literature only, the mildly psychoactive alkaloid, leonurine. Leonurine has however, never been reported by any scientific analysis of the extracts of L. leonurus. Despite the publication of various papers on L. leonurus, there is still, however, the need for definitive research and clarification of other compounds, including alkaloids and essential oils from L. leonurus, as well as from other plant parts, such as the roots which are extensively used in traditional medicine. The traditional use by smoking also requires further investigation as to how the chemistry and activity are affected by this form of administration. Research has proven the psychoactive effects of the crude extract of L. leonurus, but confirmation of the presence of psychoactive compounds, as well as isolation and characterization, is still required. Deliberate adulteration of L. leonurus with synthetic cannabinoids has been reported recently, in an attempt to facilitate the marketing of these illegal substances, highlighting the necessity for refinement of appropriate quality control processes to ensure safety and quality. Much work is therefore still required on the aspect of quality control to ensure safety, quality and efficacy of the product supplied to patients, as this plant is widely used in South Africa as a traditional medicine. Commercially available plant sources provide a viable option for phytochemical research, particularly with regard to the appropriate validation of the plant material (taxonomy) in order to identify and delimit closely related species such as L. leonurus and L. nepetifolia which are very similar in habit. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
Refining dosing by oral gavage in the dog: A protocol to harmonise welfare
Hall, Laura E.; Robinson, Sally; Buchanan-Smith, Hannah M.
2015-01-01
Introduction The dog is a frequently-used, non-rodent species in the safety assessment of new chemical entities. We have a scientific and ethical obligation to ensure that the best quality of data is achieved from their use. Oral gavage is a technique frequently used to deliver a compound directly into the stomach. As with other animals, in the dog, gavage is aversive and the frequency of its use is a cause for welfare concern but little research has been published on the technique nor how to Refine it. A Welfare Assessment Framework (Hall, 2014) was previously developed for use with the laboratory-housed dog and a contrasting pattern of behaviour, cardiovascular and affective measures were found in dogs with positive and negative welfare. Methods Using the framework, this study compared the effects of sham dosing (used to attempt to habituate dogs to dosing) and a Refined training protocol against a control, no-training group to determine the benefit to welfare and scientific output of each technique. Results Our findings show that sham dosing is ineffective as a habituation technique and ‘primes’ rather than desensitises dogs to dosing. Dogs in the control group showed few changes in parameters across the duration of the study, with some undesirable changes during dosing, while dogs in the Refined treatment group showed improvements in many parameters. Discussion It is recommended that if there is no time allocated for pre-study training a no-sham dosing protocol is used. However, brief training periods show a considerable benefit for welfare and quality of data to be obtained from the dogs' use. PMID:25575806
du Prel, Jean-Baptist; Röhrig, Bernd; Blettner, Maria
2009-02-01
In the era of evidence-based medicine, one of the most important skills a physician needs is the ability to analyze scientific literature critically. This is necessary to keep medical knowledge up to date and to ensure optimal patient care. The aim of this paper is to present an accessible introduction into critical appraisal of scientific articles. Using a selection of international literature, the reader is introduced to the principles of critical reading of scientific articles in medicine. For the sake of conciseness, detailed description of statistical methods is omitted. Widely accepted principles for critically appraising scientific articles are outlined. Basic knowledge of study design, structuring of an article, the role of different sections, of statistical presentations as well as sources of error and limitation are presented. The reader does not require extensive methodological knowledge. As far as necessary for critical appraisal of scientific articles, differences in research areas like epidemiology, clinical, and basic research are outlined. Further useful references are presented. Basic methodological knowledge is required to select and interpret scientific articles correctly.
Air Quality Criteria for Ozone and Related Photochemical ...
In February 2006, EPA released the final document, Air Quality Criteria for Ozone and Other Photochemical Oxidants. Tropospheric or surface-level ozone (O3) is one of six major air pollutants regulated by National Ambient Air Quality Standards (NAAQS) under the U.S. Clean Air Act. As mandated by the Clean Air Act, the U.S. Environmental Protection Agency (EPA) must periodically review the scientific bases (or criteria) for the various NAAQS by assessing newly available scientific information on a given criteria air pollutant. This document, Air Quality Criteria for Ozone and Other Photochemical Oxidants, is an updated revision of the 1996 Ozone Air Quality Criteria Document (O3 AQCD) that provided scientific bases for the current O3 NAAQS set in 1997. The Clean Air Act mandates periodic review of the National Ambient Air Quality Standards (NAAQS) for six common air pollutants, also referred to as criteria pollutants, including ozone.
Yucha, Carolyn B; Schneider, Barbara St Pierre; Smyer, Tish; Kowalski, Susan; Stowers, Eva
2011-01-01
The methodological quality of nursing education research has not been rigorously studied. The purpose of this study was to evaluate the methodological quality and scientific impact of nursing education research reports. The methodological quality of 133 quantitative nursing education research articles published between July 2006 and December 2007 was evaluated using the Medical Education Research Study Quality Instrument (MERSQI).The mean (+/- SD) MERSQI score was 9.8 +/- 2.2. It correlated (p < .05) with several scientific impact indicators: citation counts from Scopus (r = .223), Google Scholar (r = .224), and journal impact factor (r = .216); it was not associated with Web of Science citation count, funding, or h Index. The similarities between this study's MERSQI ratings for nursing literature and those reported for the medical literature, coupled with the association with citation counts, suggest that the MERSQI is an appropriate instrument to evaluate the quality of nursing education research.
ERIC Educational Resources Information Center
Wisconsin Department of Public Instruction, 2006
2006-01-01
This bulletin outlines the New Wisconsin Promise program - a commitment to ensuring that every Wisconsin child graduates with the knowledge and skills necessary for success in the twenty-first century global society by: (1) Ensuring quality teachers in every classroom and strong leadership in every school; (2) Improving student achievement with a…
Ensuring quality: a key consideration in scaling-up HIV-related point-of-care testing programs
Fonjungo, Peter N.; Osmanov, Saladin; Kuritsky, Joel; Ndihokubwayo, Jean Bosco; Bachanas, Pam; Peeling, Rosanna W.; Timperi, Ralph; Fine, Glenn; Stevens, Wendy; Habiyambere, Vincent; Nkengasong, John N.
2016-01-01
Objective: The objective of the WHO/US President's Emergency Plan for AIDS Relief consultation was to discuss innovative strategies, offer guidance, and develop a comprehensive policy framework for implementing quality-assured HIV-related point-of-care testing (POCT). Methods: The consultation was attended by representatives from international agencies (WHO, UNICEF, UNITAID, Clinton Health Access Initiative), United States Agency for International Development, Centers for Disease Control and Prevention/President's Emergency Plan for AIDS Relief Cooperative Agreement Partners, and experts from more than 25 countries, including policy makers, clinicians, laboratory experts, and program implementers. Main outcomes: There was strong consensus among all participants that ensuring access to quality of POCT represents one of the key challenges for the success of HIV prevention, treatment, and care programs. The following four strategies were recommended: implement a newly proposed concept of a sustainable quality assurance cycle that includes careful planning; definition of goals and targets; timely implementation; continuous monitoring; improvements and adjustments, where necessary; and a detailed evaluation; the importance of supporting a cadre of workers [e.g. volunteer quality corps (Q-Corps)] with the role to ensure that the quality assurance cycle is followed and sustained; implementation of the new strategy should be seen as a step-wise process, supported by development of appropriate policies and tools; and joint partnership under the leadership of the ministries of health to ensure sustainability of implementing novel approaches. Conclusion: The outcomes of this consultation have been well received by program implementers in the field. The recommendations also laid the groundwork for developing key policy and quality documents for the implementation of HIV-related POCT. PMID:26807969
AIR QUALITY CRITERIA DOCUMENT(S) FOR LEAD
This collection of documents intend to assess the latest scientific information on the health and environmental fate and effects of lead to provide scientific bases for periodic review and possible revision of the National Ambient Air Quality Standards (NAAQS) for lead.
FY 2014 LDRD Annual Report Project Summaries
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tomchak, Dena
The FY 2014 Laboratory Directed Research and Development (LDRD) Annual Report is a compendium of the diverse research performed to develop and ensure the INL's technical capabilities can support future DOE missions and national research priorities. LDRD is essential to INL - it provides a means for the laboratory to pursue novel scientific and engineering research in areas that are deemed too basic or risky for programmatic investments. This research enahnces technical capabilities at the laboratory, providing scientific and engineering staff with opportunities for skill building and partnership development.
IRIS Toxicological Review of 1,2,3-Trichloropropane (External ...
EPA conducted a peer review of the scientific basis supporting the human health hazard and dose-response assessment of 1,2,3-trichloropropane (TCP) that once finalized will appear on the Integrated Risk Information System (IRIS) database. Peer review is meant to ensure that science is used credibly and appropriately in derivation of the dose-response assessments and toxicological characterization. This Tox Review provides scientific support and rationale for the hazard and dose-response assessment pertaining to chronic exposure to 1,2,3-trichloropropane.
The '3Is' of animal experimentation.
2012-05-29
Animal experimentation in scientific research is a good thing: important, increasing and often irreplaceable. Careful experimental design and reporting are at least as important as attention to welfare in ensuring that the knowledge we gain justifies using live animals as experimental tools.
ERIC Educational Resources Information Center
Mathias, Charles McC., Jr.
1980-01-01
The value of scientific research can only be measured by the resulting benefits to society. Federal policymakers must ensure that funding for research and development is adequate and that research results are applied to the serious problems facing state and local governments. (SK)
Scientific Writing = Thinking in Words
USDA-ARS?s Scientific Manuscript database
Ensuring that research results are reported accurately and effectively is an eternal challenge for scientists. The book Science Writing = Thinking in Words (David Lindsay, 2011. CSIRO Publishing) is a primer for researchers who seek to improve their impact through better written (and oral) presentat...
Stavelin, Anne; Albe, Xavier; Meijer, Piet; Sarkany, Erika; MacKenzie, Finlay
2017-01-01
The European Organisation for External Quality Assurance Providers in Laboratory Medicine (EQALM) was founded in 1996 and currently has members from 29 European countries and 6 countries from outside Europe. EQALM provides a forum for co-operation and exchange of knowledge on quality-related matters in laboratory medicine, especially with regard to external quality assessment (EQA) programs in Europe. In addition, EQALM represent the EQA providers in laboratory medicine at European level vis-ŕ-vis political, professional, scientific and other bodies, including patients’ organisations. To this end EQALM promotes activities such as organizing meetings with scientific and practical themes for members and other interested parties, issuing scientific publications, developing EQA projects and representing laboratory medicine EQA activities within other organisations and networks. EQALM is active in scientific and educational activity in different fields such as survey frequency, haematology, haemostasis, microbiology, nomenclature, virtual microscopy, traceability, accreditation, and quality assurance of the total testing process. The aim of this paper is to give an overview of the EQALM organisation. PMID:28392724
Challenges of In-Flight Calibrations for the Mars Reconnaissance Orbiter Payload
NASA Technical Reports Server (NTRS)
Xaypraseuth, Peter
2007-01-01
The Mars Reconnaissance Orbiter is the most complex spacecraft that has ever been sent to investigate the Red Planet. A major part of what makes this mission so complex is the suite of instruments that were selected. The instruments on MRO vary from a simple imaging system, not much larger than a pocket knife to the largest camera ever flown to another planet. Not only does the size of the instruments vary, so do the scientific investigations associated with each instrument. In order to ensure that this payload suite would be able to satisfy all of its science objectives, a major effort was put forth by the MRO Project to ensure these instruments were well calibrated prior to the start of the Primary Science Phase. The in-flight calibration plan for MRO proved to be quite challenging, given the often conflicting requirements due to the varying capability of each of the instruments and the desire to constrain the workload on the Mission Operations personnel. The quality of data returned by MRO since the start of the Primary Science Phase is a tribute to the effort that was put forth to characterize the in-flight performance of the instruments. This paper will describe the challenges associated with the planning and implementation of the various calibration events on MRO, and will exhibit some of the results from those calibrations.
Ensuring Quality in AFRINEST and SATT
2013-01-01
Background: Three randomized open-label clinical trials [Simplified Antibiotic Therapy Trial (SATT) Bangladesh, SATT Pakistan and African Neonatal Sepsis Trial (AFRINEST)] were developed to test the equivalence of simplified antibiotic regimens compared with the standard regimen of 7 days of parenteral antibiotics. These trials were originally conceived and designed separately; subsequently, significant efforts were made to develop and implement a common protocol and approach. Previous articles in this supplement briefly describe the specific quality control methods used in the individual trials; this article presents additional information about the systematic approaches used to minimize threats to validity and ensure quality across the trials. Methods: A critical component of quality control for AFRINEST and SATT was striving to eliminate variation in clinical assessments and decisions regarding eligibility, enrollment and treatment outcomes. Ensuring appropriate and consistent clinical judgment was accomplished through standardized approaches applied across the trials, including training, assessment of clinical skills and refresher training. Standardized monitoring procedures were also applied across the trials, including routine (day-to-day) internal monitoring of performance and adherence to protocols, systematic external monitoring by funding agencies and external monitoring by experienced, independent trial monitors. A group of independent experts (Technical Steering Committee/Technical Advisory Group) provided regular monitoring and technical oversight for the trials. Conclusions: Harmonization of AFRINEST and SATT have helped to ensure consistency and quality of implementation, both internally and across the trials as a whole, thereby minimizing potential threats to the validity of the trials’ results. PMID:23945575
[Packaging: the guarantee of medicinal quality].
Chaumeil, J-C
2003-01-01
Primary packaging guarantees the pharmaceutical quality of the medicinal preparation received by the patient. Glass bottles containing parenteral solutions for example ensure that sterility, quality and optimal stability are preserved until administration. Recent innovations in materials research has lead to improvements in parenteral infusions. Multicompartmental bags, allowing extemporaneous mixtures without opening the container, constitute an extremely beneficial advance for the patient, allowing administration of mixtures with solutions and emulsions which would be unstable if stored. Metered dose pressurized inhalers are an excellent example of drug administration devices designed specifically to ensure quality and bioavailability. These examples illustrate the important role of primary packaging and demonstrate the usefulness of research and development in this area.
Strawman Philosophical Guide for Developing International Network of GPM GV Sites
NASA Technical Reports Server (NTRS)
Smith, Eric A.
2005-01-01
The creation of an international network of ground validation (GV) sites that will support the Global Precipitation Measurement (GPM) Mission's international science programme will require detailed planning of mechanisms for exchanging technical information, GV data products, and scientific results. An important component of the planning will be the philosophical guide under which the network will grow and emerge as a successful element of the GPM Mission. This philosophical guide should be able to serve the mission in developing scientific pathways for ground validation research which will ensure the highest possible quality measurement record of global precipitation products. The philosophical issues, in this regard, partly stem from the financial architecture under which the GV network will be developed, i.e., each participating country will provide its own financial support through committed institutions -- regardless of whether a national or international space agency is involved.At the 1st International GPM Ground Validation Workshop held in Abingdon, UK in November-2003, most of the basic tenants behind the development of the international GV network were identified and discussed. Therefore, with this progress in mind, this presentation is intended to put forth a strawman philosophical guide supporting the development of the international network of GPM GV sites, noting that the initial progress has been reported in the Proceedings of the 1st International GPM GV Workshop -- available online. The central philosophical issues themselves, all flow from the fact that each participating institution can only bring to the table, GV facilities and scientific personnel that are affordable to the sanctioning (funding) national agency (be that a research, research-support, or operational agency). This situation imposes on the network, heterogeneity in the measuring sensors, data collection periods, data collection procedures, data latencies, and data reporting capabilities. Therefore, in order for the network to be effective in supporting the central scientific goals of the GPM mission, there must be a basic agreed upon doctrine under which the network participants function vis-a-vis: (1) an overriding set of general scientific requirements, (2) a minimal set of policies governing the free flow of GV data between the scientific participants, (3) a few basic definitions concerning the prioritization of measurements and their respective value to the mission, (4) a few basic procedures concerning data formats, data reporting procedures, data access, and data archiving, and (5) a simple means to differentiate GV sites according to their level of effort and ability to perform near real-time data acquisition - data reporting tasks. Most important, in case they choose to operate as a near real-time data collection-data distribution site, they would be expected to operate under a fairly narrowly defined protocol needed to ensure smooth GV support operations. This presentation will suggest measures responsive to items (1) - (5) from which to proceed,. In addition, this presentation will seek to stimulate discussion and debate concerning how much heterogeneity is tolerable within the eventual GV site network, given that the any individual GV site can only be considered scientifically useful if it supports the achievement of the central GPM Mission goals. Only ground validation research that has a direct connection to the space mission should be considered justifiable given the overarching scientific goals of the mission. Therefore each site will have to seek some level of accommodation to what the GPM Mission requires in the way of retrieval error characterization, retrieval error detection and reporting, and generation of GV data products that support assessment and improvement of the mission's standard precipitation retrieval algorithms. These are all important scientific issues that will be best resolved in open scientific debate.
Spinks, Tracy; Albright, Heidi W.; Feeley, Thomas W.; Walters, Ron; Burke, Thomas W.; Aloia, Thomas; Bruera, Eduardo; Buzdar, Aman; Foxhall, Lewis; Hui, David; Summers, Barbara; Rodriguez, Alma; DuBois, Raymond; Shine, Kenneth I.
2011-01-01
Responding to growing concerns regarding the safety, quality, and efficacy of cancer care in the United States, the Institute of Medicine (IOM) of the National Academy of Sciences commissioned a comprehensive review of cancer care delivery in the US healthcare system in the late 1990s. The National Cancer Policy Board (NCPB), a twenty-member board with broad representation, performed this review. In its review, the NCPB focused on the state of cancer care delivery at that time, its shortcomings, and ways to measure and improve the quality of cancer care. The NCPB described an ideal cancer care system, where patients would have equitable access to coordinated, guideline-based care and novel therapies throughout the course of their disease. In 1999, the IOM published the results of this review in its influential report, Ensuring Quality Cancer Care. This report outlined ten recommendations, which, when implemented, would: 1) improve the quality of cancer care; 2) increase our understanding of quality cancer care; and, 3) reduce or eliminate access barriers to quality cancer care. Despite the fervor generated by this report, there are lingering doubts regarding the safety and quality of cancer care in the United States today. Increased awareness of medical errors and barriers to quality care, coupled with escalating healthcare costs, has prompted national efforts to reform the healthcare system. These efforts by healthcare providers and policymakers should bridge the gap between the ideal state described in Ensuring Quality Cancer Care and the current state of cancer care in the United States. PMID:22045610
Drug quality in South Africa: perceptions of key players involved in medicines distribution.
Patel, Aarti; Norris, Pauline; Gauld, Robin; Rades, Thomas
2009-01-01
Substandard medicines contribute to poor public health and affect development, especially in the developing world. However knowledge of how manufacturers, distributors and providers understand the concept of drug quality and what strategies they adopt to ensure drug quality is limited, particularly in the developing world. The purpose of this paper is to explore pharmaceutical manufacturers', distributors' and providers' perceptions of drug quality in South Africa and how they ensure the quality of drugs during the distribution process. The approach taken was qualitative data collection through key informant interviews using a semi-structured interview guide. Transcripts were analysed thematically in Johannesburg, Pretoria and Durban, South Africa. Participants were recruited purposefully from a South African pharmaceutical manufacturer, SA subsidiaries of international manufacturers, national distribution companies, national wholesaler, public and private sector pharmacists, and a dispensing doctor. In total, ten interviews were conducted. Participants described drug quality in terms of the product and the processes involved in manufacturing and handling the product. Participants identified purchasing registered medicines from licensed suppliers, use of standard operating procedures, and audits between manufacturer and distributor and/or provider as key strategies employed to protect medicine quality. Effective communication amongst all stakeholders, especially in terms of providing feedback regarding complaints about medicine quality, appears as a potential area of concern, which would benefit from further research. The paper hightlights that ensuring medicine quality should be a shared responsibility amongst all involved in the distribution process to prevent medicines moving from one distribution system (public) into another (private).
76 FR 8753 - Final Information Quality Guidelines Policy
Federal Register 2010, 2011, 2012, 2013, 2014
2011-02-15
... DEPARTMENT OF HOMELAND SECURITY Final Information Quality Guidelines Policy AGENCY: Department of Homeland Security. ACTION: Notice and request for public comment on Final Information Quality Guidelines. SUMMARY: These guidelines should be used to ensure and maximize the quality of disseminated information...
75 FR 37819 - Proposed Information Quality Guidelines Policy
Federal Register 2010, 2011, 2012, 2013, 2014
2010-06-30
... DEPARTMENT OF HOMELAND SECURITY Proposed Information Quality Guidelines Policy ACTION: Notice and request for public comment on Proposed Information Quality Guidelines. SUMMARY: These guidelines should be used to ensure and maximize the quality of disseminated information. The Department's guidelines are...
Design and Implementation of Green Construction Scheme for a High-rise Residential Building Project
NASA Astrophysics Data System (ADS)
Zhou, Yong; Huang, You Zhen
2018-06-01
This paper mainly studies the green construction scheme of a high-rise residential building project. From "four sections one environmental protection", saving material, water saving, energy saving, economical use of land and environmental protection conduct analysis and research. Adopting scientific, advanced, reasonable and economical construction technology measures, implementing green construction method. Promoting energy-saving technologies in buildings, ensuring the sustainable use of resources, Maximum savings of resources and energy, increase energy efficiency, to reduce pollution, reducing the adverse environmental impact of construction activities, ensure construction safety, build sustainable buildings.
Austin, J Matthew; Demski, Renee; Callender, Tiffany; Lee, K H Ken; Hoffman, Ann; Allen, Lisa; Radke, Deborah A; Kim, Yungjin; Werthman, Ronald J; Peterson, Ronald R; Pronovost, Peter J
2017-04-01
As the health care system in the United States places greater emphasis on the public reporting of quality and safety data and its use to determine payment, provider organizations must implement structures that ensure discipline and rigor regarding these data. An academic health system, as part of a performance management system, applied four key components of a financial reporting structure to support the goal of top-to-bottom accountability for improving quality and safety. The four components implemented by Johns Hopkins Medicine were governance, accountability, reporting of consolidated quality performance statements, and auditing. Governance is provided by the health system's Patient Safety and Quality Board Committee, which reviews goals and strategy for patient safety and quality, reviews quarterly performance for each entity, and holds organizational leaders accountable for performance. An accountability plan includes escalating levels of review corresponding to the number of months an entity misses the defined performance target for a measure. A consolidated quality statement helps inform the Patient Safety and Quality Board Committee and leadership on key quality and safety issues. An audit evaluates the efficiency and effectiveness of processes for data collection, validation, and storage, as to ensure the accuracy and completeness of quality measure reporting. If hospitals and health systems truly want to prioritize improvements in safety and quality, they will need to create a performance management system that ensures data validity and supports performance accountability. Without valid data, it is difficult to know whether a performance gap is due to data quality or clinical quality. Copyright © 2017 The Joint Commission. Published by Elsevier Inc. All rights reserved.
Kremzner, Mary
2016-01-01
Ensuring that the drugs patients take are safe and effective is critical to the Food and Drug Administration (FDA) mission and a major reason for testing an active pharmaceutical ingredient or currently marketed drug product. To address gaps in the assessment of drug quality, FDA's Center for Drug Evaluation and Research (CDER) has created the Office of Pharmaceutical Quality (OPQ). This newly formed "super-office" within CDER launched a concerted new strategy that enhances the surveillance of drug manufacturing and will bring a comprehensive approach to quality oversight. With OPQ and these new performance measures in place, FDA can sharpen its focus on issues critical to quality and can identify and respond to manufacturing issues before they become major systemic problems. Published by Elsevier Inc.
Aboushanab, Tamer; AlSanad, Saud
2018-06-08
Cupping therapy is a popular treatment in various countries and regions, including Saudi Arabia. Cupping therapy is regulated in Saudi Arabia by the National Center for Complementary and Alternative Medicine (NCCAM), Ministry of Health. The authors recommend that this quality model for selecting patients in cupping clinics - first version (QMSPCC-1) - be used routinely as part of clinical practice and quality management in cupping clinics. The aim of the quality model is to ensure the safety of patients and to introduce and facilitate quality and auditing processes in cupping therapy clinics. Clinical evaluation of this tool is recommended. Continued development, re-evaluation and reassessment of this tool are important. Copyright © 2018. Published by Elsevier B.V.
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
Since its creation in 1946, Argonne National Laboratory has addressed the nation’s most pressing challenges in science, energy, the environment, and national security. United by a common goal – to improve the world – Argonne continues to drive the scientific and technological breakthroughs needed to ensure a sustainable future.
ERIC Educational Resources Information Center
Forman, Paul
1982-01-01
Physicists had assumed that the world is distinguishable from its mirror image and constructed theories to ensure that the corresponding mathematical property (parity) is conserved in all subatomic processes. However, a scientific experiment demonstrated an intrinsic handedness to at least one physical process. The experiment, equipment, and…
33 CFR 385.9 - Implementation principles.
Code of Federal Regulations, 2013 CFR
2013-07-01
... of the Plan at specific time intervals during implementation. Interim targets to evaluate progress on... accordance with § 385.39. Interim goals and interim targets shall be consistent with each other. (c... ensure that new information resulting from changed or unforeseen circumstances, new scientific and...
33 CFR 385.9 - Implementation principles.
Code of Federal Regulations, 2014 CFR
2014-07-01
... of the Plan at specific time intervals during implementation. Interim targets to evaluate progress on... accordance with § 385.39. Interim goals and interim targets shall be consistent with each other. (c... ensure that new information resulting from changed or unforeseen circumstances, new scientific and...
33 CFR 385.9 - Implementation principles.
Code of Federal Regulations, 2011 CFR
2011-07-01
... of the Plan at specific time intervals during implementation. Interim targets to evaluate progress on... accordance with § 385.39. Interim goals and interim targets shall be consistent with each other. (c... ensure that new information resulting from changed or unforeseen circumstances, new scientific and...
33 CFR 385.9 - Implementation principles.
Code of Federal Regulations, 2012 CFR
2012-07-01
... of the Plan at specific time intervals during implementation. Interim targets to evaluate progress on... accordance with § 385.39. Interim goals and interim targets shall be consistent with each other. (c... ensure that new information resulting from changed or unforeseen circumstances, new scientific and...
Enabling Scientists: Serving Sci-Tech Library Users with Disabilities.
ERIC Educational Resources Information Center
Coonin, Bryna
2001-01-01
Discusses how librarians in scientific and technical libraries can contribute to an accessible electronic library environment for users with disabilities to ensure independent access to information. Topics include relevant assistive technologies; creating accessible Web pages; monitoring accessibility of electronic databases; preparing accessible…
Startsev, N; Dimov, P; Grosche, B; Tretyakov, F; Schüz, J; Akleyev, A
2015-01-01
To follow up populations exposed to several radiation accidents in the Southern Urals, a cause-of-death registry was established at the Urals Center capturing deaths in the Chelyabinsk, Kurgan and Sverdlovsk region since 1950. When registering deaths over such a long time period, quality measures need to be in place to maintain quality and reduce the impact of individual coders as well as quality changes in death certificates. To ensure the uniformity of coding, a method for semi-automatic coding was developed, which is described here. Briefly, the method is based on a dynamic thesaurus, database-supported coding and parallel coding by two different individuals. A comparison of the proposed method for organizing the coding process with the common procedure of coding showed good agreement, with, at the end of the coding process, 70 - 90% agreement for the three-digit ICD -9 rubrics. The semi-automatic method ensures a sufficiently high quality of coding by at the same time providing an opportunity to reduce the labor intensity inherent in the creation of large-volume cause-of-death registries.
Lessons from industry: one school's transformation toward "lean" curricular governance.
Stratton, Terry D; Rudy, David W; Sauer, Marlene J; Perman, Jay A; Jennings, C Darrell
2007-04-01
As medical education grapples with organizational calls for centralized curricular oversight, programs may be compelled to respond by establishing highly vertical, stacked governance structures. Although these models offer discrete advantages over the horizontal, compartmentalized structures they are designed to replace, they pose new challenges to ensuring curricular quality and the educational innovations that drive the curricula. The authors describe a hybrid quality-assurance (QA) governance structure introduced in 2003 at the University of Kentucky College of Medicine (UKCOM) that ensures centralized curricular oversight of the educational product while allowing individualized creative control over the educational process. Based on a Lean production model, this approach draws on industry experiences that strategically separate institutional accountability (management) for a quality curriculum from the decision-making processes required to ensure it (production). In so doing, the authors acknowledge general similarities and key differences between overseeing the manufacture of a complex product versus the education of a physician-emphasizing the structured, sequential, and measurable nature of each process. Further, the authors briefly trace the emergence of quality approaches in manufacturing and discuss the philosophical changes that accompany transition to an institutional governance system that relies on vigorous, robust performance measures to offer continuous feedback on curricular quality.
Electron beam processing of fresh produce - A critical review
NASA Astrophysics Data System (ADS)
Pillai, Suresh D.; Shayanfar, Shima
2018-02-01
To meet the increasing global demand for fresh produce, robust processing methods that ensures both the safety and quality of fresh produce are needed. Since fresh produce cannot withstand thermal processing conditions, most of common safety interventions used in other foods are ineffective. Electron beam (eBeam) is a non-thermal technology that can be used to extend the shelf life and ensure the microbiological safety of fresh produce. There have been studies documenting the application of eBeam to ensure both safety and quality in fresh produce, however, there are still unexplored areas that still need further research. This is a critical review on the current literature on the application of eBeam technology for fresh produce.
Quality Assurance in Higher Education: A Review of Literature
ERIC Educational Resources Information Center
Ryan, Tricia
2015-01-01
This paper examines the literature surrounding quality assurance in global higher education. It provides an overview of accreditation as a mechanism to ensure quality in higher education, examines models of QA, and explores the concept of quality (including definitions of quality and quality assurance). In addition, this paper provides a review of…
Pharmaceutical quality by design: product and process development, understanding, and control.
Yu, Lawrence X
2008-04-01
The purpose of this paper is to discuss the pharmaceutical Quality by Design (QbD) and describe how it can be used to ensure pharmaceutical quality. The QbD was described and some of its elements identified. Process parameters and quality attributes were identified for each unit operation during manufacture of solid oral dosage forms. The use of QbD was contrasted with the evaluation of product quality by testing alone. The QbD is a systemic approach to pharmaceutical development. It means designing and developing formulations and manufacturing processes to ensure predefined product quality. Some of the QbD elements include: Defining target product quality profile; Designing product and manufacturing processes; Identifying critical quality attributes, process parameters, and sources of variability; Controlling manufacturing processes to produce consistent quality over time. Using QbD, pharmaceutical quality is assured by understanding and controlling formulation and manufacturing variables. Product testing confirms the product quality. Implementation of QbD will enable transformation of the chemistry, manufacturing, and controls (CMC) review of abbreviated new drug applications (ANDAs) into a science-based pharmaceutical quality assessment.
77 FR 11121 - Scientific Information Request on Treatment of Atrial Fibrillation
Federal Register 2010, 2011, 2012, 2013, 2014
2012-02-24
... fibrillation medical devices. Scientific information is being solicited to inform our Comparative Effectiveness... unpublished pertinent scientific information on this device will improve the quality of this comparative effectiveness review. AHRQ is requesting this scientific information and conducting this comparative...
Advances in traffic data collection and management : white paper.
DOT National Transportation Integrated Search
2003-01-31
This white paper identifies innovative approaches for improving data quality through Quality Control. Quality Control emphasizes good data by ensuring selection of the most accurate detector then optimizing detector system performance. This is contra...
Geospatial Science is increasingly becoming an important tool in making Agency decisions. QualIty Control and Quality Assurance are required to be integrated during the planning, implementation and assessment of geospatial databases, processes and products. In order to ensure Age...
US EPA GEOSPATIAL QUALITY COUNCIL: ENSURING QUALITY IN GEOPSPATIAL SOLUTIONS
In 1999, the U.S. Environmental Protection Agency (EPA), Office of Research and Development, Environmental Sciences Division, created the EPA Geospatial Quality Council (GQC) to fill the gap between the EPA Quality Assurance (QA) and Geospatial communities. GQC participants inclu...
Quality Program Provisions for Aeronautical and Space System Contractors
NASA Technical Reports Server (NTRS)
1969-01-01
This publication sets forth quality program requirements for NASA aeronautical and space programs, systems, subsystems, and related services. These requirements provide for the effective operation of a quality program which ensures that quality criteria and requirements are recognized, definitized, and performed satisfactorily.
ERIC Educational Resources Information Center
Brickman, Peggy; Gormally, Cara; Francom, Greg; Jardeleza, Sarah E.; Schutte, Virginia G. W.; Jordan, Carly; Kanizay, Lisa
2012-01-01
Students must learn content knowledge and develop scientific literacy skills to evaluate and use scientific information in real-world situations. Recognizing the accessibility of scientific information to the average citizen, we developed an instructional approach to help students learn how to judge the quality of claims. We describe a…
Elliott, Lawrie; Crombie, Iain K; Irvine, Linda; Cantrell, Jane; Taylor, Julie
2004-01-01
In 1999 the Scottish Office, United Kingdom, intimated that the Chief Nursing Officer would undertake a policy review of nurses' contribution to improving the public's health. The importance of reviewing the scientific literature on the effectiveness of public health nursing was recognized as a crucial part of the policy review. A final report was expected within a 6-month period. The reason for the short time period was to fit the policy-making schedule. This paper discusses our literature review for this work. The aim was to conduct a review of the international scientific literature that gave the greatest coverage of the role and potential role of nurses in improving the public's health in relation to 14 major health topics. This paper describes the methods used, outlines the rationale underpinning the methods, discusses the problems encountered and offers solutions to some of these problems. The initial search for relevant scientific literature revealed 709 suitable primary papers. Reviewing this number was beyond the time limit set by the funding organization. Therefore, a decision was made to concentrate on the evidence contained in systematic reviews. Reviewing systematic reviews raises a number of methodological problems to which there are often no predetermined solutions, such as ensuring that important interventions are included, assessing the relevance and quality of the reviews, and grading the strength of the evidence. Reviewing systematic reviews provides the scope to increase the number of topics that might be covered. However, it is possible that a number of interventions may be missed, particularly those that are not subject to review or those assessed using qualitative techniques. The definition of public health nursing used in the present study was also restrictive, and could be widened to include community interventions. Finally, assessing the quality of reviews and grading the evidence proved difficult and there is lack of consensus on how these tasks should be achieved. Nevertheless, the review presented policy makers with accessible information on a large number of relevant international studies.
The Higher the Quality of Teaching the Higher the Quality of Education
ERIC Educational Resources Information Center
Sultana, Naveed; Yousuf, Muhammad Imran; Ud Din, Muhammad Naseer; Rehman, Sajid
2009-01-01
The higher education plays as leadership role in the system of education. Quality education can ensure security, welfare and prosperity of a nation. The key factors influencing the quality of higher education is the quality of faculty, curriculum standards, technological infrastructure available, research environment, accreditation regime,…
Parks, Nathan A.; Gannon, Matthew A.; Long, Stephanie M.; Young, Madeleine E.
2016-01-01
Analysis of event-related potential (ERP) data includes several steps to ensure that ERPs meet an appropriate level of signal quality. One such step, subject exclusion, rejects subject data if ERP waveforms fail to meet an appropriate level of signal quality. Subject exclusion is an important quality control step in the ERP analysis pipeline as it ensures that statistical inference is based only upon those subjects exhibiting clear evoked brain responses. This critical quality control step is most often performed simply through visual inspection of subject-level ERPs by investigators. Such an approach is qualitative, subjective, and susceptible to investigator bias, as there are no standards as to what constitutes an ERP of sufficient signal quality. Here, we describe a standardized and objective method for quantifying waveform quality in individual subjects and establishing criteria for subject exclusion. The approach uses bootstrap resampling of ERP waveforms (from a pool of all available trials) to compute a signal-to-noise ratio confidence interval (SNR-CI) for individual subject waveforms. The lower bound of this SNR-CI (SNRLB) yields an effective and objective measure of signal quality as it ensures that ERP waveforms statistically exceed a desired signal-to-noise criterion. SNRLB provides a quantifiable metric of individual subject ERP quality and eliminates the need for subjective evaluation of waveform quality by the investigator. We detail the SNR-CI methodology, establish the efficacy of employing this approach with Monte Carlo simulations, and demonstrate its utility in practice when applied to ERP datasets. PMID:26903849
DSSTOX MASTER STRUCTURE-INDEX FILE: SDF FILE AND ...
The DSSTox Master Structure-Index File serves to consolidate, manage, and ensure quality and uniformity of the chemical and substance information spanning all DSSTox Structure Data Files, including those in development but not yet published separately on this website. The DSSTox Master Structure-Index File serves to consolidate, manage, and ensure quality and uniformity of the chemical and substance information spanning all DSSTox Structure Data Files, including those in development but not yet published separately on this website.
Motorcycle helmets in Vietnam: ownership, quality, purchase price, and affordability.
Hung, Dang Viet; Stevenson, Mark R; Ivers, Rebecca Q
2008-06-01
This study investigated motorcycle helmet ownership, quality, purchase price, and affordability in Vietnam. A random sample of motorcyclists was interviewed to investigate aspects of helmet ownership, the purchase price, and affordability of a motorcycle helmet. Multivariate modeling conducted to determine factors associated with the purchase price and affordability of motorcycle helmets. Helmet quality was assessed based on current legal requirements in Vietnam. The prevalence of helmet use in Vietnam remains low (23.3%) despite a high level of helmet ownership (94%), indicating that this is an important area for public health intervention. Overall the quality of helmets appeared to be good; however, few helmets displayed legally required information. Motorcyclists with a high income purchase more helmets for their household rather than more expensive helmets. To ensure that helmets are accessible to the community, policy-makers need to consider pricing motorcycle helmets at a price indicated by the results of this study. Prior to universal motorcycle helmet legislation, the government will also need to ensure that standard helmets are available and that enforcement is at a level to ensure that motorcycle helmets are actually used.
What Does it Mean to Publish Data in Earth System Science Data Journal?
NASA Astrophysics Data System (ADS)
Carlson, D.; Pfeiffenberger, H.
2015-12-01
The availability of more than 120 data sets in ESSD represents an unprecedented effort by providers, data centers and ESSD. ESSD data sets and their accompanying data descriptions undergo rigorous review. The data sets reside at any of more than 20 cooperating data centers. The ESSD publication process depends on but challenges the concepts of digital object identification and exacerbates the varied interpretations of the phrase 'data publication'. ESSD adopts the digital object identifier (doi). Key questions apply to doi's and other identifiers. How will persistent identifiers point accurately to distributed or replicated data? How should data centers and data publishers use identifier technologies to ensure authenticity and integrity? Should metadata associated with identifiers distinguish among raw, quality controlled and derived data processing levels, or indicate license or copyright status?Data centers publish data sets according to internal metadata standards but without indicators of quality control. Publication in this sense indicates availability. National data portals compile, serve and publish data products as a service to national researchers and, often, to meet national requirements. Publication in this second case indicates availability in a national context; the data themselves may still reside at separate data centers. Data journals such as ESSD or Scientific Data publish peer-reviewed, quality controlled data sets. These data sets almost always reside at a separate data center - the journal and the center maintain explicit identifier linkages. Data journals add quality to the feature of availability. A single data set processed through these layers will generate three independent doi's but the doi's will provide little information about availability or quality. Could the data world learn from the URL world to consider additions? Suffixes? Could we use our experience with processing levels or data maturity to propose and agree such extensions?
Dupuytren Disease: Is There Enough Comprehensive Patient Information on the Internet?
Zuk, Grzegorz; Reinisch, Katharina B; Raptis, Dimitri A; Fertsch, Sonia; Guggenheim, Merlin; Palma, Adrian F
2017-06-22
Dupuytren disease is a chronic nonmalign fibroproliferative disorder that causes finger contractures via proliferation of new tissue under the glabrous skin of the hand, resulting in multiple functional limitations for the patient. As many surgical therapy options exist, patients suffering from this condition actively search for information in their environment before consulting a health professional. As little is known about the quality of Web-based patient information, the aim of this study was to conduct its systematic evaluation using a validated tool. A total of 118 websites were included, and qualitative and quantitative assessment was performed using the modified Ensuring Quality Information for Patients (EQIP) tool. This standardized and reproducible tool consists of 36 items to assess available information in three categories: contents, identification, and structure data. Scientific data with restricted access, duplicates, and irrelevant websites were not included. Only 32 websites addressed more than 19 items, and the scores did not significantly differ among the website developers. The median number of items from the EQIP tool was 16, with the top websites addressing 28 out of 36 items. The quality of the newly developed websites did not increase with passing time. This study revealed several shortcomings in the quality of Web-based information available for patients suffering from Dupuytren disease. In the world of continuously growing and instantly available Web-based information, it is the health providers' negligence of the last two decades that there are very few good quality, informative, and educative websites that could be recommended to patients. ©Grzegorz Zuk, Katharina B Reinisch, Dimitri A Raptis, Sonia Fertsch, Merlin Guggenheim, Adrian F Palma. Originally published in the Interactive Journal of Medical Research (http://www.i-jmr.org/), 22.06.2017.
AGU's new task force on scientific ethics and integrity begins work
NASA Astrophysics Data System (ADS)
Gleick, Peter; Townsend, Randy
2011-11-01
In support of the new strategic plan, AGU has established a new task force to review, evaluate, and update the Union's policies on scientific misconduct and the process for investigating and responding to allegations of possible misconduct by AGU members. As noted by AGU president Michael McPhaden, "AGU can only realize its vision of `collaboratively advancing and communicating science and its power to ensure a sustainable future' if we have the trust of the public and policy makers. That trust is earned by maintaining the highest standards of scientific integrity in all that we do. The work of the Task Force on Scientific Ethics is essential for defining norms of professional conduct that all our members can aspire to and that demonstrate AGU's unwavering commitment to excellence in Earth and space science."
Georg Neumayer and Melbourne Observatory: an institutional legacy
NASA Astrophysics Data System (ADS)
Gillespie, Richard
This paper assesses Georg Neumayer's impact on the Victorian scientific community, and especially his role in the establishment of Melbourne Observatory as a major scientific institution in colonial Australia. Neumayer's arrival in Melbourne to pursue his own scientific project triggered a chain of events that would lead to the creation of Melbourne Observatory and the integration of Neumayer's geomagnetic and meteorological research into the ongoing program of the observatory. The location of the observatory in South Yarra was a direct result of Neumayer's insistence that the site was the most suitable for geomagnetic measurement. Most critically, Neumayer's attempts to get approval for his project highlighted the need for local scientists to establish political and scientific alliances that would ensure endorsement by international, notably British, scientists, and that would persuade local elites and government of the practical value of their research.
NASA Astrophysics Data System (ADS)
Añel, Juan A.
2017-03-01
Nowadays, the majority of the scientific community is not aware of the risks and problems associated with an inadequate use of computer systems for research, mostly for reproducibility of scientific results. Such reproducibility can be compromised by the lack of clear standards and insufficient methodological description of the computational details involved in an experiment. In addition, the inappropriate application or ignorance of copyright laws can have undesirable effects on access to aspects of great importance of the design of experiments and therefore to the interpretation of results.
Scientific Integrity Policy Creation and Implementation.
NASA Astrophysics Data System (ADS)
Koizumi, K.
2017-12-01
Ensuring the integrity of science was a priority for the Obama Administration. In March 2009, President Obama issued a Presidential Memorandum that recognized the need for the public to be able to trust the science and scientific process informing public policy decisions. In 2010, the White House Office of Science and Technology Policy (OSTP) issued a Memorandum providing guidelines for Federal departments and agencies to follow in developing scientific integrity policies. This Memorandum describes minimum standards for: (1) strengthening the foundations of scientific integrity in government, including by shielding scientific data and analysis from inappropriate political influence; (2) improving public communication about science and technology by promoting openness and transparency; (3) enhancing the ability of Federal Advisory Committees to provide independent scientific advice; and (4) supporting the professional development of government scientists and engineers. The Memorandum called upon the heads of departments and agencies to develop scientific integrity policies that meet these requirements. At the end of the Obama Administration, 24 Federal departments and agencies had developed and implemented scientific integrity policies consistent with the OSTP guidelines. This year, there are significant questions as to the Trump Administration's commitment to these scientific integrity policies and interest in the Congress in codifying these policies in law. The session will provide an update on the status of agency scientific integrity policies and legislation.
48 CFR 1646.201 - Contract Quality Policy.
Code of Federal Regulations, 2010 CFR
2010-10-01
... ensure that services acquired under the FEHB contract conform to the contract's quality and audit requirements. (b) OPM will periodically evaluate the contractor's system of internal controls under the quality... 48 Federal Acquisition Regulations System 6 2010-10-01 2010-10-01 true Contract Quality Policy...
42 CFR 493.1445 - Standard; Laboratory director responsibilities.
Code of Federal Regulations, 2010 CFR
2010-10-01
... quality laboratory services for all aspects of test performance, which includes the preanalytic, analytic... result is found to be unacceptable or unsatisfactory; (5) Ensure that the quality control and quality assessment programs are established and maintained to assure the quality of laboratory services provided and...
US EPA GEOSPATIAL QUALITY COUNCIL: ENSURING QUALITY GEOSPATIAL SOLUTIONS
This presentation will discuss the history, strategy, products, and future plans of the EPA Geospatial Quality Council (GQC). A topical review of GQC products will be presented including:
o Guidance for Geospatial Data Quality Assurance Project Plans.
o GPS - Tec...
Berger, M; Kooyman, P J; Makkee, M; van der Zee, J S; Sterk, P J; van Dijk, J; Kemper, E M
2016-08-19
Clinical studies investigating medicinal products need to comply with laws concerning good clinical practice (GCP) and good manufacturing practice (GMP) to guarantee the quality and safety of the product, to protect the health of the participating individual and to assure proper performance of the study. However, there are no specific regulations or guidelines for non-Medicinal Investigational Products (non-MIPs) such as allergens, enriched food supplements, and air pollution components. As a consequence, investigators will avoid clinical research and prefer preclinical models or in vitro testing for e.g. toxicology studies. 1) briefly review the current guidelines and regulations for Investigational Medicinal Products; 2) present a standardised approach to ensure the quality and safety of non-MIPs in human in vivo research; and 3) discuss some lessons we have learned. We propose a practical line of approach to compose a clarifying product dossier (PD), comprising the description of the production process, the analysis of the raw and final product, toxicological studies, and a thorough risk-benefit-analysis. This is illustrated by an example from a human in vivo research model to study exposure to air pollutants, by challenging volunteers with a suspension of carbon nanoparticles (the component of ink cartridges for laser printers). With this novel risk-based approach, the members of competent authorities are provided with standardised information on the quality of the product in relation to the safety of the participants, and the scientific goal of the study.