Sample records for development production validation

  1. The Importance of Method Selection in Determining Product Integrity for Nutrition Research1234

    PubMed Central

    Mudge, Elizabeth M; Brown, Paula N

    2016-01-01

    The American Herbal Products Association estimates that there as many as 3000 plant species in commerce. The FDA estimates that there are about 85,000 dietary supplement products in the marketplace. The pace of product innovation far exceeds that of analytical methods development and validation, with new ingredients, matrixes, and combinations resulting in an analytical community that has been unable to keep up. This has led to a lack of validated analytical methods for dietary supplements and to inappropriate method selection where methods do exist. Only after rigorous validation procedures to ensure that methods are fit for purpose should they be used in a routine setting to verify product authenticity and quality. By following systematic procedures and establishing performance requirements for analytical methods before method development and validation, methods can be developed that are both valid and fit for purpose. This review summarizes advances in method selection, development, and validation regarding herbal supplement analysis and provides several documented examples of inappropriate method selection and application. PMID:26980823

  2. The Importance of Method Selection in Determining Product Integrity for Nutrition Research.

    PubMed

    Mudge, Elizabeth M; Betz, Joseph M; Brown, Paula N

    2016-03-01

    The American Herbal Products Association estimates that there as many as 3000 plant species in commerce. The FDA estimates that there are about 85,000 dietary supplement products in the marketplace. The pace of product innovation far exceeds that of analytical methods development and validation, with new ingredients, matrixes, and combinations resulting in an analytical community that has been unable to keep up. This has led to a lack of validated analytical methods for dietary supplements and to inappropriate method selection where methods do exist. Only after rigorous validation procedures to ensure that methods are fit for purpose should they be used in a routine setting to verify product authenticity and quality. By following systematic procedures and establishing performance requirements for analytical methods before method development and validation, methods can be developed that are both valid and fit for purpose. This review summarizes advances in method selection, development, and validation regarding herbal supplement analysis and provides several documented examples of inappropriate method selection and application. © 2016 American Society for Nutrition.

  3. Development of Learning Models Based on Problem Solving and Meaningful Learning Standards by Expert Validity for Animal Development Course

    NASA Astrophysics Data System (ADS)

    Lufri, L.; Fitri, R.; Yogica, R.

    2018-04-01

    The purpose of this study is to produce a learning model based on problem solving and meaningful learning standards by expert assessment or validation for the course of Animal Development. This research is a development research that produce the product in the form of learning model, which consist of sub product, namely: the syntax of learning model and student worksheets. All of these products are standardized through expert validation. The research data is the level of validity of all sub products obtained using questionnaire, filled by validators from various field of expertise (field of study, learning strategy, Bahasa). Data were analysed using descriptive statistics. The result of the research shows that the problem solving and meaningful learning model has been produced. Sub products declared appropriate by expert include the syntax of learning model and student worksheet.

  4. An Engineering Method of Civil Jet Requirements Validation Based on Requirements Project Principle

    NASA Astrophysics Data System (ADS)

    Wang, Yue; Gao, Dan; Mao, Xuming

    2018-03-01

    A method of requirements validation is developed and defined to meet the needs of civil jet requirements validation in product development. Based on requirements project principle, this method will not affect the conventional design elements, and can effectively connect the requirements with design. It realizes the modern civil jet development concept, which is “requirement is the origin, design is the basis”. So far, the method has been successfully applied in civil jet aircraft development in China. Taking takeoff field length as an example, the validation process and the validation method of the requirements are detailed introduced in the study, with the hope of providing the experiences to other civil jet product design.

  5. Parent Reports of Young Spanish-English Bilingual Children's Productive Vocabulary: A Development and Validation Study

    ERIC Educational Resources Information Center

    Mancilla-Martinez, Jeannette; Gámez, Perla B.; Vagh, Shaher Banu; Lesaux, Nonie K.

    2016-01-01

    Purpose: This 2-phase study aims to extend research on parent report measures of children's productive vocabulary by investigating the development (n = 38) of the Spanish Vocabulary Extension and validity (n = 194) of the 100-item Spanish and English MacArthur-Bates Communicative Development Inventories Toddler Short Forms and Upward Extension…

  6. Validated environmental and physiological data from the CELSS Breadboard Projects Biomass Production Chamber. BWT931 (Wheat cv. Yecora Rojo)

    NASA Technical Reports Server (NTRS)

    Stutte, G. W.; Mackowiak, C. L.; Markwell, G. A.; Wheeler, R. M.; Sager, J. C.

    1993-01-01

    This KSC database is being made available to the scientific research community to facilitate the development of crop development models, to test monitoring and control strategies, and to identify environmental limitations in crop production systems. The KSC validated dataset consists of 17 parameters necessary to maintain bioregenerative life support functions: water purification, CO2 removal, O2 production, and biomass production. The data are available on disk as either a DATABASE SUBSET (one week of 5-minute data) or DATABASE SUMMARY (daily averages of parameters). Online access to the VALIDATED DATABASE will be made available to institutions with specific programmatic requirements. Availability and access to the KSC validated database are subject to approval and limitations implicit in KSC computer security policies.

  7. Best Practices in Stability Indicating Method Development and Validation for Non-clinical Dose Formulations.

    PubMed

    Henry, Teresa R; Penn, Lara D; Conerty, Jason R; Wright, Francesca E; Gorman, Gregory; Pack, Brian W

    2016-11-01

    Non-clinical dose formulations (also known as pre-clinical or GLP formulations) play a key role in early drug development. These formulations are used to introduce active pharmaceutical ingredients (APIs) into test organisms for both pharmacokinetic and toxicological studies. Since these studies are ultimately used to support dose and safety ranges in human studies, it is important to understand not only the concentration and PK/PD of the active ingredient but also to generate safety data for likely process impurities and degradation products of the active ingredient. As such, many in the industry have chosen to develop and validate methods which can accurately detect and quantify the active ingredient along with impurities and degradation products. Such methods often provide trendable results which are predictive of stability, thus leading to the name; stability indicating methods. This document provides an overview of best practices for those choosing to include development and validation of such methods as part of their non-clinical drug development program. This document is intended to support teams who are either new to stability indicating method development and validation or who are less familiar with the requirements of validation due to their position within the product development life cycle.

  8. DEVELOPMENT AND VALIDATION OF A METHOD FOR MEASURING EXEMPT VOLATILE ORGANIC COMPOUNDS AND CARBON DIOXIDE IN CONSUMER PRODUCTS

    EPA Science Inventory

    The report describes the development and validation of a method for measuring exempt volatile organic compounds (VOCs) and carbon dioxide in consumer products. (NOTE: Ground-level ozone can cause a variety of adverse health effects as well as agricultural and ecological damage. C...

  9. 76 FR 58539 - Notice Pursuant to The National Cooperative Research and Production Act of 1993-Cooperative...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-21

    ... DEPARTMENT OF JUSTICE Antitrust Division Notice Pursuant to The National Cooperative Research and Production Act of 1993--Cooperative Research Group on Development and Validation of FlawPRO for Assessing... Development and Validation of FlawPRO for Assessing Defect Tolerance of Welded Pipes Under Generalized High...

  10. 77 FR 73676 - Notice Pursuant to the National Cooperative Research and Production Act of 1993; Cooperative...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-12-11

    ... DEPARTMENT OF JUSTICE Antitrust Division Notice Pursuant to the National Cooperative Research and Production Act of 1993; Cooperative Research Group on Development and Validation of Flawpro for Assessing... Development and Validation of FlawPRO for Assessing Defect Tolerance of Welded Pipes Under Generalized High...

  11. Remote Sensing Product Verification and Validation at the NASA Stennis Space Center

    NASA Technical Reports Server (NTRS)

    Stanley, Thomas M.

    2005-01-01

    Remote sensing data product verification and validation (V&V) is critical to successful science research and applications development. People who use remote sensing products to make policy, economic, or scientific decisions require confidence in and an understanding of the products' characteristics to make informed decisions about the products' use. NASA data products of coarse to moderate spatial resolution are validated by NASA science teams. NASA's Stennis Space Center (SSC) serves as the science validation team lead for validating commercial data products of moderate to high spatial resolution. At SSC, the Applications Research Toolbox simulates sensors and targets, and the Instrument Validation Laboratory validates critical sensors. The SSC V&V Site consists of radiometric tarps, a network of ground control points, a water surface temperature sensor, an atmospheric measurement system, painted concrete radial target and edge targets, and other instrumentation. NASA's Applied Sciences Directorate participates in the Joint Agency Commercial Imagery Evaluation (JACIE) team formed by NASA, the U.S. Geological Survey, and the National Geospatial-Intelligence Agency to characterize commercial systems and imagery.

  12. Assessment Methodology for Process Validation Lifecycle Stage 3A.

    PubMed

    Sayeed-Desta, Naheed; Pazhayattil, Ajay Babu; Collins, Jordan; Chen, Shu; Ingram, Marzena; Spes, Jana

    2017-07-01

    The paper introduces evaluation methodologies and associated statistical approaches for process validation lifecycle Stage 3A. The assessment tools proposed can be applied to newly developed and launched small molecule as well as bio-pharma products, where substantial process and product knowledge has been gathered. The following elements may be included in Stage 3A: number of 3A batch determination; evaluation of critical material attributes, critical process parameters, critical quality attributes; in vivo in vitro correlation; estimation of inherent process variability (IPV) and PaCS index; process capability and quality dashboard (PCQd); and enhanced control strategy. US FDA guidance on Process Validation: General Principles and Practices, January 2011 encourages applying previous credible experience with suitably similar products and processes. A complete Stage 3A evaluation is a valuable resource for product development and future risk mitigation of similar products and processes. Elements of 3A assessment were developed to address industry and regulatory guidance requirements. The conclusions made provide sufficient information to make a scientific and risk-based decision on product robustness.

  13. Report on Development and Validation of Utilization Materials to Accompany Two Series of U.S. Office of Education Alcohol Education Films.

    ERIC Educational Resources Information Center

    Finn, Peter

    This report records the development and validation by Abt Associates, Inc. of utilization materials developed to accompany the two U.S. Office of Education film series, Jackson Junior High and Dial A-L-C-O-H-O-L. The first section describes the process by which the nine project products were developed. These products include the following: (1) a…

  14. Validation of a 30-year-old process for the manufacture of L-asparaginase from Erwinia chrysanthemi.

    PubMed

    Gervais, David; Allison, Nigel; Jennings, Alan; Jones, Shane; Marks, Trevor

    2013-04-01

    A 30-year-old manufacturing process for the biologic product L-asparaginase from the plant pathogen Erwinia chrysanthemi was rigorously qualified and validated, with a high level of agreement between validation data and the 6-year process database. L-Asparaginase exists in its native state as a tetrameric protein and is used as a chemotherapeutic agent in the treatment regimen for Acute Lymphoblastic Leukaemia (ALL). The manufacturing process involves fermentation of the production organism, extraction and purification of the L-asparaginase to make drug substance (DS), and finally formulation and lyophilisation to generate drug product (DP). The extensive manufacturing experience with the product was used to establish ranges for all process parameters and product quality attributes. The product and in-process intermediates were rigorously characterised, and new assays, such as size-exclusion and reversed-phase UPLC, were developed, validated, and used to analyse several pre-validation batches. Finally, three prospective process validation batches were manufactured and product quality data generated using both the existing and the new analytical methods. These data demonstrated the process to be robust, highly reproducible and consistent, and the validation was successful, contributing to the granting of an FDA product license in November, 2011.

  15. Operational space weather product development and validation at the joint SMC-AFRL Rapid Prototyping Center

    NASA Astrophysics Data System (ADS)

    Quigley, S.

    The Air Force Research Laboratory (AFRL/VSB) and Detachment 11, Space &Missile Systems Center (SMC, Det 11/CIT) have combined efforts to design, develop, test, and implement graphical products for the Air Force's space weather operations center. These products are generated to analyze, specify, and forecast the effects of the near-earth space environment on Department of Defense systems and communications. Jointly-developed products that have been, or will soon be added to real-time operations include: 1) the Operational Space Environment Network Display (OpSEND) suit - a set of four products that address HF communication, UHF satellite communication scintillation, radar auroral clutter, and GP S single- frequency errors; 2) a solar radio background and burst effects (SoRBE) product suite; and C) a meteor effects (ME) product suite. The RPC is also involved in a rather substantial "V&V" effort to produce multiple operational product verifications and validations, with an added end goal of a generalized validation software package. The presentation will provide a general overview of the RPC and each of the products mentioned above, to include background science, operational history, inputs, outputs, dissemination, and customer uses for each.

  16. Providing a Science Base for the Evaluation of Tobacco Products

    PubMed Central

    Berman, Micah L.; Connolly, Greg; Cummings, K. Michael; Djordjevic, Mirjana V.; Hatsukami, Dorothy K.; Henningfield, Jack E.; Myers, Matthew; O'Connor, Richard J.; Parascandola, Mark; Rees, Vaughan; Rice, Jerry M.

    2015-01-01

    Objective Evidence-based tobacco regulation requires a comprehensive scientific framework to guide the evaluation of new tobacco products and health-related claims made by product manufacturers. Methods The Tobacco Product Assessment Consortium (TobPRAC) employed an iterative process involving consortia investigators, consultants, a workshop of independent scientists and public health experts, and written reviews in order to develop a conceptual framework for evaluating tobacco products. Results The consortium developed a four-phased framework for the scientific evaluation of tobacco products. The four phases addressed by the framework are: (1) pre-market evaluation, (2) pre-claims evaluation, (3) post-market activities, and (4) monitoring and re-evaluation. For each phase, the framework proposes the use of validated testing procedures that will evaluate potential harms at both the individual and population level. Conclusions While the validation of methods for evaluating tobacco products is an ongoing and necessary process, the proposed framework need not wait for fully validated methods to be used in guiding tobacco product regulation today. PMID:26665160

  17. Validation of the second version of the LittlEARS® Early Speech Production Questionnaire (LEESPQ) in German-speaking children with normal hearing.

    PubMed

    Keilmann, Annerose; Friese, Barbara; Lässig, Anne; Hoffmann, Vanessa

    2018-04-01

    The introduction of neonatal hearing screening and the increasingly early age at which children can receive a cochlear implant has intensified the need for a validated questionnaire to assess the speech production of children aged 0‒18. Such a questionnaire has been created, the LittlEARS ® Early Speech Production Questionnaire (LEESPQ). This study aimed to validate a second, revised edition of the LEESPQ. Questionnaires were returned for 362 children with normal hearing. Completed questionnaires were analysed to determine if the LEESPQ is reliable, prognostically accurate, internally consistent, and if gender or multilingualism affects total scores. Total scores correlated positively with age. The LEESPQ is reliable, accurate, and consistent, and independent of gender or lingual status. A norm curve was created. This second version of the LEESPQ is a valid tool to assess the speech production development of children with normal hearing, aged 0‒18, regardless of their gender. As such, the LEESPQ may be a useful tool to monitor the development of paediatric hearing device users. The second version of the LEESPQ is a valid instrument for assessing early speech production of children aged 0‒18 months.

  18. Designing Interactive Electronic Module in Chemistry Lessons

    NASA Astrophysics Data System (ADS)

    Irwansyah, F. S.; Lubab, I.; Farida, I.; Ramdhani, M. A.

    2017-09-01

    This research aims to design electronic module (e-module) oriented to the development of students’ chemical literacy on the solution colligative properties material. This research undergoes some stages including concept analysis, discourse analysis, storyboard design, design development, product packaging, validation, and feasibility test. Overall, this research undertakes three main stages, namely, Define (in the form of preliminary studies); Design (designing e-module); Develop (including validation and model trial). The concept presentation and visualization used in this e-module is oriented to chemical literacy skills. The presentation order carries aspects of scientific context, process, content, and attitude. Chemists and multi media experts have done the validation to test the initial quality of the products and give a feedback for the product improvement. The feasibility test results stated that the content presentation and display are valid and feasible to be used with the value of 85.77% and 87.94%. These values indicate that this e-module oriented to students’ chemical literacy skills for the solution colligative properties material is feasible to be used.

  19. Development and Validation of New Discriminative Dissolution Method for Carvedilol Tablets

    PubMed Central

    Raju, V.; Murthy, K. V. R.

    2011-01-01

    The objective of the present study was to develop and validate a discriminative dissolution method for evaluation of carvedilol tablets. Different conditions such as type of dissolution medium, volume of dissolution medium and rotation speed of paddle were evaluated. The best in vitro dissolution profile was obtained using Apparatus II (paddle), 50 rpm, 900 ml of pH 6.8 phosphate buffer as dissolution medium. The drug release was evaluated by high-performance liquid chromatographic method. The dissolution method was validated according to current ICH and FDA guidelines using parameters such as the specificity, accuracy, precision and stability were evaluated and obtained results were within the acceptable range. The comparison of the obtained dissolution profiles of three different products were investigated using ANOVA-based, model-dependent and model-independent methods, results showed that there is significant difference between the products. The dissolution test developed and validated was adequate for its higher discriminative capacity in differentiating the release characteristics of the products tested and could be applied for development and quality control of carvedilol tablets. PMID:22923865

  20. Development of Level 2 Calibration and Validation Plans for GOES-R; What is a RIMP?

    NASA Technical Reports Server (NTRS)

    Kopp, Thomas J.; Belsma, Leslie O.; Mollner, Andrew K.; Sun, Ziping; Deluccia, Frank

    2017-01-01

    Calibration and Validation (CalVal) plans for Geostationary Operational Environmental Satellite version R (GOES-R) Level 2 (L2) products were documented via Resource, Implementation, and Management Plans (RIMPs) for all of the official L2 products required from the GOES-R Advanced Baseline Imager (ABI). In 2015 the GOES-R program decided to replace the typical CalVal plans with RIMPs that covered, for a given L2 product, what was required from that product, how it would be validated, and what tools would be used to do so. Similar to Level 1b products, the intent was to cover the full spectrum of planning required for the CalVal of L2 ABI products. Instead of focusing on step-by-step procedures, the RIMPs concentrated on the criteria for each stage of the validation process (Beta, Provisional, and Full Validation) and the many elements required to prove when each stage was reached.

  1. EOS-Aura's Ozone Monitoring Instrument (OMI): Validation Requirements

    NASA Technical Reports Server (NTRS)

    Brinksma, E. J.; McPeters, R.; deHaan, J. F.; Levelt, P. F.; Hilsenrath, E.; Bhartia, P. K.

    2003-01-01

    OMI is an advanced hyperspectral instrument that measures backscattered radiation in the UV and visible. It will be flown as part of the EOS Aura mission and provide data on atmospheric chemistry that is highly synergistic with other Aura instruments HIRDLS, MLS, and TES. OMI is designed to measure total ozone, aerosols, cloud information, and UV irradiances, continuing the TOMS series of global mapped products but with higher spatial resolution. In addition its hyperspectral capability enables measurements of trace gases such as SO2, NO2, HCHO, BrO, and OClO. A plan for validation of the various OM1 products is now being formulated. Validation of the total column and UVB products will rely heavily on existing networks of instruments, like NDSC. NASA and its European partners are planning aircraft missions for the validation of Aura instruments. New instruments and techniques (DOAS systems for example) will need to be developed, both ground and aircraft based. Lidar systems are needed for validation of the vertical distributions of ozone, aerosols, NO2 and possibly SO2. The validation emphasis will be on the retrieval of these products under polluted conditions. This is challenging because they often depend on the tropospheric profiles of the product in question, and because of large spatial variations in the troposphere. Most existing ground stations are located in, and equipped for, pristine environments. This is also true for almost all NDSC stations. OMI validation will need ground based sites in polluted environments and specially developed instruments, complementing the existing instrumentation.

  2. Quality of life in postmenopausal women: translation and validation of MSkinQOL questionnaire to measure the effect of a skincare product in USA.

    PubMed

    Segot-Chicq, Evelyne; Fanchon, Chantal

    2013-12-01

    The 28-item Menopausal Skin Quality Of Life (MSkinQOL), a previously validated French questionnaire, developed to assess psychological features of menopausal women and to measure the benefits of using cosmetic skincare products was translated and validated to assess a skincare product in the USA. Construct validity, reliability, reproducibility, and responsiveness were assessed with two groups of 100 nonmenopausal (NM) and 100 postmenopausal (PM) women. The group of PM women applied a specially developed skincare product twice daily for 1 month and filled in the same questionnaire after 1 month as well as a general self-assessment questionnaire about the efficacy and cosmetic properties of the product. No ceiling or floor effects were identified. Construct and internal validity was assessed using a multitrait analysis: questionnaire items proved closely correlated, and each dimension covers a different aspect of women answers profile. The three dimensions showed good reliability and stability. Baseline values for social effects of skin appearance, health status, and self-esteem were significantly different between PM and NM volunteers. Values of these three dimensions were significantly improved after 2 weeks of product application, and further improved after 4 weeks. This study shows that a careful translation and a rigorous process of validation lead to a reliable tool adapted to each country to explore and measure quality of life in healthy PM women. © 2013 Wiley Periodicals, Inc.

  3. Strategic Defense Initiative Demonstration/Validation Program Environmental Assessment. Exoatmospheric Reentry Vehicle Interception System (ERIS),

    DTIC Science & Technology

    1987-08-01

    proceed to Demonstration/Validation for ERIS vould not preclude other technologies, nor vould it mandate the eventual Full-Scale Development or Production ...Full-Scale Development, and Production /Deployment. These four stages are separated by three major decision points (Milestones I, II, and III). Prior...percent facility population increase would require increased power plant gener- ating capacity. One concern is the nitrogen oxide emissions which is

  4. Parent Reports of Young Spanish-English Bilingual Children's Productive Vocabulary: A Development and Validation Study.

    PubMed

    Mancilla-Martinez, Jeannette; Gámez, Perla B; Vagh, Shaher Banu; Lesaux, Nonie K

    2016-01-01

    This 2-phase study aims to extend research on parent report measures of children's productive vocabulary by investigating the development (n = 38) of the Spanish Vocabulary Extension and validity (n = 194) of the 100-item Spanish and English MacArthur-Bates Communicative Development Inventories Toddler Short Forms and Upward Extension (Fenson et al., 2000, 2007; Jackson-Maldonado, Marchman, & Fernald, 2013) and the Spanish Vocabulary Extension for use with parents from low-income homes and their 24- to 48-month-old Spanish-English bilingual children. Study participants were drawn from Early Head Start and Head Start collaborative programs in the Northeastern United States in which English was the primary language used in the classroom. All families reported Spanish or Spanish-English as their home language(s). The MacArthur Communicative Development Inventories as well as the researcher-designed Spanish Vocabulary Extension were used as measures of children's English and Spanish productive vocabularies. Findings revealed the forms' concurrent and discriminant validity, on the basis of standardized measures of vocabulary, as measures of productive vocabulary for this growing bilingual population. These findings suggest that parent reports, including our researcher-designed form, represent a valid, cost-effective mechanism for vocabulary monitoring purposes in early childhood education settings.

  5. Fostering creativity in product and service development: validation in the domain of information technology.

    PubMed

    Zeng, Liang; Proctor, Robert W; Salvendy, Gavriel

    2011-06-01

    This research is intended to empirically validate a general model of creative product and service development proposed in the literature. A current research gap inspired construction of a conceptual model to capture fundamental phases and pertinent facilitating metacognitive strategies in the creative design process. The model also depicts the mechanism by which design creativity affects consumer behavior. The validity and assets of this model have not yet been investigated. Four laboratory studies were conducted to demonstrate the value of the proposed cognitive phases and associated metacognitive strategies in the conceptual model. Realistic product and service design problems were used in creativity assessment to ensure ecological validity. Design creativity was enhanced by explicit problem analysis, whereby one formulates problems from different perspectives and at different levels of abstraction. Remote association in conceptual combination spawned more design creativity than did near association. Abstraction led to greater creativity in conducting conceptual expansion than did specificity, which induced mental fixation. Domain-specific knowledge and experience enhanced design creativity, indicating that design can be of a domain-specific nature. Design creativity added integrated value to products and services and positively influenced customer behavior. The validity and value of the proposed conceptual model is supported by empirical findings. The conceptual model of creative design could underpin future theory development. Propositions advanced in this article should provide insights and approaches to facilitate organizations pursuing product and service creativity to gain competitive advantage.

  6. Development of Level 1b Calibration and Validation Readiness, Implementation and Management Plans for GOES-R

    NASA Technical Reports Server (NTRS)

    Kunkee, David B.; Farley, Robert W.; Kwan, Betty P.; Hecht, James H.; Walterscheid, Richard L.; Claudepierre, Seth G.; Bishop, Rebecca L.; Gelinas, Lynette J.; Deluccia, Frank J.

    2017-01-01

    A complement of Readiness, Implementation and Management Plans (RIMPs) to facilitate management of post-launch product test activities for the official Geostationary Operational Environmental Satellite (GOES-R) Level 1b (L1b) products have been developed and documented. Separate plans have been created for each of the GOES-R sensors including: the Advanced Baseline Imager (ABI), the Extreme ultraviolet and X-ray Irradiance Sensors (EXIS), Geostationary Lightning Mapper (GLM), GOES-R Magnetometer (MAG), the Space Environment In-Situ Suite (SEISS), and the Solar Ultraviolet Imager (SUVI). The GOES-R program has implemented these RIMPs in order to address the full scope of CalVal activities required for a successful demonstration of GOES-R L1b data product quality throughout the three validation stages: Beta, Provisional and Full Validation. For each product maturity level, the RIMPs include specific performance criteria and required artifacts that provide evidence a given validation stage has been reached, the timing when each stage will be complete, a description of every applicable Post-Launch Product Test (PLPT), roles and responsibilities of personnel, upstream dependencies, and analysis methods and tools to be employed during validation. Instrument level Post-Launch Tests (PLTs) are also referenced and apply primarily to functional check-out of the instruments.

  7. The Validation by Measurement Theory of Proposed Object-Oriented Software Metrics

    NASA Technical Reports Server (NTRS)

    Neal, Ralph D.

    1996-01-01

    Moving software development into the engineering arena requires controllability, and to control a process, it must be measurable. Measuring the process does no good if the product is not also measured, i.e., being the best at producing an inferior product does not define a quality process. Also, not every number extracted from software development is a valid measurement. A valid measurement only results when we are able to verify that the number is representative of the attribute that we wish to measure. Many proposed software metrics are used by practitioners without these metrics ever having been validated, leading to costly but often useless calculations. Several researchers have bemoaned the lack of scientific precision in much of the published software measurement work and have called for validation of software metrics by measurement theory. This dissertation applies measurement theory to validate fifty proposed object-oriented software metrics.

  8. Integration of design and inspection

    NASA Astrophysics Data System (ADS)

    Simmonds, William H.

    1990-08-01

    Developments in advanced computer integrated manufacturing technology, coupled with the emphasis on Total Quality Management, are exposing needs for new techniques to integrate all functions from design through to support of the delivered product. One critical functional area that must be integrated into design is that embracing the measurement, inspection and test activities necessary for validation of the delivered product. This area is being tackled by a collaborative project supported by the UK Government Department of Trade and Industry. The project is aimed at developing techniques for analysing validation needs and for planning validation methods. Within the project an experimental Computer Aided Validation Expert system (CAVE) is being constructed. This operates with a generalised model of the validation process and helps with all design stages: specification of product requirements; analysis of the assurance provided by a proposed design and method of manufacture; development of the inspection and test strategy; and analysis of feedback data. The kernel of the system is a knowledge base containing knowledge of the manufacturing process capabilities and of the available inspection and test facilities. The CAVE system is being integrated into a real life advanced computer integrated manufacturing facility for demonstration and evaluation.

  9. Quantitative and Systems Pharmacology. 1. In Silico Prediction of Drug-Target Interactions of Natural Products Enables New Targeted Cancer Therapy.

    PubMed

    Fang, Jiansong; Wu, Zengrui; Cai, Chuipu; Wang, Qi; Tang, Yun; Cheng, Feixiong

    2017-11-27

    Natural products with diverse chemical scaffolds have been recognized as an invaluable source of compounds in drug discovery and development. However, systematic identification of drug targets for natural products at the human proteome level via various experimental assays is highly expensive and time-consuming. In this study, we proposed a systems pharmacology infrastructure to predict new drug targets and anticancer indications of natural products. Specifically, we reconstructed a global drug-target network with 7,314 interactions connecting 751 targets and 2,388 natural products and built predictive network models via a balanced substructure-drug-target network-based inference approach. A high area under receiver operating characteristic curve of 0.96 was yielded for predicting new targets of natural products during cross-validation. The newly predicted targets of natural products (e.g., resveratrol, genistein, and kaempferol) with high scores were validated by various literature studies. We further built the statistical network models for identification of new anticancer indications of natural products through integration of both experimentally validated and computationally predicted drug-target interactions of natural products with known cancer proteins. We showed that the significantly predicted anticancer indications of multiple natural products (e.g., naringenin, disulfiram, and metformin) with new mechanism-of-action were validated by various published experimental evidence. In summary, this study offers powerful computational systems pharmacology approaches and tools for the development of novel targeted cancer therapies by exploiting the polypharmacology of natural products.

  10. Developing Online Course Portal to Improve Teachers’ Competency in Creating Action Research (CAR) Proposal Using Learning Management System (LMS) Moodle

    NASA Astrophysics Data System (ADS)

    Muhtar, A. A.

    2017-02-01

    Online course can offer flexible and easy way to improve teachers’ competency in conducting education research, especially in classroom action research (CAR). Teachers can attend the course without physically present in the class. This research aims to (1) develop online course portal to improve teachers’ competency in creating CAR proposal, and (2) produce proper online course portal validated and evaluated from four aspects: learning process, content, graphic user interface and programming. Online course in this research developed using Learning Management System (LMS) Moodle. The research model is using modified Borg & Gall Research and Development (R&D) started from preliminary studies, designing product, creating product, and evaluation. Product validated by three experts from three universities. Research subjects for field test are seven teachers as participants from different schools in several provinces in Indonesia. Based on expert validation and field test results, the product developed in this research categorized as “very good” in all aspects and it is suitable for teacher to improve their competency in creating CAR proposal. Online course portal produced in this research can be used as a proper model for online learning in creating CAR proposal.

  11. Risk management in technovigilance: construction and validation of a medical-hospital product evaluation instrument.

    PubMed

    Kuwabara, Cleuza Catsue Takeda; Evora, Yolanda Dora Martinez; de Oliveira, Márcio Mattos Borges

    2010-01-01

    With the continuous incorporation of health technologies, hospital risk management should be implemented to systemize the monitoring of adverse effects, performing actions to control and eliminate their damage. As part of these actions, Technovigilance is active in the procedures of acquisition, use and quality control of health products and equipment. This study aimed to construct and validate an instrument to evaluate medical-hospital products. This is a quantitative, exploratory, longitudinal and methodological development study, based on the Six Sigma quality management model, which has as its principle basis the component stages of the DMAIC Cycle. For data collection and content validation, the Delphi technique was used with professionals from the Brazilian Sentinel Hospital Network. It was concluded that the instrument developed permitted the evaluation of the product, differentiating between the results of the tested brands, in line with the initial study goal of qualifying the evaluations performed.

  12. Vacuum decay container closure integrity leak test method development and validation for a lyophilized product-package system.

    PubMed

    Patel, Jayshree; Mulhall, Brian; Wolf, Heinz; Klohr, Steven; Guazzo, Dana Morton

    2011-01-01

    A leak test performed according to ASTM F2338-09 Standard Test Method for Nondestructive Detection of Leaks in Packages by Vacuum Decay Method was developed and validated for container-closure integrity verification of a lyophilized product in a parenteral vial package system. This nondestructive leak test method is intended for use in manufacturing as an in-process package integrity check, and for testing product stored on stability in lieu of sterility tests. Method development and optimization challenge studies incorporated artificially defective packages representing a range of glass vial wall and sealing surface defects, as well as various elastomeric stopper defects. Method validation required 3 days of random-order replicate testing of a test sample population of negative-control, no-defect packages and positive-control, with-defect packages. Positive-control packages were prepared using vials each with a single hole laser-drilled through the glass vial wall. Hole creation and hole size certification was performed by Lenox Laser. Validation study results successfully demonstrated the vacuum decay leak test method's ability to accurately and reliably detect those packages with laser-drilled holes greater than or equal to approximately 5 μm in nominal diameter. All development and validation studies were performed at Whitehouse Analytical Laboratories in Whitehouse, NJ, under the direction of consultant Dana Guazzo of RxPax, LLC, using a VeriPac 455 Micro Leak Test System by Packaging Technologies & Inspection (Tuckahoe, NY). Bristol Myers Squibb (New Brunswick, NJ) fully subsidized all work. A leak test performed according to ASTM F2338-09 Standard Test Method for Nondestructive Detection of Leaks in Packages by Vacuum Decay Method was developed and validated to detect defects in stoppered vial packages containing lyophilized product for injection. This nondestructive leak test method is intended for use in manufacturing as an in-process package integrity check, and for testing product stored on stability in lieu of sterility tests. Test method validation study results proved the method capable of detecting holes laser-drilled through the glass vial wall greater than or equal to 5 μm in nominal diameter. Total test time is less than 1 min per package. All method development and validation studies were performed at Whitehouse Analytical Laboratories in Whitehouse, NJ, under the direction of consultant Dana Guazzo of RxPax, LLC, using a VeriPac 455 Micro Leak Test System by Packaging Technologies & Inspection (Tuckahoe, NY). Bristol Myers Squibb (New Brunswick, NJ) fully subsidized all work.

  13. Comprehensive validation scheme for in situ fiber optics dissolution method for pharmaceutical drug product testing.

    PubMed

    Mirza, Tahseen; Liu, Qian Julie; Vivilecchia, Richard; Joshi, Yatindra

    2009-03-01

    There has been a growing interest during the past decade in the use of fiber optics dissolution testing. Use of this novel technology is mainly confined to research and development laboratories. It has not yet emerged as a tool for end product release testing despite its ability to generate in situ results and efficiency improvement. One potential reason may be the lack of clear validation guidelines that can be applied for the assessment of suitability of fiber optics. This article describes a comprehensive validation scheme and development of a reliable, robust, reproducible and cost-effective dissolution test using fiber optics technology. The test was successfully applied for characterizing the dissolution behavior of a 40-mg immediate-release tablet dosage form that is under development at Novartis Pharmaceuticals, East Hanover, New Jersey. The method was validated for the following parameters: linearity, precision, accuracy, specificity, and robustness. In particular, robustness was evaluated in terms of probe sampling depth and probe orientation. The in situ fiber optic method was found to be comparable to the existing manual sampling dissolution method. Finally, the fiber optic dissolution test was successfully performed by different operators on different days, to further enhance the validity of the method. The results demonstrate that the fiber optics technology can be successfully validated for end product dissolution/release testing. (c) 2008 Wiley-Liss, Inc. and the American Pharmacists Association

  14. CEOS WGCV Land Product Validation (LPV) Sub-Group: Current and Potential Roles in Future Decadal Survey Missions

    NASA Technical Reports Server (NTRS)

    Roman, Miguel O.; Nightingale, Joanne; Nickeson, Jaime; Schaepman-Strub, Gabriela

    2011-01-01

    The goals and objectives of the sub group are: To foster and coordinate quantitative validation of higher level global land products derive d from remotely sensed data, in a traceable way, and to relay results so they are relevant to users. and to increase the quality and effi ciency of global satellite product validation by developing and promo ting international standards and protocols for: (1) Field sampling (2) Scaling techniques (3) Accuracy reporting (4) Data / information exchange also to provide feedback to international structures (GEOSS ) for: (1) Requirements on product accuracy and quality assurance (QA 4EO) (2) Terrestrial ECV measurement standards (3) Definitions for f uture missions

  15. LAnd surface remote sensing Products VAlidation System (LAPVAS) and its preliminary application

    NASA Astrophysics Data System (ADS)

    Lin, Xingwen; Wen, Jianguang; Tang, Yong; Ma, Mingguo; Dou, Baocheng; Wu, Xiaodan; Meng, Lumin

    2014-11-01

    The long term record of remote sensing product shows the land surface parameters with spatial and temporal change to support regional and global scientific research widely. Remote sensing product with different sensors and different algorithms is necessary to be validated to ensure the high quality remote sensing product. Investigation about the remote sensing product validation shows that it is a complex processing both the quality of in-situ data requirement and method of precision assessment. A comprehensive validation should be needed with long time series and multiple land surface types. So a system named as land surface remote sensing product is designed in this paper to assess the uncertainty information of the remote sensing products based on a amount of in situ data and the validation techniques. The designed validation system platform consists of three parts: Validation databases Precision analysis subsystem, Inter-external interface of system. These three parts are built by some essential service modules, such as Data-Read service modules, Data-Insert service modules, Data-Associated service modules, Precision-Analysis service modules, Scale-Change service modules and so on. To run the validation system platform, users could order these service modules and choreograph them by the user interactive and then compete the validation tasks of remote sensing products (such as LAI ,ALBEDO ,VI etc.) . Taking SOA-based architecture as the framework of this system. The benefit of this architecture is the good service modules which could be independent of any development environment by standards such as the Web-Service Description Language(WSDL). The standard language: C++ and java will used as the primary programming language to create service modules. One of the key land surface parameter, albedo, is selected as an example of the system application. It is illustrated that the LAPVAS has a good performance to implement the land surface remote sensing product validation.

  16. Specifications and programs for computer software validation

    NASA Technical Reports Server (NTRS)

    Browne, J. C.; Kleir, R.; Davis, T.; Henneman, M.; Haller, A.; Lasseter, G. L.

    1973-01-01

    Three software products developed during the study are reported and include: (1) FORTRAN Automatic Code Evaluation System, (2) the Specification Language System, and (3) the Array Index Validation System.

  17. The validation by measurement theory of proposed object-oriented software metrics

    NASA Technical Reports Server (NTRS)

    Neal, Ralph D.

    1994-01-01

    Moving software development into the engineering arena requires controllability, and to control a process, it must be measurable. Measuring the process does no good if the product is not also measured, i.e., being the best at producing an inferior product does not define a quality process. Also, not every number extracted from software development is a valid measurement. A valid measurement only results when we are able to verify that the number is representative of the attribute that we wish to measure. Many proposed software metrics are used by practitioners without these metrics ever having been validated, leading to costly but often useless calculations. Several researchers have bemoaned the lack of scientific precision in much of the published software measurement work and have called for validation of software metrics by measurement theory. This dissertation applies measurement theory to validate fifty proposed object-oriented software metrics (Li and Henry, 1993; Chidamber and Kemerrer, 1994; Lorenz and Kidd, 1994).

  18. Guidelines To Validate Control of Cross-Contamination during Washing of Fresh-Cut Leafy Vegetables.

    PubMed

    Gombas, D; Luo, Y; Brennan, J; Shergill, G; Petran, R; Walsh, R; Hau, H; Khurana, K; Zomorodi, B; Rosen, J; Varley, R; Deng, K

    2017-02-01

    The U.S. Food and Drug Administration requires food processors to implement and validate processes that will result in significantly minimizing or preventing the occurrence of hazards that are reasonably foreseeable in food production. During production of fresh-cut leafy vegetables, microbial contamination that may be present on the product can spread throughout the production batch when the product is washed, thus increasing the risk of illnesses. The use of antimicrobials in the wash water is a critical step in preventing such water-mediated cross-contamination; however, many factors can affect antimicrobial efficacy in the production of fresh-cut leafy vegetables, and the procedures for validating this key preventive control have not been articulated. Producers may consider three options for validating antimicrobial washing as a preventive control for cross-contamination. Option 1 involves the use of a surrogate for the microbial hazard and the demonstration that cross-contamination is prevented by the antimicrobial wash. Option 2 involves the use of antimicrobial sensors and the demonstration that a critical antimicrobial level is maintained during worst-case operating conditions. Option 3 validates the placement of the sensors in the processing equipment with the demonstration that a critical antimicrobial level is maintained at all locations, regardless of operating conditions. These validation options developed for fresh-cut leafy vegetables may serve as examples for validating processes that prevent cross-contamination during washing of other fresh produce commodities.

  19. Global Precipitation Measurement (GPM) Ground Validation (GV) Science Implementation Plan

    NASA Technical Reports Server (NTRS)

    Petersen, Walter A.; Hou, Arthur Y.

    2008-01-01

    For pre-launch algorithm development and post-launch product evaluation Global Precipitation Measurement (GPM) Ground Validation (GV) goes beyond direct comparisons of surface rain rates between ground and satellite measurements to provide the means for improving retrieval algorithms and model applications.Three approaches to GPM GV include direct statistical validation (at the surface), precipitation physics validation (in a vertical columns), and integrated science validation (4-dimensional). These three approaches support five themes: core satellite error characterization; constellation satellites validation; development of physical models of snow, cloud water, and mixed phase; development of cloud-resolving model (CRM) and land-surface models to bridge observations and algorithms; and, development of coupled CRM-land surface modeling for basin-scale water budget studies and natural hazard prediction. This presentation describes the implementation of these approaches.

  20. Climatological Processing and Product Development for the TRMM Ground Validation Program

    NASA Technical Reports Server (NTRS)

    Marks, D. A.; Kulie, M. S.; Robinson, M.; Silberstein, D. S.; Wolff, D. B.; Ferrier, B. S.; Amitai, E.; Fisher, B.; Wang, J.; Augustine, D.; hide

    2000-01-01

    The Tropical Rainfall Measuring Mission (TRMM) satellite was successfully launched in November 1997.The main purpose of TRMM is to sample tropical rainfall using the first active spaceborne precipitation radar. To validate TRMM satellite observations, a comprehensive Ground Validation (GV) Program has been implemented. The primary goal of TRMM GV is to provide basic validation of satellite-derived precipitation measurements over monthly climatologies for the following primary sites: Melbourne, FL; Houston, TX; Darwin, Australia- and Kwajalein Atoll, RMI As part of the TRMM GV effort, research analysts at NASA Goddard Space Flight Center (GSFC) generate standardized rainfall products using quality-controlled ground-based radar data from the four primary GV sites. This presentation will provide an overview of TRMM GV climatological processing and product generation. A description of the data flow between the primary GV sites, NASA GSFC, and the TRMM Science and Data Information System (TSDIS) will be presented. The radar quality control algorithm, which features eight adjustable height and reflectivity parameters, and its effect on monthly rainfall maps, will be described. The methodology used to create monthly, gauge-adjusted rainfall products for each primary site will also be summarized. The standardized monthly rainfall products are developed in discrete, modular steps with distinct intermediate products. A summary of recently reprocessed official GV rainfall products available for TRMM science users will be presented. Updated basic standardized product results involving monthly accumulation, Z-R relationship, and gauge statistics for each primary GV site will also be displayed.

  1. Malaysian consumers’ awareness, perception, and attitude toward cosmetic products: Questionnaire development and pilot testing

    PubMed Central

    Ayob, Ain; Awadh, Ammar Ihsan; Hadi, Hazrina; Jaffri, Juliana; Jamshed, Shazia; Ahmad, Hawa Mas Azmar

    2016-01-01

    Background: Increased usage of cosmetic products has caused a growing concern about the safety of these products, and yet little is known about cosmetics from the consumers’ perspective. Hence, this study's aim is to develop a valid and reliable tool for assessing consumers’ awareness, perceptions, and attitudes toward cosmetic products. Materials and Methods: A questionnaire was developed in the English language based on information collected from a literature search, in-depth interviews conducted with consumers prior to this study and consultations with experts. Subsequently, the questionnaire was subjected to translation, validation, and test-retest reliability. A final version of the questionnaire was piloted among 66 consumers via convenient sampling. A descriptive analysis was performed, and the internal consistency and the differences between variables in the questionnaire were analyzed. Results: The developed and translated questionnaire produced repeatable data for each of the domains (Spearman's correlation ≥ 0.7, P < 0.001). The internal consistency for awareness, perceptions and attitudes indicates good internal consistency (Cronbach's alpha value of more than 0.7 for each domain). Significant differences were found between the perception scores for the race, religion, and monthly expenses for cosmetic products, respectively, and the same pattern was found for the attitude scores, but monthly expenses for cosmetic products was replaced by monthly income. Conclusion: The results achieved via the Bahasa Malaysia questionnaire indicated that the developed and translated questionnaire can be used as a valid and reliable tool for assessing consumers’ awareness, perceptions, and attitudes toward cosmetic products in Malaysia in future studies. PMID:27413348

  2. Using Android-Based Educational Game for Learning Colloid Material

    NASA Astrophysics Data System (ADS)

    Sari, S.; Anjani, R.; Farida, I.; Ramdhani, M. A.

    2017-09-01

    This research is based on the importance of the development of student’s chemical literacy on Colloid material using Android-based educational game media. Educational game products are developed through research and development design. In the analysis phase, material analysis is performed to generate concept maps, determine chemical literacy indicators, game strategies and set game paths. In the design phase, product packaging is carried out, then validation and feasibility test are performed. Research produces educational game based on Android that has the characteristics that is: Colloid material presented in 12 levels of game in the form of questions and challenges, presents visualization of discourse, images and animation contextually to develop the process of thinking and attitude. Based on the analysis of validation and trial results, the product is considered feasible to use.

  3. Validation of column-based chromatography processes for the purification of proteins. Technical report No. 14.

    PubMed

    2008-01-01

    PDA Technical Report No. 14 has been written to provide current best practices, such as application of risk-based decision making, based in sound science to provide a foundation for the validation of column-based chromatography processes and to expand upon information provided in Technical Report No. 42, Process Validation of Protein Manufacturing. The intent of this technical report is to provide an integrated validation life-cycle approach that begins with the use of process development data for the definition of operational parameters as a basis for validation, confirmation, and/or minor adjustment to these parameters at manufacturing scale during production of conformance batches and maintenance of the validated state throughout the product's life cycle.

  4. Development of an Independent Global Land Cover Validation Dataset

    NASA Astrophysics Data System (ADS)

    Sulla-Menashe, D. J.; Olofsson, P.; Woodcock, C. E.; Holden, C.; Metcalfe, M.; Friedl, M. A.; Stehman, S. V.; Herold, M.; Giri, C.

    2012-12-01

    Accurate information related to the global distribution and dynamics in global land cover is critical for a large number of global change science questions. A growing number of land cover products have been produced at regional to global scales, but the uncertainty in these products and the relative strengths and weaknesses among available products are poorly characterized. To address this limitation we are compiling a database of high spatial resolution imagery to support international land cover validation studies. Validation sites were selected based on a probability sample, and may therefore be used to estimate statistically defensible accuracy statistics and associated standard errors. Validation site locations were identified using a stratified random design based on 21 strata derived from an intersection of Koppen climate classes and a population density layer. In this way, the two major sources of global variation in land cover (climate and human activity) are explicitly included in the stratification scheme. At each site we are acquiring high spatial resolution (< 1-m) satellite imagery for 5-km x 5-km blocks. The response design uses an object-oriented hierarchical legend that is compatible with the UN FAO Land Cover Classification System. Using this response design, we are classifying each site using a semi-automated algorithm that blends image segmentation with a supervised RandomForest classification algorithm. In the long run, the validation site database is designed to support international efforts to validate land cover products. To illustrate, we use the site database to validate the MODIS Collection 4 Land Cover product, providing a prototype for validating the VIIRS Surface Type Intermediate Product scheduled to start operational production early in 2013. As part of our analysis we evaluate sources of error in coarse resolution products including semantic issues related to the class definitions, mixed pixels, and poor spectral separation between classes.

  5. Establishing best practices for the validation of atmospheric composition measurements from satellites

    NASA Astrophysics Data System (ADS)

    Lambert, Jean-Christopher

    As a contribution to the implementation of the Global Earth Observation System of Systems (GEOSS), the Committee on Earth Observation Satellites (CEOS) is developing a data quality strategy for satellite measurements. To achieve GEOSS requirements of consistency and interoperability (e.g. for comparison and for integrated interpretation) of the measurements and their derived data products, proper uncertainty assessment is essential and needs to be continuously monitored and traceable to standards. Therefore, CEOS has undertaken the task to establish a set of best practices and guidelines for satellite validation, starting with current practices that could be improved with time. Best practices are not intended to be imposed as firm requirements, but rather to be suggested as a baseline for comparing against, which could be used by the widest community and provide guidance to newcomers. The present paper reviews the current development of best practices and guidelines for the validation of atmospheric composition satellites. Terminologies and general principles of validation are reminded. Going beyond elementary definitions of validation like the assessment of uncertainties, the specific GEOSS context calls also for validation of individual service components and against user requirements. This paper insists on two important aspects. First one, the question of the "collocation". Validation generally involves comparisons with "reference" measurements of the same quantities, and the question of what constitutes a valid comparison is not the least of the challenges faced. We present a tentative scheme for defining the validity of a comparison and of the necessary "collocation" criteria. Second focus of this paper: the information content of the data product. Validation against user requirements, or the verification of the "fitness for purpose" of both the data products and their validation, needs to identify what information, in the final product, is contributed really by the measurement, as opposed to what is contributed by a priori constraints imposed by the retrieval.

  6. An integrated assessment instrument: Developing and validating instrument for facilitating critical thinking abilities and science process skills on electrolyte and nonelectrolyte solution matter

    NASA Astrophysics Data System (ADS)

    Astuti, Sri Rejeki Dwi; Suyanta, LFX, Endang Widjajanti; Rohaeti, Eli

    2017-05-01

    The demanding of assessment in learning process was impact by policy changes. Nowadays, assessment is not only emphasizing knowledge, but also skills and attitudes. However, in reality there are many obstacles in measuring them. This paper aimed to describe how to develop integrated assessment instrument and to verify instruments' validity such as content validity and construct validity. This instrument development used test development model by McIntire. Development process data was acquired based on development test step. Initial product was observed by three peer reviewer and six expert judgments (two subject matter experts, two evaluation experts and two chemistry teachers) to acquire content validity. This research involved 376 first grade students of two Senior High Schools in Bantul Regency to acquire construct validity. Content validity was analyzed used Aiken's formula. The verifying of construct validity was analyzed by exploratory factor analysis using SPSS ver 16.0. The result show that all constructs in integrated assessment instrument are asserted valid according to content validity and construct validity. Therefore, the integrated assessment instrument is suitable for measuring critical thinking abilities and science process skills of senior high school students on electrolyte solution matter.

  7. Creating an open access cal/val repository via the LACO-Wiki online validation platform

    NASA Astrophysics Data System (ADS)

    Perger, Christoph; See, Linda; Dresel, Christopher; Weichselbaum, Juergen; Fritz, Steffen

    2017-04-01

    There is a major gap in the amount of in-situ data available on land cover and land use, either as field-based ground truth information or from image interpretation, both of which are used for the calibration and validation (cal/val) of products derived from Earth Observation. Although map producers generally publish their confusion matrices and the accuracy measures associated with their land cover and land use products, the cal/val data (also referred to as reference data) are rarely shared in an open manner. Although there have been efforts in compiling existing reference datasets and making them openly available, e.g. through the GOFC/GOLD (Global Observation for Forest Cover and Land Dynamics) portal or the European Commission's Copernicus Reference Data Access (CORDA), this represents a tiny fraction of the reference data collected and stored locally around the world. Moreover, the validation of land cover and land use maps is usually undertaken with tools and procedures specific to a particular institute or organization due to the lack of standardized validation procedures; thus, there are currently no incentives to share the reference data more broadly with the land cover and land use community. In an effort to provide a set of standardized, online validation tools and to build an open repository of cal/val data, the LACO-Wiki online validation portal has been developed, which will be presented in this paper. The portal contains transparent, documented and reproducible validation procedures that can be applied to local as well as global products. LACO-Wiki was developed through a user consultation process that resulted in a 4-step wizard-based workflow, which supports the user from uploading the map product for validation, through to the sampling process and the validation of these samples, until the results are processed and a final report is created that includes a range of commonly reported accuracy measures. One of the design goals of LACO-Wiki has been to simplify the workflows as much as possible so that the tool can be used both professionally and in an educational or non-expert context. By using the tool for validation, the user agrees to share their validation samples and therefore contribute to an open access cal/val repository. Interest in the use of LACO-Wiki for validation of national land cover or related products has already been expressed, e.g. by national stakeholders under the umbrella of the European Environment Agency (EEA), and for global products by GOFC/GOLD and the Group on Earth Observation (GEO). Thus, LACO-Wiki has the potential to become the focal point around which an international land cover validation community could be built, and could significantly advance the state-of-the-art in land cover cal/val, particularly given recent developments in opening up of the Landsat archive and the open availability of Sentinel imagery. The platform will also offer open access to crowdsourced in-situ data, for example, from the recently developed LACO-Wiki mobile smartphone app, which can be used to collect additional validation information in the field, as well as to validation data collected via its partner platform, Geo-Wiki, where an already established community of citizen scientists collect land cover and land use data for different research applications.

  8. FDA perspective on specifications for biotechnology products--from IND to PLA.

    PubMed

    Murano, G

    1997-01-01

    Quality standards are obligatory throughout development, approval and post-marketing phases of biotechnology-derived products, thus assuring product identity, purity, and potency/strength. The process of developing and setting specifications should be based on sound science and should represent a logical progression of actions based on the use of experiential data spanning manufacturing process validation, consistency in production, and characterization of relevant product properties/attributes, by multiple analytical means. This interactive process occurs in phases, varying in rigour. It is best described as encompassing a framework which starts with the implementation of realistic/practical operational quality limits, progressing to the establishment/adoption of more stringent specifications. The historical database is generated from preclinical, toxicology and early clinical lots. This supports the clinical development programme which, as it progresses, allows for further assay method validation/refinement, adoption/addition due to relevant or newly recognized product attributes or rejection due to irrelevance. In the next phase, (licensing/approval) specifications are set through extended experience and validation of both the preparative and analytical processes, to include availability of suitable reference standards and extensive product characterization throughout its proposed dating period. Subsequent to product approval, the incremental database of test results serves as a natural continuum for further evolving/refining specifications. While there is considerable latitude in the kinds of testing modalities finally adopted to establish product quality on a routine basis, for both drugs and drug products, it is important that the selection takes into consideration relevant (significant) product characteristics that appropriately reflect on identity, purity and potency.

  9. Interactive learning media based on flash for basic electronic engineering development for SMK Negeri 1 Driyorejo - Gresik

    NASA Astrophysics Data System (ADS)

    Mandigo Anggana Raras, Gustav

    2018-04-01

    This research aims to produce a product in the form of flash based interactive learning media on a basic electronic engineering subject that reliable to be used and to know students’ responses about the media. The target of this research is X-TEI 1 class at SMK Negeri 1 Driyorejo – Gresik. The method used in this study is R&D that has been limited into seven stages only (1) potential and problems, (2) data collection, (3) product design, (4) product validation, (5) product revision, (6) field test, and (7) analysis and writing. The obtained result is interactive learning media named MELDASH. Validation process used to produce a valid interactive learning media. The result of media validation state that the interactive learning media has a 90.83% rating. Students’ responses to this interactive learning media is really good with 88.89% rating.

  10. Online test application development using framework CodeIgniter

    NASA Astrophysics Data System (ADS)

    Wibawa, S. C.; Wahyuningsih, Y.; Sulistyowati, R.; Abidin, R.; Lestari, Y.; Noviyanti; Maulana, D. A.

    2018-01-01

    The purpose of this study is developing application an online test for vocational students and to know the user acceptance testing on the application. The method used in this research is the Research and Development (R & D) only up to the pilot phase of the product. The stage of the procedure of the research namely: (1) Analyze the exam using paper compared to using web-based application test online. (2) Design the media in accordance with the design of the author. (3) To test the product by including a questionnaire instrument against the application that has been done. Researchers carried out tests on class X on the computer and network engineering Vocational High School (SMK) Darul Ma’wa Plumpang. It can be concluded that: (1) application online test was created gets the value of the validator with the percentage of lowest value and the highest value for the validation of products: 25% and 100%. With a total number of 14 questions, after validation of the products obtained from the three aspects of the assessment scale from 81.25 to 100 obtained from 2 different validators with the meaning of an application that has been developed and very suitable for use in school. (2) Based on User Acceptance Testing (UAT), applications can be very well received by the students and recommend to replay the final semester and others. With the successful acquisition of a category which means it’s ready and qualified.

  11. A Model-Based Approach to Support Validation of Medical Cyber-Physical Systems.

    PubMed

    Silva, Lenardo C; Almeida, Hyggo O; Perkusich, Angelo; Perkusich, Mirko

    2015-10-30

    Medical Cyber-Physical Systems (MCPS) are context-aware, life-critical systems with patient safety as the main concern, demanding rigorous processes for validation to guarantee user requirement compliance and specification-oriented correctness. In this article, we propose a model-based approach for early validation of MCPS, focusing on promoting reusability and productivity. It enables system developers to build MCPS formal models based on a library of patient and medical device models, and simulate the MCPS to identify undesirable behaviors at design time. Our approach has been applied to three different clinical scenarios to evaluate its reusability potential for different contexts. We have also validated our approach through an empirical evaluation with developers to assess productivity and reusability. Finally, our models have been formally verified considering functional and safety requirements and model coverage.

  12. A Model-Based Approach to Support Validation of Medical Cyber-Physical Systems

    PubMed Central

    Silva, Lenardo C.; Almeida, Hyggo O.; Perkusich, Angelo; Perkusich, Mirko

    2015-01-01

    Medical Cyber-Physical Systems (MCPS) are context-aware, life-critical systems with patient safety as the main concern, demanding rigorous processes for validation to guarantee user requirement compliance and specification-oriented correctness. In this article, we propose a model-based approach for early validation of MCPS, focusing on promoting reusability and productivity. It enables system developers to build MCPS formal models based on a library of patient and medical device models, and simulate the MCPS to identify undesirable behaviors at design time. Our approach has been applied to three different clinical scenarios to evaluate its reusability potential for different contexts. We have also validated our approach through an empirical evaluation with developers to assess productivity and reusability. Finally, our models have been formally verified considering functional and safety requirements and model coverage. PMID:26528982

  13. Distribution and Validation of CERES Irradiance Global Data Products Via Web Based Tools

    NASA Technical Reports Server (NTRS)

    Rutan, David; Mitrescu, Cristian; Doelling, David; Kato, Seiji

    2016-01-01

    The CERES SYN1deg product provides climate quality 3-hourly globally gridded and temporally complete maps of top of atmosphere, in atmosphere, and surface fluxes. This product requires efficient release to the public and validation to maintain quality assurance. The CERES team developed web-tools for the distribution of both the global gridded products and grid boxes that contain long term validation sites that maintain high quality flux observations at the Earth's surface. These are found at: http://ceres.larc.nasa.gov/order_data.php. In this poster we explore the various tools available to users to sub-set, download, and validate using surface observations the SYN1Deg and Surface-EBAF products. We also analyze differences found in long-term records from well-maintained land surface sites such as the ARM central facility and high quality buoy radiometers, which due to their isolated nature cannot be maintained in a similar manner to their land based counterparts.

  14. The iMTA Productivity Cost Questionnaire: A Standardized Instrument for Measuring and Valuing Health-Related Productivity Losses.

    PubMed

    Bouwmans, Clazien; Krol, Marieke; Severens, Hans; Koopmanschap, Marc; Brouwer, Werner; Hakkaart-van Roijen, Leona

    2015-09-01

    Productivity losses often contribute significantly to the total costs in economic evaluations adopting a societal perspective. Currently, no consensus exists on the measurement and valuation of productivity losses. We aimed to develop a standardized instrument for measuring and valuing productivity losses. A group of researchers with extensive experience in measuring and valuing productivity losses designed an instrument suitable for self-completion, building on preknowledge and evidence on validity. The instrument was designed to cover all domains of productivity losses, thus allowing quantification and valuation of all productivity losses. A feasibility study was performed to check the questionnaire's consistency and intelligibility. The iMTA Productivity Cost Questionnaire (iPCQ) includes three modules measuring productivity losses of paid work due to 1) absenteeism and 2) presenteeism and productivity losses related to 3) unpaid work. Questions for measuring absenteeism and presenteeism were derived from existing validated questionnaires. Because validated measures of losses of unpaid work are scarce, the questions of this module were newly developed. To enhance the instrument's feasibility, simple language was used. The feasibility study included 195 respondents (response rate 80%) older than 18 years. Seven percent (n = 13) identified problems while filling in the iPCQ, including problems with the questionnaire's instructions and routing (n = 6) and wording (n = 2). Five respondents experienced difficulties in estimating the time that would be needed for other people to make up for lost unpaid work. Most modules of the iPCQ are based on validated questions derived from previously available instruments. The instrument is understandable for most of the general public. Copyright © 2015 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  15. Managing design excellence tools during the development of new orthopaedic implants.

    PubMed

    Défossez, Henri J P; Serhan, Hassan

    2013-11-01

    Design excellence (DEX) tools have been widely used for years in some industries for their potential to facilitate new product development. The medical sector, targeted by cost pressures, has therefore started adopting them. Numerous tools are available; however only appropriate deployment during the new product development stages can optimize the overall process. The primary study objectives were to describe generic tools and illustrate their implementation and management during the development of new orthopaedic implants, and compile a reference package. Secondary objectives were to present the DEX tool investment costs and savings, since the method can require significant resources for which companies must carefully plan. The publicly available DEX method "Define Measure Analyze Design Verify Validate" was adopted and implemented during the development of a new spinal implant. Several tools proved most successful at developing the correct product, addressing clinical needs, and increasing market penetration potential, while reducing design iterations and manufacturing validations. Cost analysis and Pugh Matrix coupled with multi generation planning enabled developing a strong rationale to activate the project, set the vision and goals. improved risk management and product map established a robust technical verification-validation program. Design of experiments and process quantification facilitated design for manufacturing of critical features, as early as the concept phase. Biomechanical testing with analysis of variance provided a validation model with a recognized statistical performance baseline. Within those tools, only certain ones required minimum resources (i.e., business case, multi generational plan, project value proposition, Pugh Matrix, critical To quality process validation techniques), while others required significant investments (i.e., voice of customer, product usage map, improved risk management, design of experiments, biomechanical testing techniques). All used techniques provided savings exceeding investment costs. Some other tools were considered and found less relevant. A matrix summarized the investment costs and generated estimated savings. Globally, all companies can benefit from using DEX by smartly selecting and estimating those tools with best return on investment at the start of the project. For this, a good understanding of the available company resources, background and development strategy are needed. In conclusion, it was possible to illustrate that appropriate management of design excellence tools can greatly facilitate the development of new orthopaedic implant systems.

  16. Medical device development.

    PubMed

    Panescu, Dorin

    2009-01-01

    The development of a successful medical product requires not only engineering design efforts, but also clinical, regulatory, marketing and business expertise. This paper reviews items related to the process of designing medical devices. It discusses the steps required to take a medical product idea from concept, through development, verification and validation, regulatory approvals and market release.

  17. Media fill for validation of a good manufacturing practice-compliant cell production process.

    PubMed

    Serra, Marta; Roseti, Livia; Bassi, Alessandra

    2015-01-01

    According to the European Regulation EC 1394/2007, the clinical use of Advanced Therapy Medicinal Products, such as Human Bone Marrow Mesenchymal Stem Cells expanded for the regeneration of bone tissue or Chondrocytes for Autologous Implantation, requires the development of a process in compliance with the Good Manufacturing Practices. The Media Fill test, consisting of a simulation of the expansion process by using a microbial growth medium instead of the cells, is considered one of the most effective ways to validate a cell production process. Such simulation, in fact, allows to identify any weakness in production that can lead to microbiological contamination of the final cell product as well as qualifying operators. Here, we report the critical aspects concerning the design of a Media Fill test to be used as a tool for the further validation of the sterility of a cell-based Good Manufacturing Practice-compliant production process.

  18. Development and validation of instrument for ergonomic evaluation of tablet arm chairs

    PubMed Central

    Tirloni, Adriana Seára; dos Reis, Diogo Cunha; Bornia, Antonio Cezar; de Andrade, Dalton Francisco; Borgatto, Adriano Ferreti; Moro, Antônio Renato Pereira

    2016-01-01

    The purpose of this study was to develop and validate an evaluation instrument for tablet arm chairs based on ergonomic requirements, focused on user perceptions and using Item Response Theory (IRT). This exploratory study involved 1,633 participants (university students and professors) in four steps: a pilot study (n=26), semantic validation (n=430), content validation (n=11) and construct validation (n=1,166). Samejima's graded response model was applied to validate the instrument. The results showed that all the steps (theoretical and practical) of the instrument's development and validation processes were successful and that the group of remaining items (n=45) had a high consistency (0.95). This instrument can be used in the furniture industry by engineers and product designers and in the purchasing process of tablet arm chairs for schools, universities and auditoriums. PMID:28337099

  19. 9 CFR 381.22 - Conditions for receiving inspection.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... have conducted a hazard analysis and developed and validated a HACCP plan, in accordance with §§ 417.2... exceed 90 days, during which period the establishment must validate its HACCP plan. (c) Before producing... developed a HACCP plan applicable to that product in accordance with § 417.2 of this chapter. During a...

  20. 9 CFR 381.22 - Conditions for receiving inspection.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... have conducted a hazard analysis and developed and validated a HACCP plan, in accordance with §§ 417.2... exceed 90 days, during which period the establishment must validate its HACCP plan. (c) Before producing... developed a HACCP plan applicable to that product in accordance with § 417.2 of this chapter. During a...

  1. Evaluation of MuSyQ land surface albedo based on LAnd surface Parameters VAlidation System (LAPVAS)

    NASA Astrophysics Data System (ADS)

    Dou, B.; Wen, J.; Xinwen, L.; Zhiming, F.; Wu, S.; Zhang, Y.

    2016-12-01

    satellite derived Land surface albedo is an essential climate variable which controls the earth energy budget and it can be used in applications such as climate change, hydrology, and numerical weather prediction. However, the accuracy and uncertainty of surface albedo products should be evaluated with a reliable reference truth data prior to applications. A new comprehensive and systemic project of china, called the Remote Sensing Application Network (CRSAN), has been launched recent years. Two subjects of this project is developing a Multi-source data Synergized Quantitative Remote Sensin g Production System ( MuSyQ ) and a Web-based validation system named LAnd surface remote sensing Product VAlidation System (LAPVAS) , which aims to generate a quantitative remote sensing product for ecosystem and environmental monitoring and validate them with a reference validation data and a standard validation system, respectively. Land surface BRDF/albedo is one of product datasets of MuSyQ which has a pentad period with 1km spatial resolution and is derived by Multi-sensor Combined BRDF Inversion ( MCBI ) Model. In this MuSyQ albedo evaluation, a multi-validation strategy is implemented by LAPVAS, including directly and multi-scale validation with field measured albedo and cross validation with MODIS albedo product with different land cover. The results reveal that MuSyQ albedo data with a 5-day temporal resolution is in higher sensibility and accuracy during land cover change period, e.g. snowing. But results without regard to snow or changed land cover, MuSyQ albedo generally is in similar accuracy with MODIS albedo and meet the climate modeling requirement of an absolute accuracy of 0.05.

  2. Development Status for the Stennis Space Center LIDAR Product Characterization Range

    NASA Technical Reports Server (NTRS)

    Zanoni, Vicki; Berglund, Judith; Ross, Kenton

    2004-01-01

    The presentation describes efforts to develop a LIDAR in-flight product characterization range at Stennis Space Center as the next phase of the NASA Verification and Validation activities. It describes the status of surveying efforts on targets of interest to LIDAR vendors as well as the potential guidelines that will be used for product characterization.

  3. 75 FR 25763 - Addition to the List of Validated End-Users: Advanced Micro Devices China, Inc.

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-05-10

    .... Additional Validated End-User in the PRC and Its Respective ``Eligible Items (By ECCN)'' and ``Eligible... to the ``development'' of products under ECCN 4A003). This authorization was made based on an... Country Validated end-user Eligible items (by ECCN) Eligible destination China (People's Republic of...

  4. Monitoring Land Based Sources of Pollution over Coral Reefs using VIIRS Ocean Color Products

    NASA Astrophysics Data System (ADS)

    Geiger, E.; Strong, A. E.; Eakin, C. M.; Wang, M.; Hernandez, W. J.; Cardona Maldonado, M. A.; De La Cour, J. L.; Liu, G.; Tirak, K.; Heron, S. F.; Skirving, W. J.; Armstrong, R.; Warner, R. A.

    2016-02-01

    NOAA's Coral Reef Watch (CRW) program and the NESDIS Ocean Color Team are developing new products to monitor land based sources of pollution (LBSP) over coral reef ecosystems using the Visible Infrared Imaging Radiometer Suite (VIIRS) onboard the S-NPP satellite. LBSP are a major threat to corals that can cause disease and mortality, disrupt critical ecological reef functions, and impede growth, reproduction, and larval settlement, among other impacts. From VIIRS, near-real-time satellite products of Chlorophyll-a, Kd(490), and sea surface temperature are being developed for three U.S. Coral Reef Task Force priority watershed sites - Ka'anapali (West Maui, Hawai'i), Faga'alu (American Samoa), and Guánica Bay (Puerto Rico). Background climatological levels of these parameters are being developed to construct anomaly products. Time-series data are being generated to monitor changes in water quality in near-real-time and provide information on historical variations, especially following significant rain events. A pilot calibration/validation field study of the VIIRS-based ocean color products is underway in Puerto Rico; we plan to expand this validation effort to the other two watersheds. Working with local resource managers, we have identified a focal area for product development and validation for each watershed and its associated local reefs. This poster will present preliminary results and identify a path forward to ensure marine resource managers understand and correctly use the new ocean color products, and to help NOAA CRW refine its satellite products to maximize their benefit to coral reef management. NOAA - National Oceanic and Atmospheric Administration NESDIS - NOAA/National Environmental Satellite, Data, and Information Service S-NPP - Suomi National Polar-orbiting Partnership

  5. Development and application of a validated stability-indicating high-performance liquid chromatographic method using photodiode array detection for simultaneous determination of granisetron, methylparaben, propylparaben, sodium benzoate, and their main degradation products in oral pharmaceutical preparations.

    PubMed

    Hewala, Ismail; El-Fatatry, Hamed; Emam, Ehab; Mabrouk, Mokhtar

    2011-01-01

    A simple, rapid, and sensitive RP-HPLC method using photodiode array detection was developed and validated for the simultaneous determination of granisetron hydrochloride, 1-methyl-1H-indazole-3-carboxylic acid (the main degradation product of granisetron), sodium benzoate, methylparaben, propylparaben, and 4-hydroxybenzoic acid (the main degradation product of parabens) in granisetron oral drops and solutions. The separation of the compounds was achieved within 8 min on a SymmetryShield RP18 column (100 x 4.6 mm id, 3.5 microm particle size) using the mobile phase acetonitrile--0.05 M KH2PO4 buffered to pH 3 using H3PO4 (3+7, v/v). The photodiode array detector was used to test the purity of the peaks, and the chromatograms were extracted at 240 nm. The method was validated, and validation acceptance criteria were met in all cases. The robust method was successfully applied to the determination of granisetron and preservatives, as well as their degradation products in different batches of granisetron oral drops and solutions. The method proved to be sensitive for determination down to 0.04% (w/w) of granisetron degradation product relative to granisetron and 0.03% (w/w) 4-hydroxybenzoic acid relative to total parabens.

  6. Development of Chemistry Game Card as an Instructional Media in the Subject of Naming Chemical Compound in Grade X

    NASA Astrophysics Data System (ADS)

    Bayharti; Iswendi, I.; Arifin, M. N.

    2018-04-01

    The purpose of this research was to produce a chemistry game card as an instructional media in the subject of naming chemical compounds and determine the degree of validity and practicality of instructional media produced. Type of this research was Research and Development (R&D) that produced a product. The development model used was4-D model which comprises four stages incuding: (1) define, (2) design, (3) develop, and (4) disseminate. This research was restricted at the development stage. Chemistry game card developed was validated by seven validators and practicality was tested to class X6 students of SMAN 5 Padang. Instrument of this research is questionnair that consist of validity sheet and practicality sheet. Technique in collection data was done by distributing questionnaire to the validators, chemistry teachers, and students. The data were analyzed by using formula Cohen’s Kappa. Based on data analysis, validity of chemistry game card was0.87 with category highly valid and practicality of chemistry game card was 0.91 with category highly practice.

  7. The Copernicus S5P Mission Performance Centre / Validation Data Analysis Facility for TROPOMI operational atmospheric data products

    NASA Astrophysics Data System (ADS)

    Compernolle, Steven; Lambert, Jean-Christopher; Langerock, Bavo; Granville, José; Hubert, Daan; Keppens, Arno; Rasson, Olivier; De Mazière, Martine; Fjæraa, Ann Mari; Niemeijer, Sander

    2017-04-01

    Sentinel-5 Precursor (S5P), to be launched in 2017 as the first atmospheric composition satellite of the Copernicus programme, carries as payload the TROPOspheric Monitoring Instrument (TROPOMI) developed by The Netherlands in close cooperation with ESA. Designed to measure Earth radiance and solar irradiance in the ultraviolet, visible and near infrared, TROPOMI will provide Copernicus with observational data on atmospheric composition at unprecedented geographical resolution. The S5P Mission Performance Center (MPC) provides an operational service-based solution for various QA/QC tasks, including the validation of S5P Level-2 data products and the support to algorithm evolution. Those two tasks are to be accomplished by the MPC Validation Data Analysis Facility (VDAF), one MPC component developed and operated at BIRA-IASB with support from S[&]T and NILU. The routine validation to be ensured by VDAF is complemented by a list of validation AO projects carried out by ESA's S5P Validation Team (S5PVT), with whom interaction is essential. Here we will introduce the general architecture of VDAF, its relation to the other MPC components, the generic and specific validation strategies applied for each of the official TROPOMI data products, and the expected output of the system. The S5P data products to be validated by VDAF are diverse: O3 (vertical profile, total column, tropospheric column), NO2 (total and tropospheric column), HCHO (tropospheric column), SO2 (column), CO (column), CH4 (column), aerosol layer height and clouds (fractional cover, cloud-top pressure and optical thickness). Starting from a generic validation protocol meeting community-agreed standards, a set of specific validation settings is associated with each data product, as well as the appropriate set of Fiducial Reference Measurements (FRM) to which it will be compared. VDAF collects FRMs from ESA's Validation Data Centre (EVDC) and from other sources (e.g., WMO's GAW, NDACC and TCCON). Data manipulations on satellite and FRM data (format conversion, filtering, co-location, regridding and vertical smoothing) are performed by the open source software HARP, while more specific manipulations apply in-house routines. The paper concludes with a short description of expected outputs of the system.

  8. Validation of laboratory-scale recycling test method of paper PSA label products

    Treesearch

    Carl Houtman; Karen Scallon; Richard Oldack

    2008-01-01

    Starting with test methods and a specification developed by the U.S. Postal Service (USPS) Environmentally Benign Pressure Sensitive Adhesive Postage Stamp Program, a laboratory-scale test method and a specification were developed and validated for pressure-sensitive adhesive labels, By comparing results from this new test method and pilot-scale tests, which have been...

  9. The development of thematic materials using project based learning for elementary school

    NASA Astrophysics Data System (ADS)

    Yuliana, M.; Wiryawan, S. A.; Riyadi

    2018-05-01

    Teaching materials is one of the important factors in supporting on learning process. This paper discussed about developing thematic materials using project based learning. Thematic materials are designed to make students to be active, creative, cooperative, easy in thinking to solve the problem. The purpose of the research was to develop thematic material using project based learning which used valid variables. The method of research which used in this research was four stages of research and development proposed by Thiagarajan consisting of 4 stages, namely: (1) definition stage, (2) design stage, (3) development stage, and (4) stage of dissemination. The first stage was research and information collection, it was in form of need analysis with questionnaire, observation, interview, and document analysis. Design stage was based on the competencies and indicator. The third was development stage, this stage was used to product validation from expert. The validity of research development involved media validator, material validator, and linguistic validator. The result from the validation of thematic material by expert showed that the overall result had a very good rating which ranged from 1 to 5 likert scale, media validation showed a mean score 4,83, the material validation showed mean score 4,68, and the mean of linguistic validation was e 4,74. It showed that the thematic material using project based learning was valid and feasible to be implemented in the context thematic learning.

  10. Content validity--establishing and reporting the evidence in newly developed patient-reported outcomes (PRO) instruments for medical product evaluation: ISPOR PRO good research practices task force report: part 1--eliciting concepts for a new PRO instrument.

    PubMed

    Patrick, Donald L; Burke, Laurie B; Gwaltney, Chad J; Leidy, Nancy Kline; Martin, Mona L; Molsen, Elizabeth; Ring, Lena

    2011-12-01

    The importance of content validity in developing patient reported outcomes (PRO) instruments is stressed by both the US Food and Drug Administration and the European Medicines Agency. Content validity is the extent to which an instrument measures the important aspects of concepts that developers or users purport it to assess. A PRO instrument measures the concepts most significant and relevant to a patient's condition and its treatment. For PRO instruments, items and domains as reflected in the scores of an instrument should be important to the target population and comprehensive with respect to patient concerns. Documentation of target population input in item generation, as well as evaluation of patient understanding through cognitive interviewing, can provide the evidence for content validity. Developing content for, and assessing respondent understanding of, newly developed PRO instruments for medical product evaluation will be discussed in this two-part ISPOR PRO Good Research Practices Task Force Report. Topics include the methods for generating items, documenting item development, coding of qualitative data from item generation, cognitive interviewing, and tracking item development through the various stages of research and preparing this tracking for submission to regulatory agencies. Part 1 covers elicitation of key concepts using qualitative focus groups and/or interviews to inform content and structure of a new PRO instrument. Part 2 covers the instrument development process, the assessment of patient understanding of the draft instrument using cognitive interviews and steps for instrument revision. The two parts are meant to be read together. They are intended to offer suggestions for good practices in planning, executing, and documenting qualitative studies that are used to support the content validity of PRO instruments to be used in medical product evaluation. Copyright © 2011 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  11. Agricultural Production: Task Analysis for Livestock Production. Competency-Based Education.

    ERIC Educational Resources Information Center

    Henrico County Public Schools, Glen Allen, VA. Virginia Vocational Curriculum Center.

    This task analysis guide is intended to help teachers and administrators develop instructional materials and implement competency-based education in the agricultural production program. Section 1 contains a validated task inventory for the livestock production portion of agricultural production IV and V. Tasks are divided into six duty areas:…

  12. Advances in the Research and Development of Natural Health Products as Main Stream Cancer Therapeutics

    PubMed Central

    Ovadje, Pamela; Roma, Alessia; Steckle, Matthew; Nicoletti, Leah; Arnason, John Thor; Pandey, Siyaram

    2015-01-01

    Natural health products (NHPs) are defined as natural extracts containing polychemical mixtures; they play a leading role in the discovery and development of drugs, for disease treatment. More than 50% of current cancer therapeutics are derived from natural sources. However, the efficacy of natural extracts in treating cancer has not been explored extensively. Scientific research into the validity and mechanism of action of these products is needed to develop NHPs as main stream cancer therapy. The preclinical and clinical validation of NHPs would be essential for this development. This review summarizes some of the recent advancements in the area of NHPs with anticancer effects. This review also focuses on various NHPs that have been studied to scientifically validate their claims as anticancer agents. Furthermore, this review emphasizes the efficacy of these NHPs in targeting the multiple vulnerabilities of cancer cells for a more selective efficacious treatment. The studies reviewed here have paved the way for the introduction of more NHPs from traditional medicine to the forefront of modern medicine, in order to provide alternative, safer, and cheaper complementary treatments for cancer therapy and possibly improve the quality of life of cancer patients. PMID:25883673

  13. Development and validation of a stability indicating HPLC method for determination of lisinopril, lisinopril degradation product and parabens in the lisinopril extemporaneous formulation.

    PubMed

    Beasley, Christopher A; Shaw, Jessica; Zhao, Zack; Reed, Robert A

    2005-03-09

    The purpose of the research described herein was to develop and validate a stability-indicating HPLC method for lisinopril, lisinopril degradation product (DKP), methyl paraben and propyl paraben in a lisinopril extemporaneous formulation. The method developed in this report is selective for the components listed above, in the presence of the complex and chromatographically rich matrix presented by the Bicitra and Ora-Sweet SF formulation diluents. The method was also shown to have adequate sensitivity with a detection limit of 0.0075 microg/mL (0.03% of lisinopril method concentration). The validation elements investigated showed that the method has acceptable specificity, recovery, linearity, solution stability, and method precision. Acceptable robustness indicates that the assay method remains unaffected by small but deliberate variations, which are described in ICH Q2A and Q2B guidelines.

  14. Development and Validation of The SMAP Enhanced Passive Soil Moisture Product

    NASA Technical Reports Server (NTRS)

    Chan, S.; Bindlish, R.; O'Neill, P.; Jackson, T.; Chaubell, J.; Piepmeier, J.; Dunbar, S.; Colliander, A.; Chen, F.; Entekhabi, D.; hide

    2017-01-01

    Since the beginning of its routine science operation in March 2015, the NASA SMAP observatory has been returning interference-mitigated brightness temperature observations at L-band (1.41 GHz) frequency from space. The resulting data enable frequent global mapping of soil moisture with a retrieval uncertainty below 0.040 cu m/cu m at a 36 km spatial scale. This paper describes the development and validation of an enhanced version of the current standard soil moisture product. Compared with the standard product that is posted on a 36 km grid, the new enhanced product is posted on a 9 km grid. Derived from the same time-ordered brightness temperature observations that feed the current standard passive soil moisture product, the enhanced passive soil moisture product leverages on the Backus-Gilbert optimal interpolation technique that more fully utilizes the additional information from the original radiometer observations to achieve global mapping of soil moisture with enhanced clarity. The resulting enhanced soil moisture product was assessed using long-term in situ soil moisture observations from core validation sites located in diverse biomes and was found to exhibit an average retrieval uncertainty below 0.040 cu m/cu m. As of December 2016, the enhanced soil moisture product has been made available to the public from the NASA Distributed Active Archive Center at the National Snow and Ice Data Center.

  15. Process for Low Cost Domestic Production of LIB Cathode Materials

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thurston, Anthony

    The objective of the research was to determine the best low cost method for the large scale production of the Nickel-Cobalt-Manganese (NCM) layered cathode materials. The research and development focused on scaling up the licensed technology from Argonne National Laboratory in BASF’s battery material pilot plant in Beachwood Ohio. Since BASF did not have experience with the large scale production of the NCM cathode materials there was a significant amount of development that was needed to support BASF’s already existing research program. During the three year period BASF was able to develop and validate production processes for the NCM 111,more » 523 and 424 materials as well as begin development of the High Energy NCM. BASF also used this time period to provide free cathode material samples to numerous manufactures, OEM’s and research companies in order to validate the ma-terials. The success of the project can be demonstrated by the construction of the production plant in Elyria Ohio and the successful operation of that facility. The benefit of the project to the public will begin to be apparent as soon as material from the production plant is being used in electric vehicles.« less

  16. Validation of the USGS Landsat Burned Area Essential Climate Variable (BAECV) across the conterminous United States

    USGS Publications Warehouse

    Vanderhoof, Melanie; Fairaux, Nicole; Beal, Yen-Ju G.; Hawbaker, Todd J.

    2017-01-01

    The Landsat Burned Area Essential Climate Variable (BAECV), developed by the U.S. Geological Survey (USGS), capitalizes on the long temporal availability of Landsat imagery to identify burned areas across the conterminous United States (CONUS) (1984–2015). Adequate validation of such products is critical for their proper usage and interpretation. Validation of coarse-resolution products often relies on independent data derived from moderate-resolution sensors (e.g., Landsat). Validation of Landsat products, in turn, is challenging because there is no corresponding source of high-resolution, multispectral imagery that has been systematically collected in space and time over the entire temporal extent of the Landsat archive. Because of this, comparison between high-resolution images and Landsat science products can help increase user's confidence in the Landsat science products, but may not, alone, be adequate. In this paper, we demonstrate an approach to systematically validate the Landsat-derived BAECV product. Burned area extent was mapped for Landsat image pairs using a manually trained semi-automated algorithm that was manually edited across 28 path/rows and five different years (1988, 1993, 1998, 2003, 2008). Three datasets were independently developed by three analysts and the datasets were integrated on a pixel by pixel basis in which at least one to all three analysts were required to agree a pixel was burned. We found that errors within our Landsat reference dataset could be minimized by using the rendition of the dataset in which pixels were mapped as burned if at least two of the three analysts agreed. BAECV errors of omission and commission for the detection of burned pixels averaged 42% and 33%, respectively for CONUS across all five validation years. Errors of omission and commission were lowest across the western CONUS, for example in the shrub and scrublands of the Arid West (31% and 24%, respectively), and highest in the grasslands and agricultural lands of the Great Plains in central CONUS (62% and 57%, respectively). The BAECV product detected most (> 65%) fire events > 10 ha across the western CONUS (Arid and Mountain West ecoregions). Our approach and results demonstrate that a thorough validation of Landsat science products can be completed with independent Landsat-derived reference data, but could be strengthened by the use of complementary sources of high-resolution data.

  17. EOS Terra Validation Program

    NASA Technical Reports Server (NTRS)

    Starr, David

    1999-01-01

    The EOS Terra mission will be launched in July 1999. This mission has great relevance to the atmospheric radiation community and global change issues. Terra instruments include ASTER, CERES, MISR, MODIS and MOPITT. In addition to the fundamental radiance data sets, numerous global science data products will be generated, including various Earth radiation budget, cloud and aerosol parameters, as well as land surface, terrestrial ecology, ocean color, and atmospheric chemistry parameters. Significant investments have been made in on-board calibration to ensure the quality of the radiance observations. A key component of the Terra mission is the validation of the science data products. This is essential for a mission focused on global change issues and the underlying processes. The Terra algorithms have been subject to extensive pre-launch testing with field data whenever possible. Intensive efforts will be made to validate the Terra data products after launch. These include validation of instrument calibration (vicarious calibration) experiments, instrument and cross-platform comparisons, routine collection of high quality correlative data from ground-based networks, such as AERONET, and intensive sites, such as the SGP ARM site, as well as a variety field experiments, cruises, etc. Airborne simulator instruments have been developed for the field experiment and underflight activities including the MODIS Airborne Simulator (MAS), AirMISR, MASTER (MODIS-ASTER), and MOPITT-A. All are integrated on the NASA ER-2, though low altitude platforms are more typically used for MASTER. MATR is an additional sensor used for MOPITT algorithm development and validation. The intensive validation activities planned for the first year of the Terra mission will be described with emphasis on derived geophysical parameters of most relevance to the atmospheric radiation community. Detailed information about the EOS Terra validation Program can be found on the EOS Validation program homepage i/e.: http://ospso.gsfc.nasa.gov/validation/valpage.html).

  18. Planning Model of Physics Learning In Senior High School To Develop Problem Solving Creativity Based On National Standard Of Education

    NASA Astrophysics Data System (ADS)

    Putra, A.; Masril, M.; Yurnetti, Y.

    2018-04-01

    One of the causes of low achievement of student’s competence in physics learning in high school is the process which they have not been able to develop student’s creativity in problem solving. This is shown that the teacher’s learning plan is not accordance with the National Eduction Standard. This study aims to produce a reconstruction model of physics learning that fullfil the competency standards, content standards, and assessment standards in accordance with applicable curriculum standards. The development process follows: Needs analysis, product design, product development, implementation, and product evaluation. The research process involves 2 peers judgment, 4 experts judgment and two study groups of high school students in Padang. The data obtained, in the form of qualitative and quantitative data that collected through documentation, observation, questionnaires, and tests. The result of this research up to the product development stage that obtained the physics learning plan model that meets the validity of the content and the validity of the construction in terms of the fulfillment of Basic Competence, Content Standards, Process Standards and Assessment Standards.

  19. Games for Health 2005

    DTIC Science & Technology

    2005-10-01

    to be exposed to in terms of validation and research.� This is important because the game development community is used to only market validation...dev, aggressively pursuing new frontiers.� We are working on ways within our greater project and future events to explain more about game development production...technology providers that permeate the game development space. By bringing over these key leaders we can further grow the overall community of

  20. An Inventory Battery to Predict Navy and Marine Corps Recruiter Performance: Development and Validation

    DTIC Science & Technology

    1979-05-01

    Cross-Validation Strategies to Diferent Sections of the Predictor Batery ..................... 27 Personality Scales . . . . . . . . . . . . 0. . a...he generated several performance indices on the basis of assumptions about the recruiting environment and geographical differences in production

  1. Analytical method for the determination of various arsenic species in rice, rice food products, apple juice, and other juices by ion chromatography-inductively coupled plasma/mass spectrometry.

    PubMed

    Ellingson, David; Zywicki, Richard; Sullivan, Darryl

    2014-01-01

    Recent studies have shown that there are detectable levels of arsenic (As) in rice, rice food products, and apple juice. This has created significant concern to the public, the food industry, and various regulatory bodies. Classic test methods typically measure total As and are unable to differentiate the various As species. Since different As species have greatly different toxicities, an analytical method was needed to separate and quantify the different inorganic and organic species of As. The inorganic species arsenite [As(+3)] and arsenate [As(+5)] are highly toxic. With this in mind, an ion chromatography-inductively coupled plasma (IC-ICP/MS) method was developed and validated for rice and rice food products that can separate and individually measure multiple inorganic and organic species of As. This allows for the evaluation of the safety or risk associated with any product analyzed. The IC-ICP/MS method was validated on rice and rice food products, and it has been used successfully on apple juice. This paper provides details of the validated method as well as some lessons learned during its development. Precision and accuracy data are presented for rice, rice food products, and apple juice.

  2. NASA GPM GV Science Implementation

    NASA Technical Reports Server (NTRS)

    Petersen, W. A.

    2009-01-01

    Pre-launch algorithm development & post-launch product evaluation: The GPM GV paradigm moves beyond traditional direct validation/comparison activities by incorporating improved algorithm physics & model applications (end-to-end validation) in the validation process. Three approaches: 1) National Network (surface): Operational networks to identify and resolve first order discrepancies (e.g., bias) between satellite and ground-based precipitation estimates. 2) Physical Process (vertical column): Cloud system and microphysical studies geared toward testing and refinement of physically-based retrieval algorithms. 3) Integrated (4-dimensional): Integration of satellite precipitation products into coupled prediction models to evaluate strengths/limitations of satellite precipitation producers.

  3. Analysis of various quality attributes of sunflower and soybean plants by near infra-red reflectance spectroscopy: Development and validation calibration models

    USDA-ARS?s Scientific Manuscript database

    Sunflower and soybean are summer annuals that can be grown as an alternative to corn and may be particularly useful in organic production systems. Rapid and low cost methods of analyzing plant quality would be helpful for crop management. We developed and validated calibration models for Near-infrar...

  4. Development and Validation of a Stability-Indicating Assay of Etofenamate by RP-HPLC and Characterization of Degradation Products

    PubMed Central

    Peraman, Ramalingam; Nayakanti, Devanna; Dugga, Hari Hara Theja; Kodikonda, Sudhakara

    2013-01-01

    A validated stability-indicating RP-HPLC method for etofenamate (ETF) was developed by separating its degradation products on a C18 (250 mm × 4.6 mm 5 μm) Qualisil BDS column using a phosphate buffer (pH-adjusted to 6.0 with orthophosphoric acid) and methanol in the ratio of 20:80 % v/v as the mobile phase at a flow rate of 1.0 mL/min. The column effluents were monitored by a photodiode array detector set at 286 nm. The method was validated in terms of specificity, linearity, accuracy, precision, detection limit, quantification limit, and robustness. Forced degradation of etofenamate was carried out under acidic, basic, thermal, photo, and peroxide conditions and the major degradation products of acidic and basic degradation were isolated and characterized by 1H-NMR, 13C-NMR, and mass spectral studies. The mass balance of the method varied between 92–99%. PMID:24482770

  5. Implementation effect of productive 4-stage field orientation on the student technopreneur skill in vocational schools

    NASA Astrophysics Data System (ADS)

    Ismail, Edy; Samsudi, Widjanarko, Dwi; Joyce, Peter; Stearns, Roman

    2018-03-01

    This model integrates project base learning by creating a product based on environmental needs. The Produktif Orientasi Lapangan 4 Tahap (POL4T) combines technical skills and entrepreneurial elements together in the learning process. This study is to implement the result of technopreneurship learning model development which is environment-oriented by combining technology and entrepreneurship components on Machining Skill Program. This study applies research and development design by optimizing experimental subject. Data were obtained from questionnaires, learning material validation, interpersonal, intrapersonal observation forms, skills, product, teachers and students' responses, and cognitive tasks. Expert validation and t-test calculation are applied to see how effective POL4T learning model. The result of the study is in the form of 4 steps learning model to enhance interpersonal and intrapersonal attitudes, develop practical products which orient to society and appropriate technology so that the products can have high selling value. The model is effective based on the students' post test result, which is better than the pre-test. The product obtained from POL4T model is proven to be better than the productive learning. POL4T model is recommended to be implemented for XI grade students. This is can develop entrepreneurial attitudes that are environment oriented, community needs and technical competencies students.

  6. Development and validation of in vitro-in vivo correlation (IVIVC) for estradiol transdermal drug delivery systems.

    PubMed

    Yang, Yang; Manda, Prashanth; Pavurala, Naresh; Khan, Mansoor A; Krishnaiah, Yellela S R

    2015-07-28

    The objective of this study was to develop a level A in vitro-in vivo correlation (IVIVC) for drug-in-adhesive (DIA) type estradiol transdermal drug delivery systems (TDDS). In vitro drug permeation studies across human skin were carried out to obtain the percent of estradiol permeation from marketed products. The in vivo time versus plasma concentration data of three estradiol TDDS at drug loadings of 2.0, 3.8 and 7.6mg (delivery rates of 25, 50 and 100μg/day, respectively) was deconvoluted using Wagner-Nelson method to obtain percent of in vivo drug absorption in postmenopausal women. The IVIVC between the in vitro percent of drug permeation (X) and in vivo percent of drug absorption (Y) for these three estradiol TDDS was constructed using GastroPlus® software. There was a high correlation (R(2)=1.0) with a polynomial regression of Y=-0.227X(2)+0.331X-0.001. These three estradiol TDDS were used for internal validation whereas another two products of the same formulation design (with delivery rates of 60 and 100μg/day) were used for external validation. The predicted estradiol serum concentrations (convoluted from in vitro skin permeation data) were compared with the observed serum concentrations for the respective products. The developed IVIVC model passed both the internal and external validations as the prediction errors (%PE) for Cmax and AUC were less than 15%. When another marketed estradiol TDDS with a delivery rate of 100μg/day but with a slight variation in formulation design was chosen, it did not pass external validation indicating the product-specific nature of IVIVC model. Results suggest that the IVIVC model developed in this study can be used to successfully predict the in vivo performance of the same estradiol TDDS with in vivo delivery rates ranging from 25 to 100μg/day. Published by Elsevier B.V.

  7. Sea Temperature Fiducial Reference Measurements for the Validation and Data Gap Bridging of Satellite SST Data Products

    NASA Astrophysics Data System (ADS)

    Wimmer, Werenfrid

    2016-08-01

    The Infrared Sea surface temperature Autonomous Radiometer (ISAR) was developed to provide reference data for the validation of satellite Sea Surface Temperature at the Skin interface (SSTskin) temperature data products, particularly the Advanced Along Track Scanning Radiometer (AATSR). Since March 2004 ISAR instruments have been deployed nearly continuously on ferries crossing the English Channel and the Bay of Biscay, between Portsmouth (UK) and Bilbao/Santander (Spain). The resulting twelve years of ISAR data, including an individual uncertainty estimate for each SST record, are calibrated with traceability to national standards (National Institute of Standards and Technology, USA (NIST) and National Physical Laboratory, Teddigton, UK (NPL), Fiducial Reference Measurements for satellite derived surface temperature product validation (FRM4STS)). They provide a unique independent in situ reference dataset against which to validate satellite derived products. We present results of the AATSR validation, and show the use of ISAR fiducial reference measurements as a common traceable validation data source for both AATSR and Sea and Land Surface Temperature Radiometer (SLSTR). ISAR data were also used to review performance of the Operational Sea Surface Temperature and Sea Ice Analysis (OSTIA) Sea Surface Temperature (SST) analysis before and after the demise of ESA Environmental Satellite (Envisat) when AATSR inputs ceased This demonstrates use of the ISAR reference data set for validating the SST climatologies that will bridge the data gap between AATSR and SLSTR.

  8. User Oriented Product Evaluation.

    ERIC Educational Resources Information Center

    Alkin, Marvin C.; Wingard, Joseph

    While the educational product development field has expanded tremendously over the last 15 years, there is a paucity of conveniently assembled and readily interpretable information that would enable users to make accurate and informed evaluations of different, but comparable, instructional products. Minimum types of validation data which should be…

  9. Labor Productivity Standards in Texas School Foodservice Operations

    ERIC Educational Resources Information Center

    Sherrin, A. Rachelle; Bednar, Carolyn; Kwon, Junehee

    2009-01-01

    Purpose: Purpose of this research was to investigate utilization of labor productivity standards and variables that affect productivity in Texas school foodservice operations. Methods: A questionnaire was developed, validated, and pilot tested, then mailed to 200 randomly selected Texas school foodservice directors. Descriptive statistics for…

  10. Development and validation of an HPLC-DAD method for simultaneous determination of cocaine, benzoic acid, benzoylecgonine and the main adulterants found in products based on cocaine.

    PubMed

    Floriani, Gisele; Gasparetto, João Cleverson; Pontarolo, Roberto; Gonçalves, Alan Guilherme

    2014-02-01

    Here, an HPLC-DAD method was developed and validated for simultaneous determination of cocaine, two cocaine degradation products (benzoylecgonine and benzoic acid), and the main adulterants found in products based on cocaine (caffeine, lidocaine, phenacetin, benzocaine and diltiazem). The new method was developed and validated using an XBridge C18 4.6mm×250mm, 5μm particle size column maintained at 60°C. The mobile phase consisted of a gradient of acetonitrile and ammonium formate 0.05M - pH 3.1, eluted at 1.0mL/min. The volume of injection was 10μL and the DAD detector was set at 274nm. Method validation assays demonstrated suitable sensitivity, selectivity, linearity, precision and accuracy. For selectivity assay, a MS detection system could be directly adapted to the method without the need of any change in the chromatographic conditions. The robustness study indicated that the flow rate, temperature and pH of the mobile phase are critical parameters and should not be changed considering the conditions herein determined. The new method was then successfully applied for determining cocaine, benzoylecgonine, benzoic acid, caffeine, lidocaine, phenacetin, benzocaine and diltiazem in 115 samples, seized in Brazil (2007-2012), which consisted of cocaine paste, cocaine base and salt cocaine samples. This study revealed cocaine contents that ranged from undetectable to 97.2%, with 97 samples presenting at least one of the degradation products or adulterants here evaluated. All of the studied degradation products and adulterants were observed among the seized samples, justifying the application of the method, which can be used as a screening and quantification tool in forensic analysis. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  11. Development and Validation of an Instrument to Measure University Students' Biotechnology Attitude

    NASA Astrophysics Data System (ADS)

    Erdogan, Mehmet; Özel, Murat; Uşak, Muhammet; Prokop, Pavol

    2009-06-01

    The impact of biotechnologies on peoples' everyday lives continuously increases. Measuring young peoples' attitudes toward biotechnologies is therefore very important and its results are useful not only for science curriculum developers and policy makers, but also for producers and distributors of genetically modified products. Despite of substantial number of instruments which focused on measuring student attitudes toward biotechnology, a majority of them were not rigorously validated. This study deals with the development and validation of an attitude questionnaire toward biotechnology. Detailed information on development and validation process of the instrument is provided. Data gathered from 326 university students provided evidence for the validity and reliability of the new instrument which consists of 28 attitude items on a five point likert type scale. It is believed that the instrument will serve as a valuable tool for both instructors and researchers in science education to assess students' biotechnology attitudes.

  12. Research-to-operations (R2O) for the Space Environmental Effects Fusion System (SEEFS) system-impact products

    NASA Astrophysics Data System (ADS)

    Quigley, Stephen

    The Space Vehicles Directorate of the Air Force Research Laboratory (AFRL/RVBX) and the Space Environment Branch of the Space and Missile Systems Center (SMC SLG/WMLE) have combined efforts to design, develop, test, implement, and validate numerical and graphical products for Air Force Space Command's (AFSPC) Space Environmental Effects Fusion System (SEEFS). These products are generated to analyze, specify, and forecast the effects of the near-earth space environment on Department of Defense weapons, navigation, communications, and surveillance systems. Jointly developed projects that have been completed as prototypes and are undergoing development for real-time operations include a SEEFS architecture and database, five system-impact products, and a high-level decision aid product. This first round of SEEFS products includes the Solar Radio Burst Effects (SoRBE) on radar and satellite communications, Radar Auroral Clutter (RAC), Scintillation Effects on radar and satellite communications (RadScint and SatScint), and Satellite Surface and Deep Charge/Discharge (Char/D) products. This presentation will provide overviews of the current system impact products, along with plans and potentials for future products expected for the SEEFS program. The overviews will include information on applicable research-to-operations (R2O) issues, to include input data coverage and quality control, output confidence levels, modeling standards, and validation efforts.

  13. An Analysis Of Additive Manufacturing Production Problems And Solutions

    DTIC Science & Technology

    2016-12-01

    democratization of manufacturing (Hornick, 2015). AM has three distinct advantages over subtractive manufacturing : product customization, design flexibility...58 develops software and other technology solutions for the design , analysis, testing, manufacture , and validation of products (Diane Ryan, personal...NAVAL POSTGRADUATE SCHOOL MONTEREY, CALIFORNIA MBA PROFESSIONAL REPORT AN ANALYSIS OF ADDITIVE MANUFACTURING PRODUCTION

  14. Rhetorical Approaches to Crisis Communication: The Research, Development, and Validation of an Image Repair Situational Theory for Educational Leaders

    ERIC Educational Resources Information Center

    Vogelaar, Robert J.

    2005-01-01

    In this project a product to aid educational leaders in the process of communicating in crisis situations is presented. The product was created and received a formative evaluation using an educational research and development methodology. Ultimately, an administrative training course that utilized an Image Repair Situational Theory was developed.…

  15. ME science as mobile learning based on virtual reality

    NASA Astrophysics Data System (ADS)

    Fradika, H. D.; Surjono, H. D.

    2018-04-01

    The purpose of this article described about ME Science (Mobile Education Science) as mobile learning application learning of Fisika Inti. ME Science is a product of research and development (R&D) that was using Alessi and Trollip model. Alessi and Trollip model consists three stages that are: (a) planning include analysis of problems, goals, need, and idea of development product, (b) designing includes collecting of materials, designing of material content, creating of story board, evaluating and review product, (c) developing includes development of product, alpha testing, revision of product, validation of product, beta testing, and evaluation of product. The article describes ME Science only to development of product which include development stages. The result of development product has been generates mobile learning application based on virtual reality that can be run on android-based smartphone. These application consist a brief description of learning material, quizzes, video of material summery, and learning material based on virtual reality.

  16. Validating Vegetable Production Unit (VPU) Plants, Protocols, Procedures and Requirements (P3R) using Currently Existing Flight Resources

    NASA Technical Reports Server (NTRS)

    Bingham, Gail; Bates, Scott; Bugbee, Bruce; Garland, Jay; Podolski, Igor; Levinskikh, Rita; Sychev, Vladimir; Gushin, Vadim

    2009-01-01

    Validating Vegetable Production Unit (VPU) Plants, Protocols, Procedures and Requirements (P3R) Using Currently Existing Flight Resources (Lada-VPU-P3R) is a study to advance the technology required for plant growth in microgravity and to research related food safety issues. Lada-VPU-P3R also investigates the non-nutritional value to the flight crew of developing plants on-orbit. The Lada-VPU-P3R uses the Lada hardware on the ISS and falls under a cooperative agreement between National Aeronautics and Space Administration (NASA) and the Russian Federal Space Association (FSA). Research Summary: Validating Vegetable Production Unit (VPU) Plants, Protocols, Procedures and Requirements (P3R) Using Currently Existing Flight Resources (Lada-VPU-P3R) will optimize hardware and

  17. Image quality validation of Sentinel 2 Level-1 products: performance status at the beginning of the constellation routine phase

    NASA Astrophysics Data System (ADS)

    Francesconi, Benjamin; Neveu-VanMalle, Marion; Espesset, Aude; Alhammoud, Bahjat; Bouzinac, Catherine; Clerc, Sébastien; Gascon, Ferran

    2017-09-01

    Sentinel-2 is an Earth Observation mission developed by the European Space Agency (ESA) in the frame of the Copernicus program of the European Commission. The mission is based on a constellation of 2-satellites: Sentinel-2A launched in June 2015 and Sentinel-2B launched in March 2017. It offers an unprecedented combination of systematic global coverage of land and coastal areas, a high revisit of five days at the equator and 2 days at mid-latitudes under the same viewing conditions, high spatial resolution, and a wide field of view for multispectral observations from 13 bands in the visible, near infrared and short wave infrared range of the electromagnetic spectrum. The mission performances are routinely and closely monitored by the S2 Mission Performance Centre (MPC), including a consortium of Expert Support Laboratories (ESL). This publication focuses on the Sentinel-2 Level-1 product quality validation activities performed by the MPC. It presents an up-to-date status of the Level-1 mission performances at the beginning of the constellation routine phase. Level-1 performance validations routinely performed cover Level-1 Radiometric Validation (Equalisation Validation, Absolute Radiometry Vicarious Validation, Absolute Radiometry Cross-Mission Validation, Multi-temporal Relative Radiometry Vicarious Validation and SNR Validation), and Level-1 Geometric Validation (Geolocation Uncertainty Validation, Multi-spectral Registration Uncertainty Validation and Multi-temporal Registration Uncertainty Validation). Overall, the Sentinel-2 mission is proving very successful in terms of product quality thereby fulfilling the promises of the Copernicus program.

  18. Testing Software Development Project Productivity Model

    NASA Astrophysics Data System (ADS)

    Lipkin, Ilya

    Software development is an increasingly influential factor in today's business environment, and a major issue affecting software development is how an organization estimates projects. If the organization underestimates cost, schedule, and quality requirements, the end results will not meet customer needs. On the other hand, if the organization overestimates these criteria, resources that could have been used more profitably will be wasted. There is no accurate model or measure available that can guide an organization in a quest for software development, with existing estimation models often underestimating software development efforts as much as 500 to 600 percent. To address this issue, existing models usually are calibrated using local data with a small sample size, with resulting estimates not offering improved cost analysis. This study presents a conceptual model for accurately estimating software development, based on an extensive literature review and theoretical analysis based on Sociotechnical Systems (STS) theory. The conceptual model serves as a solution to bridge organizational and technological factors and is validated using an empirical dataset provided by the DoD. Practical implications of this study allow for practitioners to concentrate on specific constructs of interest that provide the best value for the least amount of time. This study outlines key contributing constructs that are unique for Software Size E-SLOC, Man-hours Spent, and Quality of the Product, those constructs having the largest contribution to project productivity. This study discusses customer characteristics and provides a framework for a simplified project analysis for source selection evaluation and audit task reviews for the customers and suppliers. Theoretical contributions of this study provide an initial theory-based hypothesized project productivity model that can be used as a generic overall model across several application domains such as IT, Command and Control, Simulation and etc... This research validates findings from previous work concerning software project productivity and leverages said results in this study. The hypothesized project productivity model provides statistical support and validation of expert opinions used by practitioners in the field of software project estimation.

  19. Model development of production management unit to enhance entrepreneurship attitude of vocational school students from fashion department

    NASA Astrophysics Data System (ADS)

    Sumaryani, Sri

    2018-03-01

    The purpose of this study is to develop a model of production management unit to enhance entrepreneurship attitude of vocational school students from fashion department. This study concerns in developing students' entrepreneurship attitude in management which includes planning, organizing, applying and evaluation. The study uses Research and Development (R & D) approach with three main steps; preliminary study, development step, and product validation. Research subject was vocational school teachers from fashion department in Semarang, Salatiga and Demak. This study yields a development model of production management unit that could enhance vocational school students' entrepreneurship attitude in fashion department. The result shows that research subjects have understood about of production management unit in Vocational School (SMK).

  20. Impact of Learning Model Based on Cognitive Conflict toward Student’s Conceptual Understanding

    NASA Astrophysics Data System (ADS)

    Mufit, F.; Festiyed, F.; Fauzan, A.; Lufri, L.

    2018-04-01

    The problems that often occur in the learning of physics is a matter of misconception and low understanding of the concept. Misconceptions do not only happen to students, but also happen to college students and teachers. The existing learning model has not had much impact on improving conceptual understanding and remedial efforts of student misconception. This study aims to see the impact of cognitive-based learning model in improving conceptual understanding and remediating student misconceptions. The research method used is Design / Develop Research. The product developed is a cognitive conflict-based learning model along with its components. This article reports on product design results, validity tests, and practicality test. The study resulted in the design of cognitive conflict-based learning model with 4 learning syntaxes, namely (1) preconception activation, (2) presentation of cognitive conflict, (3) discovery of concepts & equations, (4) Reflection. The results of validity tests by some experts on aspects of content, didactic, appearance or language, indicate very valid criteria. Product trial results also show a very practical product to use. Based on pretest and posttest results, cognitive conflict-based learning models have a good impact on improving conceptual understanding and remediating misconceptions, especially in high-ability students.

  1. NPOESS Preparatory Project Validation Program for Ocean Data Products from VIIRS

    NASA Astrophysics Data System (ADS)

    Arnone, R.; Jackson, J. M.

    2009-12-01

    The National Polar-orbiting Operational Environmental Satellite Suite (NPOESS) Program, in partnership with National Aeronautical Space Administration (NASA), will launch the NPOESS Preparatory Project (NPP), a risk reduction and data continuity mission, prior to the first operational NPOESS launch. The NPOESS Program, in partnership with Northrop Grumman Aerospace Systems (NGAS), will execute the NPP Validation program to ensure the data products comply with the requirements of the sponsoring agencies. Data from the NPP Visible/Infrared Imager/Radiometer Suite (VIIRS) will be used to produce Environmental Data Records (EDR's) of Ocean Color/Chlorophyll and Sea Surface Temperature. The ocean Cal/Val program is designed to address an “end to end” capability from sensor to end product and is developed based on existing ongoing government satellite ocean remote sensing capabilities that are currently in use with NASA research and Navy and NOAA operational products. Therefore, the plan focuses on the extension of known reliable methods and capabilities currently used with the heritage sensors that will be extended to the NPP and NPOESS ocean product Cal/Val effort. This is not a fully “new” approach but it is designed to be the most reliable and cost effective approach to developing an automated Cal/Val system for VIIRS while retaining highly accurate procedures and protocols. This presentation will provide an overview of the approaches, data and schedule for the validation of the NPP VIIRS Ocean environmental data products.

  2. Describing the Situational Contexts of Sweetened Product Consumption in a Middle Eastern Canadian Community: Application of a Mixed Method Design

    PubMed Central

    Moubarac, Jean-Claude; Cargo, Margaret; Receveur, Olivier; Daniel, Mark

    2012-01-01

    Little is known about the situational contexts in which individuals consume processed sources of dietary sugars. This study aimed to describe the situational contexts associated with the consumption of sweetened food and drink products in a Catholic Middle Eastern Canadian community. A two-stage exploratory sequential mixed-method design was employed with a rationale of triangulation. In stage 1 (n = 62), items and themes describing the situational contexts of sweetened food and drink product consumption were identified from semi-structured interviews and were used to develop the content for the Situational Context Instrument for Sweetened Product Consumption (SCISPC). Face validity, readability and cultural relevance of the instrument were assessed. In stage 2 (n = 192), a cross-sectional study was conducted and exploratory factor analysis was used to examine the structure of themes that emerged from the qualitative analysis as a means of furthering construct validation. The SCISPC reliability and predictive validity on the daily consumption of sweetened products were also assessed. In stage 1, six themes and 40-items describing the situational contexts of sweetened product consumption emerged from the qualitative analysis and were used to construct the first draft of the SCISPC. In stage 2, factor analysis enabled the clarification and/or expansion of the instrument's initial thematic structure. The revised SCISPC has seven factors and 31 items describing the situational contexts of sweetened product consumption. Initial validation of the instrument indicated it has excellent internal consistency and adequate test-retest reliability. Two factors of the SCISPC had predictive validity for the daily consumption of total sugar from sweetened products (Snacking and Energy demands) while the other factors (Socialization, Indulgence, Constraints, Visual Stimuli and Emotional needs) were rather associated to occasional consumption of these products. PMID:23028597

  3. Validating long-term satellite-derived disturbance products: the case of burned areas

    NASA Astrophysics Data System (ADS)

    Boschetti, L.; Roy, D. P.

    2015-12-01

    The potential research, policy and management applications of satellite products place a high priority on providing statements about their accuracy. A number of NASA, ESA and EU funded global and continental burned area products have been developed using coarse spatial resolution satellite data, and have the potential to become part of a long-term fire Climate Data Record. These products have usually been validated by comparison with reference burned area maps derived by visual interpretation of Landsat or similar spatial resolution data selected on an ad hoc basis. More optimally, a design-based validation method should be adopted that is characterized by the selection of reference data via a probability sampling that can subsequently be used to compute accuracy metrics, taking into account the sampling probability. Design based techniques have been used for annual land cover and land cover change product validation, but have not been widely used for burned area products, or for the validation of global products that are highly variable in time and space (e.g. snow, floods or other non-permanent phenomena). This has been due to the challenge of designing an appropriate sampling strategy, and to the cost of collecting independent reference data. We propose a tri-dimensional sampling grid that allows for probability sampling of Landsat data in time and in space. To sample the globe in the spatial domain with non-overlapping sampling units, the Thiessen Scene Area (TSA) tessellation of the Landsat WRS path/rows is used. The TSA grid is then combined with the 16-day Landsat acquisition calendar to provide tri-dimensonal elements (voxels). This allows the implementation of a sampling design where not only the location but also the time interval of the reference data is explicitly drawn by probability sampling. The proposed sampling design is a stratified random sampling, with two-level stratification of the voxels based on biomes and fire activity (Figure 1). The novel validation approach, used for the validation of the MODIS and forthcoming VIIRS global burned area products, is a general one, and could be used for the validation of other global products that are highly variable in space and time and is required to assess the accuracy of climate records. The approach is demonstrated using a 1 year dataset of MODIS fire products.

  4. Hydrogen Reduction of Lunar Regolith Simulants for Oxygen Production

    NASA Technical Reports Server (NTRS)

    Hegde, U.; Balasubramaniam, R.; Gokoglu, S. A.; Rogers, K.; Reddington, M.; Oryshchyn, L.

    2011-01-01

    Hydrogen reduction of the lunar regolith simulants JSC-1A and LHT-2M is investigated in this paper. Experiments conducted at NASA Johnson Space Center are described and are analyzed utilizing a previously validated model developed by the authors at NASA Glenn Research Center. The effects of regolith sintering and clumping, likely in actual production operations, on the oxygen production rate are studied. Interpretations of the obtained results on the basis of the validated model are provided and linked to increase in the effective particle size and reduction in the intra-particle species diffusion rates. Initial results on the pressure dependence of the oxygen production rate are also presented and discussed

  5. Measurement Sets and Sites Commonly Used for High Spatial Resolution Image Product Characterization

    NASA Technical Reports Server (NTRS)

    Pagnutti, Mary

    2006-01-01

    Scientists within NASA's Applied Sciences Directorate have developed a well-characterized remote sensing Verification & Validation (V&V) site at the John C. Stennis Space Center (SSC). This site has enabled the in-flight characterization of satellite high spatial resolution remote sensing system products form Space Imaging IKONOS, Digital Globe QuickBird, and ORBIMAGE OrbView, as well as advanced multispectral airborne digital camera products. SSC utilizes engineered geodetic targets, edge targets, radiometric tarps, atmospheric monitoring equipment and their Instrument Validation Laboratory to characterize high spatial resolution remote sensing data products. This presentation describes the SSC characterization capabilities and techniques in the visible through near infrared spectrum and examples of calibration results.

  6. Soil moisture mapping using Sentinel 1 images: the proposed approach and its preliminary validation carried out in view of an operational product

    NASA Astrophysics Data System (ADS)

    Paloscia, S.; Pettinato, S.; Santi, E.; Pierdicca, N.; Pulvirenti, L.; Notarnicola, C.; Pace, G.; Reppucci, A.

    2011-11-01

    The main objective of this research is to develop, test and validate a soil moisture (SMC)) algorithm for the GMES Sentinel-1 characteristics, within the framework of an ESA project. The SMC product, to be generated from Sentinel-1 data, requires an algorithm able to process operationally in near-real-time and deliver the product to the GMES services within 3 hours from observations. Two different complementary approaches have been proposed: an Artificial Neural Network (ANN), which represented the best compromise between retrieval accuracy and processing time, thus allowing compliance with the timeliness requirements and a Bayesian Multi-temporal approach, allowing an increase of the retrieval accuracy, especially in case where little ancillary data are available, at the cost of computational efficiency, taking advantage of the frequent revisit time achieved by Sentinel-1. The algorithm was validated in several test areas in Italy, US and Australia, and finally in Spain with a 'blind' validation. The Multi-temporal Bayesian algorithm was validated in Central Italy. The validation results are in all cases very much in line with the requirements. However, the blind validation results were penalized by the availability of only VV polarization SAR images and MODIS lowresolution NDVI, although the RMS is slightly > 4%.

  7. GLASS daytime all-wave net radiation product: Algorithm development and preliminary validation

    DOE PAGES

    Jiang, Bo; Liang, Shunlin; Ma, Han; ...

    2016-03-09

    Mapping surface all-wave net radiation (R n) is critically needed for various applications. Several existing R n products from numerical models and satellite observations have coarse spatial resolutions and their accuracies may not meet the requirements of land applications. In this study, we develop the Global LAnd Surface Satellite (GLASS) daytime R n product at a 5 km spatial resolution. Its algorithm for converting shortwave radiation to all-wave net radiation using the Multivariate Adaptive Regression Splines (MARS) model is determined after comparison with three other algorithms. The validation of the GLASS R n product based on high-quality in situ measurementsmore » in the United States shows a coefficient of determination value of 0.879, an average root mean square error value of 31.61 Wm -2, and an average bias of 17.59 Wm -2. Furthermore, we also compare our product/algorithm with another satellite product (CERES-SYN) and two reanalysis products (MERRA and JRA55), and find that the accuracy of the much higher spatial resolution GLASS R n product is satisfactory. The GLASS R n product from 2000 to the present is operational and freely available to the public.« less

  8. GLASS daytime all-wave net radiation product: Algorithm development and preliminary validation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jiang, Bo; Liang, Shunlin; Ma, Han

    Mapping surface all-wave net radiation (R n) is critically needed for various applications. Several existing R n products from numerical models and satellite observations have coarse spatial resolutions and their accuracies may not meet the requirements of land applications. In this study, we develop the Global LAnd Surface Satellite (GLASS) daytime R n product at a 5 km spatial resolution. Its algorithm for converting shortwave radiation to all-wave net radiation using the Multivariate Adaptive Regression Splines (MARS) model is determined after comparison with three other algorithms. The validation of the GLASS R n product based on high-quality in situ measurementsmore » in the United States shows a coefficient of determination value of 0.879, an average root mean square error value of 31.61 Wm -2, and an average bias of 17.59 Wm -2. Furthermore, we also compare our product/algorithm with another satellite product (CERES-SYN) and two reanalysis products (MERRA and JRA55), and find that the accuracy of the much higher spatial resolution GLASS R n product is satisfactory. The GLASS R n product from 2000 to the present is operational and freely available to the public.« less

  9. Geometry in flipbook multimedia, a role of technology to improve mathematics learning quality: the case in madiun, east java

    NASA Astrophysics Data System (ADS)

    Andini, S.; Fitriana, L.; Budiyono

    2018-04-01

    This research is aimed to describe the process and to get product development of learning material using flipbook. The learning material is developed in geometry, especially quadrilateral. This research belongs to Research and Development (R&D). The procedure includes the steps of Budiyono Model such as conducting preliminary research, planning and developing a theoretical and prototype product, and determining product quality (validity, practicality, and effectiveness). The average assessment result of the theoretical product by the experts gets 4,54, while validity result of prototype product by the experts is 4,62. Practicability is obtained by the implementation of flipbook prototype in each meeting of limited-scale try out based on learning observation, with the average score of 4,10 and increasing of 4,50 in wide-scale try out. The effectiveness of the prototype product is obtained by the result from pretest and posttest on a limited-scale and a wide-scale try out. The limited-scale pre-test result showed a significant increase in average score of wide-scale pre-test of 25,2, and there is an increase in the average score of posttest on limited-scale try out and wide-scale try out is 8,16. The result of product quality can be concluded that flipbook media can be used in the geometry learning in elementary school which implemented curriculum 2013.

  10. Evaluation of coarse scale land surface remote sensing albedo product over rugged terrain

    NASA Astrophysics Data System (ADS)

    Wen, J.; Xinwen, L.; You, D.; Dou, B.

    2017-12-01

    Satellite derived Land surface albedo is an essential climate variable which controls the earth energy budget and it can be used in applications such as climate change, hydrology, and numerical weather prediction. The accuracy and uncertainty of surface albedo products should be evaluated with a reliable reference truth data prior to applications. And more literatures investigated the validation methods about the albedo validation in a flat or homogenous surface. However, the albedo performance over rugged terrain is still unknow due to the validation method limited. A multi-validation strategy is implemented to give a comprehensive albedo validation, which will involve the high resolution albedo processing, high resolution albedo validation based on in situ albedo, and the method to upscale the high resolution albedo to a coarse scale albedo. Among them, the high resolution albedo generation and the upscale method is the core step for the coarse scale albedo validation. In this paper, the high resolution albedo is generated by Angular Bin algorithm. And a albedo upscale method over rugged terrain is developed to obtain the coarse scale albedo truth. The in situ albedo located 40 sites in mountain area are selected globally to validate the high resolution albedo, and then upscaled to the coarse scale albedo by the upscale method. This paper takes MODIS and GLASS albedo product as a example, and the prelimarily results show the RMSE of MODIS and GLASS albedo product over rugged terrain are 0.047 and 0.057, respectively under the RMSE with 0.036 of high resolution albedo.

  11. Simulator validation results and proposed reporting format from flight testing a software model of a complex, high-performance airplane.

    DOT National Transportation Integrated Search

    2008-01-01

    Computer simulations are often used in aviation studies. These simulation tools may require complex, high-fidelity aircraft models. Since many of the flight models used are third-party developed products, independent validation is desired prior to im...

  12. Design and Development Computer-Based E-Learning Teaching Material for Improving Mathematical Understanding Ability and Spatial Sense of Junior High School Students

    NASA Astrophysics Data System (ADS)

    Nurjanah; Dahlan, J. A.; Wibisono, Y.

    2017-02-01

    This paper aims to make a design and development computer-based e-learning teaching material for improving mathematical understanding ability and spatial sense of junior high school students. Furthermore, the particular aims are (1) getting teaching material design, evaluation model, and intrument to measure mathematical understanding ability and spatial sense of junior high school students; (2) conducting trials computer-based e-learning teaching material model, asessment, and instrument to develop mathematical understanding ability and spatial sense of junior high school students; (3) completing teaching material models of computer-based e-learning, assessment, and develop mathematical understanding ability and spatial sense of junior high school students; (4) resulting research product is teaching materials of computer-based e-learning. Furthermore, the product is an interactive learning disc. The research method is used of this study is developmental research which is conducted by thought experiment and instruction experiment. The result showed that teaching materials could be used very well. This is based on the validation of computer-based e-learning teaching materials, which is validated by 5 multimedia experts. The judgement result of face and content validity of 5 validator shows that the same judgement result to the face and content validity of each item test of mathematical understanding ability and spatial sense. The reliability test of mathematical understanding ability and spatial sense are 0,929 and 0,939. This reliability test is very high. While the validity of both tests have a high and very high criteria.

  13. Development and application of a validated stability-indicating HPLC method for simultaneous determination of granisetron hydrochloride, benzyl alcohol and their main degradation products in parenteral dosage forms.

    PubMed

    Hewala, Ismail; El-Fatatre, Hamed; Emam, Ehab; Mubrouk, Mokhtar

    2010-06-30

    A simple, rapid and sensitive reversed phase high performance liquid chromatographic method using photodiode array detection was developed and validated for the simultaneous determination of granisetron hydrochloride, benzyl alcohol, 1-methyl-1H-indazole-3-carboxylic acid (the main degradation product of granisetron) and benzaldehyde (the main degradation product of benzyl alcohol) in granisetron injections. The separation was achieved on Hypersil BDS C8 (250 mm x 4.6 mm i.d., 5 microm particle diameter) column using a mobile phase consisted of acetonitrile:0.05 M KH(2)PO(4):triethylamine (22:100:0.15) adjusted to pH 4.8. The column was maintained at 25 degrees C and 20 microL of solutions was injected. Photodiode array detector was used to test the peak purity and the chromatograms were extracted at 210 nm. Naphazoline hydrochloride was used as internal standard. The method was validated with respect to specificity, linearity, accuracy, precision, limit of quantitation and limit of detection. The validation acceptance criteria were met in all cases. Identification of the pure peaks was carried out using library match programmer and wavelengths of derivative optima of the spectrograms of the peaks. The method was successfully applied to the determination of the investigated drugs and their degradation products in different batches of granisetron injections. The method was proved to be sensitive for the determination down to 0.03 and 0.01% of granisetron degradation product and benzaldehyde, respectively, which are far below the compendia limits for testing these degradation products in their corresponding intact drugs. Copyright 2010 Elsevier B.V. All rights reserved.

  14. Viirs Land Science Investigator-Led Processing System

    NASA Astrophysics Data System (ADS)

    Devadiga, S.; Mauoka, E.; Roman, M. O.; Wolfe, R. E.; Kalb, V.; Davidson, C. C.; Ye, G.

    2015-12-01

    The objective of the NASA's Suomi National Polar Orbiting Partnership (S-NPP) Land Science Investigator-led Processing System (Land SIPS), housed at the NASA Goddard Space Flight Center (GSFC), is to produce high quality land products from the Visible Infrared Imaging Radiometer Suite (VIIRS) to extend the Earth System Data Records (ESDRs) developed from NASA's heritage Earth Observing System (EOS) Moderate Resolution Imaging Spectroradiometer (MODIS) onboard the EOS Terra and Aqua satellites. In this paper we will present the functional description and capabilities of the S-NPP Land SIPS, including system development phases and production schedules, timeline for processing, and delivery of land science products based on coordination with the S-NPP Land science team members. The Land SIPS processing stream is expected to be operational by December 2016, generating land products either using the NASA science team delivered algorithms, or the "best-of" science algorithms currently in operation at NASA's Land Product Evaluation and Algorithm Testing Element (PEATE). In addition to generating the standard land science products through processing of the NASA's VIIRS Level 0 data record, the Land SIPS processing system is also used to produce a suite of near-real time products for NASA's application community. Land SIPS will also deliver the standard products, ancillary data sets, software and supporting documentation (ATBDs) to the assigned Distributed Active Archive Centers (DAACs) for archival and distribution. Quality assessment and validation will be an integral part of the Land SIPS processing system; the former being performed at Land Data Operational Product Evaluation (LDOPE) facility, while the latter under the auspices of the CEOS Working Group on Calibration & Validation (WGCV) Land Product Validation (LPV) Subgroup; adopting the best-practices and tools used to assess the quality of heritage EOS-MODIS products generated at the MODIS Adaptive Processing System (MODAPS).

  15. Meat mixture detection in Iberian pork sausages.

    PubMed

    Ortiz-Somovilla, V; España-España, F; De Pedro-Sanz, E J; Gaitán-Jurado, A J

    2005-11-01

    Five homogenized meat mixture treatments of Iberian (I) and/or Standard (S) pork were set up. Each treatment was analyzed by NIRS as a fresh product (N=75) and as dry-cured sausage (N=75). Spectra acquisition was carried out using DA 7000 equipment (Perten Instruments), obtaining a total of 750 spectra. Several absorption peaks and bands were selected as the most representative for homogenized dry-cured and fresh sausages. Discriminant analysis and mixture prediction equations were carried out based on the spectral data gathered. The best results using discriminant models were for fresh products, with 98.3% (calibration) and 60% (validation) correct classification. For dry-cured sausages 91.7% (calibration) and 80% (validation) of the samples were correctly classified. Models developed using mixture prediction equations showed SECV=4.7, r(2)=0.98 (calibration) and 73.3% of validation set were correctly classified for the fresh product. These values for dry-cured sausages were SECV=5.9, r(2)=0.99 (calibration) and 93.3% correctly classified for validation.

  16. Evaluation of a moderate resolution, satellite-based impervious surface map using an independent, high-resolution validation data set

    USGS Publications Warehouse

    Jones, J.W.; Jarnagin, T.

    2009-01-01

    Given the relatively high cost of mapping impervious surfaces at regional scales, substantial effort is being expended in the development of moderate-resolution, satellite-based methods for estimating impervious surface area (ISA). To rigorously assess the accuracy of these data products high quality, independently derived validation data are needed. High-resolution data were collected across a gradient of development within the Mid-Atlantic region to assess the accuracy of National Land Cover Data (NLCD) Landsat-based ISA estimates. Absolute error (satellite predicted area - "reference area") and relative error [satellite (predicted area - "reference area")/ "reference area"] were calculated for each of 240 sample regions that are each more than 15 Landsat pixels on a side. The ability to compile and examine ancillary data in a geographic information system environment provided for evaluation of both validation and NLCD data and afforded efficient exploration of observed errors. In a minority of cases, errors could be explained by temporal discontinuities between the date of satellite image capture and validation source data in rapidly changing places. In others, errors were created by vegetation cover over impervious surfaces and by other factors that bias the satellite processing algorithms. On average in the Mid-Atlantic region, the NLCD product underestimates ISA by approximately 5%. While the error range varies between 2 and 8%, this underestimation occurs regardless of development intensity. Through such analyses the errors, strengths, and weaknesses of particular satellite products can be explored to suggest appropriate uses for regional, satellite-based data in rapidly developing areas of environmental significance. ?? 2009 ASCE.

  17. Increased importance of the documented development stage in process validation.

    PubMed

    Mohammed-Ziegler, Ildikó; Medgyesi, Ildikó

    2012-07-01

    Current trends in pharmaceutical quality assurance moved when the Federal Drug Administration (FDA) of the USA published its new guideline on process validation in 2011. This guidance introduced the lifecycle approach of process validation. In this short communication some typical changes from the point of view of practice of API production are addressed in the light of inspection experiences. Some details are compared with the European regulations.

  18. Future Food Production System Development Pulling From Space Biology Crop Growth Testing in Veggie

    NASA Technical Reports Server (NTRS)

    Massa, Gioia; Romeyn, Matt; Fritsche, Ralph

    2017-01-01

    Preliminary crop testing using Veggie indicates the environmental conditions provided by the ISS are generally suitable for food crop production. When plant samples were returned to Earth for analysis, their levels of nutrients were comparable to Earth-grown ground controls. Veggie-grown produce food safety microbiology analysis indicated that space-grown crops are safe to consume. Produce sanitizing wipes were used on-orbit to further reduce risk of foodborne illness. Validation growth tests indicated abiotic challenges of insufficient or excess fluid delivery, potentially reduced air flow leading to excess water, elevated CO2 leading to physiological responses, and microorganisms that became opportunistic pathogens. As NASA works to develop future space food production, several areas of research to define these systems pull from the Veggie technology validation tests. Research into effective, reusable water delivery and water recovery methods for future food production systems arises from abiotic challenges observed. Additionally, impacts of elevated CO2 and refinement of fertilizer and light recipes for crops needs to be assessed. Biotic pulls include methods or technologies to effectively sanitize produce with few consumables and low inputs; work to understand the phytomicrobiome and potentially use it to protect crops or enhance growth; selection of crops with high harvest index and desirable flavors for supplemental nutrition; crops that provide psychosocial benefits, and custom space crop development. Planning for future food production in a deep space gateway or a deep space transit vehicle requires methods of handling and storing seeds, and ensuring space seeds are free of contaminants and long-lived. Space food production systems may require mechanization and autonomous operation, with preliminary testing initiated to identify operations and capabilities that are candidates for automation. Food production design is also pulling from Veggie logistics lessons, as we learn about growing at different scales and move toward developing systems that require less launch mass. Veggie will be used as a test bed for novel food production technologies. Veggie is a relatively simple precursor food production system but the knowledge gained from space biology validation tests in Veggie will have far reaching repercussions on future exploration food production. This work is supported by NASA.

  19. Future Food Production System Development Pulling from Space Biology Crop Growth Testing in Veggie

    NASA Technical Reports Server (NTRS)

    Massa, G. D.; Romeyn, M. W.; Fritsche, R. F.

    2017-01-01

    Preliminary crop testing using Veggie indicates the environmental conditions provided by the ISS are generally suitable for food crop production. When plant samples were returned to Earth for analysis, their levels of nutrients were comparable to Earth-grown ground controls. Veggie-grown produce food safety microbiology analysis indicated that space-grown crops are safe to consume. Produce sanitizing wipes were used on-orbit to further reduce risk of foodborne illness. Validation growth tests indicated abiotic challenges of insufficient or excess fluid delivery, potentially reduced air flow leading to excess water, elevated CO2 leading to physiological responses, and microorganisms that became opportunistic pathogens. As NASA works to develop future space food production, several areas of research to define these systems pull from the Veggie technology validation tests. Research into effective, reusable water delivery and water recovery methods for future food production systems arises from abiotic challenges observed. Additionally, impacts of elevated CO2 and refinement of fertilizer and light recipes for crops needs to be assessed. Biotic pulls include methods or technologies to effectively sanitize produce with few consumables and low inputs; work to understand the phytomicrobiome and potentially use it to protect crops or enhance growth; selection of crops with high harvest index and desirable flavors for supplemental nutrition; crops that provide psychosocial benefits, and custom space crop development. Planning for future food production in a deep space gateway or a deep space transit vehicle requires methods of handling and storing seeds, and ensuring space seeds are free of contaminants and long-lived. Space food production systems may require mechanization and autonomous operation, with preliminary testing initiated to identify operations and capabilities that are candidates for automation. Food production design is also pulling from Veggie logistics lessons, as we learn about growing at different scales and move toward developing systems that require less launch mass. Veggie will be used as a test bed for novel food production technologies. Veggie is a relatively simple precursor food production system but the knowledge gained from space biology validation tests in Veggie will have far reaching repercussions on future exploration food production.

  20. MODIS Land Data Products: Generation, Quality Assurance and Validation

    NASA Technical Reports Server (NTRS)

    Masuoka, Edward; Wolfe, Robert; Morisette, Jeffery; Sinno, Scott; Teague, Michael; Saleous, Nazmi; Devadiga, Sadashiva; Justice, Christopher; Nickeson, Jaime

    2008-01-01

    The Moderate Resolution Imaging Spectrometer (MODIS) on-board NASA's Earth Observing System (EOS) Terra and Aqua Satellites are key instruments for providing data on global land, atmosphere, and ocean dynamics. Derived MODIS land, atmosphere and ocean products are central to NASA's mission to monitor and understand the Earth system. NASA has developed and generated on a systematic basis a suite of MODIS products starting with the first Terra MODIS data sensed February 22, 2000 and continuing with the first MODIS-Aqua data sensed July 2, 2002. The MODIS Land products are divided into three product suites: radiation budget products, ecosystem products, and land cover characterization products. The production and distribution of the MODIS Land products are described, from initial software delivery by the MODIS Land Science Team, to operational product generation and quality assurance, delivery to EOS archival and distribution centers, and product accuracy assessment and validation. Progress and lessons learned since the first MODIS data were in early 2000 are described.

  1. Recommendations for designing and conducting cold-fill hold challenge studies for acidified food products

    USDA-ARS?s Scientific Manuscript database

    A scheduled process developed for manufacture of acidified foods must be validated with data from existing literature or from a product-specific challenge study, either of which can establish both safety and shelf stability. The challenge study would evaluate the ability of a particular food product...

  2. Estimating and validating ground-based timber harvesting production through computer simulation

    Treesearch

    Jingxin Wang; Chris B. LeDoux

    2003-01-01

    Estimating ground-based timber harvesting systems production with an object oriented methodology was investigated. The estimation model developed generates stands of trees, simulates chain saw, drive-to-tree feller-buncher, swing-to-tree single-grip harvester felling, and grapple skidder and forwarder extraction activities, and analyzes costs and productivity. It also...

  3. Global Land Product Validation Protocols: An Initiative of the CEOS Working Group on Calibration and Validation to Evaluate Satellite-derived Essential Climate Variables

    NASA Astrophysics Data System (ADS)

    Guillevic, P. C.; Nickeson, J. E.; Roman, M. O.; camacho De Coca, F.; Wang, Z.; Schaepman-Strub, G.

    2016-12-01

    The Global Climate Observing System (GCOS) has specified the need to systematically produce and validate Essential Climate Variables (ECVs). The Committee on Earth Observation Satellites (CEOS) Working Group on Calibration and Validation (WGCV) and in particular its subgroup on Land Product Validation (LPV) is playing a key coordination role leveraging the international expertise required to address actions related to the validation of global land ECVs. The primary objective of the LPV subgroup is to set standards for validation methods and reporting in order to provide traceable and reliable uncertainty estimates for scientists and stakeholders. The Subgroup is comprised of 9 focus areas that encompass 10 land surface variables. The activities of each focus area are coordinated by two international co-leads and currently include leaf area index (LAI) and fraction of absorbed photosynthetically active radiation (FAPAR), vegetation phenology, surface albedo, fire disturbance, snow cover, land cover and land use change, soil moisture, land surface temperature (LST) and emissivity. Recent additions to the focus areas include vegetation indices and biomass. The development of best practice validation protocols is a core activity of CEOS LPV with the objective to standardize the evaluation of land surface products. LPV has identified four validation levels corresponding to increasing spatial and temporal representativeness of reference samples used to perform validation. Best practice validation protocols (1) provide the definition of variables, ancillary information and uncertainty metrics, (2) describe available data sources and methods to establish reference validation datasets with SI traceability, and (3) describe evaluation methods and reporting. An overview on validation best practice components will be presented based on the LAI and LST protocol efforts to date.

  4. Pre-Launch Tasks Proposed in our Contract of December 1991

    NASA Technical Reports Server (NTRS)

    1998-01-01

    We propose, during the pre-EOS phase to: (1) develop, with other MODIS Team Members, a means of discriminating different major biome types with NDVI and other AVHRR-based data; (2) develop a simple ecosystem process model for each of these biomes, BIOME-BGC; (3) relate the seasonal trend of weekly composite NDVI to vegetation phenology and temperature limits to develop a satellite defined growing season for vegetation; and (4) define physiologically based energy to mass conversion factors for carbon and water for each biome. Our final core at-launch product will be simplified, completely satellite driven biome specific models for net primary production. We will build these biome specific satellite driven algorithms using a family of simple ecosystem process models as calibration models, collectively called BIOME-BGC, and establish coordination with an existing network of ecological study sites in order to test and validate these products. Field datasets will then be available for both BIOME-BGC development and testing, use for algorithm developments of other MODIS Team Members, and ultimately be our first test point for MODIS land vegetation products upon launch. We will use field sites from the National Science Foundation Long-Term Ecological Research network, and develop Glacier National Park as a major site for intensive validation.

  5. Pre-Launch Tasks Proposed in our Contract of December 1991

    NASA Technical Reports Server (NTRS)

    Running, Steven W.; Nemani, Ramakrishna R.; Glassy, Joseph

    1997-01-01

    We propose, during the pre-EOS phase to: (1) develop, with other MODIS Team Members, a means of discriminating different major biome types with NDVI and other AVHRR-based data. (2) develop a simple ecosystem process model for each of these biomes, BIOME-BGC (3) relate the seasonal trend of weekly composite NDVI to vegetation phenology and temperature limits to develop a satellite defined growing season for vegetation; and (4) define physiologically based energy to mass conversion factors for carbon and water for each biome. Our final core at-launch product will be simplified, completely satellite driven biome specific models for net primary production. We will build these biome specific satellite driven algorithms using a family of simple ecosystem process models as calibration models, collectively called BIOME-BGC, and establish coordination with an existing network of ecological study sites in order to test and validate these products. Field datasets will then be available for both BIOME-BGC development and testing, use for algorithm developments of other MODIS Team Members, and ultimately be our first test point for MODIS land vegetation products upon launch. We will use field sites from the National Science Foundation Long-Term Ecological Research network, and develop Glacier National Park as a major site for intensive validation.

  6. Validation of Suomi NPP OMPS Limb Profiler Ozone Measurements

    NASA Astrophysics Data System (ADS)

    Buckner, S. N.; Flynn, L. E.; McCormick, M. P.; Anderson, J.

    2017-12-01

    The Ozone Mapping and Profiler Suite (OMPS) Limb Profiler onboard the Suomi National Polar-Orbiting Partnership satellite (SNPP) makes measurements of limb-scattered solar radiances over Ultraviolet and Visible wavelengths. These measurements are used in retrieval algorithms to create high vertical resolution ozone profiles, helping monitor the evolution of the atmospheric ozone layer. NOAA is in the process of implementing these algorithms to make near-real-time versions of these products. The main objective of this project is to generate estimates of the accuracy and precision of the OMPS Limb products by analysis of matchup comparisons with similar products from the Earth Observing System Microwave Limb Sounder (EOS Aura MLS). The studies investigated the sources of errors, and classified them with respect to height, geographic location, and atmospheric and observation conditions. In addition, this project included working with the algorithm developers in an attempt to develop corrections and adjustments. Collocation and zonal mean comparisons were made and statistics were gathered on both a daily and monthly basis encompassing the entire OMPS data record. This validation effort of the OMPS-LP data will be used to help validate data from the Stratosphere Aerosol and Gas Experiment III on the International Space Station (SAGE III ISS) and will also be used in conjunction with the NOAA Total Ozone from Assimilation of Stratosphere and Troposphere (TOAST) product to develop a new a-priori for the NOAA Unique Combined Atmosphere Processing System (NUCAPS) ozone product. The current NUCAPS ozone product uses a combination of Cross-track Infrared Sounder (CrIS) data for the troposphere and a tropopause based climatology derived from ozonesonde data for the stratosphere a-priori. The latest version of TOAST uses a combination of both CrIS and OMPS-LP data. We will further develop the newest version of TOAST and incorporate it into the NUCAPS system as a new a-priori, in hopes of creating a better global ozone product.

  7. Overview of SCIAMACHY validation: 2002 2004

    NASA Astrophysics Data System (ADS)

    Piters, A. J. M.; Bramstedt, K.; Lambert, J.-C.; Kirchhoff, B.

    2005-08-01

    SCIAMACHY, on board Envisat, is now in operation for almost three years. This UV/visible/NIR spectrometer measures the solar irradiance, the earthshine radiance scattered at nadir and from the limb, and the attenuation of solar radiation by the atmosphere during sunrise and sunset, from 240 to 2380 nm and at moderate spectral resolution. Vertical columns and profiles of a variety of atmospheric constituents are inferred from the SCIAMACHY radiometric measurements by dedicated retrieval algorithms. With the support of ESA and several international partners, a methodical SCIAMACHY validation programme has been developed jointly by Germany, the Netherlands and Belgium (the three instrument providing countries) to face complex requirements in terms of measured species, altitude range, spatial and temporal scales, geophysical states and intended scientific applications. This summary paper describes the approach adopted to address those requirements. The actual validation of the operational SCIAMACHY processors established at DLR on behalf of ESA has been hampered by data distribution and processor problems. Since first data releases in summer 2002, operational processors were upgraded regularly and some data products - level-1b spectra, level-2 O3, NO2, BrO and clouds data - have improved significantly. Validation results summarised in this paper conclude that for limited periods and geographical domains they can already be used for atmospheric research. Nevertheless, remaining processor problems cause major errors preventing from scientific usability in other periods and domains. Untied to the constraints of operational processing, seven scientific institutes (BIRA-IASB, IFE, IUP-Heidelberg, KNMI, MPI, SAO and SRON) have developed their own retrieval algorithms and generated SCIAMACHY data products, together addressing nearly all targeted constituents. Most of the UV-visible data products (both columns and profiles) already have acceptable, if not excellent, quality. Several near-infrared column products are still in development but they have already demonstrated their potential for a variety of applications. In any case, scientific users are advised to read carefully validation reports before using the data. It is required and anticipated that SCIAMACHY validation will continue throughout instrument lifetime and beyond. The actual amount of work will obviously depend on funding considerations.

  8. The Consumer Motivation Scale: A detailed review of item generation, exploration, confirmation, and validation procedures.

    PubMed

    Barbopoulos, I; Johansson, L-O

    2017-08-01

    This data article offers a detailed description of analyses pertaining to the development of the Consumer Motivation Scale (CMS), from item generation and the extraction of factors, to confirmation of the factor structure and validation of the emergent dimensions. The established goal structure - consisting of the sub-goals Value for Money, Quality, Safety, Stimulation, Comfort, Ethics, and Social Acceptance - is shown to be related to a variety of consumption behaviors in different contexts and for different products, and should thereby prove useful in standard marketing research, as well as in the development of tailored marketing strategies, and the segmentation of consumer groups, settings, brands, and products.

  9. Evolving Improvements to TRMM Ground Validation Rainfall Estimates

    NASA Technical Reports Server (NTRS)

    Robinson, M.; Kulie, M. S.; Marks, D. A.; Wolff, D. B.; Ferrier, B. S.; Amitai, E.; Silberstein, D. S.; Fisher, B. L.; Wang, J.; Einaudi, Franco (Technical Monitor)

    2000-01-01

    The primary function of the TRMM Ground Validation (GV) Program is to create GV rainfall products that provide basic validation of satellite-derived precipitation measurements for select primary sites. Since the successful 1997 launch of the TRMM satellite, GV rainfall estimates have demonstrated systematic improvements directly related to improved radar and rain gauge data, modified science techniques, and software revisions. Improved rainfall estimates have resulted in higher quality GV rainfall products and subsequently, much improved evaluation products for the satellite-based precipitation estimates from TRMM. This presentation will demonstrate how TRMM GV rainfall products created in a semi-automated, operational environment have evolved and improved through successive generations. Monthly rainfall maps and rainfall accumulation statistics for each primary site will be presented for each stage of GV product development. Contributions from individual product modifications involving radar reflectivity (Ze)-rain rate (R) relationship refinements, improvements in rain gauge bulk-adjustment and data quality control processes, and improved radar and gauge data will be discussed. Finally, it will be demonstrated that as GV rainfall products have improved, rainfall estimation comparisons between GV and satellite have converged, lending confidence to the satellite-derived precipitation measurements from TRMM.

  10. Developing a tool for the preparation of GMP audit of pharmaceutical contract manufacturer.

    PubMed

    Linna, Anu; Korhonen, Mirka; Mannermaa, Jukka-Pekka; Airaksinen, Marja; Juppo, Anne Mari

    2008-06-01

    Outsourcing is rapidly growing in the pharmaceutical industry. When the manufacturing activities are outsourced, control of the product's quality has to be maintained. One way to confirm contract manufacturer's GMP (Good Manufacturing Practice) compliance is auditing. Audits can be supported for instance by using GMP questionnaires. The objective of this study was to develop a tool for the audit preparation of pharmaceutical contract manufacturers and to validate its contents by using Delphi method. At this phase of the study the tool was developed for non-sterile finished product contract manufacturers. A modified Delphi method was used with expert panel consisting of 14 experts from pharmaceutical industry, authorities and university. The content validity of the developed tool was assessed by a Delphi questionnaire round. The response rate in Delphi questionnaire round was 86%. The tool consisted of 103 quality items, from which 90 (87%) achieved the pre-defined agreement rate level (75%). Thirteen quality items which did not achieve the pre-defined agreement rate were excluded from the tool. The expert panel suggested only minor changes to the tool. The results show that the content validity of the developed audit preparation tool was good.

  11. A comprehensive validation toolbox for regional ocean models - Outline, implementation and application to the Baltic Sea

    NASA Astrophysics Data System (ADS)

    Jandt, Simon; Laagemaa, Priidik; Janssen, Frank

    2014-05-01

    The systematic and objective comparison between output from a numerical ocean model and a set of observations, called validation in the context of this presentation, is a beneficial activity at several stages, starting from early steps in model development and ending at the quality control of model based products delivered to customers. Even though the importance of this kind of validation work is widely acknowledged it is often not among the most popular tasks in ocean modelling. In order to ease the validation work a comprehensive toolbox has been developed in the framework of the MyOcean-2 project. The objective of this toolbox is to carry out validation integrating different data sources, e.g. time-series at stations, vertical profiles, surface fields or along track satellite data, with one single program call. The validation toolbox, implemented in MATLAB, features all parts of the validation process - ranging from read-in procedures of datasets to the graphical and numerical output of statistical metrics of the comparison. The basic idea is to have only one well-defined validation schedule for all applications, in which all parts of the validation process are executed. Each part, e.g. read-in procedures, forms a module in which all available functions of this particular part are collected. The interface between the functions, the module and the validation schedule is highly standardized. Functions of a module are set up for certain validation tasks, new functions can be implemented into the appropriate module without affecting the functionality of the toolbox. The functions are assigned for each validation task in user specific settings, which are externally stored in so-called namelists and gather all information of the used datasets as well as paths and metadata. In the framework of the MyOcean-2 project the toolbox is frequently used to validate the forecast products of the Baltic Sea Marine Forecasting Centre. Hereby the performance of any new product version is compared with the previous version. Although, the toolbox is mainly tested for the Baltic Sea yet, it can easily be adapted to different datasets and parameters, regardless of the geographic region. In this presentation the usability of the toolbox is demonstrated along with several results of the validation process.

  12. Validation of antimicrobial interventions for small and very small processors: a how-to guide to develop and conduct validations

    USDA-ARS?s Scientific Manuscript database

    It is increasingly important to assure that antimicrobial interventions applied on/into foods to control pathogenic microorganisms are functioning properly and achieving the desired goal of preventing, reducing and/or eliminating microbial hazards associated with a defined food product. This approac...

  13. Cloud Computing and Validated Learning for Accelerating Innovation in IoT

    ERIC Educational Resources Information Center

    Suciu, George; Todoran, Gyorgy; Vulpe, Alexandru; Suciu, Victor; Bulca, Cristina; Cheveresan, Romulus

    2015-01-01

    Innovation in Internet of Things (IoT) requires more than just creation of technology and use of cloud computing or big data platforms. It requires accelerated commercialization or aptly called go-to-market processes. To successfully accelerate, companies need a new type of product development, the so-called validated learning process.…

  14. Exploring the Validity of a Second Language Intercultural Pragmatics Assessment Tool

    ERIC Educational Resources Information Center

    Timpe-Laughlin, Veronika; Choi, Ikkyu

    2017-01-01

    Pragmatics has been a key component of language competence frameworks. While the majority of second/foreign language (L2) pragmatics tests have targeted productive skills, the assessment of receptive pragmatic skills remains a developing field. This study explores validation evidence for a test of receptive L2 pragmatic ability called the American…

  15. GOES-R L1b Readiness Implementation and Management Plan

    NASA Technical Reports Server (NTRS)

    Kunkee, David; Farley, Robert; Kwan, Betty; Walterscheid, Richard; Hecht, James; Claudepierre, Seth.; De Luccia, Frank

    2017-01-01

    A complement of Readiness, Implementation and Management Plans (RIMPs) to facilitate management of post-launch product test activities for the official Geostationary Operational Environmental Satellite (GOES-R) Level 1b (L1b) products have been developed and documented. Separate plans have been created for each of the GOES-R sensors including: the Advanced Baseline Imager (ABI), the Extreme ultraviolet and X-ray Irradiance Sensors (EXIS), Geostationary Lightning Mapper (GLM), GOES-R Magnetometer (MAG), the Space Environment In-Situ Suite (SEISS), and the Solar Ultraviolet Imager (SUVI). The GOES-R program has implemented these RIMPs in order to address the full scope of CalVal activities required for a successful demonstration of GOES-R L1b data product quality throughout the three validation stages: Beta, Provisional and Full Validation. For each product maturity level, the RIMPs include specific performance criteria and required artifacts that provide evidence a given validation stage has been reached, the timing when each stage will be complete, a description of every applicable Post-Launch Product Test (PLPT), roles and responsibilities of personnel, upstream dependencies, and analysis methods and tools to be employed during validation. Instrument level Post-Launch Tests (PLTs) are also referenced and apply primarily to functional check-out of the instruments.

  16. Do placebo based validation standards mimic real batch products behaviour? Case studies.

    PubMed

    Bouabidi, A; Talbi, M; Bouklouze, A; El Karbane, M; Bourichi, H; El Guezzar, M; Ziemons, E; Hubert, Ph; Rozet, E

    2011-06-01

    Analytical methods validation is a mandatory step to evaluate the ability of developed methods to provide accurate results for their routine application. Validation usually involves validation standards or quality control samples that are prepared in placebo or reconstituted matrix made of a mixture of all the ingredients composing the drug product except the active substance or the analyte under investigation. However, one of the main concerns that can be made with this approach is that it may lack an important source of variability that come from the manufacturing process. The question that remains at the end of the validation step is about the transferability of the quantitative performance from validation standards to real authentic drug product samples. In this work, this topic is investigated through three case studies. Three analytical methods were validated using the commonly spiked placebo validation standards at several concentration levels as well as using samples coming from authentic batch samples (tablets and syrups). The results showed that, depending on the type of response function used as calibration curve, there were various degrees of differences in the results accuracy obtained with the two types of samples. Nonetheless the use of spiked placebo validation standards was showed to mimic relatively well the quantitative behaviour of the analytical methods with authentic batch samples. Adding these authentic batch samples into the validation design may help the analyst to select and confirm the most fit for purpose calibration curve and thus increase the accuracy and reliability of the results generated by the method in routine application. Copyright © 2011 Elsevier B.V. All rights reserved.

  17. Federal COBOL Compiler Testing Service Compiler Validation Request Information.

    DTIC Science & Technology

    1977-05-09

    background of the Federal COBOL Compiler Testing Service which was set up by a memorandum of agreement between the National Bureau of Standards and the...Federal Standard, and the requirement of COBOL compiler validation in the procurement process. It also contains a list of all software products...produced by the software Development Division in support of the FCCTS as well as the Validation Summary Reports produced as a result of discharging the

  18. Biomimicry in Product Design through Materials Selection and Computer Aided Engineering

    NASA Astrophysics Data System (ADS)

    Alexandridis, G.; Tzetzis, D.; Kyratsis, P.

    2016-11-01

    The aim of this study is to demonstrate a 7-step methodology that describes the way nature can act as a source of inspiration for the design and the development of a product. Furthermore, it suggests special computerized tools and methods for the product optimization regarding its environmental impact i.e. material selection, production methods. For validation purposes, a garden chaise lounge that imitates the form of a scorpion was developed as a result for the case study and the presentation of the current methodology.

  19. Verification and Validation Studies for the LAVA CFD Solver

    NASA Technical Reports Server (NTRS)

    Moini-Yekta, Shayan; Barad, Michael F; Sozer, Emre; Brehm, Christoph; Housman, Jeffrey A.; Kiris, Cetin C.

    2013-01-01

    The verification and validation of the Launch Ascent and Vehicle Aerodynamics (LAVA) computational fluid dynamics (CFD) solver is presented. A modern strategy for verification and validation is described incorporating verification tests, validation benchmarks, continuous integration and version control methods for automated testing in a collaborative development environment. The purpose of the approach is to integrate the verification and validation process into the development of the solver and improve productivity. This paper uses the Method of Manufactured Solutions (MMS) for the verification of 2D Euler equations, 3D Navier-Stokes equations as well as turbulence models. A method for systematic refinement of unstructured grids is also presented. Verification using inviscid vortex propagation and flow over a flat plate is highlighted. Simulation results using laminar and turbulent flow past a NACA 0012 airfoil and ONERA M6 wing are validated against experimental and numerical data.

  20. Integrated Process Modeling-A Process Validation Life Cycle Companion.

    PubMed

    Zahel, Thomas; Hauer, Stefan; Mueller, Eric M; Murphy, Patrick; Abad, Sandra; Vasilieva, Elena; Maurer, Daniel; Brocard, Cécile; Reinisch, Daniela; Sagmeister, Patrick; Herwig, Christoph

    2017-10-17

    During the regulatory requested process validation of pharmaceutical manufacturing processes, companies aim to identify, control, and continuously monitor process variation and its impact on critical quality attributes (CQAs) of the final product. It is difficult to directly connect the impact of single process parameters (PPs) to final product CQAs, especially in biopharmaceutical process development and production, where multiple unit operations are stacked together and interact with each other. Therefore, we want to present the application of Monte Carlo (MC) simulation using an integrated process model (IPM) that enables estimation of process capability even in early stages of process validation. Once the IPM is established, its capability in risk and criticality assessment is furthermore demonstrated. IPMs can be used to enable holistic production control strategies that take interactions of process parameters of multiple unit operations into account. Moreover, IPMs can be trained with development data, refined with qualification runs, and maintained with routine manufacturing data which underlines the lifecycle concept. These applications will be shown by means of a process characterization study recently conducted at a world-leading contract manufacturing organization (CMO). The new IPM methodology therefore allows anticipation of out of specification (OOS) events, identify critical process parameters, and take risk-based decisions on counteractions that increase process robustness and decrease the likelihood of OOS events.

  1. Validation of a novel sequential cultivation method for the production of enzymatic cocktails from Trichoderma strains.

    PubMed

    Florencio, C; Cunha, F M; Badino, A C; Farinas, C S

    2015-02-01

    The development of new cost-effective bioprocesses for the production of cellulolytic enzymes is needed in order to ensure that the conversion of biomass becomes economically viable. The aim of this study was to determine whether a novel sequential solid-state and submerged fermentation method (SF) could be validated for different strains of the Trichoderma genus. Cultivation of the Trichoderma reesei Rut-C30 reference strain under SF using sugarcane bagasse as substrate was shown to be favorable for endoglucanase (EGase) production, resulting in up to 4.2-fold improvement compared with conventional submerged fermentation. Characterization of the enzymes in terms of the optimum pH and temperature for EGase activity and comparison of the hydrolysis profiles obtained using a synthetic substrate did not reveal any qualitative differences among the different cultivation conditions investigated. However, the thermostability of the EGase was influenced by the type of carbon source and cultivation system. All three strains of Trichoderma tested (T. reesei Rut-C30, Trichoderma harzianum, and Trichoderma sp INPA 666) achieved higher enzymatic productivity when cultivated under SF, hence validating the proposed SF method for use with different Trichoderma strains. The results suggest that this bioprocess configuration is a very promising development for the cellulosic biofuels industry.

  2. NASA sea ice and snow validation plan for the Defense Meteorological Satellite Program special sensor microwave/imager

    NASA Technical Reports Server (NTRS)

    Cavalieri, Donald J. (Editor); Swift, Calvin T. (Editor)

    1987-01-01

    This document addresses the task of developing and executing a plan for validating the algorithm used for initial processing of sea ice data from the Special Sensor Microwave/Imager (SSMI). The document outlines a plan for monitoring the performance of the SSMI, for validating the derived sea ice parameters, and for providing quality data products before distribution to the research community. Because of recent advances in the application of passive microwave remote sensing to snow cover on land, the validation of snow algorithms is also addressed.

  3. Validation of reactive gases and aerosols in the MACC global analysis and forecast system

    NASA Astrophysics Data System (ADS)

    Eskes, H.; Huijnen, V.; Arola, A.; Benedictow, A.; Blechschmidt, A.-M.; Botek, E.; Boucher, O.; Bouarar, I.; Chabrillat, S.; Cuevas, E.; Engelen, R.; Flentje, H.; Gaudel, A.; Griesfeller, J.; Jones, L.; Kapsomenakis, J.; Katragkou, E.; Kinne, S.; Langerock, B.; Razinger, M.; Richter, A.; Schultz, M.; Schulz, M.; Sudarchikova, N.; Thouret, V.; Vrekoussis, M.; Wagner, A.; Zerefos, C.

    2015-02-01

    The European MACC (Monitoring Atmospheric Composition and Climate) project is preparing the operational Copernicus Atmosphere Monitoring Service (CAMS), one of the services of the European Copernicus Programme on Earth observation and environmental services. MACC uses data assimilation to combine in-situ and remote sensing observations with global and regional models of atmospheric reactive gases, aerosols and greenhouse gases, and is based on the Integrated Forecast System of the ECMWF. The global component of the MACC service has a dedicated validation activity to document the quality of the atmospheric composition products. In this paper we discuss the approach to validation that has been developed over the past three years. Topics discussed are the validation requirements, the operational aspects, the measurement data sets used, the structure of the validation reports, the models and assimilation systems validated, the procedure to introduce new upgrades, and the scoring methods. One specific target of the MACC system concerns forecasting special events with high pollution concentrations. Such events receive extra attention in the validation process. Finally, a summary is provided of the results from the validation of the latest set of daily global analysis and forecast products from the MACC system reported in November 2014.

  4. MULTIPLY: Development of a European HSRL Airborne Facility

    NASA Astrophysics Data System (ADS)

    Binietoglou, Ioannis; Serikov, Ilya; Nicolae, Doina; Amiridis, Vassillis; Belegante, Livio; Boscornea, Andrea; Brugmann, Bjorn; Costa Suros, Montserrat; Hellmann, David; Kokkalis, Panagiotis; Linne, Holger; Stachlewska, Iwona; Vajaiac, Sorin-Nicolae

    2016-08-01

    MULTIPLY is a novel airborne high spectral resolution lidar (HSRL) currently under development by a consortium of European institutions from Romania, Germany, Greece, and Poland. Its aim is to contribute to calibration and validations activities of the upcoming ESA aerosol sensing missions like ADM-Aeolus, EarthCARE and the Sentinel-3/-4/-5/-5p which include products related to atmospheric aerosols. The effectiveness of these missions depends on independent airborne measurements to develop and test the retrieval methods, and validate mission products following launch. The aim of ESA's MULTIPLY project is to design, develop, and test a multi-wavelength depolarization HSRL for airborne applications. The MULTIPLY lidar will deliver the aerosol extinction and backscatter coefficient profiles at three wavelengths (355nm, 532nm, 1064nm), as well as profiles of aerosol intensive parameters (Ångström exponents, extinction- to-backscatter ratios, and linear particle depolarization ratios).

  5. Smart Aquarium as Physics Learning Media for Renewable Energy

    NASA Astrophysics Data System (ADS)

    Desnita, D.; Raihanati, R.; Susanti, D.

    2018-04-01

    Smart aquarium has been developed as a learning media to visualize Micro Hydro Power Generator (MHPG). Its used aquarium water circulation system and Wind Power Generation (WPG) which generated through a wheel as a source. Its also used to teach about energy changes, circular motion and wheel connection, electromagnetic impact, and AC power circuit. The output power and system efficiency was adjusted through the adjustment of water level and wind speed. Specific targets in this research are: to achieved: (i) develop green aquarium technology that’s suitable to used as a medium of physics learning, (ii) improving quality of process and learning result at a senior high school student. Research method used development research by Borg and Gall, which includes preliminary studies, design, product development, expert validation, and product feasibility test, and vinalisation. The validation test by the expert states that props feasible to use. Limited trials conducted prove that this tool can improve students science process skills.

  6. Development and validation of an affinity chromatography step using a peptide ligand for cGMP production of factor VIII.

    PubMed

    Kelley, Brian D; Tannatt, Molly; Magnusson, Robert; Hagelberg, Sigrid; Booth, James

    2004-08-05

    An affinity chromatography step was developed for purification of recombinant B-Domain Deleted Factor VIII (BDDrFVIII) using a peptide ligand selected from a phage display library. The peptide library had variegated residues, contained both within a disulfide bond-constrained ring and flanking the ring. The peptide ligand binds to BDDrFVIII with a dissociation constant of approximately 1 microM both in free solution and when immobilized on a chromatographic resin. The peptide is chemically synthesized and the affinity resin is produced by coupling the peptide to an agarose matrix preactivated with N-hydroxysuccinimide. Coupling conditions were optimized to give consistent and complete ligand incorporation and validated with a robustness study that tested various combinations of processing limits. The peptide affinity chromatographic operation employs conditions very similar to an immunoaffinity chromatography step currently in use for BDDrFVIII manufacture. The process step provides excellent recovery of BDDrFVIII from a complex feed stream and reduces host cell protein and DNA by 3-4 logs. Process validation studies established resin reuse over 26 cycles without changes in product recovery or purity. A robustness study using a factorial design was performed and showed that the step was insensitive to small changes in process conditions that represent normal variation in commercial manufacturing. A scaled-down model of the process step was qualified and used for virus removal studies. A validation package addressing the safety of the leached peptide included leaching rate measurements under process conditions, testing of peptide levels in product pools, demonstration of robust removal downstream by spiking studies, end product testing, and toxicological profiling of the ligand. The peptide ligand affinity step was scaled up for cGMP production of BDDrFVIII for clinical trials.

  7. Development of a new analytical tool for assessing the mutagen 2-methyl-1,4-dinitro-pyrrole in meat products by LC-ESI-MS/MS.

    PubMed

    Molognoni, Luciano; Daguer, Heitor; de Sá Ploêncio, Leandro Antunes; Yotsuyanagi, Suzana Eri; da Silva Correa Lemos, Ana Lucia; Joussef, Antonio Carlos; De Dea Lindner, Juliano

    2018-08-01

    The use of sorbate and nitrite in meat processing may lead to the formation of 2-methyl-1,4-dinitro-pyrrole (DNMP), a mutagenic compound. This work was aimed at developing and validating an analytical method for the quantitation of DNMP by liquid chromatography-tandem mass spectrometry. Full validation was performed in accordance to Commission Decision 2002/657/EC and method applicability was checked in several samples of meat products. A simple procedure, with low temperature partitioning solid-liquid extraction, was developed. The nitrosation during the extraction was monitored by the N-nitroso-DL-pipecolic acid content. Chromatographic separation was achieved in 8 min with di-isopropyl-3-aminopropyl silane bound to hydroxylated silica as stationary phase. Samples of bacon and cooked sausage yielded the highest concentrations of DNMP (68 ± 3 and 50 ± 3 μg kg -1 , respectively). The developed method proved to be a reliable, selective, and sensitive tool for DNMP measurements in meat products. Copyright © 2018 Elsevier B.V. All rights reserved.

  8. A guided search genetic algorithm using mined rules for optimal affective product design

    NASA Astrophysics Data System (ADS)

    Fung, Chris K. Y.; Kwong, C. K.; Chan, Kit Yan; Jiang, H.

    2014-08-01

    Affective design is an important aspect of new product development, especially for consumer products, to achieve a competitive edge in the marketplace. It can help companies to develop new products that can better satisfy the emotional needs of customers. However, product designers usually encounter difficulties in determining the optimal settings of the design attributes for affective design. In this article, a novel guided search genetic algorithm (GA) approach is proposed to determine the optimal design attribute settings for affective design. The optimization model formulated based on the proposed approach applied constraints and guided search operators, which were formulated based on mined rules, to guide the GA search and to achieve desirable solutions. A case study on the affective design of mobile phones was conducted to illustrate the proposed approach and validate its effectiveness. Validation tests were conducted, and the results show that the guided search GA approach outperforms the GA approach without the guided search strategy in terms of GA convergence and computational time. In addition, the guided search optimization model is capable of improving GA to generate good solutions for affective design.

  9. WHO Expert Committee on Specifications for Pharmaceutical Preparations. Forty-ninth report.

    PubMed

    2015-01-01

    The Expert Committee on Specifications for Pharmaceutical Preparations works towards clear, independent and practical standards and guidelines for the quality assurance of medicines. Standards are developed by the Committee through worldwide consultation and an international consensus-building process. The following new guidelines were adopted and recommended for use. Revised procedure for the development of monographs and other texts for The International Pharmacopoeia; Revised updating mechanism for the section on radiopharmaceuticals in The International Pharmacopoeia; Revision of the supplementary guidelines on good manufacturing practices: validation, Appendix 7: non-sterile process validation; General guidance for inspectors on hold-time studies; 16 technical supplements to Model guidance for the storage and transport of time- and temperature-sensitive pharmaceutical products; Recommendations for quality requirements when plant-derived artemisinin is used as a starting material in the production of antimalarial active pharmaceutical ingredients; Multisource (generic) pharmaceutical products: guidelines on registration requirements to establish interchangeability: revision; Guidance on the selection of comparator pharmaceutical products for equivalence assessment of interchangeable multisource (generic) products: revision; and Good review practices: guidelines for national and regional regulatory authorities.

  10. EOS Terra Validation Program

    NASA Technical Reports Server (NTRS)

    Starr, David

    2000-01-01

    The EOS Terra mission will be launched in July 1999. This mission has great relevance to the atmospheric radiation community and global change issues. Terra instruments include Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER), Clouds and Earth's Radiant Energy System (CERES), Multi-Angle Imaging Spectroradiometer (MISR), Moderate Resolution Imaging Spectroradiometer (MODIS) and Measurements of Pollution in the Troposphere (MOPITT). In addition to the fundamental radiance data sets, numerous global science data products will be generated, including various Earth radiation budget, cloud and aerosol parameters, as well as land surface, terrestrial ecology, ocean color, and atmospheric chemistry parameters. Significant investments have been made in on-board calibration to ensure the quality of the radiance observations. A key component of the Terra mission is the validation of the science data products. This is essential for a mission focused on global change issues and the underlying processes. The Terra algorithms have been subject to extensive pre-launch testing with field data whenever possible. Intensive efforts will be made to validate the Terra data products after launch. These include validation of instrument calibration (vicarious calibration) experiments, instrument and cross-platform comparisons, routine collection of high quality correlative data from ground-based networks, such as AERONET, and intensive sites, such as the SGP ARM site, as well as a variety field experiments, cruises, etc. Airborne simulator instruments have been developed for the field experiment and underflight activities including the MODIS Airborne Simulator (MAS) AirMISR, MASTER (MODIS-ASTER), and MOPITT-A. All are integrated on the NASA ER-2 though low altitude platforms are more typically used for MASTER. MATR is an additional sensor used for MOPITT algorithm development and validation. The intensive validation activities planned for the first year of the Terra mission will be described with emphasis on derived geophysical parameters of most relevance to the atmospheric radiation community.

  11. Guided Inquiry Facilitated Blended Learning to Improve Metacognitive and Learning Outcome of High School Students

    NASA Astrophysics Data System (ADS)

    Suwono, H.; Susanti, S.; Lestari, U.

    2017-04-01

    The learning activities that involve the students to learn actively is one of the characteristics of a qualified education. The learning strategy that involves students’ active learning is guided inquiry. Learning problems today are growing metacognitive skills and cognitive learning outcomes. It is the research and development of learning module by using 4D models of Thiagarajan. The first phase is Define, which analyses the problems and needs required by the prior preparation of the module. The second phase is Design, which formulates learning design and devices to obtain the initial draft of learning modules. The third stage is Develop, which is developing and writing module, module validation, product testing, revision, and the resulting an end-product results module development. The fourth stage is Disseminate, which is disseminating of the valid products. Modules were validated by education experts, practitioners, subject matter experts, and expert of online media. The results of the validation module indicated that the module was valid and could be used in teaching and learning. In the validation phase of testing methods, we used experiments to know the difference of metacognitive skills and learning outcomes between the control group and experimental group. The experimental design was a one group pretest-posttest design. The results of the data analysis showed that the modules could enhance metacognitive skills and learning outcomes. The advantages of this module is as follows, 1) module is accompanied by a video link on a website that contains practical activities that are appropriate to Curriculum 2013, 2) module is accompanied by a video link on a website that contains about manual laboratory activities that will be used in the classroom face-to-face, so that students are ready when doing laboratory activities, 3) this module can be online through chat to increase students’ understanding. The disadvantages of this module are the material presented in the modules is limited. It is suggested that for the better utilisation of the online activities, students should be present at every meeting of the activities, so as to make all the students participate actively. It is also suggested that school set up facilities to support blended learning.

  12. Monitoring Progress in Vocal Development in Young Cochlear Implant Recipients: Relationships between Speech Samples and Scores from the Conditioned Assessment of Speech Production (CASP)

    PubMed Central

    Ertmer, David J.; Jung, Jongmin

    2012-01-01

    Background Evidence of auditory-guided speech development can be heard as the prelinguistic vocalizations of young cochlear implant recipients become increasingly complex, phonetically diverse, and speech-like. In research settings, these changes are most often documented by collecting and analyzing speech samples. Sampling, however, may be too time-consuming and impractical for widespread use in clinical settings. The Conditioned Assessment of Speech Production (CASP; Ertmer & Stoel-Gammon, 2008) is an easily administered and time-efficient alternative to speech sample analysis. The current investigation examined the concurrent validity of the CASP and data obtained from speech samples recorded at the same intervals. Methods Nineteen deaf children who received CIs before their third birthdays participated in the study. Speech samples and CASP scores were gathered at 6, 12, 18, and 24 months post-activation. Correlation analyses were conducted to assess the concurrent validity of CASP scores and data from samples. Results CASP scores showed strong concurrent validity with scores from speech samples gathered across all recording sessions (6 – 24 months). Conclusions The CASP was found to be a valid, reliable, and time-efficient tool for assessing progress in vocal development during young CI recipient’s first 2 years of device experience. PMID:22628109

  13. Calibration and Validation Plan for the L2A Processor and Products of the SENTINEL-2 Mission

    NASA Astrophysics Data System (ADS)

    Main-Knorn, M.; Pflug, B.; Debaecker, V.; Louis, J.

    2015-04-01

    The Copernicus programme, is a European initiative for the implementation of information services based on observation data received from Earth Observation (EO) satellites and ground based information. In the frame of this programme, ESA is developing the Sentinel-2 optical imaging mission that will deliver optical data products designed to feed downstream services mainly related to land monitoring, emergency management and security. To ensure the highest quality of service, ESA sets up the Sentinel-2 Mission Performance Centre (MPC) in charge of the overall performance monitoring of the Sentinel-2 mission. TPZ F and DLR have teamed up in order to provide the best added-value support to the MPC for calibration and validation of the Level-2A processor (Sen2Cor) and products. This paper gives an overview over the planned L2A calibration and validation activities. Level-2A processing is applied to Top-Of-Atmosphere (TOA) Level-1C ortho-image reflectance products. Level-2A main output is the Bottom-Of-Atmosphere (BOA) corrected reflectance product. Additional outputs are an Aerosol Optical Thickness (AOT) map, a Water Vapour (WV) map and a Scene Classification (SC) map with Quality Indicators for cloud and snow probabilities. Level-2A BOA, AOT and WV outputs are calibrated and validated using ground-based data of automatic operating stations and data of in-situ campaigns. Scene classification is validated by the visual inspection of test datasets and cross-sensor comparison, supplemented by meteorological data, if available. Contributions of external in-situ campaigns would enlarge the reference dataset and enable extended validation exercise. Therefore, we are highly interested in and welcome external contributors.

  14. The NASA Soil Moisture Active Passive (SMAP) Mission - Science and Data Product Development Status

    NASA Technical Reports Server (NTRS)

    Nloku, E.; Entekhabi, D.; O'Neill, P.

    2012-01-01

    The Soil Moisture Active Passive (SMAP) mission, planned for launch in late 2014, has the objective of frequent, global mapping of near-surface soil moisture and its freeze-thaw state. The SMAP measurement system utilizes an L-band radar and radiometer sharing a rotating 6-meter mesh reflector antenna. The instruments will operate on a spacecraft in a 685 km polar orbit with 6am/6pm nodal crossings, viewing the surface at a constant 40-degree incidence angle with a 1000-km swath width, providing 3-day global coverage. Data from the instruments will yield global maps of soil moisture and freeze/thaw state at 10 km and 3 km resolutions, respectively, every two to three days. The 10-km soil moisture product will be generated using a combined radar and radiometer retrieval algorithm. SMAP will also provide a radiometer-only soil moisture product at 40-km spatial resolution and a radar-only soil moisture product at 3-km resolution. The relative accuracies of these products will vary regionally and will depend on surface characteristics such as vegetation water content, vegetation type, surface roughness, and landscape heterogeneity. The SMAP soil moisture and freeze/thaw measurements will enable significantly improved estimates of the fluxes of water, energy and carbon between the land and atmosphere. Soil moisture and freeze/thaw controls of these fluxes are key factors in the performance of models used for weather and climate predictions and for quantifYing the global carbon balance. Soil moisture measurements are also of importance in modeling and predicting extreme events such as floods and droughts. The algorithms and data products for SMAP are being developed in the SMAP Science Data System (SDS) Testbed. In the Testbed algorithms are developed and evaluated using simulated SMAP observations as well as observational data from current airborne and spaceborne L-band sensors including data from the SMOS and Aquarius missions. We report here on the development status of the SMAP data products. The Testbed simulations are designed to capture various sources of errors in the products including environment effects, instrument effects (nonideal aspects of the measurement system), and retrieval algorithm errors. The SMAP project has developed a Calibration and Validation (Cal/Val) Plan that is designed to support algorithm development (pre-launch) and data product validation (post-launch). A key component of the Cal/Val Plan is the identification, characterization, and instrumentation of sites that can be used to calibrate and validate the sensor data (Level l) and derived geophysical products (Level 2 and higher).

  15. Validation study and routine control monitoring of moist heat sterilization procedures.

    PubMed

    Shintani, Hideharu

    2012-06-01

    The proposed approach to validation of steam sterilization in autoclaves follows the basic life cycle concepts applicable to all validation programs. Understand the function of sterilization process, develop and understand the cycles to carry out the process, and define a suitable test or series of tests to confirm that the function of the process is suitably ensured by the structure provided. Sterilization of product and components and parts that come in direct contact with sterilized product is the most critical of pharmaceutical processes. Consequently, this process requires a most rigorous and detailed approach to validation. An understanding of the process requires a basic understanding of microbial death, the parameters that facilitate that death, the accepted definition of sterility, and the relationship between the definition and sterilization parameters. Autoclaves and support systems need to be designed, installed, and qualified in a manner that ensures their continued reliability. Lastly, the test program must be complete and definitive. In this paper, in addition to validation study, documentation of IQ, OQ and PQ concretely were described.

  16. A Premiere example of the illusion of harm reduction cigarettes in the 1990s.

    PubMed

    Pollay, R W; Dewhirst, T

    2003-09-01

    To use the product launch of Player's Premiere as a case study for understanding the new cigarette product development process during the 1990s. We determine the (in)validity of industry claims that: (1) development of the physical product preceded the promotional promise of "less irritation"; (2) "less irritation" was actually realised; (3) advertising informed consumers; and (4) advertising regulations caused the product's failure in the marketplace. Court proceedings assessing the constitutionality of Canada's Tobacco Act, which substantially restricts cigarette advertising. The 2002 Quebec Superior Court trial yielded a new collection of internal documents from Imperial Tobacco Ltd (ITL), including several about the development and marketing of Player's Premiere. Trial testimony and corporate documents were reviewed to determine the validity of the industry representations about the new cigarette product development process, focusing on the case history of Player's Premiere. In direct contradiction to industry testimony, the documentary evidence demonstrates that (1) communications for Player's Premiere, which claimed less irritation, were developed long before finding a product that could deliver on the promise; (2) ITL did not sell a "less irritating" product that matched its promotional promise; (3) the advertising and other communications for Player's Premiere were extensive, relying on the hi-tech appearances ("tangible credibility") of a "unique" filter, yet were uninformative and vague; and (4) Player's Premiere failed in the marketplace, despite extensive advertising and retail support, because it was an inferior product that did not live up to its promotional promise, not because of regulation of commercial speech. New product development entails extensive consumer research to craft all communications tools in fine detail. In the case of Player's Premiere, this crafting created a false and misleading impression of technological advances producing a "less irritating" cigarette. This product was solely a massive marketing ploy with neither consumer benefits, nor public health benefits. The industry attempted to deceive both consumers and the court.

  17. A Premiere example of the illusion of harm reduction cigarettes in the 1990s

    PubMed Central

    Pollay, R; Dewhirst, T

    2003-01-01

    Objective: To use the product launch of Player's Premiere as a case study for understanding the new cigarette product development process during the 1990s. We determine the (in)validity of industry claims that: (1) development of the physical product preceded the promotional promise of "less irritation"; (2) "less irritation" was actually realised; (3) advertising informed consumers; and (4) advertising regulations caused the product's failure in the marketplace. Setting: Court proceedings assessing the constitutionality of Canada's Tobacco Act, which substantially restricts cigarette advertising. The 2002 Quebec Superior Court trial yielded a new collection of internal documents from Imperial Tobacco Ltd (ITL), including several about the development and marketing of Player's Premiere. Method: Trial testimony and corporate documents were reviewed to determine the validity of the industry representations about the new cigarette product development process, focusing on the case history of Player's Premiere. Results: In direct contradiction to industry testimony, the documentary evidence demonstrates that (1) communications for Player's Premiere, which claimed less irritation, were developed long before finding a product that could deliver on the promise; (2) ITL did not sell a "less irritating" product that matched its promotional promise; (3) the advertising and other communications for Player's Premiere were extensive, relying on the hi-tech appearances ("tangible credibility") of a "unique" filter, yet were uninformative and vague; and (4) Player's Premiere failed in the marketplace, despite extensive advertising and retail support, because it was an inferior product that did not live up to its promotional promise, not because of regulation of commercial speech. Conclusions: New product development entails extensive consumer research to craft all communications tools in fine detail. In the case of Player's Premiere, this crafting created a false and misleading impression of technological advances producing a "less irritating" cigarette. This product was solely a massive marketing ploy with neither consumer benefits, nor public health benefits. The industry attempted to deceive both consumers and the court. PMID:12958396

  18. In-vitro Equilibrium Phosphate Binding Study of Sevelamer Carbonate by UV-Vis Spectrophotometry.

    PubMed

    Prasaja, Budi; Syabani, M Maulana; Sari, Endah; Chilmi, Uci; Cahyaningsih, Prawitasari; Kosasih, Theresia Weliana

    2018-06-12

    Sevelamer carbonate is a cross-linked polymeric amine; it is the active ingredient in Renvela ® tablets. US FDA provides recommendation for demonstrating bioequivalence for the development of a generic product of sevelamer carbonte using in-vitro equilibrium binding study. A simple UV-vis spectrophotometry method was developed and validated for quantification of free phosphate to determine the binding parameter constant of sevelamer. The method validation demonstrated the specificity, limit of quantification, accuracy and precision of measurements. The validated method has been successfully used to analyze samples in in-vitro equilibrium binding study for demonstrating bioequivalence. © Georg Thieme Verlag KG Stuttgart · New York.

  19. The Effectiveness of Guided Inquiry-based Learning Material on Students’ Science Literacy Skills

    NASA Astrophysics Data System (ADS)

    Aulia, E. V.; Poedjiastoeti, S.; Agustini, R.

    2018-01-01

    The purpose of this research is to describe the effectiveness of guided inquiry-based learning material to improve students’ science literacy skills on solubility and solubility product concepts. This study used Research and Development (R&D) design and was implemented to the 11th graders of Muhammadiyah 4 Senior High School Surabaya in 2016/2017 academic year with one group pre-test and post-test design. The data collection techniques used were validation, observation, test, and questionnaire. The results of this research showed that the students’ science literacy skills are different after implementation of guided inquiry-based learning material. The guided inquiry-based learning material is effective to improve students’ science literacy skills on solubility and solubility product concepts by getting N-gain score with medium and high category. This improvement caused by the developed learning material such as lesson plan, student worksheet, and science literacy skill tests were categorized as valid and very valid. In addition, each of the learning phases in lesson plan has been well implemented. Therefore, it can be concluded that the guided inquiry-based learning material are effective to improve students’ science literacy skills on solubility and solubility product concepts in senior high school.

  20. Validation of High-Fidelity CFD/CAA Framework for Launch Vehicle Acoustic Environment Simulation against Scale Model Test Data

    NASA Technical Reports Server (NTRS)

    Liever, Peter A.; West, Jeffrey S.

    2016-01-01

    A hybrid Computational Fluid Dynamics and Computational Aero-Acoustics (CFD/CAA) modeling framework has been developed for launch vehicle liftoff acoustic environment predictions. The framework couples the existing highly-scalable NASA production CFD code, Loci/CHEM, with a high-order accurate discontinuous Galerkin solver developed in the same production framework, Loci/THRUST, to accurately resolve and propagate acoustic physics across the entire launch environment. Time-accurate, Hybrid RANS/LES CFD modeling is applied for predicting the acoustic generation physics at the plume source, and a high-order accurate unstructured discontinuous Galerkin (DG) method is employed to propagate acoustic waves away from the source across large distances using high-order accurate schemes. The DG solver is capable of solving 2nd, 3rd, and 4th order Euler solutions for non-linear, conservative acoustic field propagation. Initial application testing and validation has been carried out against high resolution acoustic data from the Ares Scale Model Acoustic Test (ASMAT) series to evaluate the capabilities and production readiness of the CFD/CAA system to resolve the observed spectrum of acoustic frequency content. This paper presents results from this validation and outlines efforts to mature and improve the computational simulation framework.

  1. Validation plays the role of a "bridge" in connecting remote sensing research and applications

    NASA Astrophysics Data System (ADS)

    Wang, Zhiqiang; Deng, Ying; Fan, Yida

    2018-07-01

    Remote sensing products contribute to improving earth observations over space and time. Uncertainties exist in products of different levels; thus, validation of these products before and during their applications is critical. This study discusses the meaning of validation in depth and proposes a new definition of reliability for use with such products. In this context, validation should include three aspects: a description of the relevant uncertainties, quantitative measurement results and a qualitative judgment that considers the needs of users. A literature overview is then presented evidencing improvements in the concepts associated with validation. It shows that the root mean squared error (RMSE) is widely used to express accuracy; increasing numbers of remote sensing products have been validated; research institutes contribute most validation efforts; and sufficient validation studies encourage the application of remote sensing products. Validation plays a connecting role in the distribution and application of remote sensing products. Validation connects simple remote sensing subjects with other disciplines, and it connects primary research with practical applications. Based on the above findings, it is suggested that validation efforts that include wider cooperation among research institutes and full consideration of the needs of users should be promoted.

  2. Prototype of NASA's Global Precipitation Measurement Mission Ground Validation System

    NASA Technical Reports Server (NTRS)

    Schwaller, M. R.; Morris, K. R.; Petersen, W. A.

    2007-01-01

    NASA is developing a Ground Validation System (GVS) as one of its contributions to the Global Precipitation Mission (GPM). The GPM GVS provides an independent means for evaluation, diagnosis, and ultimately improvement of GPM spaceborne measurements and precipitation products. NASA's GPM GVS consists of three elements: field campaigns/physical validation, direct network validation, and modeling and simulation. The GVS prototype of direct network validation compares Tropical Rainfall Measuring Mission (TRMM) satellite-borne radar data to similar measurements from the U.S. national network of operational weather radars. A prototype field campaign has also been conducted; modeling and simulation prototypes are under consideration.

  3. Functional and Behavioral Product Information Representation and Consistency Validation for Collaboration in Product Lifecycle Activities

    ERIC Educational Resources Information Center

    Baysal, Mehmet Murat

    2012-01-01

    Information models that represent the function, assembly and behavior of artifacts are critical in the conceptual development of a product and its evaluation. Much research has been conducted in this area; however, existing models do not relate function, behavior and structure in a comprehensive and consistent way. In this work, NIST's Core…

  4. Identification student’s misconception of heat and temperature using three-tier diagnostic test

    NASA Astrophysics Data System (ADS)

    Suliyanah; Putri, H. N. P. A.; Rohmawati, L.

    2018-03-01

    The objective of this research is to develop a Three-Tier Diagnostic Test (TTDT) to identify the student's misconception of heat and temperature. Stages of development include: analysis, planning, design, development, evaluation and revise. The results of this study show that (1) the quality of the three-tier type diagnostic test instrument developed has been expressed well with the following details: (a) Internal validity of 88.19% belonging to the valid category. (b) External validity of empirical construct validity test using Pearson Product Moment obtained 0.43 is classified and result of empirical construct validity test obtained false positives 6.1% and false negatives 5.9% then the instrument was valid. (c) Test reliability by using Cronbach’s Alpha of 0.98 which means acceptable. (d) The 80% difficulty level test is quite difficult. (2) Student misconceptions on the temperature of heat and displacement materials based on the II test the highest (84%), the lowest (21%), and the non-misconceptions (7%). (3) The highest cause of misconception among students is associative thinking (22%) and the lowest is caused by incomplete or incomplete reasoning (11%). Three-Tier Diagnostic Test (TTDT) could identify the student's misconception of heat and temperature.

  5. Model for heat and mass transfer in freeze-drying of pellets.

    PubMed

    Trelea, Ioan Cristian; Passot, Stéphanie; Marin, Michèle; Fonseca, Fernanda

    2009-07-01

    Lyophilizing frozen pellets, and especially spray freeze-drying, have been receiving growing interest. To design efficient and safe freeze-drying cycles, local temperature and moisture content in the product bed have to be known, but both are difficult to measure in the industry. Mathematical modeling of heat and mass transfer helps to determine local freeze-drying conditions and predict effects of operation policy, and equipment and recipe changes on drying time and product quality. Representative pellets situated at different positions in the product slab were considered. One-dimensional transfer in the slab and radial transfer in the pellets were assumed. Coupled heat and vapor transfer equations between the temperature-controlled shelf, the product bulk, the sublimation front inside the pellets, and the chamber were established and solved numerically. The model was validated based on bulk temperature measurement performed at two different locations in the product slab and on partial vapor pressure measurement in the freeze-drying chamber. Fair agreement between measured and calculated values was found. In contrast, a previously developed model for compact product layer was found inadequate in describing freeze-drying of pellets. The developed model represents a good starting basis for studying freeze-drying of pellets. It has to be further improved and validated for a variety of product types and freeze-drying conditions (shelf temperature, total chamber pressure, pellet size, slab thickness, etc.). It could be used to develop freeze-drying cycles based on product quality criteria such as local moisture content and glass transition temperature.

  6. Guided Inquiry with Cognitive Conflict Strategy: Drilling Indonesian High School Students’ Creative Thinking Skills

    NASA Astrophysics Data System (ADS)

    Syadzili, A. F.; Soetjipto; Tukiran

    2018-01-01

    This research aims to produce physics learning materials in Indonesian high school using guided inquiry with cognitive conflict strategy to drill students’ creative thinking skills in a static fluid learning. This development research used 4D model with one group pre-test and post-test design implemented in the eleventh grade students in the second semester of 2016/2017 academic year. The data were collected by validation sheets, questionnaires, tests and observations, while data analysis techniques is descriptive quantitative analysis. This research obtained several findings, they are : the learning material developed had an average validity score with very valid category. The lesson plan can be implemented very well. The students’ responses toward the learning process were very possitive with the students’ interest to follow the learning. Creative thinking skills of student before the implementation of product was inadequate, then it is very creative after product was implemented. The impacts of the research suggest that guided inquiry may stimulate the students to think creatifly.

  7. ARM Radiosondes for National Polar-Orbiting Operational Environmental Satellite System Preparatory Project Validation Field Campaign Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Borg, Lori; Tobin, David; Reale, Anthony

    This IOP has been a coordinated effort involving the U.S. Department of Energy (DOE) Atmospheric Radiation (ARM) Climate Research Facility, the University of Wisconsin (UW)-Madison, and the JPSS project to validate SNPP NOAA Unique Combined Atmospheric Processing System (NUCAPS) temperature and moisture sounding products from the Cross-track Infrared Sounder (CrIS) and the Advanced Technology Microwave Sounder (ATMS). In this arrangement, funding for radiosondes was provided by the JPSS project to ARM. These radiosondes were launched coincident with the SNPP satellite overpasses (OP) at four of the ARM field sites beginning in July 2012 and running through September 2017. Combined withmore » other ARM data, an assessment of the radiosonde data quality was performed and post-processing corrections applied producing an ARM site Best Estimate (BE) product. The SNPP targeted radiosondes were integrated into the NOAA Products Validation System (NPROVS+) system, which collocated the radiosondes with satellite products (NOAA, National Aeronautics and Space Administration [NASA], European Organisation for the Exploitation of Meteorological Satellites [EUMETSAT], Geostationary Operational Environmental Satellite [GOES], Constellation Observing System for Meteorology, Ionosphere, and Climate [COSMIC]) and Numerical Weather Prediction (NWP forecasts for use in product assessment and algorithm development. This work was a fundamental, integral, and cost-effective part of the SNPP validation effort and provided critical accuracy assessments of the SNPP temperature and water vapor soundings.« less

  8. Validation and Verification of Operational Land Analysis Activities at the Air Force Weather Agency

    NASA Technical Reports Server (NTRS)

    Shaw, Michael; Kumar, Sujay V.; Peters-Lidard, Christa D.; Cetola, Jeffrey

    2012-01-01

    The NASA developed Land Information System (LIS) is the Air Force Weather Agency's (AFWA) operational Land Data Assimilation System (LDAS) combining real time precipitation observations and analyses, global forecast model data, vegetation, terrain, and soil parameters with the community Noah land surface model, along with other hydrology module options, to generate profile analyses of global soil moisture, soil temperature, and other important land surface characteristics. (1) A range of satellite data products and surface observations used to generate the land analysis products (2) Global, 1/4 deg spatial resolution (3) Model analysis generated at 3 hours. AFWA recognizes the importance of operational benchmarking and uncertainty characterization for land surface modeling and is developing standard methods, software, and metrics to verify and/or validate LIS output products. To facilitate this and other needs for land analysis activities at AFWA, the Model Evaluation Toolkit (MET) -- a joint product of the National Center for Atmospheric Research Developmental Testbed Center (NCAR DTC), AFWA, and the user community -- and the Land surface Verification Toolkit (LVT), developed at the Goddard Space Flight Center (GSFC), have been adapted to operational benchmarking needs of AFWA's land characterization activities.

  9. SeaWiFS Technical Report Series. Volume 42; Satellite Primary Productivity Data and Algorithm Development: A Science Plan for Mission to Planet Earth

    NASA Technical Reports Server (NTRS)

    Falkowski, Paul G.; Behrenfeld, Michael J.; Esaias, Wayne E.; Balch, William; Campbell, Janet W.; Iverson, Richard L.; Kiefer, Dale A.; Morel, Andre; Yoder, James A.; Hooker, Stanford B. (Editor); hide

    1998-01-01

    Two issues regarding primary productivity, as it pertains to the Sea-viewing Wide Field-of-view Sensor (SeaWiFS) Program and the National Aeronautics and Space Administration (NASA) Mission to Planet Earth (MTPE) are presented in this volume. Chapter 1 describes the development of a science plan for deriving primary production for the world ocean using satellite measurements, by the Ocean Primary Productivity Working Group (OPPWG). Chapter 2 presents discussions by the same group, of algorithm classification, algorithm parameterization and data availability, algorithm testing and validation, and the benefits of a consensus primary productivity algorithm.

  10. Applying innovative approach “Nature of Science (NoS) within inquiry” for developing scientific literacy in the student worksheet

    NASA Astrophysics Data System (ADS)

    Widowati, A.; Anjarsari, P.; Zuhdan, K. P.; Dita, A.

    2018-03-01

    The challenges of the 21st century require innovative solutions. Education must able to make an understanding of science learning that leads to the formation of scientific literacy learners. This research was conducted to produce the prototype as science worksheet based on Nature of Science (NoS) within inquiry approach and to know the effectiveness its product for developing scientific literacy. This research was the development and research design, by pointing to Four D models and Borg & Gall Model. There were 4 main phases (define, design, develop, disseminate) and additional phases (preliminary field testing, main product revision, main field testing, and operational product revision). Research subjects were students of the junior high school in Yogyakarta. The instruments used included questionnaire sheet product validation and scientific literacy test. For the validation data were analyzed descriptively. The test result was analyzed by an N-gain score. The results showed that the appropriateness of worksheet applying NoS within inquiry-based learning approach is eligible based on the assessment from excellent by experts and teachers, students’ scientific literacy can improve high category of the N-gain score at 0.71 by using student worksheet with Nature of Science (NoS) within inquiry approach.

  11. MODIS Snow and Sea Ice Products

    NASA Technical Reports Server (NTRS)

    Hall, Dorothy K.; Riggs, George A.; Salomonson, Vincent V.

    2004-01-01

    In this chapter, we describe the suite of Earth Observing System (EOS) Moderate-Resolution Imaging Spectroradiometer (MODIS) Terra and Aqua snow and sea ice products. Global, daily products, developed at Goddard Space Flight Center, are archived and distributed through the National Snow and Ice Data Center at various resolutions and on different grids useful for different communities Snow products include binary snow cover, snow albedo, and in the near future, fraction of snow in a 5OO-m pixel. Sea ice products include ice extent determined with two different algorithms, and sea ice surface temperature. The algorithms used to develop these products are described. Both the snow and sea ice products, available since February 24,2000, are useful for modelers. Validation of the products is also discussed.

  12. 47 CFR 7.7 - Product design, development, and evaluation.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ...: (1) Where market research is undertaken, including individuals with disabilities in target... appropriate disability-related organizations; and (4) Making reasonable efforts to validate any unproven...

  13. Strategic system development toward biofuel, desertification, and crop production monitoring in continental scales using satellite-based photosynthesis models

    NASA Astrophysics Data System (ADS)

    Kaneko, Daijiro

    2013-10-01

    The author regards fundamental root functions as underpinning photosynthesis activities by vegetation and as affecting environmental issues, grain production, and desertification. This paper describes the present development of monitoring and near real-time forecasting of environmental projects and crop production by approaching established operational monitoring step-by-step. The author has been developing a thematic monitoring structure (named RSEM system) which stands on satellite-based photosynthesis models over several continents for operational supports in environmental fields mentioned above. Validation methods stand not on FLUXNET but on carbon partitioning validation (CPV). The models demand continuing parameterization. The entire frame system has been built using Reanalysis meteorological data, but model accuracy remains insufficient except for that of paddy rice. The author shall accomplish the system that incorporates global environmental forces. Regarding crop production applications, industrialization in developing countries achieved through direct investment by economically developed nations raises their income, resulting in increased food demand. Last year, China began to import rice as it had in the past with grains of maize, wheat, and soybeans. Important agro-potential countries make efforts to cultivate new crop lands in South America, Africa, and Eastern Europe. Trends toward less food sustainability and stability are continuing, with exacerbation by rapid social and climate changes. Operational monitoring of carbon sequestration by herbaceous and bore plants converges with efforts at bio-energy, crop production monitoring, and socio-environmental projects such as CDM A/R, combating desertification, and bio-diversity.

  14. A Framework for the Generation and Dissemination of Drop Size Distribution (DSD) Characteristics Using Multiple Platforms

    NASA Technical Reports Server (NTRS)

    Wolf, David B.; Tokay, Ali; Petersen, Walt; Williams, Christopher; Gatlin, Patrick; Wingo, Mathew

    2010-01-01

    Proper characterization of the precipitation drop size distribution (DSD) is integral to providing realistic and accurate space- and ground-based precipitation retrievals. Current technology allows for the development of DSD products from a variety of platforms, including disdrometers, vertical profilers and dual-polarization radars. Up to now, however, the dissemination or availability of such products has been limited to individual sites and/or field campaigns, in a variety of formats, often using inconsistent algorithms for computing the integral DSD parameters, such as the median- and mass-weighted drop diameter, total number concentration, liquid water content, rain rate, etc. We propose to develop a framework for the generation and dissemination of DSD characteristic products using a unified structure, capable of handling the myriad collection of disdrometers, profilers, and dual-polarization radar data currently available and to be collected during several upcoming GPM Ground Validation field campaigns. This DSD super-structure paradigm is an adaptation of the radar super-structure developed for NASA s Radar Software Library (RSL) and RSL_in_IDL. The goal is to provide the DSD products in a well-documented format, most likely NetCDF, along with tools to ingest and analyze the products. In so doing, we can develop a robust archive of DSD products from multiple sites and platforms, which should greatly benefit the development and validation of precipitation retrieval algorithms for GPM and other precipitation missions. An outline of this proposed framework will be provided as well as a discussion of the algorithms used to calculate the DSD parameters.

  15. Quantitative determination and sampling of azathioprine residues for cleaning validation in production area.

    PubMed

    Fazio, Tatiana Tatit; Singh, Anil Kumar; Kedor-Hackmann, Erika Rosa Maria; Santoro, Maria Inês Rocha Miritello

    2007-03-12

    Cleaning validation is an integral part of current good manufacturing practices in any pharmaceutical industry. Nowadays, azathioprine and several other pharmacologically potent pharmaceuticals are manufactured in same production area. Carefully designed cleaning validation and its evaluation can ensure that residues of azathioprine will not carry over and cross contaminate the subsequent product. The aim of this study was to validate simple analytical method for verification of residual azathioprine in equipments used in the production area and to confirm efficiency of cleaning procedure. The HPLC method was validated on a LC system using Nova-Pak C18 (3.9 mm x 150 mm, 4 microm) and methanol-water-acetic acid (20:80:1, v/v/v) as mobile phase at a flow rate of 1.0 mL min(-1). UV detection was made at 280 nm. The calibration curve was linear over a concentration range from 2.0 to 22.0 microg mL(-1) with a correlation coefficient of 0.9998. The detection limit (DL) and quantitation limit (QL) were 0.09 and 0.29 microg mL(-1), respectively. The intra-day and inter-day precision expressed as relative standard deviation (R.S.D.) were below 2.0%. The mean recovery of method was 99.19%. The mean extraction-recovery from manufacturing equipments was 83.5%. The developed UV spectrophotometric method could only be used as limit method to qualify or reject cleaning procedure in production area. Nevertheless, the simplicity of spectrophotometric method makes it useful for routine analysis of azathioprine residues on cleaned surface and as an alternative to proposed HPLC method.

  16. NOAA activities in support of in situ validation observations for satellite ocean color products and related ocean science research

    NASA Astrophysics Data System (ADS)

    Lance, V. P.; DiGiacomo, P. M.; Ondrusek, M.; Stengel, E.; Soracco, M.; Wang, M.

    2016-02-01

    The NOAA/STAR ocean color program is focused on "end-to-end" production of high quality satellite ocean color products. In situ validation of satellite data is essential to produce the high quality, "fit for purpose" ocean color products that support users and applications in all NOAA line offices, as well as external (both applied and research) users. The first NOAA/OMAO (Office of Marine and Aviation Operations) sponsored research cruise dedicated to VIIRS SNPP validation was completed aboard the NOAA Ship Nancy Foster in November 2014. The goals and objectives of the 2014 cruise are highlighted in the recently published NOAA/NESDIS Technical Report. A second dedicated validation cruise is planned for December 2015 and will have been completed by the time of this meeting. The goals and objectives of the 2015 cruise will be discussed in the presentation. Participants and observations made will be reported. The NOAA Ocean Color Calibration/Validation (Cal/Val) team also works collaboratively with others programs. A recent collaboration with the NOAA Ocean Acidification program on the East Coast Ocean Acidification (ECOA) cruise during June-July 2015, where biogeochemical and optical measurements were made together, allows for the leveraging of in situ observations for satellite validation and for their use in the development of future ocean acidification satellite products. Datasets from these cruises will be formally archived at NOAA and Digital Object Identifier (DOI) numbers will be assigned. In addition, the NOAA Coast/OceanWatch Program is working to establish a searchable database. The beta version will begin with cruise data and additional in situ calibration/validation related data collected by the NOAA Ocean Color Cal/Val team members. A more comprehensive searchable NOAA database, with contributions from other NOAA ocean observation platforms and cruise collaborations is envisioned. Progress on these activities will be reported.

  17. Validation of the MODIS MOD21 and MOD11 land surface temperature and emissivity products in an arid area of Northwest China

    NASA Astrophysics Data System (ADS)

    Li, H.; Yang, Y.; Yongming, D.; Cao, B.; Qinhuo, L.

    2017-12-01

    Land surface temperature (LST) is a key parameter for hydrological, meteorological, climatological and environmental studies. During the past decades, many efforts have been devoted to the establishment of methodology for retrieving the LST from remote sensing data and significant progress has been achieved. Many operational LST products have been generated using different remote sensing data. MODIS LST product (MOD11) is one of the most commonly used LST products, which is produced using a generalized split-window algorithm. Many validation studies have showed that MOD11 LST product agrees well with ground measurements over vegetated and inland water surfaces, however, large negative biases of up to 5 K are present over arid regions. In addition, land surface emissivity of MOD11 are estimated by assigning fixed emissivities according to a land cover classification dataset, which may introduce large errors to the LST product due to misclassification of the land cover. Therefore, a new MODIS LSE&E product (MOD21) is developed based on the temperature emissivity separation (TES) algorithm, and the water vapor scaling (WVS) method has also been incorporated into the MODIS TES algorithm for improving the accuracy of the atmospheric correction. The MOD21 product will be released with MODIS collection 6 Tier-2 land products in 2017. Due to the MOD21 products are not available right now, the MODTES algorithm was implemented including the TES and WVS methods as detailed in the MOD21 Algorithm Theoretical Basis Document. The MOD21 and MOD11 C6 LST products are validated using ground measurements and ASTER LST products collected in an arid area of Northwest China during the Heihe Watershed Allied Telemetry Experimental Research (HiWATER) experiment. In addition, lab emissivity spectra of four sand dunes in the Northwest China are also used to validate the MOD21 and MOD11 emissivity products.

  18. Reproducibility and relative validity of food group intake in a food frequency questionnaire developed for Nepalese diet.

    PubMed

    Shrestha, Archana; Koju, Rajendra Prasad; Beresford, Shirley A A; Chan, Kwun Chuen Gary; Connell, Frederik A; Karmacharya, Biraj Man; Shrestha, Pramita; Fitzpatrick, Annette L

    2017-08-01

    We developed a food frequency questionnaire (FFQ) designed to measure the dietary practices of adult Nepalese. The present study examined the validity and reproducibility of the FFQ. To evaluate the reproducibility of the FFQ, 116 subjects completed two 115-item FFQ across a four-month interval. Six 24-h dietary recalls were collected (1 each month) to assess the validity of the FFQ. Seven major food groups and 23 subgroups were clustered from the FFQ based on macronutrient composition. Spearman correlation coefficients evaluating reproducibility for all food groups were greater than 0.5, with the exceptions of oil. The correlations varied from 0.41 (oil) to 0.81 (vegetables). All crude spearman coefficients for validity were greater than 0.5 except for dairy products, pizzas/pastas and sausage/burgers. The FFQ was found to be reliable and valid for ranking the intake of food groups for Nepalese dietary intake.

  19. High-voltage leak detection of a parenteral proteinaceous solution product packaged in form-fill-seal plastic laminate bags. Part 1. Method development and validation.

    PubMed

    Damgaard, Rasmus; Rasmussen, Mats; Buus, Peter; Mulhall, Brian; Guazzo, Dana Morton

    2013-01-01

    In Part 1 of this three-part research series, a leak test performed using high-voltage leak detection (HVLD) technology, also referred to as an electrical conductivity and capacitance leak test, was developed and validated for container-closure integrity verification of a small-volume laminate plastic bag containing an aqueous solution for injection. The sterile parenteral product is the rapid-acting insulin analogue, insulin aspart (NovoRapid®/NovoLog®, by Novo Nordisk A/S, Bagsværd, Denmark). The aseptically filled and sealed package is designed to preserve product sterility through expiry. Method development and validation work incorporated positive control packages with a single hole laser-drilled through the laminate film of each bag. A unique HVLD method characterized by specific high-voltage and potentiometer set points was established for testing bags positioned in each of three possible orientations as they are conveyed through the instrument's test zone in each of two possible directions-resulting in a total of six different test method options. Validation study results successfully demonstrated the ability of all six methods to accurately and reliably detect those packages with laser-drilled holes from 2.5-11.2 μm in nominal diameter. Part 2 of this series will further explore HVLD test results as a function of package seal and product storage variables. The final Part 3 will report the impact of HVLD exposure on product physico-chemical stability. In this Part 1 of a three-part research series, a leak test method based on electrical conductivity and capacitance, called high voltage leak detection (HVLD), was used to find leaks in small plastic bags filled with an insulin pharmaceutical solution for human injection by Novo Nordisk A/S (Bagsværd, Denmark). To perform the test, the package is electrically grounded while being conveyed past an electrode linked to a high-voltage, low-amperage transformer. The instrument measures the current that passes from the transformer to the electrode, through the packaged product and along the package walls, to the ground. Plastic packages without defect are relatively nonconductive and yield a low voltage reading; a leaking package with electrically conductive solution located in or near the leak triggers a spike in voltage reading. Test methods were optimized and validated, enabling the detection of leaking packages with holes as small as 2.5 μm in diameter. Part 2 of this series will further explore HVLD test results as a function of package seal and product storage variables. The final Part 3 will report the impact of HVLD exposure on product stability.

  20. [The added value of information summaries supporting clinical decisions at the point-of-care.

    PubMed

    Banzi, Rita; González-Lorenzo, Marien; Kwag, Koren Hyogene; Bonovas, Stefanos; Moja, Lorenzo

    2016-11-01

    Evidence-based healthcare requires the integration of the best research evidence with clinical expertise and patients' values. International publishers are developing evidence-based information services and resources designed to overcome the difficulties in retrieving, assessing and updating medical information as well as to facilitate a rapid access to valid clinical knowledge. Point-of-care information summaries are defined as web-based medical compendia that are specifically designed to deliver pre-digested, rapidly accessible, comprehensive, and periodically updated information to health care providers. Their validity must be assessed against marketing claims that they are evidence-based. We periodically evaluate the content development processes of several international point-of-care information summaries. The number of these products has increased along with their quality. The last analysis done in 2014 identified 26 products and found that three of them (Best Practice, Dynamed e Uptodate) scored the highest across all evaluated dimensions (volume, quality of the editorial process and evidence-based methodology). Point-of-care information summaries as stand-alone products or integrated with other systems, are gaining ground to support clinical decisions. The choice of one product over another depends both on the properties of the service and the preference of users. However, even the most innovative information system must rely on transparent and valid contents. Individuals and institutions should regularly assess the value of point-of-care summaries as their quality changes rapidly over time.

  1. Monitoring Rainfall by Combining Ground-based Observed Precipitation and PERSIANN Satellite Product (Case Study Area: Lake Urmia Basin)

    NASA Astrophysics Data System (ADS)

    Abrishamchi, A.; Mirshahi, A.

    2015-12-01

    The global coverage, quick access, and appropriate spatial-temporal resolution of satellite precipitation data renders the data appropriate for hydrologic studies, especially in regions with no sufficient rain-gauge network. On the other hand, satellite precipitation products may have major errors. The present study aims at reduction of estimation error of the PERSIANN satellite precipitation product. Bayesian logic employed to develop a statistical relationship between historical ground-based and satellite precipitation data. This relationship can then be used to reduce satellite precipitation product error in near real time, when there is no ground-based precipitation observation. The method was evaluated in the Lake Urmia basin with a monthly time scale; November to May of 2000- 2008 for the purpose of model development and two years of 2009 and 2010 for the validation of the established relationships. Moreover, Kriging interpolation method was employed to estimate the average rainfall in the basin. Furthermore, to downscale the satellite precipitation product from 0.25o to 0.05o, data-location downscaling algorithm was used. In 76 percent of months, the final product, compared with the satellite precipitation, had less error during the validation period. Additionally, its performance was marginally better than adjusted PERSIANN product.

  2. Development and validation of a predictive model for the influences of selected product and process variables on ascorbic acid degradation in simulated fruit juice.

    PubMed

    Gabriel, Alonzo A; Cayabyab, Jochelle Elysse C; Tan, Athalie Kaye L; Corook, Mark Lester F; Ables, Errol John O; Tiangson-Bayaga, Cecile Leah P

    2015-06-15

    A predictive response surface model for the influences of product (soluble solids and titratable acidity) and process (temperature and heating time) parameters on the degradation of ascorbic acid (AA) in heated simulated fruit juices (SFJs) was established. Physicochemical property ranges of freshly squeezed and processed juices, and a previously established decimal reduction times of Escherichiacoli O157:H7 at different heating temperatures were used in establishing a Central Composite Design of Experiment that determined the combinations of product and process variable used in the model building. Only the individual linear effects of temperature and heating time significantly (P<0.05) affected AA reduction (%AAr). Validating systems either over- or underestimated actual %AAr with bias factors 0.80-1.20. However, all validating systems still resulted in acceptable predictive efficacy, with accuracy factor 1.00-1.26. The model may be useful in establishing unique process schedules for specific products, for the simultaneous control and improvement of food safety and quality. Copyright © 2015 Elsevier Ltd. All rights reserved.

  3. Validation of the MODIS Collection 6 MCD64 Global Burned Area Product

    NASA Astrophysics Data System (ADS)

    Boschetti, L.; Roy, D. P.; Giglio, L.; Stehman, S. V.; Humber, M. L.; Sathyachandran, S. K.; Zubkova, M.; Melchiorre, A.; Huang, H.; Huo, L. Z.

    2017-12-01

    The research, policy and management applications of satellite products place a high priority on rigorously assessing their accuracy. A number of NASA, ESA and EU funded global and continental burned area products have been developed using coarse spatial resolution satellite data, and have the potential to become part of a long-term fire Essential Climate Variable. These products have usually been validated by comparison with reference burned area maps derived by visual interpretation of Landsat or similar spatial resolution data selected on an ad hoc basis. More optimally, a design-based validation method should be adopted, characterized by the selection of reference data via probability sampling. Design based techniques have been used for annual land cover and land cover change product validation, but have not been widely used for burned area products, or for other products that are highly variable in time and space (e.g. snow, floods, other non-permanent phenomena). This has been due to the challenge of designing an appropriate sampling strategy, and to the cost and limited availability of independent reference data. This paper describes the validation procedure adopted for the latest Collection 6 version of the MODIS Global Burned Area product (MCD64, Giglio et al, 2009). We used a tri-dimensional sampling grid that allows for probability sampling of Landsat data in time and in space (Boschetti et al, 2016). To sample the globe in the spatial domain with non-overlapping sampling units, the Thiessen Scene Area (TSA) tessellation of the Landsat WRS path/rows is used. The TSA grid is then combined with the 16-day Landsat acquisition calendar to provide tri-dimensonal elements (voxels). This allows the implementation of a sampling design where not only the location but also the time interval of the reference data is explicitly drawn through stratified random sampling. The novel sampling approach was used for the selection of a reference dataset consisting of 700 Landsat 8 image pairs, interpreted according to the CEOS Burned Area Validation Protocol (Boschetti et al., 2009). Standard quantitative burned area product accuracy measures that are important for different types of fire users (Boschetti et al, 2016, Roy and Boschetti, 2009, Boschetti et al, 2004) are computed to characterize the accuracy of the MCD64 product.

  4. Validation of reactive gases and aerosols in the MACC global analysis and forecast system

    NASA Astrophysics Data System (ADS)

    Eskes, H.; Huijnen, V.; Arola, A.; Benedictow, A.; Blechschmidt, A.-M.; Botek, E.; Boucher, O.; Bouarar, I.; Chabrillat, S.; Cuevas, E.; Engelen, R.; Flentje, H.; Gaudel, A.; Griesfeller, J.; Jones, L.; Kapsomenakis, J.; Katragkou, E.; Kinne, S.; Langerock, B.; Razinger, M.; Richter, A.; Schultz, M.; Schulz, M.; Sudarchikova, N.; Thouret, V.; Vrekoussis, M.; Wagner, A.; Zerefos, C.

    2015-11-01

    The European MACC (Monitoring Atmospheric Composition and Climate) project is preparing the operational Copernicus Atmosphere Monitoring Service (CAMS), one of the services of the European Copernicus Programme on Earth observation and environmental services. MACC uses data assimilation to combine in situ and remote sensing observations with global and regional models of atmospheric reactive gases, aerosols, and greenhouse gases, and is based on the Integrated Forecasting System of the European Centre for Medium-Range Weather Forecasts (ECMWF). The global component of the MACC service has a dedicated validation activity to document the quality of the atmospheric composition products. In this paper we discuss the approach to validation that has been developed over the past 3 years. Topics discussed are the validation requirements, the operational aspects, the measurement data sets used, the structure of the validation reports, the models and assimilation systems validated, the procedure to introduce new upgrades, and the scoring methods. One specific target of the MACC system concerns forecasting special events with high-pollution concentrations. Such events receive extra attention in the validation process. Finally, a summary is provided of the results from the validation of the latest set of daily global analysis and forecast products from the MACC system reported in November 2014.

  5. Development and validation of a comprehensive model for map of fruits based on enzyme kinetics theory and arrhenius relation.

    PubMed

    Mangaraj, S; K Goswami, T; Mahajan, P V

    2015-07-01

    MAP is a dynamic system where respiration of the packaged product and gas permeation through the packaging film takes place simultaneously. The desired level of O2 and CO2 in a package is achieved by matching film permeation rates for O2 and CO2 with respiration rate of the packaged product. A mathematical model for MAP of fresh fruits applying enzyme kinetics based respiration equation coupled with the Arrhenious type model was developed. The model was solved numerically using MATLAB programme. The model was used to determine the time to reach to the equilibrium concentration inside the MA package and the level of O2 and CO2 concentration at equilibrium state. The developed model for prediction of equilibrium O2 and CO2 concentration was validated using experimental data for MA packaging of apple, guava and litchi.

  6. Development of a framework for international certification by OIE of diagnostic tests validated as fit for purpose.

    PubMed

    Wright, P; Edwards, S; Diallo, A; Jacobson, R

    2006-01-01

    Historically, the OIE has focused on test methods applicable to trade and the international movement of animals and animal products. With its expanding role as the World Organisation for Animal Health, the OIE has recognised the need to evaluate test methods relative to specific diagnostic applications other than trade. In collaboration with its international partners, the OIE solicited input from experts through consultants' meetings on the development of guidelines for validation and certification of diagnostic assays for infectious animal diseases. Recommendations from the first meeting were formally adopted and have subsequently been acted upon by the OIE. A validation template has been developed that specifically requires a test to be fit or suited for its intended purpose (e.g. as a screening or a confirmatory test). This is a key criterion for validation. The template incorporates four distinct stages of validation, each of which has bearing on the evaluation of fitness for purpose. The OIE has just recently created a registry for diagnostic tests that fulfil these validation requirements. Assay developers are invited to submit validation dossiers to the OIE for evaluation by a panel of experts. Recognising that validation is an incremental process, tests methods achieving at least the first stages of validation may be provisionally accepted. To provide additional confidence in assay performance, the OIE, through its network of Reference Laboratories, has embarked on the development of evaluation panels. These panels would contain specially selected test samples that would assist in verifying fitness for purpose.

  7. Development of a framework for international certification by the OIE of diagnostic tests validated as fit for purpose.

    PubMed

    Wright, P; Edwards, S; Diallo, A; Jacobson, R

    2007-01-01

    Historically, the OIE has focussed on test methods applicable to trade and the international movement of animals and animal products. With its expanding role as the World Organisation for Animal Health, the OIE has recognised the need to evaluate test methods relative to specific diagnostic applications other than trade. In collaboration with its international partners, the OIE solicited input from experts through consultants meetings on the development of guidelines for validation and certification of diagnostic assays for infectious animal diseases. Recommendations from the first meeting were formally adopted and have subsequently been acted upon by the OIE. A validation template has been developed that specifically requires a test to be fit or suited for its intended purpose (e.g. as a screening or a confirmatory test). This is a key criterion for validation. The template incorporates four distinct stages of validation, each of which has bearing on the evaluation of fitness for purpose. The OIE has just recently created a registry for diagnostic tests that fulfil these validation requirements. Assay developers are invited to submit validation dossiers to the OIE for evaluation by a panel of experts. Recognising that validation is an incremental process, tests methods achieving at least the first stages of validation may be provisionally accepted. To provide additional confidence in assay performance, the OIE, through its network of Reference Laboratories, has embarked on the development of evaluation panels. These panels would contain specially selected test samples that would assist in verifying fitness for purpose.

  8. Biomarkers of exposure to new and emerging tobacco delivery products.

    PubMed

    Schick, Suzaynn F; Blount, Benjamin C; Jacob, Peyton; Saliba, Najat A; Bernert, John T; El Hellani, Ahmad; Jatlow, Peter; Pappas, R Steven; Wang, Lanqing; Foulds, Jonathan; Ghosh, Arunava; Hecht, Stephen S; Gomez, John C; Martin, Jessica R; Mesaros, Clementina; Srivastava, Sanjay; St Helen, Gideon; Tarran, Robert; Lorkiewicz, Pawel K; Blair, Ian A; Kimmel, Heather L; Doerschuk, Claire M; Benowitz, Neal L; Bhatnagar, Aruni

    2017-09-01

    Accurate and reliable measurements of exposure to tobacco products are essential for identifying and confirming patterns of tobacco product use and for assessing their potential biological effects in both human populations and experimental systems. Due to the introduction of new tobacco-derived products and the development of novel ways to modify and use conventional tobacco products, precise and specific assessments of exposure to tobacco are now more important than ever. Biomarkers that were developed and validated to measure exposure to cigarettes are being evaluated to assess their use for measuring exposure to these new products. Here, we review current methods for measuring exposure to new and emerging tobacco products, such as electronic cigarettes, little cigars, water pipes, and cigarillos. Rigorously validated biomarkers specific to these new products have not yet been identified. Here, we discuss the strengths and limitations of current approaches, including whether they provide reliable exposure estimates for new and emerging products. We provide specific guidance for choosing practical and economical biomarkers for different study designs and experimental conditions. Our goal is to help both new and experienced investigators measure exposure to tobacco products accurately and avoid common experimental errors. With the identification of the capacity gaps in biomarker research on new and emerging tobacco products, we hope to provide researchers, policymakers, and funding agencies with a clear action plan for conducting and promoting research on the patterns of use and health effects of these products.

  9. A novel heuristic for optimization aggregate production problem: Evidence from flat panel display in Malaysia

    NASA Astrophysics Data System (ADS)

    Al-Kuhali, K.; Hussain M., I.; Zain Z., M.; Mullenix, P.

    2015-05-01

    Aim: This paper contribute to the flat panel display industry it terms of aggregate production planning. Methodology: For the minimization cost of total production of LCD manufacturing, a linear programming was applied. The decision variables are general production costs, additional cost incurred for overtime production, additional cost incurred for subcontracting, inventory carrying cost, backorder costs and adjustments for changes incurred within labour levels. Model has been developed considering a manufacturer having several product types, which the maximum types are N, along a total time period of T. Results: Industrial case study based on Malaysia is presented to test and to validate the developed linear programming model for aggregate production planning. Conclusion: The model development is fit under stable environment conditions. Overall it can be recommended to adapt the proven linear programming model to production planning of Malaysian flat panel display industry.

  10. Recovery Act. Development and Validation of an Advanced Stimulation Prediction Model for Enhanced Geothermal System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gutierrez, Marte

    The research project aims to develop and validate an advanced computer model that can be used in the planning and design of stimulation techniques to create engineered reservoirs for Enhanced Geothermal Systems. The specific objectives of the proposal are to: 1) Develop a true three-dimensional hydro-thermal fracturing simulator that is particularly suited for EGS reservoir creation. 2) Perform laboratory scale model tests of hydraulic fracturing and proppant flow/transport using a polyaxial loading device, and use the laboratory results to test and validate the 3D simulator. 3) Perform discrete element/particulate modeling of proppant transport in hydraulic fractures, and use the resultsmore » to improve understand of proppant flow and transport. 4) Test and validate the 3D hydro-thermal fracturing simulator against case histories of EGS energy production. 5) Develop a plan to commercialize the 3D fracturing and proppant flow/transport simulator. The project is expected to yield several specific results and benefits. Major technical products from the proposal include: 1) A true-3D hydro-thermal fracturing computer code that is particularly suited to EGS, 2) Documented results of scale model tests on hydro-thermal fracturing and fracture propping in an analogue crystalline rock, 3) Documented procedures and results of discrete element/particulate modeling of flow and transport of proppants for EGS applications, and 4) Database of monitoring data, with focus of Acoustic Emissions (AE) from lab scale modeling and field case histories of EGS reservoir creation.« less

  11. Recovery Act. Development and Validation of an Advanced Stimulation Prediction Model for Enhanced Geothermal Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gutierrez, Marte

    2013-12-31

    This research project aims to develop and validate an advanced computer model that can be used in the planning and design of stimulation techniques to create engineered reservoirs for Enhanced Geothermal Systems. The specific objectives of the proposal are to; Develop a true three-dimensional hydro-thermal fracturing simulator that is particularly suited for EGS reservoir creation; Perform laboratory scale model tests of hydraulic fracturing and proppant flow/transport using a polyaxial loading device, and use the laboratory results to test and validate the 3D simulator; Perform discrete element/particulate modeling of proppant transport in hydraulic fractures, and use the results to improve understandmore » of proppant flow and transport; Test and validate the 3D hydro-thermal fracturing simulator against case histories of EGS energy production; and Develop a plan to commercialize the 3D fracturing and proppant flow/transport simulator. The project is expected to yield several specific results and benefits. Major technical products from the proposal include; A true-3D hydro-thermal fracturing computer code that is particularly suited to EGS; Documented results of scale model tests on hydro-thermal fracturing and fracture propping in an analogue crystalline rock; Documented procedures and results of discrete element/particulate modeling of flow and transport of proppants for EGS applications; and Database of monitoring data, with focus of Acoustic Emissions (AE) from lab scale modeling and field case histories of EGS reservoir creation.« less

  12. Data survey on the effect of product features on competitive advantage of selected firms in Nigeria.

    PubMed

    Olokundun, Maxwell; Iyiola, Oladele; Ibidunni, Stephen; Falola, Hezekiah; Salau, Odunayo; Amaihian, Augusta; Peter, Fred; Borishade, Taiye

    2018-06-01

    The main objective of this study was to present a data article that investigates the effect product features on firm's competitive advantage. Few studies have examined how the features of a product could help in driving the competitive advantage of a firm. Descriptive research method was used. Statistical Package for Social Sciences (SPSS 22) was engaged for analysis of one hundred and fifty (150) valid questionnaire which were completed by small business owners registered under small and medium scale enterprises development of Nigeria (SMEDAN). Stratified and simple random sampling techniques were employed; reliability and validity procedures were also confirmed. The field data set is made publicly available to enable critical or extended analysis.

  13. The contributions of the European cosmetics industry to the development of alternatives to animal testing: dialogue with ECVAM and future challenges.

    PubMed

    de Silva, Odile

    2002-12-01

    COLIPA (the European Federation of the Cosmetics Industry) represents 24 international companies and 2000 small and medium-sized enterprises. Together with ECVAM, COLIPA has been involved in the development and validation of alternative methods since the beginning of the validation efforts. The work of the Steering Committee on Alternatives to Animal Testing (SCAAT) is based on collaboration between companies, but also with academia, trade associations, the Scientific Committee on Cosmetics and Non-Food Products (SCCNFP), European Commission Directorates General, and ECVAM. Some success has been achieved, but some validation efforts have failed. One lesson is that the search for alternatives requires a lot of humility.

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fraile-Garcia, Esteban, E-mail: esteban.fraile@unirioja.es; Ferreiro-Cabello, Javier, E-mail: javier.ferreiro@unirioja.es; Qualiberica S.L.

    The European Committee for Standardization (CEN) through its Technical Committee CEN/TC-350 is developing a series of standards for assessing the building sustainability, at both product and building levels. The practical application of the selection (decision making) of structural alternatives made by one-way slabs leads to an intermediate level between the product and the building. Thus the present study addresses this problem of decision making, following the CEN guidelines and incorporating relevant aspects of architectural design into residential construction. A life cycle assessment (LCA) is developed in order to obtain valid information for the decision making process (the LCA was developedmore » applying CML methodology although Ecoindicator99 was used in order to facilitate the comparison of the values); this information (the carbon footprint values) is contrasted with other databases and with the information from the Environmental Product Declaration (EPD) of one of the lightening materials (expanded polystyrene), in order to validate the results. Solutions of different column disposition and geometries are evaluated in the three pillars of sustainable construction on residential construction: social, economic and environmental. The quantitative analysis of the variables used in this study enables and facilitates an objective comparison in the design stage by a responsible technician; the application of the proposed methodology reduces the possible solutions to be evaluated by the expert to 12.22% of the options in the case of low values of the column index and to 26.67% for the highest values. - Highlights: • Methodology for selection of structural alternatives in buildings with one-way slabs • Adapted to CEN guidelines (CEN/TC-350) for assessing the building sustainability • LCA is developed in order to obtain valid information for the decision making process. • Results validated comparing carbon footprint, databases and Env. Product Declarations • The proposal reduces the solutions to be evaluated to between 12.22 and 26.67%.« less

  15. Materials Compatibility and Agent Operational Validation for Halon 1211 Replacement: Phases 1 2, and 3. Volume 1

    DTIC Science & Technology

    1993-03-01

    KC-135 Gl-epoxy Winglet 1 *1 = experimental; 2 = prototype development; 3 = production 9 TABLE 3. ADVANCED COMPOSITES IN MILITARY AIRCRAFT (CONCLUDED...specially blended for related agent testing and would not be available, due to its high production cost, for regular distribution.1 ’Personal

  16. Prototype Testing in Instructional Development. SWRL Working Papers: 1972.

    ERIC Educational Resources Information Center

    Niedermeyer, Fred C., Ed.

    When properly implemented, prototype testing appears to provide one of the most direct and economical methods for identifying means to optimize the effectiveness of a product, and ultimately to validate a product's effect. The nine papers in this volume exemplify several categories of protytype testing conducted at different stages of the…

  17. Estimating and validating harvesting system production through computer simulation

    Treesearch

    John E. Baumgras; Curt C. Hassler; Chris B. LeDoux

    1993-01-01

    A Ground Based Harvesting System Simulation model (GB-SIM) has been developed to estimate stump-to-truck production rates and multiproduct yields for conventional ground-based timber harvesting systems in Appalachian hardwood stands. Simulation results reflect inputs that define harvest site and timber stand attributes, wood utilization options, and key attributes of...

  18. The Validity of the Instructional Reading Level.

    ERIC Educational Resources Information Center

    Powell, William R.

    Presented is a critical inquiry about the product of the informal reading inventory (IRI) and about some of the elements used in the process of determining that product. Recent developments on this topic are briefly reviewed. Questions are raised concerning what is a suitable criterion level for word recognition. The original criterion of 95…

  19. A Bayesian Approach to Determination of F, D, and Z Values Used in Steam Sterilization Validation.

    PubMed

    Faya, Paul; Stamey, James D; Seaman, John W

    2017-01-01

    For manufacturers of sterile drug products, steam sterilization is a common method used to provide assurance of the sterility of manufacturing equipment and products. The validation of sterilization processes is a regulatory requirement and relies upon the estimation of key resistance parameters of microorganisms. Traditional methods have relied upon point estimates for the resistance parameters. In this paper, we propose a Bayesian method for estimation of the well-known D T , z , and F o values that are used in the development and validation of sterilization processes. A Bayesian approach allows the uncertainty about these values to be modeled using probability distributions, thereby providing a fully risk-based approach to measures of sterility assurance. An example is given using the survivor curve and fraction negative methods for estimation of resistance parameters, and we present a means by which a probabilistic conclusion can be made regarding the ability of a process to achieve a specified sterility criterion. LAY ABSTRACT: For manufacturers of sterile drug products, steam sterilization is a common method used to provide assurance of the sterility of manufacturing equipment and products. The validation of sterilization processes is a regulatory requirement and relies upon the estimation of key resistance parameters of microorganisms. Traditional methods have relied upon point estimates for the resistance parameters. In this paper, we propose a Bayesian method for estimation of the critical process parameters that are evaluated in the development and validation of sterilization processes. A Bayesian approach allows the uncertainty about these parameters to be modeled using probability distributions, thereby providing a fully risk-based approach to measures of sterility assurance. An example is given using the survivor curve and fraction negative methods for estimation of resistance parameters, and we present a means by which a probabilistic conclusion can be made regarding the ability of a process to achieve a specified sterility criterion. © PDA, Inc. 2017.

  20. Quantitative impurity analysis of monoclonal antibody size heterogeneity by CE-LIF: example of development and validation through a quality-by-design framework.

    PubMed

    Michels, David A; Parker, Monica; Salas-Solano, Oscar

    2012-03-01

    This paper describes the framework of quality by design applied to the development, optimization and validation of a sensitive capillary electrophoresis-sodium dodecyl sulfate (CE-SDS) assay for monitoring impurities that potentially impact drug efficacy or patient safety produced in the manufacture of therapeutic MAb products. Drug substance or drug product samples are derivatized with fluorogenic 3-(2-furoyl)quinoline-2-carboxaldehyde and nucleophilic cyanide before separation by CE-SDS coupled to LIF detection. Three design-of-experiments enabled critical labeling parameters to meet method requirements for detecting minor impurities while building precision and robustness into the assay during development. The screening design predicted optimal conditions to control labeling artifacts while two full factorial designs demonstrated method robustness through control of temperature and cyanide parameters within the normal operating range. Subsequent validation according to the guidelines of the International Committee of Harmonization showed the CE-SDS/LIF assay was specific, accurate, and precise (RSD ≤ 0.8%) for relative peak distribution and linear (R > 0.997) between the range of 0.5-1.5 mg/mL with LOD and LOQ of 10 ng/mL and 35 ng/mL, respectively. Validation confirmed the system suitability criteria used as a level of control to ensure reliable method performance. © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  1. GPM Ground Validation: Pre to Post-Launch Era

    NASA Astrophysics Data System (ADS)

    Petersen, Walt; Skofronick-Jackson, Gail; Huffman, George

    2015-04-01

    NASA GPM Ground Validation (GV) activities have transitioned from the pre to post-launch era. Prior to launch direct validation networks and associated partner institutions were identified world-wide, covering a plethora of precipitation regimes. In the U.S. direct GV efforts focused on use of new operational products such as the NOAA Multi-Radar Multi-Sensor suite (MRMS) for TRMM validation and GPM radiometer algorithm database development. In the post-launch, MRMS products including precipitation rate, accumulation, types and data quality are being routinely generated to facilitate statistical GV of instantaneous (e.g., Level II orbit) and merged (e.g., IMERG) GPM products. Toward assessing precipitation column impacts on product uncertainties, range-gate to pixel-level validation of both Dual-Frequency Precipitation Radar (DPR) and GPM microwave imager data are performed using GPM Validation Network (VN) ground radar and satellite data processing software. VN software ingests quality-controlled volumetric radar datasets and geo-matches those data to coincident DPR and radiometer level-II data. When combined MRMS and VN datasets enable more comprehensive interpretation of both ground and satellite-based estimation uncertainties. To support physical validation efforts eight (one) field campaigns have been conducted in the pre (post) launch era. The campaigns span regimes from northern latitude cold-season snow to warm tropical rain. Most recently the Integrated Precipitation and Hydrology Experiment (IPHEx) took place in the mountains of North Carolina and involved combined airborne and ground-based measurements of orographic precipitation and hydrologic processes underneath the GPM Core satellite. One more U.S. GV field campaign (OLYMPEX) is planned for late 2015 and will address cold-season precipitation estimation, process and hydrology in the orographic and oceanic domains of western Washington State. Finally, continuous direct and physical validation measurements are also being conducted at the NASA Wallops Flight Facility multi-radar, gauge and disdrometer facility located in coastal Virginia. This presentation will summarize the evolution of the NASA GPM GV program from pre to post-launch eras and place focus on evaluation of year-1 post-launch GPM satellite datasets including Level II GPROF, DPR and Combined algorithms, and Level III IMERG products.

  2. The GPM Ground Validation Program: Pre to Post-Launch

    NASA Astrophysics Data System (ADS)

    Petersen, W. A.

    2014-12-01

    NASA GPM Ground Validation (GV) activities have transitioned from the pre to post-launch era. Prior to launch direct validation networks and associated partner institutions were identified world-wide, covering a plethora of precipitation regimes. In the U.S. direct GV efforts focused on use of new operational products such as the NOAA Multi-Radar Multi-Sensor suite (MRMS) for TRMM validation and GPM radiometer algorithm database development. In the post-launch, MRMS products including precipitation rate, types and data quality are being routinely generated to facilitate statistical GV of instantaneous and merged GPM products. To assess precipitation column impacts on product uncertainties, range-gate to pixel-level validation of both Dual-Frequency Precipitation Radar (DPR) and GPM microwave imager data are performed using GPM Validation Network (VN) ground radar and satellite data processing software. VN software ingests quality-controlled volumetric radar datasets and geo-matches those data to coincident DPR and radiometer level-II data. When combined MRMS and VN datasets enable more comprehensive interpretation of ground-satellite estimation uncertainties. To support physical validation efforts eight (one) field campaigns have been conducted in the pre (post) launch era. The campaigns span regimes from northern latitude cold-season snow to warm tropical rain. Most recently the Integrated Precipitation and Hydrology Experiment (IPHEx) took place in the mountains of North Carolina and involved combined airborne and ground-based measurements of orographic precipitation and hydrologic processes underneath the GPM Core satellite. One more U.S. GV field campaign (OLYMPEX) is planned for late 2015 and will address cold-season precipitation estimation, process and hydrology in the orographic and oceanic domains of western Washington State. Finally, continuous direct and physical validation measurements are also being conducted at the NASA Wallops Flight Facility multi-radar, gauge and disdrometer facility located in coastal Virginia. This presentation will summarize the evolution of the NASA GPM GV program from pre to post-launch eras and highlight early evaluations of GPM satellite datasets.

  3. 2005 AG20/20 Annual Review

    NASA Technical Reports Server (NTRS)

    Ross, Kenton W.; McKellip, Rodney D.

    2005-01-01

    Topics covered include: Implementation and Validation of Sensor-Based Site-Specific Crop Management; Enhanced Management of Agricultural Perennial Systems (EMAPS) Using GIS and Remote Sensing; Validation and Application of Geospatial Information for Early Identification of Stress in Wheat; Adapting and Validating Precision Technologies for Cotton Production in the Mid-Southern United States - 2004 Progress Report; Development of a System to Automatically Geo-Rectify Images; Economics of Precision Agriculture Technologies in Cotton Production-AG 2020 Prescription Farming Automation Algorithms; Field Testing a Sensor-Based Applicator for Nitrogen and Phosphorus Application; Early Detection of Citrus Diseases Using Machine Vision and DGPS; Remote Sensing of Citrus Tree Stress Levels and Factors; Spectral-based Nitrogen Sensing for Citrus; Characterization of Tree Canopies; In-field Sensing of Shallow Water Tables and Hydromorphic Soils with an Electromagnetic Induction Profiler; Maintaining the Competitiveness of Tree Fruit Production Through Precision Agriculture; Modeling and Visualizing Terrain and Remote Sensing Data for Research and Education in Precision Agriculture; Thematic Soil Mapping and Crop-Based Strategies for Site-Specific Management; and Crop-Based Strategies for Site-Specific Management.

  4. Enabling high speed friction stir welding of aluminum tailor welded blanks

    NASA Astrophysics Data System (ADS)

    Hovanski, Yuri

    Current welding technologies for production of aluminum tailor-welded blanks (TWBs) are utilized in low-volume and niche applications, and have yet to be scaled for the high-volume vehicle market. This study targeted further weight reduction, part reduction, and cost savings by enabling tailor-welded blank technology for aluminum alloys at high-volumes. While friction stir welding (FSW) has traditionally been applied at linear velocities less than one meter per minute, high volume production applications demand the process be extended to higher velocities more amenable to cost sensitive production environments. Unfortunately, weld parameters and performance developed and characterized at low to moderate welding velocities do not directly translate to high speed linear friction stir welding. Therefore, in order to facilitate production of high volume aluminum FSW components, parameters were developed with a minimum welding velocity of three meters per minute. With an emphasis on weld quality, welded blanks were evaluated for post-weld formability using a combination of numerical and experimental methods. Evaluation across scales was ultimately validated by stamping full-size production door inner panels made from dissimilar thickness aluminum tailor-welded blanks, which provided validation of the numerical and experimental analysis of laboratory scale tests.

  5. High-Speed Friction-Stir Welding to Enable Aluminum Tailor-Welded Blanks

    NASA Astrophysics Data System (ADS)

    Hovanski, Yuri; Upadhyay, Piyush; Carsley, John; Luzanski, Tom; Carlson, Blair; Eisenmenger, Mark; Soulami, Ayoub; Marshall, Dustin; Landino, Brandon; Hartfield-Wunsch, Susan

    2015-05-01

    Current welding technologies for production of aluminum tailor-welded blanks (TWBs) are utilized in low-volume and niche applications, and they have yet to be scaled for the high-volume vehicle market. This study targeted further weight reduction, part reduction, and cost savings by enabling tailor-welded blank technology for aluminum alloys at high volumes. While friction-stir welding (FSW) has been traditionally applied at linear velocities less than 1 m/min, high-volume production applications demand the process be extended to higher velocities more amenable to cost-sensitive production environments. Unfortunately, weld parameters and performance developed and characterized at low-to-moderate welding velocities do not directly translate to high-speed linear FSW. Therefore, to facilitate production of high-volume aluminum FSW components, parameters were developed with a minimum welding velocity of 3 m/min. With an emphasis on weld quality, welded blanks were evaluated for postweld formability using a combination of numerical and experimental methods. An evaluation across scales was ultimately validated by stamping full-size production door inner panels made from dissimilar thickness aluminum TWBs, which provided validation of the numerical and experimental analysis of laboratory-scale tests.

  6. The ESA DUE GlobVapour Project

    NASA Astrophysics Data System (ADS)

    Schröder, M.; ESA Due Globvapour Project Team

    2010-12-01

    The European Space Agency (ESA) Data User Element (DUE) project series aims at bridging the gap between research projects and the sustainable provision of Earth Observation (EO) climate data products at an information level that fully responds to the operational needs of user communities. The ultimate objective of GlobVapour is to provide long-term coherent water vapour data sets exploiting the synergistic capabilities of different EO missions aiming at improved accuracies and enhanced temporal and spatial sampling better than those provided by the single sources. The project seeks to utilize the increasing potential of the synergistic capabilities of past, existing and upcoming satellite missions (ERS-1 and -2, ENVISAT, METOP, MSG as well as relevant non-European missions and in-situ data) in order to meet the increasing needs for coherent long-term water vapour datasets required by the scientific community. GlobVapour develops, validates and applies novel water vapour climate data sets derived from various sensors. More specifically, the primary objectives of the GlobVapour project are: 1)The development of multi-annual global water vapour data sets inclusive of error estimates based on carefully calibrated and inter-calibrated radiances. 2)The validation of the water vapour products against ground based, airborne and other satellite based measurements. 3) The provision of an assessment of the quality of different IASI water vapour profile algorithms developed by the project partners and other groups. 4) The provision of a complete processing system that can further strengthen operational production of the developed products. 5) A demonstration of the use of the products in the field of climate modelling, including applying alternative ways of climate model validation using forward radiation operators. 6) The promotion of the strategy of data set construction and the data sets themselves to the global research and operational community. The ultimate goal of the DUE GlobVapour project is the preparation of recognised data sets and successful concepts that can be used to ensure a sustainable provision of such data from operational entities such as the European Organisation for the Exploitation of Meteorological Satellites (EUMETSAT) Satellite Application Facility (SAF) network. Key scientific questions which GlobVapour data can contribute to are climate monitoring and attribution, assimilation of different water vapour datasets to form a consistent analysis, model process studies, evaluation of in-situ water vapour measurements, validation of climate models and reanalyses, assessing the relationship between water vapour and dynamics, research and development for operational applications and input to atmospheric reanalyses. This presentation will introduce the GlobVapour project and concept as well as the products which are the global total column water vapour (TCWV) time series from a combination of MERIS and SSM/I as well as TCWV data sets derived from the GOME/SCIAMACHY/GOME-2 and the (A)ATSR instruments. A shorter time series of water vapour profiles will be derived from a combination of IASI and SEVIRI. The retrieval and combination methods as well as first validation results will also be discussed.

  7. Development, optimization and validation of a rapid colorimetric microplate bioassay for neomycin sulfate in pharmaceutical drug products.

    PubMed

    Francisco, Fabiane Lacerda; Saviano, Alessandro Morais; Pinto, Terezinha de Jesus Andreoli; Lourenço, Felipe Rebello

    2014-08-01

    Microbiological assays have been used to evaluate antimicrobial activity since the discovery of the first antibiotics. Despite their limitations, microbiological assays are widely employed to determine antibiotic potency of pharmaceutical dosage forms, since they provide a measure of biological activity. The aim of this work is to develop, optimize and validate a rapid colorimetric microplate bioassay for the potency of neomycin in pharmaceutical drug products. Factorial and response surface methodologies were used in the development and optimization of the choice of microorganism, culture medium composition, amount of inoculum, triphenyltetrazolium chloride (TTC) concentration and neomycin concentration. The optimized bioassay method was validated by the assessment of linearity (range 3.0 to 5.0μg/mL, r=0.998 and 0.994 for standard and sample curves, respectively), precision (relative standard deviation (RSD) of 2.8% and 4.0 for repeatability and intermediate precision, respectively), accuracy (mean recovery=100.2%) and robustness. Statistical analysis showed equivalency between agar diffusion microbiological assay and rapid colorimetric microplate bioassay. In addition, microplate bioassay had advantages concerning the sensitivity of response, time of incubation, and amount of culture medium and solutions required. Copyright © 2014 Elsevier B.V. All rights reserved.

  8. Real-time quality monitoring in debutanizer column with regression tree and ANFIS

    NASA Astrophysics Data System (ADS)

    Siddharth, Kumar; Pathak, Amey; Pani, Ajaya Kumar

    2018-05-01

    A debutanizer column is an integral part of any petroleum refinery. Online composition monitoring of debutanizer column outlet streams is highly desirable in order to maximize the production of liquefied petroleum gas. In this article, data-driven models for debutanizer column are developed for real-time composition monitoring. The dataset used has seven process variables as inputs and the output is the butane concentration in the debutanizer column bottom product. The input-output dataset is divided equally into a training (calibration) set and a validation (testing) set. The training set data were used to develop fuzzy inference, adaptive neuro fuzzy (ANFIS) and regression tree models for the debutanizer column. The accuracy of the developed models were evaluated by simulation of the models with the validation dataset. It is observed that the ANFIS model has better estimation accuracy than other models developed in this work and many data-driven models proposed so far in the literature for the debutanizer column.

  9. Verification and Validation in a Rapid Software Development Process

    NASA Technical Reports Server (NTRS)

    Callahan, John R.; Easterbrook, Steve M.

    1997-01-01

    The high cost of software production is driving development organizations to adopt more automated design and analysis methods such as rapid prototyping, computer-aided software engineering (CASE) tools, and high-level code generators. Even developers of safety-critical software system have adopted many of these new methods while striving to achieve high levels Of quality and reliability. While these new methods may enhance productivity and quality in many cases, we examine some of the risks involved in the use of new methods in safety-critical contexts. We examine a case study involving the use of a CASE tool that automatically generates code from high-level system designs. We show that while high-level testing on the system structure is highly desirable, significant risks exist in the automatically generated code and in re-validating releases of the generated code after subsequent design changes. We identify these risks and suggest process improvements that retain the advantages of rapid, automated development methods within the quality and reliability contexts of safety-critical projects.

  10. Quality assurance after process changes of the production of a therapeutic antibody.

    PubMed

    Brass, J M; Krummen, K; Moll-Kaufmann, C

    1996-12-01

    Process development for the production of a therapeutic humanised antibody is a very complex operation. It involves recombinant genetics, verification of a strong expression system, gene amplification, characterisation of a stable host cell expression system, optimisation and design of the mammalian cell culture fermentation system and development of an efficient recovery process resulting in high yields and product quality. Rapid progress in the field and the wish of some pharmaceutical companies for outsourcing their production are the driving forces for process changes relatively late in the development phase. This literature survey is aimed at identifying the limits of acceptable process changes in up scaling of the fermentation and down stream processing of biopharmaceuticals and defining the demand in production validation to prove product equivalency and identity of the isolated, purified therapeutic antibody.

  11. Ground validation of DPR precipitation rate over Italy using H-SAF validation methodology

    NASA Astrophysics Data System (ADS)

    Puca, Silvia; Petracca, Marco; Sebastianelli, Stefano; Vulpiani, Gianfranco

    2017-04-01

    The H-SAF project (Satellite Application Facility on support to Operational Hydrology and Water Management, funded by EUMETSAT) is aimed at retrieving key hydrological parameters such as precipitation, soil moisture and snow cover. Within the H-SAF consortium, the Product Precipitation Validation Group (PPVG) evaluate the accuracy of instantaneous and accumulated precipitation products with respect to ground radar and rain gauge data adopting the same methodology (using a Unique Common Code) throughout Europe. The adopted validation methodology can be summarized by the following few steps: (1) ground data (radar and rain gauge) quality control; (2) spatial interpolation of rain gauge measurements; (3) up-scaling of radar data to satellite native grid; (4) temporal comparison of satellite and ground-based precipitation products; and (5) production and evaluation of continuous and multi-categorical statistical scores for long time series and case studies. The statistical scores are evaluated taking into account the satellite product native grid. With the recent advent of the GPM era starting in march 2014, more new global precipitation products are available. The validation methodology developed in H-SAF can be easily applicable to different precipitation products. In this work, we have validated instantaneous precipitation data estimated from DPR (Dual-frequency Precipitation Radar) instrument onboard of the GPM-CO (Global Precipitation Measurement Core Observatory) satellite. In particular, we have analyzed the near surface and estimated precipitation fields collected in the 2A-Level for 3 different scans (NS, MS and HS). The Italian radar mosaic managed by the National Department of Civil Protection available operationally every 10 minutes is used as ground reference data. The results obtained highlight the capability of the DPR to identify properly the precipitation areas with higher accuracy in estimating the stratiform precipitation (especially for the HS). An underestimation of the rainfall rate are observed in the retrieval of some convective case studies. The analysis of several (stratiform and convective) events occurred in the Mediterranean area in the last two years highlights the capability of the DPR to observe interesting features of the precipitation clouds and to estimate the ground rain intensity.

  12. Development and Validation of Discriminating and Biorelevant Dissolution Test for Lornoxicam Tablets

    PubMed Central

    Anumolu, P. D.; Sunitha, G.; Bindu, S. Hima; Satheshbabu, P. R.; Subrahmanyam, C. V. S.

    2015-01-01

    The establishment of biorelevant and discriminating dissolution procedure for drug products with limited water solubility is a useful technique for qualitative forecasting of the in vivo behavior of formulations. It also characterizes the drug product performance in pharmaceutical development. Lornoxicam, a BCS class-II drug is a nonsteroidal antiinflammatory drug of the oxicam class, has no official dissolution media available in the literature. The objective of present work was to develop and validate a discriminating and biorelevant dissolution test for lornoxicam tablet dosage forms. To quantify the lornoxicam in dissolution samples, UV spectrophotometric method was developed using 0.01M sodium hydroxide solution as solvent at λma×376 nm. After evaluation of saturation solubility, dissolution, sink conditions and stability of lornoxicam bulk drug in different pH solutions and biorelevant media, the dissolution method was optimized using USP paddle type apparatus at 50 rpm rotation speed and 500 ml simulated intestinal fluid as discriminating and biorelevant dissolution medium. The similarity factor (f2) were investigated for formulations with changes in composition and manufacturing variations, values revealed that dissolution method having discriminating power and method was validated as per standard guidelines. The proposed dissolution method can be effectively applied for routine quality control in vitro dissolution studies of lornoxicam in tablets and helpful to pharmacopoeias. PMID:26180277

  13. Development and Validation of Discriminating and Biorelevant Dissolution Test for Lornoxicam Tablets.

    PubMed

    Anumolu, P D; Sunitha, G; Bindu, S Hima; Satheshbabu, P R; Subrahmanyam, C V S

    2015-01-01

    The establishment of biorelevant and discriminating dissolution procedure for drug products with limited water solubility is a useful technique for qualitative forecasting of the in vivo behavior of formulations. It also characterizes the drug product performance in pharmaceutical development. Lornoxicam, a BCS class-II drug is a nonsteroidal antiinflammatory drug of the oxicam class, has no official dissolution media available in the literature. The objective of present work was to develop and validate a discriminating and biorelevant dissolution test for lornoxicam tablet dosage forms. To quantify the lornoxicam in dissolution samples, UV spectrophotometric method was developed using 0.01M sodium hydroxide solution as solvent at λma×376 nm. After evaluation of saturation solubility, dissolution, sink conditions and stability of lornoxicam bulk drug in different pH solutions and biorelevant media, the dissolution method was optimized using USP paddle type apparatus at 50 rpm rotation speed and 500 ml simulated intestinal fluid as discriminating and biorelevant dissolution medium. The similarity factor (f2) were investigated for formulations with changes in composition and manufacturing variations, values revealed that dissolution method having discriminating power and method was validated as per standard guidelines. The proposed dissolution method can be effectively applied for routine quality control in vitro dissolution studies of lornoxicam in tablets and helpful to pharmacopoeias.

  14. Developing Guided Worksheet for Cognitive Apprenticeship Approach in teaching Formal Definition of The Limit of A Function

    NASA Astrophysics Data System (ADS)

    Oktaviyanthi, R.; Dahlan, J. A.

    2018-04-01

    This study aims to develop student worksheets that correspond to the Cognitive Apprenticeship learning approach. The main subject in this student worksheet is Functions and Limits with the branch of the main subject is Continuity and Limits of Functions. There are two indicators of the achievement of this learning that are intended to be developed in the student worksheet (1) the student can explain the concept of limit by using the formal definition of limit and (2) the student can evaluate the value of limit of a function using epsilon and delta. The type of research used is development research that refers to the development of Plomp products. The research flow starts from literature review, observation, interviews, work sheet design, expert validity test, and limited trial on first-year students in academic year 2016-2017 in Universitas Serang Raya, STKIP Pelita Pratama Al-Azhar Serang, and Universitas Mathla’ul Anwar Pandeglang. Based on the product development result obtained the student worksheets that correspond to the Cognitive Apprenticeship learning approach are valid and reliable.

  15. Learning Crude Oil by Using Scientific Literacy Comics

    NASA Astrophysics Data System (ADS)

    Aisyah, R.; Zakiyah, I. A.; Farida, I.; Ramdhani, M. A.

    2017-09-01

    A research has been conducted to create a crude oil learning media in the form of scientific literacy-oriented comic. The research included some phases, namely: concept analysis, material transformation to concept map, indicator identification and science literacy aspect. The product was made based on flowcharts and storyboards that have been validated by expert validators. The product has characteristics namely; 1) Develops indicators and aspects of science literacy, 2) presents the materials in form of story of science fiction genre, 3) has characters adopting levels of scientific literacy, 4) has optional stories, because it depends on questions asked to develop scientific literacy in terms of content, context, process and attitude. Based on feasibility test, the product is feasible to be used as learning media. It is suggested to do an expanded experiment to examine its affectivity in improving scientific literacy and growing students’ awareness about the issues of energy crisis and the impacts of fossil fuel use on the environment.

  16. Development and validation of an Opioid Attractiveness Scale: a novel measure of the attractiveness of opioid products to potential abusers

    PubMed Central

    Butler, Stephen F; Benoit, Christine; Budman, Simon H; Fernandez, Kathrine C; McCormick, Cynthia; Venuti, Synne Wing; Katz, Nathaniel

    2006-01-01

    Background The growing trends in opioid abuse, assessment of the abuse liability of prescription opioid products, and growing efforts by the pharmaceutical industry to develop 'abuse-resistant' formulations highlight a need to understand the features that make one product more 'attractive' than another to potential abusers. We developed a scale to measure the 'attractiveness' of prescription opioids to potential abusers, and used the scale to measure the relative attractiveness of 14 opioid analgesic products. Methods First, the concept of attractiveness was empirically defined with a group of prescription opioid abusers and experts in opioid abuse using a process called Concept Mapping. Abuse liability consisted of two components: factors intrinsic to the drug formulation (e.g., speed of onset, duration) and factors extrinsic to drug formulation (e.g., availability, availability of alternatives, cost). A 17-item Opioid Attractiveness Scale (OAS) was constructed, focusing on factors intrinsic to the drug product. Results A total of 144 individuals participated in tests of validity and reliability. Internal consistency was excellent (Cronbach's α = 0.85–0.94). Drug rankings based on OAS scores achieved good inter-rater agreement (Kendall's W 0.37, p < 0.001). Agreement on drug OAS scores between the developmental sample and a confirmation sample was good (IntraClass Correlations [ICC] of 0.65–0.69). Global ratings of overall attractiveness of the 14 selected opioid products by substance abuse counselors corresponded with the rankings based on OAS ratings of the abuser group. Finally, substance abuse counselors completed the OAS, yielding a high level of correspondence with ratings by the abuser group (ICC = 0.83, p = 0.002). The OAS differentiated attractiveness among 14 selected pharmaceutical opioid products. OxyContin, Dilaudid, and Percocet were ranked highest (most attractive); Talwin NX and Duragesic were ranked lowest (least attractive). Conclusion An initial examination of the psychometric properties of the OAS suggests that it is a valid and reliable scale. The OAS may be useful in providing important guidance on product features that are attractive to potential abusers. PMID:16457713

  17. Development and validation of an Opioid Attractiveness Scale: a novel measure of the attractiveness of opioid products to potential abusers.

    PubMed

    Butler, Stephen F; Benoit, Christine; Budman, Simon H; Fernandez, Kathrine C; McCormick, Cynthia; Venuti, Synne Wing; Katz, Nathaniel

    2006-02-02

    The growing trends in opioid abuse, assessment of the abuse liability of prescription opioid products, and growing efforts by the pharmaceutical industry to develop 'abuse-resistant' formulations highlight a need to understand the features that make one product more 'attractive' than another to potential abusers. We developed a scale to measure the 'attractiveness' of prescription opioids to potential abusers, and used the scale to measure the relative attractiveness of 14 opioid analgesic products. First, the concept of attractiveness was empirically defined with a group of prescription opioid abusers and experts in opioid abuse using a process called Concept Mapping. Abuse liability consisted of two components: factors intrinsic to the drug formulation (e.g., speed of onset, duration) and factors extrinsic to drug formulation (e.g., availability, availability of alternatives, cost). A 17-item Opioid Attractiveness Scale (OAS) was constructed, focusing on factors intrinsic to the drug product. A total of 144 individuals participated in tests of validity and reliability. Internal consistency was excellent (Cronbach's alpha = 0.85-0.94). Drug rankings based on OAS scores achieved good inter-rater agreement (Kendall's W 0.37, p < 0.001). Agreement on drug OAS scores between the developmental sample and a confirmation sample was good (IntraClass Correlations [ICC] of 0.65-0.69). Global ratings of overall attractiveness of the 14 selected opioid products by substance abuse counselors corresponded with the rankings based on OAS ratings of the abuser group. Finally, substance abuse counselors completed the OAS, yielding a high level of correspondence with ratings by the abuser group (ICC = 0.83, p = 0.002). The OAS differentiated attractiveness among 14 selected pharmaceutical opioid products. OxyContin, Dilaudid, and Percocet were ranked highest (most attractive); Talwin NX and Duragesic were ranked lowest (least attractive). An initial examination of the psychometric properties of the OAS suggests that it is a valid and reliable scale. The OAS may be useful in providing important guidance on product features that are attractive to potential abusers.

  18. Development and validation of a questionnaire to evaluate how a cosmetic product for oily skin is able to improve well-being in women.

    PubMed

    Segot-Chicq, E; Compan-Zaouati, D; Wolkenstein, P; Consoli, S; Rodary, C; Delvigne, V; Guillou, V; Poli, F

    2007-10-01

    Skin diseases are known to negatively affect self-image and to have detrimental psychosocial effects. Oily skin is a cosmetic skin problem that women often describe as 'invalidating'. To develop and validate a questionnaire to assess the psychological and psychosocial effects of oily skin condition in women and the outcome of a targeted cosmetic skincare treatment. We developed and validated a concise 18-item questionnaire [oily skin self-image questionnaire (OSSIQ)] to assess perception, behavioural, and emotional consequences associated with oily skin condition. The questionnaire was then used to assess the effects of a skincare treatment for oily skin and compare them with sebum level measurements. The 18-item questionnaire clearly distinguished the oily skin group from the control group. Responsiveness, reliability, and construct validity showed satisfactory performance. The questionnaire provided a relevant assessment of the psychological benefits associated with the skincare programme. The OSSIQ is a valid tool that can be used to monitor the benefits of cosmetic skincare treatments.

  19. Estimating added sugars in US consumer packaged goods: An application to beverages in 2007-08.

    PubMed

    Ng, Shu Wen; Bricker, Gregory; Li, Kuo-Ping; Yoon, Emily Ford; Kang, Jiyoung; Westrich, Brian

    2015-11-01

    This study developed a method to estimate added sugar content in consumer packaged goods (CPG) that can keep pace with the dynamic food system. A team including registered dietitians, a food scientist and programmers developed a batch-mode ingredient matching and linear programming (LP) approach to estimate the amount of each ingredient needed in a given product to produce a nutrient profile similar to that reported on its nutrition facts label (NFL). Added sugar content was estimated for 7021 products available in 2007-08 that contain sugar from ten beverage categories. Of these, flavored waters had the lowest added sugar amounts (4.3g/100g), while sweetened dairy and dairy alternative beverages had the smallest percentage of added sugars (65.6% of Total Sugars; 33.8% of Calories). Estimation validity was determined by comparing LP estimated values to NFL values, as well as in a small validation study. LP estimates appeared reasonable compared to NFL values for calories, carbohydrates and total sugars, and performed well in the validation test; however, further work is needed to obtain more definitive conclusions on the accuracy of added sugar estimates in CPGs. As nutrition labeling regulations evolve, this approach can be adapted to test for potential product-specific, category-level, and population-level implications.

  20. Validation of High-Fidelity CFD/CAA Framework for Launch Vehicle Acoustic Environment Simulation against Scale Model Test Data

    NASA Technical Reports Server (NTRS)

    Liever, Peter A.; West, Jeffrey S.; Harris, Robert E.

    2016-01-01

    A hybrid Computational Fluid Dynamics and Computational Aero-Acoustics (CFD/CAA) modeling framework has been developed for launch vehicle liftoff acoustic environment predictions. The framework couples the existing highly-scalable NASA production CFD code, Loci/CHEM, with a high-order accurate Discontinuous Galerkin solver developed in the same production framework, Loci/THRUST, to accurately resolve and propagate acoustic physics across the entire launch environment. Time-accurate, Hybrid RANS/LES CFD modeling is applied for predicting the acoustic generation physics at the plume source, and a high-order accurate unstructured mesh Discontinuous Galerkin (DG) method is employed to propagate acoustic waves away from the source across large distances using high-order accurate schemes. The DG solver is capable of solving 2nd, 3rd, and 4th order Euler solutions for non-linear, conservative acoustic field propagation. Initial application testing and validation has been carried out against high resolution acoustic data from the Ares Scale Model Acoustic Test (ASMAT) series to evaluate the capabilities and production readiness of the CFD/CAA system to resolve the observed spectrum of acoustic frequency content. This paper presents results from this validation and outlines efforts to mature and improve the computational simulation framework.

  1. Edible moisture barriers: how to assess of their potential and limits in food products shelf-life extension?

    PubMed

    Bourlieu, C; Guillard, V; Vallès-Pamiès, B; Guilbert, S; Gontard, N

    2009-05-01

    Control of moisture transfer inside composite food products or between food and its environment remains today a major challenge in food preservation. A wide rage of film-forming compounds is now available and facilitates tailoring moisture barriers with optimized functional properties. Despite these huge potentials, a realistic assessment of the film or coating efficacy is still critical. Due to nonlinear water sorption isotherms, water-dependent diffusivities, and variations of physical state, modelling transport phenomena through edible barriers is complex. Water vapor permeability can hardly be considered as an inherent property of films and only gives a relative indication of the barrier efficacy. The formal or mechanistic models reported in literature that describe the influence of testing conditions on the barrier properties of edible films are reviewed and discussed. Most of these models have been validated on a narrow range of conditions. Conversely, few original predictive models based on Fick's Second Law have been developed to assess shelf-life extension of food products including barriers. These models, assuming complex and realistic hypothesis, have been validated in various model foods. The development of nondestructive methods of moisture content measurement should speed up model validation and allow a better comprehension of moisture transfer through edible films.

  2. Estimating added sugars in US consumer packaged goods: An application to beverages in 2007–08

    PubMed Central

    Ng, Shu Wen; Bricker, Gregory; Li, Kuo-ping; Yoon, Emily Ford; Kang, Jiyoung; Westrich, Brian

    2015-01-01

    This study developed a method to estimate added sugar content in consumer packaged goods (CPG) that can keep pace with the dynamic food system. A team including registered dietitians, a food scientist and programmers developed a batch-mode ingredient matching and linear programming (LP) approach to estimate the amount of each ingredient needed in a given product to produce a nutrient profile similar to that reported on its nutrition facts label (NFL). Added sugar content was estimated for 7021 products available in 2007–08 that contain sugar from ten beverage categories. Of these, flavored waters had the lowest added sugar amounts (4.3g/100g), while sweetened dairy and dairy alternative beverages had the smallest percentage of added sugars (65.6% of Total Sugars; 33.8% of Calories). Estimation validity was determined by comparing LP estimated values to NFL values, as well as in a small validation study. LP estimates appeared reasonable compared to NFL values for calories, carbohydrates and total sugars, and performed well in the validation test; however, further work is needed to obtain more definitive conclusions on the accuracy of added sugar estimates in CPGs. As nutrition labeling regulations evolve, this approach can be adapted to test for potential product-specific, category-level, and population-level implications. PMID:26273127

  3. A Stability-Indicating HPLC-DAD Method for Determination of Stiripentol: Development, Validation, Kinetics, Structure Elucidation and Application to Commercial Dosage Form

    PubMed Central

    Darwish, Hany W.; Abdelhameed, Ali S.; Bakheit, Ahmed H.; Khalil, Nasr Y.; Al-Majed, Abdulrahman A.

    2014-01-01

    A rapid, simple, sensitive, and accurate isocratic reversed-phase stability-indicating high performance liquid chromatography method has been developed and validated for the determination of stiripentol and its degradation product in its bulk form and pharmaceutical dosage form. Chromatographic separation was achieved on a Symmetry C18 column and quantification was achieved using photodiode array detector (DAD). The method was validated in accordance with the ICH requirements showing specificity, linearity (r 2 = 0.9996, range of 1–25 μg/mL), precision (relative standard deviation lower than 2%), accuracy (mean recovery 100.08 ± 1.73), limits of detection and quantitation (LOD = 0.024 and LOQ = 0.081 μg/mL), and robustness. Stiripentol was subjected to various stress conditions and it has shown marked stability under alkaline hydrolytic stress conditions, thermal, oxidative, and photolytic conditions. Stiripentol degraded only under acidic conditions, forming a single degradation product which was well resolved from the pure drug with significantly different retention time values. This degradation product was characterized by 1H-NMR and 13C-NMR spectroscopy as well as ion trap mass spectrometry. The results demonstrated that the method would have a great value when applied in quality control and stability studies for stiripentol. PMID:25371844

  4. A review of health-related workplace productivity loss instruments.

    PubMed

    Lofland, Jennifer H; Pizzi, Laura; Frick, Kevin D

    2004-01-01

    The objective of this review was to identify health-related workplace productivity loss survey instruments, with particular emphasis on those that capture a metric suitable for direct translation into a monetary figure. A literature search using Medline, HealthSTAR, PsycINFO and Econlit databases between 1966 and 2002, and a telephone-administered survey of business leaders and researchers, were conducted to identify health-related workplace productivity measurement survey instruments. This review was conducted from the societal perspective. Each identified instrument was reviewed for the following: (i). reliability; (ii). content validity; (iii). construct validity; (iv). criterion validity; (v). productivity metric(s); (vi). instrument scoring technique; (vii). suitability for direct translation into a monetary figure; (viii). number of items; (ix). mode(s) of administration; and (x). disease state(s) in which it had been tested. Reliability and validity testing have been performed for 8 of the 11 identified surveys. Of the 11 instruments identified, six captured metrics that are suitable for direct translation into a monetary figure. Of those six, one instrument measured absenteeism, while the other five measured both absenteeism and presenteeism. All of the identified instruments except for one were available as paper, self-administered questionnaires and many were available in languages other than English. This review provides a comprehensive overview of the published, peer-reviewed survey instruments available to measure health-related workplace productivity loss. As the field of productivity measurement matures, tools may be developed that will allow researchers to accurately calculate lost productivity costs when performing cost-effectiveness and cost-benefit analyses. Using data captured by these instruments, society and healthcare decision makers will be able to make better informed decisions concerning the value of the medications, disease management and health promotion programmes that individuals receive.

  5. Nonclinical dose formulation analysis method validation and sample analysis.

    PubMed

    Whitmire, Monica Lee; Bryan, Peter; Henry, Teresa R; Holbrook, John; Lehmann, Paul; Mollitor, Thomas; Ohorodnik, Susan; Reed, David; Wietgrefe, Holly D

    2010-12-01

    Nonclinical dose formulation analysis methods are used to confirm test article concentration and homogeneity in formulations and determine formulation stability in support of regulated nonclinical studies. There is currently no regulatory guidance for nonclinical dose formulation analysis method validation or sample analysis. Regulatory guidance for the validation of analytical procedures has been developed for drug product/formulation testing; however, verification of the formulation concentrations falls under the framework of GLP regulations (not GMP). The only current related regulatory guidance is the bioanalytical guidance for method validation. The fundamental parameters for bioanalysis and formulation analysis validations that overlap include: recovery, accuracy, precision, specificity, selectivity, carryover, sensitivity, and stability. Divergence in bioanalytical and drug product validations typically center around the acceptance criteria used. As the dose formulation samples are not true "unknowns", the concept of quality control samples that cover the entire range of the standard curve serving as the indication for the confidence in the data generated from the "unknown" study samples may not always be necessary. Also, the standard bioanalytical acceptance criteria may not be directly applicable, especially when the determined concentration does not match the target concentration. This paper attempts to reconcile the different practices being performed in the community and to provide recommendations of best practices and proposed acceptance criteria for nonclinical dose formulation method validation and sample analysis.

  6. Comprehensive Calibration and Validation Site for Information Remote Sensing

    NASA Astrophysics Data System (ADS)

    Li, C. R.; Tang, L. L.; Ma, L. L.; Zhou, Y. S.; Gao, C. X.; Wang, N.; Li, X. H.; Wang, X. H.; Zhu, X. H.

    2015-04-01

    As a naturally part of information technology, Remote Sensing (RS) is strongly required to provide very precise and accurate information product to serve industry, academy and the public at this information economic era. To meet the needs of high quality RS product, building a fully functional and advanced calibration system, including measuring instruments, measuring approaches and target site become extremely important. Supported by MOST of China via national plan, great progress has been made to construct a comprehensive calibration and validation (Cal&Val) site, which integrates most functions of RS sensor aviation testing, EO satellite on-orbit caration and performance assessment and RS product validation at this site located in Baotou, 600km west of Beijing. The site is equipped with various artificial standard targets, including portable and permanent targets, which supports for long-term calibration and validation. A number of fine-designed ground measuring instruments and airborne standard sensors are developed for realizing high-accuracy stepwise validation, an approach in avoiding or reducing uncertainties caused from nonsynchronized measurement. As part of contribution to worldwide Cal&Val study coordinated by CEOS-WGCV, Baotou site is offering its support to Radiometric Calibration Network of Automated Instruments (RadCalNet), with an aim of providing demonstrated global standard automated radiometric calibration service in cooperation with ESA, NASA, CNES and NPL. Furthermore, several Cal&Val campaigns have been performed during the past years to calibrate and validate the spaceborne/airborne optical and SAR sensors, and the results of some typical demonstration are discussed in this study.

  7. Measuring production loss due to health and work environment problems: construct validity and implications.

    PubMed

    Karlsson, Malin Lohela; Bergström, Gunnar; Björklund, Christina; Hagberg, Jan; Jensen, Irene

    2013-12-01

    The aim was to validate two measures of production loss, health-related and work environment-related production loss, concerning their associations with health status and work environment factors. Validity was assessed by evaluating the construct validity. Health problems related and work environment-related problems (or factors) were included in separate analyses and evaluated regarding the significant difference in proportion of explained variation (R) of production loss. health problems production loss was not found to fulfill the criteria for convergent validity in this study; however, the measure of work environment-related production loss did fulfill the criteria that were set up. The measure of work environment-related production loss can be used to screen for production loss due to work environment problems as well as an outcome measure when evaluating the effect of organizational interventions.

  8. Measuring the emotional climate of an organization.

    PubMed

    Yurtsever, Gülçimen; De Rivera, Joseph

    2010-04-01

    The importance of emotional climate in the organizational climate literature has gained interest. However, few studies have concentrated on adequately measuring the emotional climate of organizations. In this study, a reliable and valid scale was developed to measure the most important aspects of emotional climate in different organizations. This study presents evidence of reliability and validity for 28 items constructed to measure emotional climate in an organization in four separate studies. The data were obtained from working people from four different organizations by self-administered questionnaires. The findings indicate that three factors--Trust, Hope, and Security--were factors of the 28-item scale. Validation data also included correlations with duration of employment. The other method of assessing criterion validity was by comparing mean scores in organizations with differing productivity; results indicated that the organization with more productive members had a significantly higher mean score on emotional climate and its subscales. The generalizability of the results to private businesses also was assessed.

  9. Sensitivity Study for Sensor Optical and Electric Crosstalk Based on Spectral Measurements: An Application to Developmental Sensors Using Heritage Sensors Such As MODIS

    NASA Technical Reports Server (NTRS)

    Butler, James J.; Oudrari, Hassan; Xiong, Sanxiong; Che, Nianzeng; Xiong, Xiaoxiong

    2007-01-01

    The process of developing new sensors for space flight frequently builds upon the designs and experience of existing heritage space flight sensors. Frequently in the development and testing of new sensors, problems are encountered that pose the risk of serious impact on successful retrieval of geophysical products. This paper describes an approach to assess the importance of optical and electronic cross-talk on retrieval of geophysical products using new MODIS-like sensors through the use of MODIS data sets. These approaches may be extended to any sensor characteristic and any sensor where that characteristic may impact the Level 1 products so long as validated geophysical products are being developed from the heritage sensor. In this study, a set of electronic and/or optical cross-talk coefficients are postulated. These coefficients are sender-receiver influence coefficients and represent a sensor signal contamination on any detector on a focal plane when another band's detectors on that focal plane are stimulated with a monochromatic light. The approach involves using the postulated cross-talk coefficients on an actual set of MODIS data granules. The original MODIS data granules and the cross-talk impacted granules are used with validated geophysical algorithms to create the derived products. Comparison of the products produced with the original and cross-talk impacted granules indicates potential problems, if any, with the characteristics of the developmental sensor that are being studied.

  10. Kinetic modeling of sporulation and product formation in stationary phase by Bacillus coagulans RK-02 vis-à-vis other Bacilli.

    PubMed

    Das, Subhasish; Sen, Ramkrishna

    2011-10-01

    A logistic kinetic model was derived and validated to characterize the dynamics of a sporogenous bacterium in stationary phase with respect to sporulation and product formation. The kinetic constants as determined using this model are particularly important for describing intrinsic properties of a sporogenous bacterial culture in stationary phase. Non-linear curve fitting of the experimental data into the mathematical model showed very good correlation with the predicted values for sporulation and lipase production by Bacillus coagulans RK-02 culture in minimal media. Model fitting of literature data of sporulation and product (protease and amylase) formation in the stationary phase by some other Bacilli and comparison of the results of model fitting with those of Bacillus coagulans helped validate the significance and robustness of the developed kinetic model. Copyright © 2011 Elsevier Ltd. All rights reserved.

  11. In-house validation of a method for determination of silver nanoparticles in chicken meat based on asymmetric flow field-flow fractionation and inductively coupled plasma mass spectrometric detection.

    PubMed

    Loeschner, Katrin; Navratilova, Jana; Grombe, Ringo; Linsinger, Thomas P J; Købler, Carsten; Mølhave, Kristian; Larsen, Erik H

    2015-08-15

    Nanomaterials are increasingly used in food production and packaging, and validated methods for detection of nanoparticles (NPs) in foodstuffs need to be developed both for regulatory purposes and product development. Asymmetric flow field-flow fractionation with inductively coupled plasma mass spectrometric detection (AF(4)-ICP-MS) was applied for quantitative analysis of silver nanoparticles (AgNPs) in a chicken meat matrix following enzymatic sample preparation. For the first time an analytical validation of nanoparticle detection in a food matrix by AF(4)-ICP-MS has been carried out and the results showed repeatable and intermediately reproducible determination of AgNP mass fraction and size. The findings demonstrated the potential of AF(4)-ICP-MS for quantitative analysis of NPs in complex food matrices for use in food monitoring and control. The accurate determination of AgNP size distribution remained challenging due to the lack of certified size standards. Copyright © 2015 Elsevier Ltd. All rights reserved.

  12. Virtual Factory Framework for Supporting Production Planning and Control.

    PubMed

    Kibira, Deogratias; Shao, Guodong

    2017-01-01

    Developing optimal production plans for smart manufacturing systems is challenging because shop floor events change dynamically. A virtual factory incorporating engineering tools, simulation, and optimization generates and communicates performance data to guide wise decision making for different control levels. This paper describes such a platform specifically for production planning. We also discuss verification and validation of the constituent models. A case study of a machine shop is used to demonstrate data generation for production planning in a virtual factory.

  13. GlobPermafrost- How Space-BasedEarth Observation Supports Understanding of Permafrost

    NASA Astrophysics Data System (ADS)

    Bartsch, Annett; Grosse, Guido; Kaab, Andreas; Westermann, Sebastian; Strozzi, Tazio; Wiesmann, Andreas; Duguay, Claude; Seifert, Frank Martin; Obu, Jaroslav; Goler, Robert

    2016-08-01

    The GlobPermafrost project develops, validates and implements Earth Observation (EO) products to support research communities and international organisations in their work on better understanding permafrost characteristics and dynamics. To facilitate usability of these products by the target audience, user requirements with respect to the planned products have been requested and collected through an online community survey as well as by interview. This paper provides an overview on the planned thematic EO products as well as results of the user requirement survey.

  14. The Backyard Human Performance Technologist: Applying the Development Research Methodology to Develop and Validate a New Instructional Design Framework

    ERIC Educational Resources Information Center

    Brock, Timothy R.

    2009-01-01

    Development research methodology (DRM) has been recommended as a viable research approach to expand the practice-to-theory/theory-to-practice literature that human performance technology (HPT) practitioners can integrate into the day-to-day work flow they already use to develop instructional products. However, little has been written about how it…

  15. Development, upscaling and validation of the purification process for human-cl rhFVIII (Nuwiq®), a new generation recombinant factor VIII produced in a human cell-line.

    PubMed

    Winge, Stefan; Yderland, Louise; Kannicht, Christoph; Hermans, Pim; Adema, Simon; Schmidt, Torben; Gilljam, Gustav; Linhult, Martin; Tiemeyer, Maya; Belyanskaya, Larisa; Walter, Olaf

    2015-11-01

    Human-cl rhFVIII (Nuwiq®), a new generation recombinant factor VIII (rFVIII), is the first rFVIII produced in a human cell-line approved by the European Medicines Agency. To describe the development, upscaling and process validation for industrial-scale human-cl rhFVIII purification. The purification process involves one centrifugation, two filtration, five chromatography columns and two dedicated pathogen clearance steps (solvent/detergent treatment and 20 nm nanofiltration). The key purification step uses an affinity resin (VIIISelect) with high specificity for FVIII, removing essentially all host-cell proteins with >80% product recovery. The production-scale multi-step purification process efficiently removes process- and product-related impurities and results in a high-purity rhFVIII product, with an overall yield of ∼50%. Specific activity of the final product was >9000 IU/mg, and the ratio between active FVIII and total FVIII protein present was >0.9. The entire production process is free of animal-derived products. Leaching of potential harmful compounds from chromatography resins and all pathogens tested were below the limit of quantification in the final product. Human-cl rhFVIII can be produced at 500 L bioreactor scale, maintaining high purity and recoveries. The innovative purification process ensures a high-purity and high-quality human-cl rhFVIII product with a high pathogen safety margin. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.

  16. Technical Report Series on Global Modeling and Data Assimilation. Volume 40; Soil Moisture Active Passive (SMAP) Project Assessment Report for the Beta-Release L4_SM Data Product

    NASA Technical Reports Server (NTRS)

    Koster, Randal D.; Reichle, Rolf H.; De Lannoy, Gabrielle J. M.; Liu, Qing; Colliander, Andreas; Conaty, Austin; Jackson, Thomas; Kimball, John

    2015-01-01

    During the post-launch SMAP calibration and validation (Cal/Val) phase there are two objectives for each science data product team: 1) calibrate, verify, and improve the performance of the science algorithm, and 2) validate the accuracy of the science data product as specified in the science requirements and according to the Cal/Val schedule. This report provides an assessment of the SMAP Level 4 Surface and Root Zone Soil Moisture Passive (L4_SM) product specifically for the product's public beta release scheduled for 30 October 2015. The primary objective of the beta release is to allow users to familiarize themselves with the data product before the validated product becomes available. The beta release also allows users to conduct their own assessment of the data and to provide feedback to the L4_SM science data product team. The assessment of the L4_SM data product includes comparisons of SMAP L4_SM soil moisture estimates with in situ soil moisture observations from core validation sites and sparse networks. The assessment further includes a global evaluation of the internal diagnostics from the ensemble-based data assimilation system that is used to generate the L4_SM product. This evaluation focuses on the statistics of the observation-minus-forecast (O-F) residuals and the analysis increments. Together, the core validation site comparisons and the statistics of the assimilation diagnostics are considered primary validation methodologies for the L4_SM product. Comparisons against in situ measurements from regional-scale sparse networks are considered a secondary validation methodology because such in situ measurements are subject to upscaling errors from the point-scale to the grid cell scale of the data product. Based on the limited set of core validation sites, the assessment presented here meets the criteria established by the Committee on Earth Observing Satellites for Stage 1 validation and supports the beta release of the data. The validation against sparse network measurements and the evaluation of the assimilation diagnostics address Stage 2 validation criteria by expanding the assessment to regional and global scales.

  17. The foodscape: classification and field validation of secondary data sources.

    PubMed

    Lake, Amelia A; Burgoine, Thomas; Greenhalgh, Fiona; Stamp, Elaine; Tyrrell, Rachel

    2010-07-01

    The aims were to: develop a food environment classification tool and to test the acceptability and validity of three secondary sources of food environment data within a defined urban area of Newcastle-Upon-Tyne, using a field validation method. A 21 point (with 77 sub-categories) classification tool was developed. The fieldwork recorded 617 establishments selling food and/or food products. The sensitivity analysis of the secondary sources against fieldwork for the Newcastle City Council data was good (83.6%), while Yell.com and the Yellow Pages were low (51.2% and 50.9%, respectively). To improve the quality of secondary data, multiple sources should be used in order to achieve a realistic picture of the foodscape. 2010 Elsevier Ltd. All rights reserved.

  18. Development of Finer Spatial Resolution Optical Properties from MODIS

    DTIC Science & Technology

    2008-02-04

    infrared (SWIR) channels at 1240 nm and 2130 run. The increased resolution spectral Rrs channels are input into bio-optical algorithms (Quasi...processes. Additionally, increased resolution is required for validation of ocean color products in coastal regions due to the shorter spatial scales of...with in situ Rrs data to determine the "best" method in coastal regimes. We demonstrate that finer resolution is required for validation of coastal

  19. Coarse Scale In Situ Albedo Observations over Heterogeneous Land Surfaces and Validation Strategy

    NASA Astrophysics Data System (ADS)

    Xiao, Q.; Wu, X.; Wen, J.; BAI, J., Sr.

    2017-12-01

    To evaluate and improve the quality of coarse-pixel land surface albedo products, validation with ground measurements of albedo is crucial over the spatially and temporally heterogeneous land surface. The performance of albedo validation depends on the quality of ground-based albedo measurements at a corresponding coarse-pixel scale, which can be conceptualized as the "truth" value of albedo at coarse-pixel scale. The wireless sensor network (WSN) technology provides access to continuously observe on the large pixel scale. Taking the albedo products as an example, this paper was dedicated to the validation of coarse-scale albedo products over heterogeneous surfaces based on the WSN observed data, which is aiming at narrowing down the uncertainty of results caused by the spatial scaling mismatch between satellite and ground measurements over heterogeneous surfaces. The reference value of albedo at coarse-pixel scale can be obtained through an upscaling transform function based on all of the observations for that pixel. We will devote to further improve and develop new method that that are better able to account for the spatio-temporal characteristic of surface albedo in the future. Additionally, how to use the widely distributed single site measurements over the heterogeneous surfaces is also a question to be answered. Keywords: Remote sensing; Albedo; Validation; Wireless sensor network (WSN); Upscaling; Heterogeneous land surface; Albedo truth at coarse-pixel scale

  20. Evaluation Instruments for Quality of Life Related to Melasma: An Integrative Review.

    PubMed

    Pollo, Camila Fernandes; Meneguin, Silmara; Miot, Helio Amante

    2018-05-21

    The aim of this study was to analyze scientific production concerning the validation and cultural adaptation of quality of life evaluation instruments for patients with melasma and to offer a critical reflection on these methods. A literature review was performed based on a search of the Web of Science, Bireme, PubMed, Elsevier Scopus, and Google Scholar databases. All published articles from indexed periodicals in these electronic databases up to December 2015 were included. Eight articles were identified, of which only one (12.5%) referred to the development and validation of a specific instrument for evaluation of the quality of life of melasma patients. An additional six articles (75%) referred to transcultural adjustment and validation of the same instrument in other languages, and another (12.5%) article reported the development of a generic instrument for evaluation of quality of life in patients with pigment disorders. This review revealed only one specific instrument developed and validated in different cultures. Despite being widely used, this instrument did not follow the classic construction steps for psychometric instruments, which paves the way for future studies to develop novel instruments.

  1. Tool development to assess the work related neck and upper limb musculoskeletal disorders among female garment workers in Sri-Lanka.

    PubMed

    Amarasinghe, Nirmalie Champika; De AlwisSenevirathne, Rohini

    2016-10-17

    Musculoskeletal disorders (MSDs) have been identified as a predisposing factor for lesser productivity, but no validated tool has been developed to assess them in the Sri- Lankan context. To develop a validated tool to assess the neck and upper limb MSDs. It comprises three components: item selections, item reduction using principal component analysis, and validation. A tentative self-administrated questionnaire was developed, translated, and pre-tested. Four important domains - neck, shoulder, elbow and wrist - were identified through principal component analysis. Prevalence of any MSDs was 38.1% and prevalence of neck, shoulder, elbow and wrist MSDs are 12.85%, 13.71%, 12%, 13.71% respectively. Content and criterion validity of the tool was assessed. Separate ROC curves were produced and sensitivity and specificity of neck (83.1%, 71.7%), shoulder (97.6%, 91.9%), elbow (98.2%, 87.2%), and wrist (97.6%, 94.9%) was determined. Cronbach's Alpha and correlation coefficient was above 0.7. The tool has high sensitivity, specificity, internal consistency, and test re-test reliability.

  2. Evaluation Instruments for Quality of Life Related to Melasma: An Integrative Review

    PubMed Central

    Pollo, Camila Fernandes; Meneguin, Silmara; Miot, Helio Amante

    2018-01-01

    The aim of this study was to analyze scientific production concerning the validation and cultural adaptation of quality of life evaluation instruments for patients with melasma and to offer a critical reflection on these methods. A literature review was performed based on a search of the Web of Science, Bireme, PubMed, Elsevier Scopus, and Google Scholar databases. All published articles from indexed periodicals in these electronic databases up to December 2015 were included. Eight articles were identified, of which only one (12.5%) referred to the development and validation of a specific instrument for evaluation of the quality of life of melasma patients. An additional six articles (75%) referred to transcultural adjustment and validation of the same instrument in other languages, and another (12.5%) article reported the development of a generic instrument for evaluation of quality of life in patients with pigment disorders. This review revealed only one specific instrument developed and validated in different cultures. Despite being widely used, this instrument did not follow the classic construction steps for psychometric instruments, which paves the way for future studies to develop novel instruments. PMID:29791603

  3. Quality of life assessment in cosmetics: specificity and interest of the international BeautyQol instrument.

    PubMed

    Beresniak, Ariel; Auray, Jean-Paul; Duru, Gérard; Aractingi, Selim; Krueger, Gerald G; Talarico, Sergio; Tsutani, Kiichiro; Dupont, Danielle; de Linares, Yolaine

    2015-09-01

    The wide use of cosmetics and their perceived benefits upon well-being imply objective descriptions of their effects upon the different dimensions contributing to the quality of life (QoL). Such a goal pleas for using relevant and validated scientific instruments with robust measurement methods. This paper discusses the interest of the new validated questionnaire BeautyQoL specifically designed to assess the effect of cosmetic products on physical appearance and QoL. After conducting a review of skin appearance and QoL, three phases of the international codevelopment have been carried out in the following sequence: semi-directed interviews (Phase 1), acceptability study (Phase 2), and validation study (Phase 3). Data collection and validation process have been carried out in 16 languages. This review confirms that QoL instruments developed in dermatology are not suitable to assess cosmetic products, mainly because of their lack of sensitivity. General acceptability of BeautyQol was very good. Forty-two questions have been structured in five dimensions that explained 76.7% of the total variance: Social Life, Self-confidence, Mood, Vitality, and Attractiveness. Cronbach's alpha coefficients are between 0.932 and 0.978, confirming the good internal consistency of the results. The BeautyQol questionnaire is the first international instrument specific to cosmetic products and physical appearance that has been validated in 16 languages and could be used in a number of clinical trials and descriptive studies to demonstrate the added value of these products on the QoL. © 2015 Wiley Periodicals, Inc.

  4. RESEARCH TOWARDS DEVELOPING METHODS FOR SELECTED PHARMACEUTICAL AND PERSONAL CARE PRODUCTS (PPCPS) ADAPTED FOR BIOSOLIDS

    EPA Science Inventory

    Development, standardization, and validation of analytical methods provides state-of-the-science

    techniques to evaluate the presence, or absence, of select PPCPs in biosolids. This research

    provides the approaches, methods, and tools to assess the exposures and redu...

  5. Developing Good Workers. Research Brief.

    ERIC Educational Resources Information Center

    Peterson, Robert M.

    Developing the productive capacities of students is a valid function of schooling and is not in conflict or competition with other educational purposes, such as academic excellence. Employers and young workers in the San Francisco Bay area noted attributes that workers need for success in entry-level unskilled or junior professional jobs. These…

  6. Development and Validation of an Instrument to Measure University Students' Biotechnology Attitude

    ERIC Educational Resources Information Center

    Erdogan, Mehmet; Ozel, Murat; Usak, Muhammet; Prokop, Pavol

    2009-01-01

    The impact of biotechnologies on peoples' everyday lives continuously increases. Measuring young peoples' attitudes toward biotechnologies is therefore very important and its results are useful not only for science curriculum developers and policy makers, but also for producers and distributors of genetically modified products. Despite of…

  7. De novo genome assembly of Cercospora beticola for microsatellite marker development and validation

    USDA-ARS?s Scientific Manuscript database

    Cercospora leaf spot caused by Cercospora beticola is a significant threat to the production of sugar and table beet worldwide. A de novo genome assembly of C. beticola was used to develop eight polymorphic and reproducible microsatellite markers for population genetic analyses. These markers were u...

  8. Current Status of Japanese Global Precipitation Measurement (GPM) Research Project

    NASA Astrophysics Data System (ADS)

    Kachi, Misako; Oki, Riko; Kubota, Takuji; Masaki, Takeshi; Kida, Satoshi; Iguchi, Toshio; Nakamura, Kenji; Takayabu, Yukari N.

    2013-04-01

    The Global Precipitation Measurement (GPM) mission is a mission led by the Japan Aerospace Exploration Agency (JAXA) and the National Aeronautics and Space Administration (NASA) under collaboration with many international partners, who will provide constellation of satellites carrying microwave radiometer instruments. The GPM Core Observatory, which carries the Dual-frequency Precipitation Radar (DPR) developed by JAXA and the National Institute of Information and Communications Technology (NICT), and the GPM Microwave Imager (GMI) developed by NASA. The GPM Core Observatory is scheduled to be launched in early 2014. JAXA also provides the Global Change Observation Mission (GCOM) 1st - Water (GCOM-W1) named "SHIZUKU," as one of constellation satellites. The SHIZUKU satellite was launched in 18 May, 2012 from JAXA's Tanegashima Space Center, and public data release of the Advanced Microwave Scanning Radiometer 2 (AMSR2) on board the SHIZUKU satellite was planned that Level 1 products in January 2013, and Level 2 products including precipitation in May 2013. The Japanese GPM research project conducts scientific activities on algorithm development, ground validation, application research including production of research products. In addition, we promote collaboration studies in Japan and Asian countries, and public relations activities to extend potential users of satellite precipitation products. In pre-launch phase, most of our activities are focused on the algorithm development and the ground validation related to the algorithm development. As the GPM standard products, JAXA develops the DPR Level 1 algorithm, and the NASA-JAXA Joint Algorithm Team develops the DPR Level 2 and the DPR-GMI combined Level2 algorithms. JAXA also develops the Global Rainfall Map product as national product to distribute hourly and 0.1-degree horizontal resolution rainfall map. All standard algorithms including Japan-US joint algorithm will be reviewed by the Japan-US Joint Precipitation Measuring Mission (PMM) Science Team (JPST) before the release. DPR Level 2 algorithm has been developing by the DPR Algorithm Team led by Japan, which is under the NASA-JAXA Joint Algorithm Team. The Level-2 algorithms will provide KuPR only products, KaPR only products, and Dual-frequency Precipitation products, with estimated precipitation rate, radar reflectivity, and precipitation information such as drop size distribution and bright band height. At-launch code was developed in December 2012. In addition, JAXA and NASA have provided synthetic DPR L1 data and tests have been performed using them. Japanese Global Rainfall Map algorithm for the GPM mission has been developed by the Global Rainfall Map Algorithm Development Team in Japan. The algorithm succeeded heritages of the Global Satellite Mapping for Precipitation (GSMaP) project, which was sponsored by the Japan Science and Technology Agency (JST) under the Core Research for Evolutional Science and Technology (CREST) framework between 2002 and 2007. The GSMaP near-real-time version and reanalysis version have been in operation at JAXA, and browse images and binary data available at the GSMaP web site (http://sharaku.eorc.jaxa.jp/GSMaP/). The GSMaP algorithm for GPM is developed in collaboration with AMSR2 standard algorithm for precipitation product, and their validation studies are closely related. As JAXA GPM product, we will provide 0.1-degree grid and hourly product for standard and near-realtime processing. Outputs will include hourly rainfall, gauge-calibrated hourly rainfall, and several quality information (satellite information flag, time information flag, and gauge quality information) over global areas from 60°S to 60°N. At-launch code of GSMaP for GPM is under development, and will be delivered to JAXA GPM Mission Operation System by April 2013. At-launch code will include several updates of microwave imager and sounder algorithms and databases, and introduction of rain-gauge correction.

  9. Modeling pure culture heterotrophic production of polyhydroxybutyrate (PHB).

    PubMed

    Mozumder, Md Salatul Islam; Goormachtigh, Laurens; Garcia-Gonzalez, Linsey; De Wever, Heleen; Volcke, Eveline I P

    2014-03-01

    In this contribution a mechanistic model describing the production of polyhydroxybutyrate (PHB) through pure-culture fermentation was developed, calibrated and validated for two different substrates, namely glucose and waste glycerol. In both cases, non-growth-associated PHB production was triggered by applying nitrogen limitation. The occurrence of some growth-associated PHB production besides non-growth-associated PHB production was demonstrated, although it is inhibited in the presence of nitrogen. Other phenomena observed experimentally and described by the model included biomass growth on PHB and non-linear product inhibition of PHB production. The accumulated impurities from the waste substrate negatively affected the obtained maximum PHB content. Overall, the developed mathematical model provided an accurate prediction of the dynamic behavior of heterotrophic biomass growth and PHB production in a two-phase pure culture system. Copyright © 2013 Elsevier Ltd. All rights reserved.

  10. Evaluation of MODIS and VIIRS Albedo Products Using Ground and Airborne Measurements and Development of Ceos/Wgcv/Lpv Albedo Ecv Protocols

    NASA Astrophysics Data System (ADS)

    Wang, Z.; Roman, M. O.; Schaaf, C.; Sun, Q.; Liu, Y.; Saenz, E. J.; Gatebe, C. K.

    2014-12-01

    Surface albedo, defined as the ratio of the hemispheric reflected solar radiation flux to the incident flux upon the surface, is one of the essential climate variables and quantifies the radiation interaction between the atmosphere and the land surface. An absolute accuracy of 0.02-0.05 for global surface albedo is required by climate models. The MODerate resolution Imaging Spectroradiometer (MODIS) standard BRDF/albedo product makes use of a linear "kernel-driven" RossThick-LiSparse Reciprocal (RTLSR) BRDF model to describe the reflectance anisotropy. The surface albedo is calculated by integrating the BRDF over the above ground hemisphere. While MODIS Terra was launched in Dec 1999 and MODIS Aqua in 2002, the Visible Infrared Imaging Radiometer Suite (VIIRS) on the Suomi-NPP satellite was launched more recently on October 28, 2011. Thus a long term record of BRDF, albedo and Nadir BRDF-Adjusted Reflectance (NBAR) products from VIIRS can be generated through MODIS heritage algorithms. Several investigations have evaluated the MODIS albedo products during the growing season, as well as during dormant and snow covered periods. The Land Product Validation (LPV) sub-group of the Committee on Earth Observation Satellites (CEOS) Working Group on Calibration and Validation (WGCV) aims to address the challenges associated with the validation of global land products. The validation of global surface radiation/albedo products is one of the LPV subgroup activities. In this research, a reference dataset covering various land surface types and vegetation structure is assembled to assess the accuracy of satellite albedo products. This dataset includes in situ data (Baseline Surface Radiation Network (BSRN), FLUXNET and Long Term Ecological Research network (LTER) etc.) and airborne measurements (e.g. Cloud Absorption Radiometer (CAR)). Spatially representative analysis is applied to each site to establish whether the ground measurements can adequately represent moderate spatial resolution remotely sensed albedo products.

  11. Concurrent validity of caregiver/parent report measures of language for children who are learning both English and Spanish.

    PubMed

    Marchman, Virginia A; Martine-Sussmann, Carmen

    2002-10-01

    The validity of two analogous caregiver/parent report measures of early language development in young children who are learning both English and Spanish is examined. Caregiver/parent report indices of vocabulary production and grammar were obtained for 26 children using the MacArthur Communicative Development Inventory: Words & Sentences (CDI; Fenson et al., 1994) and the Inventario del Desarrollo de Habilidades Comunicativas: Palabras y Enunciados (IDHC; Jackson-Maldonado, Bates, & Thal, 1992). Scores were significantly correlated with analogous laboratory measures in both English and Spanish, including a real-object naming task and spontaneous language use during free-play. The findings offer evidence that the CDI and IDHC provide valid assessments of early language milestones in young English- and Spanish-speaking children. Factors that may influence the validity of these tools for use with this population are also discussed.

  12. Construction and validation of clinical contents for development of learning objects.

    PubMed

    Hortense, Flávia Tatiana Pedrolo; Bergerot, Cristiane Decat; Domenico, Edvane Birelo Lopes de

    2018-01-01

    to describe the process of construction and validation of clinical contents for health learning objects, aimed at patients in the treatment of head and neck cancer. descriptive, methodological study. The development of the script and the storyboard were based on scientific evidence and submitted to the appreciation of specialists for validation of content. The agreement index was checked quantitatively and the suggestions were qualitatively evaluated. The items described in the roadmap were approved by 99% of expert experts. The suggestions for adjustments were inserted in their entirety in the final version. The free-marginal kappa statistical test, for multiple evaluators, presented value equal to 0.68%, granting a substantial agreement. The steps taken in the construction and validation of the content for the production of educational material for patients with head and neck cancer were adequate, relevant and suitable for use in other subjects.

  13. Development of reference practices for the calibration and validation of atmospheric composition satellites

    NASA Astrophysics Data System (ADS)

    Lambert, Jean-Christopher; Bojkov, Bojan

    The Committee on Earth Observation Satellites (CEOS)/Working Group on Calibration and Validation (WGCV) is developing a global data quality strategy for the Global Earth Obser-vation System of Systems (GEOSS). In this context, CEOS WGCV elaborated the GEOSS Quality Assurance framework for Earth Observation (QA4EO, http://qa4eo.org). QA4EO en-compasses a documentary framework and a set of ten guidelines, which describe the top-level approach of QA activities and key requirements that drive the QA process. QA4EO is appli-cable virtually to all Earth Observation data. Calibration and validation activities are a cornerstone of the GEOSS data quality strategy. Proper uncertainty assessment of the satellite measurements and their derived data products is essential, and needs to be continuously monitored and traceable to standards. As a practical application of QA4EO, CEOS WGCV has undertaken to establish a set of best practices, methodologies and guidelines for satellite calibration and validation. The present paper reviews current developments of best practices and guidelines for the vali-dation of atmospheric composition satellites. Aimed as a community effort, the approach is to start with current practices that could be improved with time. The present review addresses current validation capabilities, achievements, caveats, harmonization efforts, and challenges. Terminologies and general principles of validation are reminded. Going beyond elementary def-initions of validation like the assessment of uncertainties, the specific GEOSS context requires considering also the validation of individual service components and against user requirements.

  14. Current Status of the Validation of the Atmospheric Chemistry Instruments on Envisat

    NASA Astrophysics Data System (ADS)

    Lecomte, P.; Koopman, R.; Zehner, C.; Laur, H.; Attema, E.; Wursteisen, P.; Snoeij, P.

    2003-04-01

    Envisat is ESA's advanced Earth observing satellite launched in March 2002 and is designed to provide measurements of the atmosphere, ocean, land and ice over a five-year period. After the launch and the switch-on period, a six-month commissioning phase has taken place for instrument calibration and geophysical validation, concluded with the Envisat Calibration Review held in September 2002. In addition to ESA and its industrial partners in the Envisat consortium, many other companies and research institutes have contributed to the calibration and validation programme under ESA contract as expert support laboratories (ESLs). A major contribution has also been made by the Principal Investigators of approved proposals submitted to ESA in response to a worldwide "Announcement of Opportunity for the Exploitation of the Envisat Data Products" in 1998. Working teams have been formed in which the different participants worked side by side to achieve the objectives of the calibration and validation programme. Validation is a comparison of Envisat level-2 data products and estimates of the different geophysical variables obtained by independent means, the validation instruments. Validation is closely linked to calibration because inconsistencies discovered in the comparison of Envisat Level 2 data products to well-known external instruments can have many different sources, including inaccuracies of the Envisat instrument calibration and the data calibration algorithms. Therefore, initial validation of the geophysical variables has provided feedback to calibration, de-bugging and algorithm improvement. The initial validation phase ended in December 2002 with the Envisat Validation Workshop at which, for a number of products, a final quality statement was given. Full validation of all data products available from the Atmospheric Chemistry Instruments on Envisat (MIPAS, GOMOS and SCIAMACHY) is quite a challenge and therefore it has been decided to adopt a step-wise approach. As a first step the intention is to arrive at a first quality assessment of the data products for near-real time distribution. This core validation was performed during the commissioning and validation phase of Envisat. The results of this exercise have been presented at the Envisat Validation Workshop. It was already anticipated early in the program that more work needed to be done after this workshop on all Envisat data products both for near-real time and for off-line distribution. The algorithms designed to derive estimates of the atmospheric constitutes need to be verified. For this a large number of correlative observations under a wide range of conditions are needed to arrive at a representative and statistically significant data quality assessment, and to provide insight into sources of error both in the Envisat data and the correlative data sets. In order to achieve this within the tight time schedule the best use must be made of the available resources. For the Atmospheric Chemistry Instruments on Envisat it has therefore been decided to plan a joint geophysical validation programme that is not instrument specific but serves all three instruments. For the co-ordination of the activities the Atmospheric Chemistry Validation Team was formed (ACVT). The ACVT methods can roughly be categorised into different approaches and consistent with these the group is divided into different subgroups on · balloon and aircraft campaigns · ground-based measurements · model assimilation and satellite intercomparison The data coming from the various validation campaigns are stored within a central data storage facility established at the Norwegian Institute for Air Research (NILU) in Norway. NILU provides access to correlative measurements from sensors on-board satellites, aircraft, balloons and ships, as well as from ground-based instruments and numerical models, such as that of the ECMWF. Particular emphasis has been put on the quality control of such data. Users are able to connect with the database to add or retrieve data according to their requirements. Access to such a range of data have strengthened the statistical significance of the results and increased the chances of detecting errors in the processing algorithms. Two types of data are stored in the NILU database, fixed point and transect data. Transect data is only provided for inclusion in the database for selected times which correspond to the satellite overpass. Envisat data is not stored in the NILU database although other correlative satellite data is included to facilitate their comparison with data acquired by Envisat. The European Space Agency (ESA) organised a workshop in Frascati from 9 to 13 December 2002 to review the first results of the validation of the geophysical data products from its environmental satellite Envisat. The objectives of the Envisat Validation Workshop were: . to review the Level 2 product algorithms using the results of the validation campaigns, . to review the geophysical consistency of the Level 2 processor products, . to provide an error estimation of the Level 2 products, . to recommend instrument re-calibration and algorithm development where needed. At the Envisat Validation workshop held in Frascati, Italy, from 9-13 December, scientists and engineers presented analyses of the exhaustive series of tests that have been run on each of Envisat's sensors since the spacecraft was launched in March. On the basis of workshop results it was decided that most of the 73 data products provided by the Envisat instruments are ready for operational delivery. Although the main validation phase for the atmospheric instruments of Envisat will be completed this year, ongoing validation products will continue throughout the lifetime of the Envisat mission. More specifically, the main validation phase (i.e. with intensive validation activities) will be completed in 2003, whereas the long-term validation phase will: - Provide assurance of data quality and accuracy for applications such as climate change research - Investigate the fully representative range of geophysical conditions - Investigate the fully representative range of seasonal cycles - Perform long term monitoring for instrumental drifts and other artefacts - Validate new products. The paper will discuss the general status of the calibration and validation activities for GOMOS, MIPAS and SCIAMACHY. The short-term and long-term validation plans will be presented.

  15. Pharmaceutical product development: A quality by design approach

    PubMed Central

    Pramod, Kannissery; Tahir, M. Abu; Charoo, Naseem A.; Ansari, Shahid H.; Ali, Javed

    2016-01-01

    The application of quality by design (QbD) in pharmaceutical product development is now a thrust area for the regulatory authorities and the pharmaceutical industry. International Conference on Harmonization and United States Food and Drug Administration (USFDA) emphasized the principles and applications of QbD in pharmaceutical development in their guidance for the industry. QbD attributes are addressed in question-based review, developed by USFDA for chemistry, manufacturing, and controls section of abbreviated new drug applications. QbD principles, when implemented, lead to a successful product development, subsequent prompt regulatory approval, reduce exhaustive validation burden, and significantly reduce post-approval changes. The key elements of QbD viz., target product quality profile, critical quality attributes, risk assessments, design space, control strategy, product lifecycle management, and continual improvement are discussed to understand the performance of dosage forms within design space. Design of experiments, risk assessment tools, and process analytical technology are also discussed for their role in QbD. This review underlines the importance of QbD in inculcating science-based approach in pharmaceutical product development. PMID:27606256

  16. Pharmaceutical product development: A quality by design approach.

    PubMed

    Pramod, Kannissery; Tahir, M Abu; Charoo, Naseem A; Ansari, Shahid H; Ali, Javed

    2016-01-01

    The application of quality by design (QbD) in pharmaceutical product development is now a thrust area for the regulatory authorities and the pharmaceutical industry. International Conference on Harmonization and United States Food and Drug Administration (USFDA) emphasized the principles and applications of QbD in pharmaceutical development in their guidance for the industry. QbD attributes are addressed in question-based review, developed by USFDA for chemistry, manufacturing, and controls section of abbreviated new drug applications. QbD principles, when implemented, lead to a successful product development, subsequent prompt regulatory approval, reduce exhaustive validation burden, and significantly reduce post-approval changes. The key elements of QbD viz., target product quality profile, critical quality attributes, risk assessments, design space, control strategy, product lifecycle management, and continual improvement are discussed to understand the performance of dosage forms within design space. Design of experiments, risk assessment tools, and process analytical technology are also discussed for their role in QbD. This review underlines the importance of QbD in inculcating science-based approach in pharmaceutical product development.

  17. Hydrogen Infrastructure Testing and Research Facility | Hydrogen and Fuel

    Science.gov Websites

    stations, enabling NREL to validate current industry standards and methods for hydrogen fueling as well as the HITRF to: Develop, quantify performance of, and improve renewable hydrogen production methods

  18. Need for certification of household water treatment products: examples from Haiti.

    PubMed

    Murray, Anna; Pierre-Louis, Jocelyne; Joseph, Flaurine; Sylvain, Ginelove; Patrick, Molly; Lantagne, Daniele

    2015-04-01

    To evaluate four household water treatment (HWT) products currently seeking approval for distribution in Haiti, through the application of a recently-developed national HWT product certification process. Four chemical treatment products were evaluated against the certification process validation stage by verifying international product certifications confirming treatment efficacy and reviewing laboratory efficacy data against WHO HWT microbiological performance targets; and against the approval stage by confirming product composition, evaluating treated water chemical content against national and international drinking water quality guidelines and reviewing packaging for dosing ability and usage directions in Creole. None of the four evaluated products fulfilled validation or approval stage requirements. None was certified by an international agency as efficacious for drinking water treatment, and none had data demonstrating its ability to meet WHO HWT performance targets. All product sample compositions differed from labelled composition by >20%, and no packaging included complete usage directions in Creole. Product manufacturers provided information that was inapplicable, did not demonstrate product efficacy, and was insufficient to ensure safe product use. Capacity building is needed with country regulatory agencies to objectively evaluate HWT products. Products should be internationally assessed against WHO performance targets and also locally approved, considering language, culture and usability, to ensure effective HWT. © 2014 John Wiley & Sons Ltd.

  19. Need for certification of household water treatment products: examples from Haiti

    PubMed Central

    Murray, Anna; Pierre-Louis, Jocelyne; Joseph, Flaurine; Sylvain, Ginelove; Patrick, Molly; Lantagne, Daniele

    2015-01-01

    OBJECTIVE To evaluate four household water treatment (HWT) products currently seeking approval for distribution in Haiti, through the application of a recently-developed national HWT product certification process. METHODS Four chemical treatment products were evaluated against the certification process validation stage by verifying international product certifications confirming treatment efficacy and reviewing laboratory efficacy data against WHO HWT microbiological performance targets; and against the approval stage by confirming product composition, evaluating treated water chemical content against national and international drinking water quality guidelines and reviewing packaging for dosing ability and usage directions in Creole. RESULTS None of the four evaluated products fulfilled validation or approval stage requirements. None was certified by an international agency as efficacious for drinking water treatment, and none had data demonstrating its ability to meet WHO HWT performance targets. All product sample compositions differed from labelled composition by >20%, and no packaging included complete usage directions in Creole. CONCLUSIONS Product manufacturers provided information that was inapplicable, did not demonstrate product efficacy, and was insufficient to ensure safe product use. Capacity building is needed with country regulatory agencies to objectively evaluate HWT products. Products should be internationally assessed against WHO performance targets and also locally approved, considering language, culture and usability, to ensure effective HWT. PMID:25441711

  20. The Development and Single-Laboratory Validation of a Method for the Determination of Steroid Residues in Fish and Fish Products.

    PubMed

    Watson, Lynn; Potter, Ross; Murphy, Cory; Gibbs, Ryan

    2015-01-01

    Due to potential use in aquacultured fish products, the Canadian Food Inspection Agency has identified residue testing for steroids as a priority. These compounds are used in aquaculture primarily to direct sexual differentiation with both androgens and estrogens applied depending on the desired outcome. Published research is lacking with respect to steroid residue testing in fish; however, recent studies in other matrixes provided transferable cleanup techniques. A simple, rapid, and sensitive method was developed and validated for use in monitoring aquacultured fish products for the presence of methyltestosterone, nandrolone, epi-nandrolone, boldenone, and epi-boldenone residues. The developed method consists of solvent extraction followed by cleanup using hexane and dual cartridge SPE with analysis by ultra-HPLC-MS/MS. The method is capable of detecting and confirming steroid residue levels ranging from 0.05 to 25 ng/g in salmon and tilapia, depending on the analyte. Recoveries ranged from 88 to 130% for the analytes. Instrument repeatability was less than 13% for all compounds, while intermediate precision ranged from 5 to 25% RSD. HorRat values were within acceptable ranges.

  1. Moderate Resolution Imaging Spectrometer (MODIS) design evolution and associated development and verification of data product efforts

    NASA Technical Reports Server (NTRS)

    Salomonson, Vincent V.

    1991-01-01

    The Moderate Resolution Imaging Spectrometer (MODIS) is a key observing facility to be flown on the Earth Observing System (EOS). The facility is composed of two instruments called MODIS-N (nadir) and MODIS-T (tilt). The MODIS-N is being built under contract to NASA by the Santa Barbara Research Center. The MODIS-T is being fabricated by the Engineering Directorate at the Goddard Space Flight Center. The MODIS Science Team has defined nearly 40 biogeophysical data products for studies of the ocean and land surface and properties of the atmosphere including clouds that can be expected to be produced from the MODIS instruments shortly after the launch of EOS. The ocean, land, atmosphere, and calibration groups of the MODIS Science Team are now proceeding to plan and implement the operations and facilities involving the analysis of data from existing spaceborne, airborne, and in-situ sensors required to develop and validate the algorithms that will produce the geophysical data products. These algorithm development and validation efforts will be accomplished wherever possible within the context of existing or planned national and international experiments or programs such as those in the World Climate Research Program.

  2. Developing a consumer evaluation tool of weight control strategy advertisements on the Internet.

    PubMed

    Luevorasirikul, Kanokrat; Gray, Nicola J; Anderson, Claire W

    2008-06-01

    To develop two evaluation tools for weight loss and weight gain advertisements on the Internet in order to help consumers to evaluate the quality of information within these advertisements. One hundred websites identified by Internet search engines for weight loss and weight gain strategies (50 websites each) were evaluated using two specific scoring instruments, developed by adapting questions from the 'DISCERN' tool and reviewing all related weight control guidelines and advertising regulations. The validity and reliability of the adapted tools were tested. Our evaluation tools rated the information from most websites as poor quality (70%). In the case of weight loss strategies, statements about rapid (18%) and permanent (28%) weight loss caused concern as well as lack of sensible advice about dieting and a lack of product warnings (84%). Safety concerns relating to weight gain products were the lack of warnings about side effects in products containing steroids and creatine (92%). The adapted tools exhibited acceptable validity and reliability. Quality of information within weight control advertisements on the Internet was generally poor. Problems of false claims, little advice on healthy ways to modify weight and few warnings on side effects have been highlighted in this study.

  3. Surveillance methods for identifying, characterizing, and monitoring tobacco products: potential reduced exposure products as an example

    PubMed Central

    O’Connor, Richard J.; Cummings, K. Michael; Rees, Vaughan W.; Connolly, Gregory N.; Norton, Kaila J.; Sweanor, David; Parascandola, Mark; Hatsukami, Dorothy K.; Shields, Peter G.

    2015-01-01

    Tobacco products are widely sold and marketed, yet integrated data systems for identifying, tracking, and characterizing products are lacking. Tobacco manufacturers recently have developed potential reduction exposure products (PREPs) with implied or explicit health claims. Currently, a systematic approach for identifying, defining, and evaluating PREPs sold at the local, state or national levels in the US has not been developed. Identifying, characterizing, and monitoring new tobacco products could be greatly enhanced with a responsive surveillance system. This paper critically reviews available surveillance data sources for identifying and tracking tobacco products, including PREPs, evaluating strengths and weaknesses of potential data sources in light of their reliability and validity. Absent regulations mandating disclosure of product-specific information, it is likely that public health officials will need to rely on a variety of imperfect data sources to help identify, characterize, and monitor tobacco products, including PREPs. PMID:19959680

  4. Plumes and Blooms: Observations, Analysis and Modeling for SIMBIOS

    NASA Technical Reports Server (NTRS)

    Maritorena, S.; Siegel, D. A.; Nelson, N. B.

    2004-01-01

    The goal of the Plumes and Blooms (PnB) project is to develop, validate and apply to imagery state-of-the-art ocean color algorithms for quantifying sediment plumes and phytoplankton blooms for the Case II environment of the Santa Barbara Channel. We conduct monthly to twice-monthly transect observations across the Santa Barbara Channel to develop an algorithm development and product validation data set. A primary goal is the use the PnB field data set to objectively tune semi-analytical models of ocean color for this site and apply them using available satellite imagery (SeaWiFS and MODIS). However, the comparison between PnB field observations and satellite estimates of primary products has been disappointing. We find that field estimates of water-leaving radiance correspond poorly to satellite estimates for both SeaWiFS and MODIS local area coverage imagery. We believe this is due to poor atmospheric correction due to complex mixtures of aerosol types found in these near-coastal regions.

  5. Development of prolonged standing strain index to quantify risk levels of standing jobs.

    PubMed

    Halim, Isa; Omar, Abdul Rahman

    2012-01-01

    Many occupations in industry such as metal stamping workers, electronics parts assembly operators, automotive industry welders, and lathe operators require working in a standing posture for a long time. Prolonged standing can contribute to discomfort and muscle fatigue particularly in the back and legs. This study developed the prolonged standing strain index (PSSI) to quantify the risk levels caused by standing jobs, and proposed recommendations to minimize the risk levels. Risk factors associated with standing jobs, such as working posture, muscles activity, standing duration, holding time, whole-body vibration, and indoor air quality, were the basis for developing the PSSI. All risk factors were assigned multipliers, and the PSSI was the product of those multipliers. Recommendations for improvement are based on the PSSI; however, extensive studies are required to validate their effectiveness. multipliers, and the PSSI was the product of those multipliers. Recommendations for improvement are based on the PSSI; however, extensive studies are required to validate their effectiveness.

  6. Development and Validation of an RP-HPLC Method for the Determination of Vinpocetine and Folic Acid in the Presence of a Vinpocetine Alkaline Degradation Product in Bulk and in Capsule Form.

    PubMed

    Elkady, Ehab F; Tammam, Marwa H; Mohamed, Ayman A

    2017-05-01

    An alkaline-forced degradation hydrolytic product of vinpocetine was prepared and characterized by 1H-NMR, FTIR spectroscopy, and MS. Subsequently, a simple, selective, and validated reversed-phase HPLC method was developed for the simultaneous estimation of vinpocetine and folic acid in the presence of a vinpocetine alkaline degradation product. Chromatographic separation was achieved using an isocratic mobile phase consisting of acetonitrile-0.02 M KH2PO4 [containing 0.2% (v/v) triethylamine and adjusted to pH 6 with orthophosphoric acid; (80 + 20, v/v)] at a flow rate of 0.9 mL/min at ambient temperature on a Eurospher II C18 (250 × 4.6 mm, 5 μm) column, with UV detection at 280 nm for folic acid and 230 nm for vinpocetine and its alkaline hydrolytic product. Linearity, accuracy, and precision were found to be acceptable over a concentration range of 12.5-200 μg/mL for vinpocetine and 1-16 μg/mL for folic acid. The proposed method was successfully applied for the determination of both drugs and a vinpocetine hydrolysis product in a laboratory-prepared mixture and in a capsule containing both drugs.

  7. FUSION-Guided Hypothesis Development Leads to the Identification of N6,N6-Dimethyladenosine, a Marine-Derived AKT Pathway Inhibitor

    PubMed Central

    Vaden, Rachel M.; Oswald, Nathaniel W.; Potts, Malia B.; MacMillan, John B.; White, Michael A.

    2017-01-01

    Chemicals found in nature have evolved over geological time scales to productively interact with biological molecules, and thus represent an effective resource for pharmaceutical development. Marine-derived bacteria are rich sources of chemically diverse, bioactive secondary metabolites, but harnessing this diversity for biomedical benefit is limited by challenges associated with natural product purification and determination of biochemical mechanism. Using Functional Signature Ontology (FUSION), we report the parallel isolation and characterization of a marine-derived natural product, N6,N6-dimethyladenosine, that robustly inhibits AKT signaling in a variety of non-small cell lung cancer cell lines. Upon validation of the elucidated structure by comparison with a commercially available sample, experiments were initiated to understand the small molecule’s breadth of effect in a biological setting. One such experiment, a reverse phase protein array (RPPA) analysis of >50 kinases, indicated a specific cellular response to treatment. In all, leveraging the FUSION platform allowed for the rapid generation and validation of a biological mechanism of action hypothesis for an unknown natural product and permitted accelerated purification of the bioactive component from a chemically complex fraction. PMID:28294973

  8. High-Speed Friction-Stir Welding To Enable Aluminum Tailor-Welded Blanks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hovanski, Yuri; Upadhyay, Piyush; Carsley, John

    Current joining technologies for automotive aluminum alloys are utilized in low-volume and niche applications, and have yet to be scaled for the high-volume vehicle market. This study targeted further weight reduction, part reduction, and cost savings by enabling tailor-welded blank technology for aluminum alloys at high-volumes. While friction stir welding has been traditionally applied at linear velocities less than one meter per minute, high volume production applications demand the process be extended to higher velocities more amenable to cost sensitive production environments. Unfortunately, weld parameters and performance developed and characterized at low to moderate welding velocities do not directly translatemore » to high speed linear friction stir welding. Therefore, in order to facilitate production of high volume aluminum welded components, parameters were developed with a minimum welding velocity of three meters per minute. With an emphasis on weld quality, welded blanks were evaluated for post-weld formability utilizing a combination of numerical and experimental methods. Evaluation across scales was ultimately validated by stamping full-size production door inner panels made from dissimilar thickness aluminum tailor-welded blanks, which provided validation of the numerical and experimental analysis of laboratory scale tests.« less

  9. Modernization of AOAC Nutrient Methods by Stakeholder Panel on Infant Formula and Adult Nutritionals.

    PubMed

    Sullivan, Darryl

    2016-01-01

    Infant formula is one of the most highly regulated products in the world. To comply with global regulations and to ensure the products are manufactured within product specifications, accurate analytical testing is required. Most of the AOAC INTERNATIONAL legacy test methods for infant formula were developed and validated in the 1980s and 1990s. Although these methods performed very well for many years, infant formulas have been updated, and today's products contain many new and novel ingredients. There were a number of cases in which the legacy AOAC methods began to result in problems with the analysis of modern infant formulas, and the use of these methods caused some disputes with regulatory agencies. In 2010, AOAC reached an agreement with the International Formula Council, which has changed its name to the Infant Nutrition Council of America, regarding a project to modernize these AOAC infant-formula test methods. This agreement led to the development of Standard Method Performance Requirements (SMPRs(®)) for 28 nutrients. After SMPR approval, methods were collected, evaluated, validated, and approved through the AOAC Official Methods(SM) process. Forty-seven methods have been approved as AOAC First Action Methods, and eight have been approved as Final Action.

  10. Development and Applications of a New, High-Resolution, Operational MISR Aerosol Product

    NASA Astrophysics Data System (ADS)

    Garay, M. J.; Diner, D. J.; Kalashnikova, O.

    2014-12-01

    Since early 2000, the Multi-angle Imaging SpectroRadiometer (MISR) instrument on NASA's Terra satellite has been providing aerosol optical depth (AOD) and particle property retrievals at 17.6 km spatial resolution. Capitalizing on the capabilities provided by multi-angle viewing, the operational MISR algorithm performs well, with about 75% of MISR AOD retrievals falling within 0.05 or 20% × AOD of the paired validation data from the ground-based Aerosol Robotic Network (AERONET), and is able to distinguish aerosol particles by size and sphericity, over both land and water. These attributes enable a variety of applications, including aerosol transport model validation and global air quality assessment. Motivated by the adverse impacts of aerosols on human health at the local level, and taking advantage of computational speed advances that have occurred since the launch of Terra, we have implemented an operational MISR aerosol product with 4.4 km spatial resolution that maintains, and sometimes improves upon, the quality of the 17.6 km resolution product. We will describe the performance of this product relative to the heritage 17.6 km product, the global AERONET validation network, and high spatial density AERONET-DRAGON sites. Other changes that simplify product content, and make working with the data much easier for users, will also be discussed. Examples of how the new product demonstrates finer spatial variability of aerosol fields than previously retrieved, and ways this new dataset can be used for studies of local aerosol effects, will be shown.

  11. Multi-platform operational validation of the Western Mediterranean SOCIB forecasting system

    NASA Astrophysics Data System (ADS)

    Juza, Mélanie; Mourre, Baptiste; Renault, Lionel; Tintoré, Joaquin

    2014-05-01

    The development of science-based ocean forecasting systems at global, regional, and local scales can support a better management of the marine environment (maritime security, environmental and resources protection, maritime and commercial operations, tourism, ...). In this context, SOCIB (the Balearic Islands Coastal Observing and Forecasting System, www.socib.es) has developed an operational ocean forecasting system in the Western Mediterranean Sea (WMOP). WMOP uses a regional configuration of the Regional Ocean Modelling System (ROMS, Shchepetkin and McWilliams, 2005) nested in the larger scale Mediterranean Forecasting System (MFS) with a spatial resolution of 1.5-2km. WMOP aims at reproducing both the basin-scale ocean circulation and the mesoscale variability which is known to play a crucial role due to its strong interaction with the large scale circulation in this region. An operational validation system has been developed to systematically assess the model outputs at daily, monthly and seasonal time scales. Multi-platform observations are used for this validation, including satellite products (Sea Surface Temperature, Sea Level Anomaly), in situ measurements (from gliders, Argo floats, drifters and fixed moorings) and High-Frequency radar data. The validation procedures allow to monitor and certify the general realism of the daily production of the ocean forecasting system before its distribution to users. Additionally, different indicators (Sea Surface Temperature and Salinity, Eddy Kinetic Energy, Mixed Layer Depth, Heat Content, transports in key sections) are computed every day both at the basin-scale and in several sub-regions (Alboran Sea, Balearic Sea, Gulf of Lion). The daily forecasts, validation diagnostics and indicators from the operational model over the last months are available at www.socib.es.

  12. Southern Africa Validation of NASA's Earth Observing System (SAVE EOS)

    NASA Technical Reports Server (NTRS)

    Privette, Jeffrey L.

    2000-01-01

    Southern Africa Validation of EOS (SAVE) is 4-year, multidisciplinary effort to validate operational and experimental products from Terra-the flagship satellite of NASA's Earth Observing System (EOS). At test sites from Zambia to South Africa, we are measuring soil, vegetation and atmospheric parameters over a range of ecosystems for comparison with products from Terra, Landsat 7, AVHRR and SeaWiFS. The data are also employed to parameterize and improve vegetation process models. Fixed-point and mobile "transect" sampling are used to collect the ground data. These are extrapolated over larger areas with fine-resolution multispectral imagery. We describe the sites, infrastructure, and measurement strategies developed underSAVE, as well as initial results from our participation in the first Intensive Field Campaign of SAFARI 2000. We also describe SAVE's role in the Kalahari Transect Campaign (February/March 2000) in Zambia and Botswana.

  13. Development of measures assessing attitudes toward contraband tobacco among a web-based sample of smokers.

    PubMed

    Adkison, Sarah E; O'Connor, Richard J; Chaiton, Michael; Schwartz, Robert

    2015-01-01

    As regulation of tobacco products tightens, there are concerns that illicit markets may develop to supply restricted products. However, there are few validated measures to assess attitudes or purchase intentions toward contraband tobacco (CT). As such, it is important to investigate individual level characteristics that are associated with the purchase and use of contraband tobacco. In May 2013, a pilot survey assessed attitudes, behaviors, and purchase intentions for contraband tobacco based on previous research regarding non-tobacco contraband. The survey was administered via Amazon Mechanical Turk, a crowdsourcing resource, among current smoking respondents in the United States and Canada. Structural equation modeling was used to evaluate the validity of the proposed model for understanding attitudes toward contraband tobacco. CT purchasers were more likely to report norms supportive of counterfeit products, more intentions toward purchasing counterfeit products, a lowered risk associated with these products, and to have more favorable attitudes toward CT than those who had not purchased CT. Attitudes toward CT mediated the relationship between subjective norms and prior purchase with behavior intentions. Perceived risk had a significant direct effect on intentions and an indirect effect through attitudes toward CT. The structural model fit the data well and accounted for over half (53%) of the variance in attitudes toward tobacco. Understanding the mechanisms associated with CT attitudes and purchase behaviors may provide insight for how to mitigate possible iatrogenic consequences of newly implemented regulations. The measures developed here elucidate some elements that influence attitudes and purchase intentions for CT and may inform policy efforts to curtail the development of illicit markets.

  14. Stress Degradation Studies on Varenicline Tartrate and Development of a Validated Stability-Indicating HPLC Method

    PubMed Central

    Pujeri, Sudhakar S.; Khader, Addagadde M. A.; Seetharamappa, Jaldappagari

    2012-01-01

    A simple, rapid and stability-indicating reversed-phase liquid chromatographic method was developed for the assay of varenicline tartrate (VRT) in the presence of its degradation products generated from forced decomposition studies. The HPLC separation was achieved on a C18 Inertsil column (250 mm × 4.6 mm i.d. particle size is 5 μm) employing a mobile phase consisting of ammonium acetate buffer containing trifluoroacetic acid (0.02M; pH 4) and acetonitrile in gradient program mode with a flow rate of 1.0 mL min−1. The UV detector was operated at 237 nm while column temperature was maintained at 40 °C. The developed method was validated as per ICH guidelines with respect to specificity, linearity, precision, accuracy, robustness and limit of quantification. The method was found to be simple, specific, precise and accurate. Selectivity of the proposed method was validated by subjecting the stock solution of VRT to acidic, basic, photolysis, oxidative and thermal degradation. The calibration curve was found to be linear in the concentration range of 0.1–192 μg mL−1 (R2 = 0.9994). The peaks of degradation products did not interfere with that of pure VRT. The utility of the developed method was examined by analyzing the tablets containing VRT. The results of analysis were subjected to statistical analysis. PMID:22396908

  15. Development of Servo Motor Trainer for Basic Control System in Laboratory of Electrical Engineering Control System Faculty of Engineering Universitas Negeri Surabaya

    NASA Astrophysics Data System (ADS)

    Endryansyah; Wanarti Rusimamto, Puput; Ridianto, Adam; Sugiarto, Hariyadi

    2018-04-01

    In the Department of Electrical Engineering FT Unesa, there are 3 majors: S1 Electrical Engineering Education, S1 Electrical Engineering, and D3 Electrical Engineering. Courses the Basic System Settings go to in the curriculum of the three programs. Team lecturer college of basic system settings seek learning innovation, focused on the development of trainer to student practicum at the laboratory of systems control. Trainer developed is a servo motor along with the lab module that contains a wide variety of theories about the servo motor and guide the practicum. This research type is development research using methods Research & development (R & D). In which the steps are applied in this study is as follows: pay attention to the potential and existing problems, gather information and study the literature, design the product, validate the design, revise the design, a limited trial. The results of the validation of learning device in the form of modules and trainer obtained as follows: score validation of learning device is 3,64; score validation lab module Servo Motor is 3,47; and questionnaire responses of students is 3,73. The result of the whole validation value is located in the interval >of 3.25 s/d 4 with the category of “Very Valid”, so it can be concluded that all instruments have a level of validity “Very Valid” and worthy of use for further learning.

  16. Model development and experimental validation of capnophilic lactic fermentation and hydrogen synthesis by Thermotoga neapolitana.

    PubMed

    Pradhan, Nirakar; Dipasquale, Laura; d'Ippolito, Giuliana; Fontana, Angelo; Panico, Antonio; Pirozzi, Francesco; Lens, Piet N L; Esposito, Giovanni

    2016-08-01

    The aim of the present study was to develop a kinetic model for a recently proposed unique and novel metabolic process called capnophilic (CO2-requiring) lactic fermentation (CLF) pathway in Thermotoga neapolitana. The model was based on Monod kinetics and the mathematical expressions were developed to enable the simulation of biomass growth, substrate consumption and product formation. The calibrated kinetic parameters such as maximum specific uptake rate (k), semi-saturation constant (kS), biomass yield coefficient (Y) and endogenous decay rate (kd) were 1.30 h(-1), 1.42 g/L, 0.1195 and 0.0205 h(-1), respectively. A high correlation (>0.98) was obtained between the experimental data and model predictions for both model validation and cross validation processes. An increase of the lactate production in the range of 40-80% was obtained through CLF pathway compared to the classic dark fermentation model. The proposed kinetic model is the first mechanistically based model for the CLF pathway. This model provides useful information to improve the knowledge about how acetate and CO2 are recycled back by Thermotoga neapolitana to produce lactate without compromising the overall hydrogen yield. Copyright © 2016 Elsevier Ltd. All rights reserved.

  17. A validated stability-indicating RP-HPLC method for levofloxacin in the presence of degradation products, its process related impurities and identification of oxidative degradant.

    PubMed

    Lalitha Devi, M; Chandrasekhar, K B

    2009-12-05

    The objective of current study was to develop a validated specific stability indicating reversed-phase liquid chromatographic method for the quantitative determination of levofloxacin as well as its related substances determination in bulk samples, pharmaceutical dosage forms in the presence of degradation products and its process related impurities. Forced degradation studies were performed on bulk sample of levofloxacin as per ICH prescribed stress conditions using acid, base, oxidative, water hydrolysis, thermal stress and photolytic degradation to show the stability indicating power of the method. Significant degradation was observed during oxidative stress and the degradation product formed was identified by LCMS/MS, slight degradation in acidic stress and no degradation was observed in other stress conditions. The chromatographic method was optimized using the samples generated from forced degradation studies and the impurity spiked solution. Good resolution between the peaks corresponds to process related impurities and degradation products from the analyte were achieved on ACE C18 column using the mobile phase consists a mixture of 0.5% (v/v) triethyl amine in sodium dihydrogen orthophosphate dihydrate (25 mM; pH 6.0) and methanol using a simple linear gradient. The detection was carried out at 294 nm. The limit of detection and the limit of quantitation for the levofloxacin and its process related impurities were established. The stressed test solutions were assayed against the qualified working standard of levofloxacin and the mass balance in each case was in between 99.4 and 99.8% indicating that the developed LC method was stability indicating. Validation of the developed LC method was carried out as per ICH requirements. The developed LC method was found to be suitable to check the quality of bulk samples of levofloxacin at the time of batch release and also during its stability studies (long term and accelerated stability).

  18. Validation of multisource electronic health record data: an application to blood transfusion data.

    PubMed

    Hoeven, Loan R van; Bruijne, Martine C de; Kemper, Peter F; Koopman, Maria M W; Rondeel, Jan M M; Leyte, Anja; Koffijberg, Hendrik; Janssen, Mart P; Roes, Kit C B

    2017-07-14

    Although data from electronic health records (EHR) are often used for research purposes, systematic validation of these data prior to their use is not standard practice. Existing validation frameworks discuss validity concepts without translating these into practical implementation steps or addressing the potential influence of linking multiple sources. Therefore we developed a practical approach for validating routinely collected data from multiple sources and to apply it to a blood transfusion data warehouse to evaluate the usability in practice. The approach consists of identifying existing validation frameworks for EHR data or linked data, selecting validity concepts from these frameworks and establishing quantifiable validity outcomes for each concept. The approach distinguishes external validation concepts (e.g. concordance with external reports, previous literature and expert feedback) and internal consistency concepts which use expected associations within the dataset itself (e.g. completeness, uniformity and plausibility). In an example case, the selected concepts were applied to a transfusion dataset and specified in more detail. Application of the approach to a transfusion dataset resulted in a structured overview of data validity aspects. This allowed improvement of these aspects through further processing of the data and in some cases adjustment of the data extraction. For example, the proportion of transfused products that could not be linked to the corresponding issued products initially was 2.2% but could be improved by adjusting data extraction criteria to 0.17%. This stepwise approach for validating linked multisource data provides a basis for evaluating data quality and enhancing interpretation. When the process of data validation is adopted more broadly, this contributes to increased transparency and greater reliability of research based on routinely collected electronic health records.

  19. STAR Algorithm Integration Team - Facilitating operational algorithm development

    NASA Astrophysics Data System (ADS)

    Mikles, V. J.

    2015-12-01

    The NOAA/NESDIS Center for Satellite Research and Applications (STAR) provides technical support of the Joint Polar Satellite System (JPSS) algorithm development and integration tasks. Utilizing data from the S-NPP satellite, JPSS generates over thirty Environmental Data Records (EDRs) and Intermediate Products (IPs) spanning atmospheric, ocean, cryosphere, and land weather disciplines. The Algorithm Integration Team (AIT) brings technical expertise and support to product algorithms, specifically in testing and validating science algorithms in a pre-operational environment. The AIT verifies that new and updated algorithms function in the development environment, enforces established software development standards, and ensures that delivered packages are functional and complete. AIT facilitates the development of new JPSS-1 algorithms by implementing a review approach based on the Enterprise Product Lifecycle (EPL) process. Building on relationships established during the S-NPP algorithm development process and coordinating directly with science algorithm developers, the AIT has implemented structured reviews with self-contained document suites. The process has supported algorithm improvements for products such as ozone, active fire, vegetation index, and temperature and moisture profiles.

  20. Transformation in the pharmaceutical industry: transformation-induced quality risks--a survey.

    PubMed

    Shafiei, Nader; Ford, James L; Morecroft, Charles W; Lisboa, Paulo J; Taylor, Mark J; Mouzughi, Yusra

    2013-01-01

    This paper is the fourth in a series that explores ongoing transformation in the pharmaceutical industry and its impact on pharmaceutical quality from the perspective of risk identification. The aim of this paper is to validate proposed quality risks through elicitation of expert opinion and define the resultant quality risk model. Expert opinion was obtained using a questionnaire-based survey with participants with recognized expertise in pharmaceutical regulation, product lifecycle, or technology. The results of the survey validate the theoretical and operational evidence in support of the four main pharmaceutical transformation triggers previously identified. The quality risk model resulting from the survey indicated a firm relationship between the pharmaceutical quality risks and regulatory compliance outcomes during the marketing approval and post-marketing phases of the product lifecycle and a weaker relationship during the pre-market evaluation phase. In this paper through conduct of an expert opinion survey the proposed quality risks carried forward from an earlier part of the research are validated and resultant quality risk model is defined. The survey results validate the theoretical and operational evidence previously identified. The quality risk model indicates that transformation-related risks have a larger regulatory compliance impact during product approval, manufacturing, distribution, and commercial use than during the development phase.

  1. A Framework for Performing Verification and Validation in Reuse Based Software Engineering

    NASA Technical Reports Server (NTRS)

    Addy, Edward A.

    1997-01-01

    Verification and Validation (V&V) is currently performed during application development for many systems, especially safety-critical and mission- critical systems. The V&V process is intended to discover errors, especially errors related to critical processing, as early as possible during the development process. The system application provides the context under which the software artifacts are validated. This paper describes a framework that extends V&V from an individual application system to a product line of systems that are developed within an architecture-based software engineering environment. This framework includes the activities of traditional application-level V&V, and extends these activities into domain engineering and into the transition between domain engineering and application engineering. The framework includes descriptions of the types of activities to be performed during each of the life-cycle phases, and provides motivation for the activities.

  2. The development of an integrated assessment instrument for measuring analytical thinking and science process skills

    NASA Astrophysics Data System (ADS)

    Irwanto, Rohaeti, Eli; LFX, Endang Widjajanti; Suyanta

    2017-05-01

    This research aims to develop instrument and determine the characteristics of an integrated assessment instrument. This research uses 4-D model, which includes define, design, develop, and disseminate. The primary product is validated by expert judgment, tested it's readability by students, and assessed it's feasibility by chemistry teachers. This research involved 246 students of grade XI of four senior high schools in Yogyakarta, Indonesia. Data collection techniques include interview, questionnaire, and test. Data collection instruments include interview guideline, item validation sheet, users' response questionnaire, instrument readability questionnaire, and essay test. The results show that the integrated assessment instrument has Aiken validity value of 0.95. Item reliability was 0.99 and person reliability was 0.69. Teachers' response to the integrated assessment instrument is very good. Therefore, the integrated assessment instrument is feasible to be applied to measure the students' analytical thinking and science process skills.

  3. Validity of a Semi-Quantitative Food Frequency Questionnaire for Collegiate Athletes

    PubMed Central

    Sasaki, Kazuto; Suzuki, Yoshio; Oguma, Nobuhide; Ishihara, Junko; Nakai, Ayumi; Yasuda, Jun; Yokoyama, Yuri; Yoshizaki, Takahiro; Tada, Yuki; Hida, Azumi; Kawano, Yukari

    2016-01-01

    Background Food frequency questionnaires (FFQs) have been developed and validated for various populations. To our knowledge, however, no FFQ has been validated for young athletes. Here, we investigated whether an FFQ that was developed and validated to estimate dietary intake in middle-aged persons was also valid for estimating that in young athletes. Methods We applied an FFQ that had been developed for the Japan Public Health Center-based Prospective Cohort Study with modification to the duration of recollection. A total of 156 participants (92 males) completed the FFQ and a 3-day non-consecutive 24-hour dietary recall (24hDR). Validity of the mean estimates was evaluated by calculating the percentage differences between the 24hDR and FFQ. Ranking estimation was validated using Spearman’s correlation coefficient (CC), and the degree of miscategorization was determined by joint classification. Results The FFQ underestimated energy intake by approximately 10% for both males and females. For 35 nutrients, the median (range) deattenuated CC was 0.30 (0.10 to 0.57) for males and 0.32 (−0.08 to 0.62) for females. For 19 food groups, the median (range) deattenuated CC was 0.32 (0.17 to 0.72) for males and 0.34 (−0.11 to 0.58) for females. For both nutrient and food group intakes, cross-classification analysis indicated extreme miscategorization rates of 3% to 5%. Conclusions An FFQ developed and validated for middle-aged persons had comparable validity among young athletes. This FFQ might be useful for assessing habitual dietary intake in collegiate athletes, especially for calcium, vitamin C, vegetables, fruits, and milk and dairy products. PMID:26902164

  4. Validation of GEOLAND-2 Spot/vgt Albedo Products by Using Ceos Olive Methodology

    NASA Astrophysics Data System (ADS)

    Camacho de Coca, F.; Sanchez, J.; Schaaf, C.; Baret, F.; Weiss, M.; Cescatti, A.; Lacaze, R. N.

    2012-12-01

    This study evaluates the scientific merit of the global surface albedo products developed in the framework of the Geoland-2 project based on SPOT/VEGETATION observations. The methodology follows the OLIVE (On-Line Validation Exercise) approach supported by the CEOS Land Product Validation subgroup (calvalportal.ceos.org/cvp/web/olive). First, the spatial and temporal consistency of SPOT/VGT albedo products was assessed by intercomparison with reference global products (MODIS/Terra+Aqua and POLDER-3/PARASOL) for the period 2006-2007. A bulk statistical analysis over a global network of 420 homogeneous sites (BELMANIP-2) was performed and analyzed per biome types. Additional sites were included to study albedo under snow conditions. Second, the accuracy and realism of temporal variations were evaluated using a number of ground measurements from FLUXNET sites suitable for use in direct comparison to the co-located satellite data. Our results show that SPOT/VGT albedo products present reliable spatial and temporal distribution of retrievals. The SPOT/VGT albedo performs admirably with MODIS, with a mean bias and RMSE for the shortwave black-sky albedo over BELMANIP-2 sites lower than 0.006 and 0.03 (13% in relative terms) respectively, and even better for snow free pixels. Similar results were found for the white-sky albedo quantities. Discrepancies are larger when comparing with POLDER-3 products: for the shortwave black-sky albedo a mean bias of -0.014 and RMSE of 0.04 (20%) was found. This overall performance figures are however land-cover dependent and larger uncertainties were found over some biomes (or regions) or specific periods (e.g. winter in the north hemisphere). The comparison of SPOT/VGT blue-sky albedo estimates with ground measurements (mainly over Needle-leaf forest sites) show a RMSE of 0.04 and a bias of 0.003 when only snow-free pixels are considered. Moreover, this work shows that the OLIVE tool is also suitable for validation of global albedo products.

  5. Aura Atmospheric Data Products and Their Availability from NASA Goddard Earth Sciences DAAC

    NASA Technical Reports Server (NTRS)

    Ahmad, S.; Johnson, J.; Gopalan, A.; Smith, P.; Leptoukh, G.; Kempler, S.

    2004-01-01

    NASA's EOS-Aura spacecraft was launched successfully on July 15, 2004. The four instruments onboard the spacecraft are the Microwave Limb Sounder (MLS), the Ozone Monitoring Instrument (OMI), the Tropospheric Emission Spectrometer (TES), and the High Resolution Dynamics Limb Sounder (HBDLS). The Aura instruments are designed to gather earth sciences measurements across the ultraviolet, visible, infra-red, thermal and microwave regions of the electromagnetic spectrum. Aura will provide over 70 distinct standard atmospheric data products for use in ozone layer and surface UV-B monitoring, air quality forecast, and atmospheric chemistry and climate change studies (http://eosaura.gsfc.nasa.gov/). These products include earth-atmosphere radiances and solar spectral irradiances; total column, tropospheric, and profiles of ozone and other trace gases, surface W-B flux; clouds and aerosol characteristics; and temperature, geopotential height, and water vapor profiles. The MLS, OMI, and HIRDLS data products will be archived at the NASA Goddard Earth Sciences (GES) Distributed Active Archive Center (DAAC), while data from TES will be archived at NASA Langley Research Center DAAC. Some of the standard products which have gone through quick preliminary checks are already archived at the GES DAAC (http://daac.nsfc.nasa.gov/) and are available to the Aura science team and data validation team members for data validation; and to the application and visualization software developers, for testing their application modules. Once data are corrected for obvious calibration problems and partially validated using in-situ observations, they would be made available to the broader user community. This presentation will provide details of the whole suite of Aura atmospheric data products, and the time line of the availability of the rest of the preliminary products and of the partially validated provisional products. Software and took available for data access, visualization, and data mining will also be discussed.

  6. Development of Aa New Time Temperature Indicator for Enzymatic Validation of Pasteurization of Meat Products.

    PubMed

    Brizio, Ana Paula Dutra Resem; Prentice, Carlos

    2015-06-01

    This paper presents the development of a new smart time-temperature indicator (TTI) of pasteurization whose operating principle is based on the complexation reaction between starch and iodine, and the subsequent action of an amylase on this complex causing its discoloration at a rate dependent on time and temperature of the medium. Laboratory simulations and tests in a manufacturing plant evaluated different enzyme concentrations in the TTI prototypes when exposed to pasteurization conditions. The results showed that the color response of the indicators was visually interpreted as adaptive to measurement using appropriate equipment, with satisfactory reliability in all conditions studied. The TTI containing 6.5% amylase was one whose best results were suited for use in validating the cooking of hams. When attached to the primary packaging of the product, this TTI indicated the pasteurization process inexpensively, easily, accurately, and nondestructively. © 2015 Institute of Food Technologists®

  7. Online Assessment of Satellite-Derived Global Precipitation Products

    NASA Technical Reports Server (NTRS)

    Liu, Zhong; Ostrenga, D.; Teng, W.; Kempler, S.

    2012-01-01

    Precipitation is difficult to measure and predict. Each year droughts and floods cause severe property damages and human casualties around the world. Accurate measurement and forecast are important for mitigation and preparedness efforts. Significant progress has been made over the past decade in satellite precipitation product development. In particular, products' spatial and temporal resolutions as well as timely availability have been improved by blended techniques. Their resulting products are widely used in various research and applications. However biases and uncertainties are common among precipitation products and an obstacle exists in quickly gaining knowledge of product quality, biases and behavior at a local or regional scale, namely user defined areas or points of interest. Current online inter-comparison and validation services have not addressed this issue adequately. To address this issue, we have developed a prototype to inter-compare satellite derived daily products in the TRMM Online Visualization and Analysis System (TOVAS). Despite its limited functionality and datasets, users can use this tool to generate customized plots within the United States for 2005. In addition, users can download customized data for further analysis, e.g. comparing their gauge data. To meet increasing demands, we plan to increase the temporal coverage and expanded the spatial coverage from the United States to the globe. More products have been added as well. In this poster, we present two new tools: Inter-comparison of 3B42RT and 3B42 Inter-comparison of V6 and V7 TRMM L-3 monthly products The future plans include integrating IPWG (International Precipitation Working Group) Validation Algorithms/statistics, allowing users to generate customized plots and data. In addition, we will expand the current daily products to monthly and their climatology products. Whenever the TRMM science team changes their product version number, users would like to know the differences by inter-comparing both versions of TRMM products in their areas of interest. Making this service available to users will help them to better understand associated changes. We plan to implement this inter-comparison in TRMM standard monthly products with the IPWG algorithms. The plans outlined above will complement and accelerate the existing and ongoing validation activities in the community as well as enhance data services for TRMM and the future Global Precipitation Mission (GPM).

  8. Validation of an aggregate exposure model for substances in consumer products: a case study of diethyl phthalate in personal care products

    PubMed Central

    Delmaar, Christiaan; Bokkers, Bas; ter Burg, Wouter; Schuur, Gerlienke

    2015-01-01

    As personal care products (PCPs) are used in close contact with a person, they are a major source of consumer exposure to chemical substances contained in these products. The estimation of realistic consumer exposure to substances in PCPs is currently hampered by the lack of appropriate data and methods. To estimate aggregate exposure of consumers to substances contained in PCPs, a person-oriented consumer exposure model has been developed (the Probabilistic Aggregate Consumer Exposure Model, PACEM). The model simulates daily exposure in a population based on product use data collected from a survey among the Dutch population. The model is validated by comparing diethyl phthalate (DEP) dose estimates to dose estimates based on biomonitoring data. It was found that the model's estimates compared well with the estimates based on biomonitoring data. This suggests that the person-oriented PACEM model is a practical tool for assessing realistic aggregate exposures to substances in PCPs. In the future, PACEM will be extended with use pattern data on other product groups. This will allow for assessing aggregate exposure to substances in consumer products across different product groups. PMID:25352161

  9. Utility of Hantzsch reaction for development of highly sensitive spectrofluorimetric method for determination of alfuzosin and terazosin in bulk, dosage forms and human plasma.

    PubMed

    Hammad, Mohamed A; Omar, Mahmoud A; Salman, Baher I

    2017-09-01

    A highly sensitive, cheap, simple and accurate spectrofluorimetric method has been developed and validated for the determination of alfuzosin hydrochloride and terazosin hydrochloride in their pharmaceutical dosage forms and in human plasma. The developed method is based on the reaction of the primary amine moiety in the studied drugs with acetylacetone and formaldehyde according to the Hantzsch reaction, producing yellow fluorescent products that can be measured spectrofluorimetrically at 480 nm after excitation at 415 nm. Different experimental parameters affecting the development and stability of the reaction products were carefully studied and optimized. The fluorescence-concentration plots of alfuzosin and terazosin were rectilinear over a concentration range of 70-900 ng ml -1 , with quantitation limits 27.1 and 32.2 ng ml -1 for alfuzosin and terazosin, respectively. The proposed method was validated according to ICH guidelines and successfully applied to the analysis of the investigated drugs in dosage forms, content uniformity test and spiked human plasma with high accuracy. Copyright © 2017 John Wiley & Sons, Ltd.

  10. SMART empirical approaches for predicting field performance of PV modules from results of reliability tests

    NASA Astrophysics Data System (ADS)

    Hardikar, Kedar Y.; Liu, Bill J. J.; Bheemreddy, Venkata

    2016-09-01

    Gaining an understanding of degradation mechanisms and their characterization are critical in developing relevant accelerated tests to ensure PV module performance warranty over a typical lifetime of 25 years. As newer technologies are adapted for PV, including new PV cell technologies, new packaging materials, and newer product designs, the availability of field data over extended periods of time for product performance assessment cannot be expected within the typical timeframe for business decisions. In this work, to enable product design decisions and product performance assessment for PV modules utilizing newer technologies, Simulation and Mechanism based Accelerated Reliability Testing (SMART) methodology and empirical approaches to predict field performance from accelerated test results are presented. The method is demonstrated for field life assessment of flexible PV modules based on degradation mechanisms observed in two accelerated tests, namely, Damp Heat and Thermal Cycling. The method is based on design of accelerated testing scheme with the intent to develop relevant acceleration factor models. The acceleration factor model is validated by extensive reliability testing under different conditions going beyond the established certification standards. Once the acceleration factor model is validated for the test matrix a modeling scheme is developed to predict field performance from results of accelerated testing for particular failure modes of interest. Further refinement of the model can continue as more field data becomes available. While the demonstration of the method in this work is for thin film flexible PV modules, the framework and methodology can be adapted to other PV products.

  11. Development and in-line validation of a Process Analytical Technology to facilitate the scale up of coating processes.

    PubMed

    Wirges, M; Funke, A; Serno, P; Knop, K; Kleinebudde, P

    2013-05-05

    Incorporation of an active pharmaceutical ingredient (API) into the coating layer of film-coated tablets is a method mainly used to formulate fixed-dose combinations. Uniform and precise spray-coating of an API represents a substantial challenge, which could be overcome by applying Raman spectroscopy as process analytical tool. In pharmaceutical industry, Raman spectroscopy is still mainly used as a bench top laboratory analytical method and usually not implemented in the production process. Concerning the application in the production process, a lot of scientific approaches stop at the level of feasibility studies and do not manage the step to production scale and process applications. The present work puts the scale up of an active coating process into focus, which is a step of highest importance during the pharmaceutical development. Active coating experiments were performed at lab and production scale. Using partial least squares (PLS), a multivariate model was constructed by correlating in-line measured Raman spectral data with the coated amount of API. By transferring this model, being implemented for a lab scale process, to a production scale process, the robustness of this analytical method and thus its applicability as a Process Analytical Technology (PAT) tool for the correct endpoint determination in pharmaceutical manufacturing could be shown. Finally, this method was validated according to the European Medicine Agency (EMA) guideline with respect to the special requirements of the applied in-line model development strategy. Copyright © 2013 Elsevier B.V. All rights reserved.

  12. The relationship between organizational trust and nurse administrators’ productivity in hospitals

    PubMed Central

    Bahrami, Susan; Hasanpour, Marzieh; Rajaeepour, Saeed; Aghahosseni, Taghi; Hodhodineghad, Nilofar

    2012-01-01

    Context: Management of health care organizations based on employee’s mutual trust will increase the improvement in functions and tasks. Aims: The present study was performed to investigate the relationship between organizational trust and the nurse administrators’ productivity in educational health centers of in Health-Education Centers of Isfahan University of Medical Sciences. Settings and Design: This research was a descriptive and correlational study. Materials and Methods: The population included all nurse administrators. In this research, 165 nurses were selected through random sampling method. Data collection instruments were organizational trust questionnaire based on Robbins’s model and productivity questionnaire based on Hersy and Blanchard’s model. Validity of these questionnaires was determined through content validity and their reliability was calculated through Cranach’s alpha. Statistical analysis was used: The data analysis was done using the SPSS (18) statistical software. Results: The indicators of organizational trust such as loyalty, competence, honesty, and stability were more than average level but explicitness indicator was at average level. The components of productivity such as ability, job knowledge, environmental compatibility, performance feedback, and validity were more than average level but motivation factor was at average level and organizational support was less than average level. There were a significant multiple correlations between organizational trust and productivity. Beta coefficients among organizational trust and productivity were significant and no autocorrelation existed and regression model was significant. Conclusions: Committed employees, timely performing the tasks and developing the sense of responsibility among employees can enhance production and productivity in the health care organizations. PMID:23922588

  13. Overview of cancer vaccines

    PubMed Central

    Kudrin, Alex

    2012-01-01

    Cancer immunotherapy has seen a tremendous number of failures and only few recent regulatory successes. This is a review dedicated to determine major regulatory and developmental issues around cancer immunotherapeutics. A three pillar approach should be used in setting a development path: discovery platforms and sufficient pool of validated tumor antigens, product development strategy enabling to bring the product closer to the patient and clinical development strategy accounting for competitive landscape, treatment paradigm, technical and commercial risks. Regulatory framework existing around cancer vaccines in the EU, US, Japan and some developing countries is outlined. In addition, the review covers some specific issues on the design and conduct of clinical trials with cancer vaccines. PMID:22894970

  14. Development of the Assessment of Belief Conflict in Relationship-14 (ABCR-14).

    PubMed

    Kyougoku, Makoto; Teraoka, Mutsumi; Masuda, Noriko; Ooura, Mariko; Abe, Yasushi

    2015-01-01

    Nurses and other healthcare workers frequently experience belief conflict, one of the most important, new stress-related problems in both academic and clinical fields. In this study, using a sample of 1,683 nursing practitioners, we developed The Assessment of Belief Conflict in Relationship-14 (ABCR-14), a new scale that assesses belief conflict in the healthcare field. Standard psychometric procedures were used to develop and test the scale, including a qualitative framework concept and item-pool development, item reduction, and scale development. We analyzed the psychometric properties of ABCR-14 according to entropy, polyserial correlation coefficient, exploratory factor analysis, confirmatory factor analysis, average variance extracted, Cronbach's alpha, Pearson product-moment correlation coefficient, and multidimensional item response theory (MIRT). The results of the analysis supported a three-factor model consisting of 14 items. The validity and reliability of ABCR-14 was suggested by evidence from high construct validity, structural validity, hypothesis testing, internal consistency reliability, and concurrent validity. The result of the MIRT offered strong support for good item response of item slope parameters and difficulty parameters. However, the ABCR-14 Likert scale might need to be explored from the MIRT point of view. Yet, as mentioned above, there is sufficient evidence to support that ABCR-14 has high validity and reliability. The ABCR-14 demonstrates good psychometric properties for nursing belief conflict. Further studies are recommended to confirm its application in clinical practice.

  15. Validation and Development of the GPCP Experimental One-Degree Daily (1DD) Global Precipitation Product

    NASA Technical Reports Server (NTRS)

    Huffman, George J.; Adler, Robert F.; Bolvin, David T.; Einaud, Franco (Technical Monitor)

    2000-01-01

    The One-Degree Daily (1DD) precipitation dataset has been developed for the Global Precipitation Climatology Project (GPCP) and is currently in beta test preparatory to release as an official GPCP product. The 1DD provides a globally-complete, observation-only estimate of precipitation on a daily 1 deg. x 1 deg. grid for the period 1997 through early 2000 (by the time of the conference). In the latitude band 40N-40S the 1DD uses the Threshold-Matched Precipitation Index (TMPI), a GPI-like IR product with the pixel-level T(sub b) threshold and (single) conditional rain rate determined locally for each month by the frequency of precipitation in the GPROF SSM/I product and by, the precipitation amount in the GPCP monthly satellite-gauge (SG) combination. Outside 40N-40S the 1DD uses a scaled TOVS precipitation estimate that has month-by-month adjustments based on the TMPI and the SG. Early validation results are encouraging. The 1DD shows relatively large scatter about the daily validation values in individual grid boxes, as expected for a technique that depends on cloud-sensing schemes such as the TMPI and TOVS. On the other hand, the time series of 1DD shows good correlation with validation in individual boxes. For example, the 1997-1998 time series of 1DD and Oklahoma Mesonet values in a grid box in northeastern Oklahoma have the correlation coefficient = 0.73. Looking more carefully at these two time series, the number of raining days for the 1DD is within 7% of the Mesonet value, while the distribution of daily rain values is very similar. Other tests indicate that area- or time-averaging improve the error characteristics, making the data set highly attractive to users interested in stream flow, short-term regional climatology, and model comparisons. The second generation of the 1DD product is currently under development; it is designed to directly incorporate TRMM and other high-quality precipitation estimates. These data are generally sparse because they are observed by low-orbit satellites, so a fair amount of work must be devoted to analyzing the effect of data boundaries. This work is laying, the groundwork for effective use of the NASA Global Precipitation Mission, which will have full Global coverage by low-orbit passive microwave satellites every three hours.

  16. Object-oriented productivity metrics

    NASA Technical Reports Server (NTRS)

    Connell, John L.; Eller, Nancy

    1992-01-01

    Software productivity metrics are useful for sizing and costing proposed software and for measuring development productivity. Estimating and measuring source lines of code (SLOC) has proven to be a bad idea because it encourages writing more lines of code and using lower level languages. Function Point Analysis is an improved software metric system, but it is not compatible with newer rapid prototyping and object-oriented approaches to software development. A process is presented here for counting object-oriented effort points, based on a preliminary object-oriented analysis. It is proposed that this approach is compatible with object-oriented analysis, design, programming, and rapid prototyping. Statistics gathered on actual projects are presented to validate the approach.

  17. Development of an image analysis screen for estrogen receptor alpha (ERα) ligands through measurement of nuclear translocation dynamics.

    PubMed

    Dull, Angie; Goncharova, Ekaterina; Hager, Gordon; McMahon, James B

    2010-11-01

    We have developed a robust high-content assay to screen for novel estrogen receptor alpha (ERα) agonists and antagonists by quantitation of cytoplasmic to nuclear translocation of an estrogen receptor chimera in 384-well plates. The screen utilizes a green fluorescent protein tagged-glucocorticoid/estrogen receptor (GFP-GRER) chimera which consisted of the N-terminus of the glucocorticoid receptor fused to the human ER ligand binding domain. The GFP-GRER exhibited cytoplasmic localization in the absence of ERα ligands, and translocated to the nucleus in response to stimulation with ERα agonists or antagonists. The BD Pathway 435 imaging system was used for image acquisition, analysis of translocation dynamics, and cytotoxicity measurements. The assay was validated with known ERα agonists and antagonists, and the Library of Pharmacologically Active Compounds (LOPAC 1280). Additionally, screening of crude natural product extracts demonstrated the robustness of the assay, and the ability to quantitate the effects of toxicity on nuclear translocation dynamics. The GFP-GRER nuclear translocation assay was very robust, with z' values >0.7, CVs <5%, and has been validated with known ER ligands, and inclusion of cytotoxicity filters will facilitate screening of natural product extracts. This assay has been developed for future primary screening of synthetic, pure natural products, and natural product extracts libraries available at the National Cancer Institute at Frederick. Copyright © 2010 Elsevier Ltd. All rights reserved.

  18. Systematic Product Development of Control and Diagnosis Functionalities

    NASA Astrophysics Data System (ADS)

    Stetter, R.; Simundsson, A.

    2017-01-01

    In the scientific field of systematic product development a wide range of helpful methods, guidelines and tools were generated and published in recent years. Until now little special attention was given to design guidelines aiming at supporting product development engineers to design products that allow and support control or diagnosis functions. The general trend to ubiquitous computing and the first development steps towards cognitive systems as well as a general trend toward higher product safety, reliability and reduced total cost of ownership (TCO) in many engineering fields lead to a higher importance of control and diagnosis. In this paper a first attempt is made to formulate general valid guidelines how products can be developed in order to allow and to achieve effective and efficient control and diagnosis. The guidelines are elucidated on the example of an automated guided vehicle. One main concern of this paper is the integration of control and diagnosis functionalities into the development of complete systems which include mechanical, electrical and electronic subsystems. For the development of such systems the strategies, methods and tools of systematic product development have attracted significant attention during the last decades. Today, the functionality and safety of most products is to a large degree dependent on control and diagnosis functionalities. Still, there is comparatively little research concentrating on the integration of the development of these functionalities into the overall product development processes. The paper starts with a background describing Systematic Product Development. The second section deals with the product development of the sample product. The third part clarifies the notions monitoring, control and diagnosis. The following parts summarize some insights and formulate first hypotheses concerning control and diagnosis in Systematic Product Development.

  19. Strategies of bringing drug product marketing applications to meet current regulatory standards.

    PubMed

    Wu, Yan; Freed, Anita; Lavrich, David; Raghavachari, Ramesh; Huynh-Ba, Kim; Shah, Ketan; Alasandro, Mark

    2015-08-01

    In the past decade, many guidance documents have been issued through collaboration of global organizations and regulatory authorities. Most of these are applicable to new products, but there is a risk that currently marketed products will not meet the new compliance standards during audits and inspections while companies continue to make changes through the product life cycle for continuous improvement or market demands. This discussion presents different strategies to bringing drug product marketing applications to meet current and emerging standards. It also discusses stability and method designs to meet process validation and global development efforts.

  20. Land Surface Temperature Measurements from EOS MODIS Data

    NASA Technical Reports Server (NTRS)

    Wan, Zheng-Ming

    2004-01-01

    This report summarizes the accomplishments made by the MODIS LST (Land-Surface Temperature) group at University of California, Santa Barbara, under NASA Contract. Version 1 of the MODIS Land-Surface Temperature Algorithm Theoretical Basis Document (ATBD) was reviewed in June 1994, version 2 reviewed in November 1994, version 3.1 in August 1996, and version 3.3 updated in April 1999. Based on the ATBD, two LST algorithms were developed, one is the generalized split-window algorithm and another is the physics-based day/night LST algorithm. These two LST algorithms were implemented into the production generation executive code (PGE 16) for the daily standard MODIS LST products at level-2 (MODII-L2) and level-3 (MODIIA1 at 1 km resolution and MODIIB1 at 5km resolution). PGE codes for 8-day 1 km LST product (MODIIA2) and the daily, 8-day and monthly LST products at 0.05 degree latitude/longitude climate model grids (CMG) were also delivered. Four to six field campaigns were conducted each year since 2000 to validate the daily LST products generated by PGE16 and the calibration accuracies of the MODIS TIR bands used for the LST/emissivity retrieval from versions 2-4 of Terra MODIS data and versions 3-4 of Aqua MODIS data. Validation results from temperature-based and radiance-based methods indicate that the MODIS LST accuracy is better than 1 C in most clear-sky cases in the range from -10 to 58 C. One of the major lessons learn from multi- year temporal analysis of the consistent V4 daily Terra MODIS LST products in 2000-2003 over some selected target areas including lakes, snow/ice fields, and semi-arid sites is that there are variable numbers of cloud-contaminated LSTs in the MODIS LST products depending on surface elevation, land cover types, and atmospheric conditions. A cloud-screen scheme with constraints on spatial and temporal variations in LSTs was developed to remove cloud-contaminated LSTs. The 5km LST product was indirectly validated through comparisons to the 1 km LST product. Twenty three papers related to the LST research work were published in journals over the last decade.

  1. Retrieval with Infrared Atmospheric Sounding Interferometer and Validation during JAIVEx

    NASA Technical Reports Server (NTRS)

    Zhou, Daniel K.; Liu, Xu; Larar, Allen M.; Smith, William L.; Taylor, Jonathan P.; Schluessel, Peter; Strow, L. Larrabee; Mango, Stephen A.

    2008-01-01

    A state-of-the-art IR-only retrieval algorithm has been developed with an all-season-global EOF Physical Regression and followed by 1-D Var. Physical Iterative Retrieval for IASI, AIRS, and NAST-I. The benefits of this retrieval are to produce atmospheric structure with a single FOV horizontal resolution (approx. 15 km for IASI and AIRS), accurate profiles above the cloud (at least) or down to the surface, surface parameters, and/or cloud microphysical parameters. Initial case study and validation indicates that surface, cloud, and atmospheric structure (include TBL) are well captured by IASI and AIRS measurements. Coincident dropsondes during the IASI and AIRS overpasses are used to validate atmospheric conditions, and accurate retrievals are obtained with an expected vertical resolution. JAIVEx has provided the data needed to validate the retrieval algorithm and its products which allows us to assess the instrument ability and/or performance. Retrievals with global coverage are under investigation for detailed retrieval assessment. It is greatly desired that these products be used for testing the impact on Atmospheric Data Assimilation and/or Numerical Weather Prediction.

  2. A new validated method for the simultaneous determination of benzocaine, propylparaben and benzyl alcohol in a bioadhesive gel by HPLC.

    PubMed

    Pérez-Lozano, P; García-Montoya, E; Orriols, A; Miñarro, M; Ticó, J R; Suñé-Negre, J M

    2005-10-04

    A new HPLC-RP method has been developed and validated for the simultaneous determination of benzocaine, two preservatives (propylparaben (nipasol) and benzyl alcohol) and degradation products of benzocaine in a semisolid pharmaceutical dosage form (benzocaine gel). The method uses a Nucleosil 120 C18 column and gradient elution. The mobile phase consisted of a mixture of methanol and glacial acetic acid (10%, v/v) at different proportion according to a time-schedule programme, pumped at a flow rate of 2.0 ml min(-1). The DAD detector was set at 258 nm. The validation study was carried out fulfilling the ICH guidelines in order to prove that the new analytical method, meets the reliability characteristics, and these characteristics showed the capacity of analytical method to keep, throughout the time, the fundamental criteria for validation: selectivity, linearity, precision, accuracy and sensitivity. The method was applied during the quality control of benzocaine gel in order to quantify the drug (benzocaine), preservatives and degraded products and proved to be suitable for rapid and reliable quality control method.

  3. Estimating Computer-Based Training Development Times

    DTIC Science & Technology

    1987-10-14

    beginners , must be sure they interpret terms correctly. As a result of this informal validation, the authors suggest refinements in the tool which...Productivity tools available: automated design tools, text processor interfaces, flowcharting software, software interfaces a Multimedia interfaces e

  4. Free Trade, A New National Security Policy for the 21st Century

    DTIC Science & Technology

    1990-03-30

    view had some validity prior to the industrial revolution as countries were basically self-sufficient. However, with the growth and spread of the...eliminated complete self-sufficiency. 3 As the Industrial Revolution expanded, communities and then regions within nations became interdependent and...prosperous national economies emerged. A significant by-product of the industrial revolution was the development and massive production of weapons that

  5. Discrete-event system simulation on small and medium enterprises productivity improvement

    NASA Astrophysics Data System (ADS)

    Sulistio, J.; Hidayah, N. A.

    2017-12-01

    Small and medium industries in Indonesia is currently developing. The problem faced by SMEs is the difficulty of meeting growing demand coming into the company. Therefore, SME need an analysis and evaluation on its production process in order to meet all orders. The purpose of this research is to increase the productivity of SMEs production floor by applying discrete-event system simulation. This method preferred because it can solve complex problems die to the dynamic and stochastic nature of the system. To increase the credibility of the simulation, model validated by cooperating the average of two trials, two trials of variance and chi square test. Afterwards, Benferroni method applied to development several alternatives. The article concludes that, the productivity of SMEs production floor increased up to 50% by adding the capacity of dyeing and drying machines.

  6. Development and validation of a food photography manual, as a tool for estimation of food portion size in epidemiological dietary surveys in Tunisia

    PubMed Central

    Bouchoucha, Mongia; Akrout, Mouna; Bellali, Hédia; Bouchoucha, Rim; Tarhouni, Fadwa; Mansour, Abderraouf Ben; Zouari, Béchir

    2016-01-01

    Background Estimation of food portion sizes has always been a challenge in dietary studies on free-living individuals. The aim of this work was to develop and validate a food photography manual to improve the accuracy of the estimated size of consumed food portions. Methods A manual was compiled from digital photos of foods commonly consumed by the Tunisian population. The food was cooked and weighed before taking digital photographs of three portion sizes. The manual was validated by comparing the method of 24-hour recall (using photos) to the reference method [food weighing (FW)]. In both the methods, the comparison focused on food intake amounts as well as nutritional issues. Validity was assessed by Bland–Altman limits of agreement. In total, 31 male and female volunteers aged 9–89 participated in the study. Results We focused on eight food categories and compared their estimated amounts (using the 24-hour recall method) to those actually consumed (using FW). Animal products and sweets were underestimated, whereas pasta, bread, vegetables, fruits, and dairy products were overestimated. However, the difference between the two methods is not statistically significant except for pasta (p<0.05) and dairy products (p<0.05). The coefficient of correlation between the two methods is highly significant, ranging from 0.876 for pasta to 0.989 for dairy products. Nutrient intake calculated for both methods showed insignificant differences except for fat (p<0.001) and dietary fiber (p<0.05). A highly significant correlation was observed between the two methods for all micronutrients. The test agreement highlights the lack of difference between the two methods. Conclusion The difference between the 24-hour recall method using digital photos and the weighing method is acceptable. Our findings indicate that the food photography manual can be a useful tool for quantifying food portion sizes in epidemiological dietary surveys. PMID:27585631

  7. Development and validation of a food photography manual, as a tool for estimation of food portion size in epidemiological dietary surveys in Tunisia.

    PubMed

    Bouchoucha, Mongia; Akrout, Mouna; Bellali, Hédia; Bouchoucha, Rim; Tarhouni, Fadwa; Mansour, Abderraouf Ben; Zouari, Béchir

    2016-01-01

    Background Estimation of food portion sizes has always been a challenge in dietary studies on free-living individuals. The aim of this work was to develop and validate a food photography manual to improve the accuracy of the estimated size of consumed food portions. Methods A manual was compiled from digital photos of foods commonly consumed by the Tunisian population. The food was cooked and weighed before taking digital photographs of three portion sizes. The manual was validated by comparing the method of 24-hour recall (using photos) to the reference method [food weighing (FW)]. In both the methods, the comparison focused on food intake amounts as well as nutritional issues. Validity was assessed by Bland-Altman limits of agreement. In total, 31 male and female volunteers aged 9-89 participated in the study. Results We focused on eight food categories and compared their estimated amounts (using the 24-hour recall method) to those actually consumed (using FW). Animal products and sweets were underestimated, whereas pasta, bread, vegetables, fruits, and dairy products were overestimated. However, the difference between the two methods is not statistically significant except for pasta (p<0.05) and dairy products (p<0.05). The coefficient of correlation between the two methods is highly significant, ranging from 0.876 for pasta to 0.989 for dairy products. Nutrient intake calculated for both methods showed insignificant differences except for fat (p<0.001) and dietary fiber (p<0.05). A highly significant correlation was observed between the two methods for all micronutrients. The test agreement highlights the lack of difference between the two methods. Conclusion The difference between the 24-hour recall method using digital photos and the weighing method is acceptable. Our findings indicate that the food photography manual can be a useful tool for quantifying food portion sizes in epidemiological dietary surveys.

  8. Development and validation of a food photography manual, as a tool for estimation of food portion size in epidemiological dietary surveys in Tunisia.

    PubMed

    Bouchoucha, Mongia; Akrout, Mouna; Bellali, Hédia; Bouchoucha, Rim; Tarhouni, Fadwa; Mansour, Abderraouf Ben; Zouari, Béchir

    2016-01-01

    Estimation of food portion sizes has always been a challenge in dietary studies on free-living individuals. The aim of this work was to develop and validate a food photography manual to improve the accuracy of the estimated size of consumed food portions. A manual was compiled from digital photos of foods commonly consumed by the Tunisian population. The food was cooked and weighed before taking digital photographs of three portion sizes. The manual was validated by comparing the method of 24-hour recall (using photos) to the reference method [food weighing (FW)]. In both the methods, the comparison focused on food intake amounts as well as nutritional issues. Validity was assessed by Bland-Altman limits of agreement. In total, 31 male and female volunteers aged 9-89 participated in the study. We focused on eight food categories and compared their estimated amounts (using the 24-hour recall method) to those actually consumed (using FW). Animal products and sweets were underestimated, whereas pasta, bread, vegetables, fruits, and dairy products were overestimated. However, the difference between the two methods is not statistically significant except for pasta (p<0.05) and dairy products (p<0.05). The coefficient of correlation between the two methods is highly significant, ranging from 0.876 for pasta to 0.989 for dairy products. Nutrient intake calculated for both methods showed insignificant differences except for fat (p<0.001) and dietary fiber (p<0.05). A highly significant correlation was observed between the two methods for all micronutrients. The test agreement highlights the lack of difference between the two methods. The difference between the 24-hour recall method using digital photos and the weighing method is acceptable. Our findings indicate that the food photography manual can be a useful tool for quantifying food portion sizes in epidemiological dietary surveys.

  9. A toolbox of immunoprecipitation-grade monoclonal antibodies to human transcription factors.

    PubMed

    Venkataraman, Anand; Yang, Kun; Irizarry, Jose; Mackiewicz, Mark; Mita, Paolo; Kuang, Zheng; Xue, Lin; Ghosh, Devlina; Liu, Shuang; Ramos, Pedro; Hu, Shaohui; Bayron Kain, Diane; Keegan, Sarah; Saul, Richard; Colantonio, Simona; Zhang, Hongyan; Behn, Florencia Pauli; Song, Guang; Albino, Edisa; Asencio, Lillyann; Ramos, Leonardo; Lugo, Luvir; Morell, Gloriner; Rivera, Javier; Ruiz, Kimberly; Almodovar, Ruth; Nazario, Luis; Murphy, Keven; Vargas, Ivan; Rivera-Pacheco, Zully Ann; Rosa, Christian; Vargas, Moises; McDade, Jessica; Clark, Brian S; Yoo, Sooyeon; Khambadkone, Seva G; de Melo, Jimmy; Stevanovic, Milanka; Jiang, Lizhi; Li, Yana; Yap, Wendy Y; Jones, Brittany; Tandon, Atul; Campbell, Elliot; Montelione, Gaetano T; Anderson, Stephen; Myers, Richard M; Boeke, Jef D; Fenyö, David; Whiteley, Gordon; Bader, Joel S; Pino, Ignacio; Eichinger, Daniel J; Zhu, Heng; Blackshaw, Seth

    2018-03-19

    A key component of efforts to address the reproducibility crisis in biomedical research is the development of rigorously validated and renewable protein-affinity reagents. As part of the US National Institutes of Health (NIH) Protein Capture Reagents Program (PCRP), we have generated a collection of 1,406 highly validated immunoprecipitation- and/or immunoblotting-grade mouse monoclonal antibodies (mAbs) to 737 human transcription factors, using an integrated production and validation pipeline. We used HuProt human protein microarrays as a primary validation tool to identify mAbs with high specificity for their cognate targets. We further validated PCRP mAbs by means of multiple experimental applications, including immunoprecipitation, immunoblotting, chromatin immunoprecipitation followed by sequencing (ChIP-seq), and immunohistochemistry. We also conducted a meta-analysis that identified critical variables that contribute to the generation of high-quality mAbs. All validation data, protocols, and links to PCRP mAb suppliers are available at http://proteincapture.org.

  10. Development and application of a predictive model of Aspergillus candidus growth as a tool to improve shelf life of bakery products.

    PubMed

    Huchet, V; Pavan, S; Lochardet, A; Divanac'h, M L; Postollec, F; Thuault, D

    2013-12-01

    Molds are responsible for spoilage of bakery products during storage. A modeling approach to predict the effect of water activity (aw) and temperature on the appearance time of Aspergillus candidus was developed and validated on cakes. The gamma concept of Zwietering was adapted to model fungal growth, taking into account the impact of temperature and aw. We hypothesized that the same model could be used to calculate the time for mycelium to become visible (tv), by substituting the matrix parameter by tv. Cardinal values of A. candidus were determined on potato dextrose agar, and predicted tv were further validated by challenge-tests run on 51 pastries. Taking into account the aw dynamics recorded in pastries during reasonable conditions of storage, high correlation was shown between predicted and observed tv when the aw at equilibrium (after 14 days of storage) was used for modeling (Af = 1.072, Bf = 0.979). Validation studies on industrial cakes confirmed the experimental results and demonstrated the suitability of the model to predict tv in food as a function of aw and temperature. Copyright © 2013 Elsevier Ltd. All rights reserved.

  11. LACO-Wiki: A land cover validation tool and a new, innovative teaching resource for remote sensing and the geosciences

    NASA Astrophysics Data System (ADS)

    See, Linda; Perger, Christoph; Dresel, Christopher; Hofer, Martin; Weichselbaum, Juergen; Mondel, Thomas; Steffen, Fritz

    2016-04-01

    The validation of land cover products is an important step in the workflow of generating a land cover map from remotely-sensed imagery. Many students of remote sensing will be given exercises on classifying a land cover map followed by the validation process. Many algorithms exist for classification, embedded within proprietary image processing software or increasingly as open source tools. However, there is little standardization for land cover validation, nor a set of open tools available for implementing this process. The LACO-Wiki tool was developed as a way of filling this gap, bringing together standardized land cover validation methods and workflows into a single portal. This includes the storage and management of land cover maps and validation data; step-by-step instructions to guide users through the validation process; sound sampling designs; an easy-to-use environment for validation sample interpretation; and the generation of accuracy reports based on the validation process. The tool was developed for a range of users including producers of land cover maps, researchers, teachers and students. The use of such a tool could be embedded within the curriculum of remote sensing courses at a university level but is simple enough for use by students aged 13-18. A beta version of the tool is available for testing at: http://www.laco-wiki.net.

  12. Allergen source materials: state-of-the-art.

    PubMed

    Esch, Robert E

    2009-01-01

    A variety of positive outcomes can be realized from validation and risk management activities (see Table 4). They are dependent on the participation of multiple functional groups including the quality unit, regulatory and legal affairs, engineering and production operations, research and development, and sales and marketing. Quality risk management is receiving increased attention in the area of public health, pharmacovigilance, and pharmaceutical manufacturing. Recent examples of its regulatory use in our industry include the assessment of the potential risks of transmissible spongiform encephalopathies (TSE) agents through contaminated products], the risks of precipitates in allergenic extracts, and the revision of the potency limits for standardized dust mite and grass allergen vaccines. Its application to allergen source material process validation activities allowed for a practical strategy, especially in a complex manufacturing environment involving hundreds of products with multiple intended uses. In addition, the use of tools such as FMEA was useful in evaluating proposed changes made to manufacturing procedures and product specifications, new regulatory actions, and customer feedback or complaints. The success of such a quality assurance programs will ultimately be reflected in the elimination or reduction of product failures, improvement in the detection and prediction of potential product failures, and increased confidence in product quality.

  13. Consumer product in vitro digestion model: Bioaccessibility of contaminants and its application in risk assessment.

    PubMed

    Brandon, Esther F A; Oomen, Agnes G; Rompelberg, Cathy J M; Versantvoort, Carolien H M; van Engelen, Jacqueline G M; Sips, Adrienne J A M

    2006-03-01

    This paper describes the applicability of in vitro digestion models as a tool for consumer products in (ad hoc) risk assessment. In current risk assessment, oral bioavailability from a specific product is considered to be equal to bioavailability found in toxicity studies in which contaminants are usually ingested via liquids or food matrices. To become bioavailable, contaminants must first be released from the product during the digestion process (i.e. become bioaccessible). Contaminants in consumer products may be less bioaccessible than contaminants in liquid or food. Therefore, the actual risk after oral exposure could be overestimated. This paper describes the applicability of a simple, reliable, fast and relatively inexpensive in vitro method for determining the bioaccessibility of a contaminant from a consumer product. Different models, representing sucking and/or swallowing were developed. The experimental design of each model can be adjusted to the appropriate exposure scenarios as determined by the risk assessor. Several contaminated consumer products were tested in the various models. Although relevant in vivo data are scare, we succeeded to preliminary validate the model for one case. This case showed good correlation and never underestimated the bioavailability. However, validation check needs to be continued.

  14. Effective virus inactivation and removal by steps of Biotest Pharmaceuticals IGIV production process

    PubMed Central

    Dichtelmüller, Herbert O.; Flechsig, Eckhard; Sananes, Frank; Kretschmar, Michael; Dougherty, Christopher J.

    2012-01-01

    The virus validation of three steps of Biotest Pharmaceuticals IGIV production process is described here. The steps validated are precipitation and removal of fraction III of the cold ethanol fractionation process, solvent/detergent treatment and 35 nm virus filtration. Virus validation was performed considering combined worst case conditions. By these validated steps sufficient virus inactivation/removal is achieved, resulting in a virus safe product. PMID:24371563

  15. The Transition from Spacecraft Development Ot Flight Operation: Human Factor Considerations

    NASA Technical Reports Server (NTRS)

    Basilio, Ralph R.

    2000-01-01

    In the field of aeronautics and astronautics, a paradigm shift has been witnessed by those in academia, research and development, and private industry. Long development life cycles and the budgets to support such programs and projects has given way to aggressive task schedules and leaner resources to draw from all the while challenging assigned individuals to create and produce improved products of processes. however, this "faster, better, cheaper" concept cannot merely be applied to the design, development, and test of complex systems such as earth-orbiting of interplanetary robotic spacecraft. Full advantage is not possible without due consideration and application to mission operations planning and flight operations, Equally as important as the flight system, the mission operations system consisting of qualified personnel, ground hardware and software tools, and verified and validated operational processes, should also be regarded as a complex system requiring personnel to draw upon formal education, training, related experiences, and heuristic reasoning in engineering an effective and efficient system. Unquestionably, qualified personnel are the most important elements of a mission operations system. This paper examines the experiences of the Deep Space I Project, the first in a series of new technology in-flight validation missions sponsored by the United States National Aeronautics and Space Administration (NASA), specifically, in developing a subsystems analysis and technology validation team comprised of former spacecraft development personnel. Human factor considerations are investigated from initial concept/vision formulation; through operational process development; personnel test and training; to initial uplink product development and test support. Emphasis has been placed on challenges and applied or recommended solutions, so as to provide opportunities for future programs and projects to address and disposition potential issues and concerns as early as possible to reap the benefits associated with learning from other's past experiences.

  16. Bigfoot Field Manual

    NASA Astrophysics Data System (ADS)

    Campbell, J. L.; Burrows, S.; Gower, S. T.; Cohen, W. B.

    1999-09-01

    The BigFoot Project is funded by the Earth Science Enterprise to collect and organize data to be used in the EOS Validation Program. The data collected by the BigFoot Project are unique in being ground-based observations coincident with satellite overpasses. In addition to collecting data, the BigFoot project will develop and test new algorithms for scaling point measurements to the same spatial scales as the EOS satellite products. This BigFoot Field Manual Mill be used to achieve completeness and consistency of data collected at four initial BigFoot sites and at future sites that may collect similar validation data. Therefore, validation datasets submitted to the ORNL DAAC that have been compiled in a manner consistent with the field manual will be especially valuable in the validation program.

  17. Mathematical modeling of enzyme production using Trichoderma harzianum P49P11 and sugarcane bagasse as carbon source.

    PubMed

    Gelain, Lucas; da Cruz Pradella, José Geraldo; da Costa, Aline Carvalho

    2015-12-01

    A mathematical model to describe the kinetics of enzyme production by the filamentous fungus Trichoderma harzianum P49P11 was developed using a low cost substrate as main carbon source (pretreated sugarcane bagasse). The model describes the cell growth, variation of substrate concentration and production of three kinds of enzymes (cellulases, beta-glucosidase and xylanase) in different sugarcane bagasse concentrations (5; 10; 20; 30; 40 gL(-1)). The 10 gL(-1) concentration was used to validate the model and the other to parameter estimation. The model for enzyme production has terms implicitly representing induction and repression. Substrate variation was represented by a simple degradation rate. The models seem to represent well the kinetics with a good fit for the majority of the assays. Validation results indicate that the models are adequate to represent the kinetics for a biotechnological process. Copyright © 2015 Elsevier Ltd. All rights reserved.

  18. An Approach for Validating Actinide and Fission Product Burnup Credit Criticality Safety Analyses--Criticality (keff) Predictions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Scaglione, John M; Mueller, Don; Wagner, John C

    2011-01-01

    One of the most significant remaining challenges associated with expanded implementation of burnup credit in the United States is the validation of depletion and criticality calculations used in the safety evaluation - in particular, the availability and use of applicable measured data to support validation, especially for fission products. Applicants and regulatory reviewers have been constrained by both a scarcity of data and a lack of clear technical basis or approach for use of the data. U.S. Nuclear Regulatory Commission (NRC) staff have noted that the rationale for restricting their Interim Staff Guidance on burnup credit (ISG-8) to actinide-only ismore » based largely on the lack of clear, definitive experiments that can be used to estimate the bias and uncertainty for computational analyses associated with using burnup credit. To address the issue of validation, the NRC initiated a project with the Oak Ridge National Laboratory to (1) develop and establish a technically sound validation approach (both depletion and criticality) for commercial spent nuclear fuel (SNF) criticality safety evaluations based on best-available data and methods and (2) apply the approach for representative SNF storage and transport configurations/conditions to demonstrate its usage and applicability, as well as to provide reference bias results. The purpose of this paper is to describe the criticality (k{sub eff}) validation approach, and resulting observations and recommendations. Validation of the isotopic composition (depletion) calculations is addressed in a companion paper at this conference. For criticality validation, the approach is to utilize (1) available laboratory critical experiment (LCE) data from the International Handbook of Evaluated Criticality Safety Benchmark Experiments and the French Haut Taux de Combustion (HTC) program to support validation of the principal actinides and (2) calculated sensitivities, nuclear data uncertainties, and the limited available fission product LCE data to predict and verify individual biases for relevant minor actinides and fission products. This paper (1) provides a detailed description of the approach and its technical bases, (2) describes the application of the approach for representative pressurized water reactor and boiling water reactor safety analysis models to demonstrate its usage and applicability, (3) provides reference bias results based on the prerelease SCALE 6.1 code package and ENDF/B-VII nuclear cross-section data, and (4) provides recommendations for application of the results and methods to other code and data packages.« less

  19. Production of Reliable Flight Crucial Software: Validation Methods Research for Fault Tolerant Avionics and Control Systems Sub-Working Group Meeting

    NASA Technical Reports Server (NTRS)

    Dunham, J. R. (Editor); Knight, J. C. (Editor)

    1982-01-01

    The state of the art in the production of crucial software for flight control applications was addressed. The association between reliability metrics and software is considered. Thirteen software development projects are discussed. A short term need for research in the areas of tool development and software fault tolerance was indicated. For the long term, research in format verification or proof methods was recommended. Formal specification and software reliability modeling, were recommended as topics for both short and long term research.

  20. Mars Observer data production, transfer, and archival: The data production assembly line

    NASA Technical Reports Server (NTRS)

    Childs, David B.

    1993-01-01

    This paper describes the data production, transfer, and archival process designed for the Mars Observer Flight Project. It addresses the developmental and operational aspects of the archive collection production process. The developmental aspects cover the design and packaging of data products for archival and distribution to the planetary community. Also discussed is the design and development of a data transfer and volume production process capable of handling the large throughput and complexity of the Mars Observer data products. The operational aspects cover the main functions of the process: creating data and engineering products, collecting the data products and ancillary products in a central repository, producing archive volumes, validating volumes, archiving, and distributing the data to the planetary community.

  1. [Status of research and development for control of tropical diseases: hypocrisy, indifference or lack of coordination].

    PubMed

    Millet, P

    2006-12-01

    Tropical diseases neglected by the pharmaceutical industry usually involve developing countries. Neglected diseases can now be divided into two groups. The first includes the big three infections i.e., malaria, HIV/AIDS, and tuberculosis, that present strategic and political overtones. The second group includes a host of other fatal infections including worms, trypanosomiasis, and leishmaniasis. Fundamental research on neglected diseases has been highly productive, but there has been little success in transferring research findings to a pharmaceutical industry unwilling to take the risks associated with developing new drugs on its own. However several public-private initiatives have revived hopes of developing new products with growing involvement of industries in developing countries (India and Brazil) despite the high risks associated with fluctuating demand for medicines or funding shortages. To meet the need for testing new drugs, more clinical facilities and better patient recruitment will be needed in endemic countries. Although these new efforts to control neglected diseases are encouraging, there is now a need for coordination. Clinical research in developing countries must be organized in compliance with international principles of ethics. Testing must be aimed at validating fundamental data from industrialized countries. Appropriate incentives must be given to ensure that pharmaceutical companies use research findings for new product development. In this context, the time seems ripe for the establishment of an independent laboratory for technological innovation in neglected diseases. Such a facility could not only validate scientific data but also supervise the development of clinical applications from research data.

  2. Separating the Wheat from the Chaff: The Role of Vocational Education in Economic Development.

    ERIC Educational Resources Information Center

    Grubb, W. Norton; Stern, David

    This paper states that, although education has been linked historically to economic development, there is no clear evidence that this link is valid. It investigates under what conditions educational programs are likely to be effective and which are likely to shift resources without any net effects on employment, wage levels, productivity, or…

  3. Grand canonical validation of the bipartite international trade network.

    PubMed

    Straka, Mika J; Caldarelli, Guido; Saracco, Fabio

    2017-08-01

    Devising strategies for economic development in a globally competitive landscape requires a solid and unbiased understanding of countries' technological advancements and similarities among export products. Both can be addressed through the bipartite representation of the International Trade Network. In this paper, we apply the recently proposed grand canonical projection algorithm to uncover country and product communities. Contrary to past endeavors, our methodology, based on information theory, creates monopartite projections in an unbiased and analytically tractable way. Single links between countries or products represent statistically significant signals, which are not accounted for by null models such as the bipartite configuration model. We find stable country communities reflecting the socioeconomic distinction in developed, newly industrialized, and developing countries. Furthermore, we observe product clusters based on the aforementioned country groups. Our analysis reveals the existence of a complicated structure in the bipartite International Trade Network: apart from the diversification of export baskets from the most basic to the most exclusive products, we observe a statistically significant signal of an export specialization mechanism towards more sophisticated products.

  4. Grand canonical validation of the bipartite international trade network

    NASA Astrophysics Data System (ADS)

    Straka, Mika J.; Caldarelli, Guido; Saracco, Fabio

    2017-08-01

    Devising strategies for economic development in a globally competitive landscape requires a solid and unbiased understanding of countries' technological advancements and similarities among export products. Both can be addressed through the bipartite representation of the International Trade Network. In this paper, we apply the recently proposed grand canonical projection algorithm to uncover country and product communities. Contrary to past endeavors, our methodology, based on information theory, creates monopartite projections in an unbiased and analytically tractable way. Single links between countries or products represent statistically significant signals, which are not accounted for by null models such as the bipartite configuration model. We find stable country communities reflecting the socioeconomic distinction in developed, newly industrialized, and developing countries. Furthermore, we observe product clusters based on the aforementioned country groups. Our analysis reveals the existence of a complicated structure in the bipartite International Trade Network: apart from the diversification of export baskets from the most basic to the most exclusive products, we observe a statistically significant signal of an export specialization mechanism towards more sophisticated products.

  5. A new dataset validation system for the Planetary Science Archive

    NASA Astrophysics Data System (ADS)

    Manaud, N.; Zender, J.; Heather, D.; Martinez, S.

    2007-08-01

    The Planetary Science Archive is the official archive for the Mars Express mission. It has received its first data by the end of 2004. These data are delivered by the PI teams to the PSA team as datasets, which are formatted conform to the Planetary Data System (PDS). The PI teams are responsible for analyzing and calibrating the instrument data as well as the production of reduced and calibrated data. They are also responsible of the scientific validation of these data. ESA is responsible of the long-term data archiving and distribution to the scientific community and must ensure, in this regard, that all archived products meet quality. To do so, an archive peer-review is used to control the quality of the Mars Express science data archiving process. However a full validation of its content is missing. An independent review board recently recommended that the completeness of the archive as well as the consistency of the delivered data should be validated following well-defined procedures. A new validation software tool is being developed to complete the overall data quality control system functionality. This new tool aims to improve the quality of data and services provided to the scientific community through the PSA, and shall allow to track anomalies in and to control the completeness of datasets. It shall ensure that the PSA end-users: (1) can rely on the result of their queries, (2) will get data products that are suitable for scientific analysis, (3) can find all science data acquired during a mission. We defined dataset validation as the verification and assessment process to check the dataset content against pre-defined top-level criteria, which represent the general characteristics of good quality datasets. The dataset content that is checked includes the data and all types of information that are essential in the process of deriving scientific results and those interfacing with the PSA database. The validation software tool is a multi-mission tool that has been designed to provide the user with the flexibility of defining and implementing various types of validation criteria, to iteratively and incrementally validate datasets, and to generate validation reports.

  6. Extensive validation of CM SAF surface radiation products over Europe.

    PubMed

    Urraca, Ruben; Gracia-Amillo, Ana M; Koubli, Elena; Huld, Thomas; Trentmann, Jörg; Riihelä, Aku; Lindfors, Anders V; Palmer, Diane; Gottschalg, Ralph; Antonanzas-Torres, Fernando

    2017-09-15

    This work presents a validation of three satellite-based radiation products over an extensive network of 313 pyranometers across Europe, from 2005 to 2015. The products used have been developed by the Satellite Application Facility on Climate Monitoring (CM SAF) and are one geostationary climate dataset (SARAH-JRC), one polar-orbiting climate dataset (CLARA-A2) and one geostationary operational product. Further, the ERA-Interim reanalysis is also included in the comparison. The main objective is to determine the quality level of the daily means of CM SAF datasets, identifying their limitations, as well as analyzing the different factors that can interfere in the adequate validation of the products. The quality of the pyranometer was the most critical source of uncertainty identified. In this respect, the use of records from Second Class pyranometers and silicon-based photodiodes increased the absolute error and the bias, as well as the dispersion of both metrics, preventing an adequate validation of the daily means. The best spatial estimates for the three datasets were obtained in Central Europe with a Mean Absolute Deviation (MAD) within 8-13 W/m 2 , whereas the MAD always increased at high-latitudes, snow-covered surfaces, high mountain ranges and coastal areas. Overall, the SARAH-JRC's accuracy was demonstrated over a dense network of stations making it the most consistent dataset for climate monitoring applications. The operational dataset was comparable to SARAH-JRC in Central Europe, but lacked of the temporal stability of climate datasets, while CLARA-A2 did not achieve the same level of accuracy despite predictions obtained showed high uniformity with a small negative bias. The ERA-Interim reanalysis shows the by-far largest deviations from the surface reference measurements.

  7. A Radiation Shielding Code for Spacecraft and Its Validation

    NASA Technical Reports Server (NTRS)

    Shinn, J. L.; Cucinotta, F. A.; Singleterry, R. C.; Wilson, J. W.; Badavi, F. F.; Badhwar, G. D.; Miller, J.; Zeitlin, C.; Heilbronn, L.; Tripathi, R. K.

    2000-01-01

    The HZETRN code, which uses a deterministic approach pioneered at NASA Langley Research Center, has been developed over the past decade to evaluate the local radiation fields within sensitive materials (electronic devices and human tissue) on spacecraft in the space environment. The code describes the interactions of shield materials with the incident galactic cosmic rays, trapped protons, or energetic protons from solar particle events in free space and low Earth orbit. The content of incident radiations is modified by atomic and nuclear reactions with the spacecraft and radiation shield materials. High-energy heavy ions are fragmented into less massive reaction products, and reaction products are produced by direct knockout of shield constituents or from de-excitation products. An overview of the computational procedures and database which describe these interactions is given. Validation of the code with recent Monte Carlo benchmarks, and laboratory and flight measurement is also included.

  8. The Importance of Measurement Errors for Deriving Accurate Reference Leaf Area Index Maps for Validation of Moderate-Resolution Satellite LAI Products

    NASA Technical Reports Server (NTRS)

    Huang, Dong; Yang, Wenze; Tan, Bin; Rautiainen, Miina; Zhang, Ping; Hu, Jiannan; Shabanov, Nikolay V.; Linder, Sune; Knyazikhin, Yuri; Myneni, Ranga B.

    2006-01-01

    The validation of moderate-resolution satellite leaf area index (LAI) products such as those operationally generated from the Moderate Resolution Imaging Spectroradiometer (MODIS) sensor data requires reference LAI maps developed from field LAI measurements and fine-resolution satellite data. Errors in field measurements and satellite data determine the accuracy of the reference LAI maps. This paper describes a method by which reference maps of known accuracy can be generated with knowledge of errors in fine-resolution satellite data. The method is demonstrated with data from an international field campaign in a boreal coniferous forest in northern Sweden, and Enhanced Thematic Mapper Plus images. The reference LAI map thus generated is used to assess modifications to the MODIS LAI/fPAR algorithm recently implemented to derive the next generation of the MODIS LAI/fPAR product for this important biome type.

  9. 76 FR 4360 - Guidance for Industry on Process Validation: General Principles and Practices; Availability

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-01-25

    ... elements of process validation for the manufacture of human and animal drug and biological products... process validation for the manufacture of human and animal drug and biological products, including APIs. This guidance describes process validation activities in three stages: In Stage 1, Process Design, the...

  10. When is good, good enough? Methodological pragmatism for sustainable guideline development.

    PubMed

    Browman, George P; Somerfield, Mark R; Lyman, Gary H; Brouwers, Melissa C

    2015-03-06

    Continuous escalation in methodological and procedural rigor for evidence-based processes in guideline development is associated with increasing costs and production delays that threaten sustainability. While health research methodologists are appropriately responsible for promoting increasing rigor in guideline development, guideline sponsors are responsible for funding such processes. This paper acknowledges that other stakeholders in addition to methodologists should be more involved in negotiating trade-offs between methodological procedures and efficiency in guideline production to produce guidelines that are 'good enough' to be trustworthy and affordable under specific circumstances. The argument for reasonable methodological compromise to meet practical circumstances is consistent with current implicit methodological practice. This paper proposes a conceptual tool as a framework to be used by different stakeholders in negotiating, and explicitly reporting, reasonable compromises for trustworthy as well as cost-worthy guidelines. The framework helps fill a transparency gap in how methodological choices in guideline development are made. The principle, 'when good is good enough' can serve as a basis for this approach. The conceptual tool 'Efficiency-Validity Methodological Continuum' acknowledges trade-offs between validity and efficiency in evidence-based guideline development and allows for negotiation, guided by methodologists, of reasonable methodological compromises among stakeholders. Collaboration among guideline stakeholders in the development process is necessary if evidence-based guideline development is to be sustainable.

  11. FIRE Science Results 1989

    NASA Technical Reports Server (NTRS)

    Mcdougal, David S. (Editor)

    1990-01-01

    FIRE (First ISCCP Regional Experiment) is a U.S. cloud-radiation research program formed in 1984 to increase the basic understanding of cirrus and marine stratocumulus cloud systems, to develop realistic parameterizations for these systems, and to validate and improve ISCCP cloud product retrievals. Presentations of results culminating the first 5 years of FIRE research activities were highlighted. The 1986 Cirrus Intensive Field Observations (IFO), the 1987 Marine Stratocumulus IFO, the Extended Time Observations (ETO), and modeling activities are described. Collaborative efforts involving the comparison of multiple data sets, incorporation of data measurements into modeling activities, validation of ISCCP cloud parameters, and development of parameterization schemes for General Circulation Models (GCMs) are described.

  12. A Comprehensive Plan for the Long-Term Calibration and Validation of Oceanic Biogeochemical Satellite Data

    NASA Technical Reports Server (NTRS)

    Hooker, Stanford B.; McClain, Charles R.; Mannino, Antonio

    2007-01-01

    The primary objective of this planning document is to establish a long-term capability and validating oceanic biogeochemical satellite data. It is a pragmatic solution to a practical problem based primarily o the lessons learned from prior satellite missions. All of the plan's elements are seen to be interdependent, so a horizontal organizational scheme is anticipated wherein the overall leadership comes from the NASA Ocean Biology and Biogeochemistry (OBB) Program Manager and the entire enterprise is split into two components of equal sature: calibration and validation plus satellite data processing. The detailed elements of the activity are based on the basic tasks of the two main components plus the current objectives of the Carbon Cycle and Ecosystems Roadmap. The former is distinguished by an internal core set of responsibilities and the latter is facilitated through an external connecting-core ring of competed or contracted activities. The core elements for the calibration and validation component include a) publish protocols and performance metrics; b) verify uncertainty budgets; c) manage the development and evaluation of instrumentation; and d) coordinate international partnerships. The core elements for the satellite data processing component are e) process and reprocess multisensor data; f) acquire, distribute, and archive data products; and g) implement new data products. Both components have shared responsibilities for initializing and temporally monitoring satellite calibration. Connecting-core elements include (but are not restricted to) atmospheric correction and characterization, standards and traceability, instrument and analysis round robins, field campaigns and vicarious calibration sites, in situ database, bio-optical algorithm (and product) validation, satellite characterization and vicarious calibration, and image processing software. The plan also includes an accountability process, creating a Calibration and Validation Team (to help manage the activity), and a discussion of issues associated with the plan's scientific focus.

  13. A meta-heuristic approach supported by NSGA-II for the design and plan of supply chain networks considering new product development

    NASA Astrophysics Data System (ADS)

    Alizadeh Afrouzy, Zahra; Paydar, Mohammad Mahdi; Nasseri, Seyed Hadi; Mahdavi, Iraj

    2018-03-01

    There are many reasons for the growing interest in developing new product projects for any firm. The most embossed reason is surviving in a highly competitive industry which the customer tastes are changing rapidly. A well-managed supply chain network can provide the most profit for firms due to considering new product development. Along with profit, customer satisfaction and production of new products are goals which lead to a more efficient supply chain. As new products appear in the market, the old products could become obsolete, and then phased out. The most important parameter in a supply chain which considers new and developed products is the time that developed and new products are introduced and old products are phased out. With consideration of the factors noted above, this study proposes to design a tri-objective multi-echelon multi-product multi-period supply chain model, which incorporates product development and new product production and their effects on supply chain configuration. The supply chain under consideration is assumed to consist of suppliers, manufacturers, distributors and customer groups. In terms of overcoming NP-hardness of the proposed model and in order to solve the complicated problem, a non-dominated sorting genetic algorithm is employed. As there is no benchmark available in the literature, the non-dominated ranking genetic algorithm is developed to validate the results obtained and some test problems are provided to show the applicability of the proposed methodology and evaluate the performance of the algorithms.

  14. Development and validation of a questionnaire to evaluate patient satisfaction with diabetes disease management.

    PubMed

    Paddock, L E; Veloski, J; Chatterton, M L; Gevirtz, F O; Nash, D B

    2000-07-01

    To develop a reliable and valid questionnaire to measure patient satisfaction with diabetes disease management programs. Questions related to structure, process, and outcomes were categorized into 14 domains defining the essential elements of diabetes disease management. Health professionals confirmed the content validity. Face validity was established by a patient focus group. The questionnaire was mailed to 711 patients with diabetes who participated in a disease management program. To reduce the number of questionnaire items, a principal components analysis was performed using a varimax rotation. The Scree test was used to select significant components. To further assess reliability and validity; Cronbach's alpha and product-moment correlations were calculated for components having > or =3 items with loadings >0.50. The validated 73-item mailed satisfaction survey had a 34.1% response rate. Principal components analysis yielded 13 components with eigenvalues > 1.0. The Scree test proposed a 6-component solution (39 items), which explained 59% of the total variation. Internal consistency reliabilities computed for the first 6 components (alpha = 0.79-0.95) were acceptable. The final questionnaire, the Diabetes Management Evaluation Tool (DMET), was designed to assess patient satisfaction with diabetes disease management programs. Although more extensive testing of the questionnaire is appropriate, preliminary reliability and validity of the DMET has been demonstrated.

  15. Remote Sensing of Ocean Color in the Arctic: Algorithm Development and Comparative Validation. Chapter 9

    NASA Technical Reports Server (NTRS)

    Cota, Glenn F.

    2001-01-01

    The overall goal of this effort is to acquire a large bio-optical database, encompassing most environmental variability in the Arctic, to develop algorithms for phytoplankton biomass and production and other optically active constituents. A large suite of bio-optical and biogeochemical observations have been collected in a variety of high latitude ecosystems at different seasons. The Ocean Research Consortium of the Arctic (ORCA) is a collaborative effort between G.F. Cota of Old Dominion University (ODU), W.G. Harrison and T. Platt of the Bedford Institute of Oceanography (BIO), S. Sathyendranath of Dalhousie University and S. Saitoh of Hokkaido University. ORCA has now conducted 12 cruises and collected over 500 in-water optical profiles plus a variety of ancillary data. Observational suites typically include apparent optical properties (AOPs), inherent optical property (IOPs), and a variety of ancillary observations including sun photometry, biogeochemical profiles, and productivity measurements. All quality-assured data have been submitted to NASA's SeaWIFS Bio-Optical Archive and Storage System (SeaBASS) data archive. Our algorithm development efforts address most of the potential bio-optical data products for the Sea-Viewing Wide Field-of-view Sensor (SeaWiFS), Moderate Resolution Imaging Spectroradiometer (MODIS), and GLI, and provides validation for a specific areas of concern, i.e., high latitudes and coastal waters.

  16. The Well-Being 5: Development and Validation of a Diagnostic Instrument to Improve Population Well-being

    PubMed Central

    Sears, Lindsay E.; Agrawal, Sangeeta; Sidney, James A.; Castle, Patricia H.; Coberley, Carter R.; Witters, Dan; Pope, James E.; Harter, James K.

    2014-01-01

    Abstract Building upon extensive research from 2 validated well-being instruments, the objective of this research was to develop and validate a comprehensive and actionable well-being instrument that informs and facilitates improvement of well-being for individuals, communities, and nations. The goals of the measure were comprehensiveness, validity and reliability, significant relationships with health and performance outcomes, and diagnostic capability for intervention. For measure development and validation, questions from the Well-being Assessment and Wellbeing Finder were simultaneously administered as a test item pool to over 13,000 individuals across 3 independent samples. Exploratory factor analysis was conducted on a random selection from the first sample and confirmed in the other samples. Further evidence of validity was established through correlations to the established well-being scores from the Well-Being Assessment and Wellbeing Finder, and individual outcomes capturing health care utilization and productivity. Results showed the Well-Being 5 score comprehensively captures the known constructs within well-being, demonstrates good reliability and validity, significantly relates to health and performance outcomes, is diagnostic and informative for intervention, and can track and compare well-being over time and across groups. With this tool, well-being deficiencies within a population can be effectively identified, prioritized, and addressed, yielding the potential for substantial improvements to the health status, performance, and quality of life for individuals and cost savings for stakeholders. (Population Health Management 2014;17:357–365) PMID:24892873

  17. Validity And Practicality of Experiment Integrated Guided Inquiry-Based Module on Topic of Colloidal Chemistry for Senior High School Learning

    NASA Astrophysics Data System (ADS)

    Andromeda, A.; Lufri; Festiyed; Ellizar, E.; Iryani, I.; Guspatni, G.; Fitri, L.

    2018-04-01

    This Research & Development study aims to produce a valid and practical experiment integrated guided inquiry based module on topic of colloidal chemistry. 4D instructional design model was selected in this study. Limited trial of the product was conducted at SMAN 7 Padang. Instruments used were validity and practicality questionnaires. Validity and practicality data were analyzed using Kappa moment. Analysis of the data shows that Kappa moment for validity was 0.88 indicating a very high degree of validity. Kappa moments for the practicality from students and teachers were 0.89 and 0.95 respectively indicating high degree of practicality. Analysis on the module filled in by students shows that 91.37% students could correctly answer critical thinking, exercise, prelab, postlab and worksheet questions asked in the module. These findings indicate that the integrated guided inquiry based module on topic of colloidal chemistry was valid and practical for chemistry learning in senior high school.

  18. Development of Naphthalene PLIF for Making Quantitative Measurements of Ablation Products Transport in Supersonic Flows

    NASA Astrophysics Data System (ADS)

    Combs, Christopher; Clemens, Noel

    2014-11-01

    Ablation is a multi-physics process involving heat and mass transfer and codes aiming to predict ablation are in need of experimental data pertaining to the turbulent transport of ablation products for validation. Low-temperature sublimating ablators such as naphthalene can be used to create a limited physics problem and simulate ablation at relatively low temperature conditions. At The University of Texas at Austin, a technique is being developed that uses planar laser-induced fluorescence (PLIF) of naphthalene to visualize the transport of ablation products in a supersonic flow. In the current work, naphthalene PLIF will be used to make quantitative measurements of the concentration of ablation products in a Mach 5 turbulent boundary layer. For this technique to be used for quantitative research in supersonic wind tunnel facilities, the fluorescence properties of naphthalene must first be investigated over a wide range of state conditions and excitation wavelengths. The resulting calibration of naphthalene fluorescence will be applied to the PLIF images of ablation from a boundary layer plug, yielding 2-D fields of naphthalene mole fraction. These images may help provide data necessary to validate computational models of ablative thermal protection systems for reentry vehicles. Work supported by NASA Space Technology Research Fellowship Program under grant NNX11AN55H.

  19. Validation of GOES-9 Satellite-Derived Cloud Properties over the Tropical Western Pacific Region

    NASA Technical Reports Server (NTRS)

    Khaiyer, Mandana M.; Nordeen, Michele L.; Doeling, David R.; Chakrapani, Venkatasan; Minnis, Patrick; Smith, William L., Jr.

    2004-01-01

    Real-time processing of hourly GOES-9 images in the ARM TWP region began operationally in October 2003 and is continuing. The ARM sites provide an excellent source for validating this new satellitederived cloud and radiation property dataset. Derived cloud amounts, heights, and broadband shortwave fluxes are compared with similar quantities derived from ground-based instrumentation. The results will provide guidance for estimating uncertainties in the GOES-9 products and to develop improvements in the retrieval methodologies and input.

  20. Development and validation of chromatographic methods (HPLC and GC) for the determination of the active components (benzocaine, tyrothricin and menthol) of a pharmaceutical preparation.

    PubMed

    Ortiz-Boyer, F; Tena, M T; Luque de Castro, M D; Valcárcel, M

    1995-10-01

    Methods are reported for the determination of tyrothricin and benzocaine by HPLC and menthol by GC in the analysis of throat lozenges (tablets) containing all three compounds. After optimization of the variables involved in both HPLC and GC the methods have been characterized and validated according to the guidelines of the Spanish Pharmacopoeia, and applied to both the monitoring of the manufacturing process and the quality control of the final product.

  1. A Programming Environment Evaluation Methodology for Object-Oriented Systems. Ph.D Thesis Final Report, 1 Jul. 1985 - 31 Dec. 1987

    NASA Technical Reports Server (NTRS)

    Dominick, Wayne D. (Editor); Moreau, Dennis R.

    1987-01-01

    The object-oriented design strategy as both a problem decomposition and system development paradigm has made impressive inroads into the various areas of the computing sciences. Substantial development productivity improvements have been demonstrated in areas ranging from artificial intelligence to user interface design. However, there has been very little progress in the formal characterization of these productivity improvements and in the identification of the underlying cognitive mechanisms. The development and validation of models and metrics of this sort require large amounts of systematically-gathered structural and productivity data. There has, however, been a notable lack of systematically-gathered information on these development environments. A large part of this problem is attributable to the lack of a systematic programming environment evaluation methodology that is appropriate to the evaluation of object-oriented systems.

  2. Developing evaluation instrument based on CIPP models on the implementation of portfolio assessment

    NASA Astrophysics Data System (ADS)

    Kurnia, Feni; Rosana, Dadan; Supahar

    2017-08-01

    This study aimed to develop an evaluation instrument constructed by CIPP model on the implementation of portfolio assessment in science learning. This study used research and development (R & D) method; adapting 4-D by the development of non-test instrument, and the evaluation instrument constructed by CIPP model. CIPP is the abbreviation of Context, Input, Process, and Product. The techniques of data collection were interviews, questionnaires, and observations. Data collection instruments were: 1) the interview guidelines for the analysis of the problems and the needs, 2) questionnaire to see level of accomplishment of portfolio assessment instrument, and 3) observation sheets for teacher and student to dig up responses to the portfolio assessment instrument. The data obtained was quantitative data obtained from several validators. The validators consist of two lecturers as the evaluation experts, two practitioners (science teachers), and three colleagues. This paper shows the results of content validity obtained from the validators and the analysis result of the data obtained by using Aikens' V formula. The results of this study shows that the evaluation instrument based on CIPP models is proper to evaluate the implementation of portfolio assessment instruments. Based on the experts' judgments, practitioners, and colleagues, the Aikens' V coefficient was between 0.86-1,00 which means that it is valid and can be used in the limited trial and operational field trial.

  3. The Development and Evaluation of a Computer-Based System for Managing the Design and Pilot-Testing of Interactive Videodisc Programs. Training and Development Research Center, Project Number Forty-Three.

    ERIC Educational Resources Information Center

    Sayre, Scott Alan

    The purpose of this study was to develop and validate a computer-based system that would allow interactive video developers to integrate and manage the design components prior to production. These components of an interactive video (IVD) program include visual information in a variety of formats, audio information, and instructional techniques,…

  4. Hydrogen production by the hyperthermophilic bacterium Thermotoga maritima Part II: modeling and experimental approaches for hydrogen production.

    PubMed

    Auria, Richard; Boileau, Céline; Davidson, Sylvain; Casalot, Laurence; Christen, Pierre; Liebgott, Pierre Pol; Combet-Blanc, Yannick

    2016-01-01

    Thermotoga maritima is a hyperthermophilic bacterium known to produce hydrogen from a large variety of substrates. The aim of the present study is to propose a mathematical model incorporating kinetics of growth, consumption of substrates, product formations, and inhibition by hydrogen in order to predict hydrogen production depending on defined culture conditions. Our mathematical model, incorporating data concerning growth, substrates, and products, was developed to predict hydrogen production from batch fermentations of the hyperthermophilic bacterium, T. maritima . It includes the inhibition by hydrogen and the liquid-to-gas mass transfer of H 2 , CO 2 , and H 2 S. Most kinetic parameters of the model were obtained from batch experiments without any fitting. The mathematical model is adequate for glucose, yeast extract, and thiosulfate concentrations ranging from 2.5 to 20 mmol/L, 0.2-0.5 g/L, or 0.01-0.06 mmol/L, respectively, corresponding to one of these compounds being the growth-limiting factor of T. maritima . When glucose, yeast extract, and thiosulfate concentrations are all higher than these ranges, the model overestimates all the variables. In the window of the model validity, predictions of the model show that the combination of both variables (increase in limiting factor concentration and in inlet gas stream) leads up to a twofold increase of the maximum H 2 -specific productivity with the lowest inhibition. A mathematical model predicting H 2 production in T. maritima was successfully designed and confirmed in this study. However, it shows the limit of validity of such mathematical models. Their limit of applicability must take into account the range of validity in which the parameters were established.

  5. A new food frequency questionnaire to assess chocolate and cocoa consumption.

    PubMed

    Vicente, Filipa; Saldaña-Ruíz, Sandra; Rabanal, Manel; Rodríguez-Lagunas, María J; Pereira, Paula; Pérez-Cano, Francisco J; Castell, Margarida

    2016-01-01

    Cocoa has been highlighted as a food with potential benefits to human health because of its polyphenol content. However, few studies show the contribution of cocoa and chocolate products in polyphenol intake. The aim of this work was to develop a food frequency questionnaire (FFQ) for evaluating the intake of food products containing cocoa (C-FFQ). A sample of 50 university students was recruited to complete the 90-item questionnaire, a validated questionnaire (called here European Food Safety Authority [EFSA]-Q) as well as a 24-hour dietary recall (24 HDR). Spearman correlation test, Bland-Altman plots, and quintile classification analysis were conducted together with the Wilcoxon test and descriptive statistics. Significant correlations between the C-FFQ and the EFSA-Q for the most common cocoa/chocolate products were observed (P < 0.05), as well as between data from the C-FFQ and 24 HDR (P < 0.05). However, a number of cocoa/chocolate products frequently consumed by the participants were detected by the C-FFQ and 24 HDR which were not included in the EFSA-Q. According to the C-FFQ, chocolate bars were the main source of cocoa in university students, but dairy products also provided an important amount of cocoa. The developed C-FFQ questionnaire can be considered as a valid option for assessing the consumption frequency of cocoa/chocolate-derived products, thereby allowing the evaluation of cocoa polyphenol intake in further studies. Copyright © 2016 Elsevier Inc. All rights reserved.

  6. Characterization of a whole, inactivated influenza (H5N1) vaccine.

    PubMed

    Tada, Yoshikazu

    2008-11-01

    Effective vaccines against the highly pathogenic influenza A/H5N1 virus are being developed worldwide. In Japan, two adjuvanted, inactivated, whole-virion influenza vaccines were recently developed and licensed as mock-up, pre-pandemic vaccine formulations by the Ministry of Health and Labor Welfare of Japan. During the vaccine design and development process, various obstacles were overcome and, in this report, we introduce the non clinical production, immunogenicity data in human and development process that was associated with egg-derived adjuvanted, inactivated, whole-virion influenza A (H5N1) vaccine. Pilot lots of H5N1 vaccine were produced using the avirulent H5N1 reference strain A/Vietnam/1194/2004 (H5N1) NIBRG-14 and administered following adsorption with aluminum hydroxide as an adjuvant. Quality control and formulation stability tests were performed before clinical trials were initiated (phase I-III). The research foundation for microbial diseases of Osaka University (BIKEN) carried out vaccine production, quality control, stability testing and the phase I clinical trial in addition to overseeing the licensing of this vaccine. Mitsubishi Chemical Safety Institute Ltd. carried out the non clinical pharmacological toxicity and safety studies and the Japanese medical association carried out the phase II/III trials. Phase I-III trials took place in 2006. The production processes were well controlled by established tests and validations. Vaccine quality was confirmed by quality control, stability and pre-clinical tests, and the vaccine was approved as a mock-up, pre-pandemic vaccine by the Ministry of Health and Labor Welfare of Japan. Numerous safety and efficacy procedures were carried out prior to the approval of the described vaccine formulation. Some of these procedures were of particular importance e.g., vaccine development, validation, and quality control tests that included strict monitoring of the hemagglutinin (HA) content of the vaccine formulations. Improving vaccine productivity, shortening the production period and improving antigen yield of the avirulent vaccine strains were also considered important vaccine development criteria.

  7. Challenges and opportunities in bioanalytical support for gene therapy medicinal product development.

    PubMed

    Ma, Mark; Balasubramanian, Nanda; Dodge, Robert; Zhang, Yan

    2017-09-01

    Gene and nucleic acid therapies have demonstrated patient benefits to address unmet medical needs. Beside considerations regarding the biological nature of the gene therapy, the quality of bioanalytical methods plays an important role in ensuring the success of these novel therapies. Inconsistent approaches among bioanalytical labs during preclinical and clinical phases have been observed. There are many underlying reasons for this inconsistency. Various platforms and reagents used in quantitative methods, lacking of detailed regulatory guidance on method validation and uncertainty of immunogenicity strategy in supporting gene therapy may all be influential. This review summarizes recent practices and considerations in bioanalytical support of pharmacokinetics/pharmacodynamics and immunogenicity evaluations in gene therapy development with insight into method design, development and validations.

  8. Quantification of maltol in Korean ginseng (Panax ginseng) products by high-performance liquid chromatography-diode array detector

    PubMed Central

    Jeong, Hyun Cheol; Hong, Hee-Do; Kim, Young-Chan; Rhee, Young Kyoung; Choi, Sang Yoon; Kim, Kyung-Tack; Kim, Sung Soo; Lee, Young-Chul; Cho, Chang-Won

    2015-01-01

    Background: Maltol, as a type of phenolic compounds, is produced by the browning reaction during the high-temperature treatment of ginseng. Thus, maltol can be used as a marker for the quality control of various ginseng products manufactured by high-temperature treatment including red ginseng. For the quantification of maltol in Korean ginseng products, an effective high-performance liquid chromatography-diode array detector (HPLC-DAD) method was developed. Materials and Methods: The HPLC-DAD method for maltol quantification coupled with a liquid-liquid extraction (LLE) method was developed and validated in terms of linearity, precision, and accuracy. An HPLC separation was performed on a C18 column. Results: The LLE methods and HPLC running conditions for maltol quantification were optimized. The calibration curve of the maltol exhibited good linearity (R2 = 1.00). The limit of detection value of maltol was 0.26 μg/mL, and the limit of quantification value was 0.79 μg/mL. The relative standard deviations (RSDs) of the data of the intra- and inter-day experiments were <1.27% and 0.61%, respectively. The results of the recovery test were 101.35–101.75% with an RSD value of 0.21–1.65%. The developed method was applied successfully to quantify the maltol in three ginseng products manufactured by different methods. Conclusion: The results of validation demonstrated that the proposed HPLC-DAD method was useful for the quantification of maltol in various ginseng products. PMID:26246746

  9. Sourcing Life Cycle Inventory Data

    EPA Science Inventory

    The collection and validation of quality lifecycle inventory (LCI) data can be the most difficult and time-consuming aspect of developing a life cycle assessment (LCA). Large amounts of process and production data are needed to complete the LCI. For many studies, the LCA analyst ...

  10. Bio-Optical Measurement and Modeling of the California Current and Southern Oceans

    NASA Technical Reports Server (NTRS)

    Mitchell, B. Gregg; Mitchell, B. Greg

    2003-01-01

    The SIMBIOS project's principal goals are to validate standard or experimental ocean color products through detailed bio-optical and biogeochemical measurements, and to combine Ocean optical observations with modeling to contribute to satellite vicarious radiometric calibration and algorithm development.

  11. Validation of genetic markers associated with chalkbrood resistance

    USDA-ARS?s Scientific Manuscript database

    Chalkbrood is one of the major fungal diseases of honey bee brood. Systemic mycoses caused by the fungus, Ascosphaera apis, may significantly reduce brood population, and consequently, colony strength and productivity. Developing genetic marker(s) associated with the enhanced brood survival will be ...

  12. Measuring metacognitive ability based on science literacy in dynamic electricity topic

    NASA Astrophysics Data System (ADS)

    Warni; Sunyono; Rosidin

    2018-01-01

    This study aims to produce an instrument of metacognition ability assessment based on science literacy on theoretically and empirically feasible dynamic electrical material. The feasibility of the assessment instrument includes theoretical validity on material, construction, and language aspects, as well as empirical validity, reliability, difficulty, distinguishing, and distractor indices. The development of assessment instruments refers to the Dick and Carey development model which includes the preliminary study stage, initial product development, validation and revision, and piloting. The instrument was tested to 32 students of class IX in SMP Negeri 20 Bandar Lampung, using the design of One Group Pretest-Postest Design. The result shows that the metacognition ability assessment instrument based on science literacy is feasible theoretically with theoretical validity percentage of 95.44% and empirical validity of 43.75% for the high category, 43.75% for the medium category, and 12.50 % for low category questions; Reliability of assessment instruments of 0.83 high categories; Difficulty level of difficult item is about 31.25% and medium category is equal to 68.75%. Item that has very good distinguishing power is 12.50%, 62.50% for good stage, and medium category is 25.00%; As well as the duplexing function on a matter of multiple choice is 80.00% including good category and 20.00% for medium category.

  13. Developing self-concept instrument for pre-service mathematics teachers

    NASA Astrophysics Data System (ADS)

    Afgani, M. W.; Suryadi, D.; Dahlan, J. A.

    2018-01-01

    This study aimed to develop self-concept instrument for undergraduate students of mathematics education in Palembang, Indonesia. Type of this study was development research of non-test instrument in questionnaire form. A Validity test of the instrument was performed with construct validity test by using Pearson product moment and factor analysis, while reliability test used Cronbach’s alpha. The instrument was tested by 65 undergraduate students of mathematics education in one of the universities at Palembang, Indonesia. The instrument consisted of 43 items with 7 aspects of self-concept, that were the individual concern, social identity, individual personality, view of the future, the influence of others who become role models, the influence of the environment inside or outside the classroom, and view of the mathematics. The result of validity test showed there was one invalid item because the value of Pearson’s r was 0.107 less than the critical value (0.244; α = 0.05). The item was included in social identity aspect. After the invalid item was removed, Construct validity test with factor analysis generated only one factor. The Kaiser-Meyer-Olkin (KMO) coefficient was 0.846 and reliability coefficient was 0.91. From that result, we concluded that the self-concept instrument for undergraduate students of mathematics education in Palembang, Indonesia was valid and reliable with 42 items.

  14. Development and validation of a cancer awareness questionnaire for Malaysian undergraduate students of Chinese ethnicity.

    PubMed

    Loo, Jo Lin; Ang, Yee Kwang; Yim, Hip Seng

    2013-01-01

    To describe the development and validation of a cancer awareness questionnaire (CAQ) based on a literature review of previous studies, focusing on cancer awareness and prevention. A total of 388 Chinese undergraduate students in a private university in Kuala Lumpur, Malaysia, were recruited to evaluate the developed self-administered questionnaire. The CAQ consisted of four sections: awareness of cancer warning signs and screening tests; knowledge of cancer risk factors; barriers in seeking medical advice; and attitudes towards cancer and cancer prevention. The questionnaire was evaluated for construct validity using principal component analysis and internal consistency using Cronbach's alpha (α) coefficient. Test-retest reliability was assessed with a 10-14 days interval and measured using Pearson product-moment correlation. The initial 77-item CAQ was reduced to 63 items, with satisfactory construct validity, and a high total internal consistency (Cronbach's α=0.77). A total of 143 students completed the questionnaire for the test-retest reliability obtaining a correlation of 0.72 (p<0.001) overall. The CAQ could provide a reliable and valid measure that can be used to assess cancer awareness among local Chinese undergraduate students. However, further studies among students from different backgrounds (e.g. ethnicity) are required in order to facilitate the use of the cancer awareness questionnaire among all university students.

  15. A calibration hierarchy for risk models was defined: from utopia to empirical data.

    PubMed

    Van Calster, Ben; Nieboer, Daan; Vergouwe, Yvonne; De Cock, Bavo; Pencina, Michael J; Steyerberg, Ewout W

    2016-06-01

    Calibrated risk models are vital for valid decision support. We define four levels of calibration and describe implications for model development and external validation of predictions. We present results based on simulated data sets. A common definition of calibration is "having an event rate of R% among patients with a predicted risk of R%," which we refer to as "moderate calibration." Weaker forms of calibration only require the average predicted risk (mean calibration) or the average prediction effects (weak calibration) to be correct. "Strong calibration" requires that the event rate equals the predicted risk for every covariate pattern. This implies that the model is fully correct for the validation setting. We argue that this is unrealistic: the model type may be incorrect, the linear predictor is only asymptotically unbiased, and all nonlinear and interaction effects should be correctly modeled. In addition, we prove that moderate calibration guarantees nonharmful decision making. Finally, results indicate that a flexible assessment of calibration in small validation data sets is problematic. Strong calibration is desirable for individualized decision support but unrealistic and counter productive by stimulating the development of overly complex models. Model development and external validation should focus on moderate calibration. Copyright © 2016 Elsevier Inc. All rights reserved.

  16. Using an innovative combination of quality-by-design and green analytical chemistry approaches for the development of a stability indicating UHPLC method in pharmaceutical products.

    PubMed

    Boussès, Christine; Ferey, Ludivine; Vedrines, Elodie; Gaudin, Karen

    2015-11-10

    An innovative combination of green chemistry and quality by design (QbD) approach is presented through the development of an UHPLC method for the analysis of the main degradation products of dextromethorphan hydrobromide. QbD strategy was integrated to the field of green analytical chemistry to improve method understanding while assuring quality and minimizing environmental impacts, and analyst exposure. This analytical method was thoroughly evaluated by applying risk assessment and multivariate analysis tools. After a scouting phase aimed at selecting a suitable stationary phase and an organic solvent in accordance with green chemistry principles, quality risk assessment tools were applied to determine the critical process parameters (CPPs). The effects of the CPPs on critical quality attributes (CQAs), i.e., resolutions, efficiencies, and solvent consumption were further evaluated by means of a screening design. A response surface methodology was then carried out to model CQAs as function of the selected CPPs and the optimal separation conditions were determined through a desirability analysis. Resulting contour plots enabled to establish the design space (DS) (method operable design region) where all CQAs fulfilled the requirements. An experimental validation of the DS proved that quality within the DS was guaranteed; therefore no more robustness study was required before the validation. Finally, this UHPLC method was validated using the concept of total error and was used to analyze a pharmaceutical drug product. Copyright © 2015 Elsevier B.V. All rights reserved.

  17. Development of the Assessment of Belief Conflict in Relationship-14 (ABCR-14)

    PubMed Central

    Kyougoku, Makoto; Teraoka, Mutsumi; Masuda, Noriko; Ooura, Mariko; Abe, Yasushi

    2015-01-01

    Purpose Nurses and other healthcare workers frequently experience belief conflict, one of the most important, new stress-related problems in both academic and clinical fields. Methods In this study, using a sample of 1,683 nursing practitioners, we developed The Assessment of Belief Conflict in Relationship-14 (ABCR-14), a new scale that assesses belief conflict in the healthcare field. Standard psychometric procedures were used to develop and test the scale, including a qualitative framework concept and item-pool development, item reduction, and scale development. We analyzed the psychometric properties of ABCR-14 according to entropy, polyserial correlation coefficient, exploratory factor analysis, confirmatory factor analysis, average variance extracted, Cronbach’s alpha, Pearson product-moment correlation coefficient, and multidimensional item response theory (MIRT). Results The results of the analysis supported a three-factor model consisting of 14 items. The validity and reliability of ABCR-14 was suggested by evidence from high construct validity, structural validity, hypothesis testing, internal consistency reliability, and concurrent validity. The result of the MIRT offered strong support for good item response of item slope parameters and difficulty parameters. However, the ABCR-14 Likert scale might need to be explored from the MIRT point of view. Yet, as mentioned above, there is sufficient evidence to support that ABCR-14 has high validity and reliability. Conclusion The ABCR-14 demonstrates good psychometric properties for nursing belief conflict. Further studies are recommended to confirm its application in clinical practice. PMID:26247356

  18. Hypermedia and visual technology

    NASA Technical Reports Server (NTRS)

    Walker, Lloyd

    1990-01-01

    Applications of a codified professional practice that uses visual representations of the thoughts and ideas of a working group are reported in order to improve productivity, problem solving, and innovation. This visual technology process was developed under the auspices of General Foods as part of a multi-year study. The study resulted in the validation of this professional service as a way to use art and design to facilitate productivity and innovation and to define new opportunities. It was also used by NASA for planning Lunar/Mars exploration and by other companies for general business and advanced strategic planning, developing new product concepts, and litigation support. General Foods has continued to use the service for packaging innovation studies.

  19. Was There a Significantly Negative Anomaly of Global Land Surface Net Radiation from 2001-2006?

    NASA Astrophysics Data System (ADS)

    Liang, S.; Jia, A.; Jiang, B.

    2016-12-01

    Surface net radiation, which characterizes surface energy budget, can be estimated from in-situ measurements, satellite products, model simulations, and reanalysis. Satellite products are usually validated using ground measurements to characterize their uncertainties. The surface net radiation product from the CERES (Clouds and the Earth's Radiant Energy System) has been widely used. After validating it using extensive ground measurements, we also verified that the CERES surface net radiation product is highly accurate. When we evaluated the temporal variations of the averaged global land surface net radiation from the CERES product, we found a significantly negative anomaly starting from 2001, reaching the maximum in 2004, and gradually coming back to normal in 2006. The valley has the magnitude of approximately 3 Wm-2 centered at 2004. After comparing with the high-resolution GLASS (Global LAnd Surface Satellite) net radiation product developed at Beijing Normal University, the CMIP5 model simulations, and the ERA-Interim reanalysis dataset, we concluded that the significant decreasing pattern of land surface net radiation from 2001-2006 is an artifact mainly due to inaccurate longwave net radiation of the CERES surface net radiation product. The current ground measurement networks are not spatially dense enough to capture the false negative anomaly from the CERES product, which calls for more ground measurements.

  20. Operational Processing of Ground Validation Data for the Tropical Rainfall Measuring Mission

    NASA Technical Reports Server (NTRS)

    Kulie, Mark S.; Robinson, Mike; Marks, David A.; Ferrier, Brad S.; Rosenfeld, Danny; Wolff, David B.

    1999-01-01

    The Tropical Rainfall Measuring Mission (TRMM) satellite was successfully launched in November 1997. A primary goal of TRMM is to sample tropical rainfall using the first active spaceborne precipitation radar. To validate TRMM satellite observations, a comprehensive Ground Validation (GV) Program has been implemented for this mission. A key component of GV is the analysis and quality control of meteorological ground-based radar data from four primary sites: Melbourne, FL; Houston, TX; Darwin, Australia; and Kwajalein Atoll, RMI. As part of the TRMM GV effort, the Joint Center for Earth Systems Technology (JCET) at the University of Maryland, Baltimore County, has been tasked with developing and implementing an operational system to quality control (QC), archive, and provide data for subsequent rainfall product generation from the four primary GV sites. This paper provides an overview of the JCET operational environment. A description of the QC algorithm and performance, in addition to the data flow procedure between JCET and the TRNM science and Data Information System (TSDIS), are presented. The impact of quality-controlled data on higher level rainfall and reflectivity products will also be addressed, Finally, a brief description of JCET's expanded role into producing reference rainfall products will be discussed.

  1. A validated stability-indicating UPLC method for desloratadine and its impurities in pharmaceutical dosage forms.

    PubMed

    Rao, Dantu Durga; Satyanarayana, N V; Malleswara Reddy, A; Sait, Shakil S; Chakole, Dinesh; Mukkanti, K

    2010-02-05

    A novel stability-indicating gradient reverse phase ultra-performance liquid chromatographic (RP-UPLC) method was developed for the determination of purity of desloratadine in presence of its impurities and forced degradation products. The method was developed using Waters Aquity BEH C18 column with mobile phase containing a gradient mixture of solvents A and B. The eluted compounds were monitored at 280nm. The run time was 8min within which desloratadine and its five impurities were well separated. Desloratadine was subjected to the stress conditions of oxidative, acid, base, hydrolytic, thermal and photolytic degradation. Desloratadine was found to degrade significantly in oxidative and thermal stress conditions and stable in acid, base, hydrolytic and photolytic degradation conditions. The degradation products were well resolved from main peak and its impurities, thus proved the stability-indicating power of the method. The developed method was validated as per ICH guidelines with respect to specificity, linearity, limit of detection, limit of quantification, accuracy, precision and robustness. This method was also suitable for the assay determination of desloratadine in pharmaceutical dosage forms.

  2. Analysis of the Validity of Environmental Kuznets Curve for the Baltic States

    NASA Astrophysics Data System (ADS)

    Lapinskienė, Giedrė; Tvaronavičienė, Manuela; Vaitkus, Pranas

    2013-12-01

    The paper analyses a traditional Environmental Kuznets Curve (EKC) relationship between greenhouse gases (GHG) and gross domestic product (GDP), extending the research to include some additional factors, such as environmental tax, research and development expenditure, implicit tax rate on energy, primary production of coal and lignite, energy intensity of the economy taken from the Eurostat database. The EKC indicates that, at the early stages of economic growth, pollution increases with the growing use of resources, but when a certain level of income per capita is reached, the trend reverses so that, at a higher development stage, further economic growth leads to the improvement of the environment. In the first part of the research, the validity of the reduced EKC for the Baltic region for the period 1995-2008 is determined. In the second part, the impact of selected factors is statistically tested. In both cases, the standard cubic equation is used because it is believed that this model is the most accurate for the development stage of this region. The research results may be useful for climate change policy design.

  3. Development and validation of an extensive growth and growth boundary model for psychrotolerant Lactobacillus spp. in seafood and meat products.

    PubMed

    Mejlholm, Ole; Dalgaard, Paw

    2013-10-15

    A new and extensive growth and growth boundary model for psychrotolerant Lactobacillus spp. was developed and validated for processed and unprocessed products of seafood and meat. The new model was developed by refitting and expanding an existing cardinal parameter model for growth and the growth boundary of lactic acid bacteria (LAB) in processed seafood (O. Mejlholm and P. Dalgaard, J. Food Prot. 70. 2485-2497, 2007). Initially, to estimate values for the maximum specific growth rate at the reference temperature of 25 °C (μref) and the theoretical minimum temperature that prevents growth of psychrotolerant LAB (T(min)), the existing LAB model was refitted to data from experiments with seafood and meat products reported not to include nitrite or any of the four organic acids evaluated in the present study. Next, dimensionless terms modelling the antimicrobial effect of nitrite, and acetic, benzoic, citric and sorbic acids on growth of Lactobacillus sakei were added to the refitted model, together with minimum inhibitory concentrations determined for the five environmental parameters. The new model including the effect of 12 environmental parameters, as well as their interactive effects, was successfully validated using 229 growth rates (μ(max) values) for psychrotolerant Lactobacillus spp. in seafood and meat products. Average bias and accuracy factor values of 1.08 and 1.27, respectively, were obtained when observed and predicted μ(max) values of psychrotolerant Lactobacillus spp. were compared. Thus, on average μ(max) values were only overestimated by 8%. The performance of the new model was equally good for seafood and meat products, and the importance of including the effect of acetic, benzoic, citric and sorbic acids and to a lesser extent nitrite in order to accurately predict growth of psychrotolerant Lactobacillus spp. was clearly demonstrated. The new model can be used to predict growth of psychrotolerant Lactobacillus spp. in seafood and meat products e.g. prediction of the time to a critical cell concentration of bacteria is considered useful for establishing the shelf life. In addition, the high number of environmental parameters included in the new model makes it flexible and suitable for product development as the effect of substituting one combination of preservatives with another can be predicted. In general, the performance of the new model was unacceptable for other types of LAB including Carnobacterium spp., Leuconostoc spp. and Weissella spp. © 2013.

  4. Overview of calibration and validation activities for the EUMETSAT polar system: second generation (EPS-SG) visible/infrared imager (METimage)

    NASA Astrophysics Data System (ADS)

    Phillips, P.; Bonsignori, R.; Schlüssel, P.; Schmülling, F.; Spezzi, L.; Watts, P.; Zerfowski, I.

    2016-10-01

    The EPS-SG Visible/Infrared Imaging (VII) mission is dedicated to supporting the optical imagery user needs for Numerical Weather Prediction (NWP), Nowcasting (NWC) and climate in the timeframe beyond 2020. The VII mission is fulfilled by the METimage instrument, developed by the German Space Agency (DLR) and funded by the German government and EUMETSAT. Following on from an important list of predecessors such as the Advanced Very High Resolution Radiometer (AVHRR) and the Moderate resolution Imaging Spectro-radiometer (MODIS), METimage will fly in the mid-morning orbit of the Joint Polar System, whilst the early-afternoon orbits are served by the JPSS (U.S. Joint Polar Satellite System) Visible Infrared Imager Radiometer Suite (VIIRS). METimage itself is a cross-purpose medium resolution, multi-spectral optical imager, measuring the optical spectrum of radiation emitted and reflected by the Earth from a low-altitude sun synchronous orbit over a minimum swath width of 2700 km. The top of the atmosphere outgoing radiance will be sampled every 500 m (at nadir) with measurements made in 20 spectral channels ranging from 443 nm in the visible up to 13.345 μm in the thermal infrared. The three major objectives of the EPS-SG METimage calibration and validation activities are: • Verification of the instrument performances through continuous in-flight calibration and characterisation, including monitoring of long term stability. • Provision of validated level 1 and level 2 METimage products. • Revision of product processing facilities, i.e. algorithms and auxiliary data sets, to assure that products conform with user requirements, and then, if possible, exceed user expectations. This paper will describe the overall Calibration and Validation (Cal/Val) logic and the methods adopted to ensure that the METimage data products meet performance specifications for the lifetime of the mission. Such methods include inter-comparisons with other missions through simultaneous nadir overpasses and comparisons with ground based observations, analysis of algorithm internal diagnostics to confirm retrieval performance for geophysical products and vicarious calibration to assist with validation of the instrument on-board calibration. Any identified deficiencies in the products will lead to either an update any auxiliary data sets (e.g. calibration key data) that are used to configure the product processors or to a revision of algorithms themselves. The Cal/Val activities are mostly foreseen during commissioning but will inevitably extend to routine operations in order to take on board seasonal variations and ensure long term stability of the calibrated radiances and geophysical products. Pre-requisite to validation of products at scientific level is that the satellite and instrument itself have been verified against their respective specifications both pre-launch and during the satellite in-orbit verification phase.

  5. A validated stability-indicating LC method for the separation of enantiomer and potential impurities of Linezolid using polar organic mode.

    PubMed

    Satyanarayana Raju, T; Vishweshwari Kutty, O; Ganesh, V; Yadagiri Swamy, P

    2012-08-01

    Although a number of methods are available for evaluating Linezolid and its possible impurities, a common method for separation if its potential impurities, degradants and enantiomer in a single method with good efficiency remain unavailable. With the objective of developing an advanced method with shorter runtimes, a simple, precise, accurate stability-indicating LC method was developed for the determination of purity of Linezolid drug substance and drug products in bulk samples and pharmaceutical dosage forms in the presence of its impurities and degradation products. This method is capable of separating all the related substances of Linezolid along with the chiral impurity. This method can also be used for the estimation of assay of Linezolid in drug substance as well as in drug product. The method was developed using Chiralpak IA (250 mm×4.6 mm, 5 μm) column. A mixture of acetonitrile, ethanol, n-butyl amine and trifluoro acetic acid in 96:4:0.10:0.16 (v/v/v/v) ratio was used as a mobile phase. The eluted compounds were monitored at 254 nm. Linezolid was subjected to the stress conditions of oxidative, acid, base, hydrolytic, thermal and photolytic degradation. The degradation products were well resolved from main peak and its impurities, proving the stability-indicating power of the method. The developed method was validated as per International Conference on Harmonization (ICH) guidelines with respect to specificity, limit of detection, limit of quantification, precision, linearity, accuracy, robustness and system suitability.

  6. Validated stability-indicating spectrophotometric methods for the determination of cefixime trihydrate in the presence of its acid and alkali degradation products.

    PubMed

    Mostafa, Nadia M; Abdel-Fattah, Laila; Weshahy, Soheir A; Hassan, Nagiba Y; Boltia, Shereen A

    2015-01-01

    Five simple, accurate, precise, and economical spectrophotometric methods have been developed for the determination of cefixime trihydrate (CFX) in the presence of its acid and alkali degradation products without prior separation. In the first method, second derivative (2D) and first derivative (1D) spectrophotometry was applied to the absorption spectra of CFX and its acid (2D) or alkali (1D) degradation products by measuring the amplitude at 289 and 308 nm, respectively. The second method was a first derivative (1DD) ratio spectrophotometric method where the peak amplitudes were measured at 311 nm in presence of the acid degradation product, and 273 and 306 nm in presence of its alkali degradation product. The third method was ratio subtraction spectrophotometry where the drug is determined at 286 nm in laboratory-prepared mixtures of CFX and its acid or alkali degradation product. The fourth method was based on dual wavelength analysis; two wavelengths were selected at which the absorbances of one component were the same, so wavelengths 209 and 252 nm were used to determine CFX in presence of its acid degradation product and 310 and 321 nm in presence of its alkali degradation product. The fifth method was bivariate spectrophotometric calibration based on four linear regression equations obtained at the wavelengths 231 and 290 nm, and 231 and 285 nm for the binary mixture of CFX with either its acid or alkali degradation product, respectively. The developed methods were successfully applied to the analysis of CFX in laboratory-prepared mixtures and pharmaceutical formulations with good recoveries, and their validation was carried out following the International Conference on Harmonization guidelines. The results obtained were statistically compared with each other and showed no significant difference with respect to accuracy and precision.

  7. Cloud parameters from zenith transmittances measured by sky radiometer at surface: Method development and satellite product validation

    NASA Astrophysics Data System (ADS)

    Khatri, Pradeep; Hayasaka, Tadahiro; Iwabuchi, Hironobu; Takamura, Tamio; Irie, Hitoshi; Nakajima, Takashi Y.; Letu, Husi; Kai, Qin

    2017-04-01

    Clouds are known to have profound impacts on atmospheric radiation and water budget, climate change, atmosphere-surface interaction, and so on. Cloud optical thickness (COT) and effective radius (Re) are two fundamental cloud parameters required to study clouds from climatological and hydrological point of view. Large spatial-temporal coverages of those cloud parameters from space observation have proved to be very useful for cloud research; however, validation of space-based products is still a challenging task due to lack of reliable data. Ground-based remote sensing instruments, such as sky radiometers distributed around the world through international observation networks of SKYNET (http://atmos2.cr.chiba-u.jp/skynet/) and AERONET (https://aeronet.gsfc.nasa.gov/) have a great potential to produce ground-truth cloud parameters at different parts of the globe to validate satellite products. Focusing to the sky radiometers of SKYNET and AERONET, a few cloud retrieval methods exists, but those methods have some difficulties to address the problem when cloud is optically thin. It is because the observed transmittances at two wavelengths can be originated from more than one set of COD and Re, and the choice of the most plausible set is difficult. At the same time, calibration issue, especially for the wavelength of near infrared (NIR) region, which is important to retrieve Re, is also a difficult task at present. As a result, instruments need to be calibrated at a high mountain or calibration terms need to be transferred from a standard instrument. Taking those points on account, we developed a new retrieval method emphasizing to overcome above-mentioned difficulties. We used observed transmittances of multiple wavelengths to overcome the first problem. We further proposed a method to obtain calibration constant of NIR wavelength channel using observation data. Our cloud retrieval method is found to produce relatively accurate COD and Re when validated them using data of a narrow field of view radiometer of collocated observation in one SKYNET site. Though the method is developed for the sky radiometer of SKYNET, it can be still used for the sky radiometer of AERONET and other instruments observing spectral zenith transmittances. The proposed retrieval method is then applied to retrieve cloud parameters at key sites of SKYNET within Japan, which are then used to validate cloud products obtained from space observations by MODIS sensors onboard TERRA/AQUA satellites and Himawari 8, a Japanese geostationary satellite. Our analyses suggest the underestimation (overestimation) of COD (Re) from space observations.

  8. Screening for Small Molecule Inhibitors of Statin-Induced APP C-terminal Toxic Fragment Production

    PubMed Central

    Poksay, Karen S.; Sheffler, Douglas J.; Spilman, Patricia; Campagna, Jesus; Jagodzinska, Barbara; Descamps, Olivier; Gorostiza, Olivia; Matalis, Alex; Mullenix, Michael; Bredesen, Dale E.; Cosford, Nicholas D. P.; John, Varghese

    2017-01-01

    Alzheimer’s disease (AD) is characterized by neuronal and synaptic loss. One process that could contribute to this loss is the intracellular caspase cleavage of the amyloid precursor protein (APP) resulting in release of the toxic C-terminal 31-amino acid peptide APP-C31 along with the production of APPΔC31, full-length APP minus the C-terminal 31 amino acids. We previously found that a mutation in APP that prevents this caspase cleavage ameliorated synaptic loss and cognitive impairment in a murine AD model. Thus, inhibition of this cleavage is a reasonable target for new therapeutic development. In order to identify small molecules that inhibit the generation of APP-C31, we first used an APPΔC31 cleavage site-specific antibody to develop an AlphaLISA to screen several chemical compound libraries for the level of N-terminal fragment production. This antibody was also used to develop an ELISA for validation studies. In both high throughput screening (HTS) and validation testing, the ability of compounds to inhibit simvastatin- (HTS) or cerivastatin- (validation studies) induced caspase cleavage at the APP-D720 cleavage site was determined in Chinese hamster ovary (CHO) cells stably transfected with wildtype (wt) human APP (CHO-7W). Several compounds, as well as control pan-caspase inhibitor Q-VD-OPh, inhibited APPΔC31 production (measured fragment) and rescued cell death in a dose-dependent manner. The effective compounds fell into several classes including SERCA inhibitors, inhibitors of Wnt signaling, and calcium channel antagonists. Further studies are underway to evaluate the efficacy of lead compounds – identified here using cells and tissues expressing wt human APP – in mouse models of AD expressing mutated human APP, as well as to identify additional compounds and determine the mechanisms by which they exert their effects. PMID:28261092

  9. A model for methane production in sewers.

    PubMed

    Chaosakul, Thitirat; Koottatep, Thammarat; Polprasert, Chongrak

    2014-09-19

    Most sewers in developing countries are combined sewers which receive stormwater and effluent from septic tanks or cesspools of households and buildings. Although the wastewater strength in these sewers is usually lower than those in developed countries, due to improper construction and maintenance, the hydraulic retention time (HRT) could be relatively long and resulting considerable greenhouse gas (GHG) production. This study proposed an empirical model to predict the quantity of methane production in gravity-flow sewers based on relevant parameters such as surface area to volume ratio (A/V) of sewer, hydraulic retention time (HRT) and wastewater temperature. The model was developed from field survey data of gravity-flow sewers located in a peri-urban area, central Thailand and validated with field data of a sewer system of the Gold Coast area, Queensland, Australia. Application of this model to improve construction and maintenance of gravity-flow sewers to minimize GHG production and reduce global warming is presented.

  10. The Geostationary Operational Environmental Satellite (GOES) Product Generation System

    NASA Technical Reports Server (NTRS)

    Haines, S. L.; Suggs, R. J.; Jedlovec, G. J.

    2004-01-01

    The Geostationary Operational Environmental Satellite (GOES) Product Generation System (GPGS) is introduced and described. GPGS is a set of computer programs developed and maintained at the Global Hydrology and Climate Center and is designed to generate meteorological data products using visible and infrared measurements from the GOES-East Imager and Sounder instruments. The products that are produced by GPGS are skin temperature, total precipitable water, cloud top pressure, cloud albedo, surface albedo, and surface insolation. A robust cloud mask is also generated. The retrieval methodology for each product is described to include algorithm descriptions and required inputs and outputs for the programs. Validation is supplied where applicable.

  11. Fast emission spectroscopy for monitoring condensed carbon in detonation products of oxygen-deficient high explosives

    NASA Astrophysics Data System (ADS)

    Poeuf, Sandra; Baudin, Gerard; Genetier, Marc; Lefrançois, Alexandre; Cinnayya, Ashwin; Laurent, Jacquet

    2017-06-01

    A new thermochemical code, SIAME, dedicated to the study of high explosives, is currently being developed. New experimental data relative to the expansion of detonation products are required to validate the code, and a particular focus is made on solid carbon products. Two different high explosive formulations are used: a melt-cast one (RDX/TNT 60/40 % wt.) and a pressed one (HMX/VitonR 96/4 % wt.). The experimental setup allows the expansion of the products at pressures below 1 GPa in an inert medium (vacuum, helium, nitrogen and PMMA). The results of fast emission dynamic spectroscopy measurements used to monitor the detonation carbon products are reported. Two spectral signatures are identified: the first is associated to ionized gases and the second to carbon thermal radiation. The experimental spectral lines are compared with simulated spectra. The trajectory of the shock wave front is continuously recorded with a high frequency interferometer. Comparisons with numerical simulations on the hydrodynamic code Ouranoshave been done. These two measurements, using the different inert media, enable to make one step forward in the validation of the detonation products equation of state implemented in the SIAME code.

  12. Skill Transfer and Virtual Training for IND Response Decision-Making: Project Summary and Next Steps

    DTIC Science & Technology

    2016-04-12

    are likely to be very productive partners—independent video - game developers and academic game degree programs—are not familiar with working with...experimental validation. • Independent Video - Game Developers. Small companies and individuals that pursue video - game design and development can be...complexity, such as an improvised nuclear device (IND) detonation. The effort has examined game - based training methods to determine their suitability

  13. Skill Transfer and Virtual Training for IND Response Decision-Making: Project Summary and Next Steps

    DTIC Science & Technology

    2016-01-01

    to be very productive partners—independent video - game developers and academic game degree programs—are not familiar with working with government...experimental validation. • Independent Video - Game Developers. Small companies and individuals that pursue video - game design and development can be inexpensive...Emergency Management Agency (FEMA) that examines alternative mechanisms for training and evaluation of emergency managers (EMs) to augment and

  14. Mathematical model for dynamic cell formation in fast fashion apparel manufacturing stage

    NASA Astrophysics Data System (ADS)

    Perera, Gayathri; Ratnayake, Vijitha

    2018-05-01

    This paper presents a mathematical programming model for dynamic cell formation to minimize changeover-related costs (i.e., machine relocation costs and machine setup cost) and inter-cell material handling cost to cope with the volatile production environments in apparel manufacturing industry. The model is formulated through findings of a comprehensive literature review. Developed model is validated based on data collected from three different factories in apparel industry, manufacturing fast fashion products. A program code is developed using Lingo 16.0 software package to generate optimal cells for developed model and to determine the possible cost-saving percentage when the existing layouts used in three factories are replaced by generated optimal cells. The optimal cells generated by developed mathematical model result in significant cost saving when compared with existing product layouts used in production/assembly department of selected factories in apparel industry. The developed model can be considered as effective in minimizing the considered cost terms in dynamic production environment of fast fashion apparel manufacturing industry. Findings of this paper can be used for further researches on minimizing the changeover-related costs in fast fashion apparel production stage.

  15. Advances in identification and validation of protein targets of natural products without chemical modification.

    PubMed

    Chang, J; Kim, Y; Kwon, H J

    2016-05-04

    Covering: up to February 2016Identification of the target proteins of natural products is pivotal to understanding the mechanisms of action to develop natural products for use as molecular probes and potential therapeutic drugs. Affinity chromatography of immobilized natural products has been conventionally used to identify target proteins, and has yielded good results. However, this method has limitations, in that labeling or tagging for immobilization and affinity purification often result in reduced or altered activity of the natural product. New strategies have recently been developed and applied to identify the target proteins of natural products and synthetic small molecules without chemical modification of the natural product. These direct and indirect methods for target identification of label-free natural products include drug affinity responsive target stability (DARTS), stability of proteins from rates of oxidation (SPROX), cellular thermal shift assay (CETSA), thermal proteome profiling (TPP), and bioinformatics-based analysis of connectivity. This review focuses on and reports case studies of the latest advances in target protein identification methods for label-free natural products. The integration of newly developed technologies will provide new insights and highlight the value of natural products for use as biological probes and new drug candidates.

  16. Multiscale soil moisture estimates using static and roving cosmic-ray soil moisture sensors

    NASA Astrophysics Data System (ADS)

    McJannet, David; Hawdon, Aaron; Baker, Brett; Renzullo, Luigi; Searle, Ross

    2017-12-01

    Soil moisture plays a critical role in land surface processes and as such there has been a recent increase in the number and resolution of satellite soil moisture observations and the development of land surface process models with ever increasing resolution. Despite these developments, validation and calibration of these products has been limited because of a lack of observations on corresponding scales. A recently developed mobile soil moisture monitoring platform, known as the rover, offers opportunities to overcome this scale issue. This paper describes methods, results and testing of soil moisture estimates produced using rover surveys on a range of scales that are commensurate with model and satellite retrievals. Our investigation involved static cosmic-ray neutron sensors and rover surveys across both broad (36 × 36 km at 9 km resolution) and intensive (10 × 10 km at 1 km resolution) scales in a cropping district in the Mallee region of Victoria, Australia. We describe approaches for converting rover survey neutron counts to soil moisture and discuss the factors controlling soil moisture variability. We use independent gravimetric and modelled soil moisture estimates collected across both space and time to validate rover soil moisture products. Measurements revealed that temporal patterns in soil moisture were preserved through time and regression modelling approaches were utilised to produce time series of property-scale soil moisture which may also have applications in calibration and validation studies or local farm management. Intensive-scale rover surveys produced reliable soil moisture estimates at 1 km resolution while broad-scale surveys produced soil moisture estimates at 9 km resolution. We conclude that the multiscale soil moisture products produced in this study are well suited to future analysis of satellite soil moisture retrievals and finer-scale soil moisture models.

  17. Development and Validation of National Phenology Data Products

    NASA Astrophysics Data System (ADS)

    Weltzin, J. F.; Rosemartin, A.; Crimmins, T. M.; Gerst, K.

    2015-12-01

    The USA National Phenology Network (USA-NPN; www.usanpn.org) serves science and society by promoting a broad understanding of plant and animal phenology and the relationships among phenological patterns and environmental change. The National Phenology Database (NPDb) maintained by USA-NPN contains almost 6 million in-situ observation records for plants and animals for the period 1954-2015. These data have been used in a number of science, conservation and natural resource management applications, including national assessments of historical and potential future trends in phenology and regional assessments of spatio-temporal variation in organismal activity. Customizable downloads of raw or summarized data, freely available from www.usanpn.org, are accompanied by metadata, data-use and data-attribution policies, published protocols, version/change control, documentation of QA/QC, and links to publications that use historical or contemporary data held in the NPDb. The National Coordinating Office of USA-NPN is developing a suite of standard data products (e.g., quality-controlled raw or summarized status data) and tools (e.g., a new visualization tool released in 2015) to facilitate use and application by a diverse set of data users. This presentation outlines a workflow for the development and validation of spatially gridded phenology products, drawing on recent work related to the Spring Indices now included in two national Indicator systems. In addition, we discuss how we engage observers to collect in-situ data to validate model predictions. Preliminary analyses indicate high fidelity between historical in-situ and modeled observations on a national scale, but with considerable variability at the regional scale. Regions with strong differences between expected and observed data are identified and will be the focus of in-situ data collection campaigns using USA-NPN's Nature's Notebook on-line user interface (www.nn.usanpn.org).

  18. Identification and Structure Elucidation of Forced Degradation Products of the Novel Propionic acid Derivative Loxoprofen: Development of Stability-Indicating Chromatographic Methods Validated as per ICH Guidelines.

    PubMed

    Eissa, Maya S; Abd El-Sattar, Osama I

    2017-04-01

    Loxoprofen sodium (LOX) is a recently developed novel propionic acid derivative. Owing to its instability under both hydrolytic and oxidative conditions, the development of simple, rapid and sensitive methods for its determination in the presence of its possible forced degradation products becomes essential. Two simple chromatographic methods, high-performance thin layer chromatography (HPTLC) and high-performance liquid chromatography (HPLC), were developed associated with ultraviolet (UV) detection. In HPTLC-densitometric method, the separation of LOX from its degradation products was achieved using silica gel F254 plates and toluene:acetone:acetic acid (1.8:1.0:0.1, v/v/v) as the developing system followed by densitometric scanning at 220 nm. In the HPLC-UV method, the separation was performed using isocratic elution system with acetonitrile: 0.15% triethylamine (pH 2.2) (50:50, v/v) on C18 analytical column. The flow rate was optimized at 1.0 mL·min-1 and UV detection was achieved at 220 nm. Validation was performed in accordance with the International Conference on Harmonization guidelines and the method was perfectly applied for determination of LOX in its pharmaceutical preparation. The results obtained were statistically compared to those obtained after application of the official HPLC method, where no significant difference was found incompliance with precision and accuracy. Identification and characterization of the possible hydrolytic degradation product under alkaline conditions and that produced during oxidative degradation using hydrogen peroxide were structurally elucidated using infrared and mass spectrometry analyses. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  19. Approach for validating actinide and fission product compositions for burnup credit criticality safety analyses

    DOE PAGES

    Radulescu, Georgeta; Gauld, Ian C.; Ilas, Germina; ...

    2014-11-01

    This paper describes a depletion code validation approach for criticality safety analysis using burnup credit for actinide and fission product nuclides in spent nuclear fuel (SNF) compositions. The technical basis for determining the uncertainties in the calculated nuclide concentrations is comparison of calculations to available measurements obtained from destructive radiochemical assay of SNF samples. Probability distributions developed for the uncertainties in the calculated nuclide concentrations were applied to the SNF compositions of a criticality safety analysis model by the use of a Monte Carlo uncertainty sampling method to determine bias and bias uncertainty in effective neutron multiplication factor. Application ofmore » the Monte Carlo uncertainty sampling approach is demonstrated for representative criticality safety analysis models of pressurized water reactor spent fuel pool storage racks and transportation packages using burnup-dependent nuclide concentrations calculated with SCALE 6.1 and the ENDF/B-VII nuclear data. Furthermore, the validation approach and results support a recent revision of the U.S. Nuclear Regulatory Commission Interim Staff Guidance 8.« less

  20. Development of Mathematics Learning Strategy Module, Based on Higher Order Thinking Skill (Hots) To Improve Mathematic Communication And Self Efficacy On Students Mathematics Department

    NASA Astrophysics Data System (ADS)

    Andriani, Ade; Dewi, Izwita; Halomoan, Budi

    2018-03-01

    In general, this research is conducted to improve the quality of lectures on mathematics learning strategy in Mathematics Department. The specific objective of this research is to develop learning instrument of mathematics learning strategy based on Higher Order Thinking Skill (HOTS) that can be used to improve mathematical communication and self efficacy of mathematics education students. The type of research is development research (Research & Development), where this research aims to develop a new product or improve the product that has been made. This development research refers to the four-D Model, which consists of four stages: defining, designing, developing, and disseminating. The instrument of this research is the validation sheet and the student response sheet of the instrument.

  1. Development and Validation of a Fluorescent Multiplexed Immunoassay for Measurement of Transgenic Proteins in Cotton (Gossypium hirsutum).

    PubMed

    Yeaman, Grant R; Paul, Sudakshina; Nahirna, Iryna; Wang, Yongcheng; Deffenbaugh, Andrew E; Liu, Zi Lucy; Glenn, Kevin C

    2016-06-22

    In order to provide farmers with better and more customized alternatives to improve yields, combining multiple genetically modified (GM) traits into a single product (called stacked trait crops) is becoming prevalent. Trait protein expression levels are used to characterize new GM products and establish exposure limits, two important components of safety assessment. Developing a multiplexed immunoassay capable of measuring all trait proteins in the same sample allows for higher sample throughput and savings in both time and expense. Fluorescent (bead-based) multiplexed immunoassays (FMI) have gained wide acceptance in mammalian research and in clinical applications. In order to facilitate the measurement of stacked GM traits, we have developed and validated an FMI assay that can measure five different proteins (β-glucuronidase, neomycin phosphotransferase II, Cry1Ac, Cry2Ab2, and CP4 5-enolpyruvyl-shikimate-3-phosphate synthase) present in cotton leaf from a stacked trait product. Expression levels of the five proteins determined by FMI in cotton leaf tissues have been evaluated relative to expression levels determined by enzyme-linked immunosorbent assays (ELISAs) of the individual proteins and shown to be comparable. The FMI met characterization requirements similar to those used for ELISA. Therefore, it is reasonable to conclude that FMI results are equivalent to those determined by conventional individual ELISAs to measure GM protein expression levels in stacked trait products but with significantly higher throughput, reduced time, and more efficient use of resources.

  2. Comparison of Satellite and Aircraft Measurements of Cloud Microphysical Properties in Icing Conditions During ATREC/AIRS-II

    NASA Technical Reports Server (NTRS)

    Nguyen, Louis; Minnis, Patrick; Spangenberg, Douglas A.; Nordeen, Michele L.; Palikonda, Rabindra; Khaiyer, Mandana M.; Gultepe, Ismail; Reehorst, Andrew L.

    2004-01-01

    Satellites are ideal for continuous monitoring of aircraft icing conditions in many situations over extensive areas. The satellite imager data are used to diagnose a number of cloud properties that can be used to develop icing intensity indices. Developing and validating these indices requires comparison with objective "cloud truth" data in addition to conventional pilot reports (PIREPS) of icing conditions. Minnis et al. examined the relationships between PIREPS icing and satellite-derived cloud properties. The Atlantic-THORPEX Regional Campaign (ATReC) and the second Alliance Icing Research Study (AIRS-II) field programs were conducted over the northeastern USA and southeastern Canada during late 2003 and early 2004. The aircraft and surface measurements are concerned primarily with the icing characteristics of clouds and, thus, are ideal for providing some validation information for the satellite remote sensing product. This paper starts the process of comparing cloud properties and icing indices derived from the Geostationary Operational Environmental Satellite (GOES) with the aircraft in situ measurements of several cloud properties during campaigns and some of the The comparisons include cloud phase, particle size, icing intensity, base and top altitudes, temperatures, and liquid water path. The results of this study are crucial for developing a more reliable and objective icing product from satellite data. This icing product, currently being derived from GOES data over the USA, is an important complement to more conventional products based on forecasts, and PIREPS.

  3. Development and validation of breeder-friendly KASPar markers for er1, a powdery mildew resistance gene in pea (Pisum sativum L.)

    USDA-ARS?s Scientific Manuscript database

    Powdery mildew of pea is caused by Erysiphe pisi DC and is a serious threat to pea (Pisum sativum L.) production throughout much of the world. Development and utilization of genetic resistance to powdery mildew is considered an effective and sustainable strategy to manage this disease. One gene, er1...

  4. Climatological Processing of Radar Data for the TRMM Ground Validation Program

    NASA Technical Reports Server (NTRS)

    Kulie, Mark; Marks, David; Robinson, Michael; Silberstein, David; Wolff, David; Ferrier, Brad; Amitai, Eyal; Fisher, Brad; Wang, Jian-Xin; Augustine, David; hide

    2000-01-01

    The Tropical Rainfall Measuring Mission (TRMM) satellite was successfully launched in November, 1997. The main purpose of TRMM is to sample tropical rainfall using the first active spaceborne precipitation radar. To validate TRMM satellite observations, a comprehensive Ground Validation (GV) Program has been implemented. The primary goal of TRMM GV is to provide basic validation of satellite-derived precipitation measurements over monthly climatologies for the following primary sites: Melbourne, FL; Houston, TX; Darwin, Australia; and Kwajalein Atoll, RMI. As part of the TRMM GV effort, research analysts at NASA Goddard Space Flight Center (GSFC) generate standardized TRMM GV products using quality-controlled ground-based radar data from the four primary GV sites as input. This presentation will provide an overview of the TRMM GV climatological processing system. A description of the data flow between the primary GV sites, NASA GSFC, and the TRMM Science and Data Information System (TSDIS) will be presented. The radar quality control algorithm, which features eight adjustable height and reflectivity parameters, and its effect on monthly rainfall maps will be described. The methodology used to create monthly, gauge-adjusted rainfall products for each primary site will also be summarized. The standardized monthly rainfall products are developed in discrete, modular steps with distinct intermediate products. These developmental steps include: (1) extracting radar data over the locations of rain gauges, (2) merging rain gauge and radar data in time and space with user-defined options, (3) automated quality control of radar and gauge merged data by tracking accumulations from each instrument, and (4) deriving Z-R relationships from the quality-controlled merged data over monthly time scales. A summary of recently reprocessed official GV rainfall products available for TRMM science users will be presented. Updated basic standardized product results and trends involving monthly accumulation, Z-R relationship, and gauge statistics for each primary GV site will be also displayed.

  5. Quality Assessment of Landsat Surface Reflectance Products Using MODIS Data

    NASA Technical Reports Server (NTRS)

    Feng, Min; Huang, Chengquan; Channan, Saurabh; Vermote, Eric; Masek, Jeffrey G.; Townshend, John R.

    2012-01-01

    Surface reflectance adjusted for atmospheric effects is a primary input for land cover change detection and for developing many higher level surface geophysical parameters. With the development of automated atmospheric correction algorithms, it is now feasible to produce large quantities of surface reflectance products using Landsat images. Validation of these products requires in situ measurements, which either do not exist or are difficult to obtain for most Landsat images. The surface reflectance products derived using data acquired by the Moderate Resolution Imaging Spectroradiometer (MODIS), however, have been validated more comprehensively. Because the MODIS on the Terra platform and the Landsat 7 are only half an hour apart following the same orbit, and each of the 6 Landsat spectral bands overlaps with a MODIS band, good agreements between MODIS and Landsat surface reflectance values can be considered indicators of the reliability of the Landsat products, while disagreements may suggest potential quality problems that need to be further investigated. Here we develop a system called Landsat-MODIS Consistency Checking System (LMCCS). This system automatically matches Landsat data with MODIS observations acquired on the same date over the same locations and uses them to calculate a set of agreement metrics. To maximize its portability, Java and open-source libraries were used in developing this system, and object-oriented programming (OOP) principles were followed to make it more flexible for future expansion. As a highly automated system designed to run as a stand-alone package or as a component of other Landsat data processing systems, this system can be used to assess the quality of essentially every Landsat surface reflectance image where spatially and temporally matching MODIS data are available. The effectiveness of this system was demonstrated using it to assess preliminary surface reflectance products derived using the Global Land Survey (GLS) Landsat images for the 2000 epoch. As surface reflectance likely will be a standard product for future Landsat missions, the approach developed in this study can be adapted as an operational quality assessment system for those missions.

  6. A methodology to estimate representativeness of LAI station observation for validation: a case study with Chinese Ecosystem Research Network (CERN) in situ data

    NASA Astrophysics Data System (ADS)

    Xu, Baodong; Li, Jing; Liu, Qinhuo; Zeng, Yelu; Yin, Gaofei

    2014-11-01

    Leaf Area Index (LAI) is known as a key vegetation biophysical variable. To effectively use remote sensing LAI products in various disciplines, it is critical to understand the accuracy of them. The common method for the validation of LAI products is firstly establish the empirical relationship between the field data and high-resolution imagery, to derive LAI maps, then aggregate high-resolution LAI maps to match moderate-resolution LAI products. This method is just suited for the small region, and its frequencies of measurement are limited. Therefore, the continuous observing LAI datasets from ground station network are important for the validation of multi-temporal LAI products. However, due to the scale mismatch between the point observation in the ground station and the pixel observation, the direct comparison will bring the scale error. Thus it is needed to evaluate the representativeness of ground station measurement within pixel scale of products for the reasonable validation. In this paper, a case study with Chinese Ecosystem Research Network (CERN) in situ data was taken to introduce a methodology to estimate representativeness of LAI station observation for validating LAI products. We first analyzed the indicators to evaluate the observation representativeness, and then graded the station measurement data. Finally, the LAI measurement data which can represent the pixel scale was used to validate the MODIS, GLASS and GEOV1 LAI products. The result shows that the best agreement is reached between the GLASS and GEOV1, while the lowest uncertainty is achieved by GEOV1 followed by GLASS and MODIS. We conclude that the ground station measurement data can validate multi-temporal LAI products objectively based on the evaluation indicators of station observation representativeness, which can also improve the reliability for the validation of remote sensing products.

  7. Variability of Creativity Judgments

    ERIC Educational Resources Information Center

    Caroff, Xavier; Besancon, Maud

    2008-01-01

    The Consensual Assessment Technique (CAT), developed by Amabile [Amabile, T.M. (1982). "Social psychology of creativity: A consensual assessment technique." "Journal of Personality and Social Psychology," 43, 997-1013], is frequently used to evaluate the creativity of productions. Judgments obtained with CAT are usually reliable and valid.…

  8. An Electrochemical Impedance Spectroscopy-Based Technique to Identify and Quantify Fermentable Sugars in Pineapple Waste Valorization for Bioethanol Production

    PubMed Central

    Conesa, Claudia; García-Breijo, Eduardo; Loeff, Edwin; Seguí, Lucía; Fito, Pedro; Laguarda-Miró, Nicolás

    2015-01-01

    Electrochemical Impedance Spectroscopy (EIS) has been used to develop a methodology able to identify and quantify fermentable sugars present in the enzymatic hydrolysis phase of second-generation bioethanol production from pineapple waste. Thus, a low-cost non-destructive system consisting of a stainless double needle electrode associated to an electronic equipment that allows the implementation of EIS was developed. In order to validate the system, different concentrations of glucose, fructose and sucrose were added to the pineapple waste and analyzed both individually and in combination. Next, statistical data treatment enabled the design of specific Artificial Neural Networks-based mathematical models for each one of the studied sugars and their respective combinations. The obtained prediction models are robust and reliable and they are considered statistically valid (CCR% > 93.443%). These results allow us to introduce this EIS-based technique as an easy, fast, non-destructive, and in-situ alternative to the traditional laboratory methods for enzymatic hydrolysis monitoring. PMID:26378537

  9. Agroecology and Health: Lessons from Indigenous Populations.

    PubMed

    Suárez-Torres, José; Suárez-López, José Ricardo; López-Paredes, Dolores; Morocho, Hilario; Cachiguango-Cachiguango, Luis Enrique; Dellai, William

    2017-06-01

    The article aims to systematize and disseminate the main contributions of indigenous ancestral wisdom in the agroecological production of food, especially in Latin America. For this purpose, it is necessary to ask whether such knowledge can be accepted by academia research groups and international forums as a valid alternative that could contribute to overcome the world's nutritional problems. Although no new findings are being made, the validity of ancestral knowledge and agroecology is recognized by scientific research, and by international forums organized by agencies of the United Nations. These recommend that governments should implement them in their policies of development, and in the allocation of funds to support these initiatives. Agroecology and ancestral knowledge are being adopted by a growing number of organizations, indigenous peoples and social groups in various parts of the world, as development alternatives that respond to local needs and worldviews. Its productive potential is progressively being recognized at an international level as a model that contributes to improve the condition of people regarding nutritional food.

  10. Development and Validation of RP-LC Method for the Determination of Cinnarizine/Piracetam and Cinnarizine/Heptaminol Acefyllinate in Presence of Cinnarizine Reported Degradation Products

    PubMed Central

    EL-Houssini, Ola M.; Zawilla, Nagwan H.; Mohammad, Mohammad A.

    2013-01-01

    Specific stability indicating reverse-phase liquid chromatography (RP-LC) assay method (SIAM) was developed for the determination of cinnarizine (Cinn)/piracetam (Pira) and cinnarizine (Cinn)/heptaminol acefyllinate (Hept) in the presence of the reported degradation products of Cinn. A C18 column and gradient mobile phase was applied for good resolution of all peaks. The detection was achieved at 210 nm and 254 nm for Cinn/Pira and Cinn/Hept, respectively. The responses were linear over concentration ranges of 20–200, 20–1000 and 25–1000 μgmL−1 for Cinn, Pira, and Hept respectively. The proposed method was validated for linearity, accuracy, repeatability, intermediate precision, and robustness via statistical analysis of the data. The method was shown to be precise, accurate, reproducible, sensitive, and selective for the analysis of Cinn/Pira and Cinn/Hept in laboratory prepared mixtures and in pharmaceutical formulations. PMID:24137049

  11. Quantitative Determination of α-Arbutin, β-Arbutin, Kojic Acid, Nicotinamide, Hydroquinone, Resorcinol, 4-Methoxyphenol, 4-Ethoxyphenol, and Ascorbic Acid from Skin Whitening Products by HPLC-UV.

    PubMed

    Wang, Yan-Hong; Avonto, Cristina; Avula, Bharathi; Wang, Mei; Rua, Diego; Khan, Ikhlas A

    2015-01-01

    An HPLC-UV method was developed for the quantitative analysis of nine skin whitening agents in a single injection. These compounds are α-arbutin, β-arbutin, kojic acid, nicotinamide, resorcinol, ascorbic acid, hydroquinone, 4-methoxyphenol, and 4-ethoxyphenol. The separation was achieved on a reversed-phase C18 column within 30 min. The mobile phase was composed of water and methanol, both containing 0.1% acetic acid (v/v). The stability of the analytes was evaluated at different pH values between 2.3 and 7.6, and the extraction procedure was validated for different types of skin whitening product matrixes, which included two creams, a soap bar, and a capsule. The best solvent system for sample preparation was 20 mM NaH2PO4 containing 10% methanol at pH 2.3. The analytical method was validated for accuracy, precision, LOD, and LOQ. The developed HPLC-UV method was applied for the quantitation of the nine analytes in 59 skin whitening products including creams, lotions, sera, foams, gels, mask sheets, soap bars, tablets, and capsules.

  12. Spatially Varying Spectrally Thresholds for MODIS Cloud Detection

    NASA Technical Reports Server (NTRS)

    Haines, S. L.; Jedlovec, G. J.; Lafontaine, F.

    2004-01-01

    The EOS science team has developed an elaborate global MODIS cloud detection procedure, and the resulting MODIS product (MOD35) is used in the retrieval process of several geophysical parameters to mask out clouds. While the global application of the cloud detection approach appears quite robust, the product has some shortcomings on the regional scale, often over determining clouds in a variety of settings, particularly at night. This over-determination of clouds can cause a reduction in the spatial coverage of MODIS derived clear-sky products. To minimize this problem, a new regional cloud detection method for use with MODIS data has been developed at NASA's Global Hydrology and Climate Center (GHCC). The approach is similar to that used by the GHCC for GOES data over the continental United States. Several spatially varying thresholds are applied to MODIS spectral data to produce a set of tests for detecting clouds. The thresholds are valid for each MODIS orbital pass, and are derived from 20-day composites of GOES channels with similar wavelengths to MODIS. This paper and accompanying poster will introduce the GHCC MODIS cloud mask, provide some examples, and present some preliminary validation.

  13. Validation and Spatiotemporal Analysis of CERES Surface Net Radiation Product

    DOE PAGES

    Jia, Aolin; Jiang, Bo; Liang, Shunlin; ...

    2016-01-23

    The Clouds and the Earth’s Radiant Energy System (CERES) generates one of the few global satellite radiation products. The CERES ARM Validation Experiment (CAVE) has been providing long-term in situ observations for the validation of the CERES products. However, the number of these sites is low and their distribution is globally sparse, and particularly the surface net radiation product has not been rigorously validated yet. Therefore, additional validation efforts are highly required to determine the accuracy of the CERES radiation products. In this study, global land surface measurements were comprehensively collected for use in the validation of the CERES netmore » radiation (R n) product on a daily (340 sites) and a monthly (260 sites) basis, respectively. The validation results demonstrated that the CERES R n product was, overall, highly accurate. The daily validations had a Mean Bias Error (MBE) of 3.43 W·m −2, Root Mean Square Error (RMSE) of 33.56 W·m −2, and R 2 of 0.79, and the monthly validations had an MBE of 3.40 W·m −2, RMSE of 25.57 W·m −2, and R 2 of 0.84. The accuracy was slightly lower for the high latitudes. Following the validation, the monthly CERES R n product, from March 2000 to July 2014, was used for a further analysis. We analysed the global spatiotemporal variation of the R n, which occurred during the measurement period. In addition, two hot spot regions, the southern Great Plains and south-central Africa, were then selected for use in determining the driving factors or attribution of the R n variation. We determined that R n over the southern Great Plains decreased by −0.33 W·m −2 per year, which was mainly driven by changes in surface green vegetation and precipitation. In south-central Africa, R n decreased at a rate of −0.63 W·m −2 per year, the major driving factor of which was surface green vegetation.« less

  14. Validation and Spatiotemporal Analysis of CERES Surface Net Radiation Product

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jia, Aolin; Jiang, Bo; Liang, Shunlin

    The Clouds and the Earth’s Radiant Energy System (CERES) generates one of the few global satellite radiation products. The CERES ARM Validation Experiment (CAVE) has been providing long-term in situ observations for the validation of the CERES products. However, the number of these sites is low and their distribution is globally sparse, and particularly the surface net radiation product has not been rigorously validated yet. Therefore, additional validation efforts are highly required to determine the accuracy of the CERES radiation products. In this study, global land surface measurements were comprehensively collected for use in the validation of the CERES netmore » radiation (R n) product on a daily (340 sites) and a monthly (260 sites) basis, respectively. The validation results demonstrated that the CERES R n product was, overall, highly accurate. The daily validations had a Mean Bias Error (MBE) of 3.43 W·m −2, Root Mean Square Error (RMSE) of 33.56 W·m −2, and R 2 of 0.79, and the monthly validations had an MBE of 3.40 W·m −2, RMSE of 25.57 W·m −2, and R 2 of 0.84. The accuracy was slightly lower for the high latitudes. Following the validation, the monthly CERES R n product, from March 2000 to July 2014, was used for a further analysis. We analysed the global spatiotemporal variation of the R n, which occurred during the measurement period. In addition, two hot spot regions, the southern Great Plains and south-central Africa, were then selected for use in determining the driving factors or attribution of the R n variation. We determined that R n over the southern Great Plains decreased by −0.33 W·m −2 per year, which was mainly driven by changes in surface green vegetation and precipitation. In south-central Africa, R n decreased at a rate of −0.63 W·m −2 per year, the major driving factor of which was surface green vegetation.« less

  15. Consensus categorization of cheese based on water activity and pH-A rational approach to systemizing cheese diversity.

    PubMed

    Trmčić, A; Ralyea, R; Meunier-Goddik, L; Donnelly, C; Glass, K; D'Amico, D; Meredith, E; Kehler, M; Tranchina, N; McCue, C; Wiedmann, M

    2017-01-01

    Development of science-based interventions in raw milk cheese production is challenging due to the large diversity of production procedures and final products. Without an agreed upon categorization scheme, science-based food safety evaluations and validation of preventive controls would have to be completed separately on each individual cheese product, which is not feasible considering the large diversity of products and the typically small scale of production. Thus, a need exists to systematically group raw milk cheeses into logically agreed upon categories to be used for food safety evaluations. This paper proposes and outlines one such categorization scheme that provides for 30 general categories of cheese. As a base for this systematization and categorization of raw milk cheese, we used Table B of the US Food and Drug Administration's 2013 Food Code, which represents the interaction of pH and water activity for control of vegetative cells and spores in non-heat-treated food. Building on this table, we defined a set of more granular pH and water activity categories to better represent the pH and water activity range of different raw milk cheeses. The resulting categorization scheme was effectively validated using pH and water activity values determined for 273 different cheese samples collected in the marketplace throughout New York State, indicating the distribution of commercially available cheeses among the categories proposed here. This consensus categorization of cheese provides a foundation for a feasible approach to developing science-based solutions to assure compliance of the cheese processors with food safety regulations, such as those required by the US Food Safety Modernization Act. The key purpose of the cheese categorization proposed here is to facilitate product assessment for food safety risks and provide scientifically validated guidance on effective interventions for general cheese categories. Once preventive controls for a given category have been defined, these categories would represent safe havens for cheesemakers, which would allow cheesemakers to safely and legally produce raw milk cheeses that meet appropriate science-based safety requirements (e.g., risk to human health equivalent to pasteurized milk cheeses). Copyright © 2017 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  16. Discriminant validity, responsiveness and reliability of the arthritis-specific Work Productivity Survey assessing workplace and household productivity in patients with psoriatic arthritis

    PubMed Central

    2014-01-01

    Introduction The novel arthritis-specific Work Productivity Survey (WPS) was developed to estimate patient productivity limitations associated with arthritis within and outside the home, which is an unmet need in psoriatic arthritis (PsA). The WPS has been validated in rheumatoid arthritis. This report assesses the discriminant validity, responsiveness and reliability of the WPS in adult-onset PsA. Methods Psychometric properties were assessed using data from the RAPID-PsA trial (NCT01087788) investigating certolizumab pegol (CZP) efficacy and safety in PsA. WPS was completed at baseline and every 4 weeks until Week 24. Validity was evaluated at baseline via known-groups defined using first and third quartiles of patients’ Disease Activity Score 28 based on C-reactive protein (DAS28(CRP)), Health Assessment Questionnaire-Disability Index (HAQ-DI), Short Form-36 (SF-36) items and PsA Quality of Life (PsAQoL) scores. Responsiveness and reliability were assessed by comparing WPS mean changes at Week 12 in American College of Rheumatology 20% improvement criteria (ACR20) or HAQ-DI Minimal Clinically Important Difference (MCID) 0.3 responders versus non-responders, as well as using standardized response means (SRM). All comparisons were conducted on the observed cases in the Randomized Set, regardless of the randomization group, using a non-parametric bootstrap-t method. Results Compared with patients with a better health state, patients with a worse health state had on average 2 to 6 times more household work days lost, more days with reduced household productivity, more days missed of family/social/leisure activities, more days with outside help hired and a significantly higher interference of arthritis per month. Among employed patients, those with a worse health state had 2 to 4 times more workplace days lost, more days with patient workplace productivity reduced, and a significantly higher interference of arthritis on patient workplace productivity versus patients with a better health state. WPS was also responsive to clinical changes, with responders having significantly larger improvements at Week 12 in WPS scores versus non-responders. The effect sizes for changes in productivity in ACR20 or HAQ-DI MCID responders were moderate (0.5 < SRM < 0.8) or small. Conclusions These analyses demonstrate the validity, responsiveness and reliability of the WPS, as an instrument for the measurement of patient productivity within and outside the home in an adult-onset PsA population. PMID:24996416

  17. Validation approach for a fast and simple targeted screening method for 75 antibiotics in meat and aquaculture products using LC-MS/MS.

    PubMed

    Dubreil, Estelle; Gautier, Sophie; Fourmond, Marie-Pierre; Bessiral, Mélaine; Gaugain, Murielle; Verdon, Eric; Pessel, Dominique

    2017-04-01

    An approach is described to validate a fast and simple targeted screening method for antibiotic analysis in meat and aquaculture products by LC-MS/MS. The strategy of validation was applied for a panel of 75 antibiotics belonging to different families, i.e., penicillins, cephalosporins, sulfonamides, macrolides, quinolones and phenicols. The samples were extracted once with acetonitrile, concentrated by evaporation and injected into the LC-MS/MS system. The approach chosen for the validation was based on the Community Reference Laboratory (CRL) guidelines for the validation of screening qualitative methods. The aim of the validation was to prove sufficient sensitivity of the method to detect all the targeted antibiotics at the level of interest, generally the maximum residue limit (MRL). A robustness study was also performed to test the influence of different factors. The validation showed that the method is valid to detect and identify 73 antibiotics of the 75 antibiotics studied in meat and aquaculture products at the validation levels.

  18. CREST-SAFE: Snow LST validation, wetness profiler creation, and depth/SWE product development

    NASA Astrophysics Data System (ADS)

    Perez Diaz, C. L.; Lakhankar, T.; Romanov, P.; Khanbilvardi, R.; Munoz Barreto, J.; Yu, Y.

    2017-12-01

    CREST-SAFE: Snow LST validation, wetness profiler creation, and depth/SWE product development The Field Snow Research Station (also referred to as Snow Analysis and Field Experiment, SAFE) is operated by the NOAA Center for Earth System Sciences and Remote Sensing Technologies (CREST) in the City University of New York (CUNY). The field station is located within the premises of the Caribou Municipal Airport (46°52'59'' N, 68°01'07'' W) and in close proximity to the National Weather Service (NWS) Regional Forecast Office. The station was established in 2010 to support studies in snow physics and snow remote sensing. The Visible Infrared Imager Radiometer Suite (VIIRS) Land Surface Temperature (LST) Environmental Data Record (EDR) and Moderate Resolution Imaging Spectroradiometer (MODIS) LST product (provided by the Terra and Aqua Earth Observing System satellites) were validated using in situ LST (T-skin) and near-surface air temperature (T-air) observations recorded at CREST-SAFE for the winters of 2013 and 2014. Results indicate that T-air correlates better than T-skin with VIIRS LST data and that the accuracy of nighttime LST retrievals is considerably better than that of daytime. Several trends in the MODIS LST data were observed, including the underestimation of daytime values and night-time values. Results indicate that, although all the data sets showed high correlation with ground measurements, day values yielded slightly higher accuracy ( 1°C). Additionally, we created a liquid water content (LWC)-profiling instrument using time-domain reflectometry (TDR) at CREST-SAFE and tested it during the snow melt period (February-April) immediately after installation in 2014. Results displayed high agreement when compared to LWC estimates obtained using empirical formulas developed in previous studies, and minor improvement over wet snow LWC estimates. Lastly, to improve on global snow cover mapping, a snow product capable of estimating snow depth and snow water equivalent (SWE) using microwave remote sensing and the CREST Snow Depth Regression Tree Model (SDRTM) was developed. Data from AMSR2 onboard the JAXA GCOM-W1 satellite is used to produce daily global snow depth and SWE maps in automated fashion at a 10-km resolution.

  19. Development of GK-2A cloud optical and microphysical properties retrieval algorithm

    NASA Astrophysics Data System (ADS)

    Yang, Y.; Yum, S. S.; Um, J.

    2017-12-01

    Cloud and aerosol radiative forcing is known to be one of the the largest uncertainties in climate change prediction. To reduce this uncertainty, remote sensing observation of cloud radiative and microphysical properties have been used since 1970s and the corresponding remote sensing techniques and instruments have been developed. As a part of such effort, Geo-KOMPSAT-2A (Geostationary Korea Multi-Purpose Satellite-2A, GK-2A) will be launched in 2018. On the GK-2A, the Advanced Meteorological Imager (AMI) is primary instrument which have 3 visible, 3 near-infrared, and 10 infrared channels. To retrieve optical and microphysical properties of clouds using AMI measurements, the preliminary version of new cloud retrieval algorithm for GK-2A was developed and several validation tests were conducted. This algorithm retrieves cloud optical thickness (COT), cloud effective radius (CER), liquid water path (LWP), and ice water path (IWP), so we named this algorithm as Daytime Cloud Optical thickness, Effective radius and liquid and ice Water path (DCOEW). The DCOEW uses cloud reflectance at visible and near-infrared channels as input data. An optimal estimation (OE) approach that requires appropriate a-priori values and measurement error information is used to retrieve COT and CER. LWP and IWP are calculated using empirical relationships between COT/CER and cloud water path that were determined previously. To validate retrieved cloud properties, we compared DCOEW output data with other operational satellite data. For COT and CER validation, we used two different data sets. To compare algorithms that use cloud reflectance at visible and near-IR channels as input data, MODIS MYD06 cloud product was selected. For the validation with cloud products that are based on microwave measurements, COT(2B-TAU)/CER(2C-ICE) data retrieved from CloudSat cloud profiling radar (W-band, 94 GHz) was used. For cloud water path validation, AMSR-2 Level-3 Cloud liquid water data was used. Detailed results will be shown at the conference.

  20. Quality appraisal of generic self-reported instruments measuring health-related productivity changes: a systematic review

    PubMed Central

    2014-01-01

    Background Health impairments can result in disability and changed work productivity imposing considerable costs for the employee, employer and society as a whole. A large number of instruments exist to measure health-related productivity changes; however their methodological quality remains unclear. This systematic review critically appraised the measurement properties in generic self-reported instruments that measure health-related productivity changes to recommend appropriate instruments for use in occupational and economic health practice. Methods PubMed, PsycINFO, Econlit and Embase were systematically searched for studies whereof: (i) instruments measured health-related productivity changes; (ii) the aim was to evaluate instrument measurement properties; (iii) instruments were generic; (iv) ratings were self-reported; (v) full-texts were available. Next, methodological quality appraisal was based on COSMIN elements: (i) internal consistency; (ii) reliability; (iii) measurement error; (iv) content validity; (v) structural validity; (vi) hypotheses testing; (vii) cross-cultural validity; (viii) criterion validity; and (ix) responsiveness. Recommendations are based on evidence syntheses. Results This review included 25 articles assessing the reliability, validity and responsiveness of 15 different generic self-reported instruments measuring health-related productivity changes. Most studies evaluated criterion validity, none evaluated cross-cultural validity and information on measurement error is lacking. The Work Limitation Questionnaire (WLQ) was most frequently evaluated with moderate respectively strong positive evidence for content and structural validity and negative evidence for reliability, hypothesis testing and responsiveness. Less frequently evaluated, the Stanford Presenteeism Scale (SPS) showed strong positive evidence for internal consistency and structural validity, and moderate positive evidence for hypotheses testing and criterion validity. The Productivity and Disease Questionnaire (PRODISQ) yielded strong positive evidence for content validity, evidence for other properties is lacking. The other instruments resulted in mostly fair-to-poor quality ratings with limited evidence. Conclusions Decisions based on the content of the instrument, usage purpose, target country and population, and available evidence are recommended. Until high-quality studies are in place to accurately assess the measurement properties of the currently available instruments, the WLQ and, in a Dutch context, the PRODISQ are cautiously preferred based on its strong positive evidence for content validity. Based on its strong positive evidence for internal consistency and structural validity, the SPS is cautiously recommended. PMID:24495301

  1. Development of the evaluation instrument use CIPP on the implementation of project assessment topic optik

    NASA Astrophysics Data System (ADS)

    Asfaroh, Jati Aurum; Rosana, Dadan; Supahar

    2017-08-01

    This research aims to develop an evaluation instrument models CIPP valid and reliable as well as determine the feasibility and practicality of an evaluation instrument models CIPP. An evaluation instrument models CIPP to evaluate the implementation of the project assessment topic optik to measure problem-solving skills of junior high school class VIII in the Yogyakarta region. This research is a model of development that uses 4-D. Subject of product trials are students in class VIII SMP N 1 Galur and SMP N 1 Sleman. Data collection techniques in this research using non-test techniques include interviews, questionnaires and observations. Validity in this research was analyzed using V'Aikens. Reliability analyzed using ICC. This research uses 7 raters are derived from two lecturers expert (expert judgment), two practitioners (science teacher) and three colleagues. The results of this research is the evaluation's instrument model of CIPP is used to evaluate the implementation of the implementation of the project assessment instruments. The validity result of evaluation instrument have V'Aikens values between 0.86 to 1, which means a valid and 0.836 reliability values into categories so well that it has been worth used as an evaluation instrument.

  2. Validation of a Previously Developed Geospatial Model That Predicts the Prevalence of Listeria monocytogenes in New York State Produce Fields

    PubMed Central

    Weller, Daniel; Shiwakoti, Suvash; Bergholz, Peter; Grohn, Yrjo; Wiedmann, Martin

    2015-01-01

    Technological advancements, particularly in the field of geographic information systems (GIS), have made it possible to predict the likelihood of foodborne pathogen contamination in produce production environments using geospatial models. Yet, few studies have examined the validity and robustness of such models. This study was performed to test and refine the rules associated with a previously developed geospatial model that predicts the prevalence of Listeria monocytogenes in produce farms in New York State (NYS). Produce fields for each of four enrolled produce farms were categorized into areas of high or low predicted L. monocytogenes prevalence using rules based on a field's available water storage (AWS) and its proximity to water, impervious cover, and pastures. Drag swabs (n = 1,056) were collected from plots assigned to each risk category. Logistic regression, which tested the ability of each rule to accurately predict the prevalence of L. monocytogenes, validated the rules based on water and pasture. Samples collected near water (odds ratio [OR], 3.0) and pasture (OR, 2.9) showed a significantly increased likelihood of L. monocytogenes isolation compared to that for samples collected far from water and pasture. Generalized linear mixed models identified additional land cover factors associated with an increased likelihood of L. monocytogenes isolation, such as proximity to wetlands. These findings validated a subset of previously developed rules that predict L. monocytogenes prevalence in produce production environments. This suggests that GIS and geospatial models can be used to accurately predict L. monocytogenes prevalence on farms and can be used prospectively to minimize the risk of preharvest contamination of produce. PMID:26590280

  3. Development and validation of procedures for assessment of competency of non-pharmacists in extemporaneous dispensing.

    PubMed

    Donnelly, Ryan F; McNally, Martin J; Barry, Johanne G

    2009-02-01

    To develop and validate procedures that may be suitable for assessment of competency of two groups of non-pharmacist staff (pharmacy students and trainee support staff) in extemporaneous dispensing. This is important given the prospect of remote supervision of community pharmacies in the UK. Analytical methods were validated according to International Conference on Harmonisation specifications and procedures were optimized to allow efficient drug extraction. This permitted straightforward determination of drug content in extemporaneously prepared lidocaine hydrochloride mouthwashes and norfloxacin creams and suspensions prepared by 10 participants recruited to represent the two groups of non-pharmacist staff. All 10 participants had completed the extemporaneous dispensing of all three products within 90 min. Extraction and analysis took approximately 15 min for each lidocaine hydrochloride mouthwash and 30 min for each diluted norfloxacin cream and norfloxacin suspension. The mean drug concentrations in lidocaine hydrochloride mouthwashes and diluted norfloxacin creams were within what are generally accepted as being pharmaceutically acceptable limits for drug content (100 +/- 5%) for both groups of participants. There was no significant difference in the mean drug concentration of norfloxacin suspensions prepared by the participant groups. However, it was notable that only one participant prepared a suspension containing a norfloxacin concentration that was within pharmaceutically acceptable limits (101.51%). A laboratory possessing suitable equipment and appropriately trained staff could cope readily with the large number of products prepared, for example, by a cohort of pre-registration students. Consequently, the validated procedures developed here could usefully be incorporated into the pre-registration examination for pharmacy students and a final qualifying examination for dispensers and pharmacy technicians. We believe that this is essential if the public and the profession are to have confidence in extemporaneous dispensing carried out in the absence of a pharmacist.

  4. Optimization and Validation of ELISA for Pre-Clinical Trials of Influenza Vaccine.

    PubMed

    Mitic, K; Muhandes, L; Minic, R; Petrusic, V; Zivkovic, I

    2016-01-01

    Testing of every new vaccine involves investigation of its immunogenicity, which is based on monitoring its ability to induce specific antibodies in animals. The fastest and most sensitive method used for this purpose is enzyme-linked immunosorbent assay (ELISA). However, commercial ELISA kits with whole influenza virus antigens are not available on the market, and it is therefore essential to establish an adequate assay for testing influenza virusspecific antibodies. We developed ELISA with whole influenza virus strains for the season 2011/2012 as antigens and validated it by checking its specificity, accuracy, linearity, range, precision, and sensitivity. The results show that we developed high-quality ELISA that can be used to test immunogenicity of newly produced seasonal or pandemic vaccines in mice. The pre-existence of validated ELISA enables shortening the time from the process of vaccine production to its use in patients, which is particularly important in the case of a pandemic.

  5. The ground prototype processor: Level-1 production during Sentinel-2 in-orbit acceptance

    NASA Astrophysics Data System (ADS)

    Petrucci, B.; Dechoz, C.; Lachérade, S.; L'Helguen, C.; Raynaud, J.-L.; Trémas, T.; Picard, C.; Rolland, A.

    2015-10-01

    Jointly with the European Commission, the Sentinel-2 earth observation optical mission is developed by the European Space Agency (ESA). Relying on a constellation of satellites put in orbit starting mid-2015, Sentinel-2 will be devoted to the monitoring of land and coastal areas worldwide thanks to an imagery at high revisit (5 days with two satellites), high resolution (10m, 20m and 60m) with large swath (290km), and multi-spectral imagery (13 bands in visible and shortwave infra-red). In this framework, the French Space Agency (CNES: Centre National d'Etudes Spatiales) supports ESA on the activities related to Image Quality, defining the image products and prototyping the processing techniques. Scope of this paper is to present the Ground Prototype Processor (GPP) that will be in charge of Level-1 production during Sentinel-2 In Orbit Acceptance phase. GPP has been developed by a European industrial consortium composed of Advanced Computer Systems (ACS), Magellium and DLR on the basis of CNES technical specification of Sentinel-2 data processing and under the joint management of ESA-ESTEC and CNES. It will assure the generation of the products used for Calibration and Validation activities and it will provide the reference data for Sentinel-2 Payload Data Ground Segment Validation. At first, Sentinel-2 end-users products definition is recalled with the associated radiometric and geometric performances; secondly the methods implemented will be presented with an overview of the Ground Image Processing Parameters that need to be tuned during the In Orbit Acceptance phase to assure the required performance of the products. Finally, the complexity of the processing having been showed, the challenges of the production in terms of data volume and processing time will be highlighted. The first Sentinel-2 Level-1 products are shown.

  6. A modeling approach to direct interspecies electron transfer process in anaerobic transformation of ethanol to methane.

    PubMed

    Liu, Yiwen; Zhang, Yaobin; Zhao, Zhiqiang; Ngo, Huu Hao; Guo, Wenshan; Zhou, Junliang; Peng, Lai; Ni, Bing-Jie

    2017-01-01

    Recent studies have shown that direct interspecies electron transfer (DIET) plays an important part in contributing to methane production from anaerobic digestion. However, so far anaerobic digestion models that have been proposed only consider two pathways for methane production, namely, acetoclastic methanogenesis and hydrogenotrophic methanogenesis, via indirect interspecies hydrogen transfer, which lacks an effective way for incorporating DIET into this paradigm. In this work, a new mathematical model is specifically developed to describe DIET process in anaerobic digestion through introducing extracellular electron transfer as a new pathway for methane production, taking anaerobic transformation of ethanol to methane as an example. The developed model was able to successfully predict experimental data on methane dynamics under different experimental conditions, supporting the validity of the developed model. Modeling predictions clearly demonstrated that DIET plays an important role in contributing to overall methane production (up to 33 %) and conductive material (i.e., carbon cloth) addition would significantly promote DIET through increasing ethanol conversion rate and methane production rate. The model developed in this work will potentially enhance our current understanding on syntrophic metabolism via DIET.

  7. Air Quality Monitoring and Forecasting Applications of Suomi NPP VIIRS Aerosol Products

    NASA Astrophysics Data System (ADS)

    Kondragunta, Shobha

    The Suomi National Polar-orbiting Partnership (NPP) Visible Infrared Imaging Radiometer Suite (VIIRS) instrument was launched on October 28, 2011. It provides Aerosol Optical Thickness (AOT) at two different spatial resolutions: a pixel level (~750 m at nadir) product called the Intermediate Product (IP) and an aggregated (~6 km at nadir) product called the Environmental Data Record (EDR), and a Suspended Matter (SM) EDR that provides aerosol type (dust, smoke, sea salt, and volcanic ash) information. An extensive validation of VIIRS best quality aerosol products with ground based L1.5 Aerosol Robotic NETwork (AERONET) data shows that the AOT EDR product has an accuracy/precision of -0.01/0.11 and 0.01/0.08 over land and ocean respectively. Globally, VIIRS mean AOT EDR (0.20) is similar to Aqua MODIS (0.16) with some important regional and seasonal differences. The accuracy of the SM product, however, is found to be very low (20 percent) when compared to Cloud Aerosol Lidar with Orthogonal Polarization (CALIOP) and AERONET. Several algorithm updates which include a better approach to retrieve surface reflectance have been developed for AOT retrieval. For dust aerosol type retrieval, a new approach that takes advantage of spectral dependence of Rayleigh scattering, surface reflectance, dust absorption in the deep blue (412 nm), blue (440 nm), and mid-IR (2.2 um) has been developed that detects dust with an accuracy of ~80 percent. For smoke plume identification, a source apportionment algorithm that combines fire hot spots with AOT imagery has been developed that provides smoke plume extent with an accuracy of ~70 percent. The VIIRS aerosol products will provide continuity to the current operational use of aerosol products from Aqua and Terra MODIS. These include aerosol data assimilation in Naval Research Laboratory (NRL) global aerosol model, verification of National Weather Service (NWS) dust and smoke forecasts, exceptional events monitoring by different states, air quality warnings by Environmental Protection Agency (EPA). This talk will provide an overview of VIIRS algorithms, aerosol product validation, and examples of various applications with a discussion on the relevance of product accuracy.

  8. Computer-assisted Biology Learning Materials: Designing and Developing an Interactive CD on Spermatogenesis

    NASA Astrophysics Data System (ADS)

    Haviz, M.

    2018-04-01

    The purpose of this article is to design and develop an interactive CD on spermatogenesis. This is a research and development. Procedure of development is making an outline of media program, making flowchart, making story board, gathering of materials, programming and finishing. The quantitative data obtained were analyzed by descriptive statistics. Qualitative data obtained were analyzed with Miles and Huberman techniques. The instrument used is a validation sheet. The result of CD design with a Macro flash MX program shows there are 17 slides generated. This prototype obtained a valid value after a self-review technique with many revisions, especially on sound and programming. This finding suggests that process-oriented spermatogenesis can be audio-visualized into a more comprehensive form of learning media. But this interactive CD product needs further testing to determine consistency and resistance to revisions.

  9. Overview of SCIAMACHY validation: 2002-2004

    NASA Astrophysics Data System (ADS)

    Piters, A. J. M.; Bramstedt, K.; Lambert, J.-C.; Kirchhoff, B.

    2006-01-01

    SCIAMACHY, on board Envisat, has been in operation now for almost three years. This UV/visible/NIR spectrometer measures the solar irradiance, the earthshine radiance scattered at nadir and from the limb, and the attenuation of solar radiation by the atmosphere during sunrise and sunset, from 240 to 2380 nm and at moderate spectral resolution. Vertical columns and profiles of a variety of atmospheric constituents are inferred from the SCIAMACHY radiometric measurements by dedicated retrieval algorithms. With the support of ESA and several international partners, a methodical SCIAMACHY validation programme has been developed jointly by Germany, the Netherlands and Belgium (the three instrument providing countries) to face complex requirements in terms of measured species, altitude range, spatial and temporal scales, geophysical states and intended scientific applications. This summary paper describes the approach adopted to address those requirements.

    Since provisional releases of limited data sets in summer 2002, operational SCIAMACHY processors established at DLR on behalf of ESA were upgraded regularly and some data products - level-1b spectra, level-2 O3, NO2, BrO and clouds data - have improved significantly. Validation results summarised in this paper and also reported in this special issue conclude that for limited periods and geographical domains they can already be used for atmospheric research. Nevertheless, current processor versions still experience known limitations that hamper scientific usability in other periods and domains. Free from the constraints of operational processing, seven scientific institutes (BIRA-IASB, IFE/IUP-Bremen, IUP-Heidelberg, KNMI, MPI, SAO and SRON) have developed their own retrieval algorithms and generated SCIAMACHY data products, together addressing nearly all targeted constituents. Most of the UV-visible data products - O3, NO2, SO2, H2O total columns; BrO, OClO slant columns; O3, NO2, BrO profiles - already have acceptable, if not excellent, quality. Provisional near-infrared column products - CO, CH4, N2O and CO2 - have already demonstrated their potential for a variety of applications. Cloud and aerosol parameters are retrieved, suffering from calibration with the exception of cloud cover. In any case, scientific users are advised to read carefully validation reports before using the data. It is required and anticipated that SCIAMACHY validation will continue throughout instrument lifetime and beyond and will accompany regular processor upgrades.

  10. A new roadmap for biopharmaceutical drug product development: Integrating development, validation, and quality by design.

    PubMed

    Martin-Moe, Sheryl; Lim, Fredric J; Wong, Rita L; Sreedhara, Alavattam; Sundaram, Jagannathan; Sane, Samir U

    2011-08-01

    Quality by design (QbD) is a science- and risk-based approach to drug product development. Although pharmaceutical companies have historically used many of the same principles during development, this knowledge was not always formally captured or proactively submitted to regulators. In recent years, the US Food and Drug Administration has also recognized the need for more controls in the drug manufacturing processes, especially for biological therapeutics, and it has recently launched an initiative for Pharmaceutical Quality for the 21st Century to modernize pharmaceutical manufacturing and improve product quality. In the biopharmaceutical world, the QbD efforts have been mainly focused on active pharmaceutical ingredient processes with little emphasis on drug product development. We present a systematic approach to biopharmaceutical drug product development using a monoclonal antibody as an example. The approach presented herein leverages scientific understanding of products and processes, risk assessments, and rational experimental design to deliver processes that are consistent with QbD philosophy without excessive incremental effort. Data generated using these approaches will not only strengthen data packages to support specifications and manufacturing ranges but hopefully simplify implementation of postapproval changes. We anticipate that this approach will positively impact cost for companies, regulatory agencies, and patients, alike. Copyright © 2011 Wiley-Liss, Inc.

  11. SAMICS Validation. SAMICS Support Study, Phase 3

    NASA Technical Reports Server (NTRS)

    1979-01-01

    SAMICS provides a consistent basis for estimating array costs and compares production technology costs. A review and a validation of the SAMICS model are reported. The review had the following purposes: (1) to test the computational validity of the computer model by comparison with preliminary hand calculations based on conventional cost estimating techniques; (2) to review and improve the accuracy of the cost relationships being used by the model: and (3) to provide an independent verification to users of the model's value in decision making for allocation of research and developement funds and for investment in manufacturing capacity. It is concluded that the SAMICS model is a flexible, accurate, and useful tool for managerial decision making.

  12. Demonstration of automated proximity and docking technologies

    NASA Astrophysics Data System (ADS)

    Anderson, Robert L.; Tsugawa, Roy K.; Bryan, Thomas C.

    An autodock was demonstrated using straightforward techniques and real sensor hardware. A simulation testbed was established and validated. The sensor design was refined with improved optical performance and image processing noise mitigation techniques, and the sensor is ready for production from off-the-shelf components. The autonomous spacecraft architecture is defined. The areas of sensors, docking hardware, propulsion, and avionics are included in the design. The Guidance Navigation and Control architecture and requirements are developed. Modular structures suitable for automated control are used. The spacecraft system manager functions including configuration, resource, and redundancy management are defined. The requirements for autonomous spacecraft executive are defined. High level decisionmaking, mission planning, and mission contingency recovery are a part of this. The next step is to do flight demonstrations. After the presentation the following question was asked. How do you define validation? There are two components to validation definition: software simulation with formal and vigorous validation, and hardware and facility performance validated with respect to software already validated against analytical profile.

  13. A Citizen Science Campaign to Validate Snow Remote-Sensing Products

    NASA Astrophysics Data System (ADS)

    Wikstrom Jones, K.; Wolken, G. J.; Arendt, A. A.; Hill, D. F.; Crumley, R. L.; Setiawan, L.; Markle, B.

    2017-12-01

    The ability to quantify seasonal water retention and storage in mountain snow packs has implications for an array of important topics, including ecosystem function, water resources, hazard mitigation, validation of remote sensing products, climate modeling, and the economy. Runoff simulation models, which typically rely on gridded climate data and snow remote sensing products, would be greatly improved if uncertainties in estimates of snow depth distribution in high-elevation complex terrain could be reduced. This requires an increase in the spatial and temporal coverage of observational snow data in high-elevation data-poor regions. To this end, we launched Community Snow Observations (CSO). Participating citizen scientists use Mountain Hub, a multi-platform mobile and web-based crowdsourcing application that allows users to record, submit, and instantly share geo-located snow depth, snow water equivalence (SWE) measurements, measurement location photos, and snow grain information with project scientists and other citizen scientists. The snow observations are used to validate remote sensing products and modeled snow depth distribution. The project's prototype phase focused on Thompson Pass in south-central Alaska, an important infrastructure corridor that includes avalanche terrain and the Lowe River drainage and is essential to the City of Valdez and the fisheries of Prince William Sound. This year's efforts included website development, expansion of the Mountain Hub tool, and recruitment of citizen scientists through a combination of social media outreach, community presentations, and targeted recruitment of local avalanche professionals. We also conducted two intensive field data collection campaigns that coincided with an aerial photogrammetric survey. With more than 400 snow depth observations, we have generated a new snow remote-sensing product that better matches actual SWE quantities for Thompson Pass. In the next phase of the citizen science portion of this project we will focus on expanding our group of participants to a larger geographic area in Alaska, further develop our partnership with Mountain Hub, and build relationships in new communities as we conduct a photogrammetric survey in a different region next year.

  14. NASA SMD Airborne Science Capabilities for Development and Testing of New Instruments

    NASA Technical Reports Server (NTRS)

    Fladeland, Matthew

    2015-01-01

    The SMD NASA Airborne Science Program operates and maintains a fleet of highly modified aircraft to support instrument development, satellite instrument calibration, data product validation and earth science process studies. This poster will provide an overview of aircraft available to NASA researchers including performance specifications and modifications for instrument support, processes for requesting aircraft time and developing cost estimates for proposals, and policies and procedures required to ensure safety of flight.

  15. 78 FR 74146 - Agency Information Collection Activities; Submission for Office of Management and Budget Review...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-10

    ... for, the manufacture, preproduction design validation (including a process to assess the performance... requirements governing the design, manufacture, packing, labeling, storage, installation, and servicing of all... for Quality Assurance in Design/Development, Production, Installation, and Servicing.'' The CGMP/QS...

  16. 75 FR 63834 - Agency Information Collection Activities; Submission for Office of Management and Budget Review...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-10-18

    ... facilities and controls used for, the manufacture, preproduction design validation (including a process to... requirements governing the design, manufacture, packing, labeling, storage, installation, and servicing of all... Model for Quality Assurance in Design/Development, Production, Installation, and Servicing.'' The CGMP...

  17. 15 CFR 995.27 - Format validation software testing.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 15 Commerce and Foreign Trade 3 2010-01-01 2010-01-01 false Format validation software testing... CERTIFICATION REQUIREMENTS FOR NOAA HYDROGRAPHIC PRODUCTS AND SERVICES CERTIFICATION REQUIREMENTS FOR... of NOAA ENC Products § 995.27 Format validation software testing. Tests shall be performed verifying...

  18. Development, repeatability and validity regarding energy and macronutrient intake of a semi-quantitative food frequency questionnaire: methodological considerations.

    PubMed

    Bountziouka, V; Bathrellou, E; Giotopoulou, A; Katsagoni, C; Bonou, M; Vallianou, N; Barbetseas, J; Avgerinos, P C; Panagiotakos, D B

    2012-08-01

    The aim of this work was to evaluate the repeatability and the validity of a food frequency questionnaire (FFQ), and to discuss the methodological framework of such procedures. The semi-quantitative FFQ included 69 questions regarding the frequency of consumption of all main food groups and beverages usually consumed and 7 questions regarding eating behaviors. Five hundred individuals (37 ± 15 yrs, 38% males) were recruited for the repeatability process, while another 432 (46 ± 16 yrs, 40% males) also completed 3-Day Diaries (3DD) for the validation process. The repeatability of the FFQ was adequate for all food items tested (Kendall's tau-b: 0.26-0.67, p < 0.05), energy and macronutrients intake (energy adjusted correlation coefficients ranged between 0.56-0.69, p < 0.05). Moderate validity of the FFQ was observed for "dairy products", "fruit", "alcohol" and "stimulants" (tau-b: 0.31-0.60, p < 0.05), whereas low agreement was shown for "starchy products", "legumes", "vegetables", "meat", "fish", "sweets", "eggs", "fats and oils" (tau-b < 0.30, p < 0.05). The FFQ was also valid regarding energy and macronutrients intake. Sensitivity analyses by sex and BMI category (< or ≥25 kg/m(2)) showed similar validity of the FFQ for all food groups (apart from "fats and oils" intake), as well as energy and nutrient intake. The proposed FFQ has proven repeatable and relatively valid for foods' intake, and could therefore be used for nutritional assessment purposes. Copyright © 2010 Elsevier B.V. All rights reserved.

  19. Application and validation of the TTI based chill chain management system SMAS (Safety Monitoring and Assurance System) on shelf life optimization of vacuum packed chilled tuna.

    PubMed

    Tsironi, Theofania; Gogou, Eleni; Velliou, Eirini; Taoukis, Petros S

    2008-11-30

    The objective of the study was to establish a validated kinetic model for growth of spoilage bacteria on vacuum packed tuna slices in the temperature range of 0 to 15 degrees C and to evaluate the applicability of the TTI (Time Temperature Integrators) based SMAS (Safety Monitoring and Assurance System) system to improve tuna product quality at the time of consumption in comparison to the conventional First In First Out (FIFO) approach. The overall measurements of total flora and lactic acid bacteria (LAB) on the tuna samples used in a laboratory simulated field test were in close agreement with the predictions of the developed kinetic model. The spoilage profile of the TTI bearing products, handled with SMAS, was improved. Three out of the thirty products that were handled randomly, according to the FIFO approach, were already spoiled at the time of consumption (logN(LAB)>6.5) compared to no spoiled products when handled with the SMAS approach.

  20. Development and validation of a portable gas phase standard generation and calibration system for volatile organic compounds

    Treesearch

    P. Veres; J. B. Gilman; J. M. Roberts; W. C. Kuster; C. Warneke; I. R. Burling; J. de Gouw

    2010-01-01

    We report on the development of an accurate, portable, dynamic calibration system for volatile organic compounds (VOCs). The Mobile Organic Carbon Calibration System (MOCCS) combines the production of gas-phase VOC standards using permeation or diffusion sources with quantitative total organic carbon (TOC) conversion on a palladium surface to CO2 in the presence of...

  1. Occupational Survey Report on Business Data Programmers: Task Data From Workers and Supervisors Indicating Job Relevance and Training Criticalness. Research and Development Series No. 108.

    ERIC Educational Resources Information Center

    Ammerman, Harry L.; Pratzner, Frank C.

    The Center for Vocational Education is continuing its programatic research efforts to develop more effective procedures for identifying valid and necessary curriculum content. The occupational task survey report for the occupation of business data programer is a product resulting from this effort. The task inventory data summarized were collected…

  2. Remote Sensing Applications to Water Quality Management in Florida

    NASA Astrophysics Data System (ADS)

    Lehrter, J. C.; Schaeffer, B. A.; Hagy, J.; Spiering, B.; Barnes, B.; Hu, C.; Le, C.; McEachron, L.; Underwood, L. W.; Ellis, C.; Fisher, B.

    2013-12-01

    Optical datasets from estuarine and coastal systems are increasingly available for remote sensing algorithm development, validation, and application. With validated algorithms, the data streams from satellite sensors can provide unprecedented spatial and temporal data for local and regional coastal water quality management. Our presentation will highlight two recent applications of optical data and remote sensing to water quality decision-making in coastal regions of the state of Florida; (1) informing the development of estuarine and coastal nutrient criteria for the state of Florida and (2) informing the rezoning of the Florida Keys National Marine Sanctuary. These efforts involved building up the underlying science to demonstrate the applicability of satellite data as well as an outreach component to educate decision-makers about the use, utility, and uncertainties of remote sensing data products. Scientific developments included testing existing algorithms and generating new algorithms for water clarity and chlorophylla in case II (CDOM or turbidity dominated) estuarine and coastal waters and demonstrating the accuracy of remote sensing data products in comparison to traditional field based measurements. Including members from decision-making organizations on the research team and interacting with decision-makers early and often in the process were key factors for the success of the outreach efforts and the eventual adoption of satellite data into the data records and analyses used in decision-making. Florida coastal water bodies (black boxes) for which remote sensing imagery were applied to derive numeric nutrient criteria and in situ observations (black dots) used to validate imagery. Florida ocean color applied to development of numeric nutrient criteria

  3. The way forward

    USGS Publications Warehouse

    Estes, John; Belward, Alan; Loveland, Thomas; Scepan, Joseph; Strahler, Alan H.; Townshend, John B.; Justice, Chris

    1999-01-01

    This paper focuses on the lessons hearned in the conduct of the lnternational Geosphere Biosphere Program's Data and Information System (rcnr-nts), global 1-km Land-Cover Mapping Project (n$cover). There is stiLL considerable fundamental research to be conducted dealing with the development and validation of thematic geospatial products derived from a combination of remotely sensed and ancillary data. Issues include database and data product development, classification legend definitions, processing and analysis techniques, and sampling strategies. A significant infrastructure is required to support an effort such as DISCover. The infrastructure put in place under the auspices of the IGBP-DIS serves as a model, and must be put in place to enable replication and development of projects such as Discover.

  4. Effect of Tutorial Giving on The Topic of Special Theory of Relativity in Modern Physics Course Towards Students’ Problem-Solving Ability

    NASA Astrophysics Data System (ADS)

    Hartatiek; Yudyanto; Haryoto, Dwi

    2017-05-01

    A Special Theory of Relativity handbook has been successfully arranged to guide students tutorial activity in the Modern Physics course. The low of students’ problem-solving ability was overcome by giving the tutorial in addition to the lecture class. It was done due to the limited time in the class during the course to have students do some exercises for their problem-solving ability. The explicit problem-solving based tutorial handbook was written by emphasizing to this 5 problem-solving strategies: (1) focus on the problem, (2) picture the physical facts, (3) plan the solution, (4) solve the problem, and (5) check the result. This research and development (R&D) consisted of 3 main steps: (1) preliminary study, (2) draft I. product development, and (3) product validation. The developed draft product was validated by experts to measure the feasibility of the material and predict the effect of the tutorial giving by means of questionnaires with scale 1 to 4. The students problem-solving ability in Special Theory of Relativity showed very good qualification. It implied that the tutorial giving with the help of tutorial handbook increased students problem-solving ability. The empirical test revealed that the developed handbook was significantly affected in improving students’ mastery concept and problem-solving ability. Both students’ mastery concept and problem-solving ability were in middle category with gain of 0.31 and 0.41, respectively.

  5. Workflow for Criticality Assessment Applied in Biopharmaceutical Process Validation Stage 1.

    PubMed

    Zahel, Thomas; Marschall, Lukas; Abad, Sandra; Vasilieva, Elena; Maurer, Daniel; Mueller, Eric M; Murphy, Patrick; Natschläger, Thomas; Brocard, Cécile; Reinisch, Daniela; Sagmeister, Patrick; Herwig, Christoph

    2017-10-12

    Identification of critical process parameters that impact product quality is a central task during regulatory requested process validation. Commonly, this is done via design of experiments and identification of parameters significantly impacting product quality (rejection of the null hypothesis that the effect equals 0). However, parameters which show a large uncertainty and might result in an undesirable product quality limit critical to the product, may be missed. This might occur during the evaluation of experiments since residual/un-modelled variance in the experiments is larger than expected a priori. Estimation of such a risk is the task of the presented novel retrospective power analysis permutation test. This is evaluated using a data set for two unit operations established during characterization of a biopharmaceutical process in industry. The results show that, for one unit operation, the observed variance in the experiments is much larger than expected a priori, resulting in low power levels for all non-significant parameters. Moreover, we present a workflow of how to mitigate the risk associated with overlooked parameter effects. This enables a statistically sound identification of critical process parameters. The developed workflow will substantially support industry in delivering constant product quality, reduce process variance and increase patient safety.

  6. Modelling of polymer photodegradation for solar cell modules

    NASA Technical Reports Server (NTRS)

    Somersall, A. C.; Guillet, J. E.

    1981-01-01

    A computer program developed to model and calculate by numerical integration the varying concentrations of chemical species formed during photooxidation of a polymeric material over time, using as input data a choice set of elementary reactions, corresponding rate constants and a convenient set of starting conditions is evaluated. Attempts were made to validate the proposed mechanism by experimentally monitoring the photooxidation products of small liquid alkane which are useful starting models for ethylene segments of polymers like EVA. The model system proved in appropriate for the intended purposes. Another validation model is recommended.

  7. Model-Based Verification and Validation of Spacecraft Avionics

    NASA Technical Reports Server (NTRS)

    Khan, Mohammed Omair

    2012-01-01

    Our simulation was able to mimic the results of 30 tests on the actual hardware. This shows that simulations have the potential to enable early design validation - well before actual hardware exists. Although simulations focused around data processing procedures at subsystem and device level, they can also be applied to system level analysis to simulate mission scenarios and consumable tracking (e.g. power, propellant, etc.). Simulation engine plug-in developments are continually improving the product, but handling time for time-sensitive operations (like those of the remote engineering unit and bus controller) can be cumbersome.

  8. Prime mission results of the dual-frequency precipitation radar on the global precipitation measurement core spacecraft and the version 5 GPM standard products

    NASA Astrophysics Data System (ADS)

    Furukawa, K.; Nio, T.; Oki, R.; Kubota, T.; Iguchi, T.

    2017-09-01

    The Dual-frequency Precipitation Radar (DPR) on the Global Precipitation Measurement (GPM) core satellite was developed by Japan Aerospace Exploration Agency (JAXA) and National Institute of Information and Communications Technology (NICT). The objective of the GPM mission is to observe global precipitation more frequently and accurately. The GPM core satellite is a joint product of National Aeronautics and Space Administration (NASA), JAXA and NICT. NASA developed the satellite bus and the GPM Microwave Imager (GMI), and JAXA and NICT developed the DPR. The inclination of the GPM core satellite is 65 degrees, and the nominal flight altitude is 407 km. The non-sunsynchronous circular orbit is necessary for measuring the diurnal change of rainfall. The DPR consists of two radars, which are Ku-band precipitation radar (KuPR) and Ka-band precipitation radar (KaPR). GPM core observatory was successfully launched by H2A launch vehicle on Feb. 28, 2014. DPR orbital check out was completed in May 2014. DPR products were released to the public on Sep. 2, 2014 and Normal Observation Operation period was started. JAXA is continuing DPR trend monitoring, calibration and validation operations to confirm that DPR keeps its function and performance on orbit. The results of DPR trend monitoring, calibration and validation show that DPR kept its function and performance on orbit during the 3 years and 2 months prime mission period. The DPR Prime mission period was completed in May 2017. The version 5 GPM products were released to the public in 2017. JAXA confirmed that GPM/DPR total system performance and the GPM version 5 products achieved the success criteria and the performance indicators that were defined for the JAXA GPM/DPR mission.

  9. Developing a Model to Estimate Freshwater Gross Primary Production Using MODIS Surface Temperature Observations

    NASA Astrophysics Data System (ADS)

    Saberi, S. J.; Weathers, K. C.; Norouzi, H.; Prakash, S.; Solomon, C.; Boucher, J. M.

    2016-12-01

    Lakes contribute to local and regional climate conditions, cycle nutrients, and are viable indicators of climate change due to their sensitivity to disturbances in their water and airsheds. Utilizing spaceborne remote sensing (RS) techniques has considerable potential in studying lake dynamics because it allows for coherent and consistent spatial and temporal observations as well as estimates of lake functions without in situ measurements. However, in order for RS products to be useful, algorithms that relate in situ measurements to RS data must be developed. Estimates of lake metabolic rates are of particular scientific interest since they are indicative of lakes' roles in carbon cycling and ecological function. Currently, there are few existing algorithms relating remote sensing products to in-lake estimates of metabolic rates and more in-depth studies are still required. Here we use satellite surface temperature observations from Moderate Resolution Imaging Spectroradiometer (MODIS) product (MYD11A2) and published in-lake gross primary production (GPP) estimates for eleven globally distributed lakes during a one-year period to produce a univariate quadratic equation model. The general model was validated using other lakes during an equivalent one-year time period (R2=0.76). The statistical analyses reveal significant positive relationships between MODIS temperature data and the previously modeled in-lake GPP. Lake-specific models for Lake Mendota (USA), Rotorua (New Zealand), and Taihu (China) showed stronger relationships than the general combined model, pointing to local influences such as watershed characteristics on in-lake GPP in some cases. These validation data suggest that the developed algorithm has a potential to predict lake GPP on a global scale.

  10. Development and Validation of a High Throughput System for Discovery of Antigens for Autoantibody Detection

    PubMed Central

    Macdonald, Isabel K.; Allen, Jared; Murray, Andrea; Parsy-Kowalska, Celine B.; Healey, Graham F.; Chapman, Caroline J.; Sewell, Herbert F.; Robertson, John F. R.

    2012-01-01

    An assay employing a panel of tumor-associated antigens has been validated and is available commercially (EarlyCDT®-Lung) to aid the early detection of lung cancer by measurement of serum autoantibodies. The high throughput (HTP) strategy described herein was pursued to identify new antigens to add to the EarlyCDT-Lung panel and to assist in the development of new panels for other cancers. Two ligation-independent cloning vectors were designed and synthesized, producing fusion proteins suitable for the autoantibody ELISA. We developed an abridged HTP version of the validated autoantibody ELISA, determining that results reflected the performance of the EarlyCDT assay, by comparing results on both formats. Once validated this HTP ELISA was utilized to screen multiple fusion proteins prepared on small-scale, by a HTP expression screen. We determined whether the assay performance for these HTP protein batches was an accurate reflection of the performance of R&D or commercial batches. A HTP discovery platform for the identification and optimal production of tumor- associated antigens which detects autoantibodies has been developed and validated. The most favorable conditions for the exposure of immunogenic epitopes were assessed to produce discriminatory proteins for use in a commercial ELISA. This process is rapid and cost-effective compared to standard cloning and screening technologies and enables rapid advancement in the field of autoantibody assay discovery. This approach will significantly reduce timescale and costs for developing similar panels of autoantibody assays for the detection of other cancer types with the ultimate aim of improved overall survival due to early diagnosis and treatment. PMID:22815807

  11. Sterilization of allograft bone: is 25 kGy the gold standard for gamma irradiation?

    PubMed

    Nguyen, Huynh; Morgan, David A F; Forwood, Mark R

    2007-01-01

    For several decades, a dose of 25 kGy of gamma irradiation has been recommended for terminal sterilization of medical products, including bone allografts. Practically, the application of a given gamma dose varies from tissue bank to tissue bank. While many banks use 25 kGy, some have adopted a higher dose, while some choose lower doses, and others do not use irradiation for terminal sterilization. A revolution in quality control in the tissue banking industry has occurred in line with development of quality assurance standards. These have resulted in significant reductions in the risk of contamination by microorganisms of final graft products. In light of these developments, there is sufficient rationale to re-establish a new standard dose, sufficient enough to sterilize allograft bone, while minimizing the adverse effects of gamma radiation on tissue properties. Using valid modifications, several authors have applied ISO standards to establish a radiation dose for bone allografts that is specific to systems employed in bone banking. These standards, and their verification, suggest that the actual dose could be significantly reduced from 25 kGy, while maintaining a valid sterility assurance level (SAL) of 10(-6). The current paper reviews the methods that have been used to develop radiation doses for terminal sterilization of medical products, and the current trend for selection of a specific dose for tissue banks.

  12. Development and validation of a stochastic model for potential growth of Listeria monocytogenes in naturally contaminated lightly preserved seafood.

    PubMed

    Mejlholm, Ole; Bøknæs, Niels; Dalgaard, Paw

    2015-02-01

    A new stochastic model for the simultaneous growth of Listeria monocytogenes and lactic acid bacteria (LAB) was developed and validated on data from naturally contaminated samples of cold-smoked Greenland halibut (CSGH) and cold-smoked salmon (CSS). During industrial processing these samples were added acetic and/or lactic acids. The stochastic model was developed from an existing deterministic model including the effect of 12 environmental parameters and microbial interaction (O. Mejlholm and P. Dalgaard, Food Microbiology, submitted for publication). Observed maximum population density (MPD) values of L. monocytogenes in naturally contaminated samples of CSGH and CSS were accurately predicted by the stochastic model based on measured variability in product characteristics and storage conditions. Results comparable to those from the stochastic model were obtained, when product characteristics of the least and most preserved sample of CSGH and CSS were used as input for the existing deterministic model. For both modelling approaches, it was shown that lag time and the effect of microbial interaction needs to be included to accurately predict MPD values of L. monocytogenes. Addition of organic acids to CSGH and CSS was confirmed as a suitable mitigation strategy against the risk of growth by L. monocytogenes as both types of products were in compliance with the EU regulation on ready-to-eat foods. Copyright © 2014 Elsevier Ltd. All rights reserved.

  13. An Approach for Validating Actinide and Fission Product Burnup Credit Criticality Safety Analyses-Isotopic Composition Predictions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Radulescu, Georgeta; Gauld, Ian C; Ilas, Germina

    2011-01-01

    The expanded use of burnup credit in the United States (U.S.) for storage and transport casks, particularly in the acceptance of credit for fission products, has been constrained by the availability of experimental fission product data to support code validation. The U.S. Nuclear Regulatory Commission (NRC) staff has noted that the rationale for restricting the Interim Staff Guidance on burnup credit for storage and transportation casks (ISG-8) to actinide-only is based largely on the lack of clear, definitive experiments that can be used to estimate the bias and uncertainty for computational analyses associated with using burnup credit. To address themore » issues of burnup credit criticality validation, the NRC initiated a project with the Oak Ridge National Laboratory to (1) develop and establish a technically sound validation approach for commercial spent nuclear fuel (SNF) criticality safety evaluations based on best-available data and methods and (2) apply the approach for representative SNF storage and transport configurations/conditions to demonstrate its usage and applicability, as well as to provide reference bias results. The purpose of this paper is to describe the isotopic composition (depletion) validation approach and resulting observations and recommendations. Validation of the criticality calculations is addressed in a companion paper at this conference. For isotopic composition validation, the approach is to determine burnup-dependent bias and uncertainty in the effective neutron multiplication factor (keff) due to bias and uncertainty in isotopic predictions, via comparisons of isotopic composition predictions (calculated) and measured isotopic compositions from destructive radiochemical assay utilizing as much assay data as is available, and a best-estimate Monte Carlo based method. This paper (1) provides a detailed description of the burnup credit isotopic validation approach and its technical bases, (2) describes the application of the approach for representative pressurized water reactor and boiling water reactor safety analysis models to demonstrate its usage and applicability, (3) provides reference bias and uncertainty results based on a quality-assurance-controlled prerelease version of the Scale 6.1 code package and the ENDF/B-VII nuclear cross section data.« less

  14. Advanced composites wing study program, volume 2

    NASA Technical Reports Server (NTRS)

    Harvey, S. T.; Michaelson, G. L.

    1978-01-01

    The study on utilization of advanced composites in commercial aircraft wing structures was conducted as a part of the NASA Aircraft Energy Efficiency Program to establish, by the mid-1980s, the technology for the design of a subsonic commercial transport aircraft leading to a 40% fuel savings. The study objective was to develop a plan to define the effort needed to support a production commitment for the extensive use of composite materials in wings of new generation aircraft that will enter service in the 1985-1990 time period. Identification and analysis of what was needed to meet the above plan requirements resulted in a program plan consisting of three key development areas: (1) technology development; (2) production capability development; and (3) integration and validation by designing, building, and testing major development hardware.

  15. All sky imaging observations in visible and infrared waveband for validation of satellite cloud and aerosol products

    NASA Astrophysics Data System (ADS)

    Lu, Daren; Huo, Juan; Zhang, W.; Liu, J.

    A series of satellite sensors in visible and infrared wavelengths have been successfully operated on board a number of research satellites, e.g. NOAA/AVHRR, the MODIS onboard Terra and Aqua, etc. A number of cloud and aerosol products are produced and released in recent years. However, the validation of the product quality and accuracy are still a challenge to the atmospheric remote sensing community. In this paper, we suggest a ground based validation scheme for satellite-derived cloud and aerosol products by using combined visible and thermal infrared all sky imaging observations as well as surface meteorological observations. In the scheme, a visible digital camera with a fish-eye lens is used to continuously monitor the all sky with the view angle greater than 180 deg. The digital camera system is calibrated for both its geometry and radiance (broad blue, green, and red band) so as to a retrieval method can be used to detect the clear and cloudy sky spatial distribution and their temporal variations. A calibrated scanning thermal infrared thermometer is used to monitor the all sky brightness temperature distribution. An algorithm is developed to detect the clear and cloudy sky as well as cloud base height by using sky brightness distribution and surface temperature and humidity as input. Based on these composite retrieval of clear and cloudy sky distribution, it can be used to validate the satellite retrievals in the sense of real-simultaneous comparison and statistics, respectively. What will be presented in this talk include the results of the field observations and comparisons completed in Beijing (40 deg N, 116.5 deg E) in year 2003 and 2004. This work is supported by NSFC grant No. 4002700, and MOST grant No 2001CCA02200

  16. Meeting Report: Long Term Monitoring of Global Vegetation using Moderate Resolution Satellites

    NASA Technical Reports Server (NTRS)

    Morisette, Jeffrey; Heinsch, Fath Ann; Running, Steven W.

    2006-01-01

    The international community has long recognized the need to coordinate observations of Earth from space. In 1984, this situation provided the impetus for creating the Committee on Earth Observation Satellites (CEOS), an international coordinating mechanism charged with coordinating international civil spaceborne missions designed to observe and study planet Earth. Within CEOS, its Working Group on Calibration and Validation (WGCV) is tasked with coordinating satellite-based global observations of vegetation. Currently, several international organizations are focusing on the requirements for Earth observation from space to address key science questions and societal benefits related to our terrestrial environment. The Global Vegetation Workshop, sponsored by the WGCV and held in Missoula, Montana, 7-10 August, 2006, was organized to establish a framework to understand the inter-relationships among multiple, global vegetation products and identify opportunities for: 1) Increasing knowledge through combined products, 2) Realizing efficiency by avoiding redundancy, and 3) Developing near- and long-term plans to avoid gaps in our understanding of critical global vegetation information. The Global Vegetation Workshop brought together 135 researchers from 25 states and 14 countries to advance these themes and formulate recommendations for CEOS members and the Global Earth Observation System of Systems (GEOSS). The eighteen oral presentations and most of the 74 posters presented at the meeting can be downloaded from the meeting website (www.ntsg.umt.edu/VEGMTG/). Meeting attendees were given a copy of the July 2006 IEEE Transactions on Geoscience and Remote Sensing Special Issue on Global Land Product Validation, coordinated by the CEOS Working Group on Calibration and Validation (WGCV). This issue contains 29 articles focusing on validation products from several of the sensors discussed during the workshop.

  17. A practical approach for the validation of sterility, endotoxin and potency testing of bone marrow mononucleated cells used in cardiac regeneration in compliance with good manufacturing practice.

    PubMed

    Soncin, Sabrina; Lo Cicero, Viviana; Astori, Giuseppe; Soldati, Gianni; Gola, Mauro; Sürder, Daniel; Moccetti, Tiziano

    2009-09-08

    Main scope of the EU and FDA regulations is to establish a classification criterion for advanced therapy medicinal products (ATMP). Regulations require that ATMPs must be prepared under good manufacturing practice (GMP). We have validated a commercial system for the determination of bacterial endotoxins in compliance with EU Pharmacopoeia 2.6.14, the sterility testing in compliance with EU Pharmacopoeia 2.6.1 and a potency assay in an ATMP constituted of mononucleated cells used in cardiac regeneration. For the potency assay, cells were placed in the upper part of a modified Boyden chamber containing Endocult Basal Medium with supplements and transmigrated cells were scored. The invasion index was expressed as the ratio between the numbers of invading cells relative to cell migration through a control insert membrane. For endotoxins, we used a commercially available system based on the kinetic chromogenic LAL-test. Validation of sterility was performed by direct inoculation of TSB and FTM media with the cell product following Eu Ph 2.6.1 guideline. The calculated MVD and endotoxin limit were 780x and 39 EU/ml respectively. The 1:10 and 1:100 dilutions were selected for the validation. For sterility, all the FTM cultures were positive after 3 days. For TSB cultures, Mycetes and B. subtilis were positive after 5 and 3 days respectively. The detection limit was 1-10 colonies. A total of four invasion assay were performed: the calculated invasion index was 28.89 +/- 16.82% (mean +/- SD). We have validated a strategy for endotoxin, sterility and potency testing in an ATMP used in cardiac regeneration. Unlike pharmaceutical products, many stem-cell-based products may originate in hospitals where personnel are unfamiliar with the applicable regulations. As new ATMPs are developed, the regulatory framework is likely to evolve. Meanwhile, existing regulations provide an appropriate structure for ensuring the safety and efficacy of the next generation of ATMPs. Personnel must be adequately trained on relevant methods and their application to stem-cell-based products.

  18. Improved Global Ocean Color Using Polymer Algorithm

    NASA Astrophysics Data System (ADS)

    Steinmetz, Francois; Ramon, Didier; Deschamps, ierre-Yves; Stum, Jacques

    2010-12-01

    A global ocean color product has been developed based on the use of the POLYMER algorithm to correct atmospheric scattering and sun glint and to process the data to a Level 2 ocean color product. Thanks to the use of this algorithm, the coverage and accuracy of the MERIS ocean color product have been significantly improved when compared to the standard product, therefore increasing its usefulness for global ocean monitor- ing applications like GLOBCOLOUR. We will present the latest developments of the algorithm, its first application to MODIS data and its validation against in-situ data from the MERMAID database. Examples will be shown of global NRT chlorophyll maps produced by CLS with POLYMER for operational applications like fishing or oil and gas industry, as well as its use by Scripps for a NASA study of the Beaufort and Chukchi seas.

  19. Validation and implementation of Planova™ BioEX virus filters in the manufacture of a new liquid intravenous immunoglobulin in China.

    PubMed

    Ma, Shan; Pang, Guang Li; Shao, Yu Juan; Hongo-Hirasaki, Tomoko; Shang, Meng Xian; Inouye, Marcus; Jian, Chang Yong; Zhu, Meng Zhao; Yang, Hu Hu; Gao, Jian Feng; Xi, Zhi Ying; Song, Dian Wei

    2018-03-01

    There is a continuous need to improve the viral safety of plasma products, and we here report the development and optimization of a manufacturing-scale virus removal nanofiltration step for intravenous immunoglobulin (IVIG) using the recently introduced Planova™ BioEX filter. IVIG throughput was examined for various operating parameters: transmembrane pressure, temperature, protein concentration, and prefiltration methods. The developed procedure was based on filtering undiluted process solution (50.0 g/l IVIG) under constant transmembrane pressure filtration at 294 kPa and 25 °C following prefiltration with a 0.1 μm MILLEX VV filter. The recovery of IgG was approximately 98%, and no substantial changes in biochemical characteristics were observed before and after nanofiltration in scaled-up production. A viral clearance validation study with parvovirus under worst-case conditions performed at the National Institutes for Food and Drug Control of China (NIFDC) showed PPV logarithmic reduction value (LRV) > 4. Improved viral safety of IVIG can be assured by implementing a Planova BioEX nanofiltration step to ensure effective parvovirus clearance under conditions providing excellent protein recovery and no detectable impact on product biochemical properties. This plasma-derived IVIG product is the first to be certified for parvovirus safety by the NIFDC in China. Copyright © 2018 International Alliance for Biological Standardization. Published by Elsevier Ltd. All rights reserved.

  20. Methods to Detect Nitric Oxide and its Metabolites in Biological Samples

    PubMed Central

    Bryan, Nathan S.; Grisham, Matthew B.

    2007-01-01

    Nitric oxide (NO) methodology is a complex and often confusing science and the focus of many debates and discussion concerning NO biochemistry. NO is involved in many physiological processes including regulation of blood pressure, immune response and neural communication. Therefore its accurate detection and quantification is critical to understanding health and disease. Due to the extremely short physiological half life of this gaseous free radical, alternative strategies for the detection of reaction products of NO biochemistry have been developed. The quantification of NO metabolites in biological samples provides valuable information with regards to in vivo NO production, bioavailability and metabolism. Simply sampling a single compartment such as blood or plasma may not always provide an accurate assessment of whole body NO status, particularly in tissues. Therefore, extrapolation of plasma or blood NO status to specific tissues of interest is no longer a valid approach. As a result, methods continue to be developed and validated which allow the detection and quantification of NO and NO-related products/metabolites in multiple compartments of experimental animals in vivo. The methods described in this review is not an exhaustive or comprehensive discussion of all methods available for the detection of NO but rather a description of the most commonly used and practical methods which allow accurate and sensitive quantification of NO products/metabolites in multiple biological matrices under normal physiological conditions. PMID:17664129

  1. Validation of biological activity testing procedure of recombinant human interleukin-7.

    PubMed

    Lutsenko, T N; Kovalenko, M V; Galkin, O Yu

    2017-01-01

    Validation procedure for method of monitoring the biological activity of reсombinant human interleukin-7 has been developed and conducted according to the requirements of national and international recommendations. This method is based on the ability of recombinant human interleukin-7 to induce proliferation of T lymphocytes. It has been shown that to control the biological activity of recombinant human interleukin-7 peripheral blood mononuclear cells (PBMCs) derived from blood or cell lines can be used. Validation charac­teristics that should be determined depend on the method, type of product or object test/measurement and biological test systems used in research. The validation procedure for the method of control of biological activity of recombinant human interleukin-7 in peripheral blood mononuclear cells showed satisfactory results on all parameters tested such as specificity, accuracy, precision and linearity.

  2. SkinEthic Laboratories, a company devoted to develop and produce in vitro alternative methods to animal use.

    PubMed

    de Brugerolle, Anne

    2007-01-01

    SkinEthic Laboratories is a France-based biotechnology company recognised as the world leader in tissue engineering. SkinEthic is devoted to develop and produce reliable and robust in vitro alternative methods to animal use in cosmetic, chemical and pharmaceutical industries. SkinEthic models provide relevant tools for efficacy and safety screening tests in order to support an integrated decision-making during research and development phases. Some screening tests are referenced and validated as alternatives to animal use (Episkin), others are in the process of validation under ECVAM and OECD guidelines. SkinEthic laboratories provide a unique and joined experience of more than 20 years from Episkin SNC and SkinEthic SA. Their unique cell culture process allows in vitro reconstructed human tissues with well characterized histology, functionality and ultrastructure features to be mass produced. Our product line includes skin models: a reconstructed human epidermis with a collagen layer, Episkin, reconstructed human epidermis without or with melanocytes (with a tanning degree from phototype II to VI) and a reconstructed human epithelium, i.e. cornea, and other mucosa, i.e. oral, gingival, oesophageal and vaginal. Our philosophy is based on 3 main commitments: to support our customers by providing robust and reliable models, to ensure training and education in using validated protocols, allowing a large array of raw materials, active ingredients and finished products in solid, liquid, powder, cream or gel form to be screened, and, to provide a dedicated service to our partners.

  3. The use of main concept analysis to measure discourse production in Cantonese-speaking persons with aphasia: a preliminary report.

    PubMed

    Kong, Anthony Pak-Hin

    2009-01-01

    Discourse produced by speakers with aphasia contains rich and valuable information for researchers to understand the manifestation of aphasia as well as for clinicians to plan specific treatment components for their clients. Various approaches to investigate aphasic discourse have been proposed in the English literature. However, this is not the case in Chinese. As a result, clinical evaluations of aphasic discourse have not been a common practice. This problem is further compounded by the lack of validated stimuli that are culturally appropriate for language elicitation. The purpose of this study was twofold: (a) to develop and validate four sequential pictorial stimuli for elicitation of language samples in Cantonese speakers with aphasia, and (b) to investigate the use of a main concept measurement, a clinically oriented quantitative system, to analyze the elicited language samples. Twenty speakers with aphasia and ten normal speakers were invited to participate in this study. The aphasic group produced significantly less key information than the normal group. More importantly, a strong relationship was also found between aphasia severity and production of main concepts. While the results of the inter-rater and intra-rater reliability suggested the scoring system to be reliable, the test-retest results yielded strong and significant correlations across two testing sessions one to three weeks apart. Readers will demonstrate better understanding of (1) the development and validation of newly devised sequential pictorial stimuli to elicit oral language production, and (2) the use of a main concept measurement to quantify aphasic connected speech in Cantonese Chinese.

  4. Proposal of an environmental performance index to assess solid waste treatment technologies.

    PubMed

    Coelho, Hosmanny Mauro Goulart; Lange, Liséte Celina; Coelho, Lineker Max Goulart

    2012-07-01

    Although the concern with sustainable development and environment protection has considerably grown in the last years it is noted that the majority of decision making models and tools are still either excessively tied to economic aspects or geared to the production process. Moreover, existing models focus on the priority steps of solid waste management, beyond waste energy recovery and disposal. So, in order to help the lack of models and tools aiming at the waste treatment and final disposal, a new concept is proposed: the Cleaner Treatment, which is based on the Cleaner Production principles. This paper focuses on the development and validation of the Cleaner Treatment Index (CTI), to assess environmental performance of waste treatment technologies based on the Cleaner Treatment concept. The index is formed by aggregation (summation or product) of several indicators that consists in operational parameters. The weights of the indicator were established by Delphi Method and Brazilian Environmental Laws. In addition, sensitivity analyses were carried out comparing both aggregation methods. Finally, index validation was carried out by applying the CTI to 10 waste-to-energy plants data. From sensitivity analysis and validation results it is possible to infer that summation model is the most suitable aggregation method. For summation method, CTI results were superior to 0.5 (in a scale from 0 to 1) for most facilities evaluated. So, this study demonstrates that CTI is a simple and robust tool to assess and compare the environmental performance of different treatment plants being an excellent quantitative tool to support Cleaner Treatment implementation. Copyright © 2012 Elsevier Ltd. All rights reserved.

  5. Validated spectrophotometric methods for determination of sodium valproate based on charge transfer complexation reactions.

    PubMed

    Belal, Tarek S; El-Kafrawy, Dina S; Mahrous, Mohamed S; Abdel-Khalek, Magdi M; Abo-Gharam, Amira H

    2016-02-15

    This work presents the development, validation and application of four simple and direct spectrophotometric methods for determination of sodium valproate (VP) through charge transfer complexation reactions. The first method is based on the reaction of the drug with p-chloranilic acid (p-CA) in acetone to give a purple colored product with maximum absorbance at 524nm. The second method depends on the reaction of VP with dichlone (DC) in dimethylformamide forming a reddish orange product measured at 490nm. The third method is based upon the interaction of VP and picric acid (PA) in chloroform resulting in the formation of a yellow complex measured at 415nm. The fourth method involves the formation of a yellow complex peaking at 361nm upon the reaction of the drug with iodine in chloroform. Experimental conditions affecting the color development were studied and optimized. Stoichiometry of the reactions was determined. The proposed spectrophotometric procedures were effectively validated with respect to linearity, ranges, precision, accuracy, specificity, robustness, detection and quantification limits. Calibration curves of the formed color products with p-CA, DC, PA and iodine showed good linear relationships over the concentration ranges 24-144, 40-200, 2-20 and 1-8μg/mL respectively. The proposed methods were successfully applied to the assay of sodium valproate in tablets and oral solution dosage forms with good accuracy and precision. Assay results were statistically compared to a reference pharmacopoeial HPLC method where no significant differences were observed between the proposed methods and reference method. Copyright © 2015 Elsevier B.V. All rights reserved.

  6. Validated spectrophotometric methods for determination of sodium valproate based on charge transfer complexation reactions

    NASA Astrophysics Data System (ADS)

    Belal, Tarek S.; El-Kafrawy, Dina S.; Mahrous, Mohamed S.; Abdel-Khalek, Magdi M.; Abo-Gharam, Amira H.

    2016-02-01

    This work presents the development, validation and application of four simple and direct spectrophotometric methods for determination of sodium valproate (VP) through charge transfer complexation reactions. The first method is based on the reaction of the drug with p-chloranilic acid (p-CA) in acetone to give a purple colored product with maximum absorbance at 524 nm. The second method depends on the reaction of VP with dichlone (DC) in dimethylformamide forming a reddish orange product measured at 490 nm. The third method is based upon the interaction of VP and picric acid (PA) in chloroform resulting in the formation of a yellow complex measured at 415 nm. The fourth method involves the formation of a yellow complex peaking at 361 nm upon the reaction of the drug with iodine in chloroform. Experimental conditions affecting the color development were studied and optimized. Stoichiometry of the reactions was determined. The proposed spectrophotometric procedures were effectively validated with respect to linearity, ranges, precision, accuracy, specificity, robustness, detection and quantification limits. Calibration curves of the formed color products with p-CA, DC, PA and iodine showed good linear relationships over the concentration ranges 24-144, 40-200, 2-20 and 1-8 μg/mL respectively. The proposed methods were successfully applied to the assay of sodium valproate in tablets and oral solution dosage forms with good accuracy and precision. Assay results were statistically compared to a reference pharmacopoeial HPLC method where no significant differences were observed between the proposed methods and reference method.

  7. Optimization and Validation of a Plaque Reduction Neutralization Test for the Detection of Neutralizing Antibodies to Four Serotypes of Dengue Virus Used in Support of Dengue Vaccine Development

    PubMed Central

    Timiryasova, Tatyana M.; Bonaparte, Matthew I.; Luo, Ping; Zedar, Rebecca; Hu, Branda T.; Hildreth, Stephen W.

    2013-01-01

    A dengue plaque reduction neutralization test (PRNT) to measure dengue serotype–specific neutralizing antibodies for all four virus serotypes was developed, optimized, and validated in accordance with guidelines for validation of bioanalytical test methods using human serum samples from dengue-infected persons and persons receiving a dengue vaccine candidate. Production and characterization of dengue challenge viruses used in the assay was standardized. Once virus stocks were characterized, the dengue PRNT50 for each of the four serotypes was optimized according to a factorial design of experiments approach for critical test parameters, including days of cell seeding before testing, percentage of overlay carboxymethylcellulose medium, and days of incubation post-infection to generate a robust assay. The PRNT50 was then validated and demonstrated to be suitable to detect and measure dengue serotype-specific neutralizing antibodies in human serum samples with acceptable intra-assay and inter-assay precision, accuracy/dilutability, specificity, and with a lower limit of quantitation of 10. PMID:23458954

  8. Bridging the Gap Between Validation and Implementation of Non-Animal Veterinary Vaccine Potency Testing Methods.

    PubMed

    Dozier, Samantha; Brown, Jeffrey; Currie, Alistair

    2011-11-29

    In recent years, technologically advanced high-throughput techniques have been developed that replace, reduce or refine animal use in vaccine quality control tests. Following validation, these tests are slowly being accepted for use by international regulatory authorities. Because regulatory acceptance itself has not guaranteed that approved humane methods are adopted by manufacturers, various organizations have sought to foster the preferential use of validated non-animal methods by interfacing with industry and regulatory authorities. After noticing this gap between regulation and uptake by industry, we began developing a paradigm that seeks to narrow the gap and quicken implementation of new replacement, refinement or reduction guidance. A systematic analysis of our experience in promoting the transparent implementation of validated non-animal vaccine potency assays has led to the refinement of our paradigmatic process, presented here, by which interested parties can assess the local regulatory acceptance of methods that reduce animal use and integrate them into quality control testing protocols, or ensure the elimination of peripheral barriers to their use, particularly for potency and other tests carried out on production batches.

  9. IT Project Success w\\7120 and 7123 NPRs to Achieve Project Success

    NASA Technical Reports Server (NTRS)

    Walley, Tina L.

    2009-01-01

    This slide presentation reviews management techniques to assure information technology development project success. Details include the work products, the work breakdown structure (WBS), system integration, verification and validation (IV&V), and deployment and operations. An example, the NASA Consolidated Active Directory (NCAD), is reviewed.

  10. Ultraviolet light (UV) and UV-ozone interventions reduce shiga toxin-producing Escherichia coli (STEC) on contaminated fresh beef

    USDA-ARS?s Scientific Manuscript database

    Although numerous chemical interventions have been implemented and validated to decontaminate meat and meat products during the harvesting process, more novel technologies are under development. UV light ionizing irradiation has been used extensively in pharmaceutical and medical device companies to...

  11. Fuel Cell and Hydrogen Technologies Program | Hydrogen and Fuel Cells |

    Science.gov Websites

    NREL Fuel Cell and Hydrogen Technologies Program Fuel Cell and Hydrogen Technologies Program Through its Fuel Cell and Hydrogen Technologies Program, NREL researches, develops, analyzes, and validates fuel cell and hydrogen production, delivery, and storage technologies for transportation

  12. Identifying Behavioral Measures of Stress in Individuals with Aphasia

    ERIC Educational Resources Information Center

    Laures-Gore, Jacqueline S.; DuBay, Michaela F.; Duff, Melissa C.; Buchanan, Tony W.

    2010-01-01

    Purpose: To develop valid indicators of stress in individuals with aphasia (IWA) by examining the relationship between certain language variables (error frequency [EF] and word productivity [WP]) and cortisol reactivity. Method: Fourteen IWA and 10 controls participated in a speaking task. Salivary cortisol was collected pre- and posttask. WP and…

  13. FPL roof temperature and moisture model : description and verification

    Treesearch

    A. TenWolde

    This paper describes a mathematical model developed by the Forest Products Laboratory to predict attic temperatures, relative humidities, and roof sheathing moisture content. Comparison of data from model simulation and measured data provided limited validation of the model and led to the following conclusions: (1) the model can...

  14. Development and validation of an environmental fragility index (EFI) for the neotropical savannah biome.

    PubMed

    Macedo, Diego R; Hughes, Robert M; Kaufmann, Philip R; Callisto, Marcos

    2018-04-23

    Augmented production and transport of fine sediments resulting from increased human activities are major threats to freshwater ecosystems, including reservoirs and their ecosystem services. To support large scale assessment of the likelihood of soil erosion and reservoir sedimentation, we developed and validated an environmental fragility index (EFI) for the Brazilian neotropical savannah. The EFI was derived from measured geoclimatic controls on sediment production (rainfall, variation of elevation and slope, geology) and anthropogenic pressures (natural cover, road density, distance from roads and urban centers) in 111 catchments upstream of four large hydroelectric reservoirs. We evaluated the effectiveness of the EFI by regressing it against a relative bed stability index (LRBS) that assesses the degree to which stream sites draining into the reservoirs are affected by excess fine sediments. We developed the EFI on 111 of these sites and validated our model on the remaining 37 independent sites. We also compared the effectiveness of the EFI in predicting LRBS with that of a multiple linear regression model (via best-subset procedure) using 7 independent variables. The EFI was significantly correlated with the LRBS, with regression R 2 values of 0.32 and 0.40, respectively, in development and validation sites. Although the EFI and multiple regression explained similar amounts of variability (R 2  = 0.32 vs 0.36), the EFI had a higher F-ratio (51.6 vs 8.5) and better AICc value (333 vs 338). Because the sites were randomly selected and well-distributed across geoclimatic controlling factors, we were able to calculate spatially-explicit EFI values for all hydrologic units within the study area (~38,500 km 2 ). This model-based inference showed that over 65% of those units had high or extreme fragility. This methodology has great potential for application in the management, recovery, and preservation of hydroelectric reservoirs and streams in tropical river basins. Copyright © 2018 Elsevier B.V. All rights reserved.

  15. A new validated HPLC-FLD method for detecting ochratoxin A in dry-cured meat and in blue cheese.

    PubMed

    Dall'asta, C; Galaverna, G; De Dea Lindner, J; Virgili, R; Neviani, E; Dossena, A

    2007-09-01

    In the present study, a fast and sensitive method for the quantification of ochratoxin A in two lipidicproteic food matrices has been developed. In particular, the sample preparation procedure has been optimized for dry-cured meat products and blue cheeses and tested for several validation parameters (LOD, LOQ, recovery, repeatability and within-laboratory precision). The procedure has been then applied to several dry-cured meat products and blue cheeses from the market.Ochratoxin A has been occasionally found in dry-cured and smoked ham from the market and the contamination occurred both in the outer and in the inner part of the products. Concerning the blue cheese, the occurrence of ochratoxin A is reported for the first time: OTA was occasionally found at low levels (0.1-3 μg/kg) in commercial samples of Roquefort from France and Gorgonzola from Italy, opening a new issue for risk assessment and quality control.

  16. Validation of SMAP surface soil moisture products with core validation sites

    USDA-ARS?s Scientific Manuscript database

    The NASA Soil Moisture Active Passive (SMAP) mission has utilized a set of core validation sites as the primary methodology in assessing the soil moisture retrieval algorithm performance. Those sites provide well-calibrated in situ soil moisture measurements within SMAP product grid pixels for diver...

  17. Technical note: Intercomparison of three AATSR Level 2 (L2) AOD products over China

    NASA Astrophysics Data System (ADS)

    Che, Yahui; Xue, Yong; Mei, Linlu; Guang, Jie; She, Lu; Guo, Jianping; Hu, Yincui; Xu, Hui; He, Xingwei; Di, Aojie; Fan, Cheng

    2016-08-01

    One of four main focus areas of the PEEX initiative is to establish and sustain long-term, continuous, and comprehensive ground-based, airborne, and seaborne observation infrastructure together with satellite data. The Advanced Along-Track Scanning Radiometer (AATSR) aboard ENVISAT is used to observe the Earth in dual view. The AATSR data can be used to retrieve aerosol optical depth (AOD) over both land and ocean, which is an important parameter in the characterization of aerosol properties. In recent years, aerosol retrieval algorithms have been developed both over land and ocean, taking advantage of the features of dual view, which can help eliminate the contribution of Earth's surface to top-of-atmosphere (TOA) reflectance. The Aerosol_cci project, as a part of the Climate Change Initiative (CCI), provides users with three AOD retrieval algorithms for AATSR data, including the Swansea algorithm (SU), the ATSR-2ATSR dual-view aerosol retrieval algorithm (ADV), and the Oxford-RAL Retrieval of Aerosol and Cloud algorithm (ORAC). The validation team of the Aerosol-CCI project has validated AOD (both Level 2 and Level 3 products) and AE (Ångström Exponent) (Level 2 product only) against the AERONET data in a round-robin evaluation using the validation tool of the AeroCOM (Aerosol Comparison between Observations and Models) project. For the purpose of evaluating different performances of these three algorithms in calculating AODs over mainland China, we introduce ground-based data from CARSNET (China Aerosol Remote Sensing Network), which was designed for aerosol observations in China. Because China is vast in territory and has great differences in terms of land surfaces, the combination of the AERONET and CARSNET data can validate the L2 AOD products more comprehensively. The validation results show different performances of these products in 2007, 2008, and 2010. The SU algorithm performs very well over sites with different surface conditions in mainland China from March to October, but it slightly underestimates AOD over barren or sparsely vegetated surfaces in western China, with mean bias error (MBE) ranging from 0.05 to 0.10. The ADV product has the same precision with a low root mean square error (RMSE) smaller than 0.2 over most sites and the same error distribution as the SU product. The main limits of the ADV algorithm are underestimation and applicability; underestimation is particularly obvious over the sites of Datong, Lanzhou, and Urumchi, where the dominant land cover is grassland, with an MBE larger than 0.2, and the main aerosol sources are coal combustion and dust. The ORAC algorithm has the ability to retrieve AOD at different ranges, including high AOD (larger than 1.0); however, the stability deceases significantly with increasing AOD, especially when AOD > 1.0. In addition, the ORAC product is consistent with the CARSNET product in winter (December, January, and February), whereas other validation results lack matches during winter.

  18. Optimizing adherence in HIV prevention product trials: Development and psychometric evaluation of simple tools for screening and adherence counseling.

    PubMed

    Tolley, Elizabeth E; Guthrie, Kate Morrow; Zissette, Seth; Fava, Joseph L; Gill, Katherine; Louw, Cheryl E; Kotze, Philip; Reddy, Krishnaveni; MacQueen, Kathleen

    2018-01-01

    Low adherence in recent HIV prevention clinical trials highlights the need to better understand, measure, and support product use within clinical trials. Conventional self-reported adherence instruments within HIV prevention trials, often relying on single-item questions, have proven ineffective. While objective adherence measures are desirable, none currently exist that apply to both active and placebo arms. Scales are composed of multiple items in the form of questions or statements that, when combined, measure a more complex construct that may not be directly observable. When psychometrically validated, such measures may better assess the multiple factors contributing to adherence/non-adherence. This study aimed to develop and psychometrically evaluate tools to screen and monitor trial participants' adherence to HIV prevention products within the context of clinical trial research. Based on an extensive literature review and conceptual framework, we identified and refined 86 items assessing potential predictors of adherence and 48 items assessing adherence experience. A structured survey, including adherence items and other variables, was administered to former ASPIRE and Ring Study participants and similar non-trial participants (n = 709). We conducted exploratory factor analyses (EFA) to identify a reduced set of constructs and items that could be used at screening to predict potential adherence, and at follow-up to monitor and intervene on adherence. We examined associations with other variables to assess content and construct validity. The EFA of screener items resulted in a 6-factor solution with acceptable to very good internal reliability (α: .62-.84). Similar to our conceptual framework, factors represent trial-related commitment (Distrust of Research and Commitment to Research); alignment with trial requirements (Visit Adherence and Trial Incompatibility); Belief in Trial Benefits and Partner Disclosure. The EFA on monitoring items resulted in 4 Product-specific factors that represent Vaginal Ring Doubts, Vaginal Ring Benefits, Ring Removal, and Side Effects with good to very good internal reliability (α = .71-.82). Evidence of content and construct validity was found; relationship to social desirability bias was examined. These scales are easy and inexpensive to administer, available in several languages, and are applicable regardless of randomization. Once validated prospectively, they could (1) screen for propensity to adhere, (2) target adherence support/counselling, and (3) complement biomarker measures in determining true efficacy of the experimental product.

  19. Validation and calibration of a TDLAS oxygen sensor for in-line measurement on flow-packed products

    NASA Astrophysics Data System (ADS)

    Cocola, L.; Fedel, M.; Allermann, H.; Landa, S.; Tondello, G.; Bardenstein, A.; Poletto, L.

    2016-05-01

    A device based on Tunable Diode Laser Absorption Spectroscopy has been developed for non-invasive evaluation of gaseous oxygen concentration inside packed food containers. This work has been done in the context of the SAFETYPACK European project in order to enable full, automated product testing on a production line. The chosen samples at the end of the manufacturing process are modified atmosphere bags of processed mozzarella, in which the target oxygen concentration is required to be below 5%. The spectrometer allows in-line measurement of moving samples which are passing on a conveyor belt, with an optical layout optimized for bags made of a flexible scattering material, and works by sensing the gas phase in the headspace at the top of the package. A field applicable method for the calibration of this device has been identified and validated against traditional, industry standard, invasive measurement techniques. This allows some degrees of freedom for the end-user regarding packaging dimensions and shape. After deployment and setup of the instrument at the end-user manufacturing site, performance has been evaluated on a different range of samples in order to validate the choice of electro optical and geometrical parameters regarding sample handling and measurement timing at the actual measurement conditions.

  20. Fundamental movement skills testing in children with cerebral palsy.

    PubMed

    Capio, Catherine M; Sit, Cindy H P; Abernethy, Bruce

    2011-01-01

    To examine the inter-rater reliability and comparative validity of product-oriented and process-oriented measures of fundamental movement skills among children with cerebral palsy (CP). In total, 30 children with CP aged 6 to 14 years (Mean = 9.83, SD = 2.5) and classified in Gross Motor Function Classification System (GMFCS) levels I-III performed tasks of catching, throwing, kicking, horizontal jumping and running. Process-oriented assessment was undertaken using a number of components of the Test of Gross Motor Development (TGMD-2), while product-oriented assessment included measures of time taken, distance covered and number of successful task completions. Cohen's kappa, Spearman's rank correlation coefficient and tests to compare correlated correlation coefficients were performed. Very good inter-rater reliability was found. Process-oriented measures for running and jumping had significant associations with GMFCS, as did seven product-oriented measures for catching, throwing, kicking, running and jumping. Product-oriented measures of catching, kicking and running had stronger associations with GMFCS than the corresponding process-oriented measures. Findings support the validity of process-oriented measures for running and jumping and of product-oriented measures of catching, throwing, kicking, running and jumping. However, product-oriented measures for catching, kicking and running appear to have stronger associations with functional abilities of children with CP, and are thus recommended for use in rehabilitation processes.

  1. Validation of projective mapping as potential sensory screening tool for application by the honeybush herbal tea industry.

    PubMed

    Moelich, Erika Ilette; Muller, Magdalena; Joubert, Elizabeth; Næs, Tormod; Kidd, Martin

    2017-09-01

    Honeybush herbal tea is produced from the endemic South African Cyclopia species. Plant material subjected to a high-temperature oxidation step ("fermentation") forms the bulk of production. Production lags behind demand forcing tea merchants to use blends of available material to supply local and international markets. The distinct differences in the sensory profiles of the herbal tea produced from the different Cyclopia species require that special care is given to blending to ensure a consistent, high quality product. Although conventional descriptive sensory analysis (DSA) is highly effective in providing a detailed sensory profile of herbal tea infusions, industry requires a method that is more time- and cost-effective. Recent advances in sensory science have led to the development of rapid profiling methodologies. The question is whether projective mapping can successfully be used for the sensory characterisation of herbal tea infusions. Trained assessors performed global and partial projective mapping to determine the validity of this technique for the sensory characterisation of infusions of five Cyclopia species. Similar product configurations were obtained when comparing results of DSA and global and partial projective mapping. Comparison of replicate sessions showed RV coefficients >0.8. A similarity index, based on multifactor analysis, was calculated to determine assessor repeatability. Global projective mapping, demonstrated to be a valid method for providing a broad sensory characterisation of Cyclopia species, is thus suitable as a rapid quality control method of honeybush infusions. Its application by the honeybush industry could improve the consistency of the sensory profile of blended products. Copyright © 2017 Elsevier Ltd. All rights reserved.

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Halvorsen, T.

    The next generation subsea developments will be facing a number of new challenges which have to be solved to maintain a cost-efficient solution for production of oil and gas: (1) Smaller fields, i.e. cost reduction through volume will no longer be valid. (2) Freedom in configuration of subsea development. The current idea of standardization will not be directly applicable for cost reduction. (3) Various water depth. The same technology should be applicable for both guideline- and guideline less water depth. (4) Development in new areas of the world where drilling rig deployable system is a must. (5) Various types ofmore » fluid processing may be required as an integral part of a subsea production system. The next generation subsea production system should be universal and applicable to any subsea field development. Kongsberg Offshore a.s. (KOS) have gained extensive experience in supplying standardized total subsea systems. The paper presents the approach taken by KOS to develop the next generation subsea system, and discussed the challenges associated with this.« less

  3. Enhancing Dairy Manufacturing through customer feedback: A statistical approach

    NASA Astrophysics Data System (ADS)

    Vineesh, D.; Anbuudayasankar, S. P.; Narassima, M. S.

    2018-02-01

    Dairy products have become inevitable of habitual diet. This study aims to investigate the consumers’ satisfaction towards dairy products so as to provide useful information for the manufacturers which would serve as useful inputs for enriching the quality of products delivered. The study involved consumers of dairy products from various demographical backgrounds across South India. The questionnaire focussed on quality aspects of dairy products and also the service provided. A customer satisfaction model was developed based on various factors identified, with robust hypotheses that govern the use of the product. The developed model proved to be statistically significant as it passed the required statistical tests for reliability, construct validity and interdependency between the constructs. Some major concerns detected were regarding the fat content, taste and odour of packaged milk. A minor proportion of people (15.64%) were unsatisfied with the quality of service provided, which is another issue to be addressed to eliminate the sense of dissatisfaction in the minds of consumers.

  4. Modeling Manpower and Equipment Productivity in Tall Building Construction Projects

    NASA Astrophysics Data System (ADS)

    Mudumbai Krishnaswamy, Parthasarathy; Rajiah, Murugasan; Vasan, Ramya

    2017-12-01

    Tall building construction projects involve two critical resources of manpower and equipment. Their usage, however, widely varies due to several factors affecting their productivity. Currently, no systematic study for estimating and increasing their productivity is available. What is prevalent is the use of empirical data, experience of similar projects and assumptions. As tall building projects are here to stay and increase, to meet the emerging demands in ever shrinking urban spaces, it is imperative to explore ways and means of scientific productivity models for basic construction activities: concrete, reinforcement, formwork, block work and plastering for the input of specific resources in a mixed environment of manpower and equipment usage. Data pertaining to 72 tall building projects in India were collected and analyzed. Then, suitable productivity estimation models were developed using multiple linear regression analysis and validated using independent field data. It is hoped that the models developed in the study will be useful for quantity surveyors, cost engineers and project managers to estimate productivity of resources in tall building projects.

  5. Joint investigation of working conditions, environmental and system performance at recycling centres--development of instruments and their usage.

    PubMed

    Engkvist, I-L; Eklund, J; Krook, J; Björkman, M; Sundin, E; Svensson, R; Eklund, M

    2010-05-01

    Recycling is a new and developing industry, which has only been researched to a limited extent. This article describes the development and use of instruments for data collection within a multidisciplinary research programme "Recycling centres in Sweden - working conditions, environmental and system performance". The overall purpose of the programme was to form a basis for improving the function of recycling centres with respect to these three perspectives and the disciplines of: ergonomics, safety, external environment, and production systems. A total of 10 instruments were developed for collecting data from employees, managers and visitors at recycling centres, including one instrument for observing visitors. Validation tests were performed in several steps. This, along with the quality of the collected data, and experience from the data collection, showed that the instruments and methodology used were valid and suitable for their purpose. Copyright (c) 2009 Elsevier Ltd. All rights reserved.

  6. Assessing physics learning identity: Survey development and validation

    NASA Astrophysics Data System (ADS)

    Li, Sissi L.; Demaree, Dedra

    2012-02-01

    Innovative curricula aim to improve content knowledge and the goal of helping students develop practices and skills of authentic scientist through active engagement learning. To students, these classroom practices often seem very different from their previous learning experiences in terms of behavioral expectations, learning attitude, and what learning means. We propose that productive participation in these learning environments require students to modify their identity as learners in addition to refining their science conceptual understanding. In order to measure changes in learning identity, we developed a 49-item survey to assess students' 1) expectations of student and teacher roles, 2) self efficacy towards skills supported in the Investigative Science Learning Environment (ISLE) and 3) attitudes towards social learning. Using principle components exploratory factor analysis, we have established two reliable factors with subscales that measure these student characteristics. This paper presents the survey development, validation and pilot study results.

  7. Issues and challenges of involving users in medical device development.

    PubMed

    Bridgelal Ram, Mala; Grocott, Patricia R; Weir, Heather C M

    2008-03-01

    User engagement has become a central tenet of health-care policy. This paper reports on a case study in progress that highlights user engagement in the research process in relation to medical device development. To work with a specific group of medical device users to uncover unmet needs, translating these into design concepts, novel technologies and products. To validate a knowledge transfer model that may be replicated for a range of medical device applications and user groups. In depth qualitative case study to elicit and analyse user needs. The focus is on identifying design concepts for medical device applications from unmet needs, and validating these in an iterative feedback loop to the users. The case study has highlighted three interrelated challenges: ensuring unmet needs drive new design concepts and technology development; managing user expectations and managing the research process. Despite the challenges, active participation of users is crucial to developing usable and clinically effective devices.

  8. Field analysis for explosives: TNT and RDX

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Elcoate, W.; Mapes, J.

    The EPA has listed as hazardous many of the compounds used in the production of ammunitions and other explosive ordnance. The contamination of soil with TNT (2,4,6-trinitrotoluene), the major component of many munitions formulations and to a lesser degree RDX (hexhydro-1,3,5-trinitro-1,3,5-trizine) is a significant problem at many ammunition manufacturing facilities, depots, and ordnance disposal sites. Field test kits for explosives TNT and RDX (hexhydro-1,3,5-trinitro-1,3,5-triazine) were developed based on the methods of T.F. Jenkins and M.E. Walsh and T.F Jenkins. EnSys Environmental Products, Inc. with technical support from T.F. Jenkins took the original TNT procedure, modified it for easier field use,more » performed validation studies to ensure that it met or exceeded the method specifications for both the T.F. Jenkins and SW-846 methods, and developed an easy to use test format for the field testing of TNT. The RDX procedure has gone through the development cycle and is presently in the field validation phase. This paper describes the test protocol and performance characteristics of the TNT test procedure.« less

  9. Comparison of C5 and C6 Aqua-MODIS Dark Target Aerosol Validation

    NASA Technical Reports Server (NTRS)

    Munchak, Leigh A.; Levy, Robert C.; Mattoo, Shana

    2014-01-01

    We compare C5 and C6 validation to compare the C6 10 km aerosol product against the well validated and trusted aerosol product on global and regional scales. Only the 10 km aerosol product is evaluated in this study, validation of the new C6 3 km aerosol product still needs to be performed. Not all of the time series has processed yet for C5 or C6, and the years processed for the 2 products is not exactly the same (this work is preliminary!). To reduce the impact of outlier observations, MODIS is spatially averaged within 27.5 km of the AERONET site, and AERONET is temporatally averaged within 30 minutes of the MODIS overpass time. Only high quality (QA = 3 over land, QA greater than 0 over ocean) pixels are included in the mean.

  10. The NASA Soil Moisture Active Passive (SMAP) Mission - Algorithm and Cal/Val Activities and Synergies with SMOS and Other L-Band Missions

    NASA Technical Reports Server (NTRS)

    Njoku, Eni; Entekhabi, Dara; O'Neill, Peggy; Jackson, Tom; Kellogg, Kent; Entin, Jared

    2011-01-01

    NASA's Soil Moisture Active Passive (SMAP) mission, planned for launch in late 2014, has as its key measurement objective the frequent, global mapping of near-surface soil moisture and its freeze-thaw state. SMAP soil moisture and freeze/thaw measurements at 10 km and 3 km resolutions respectively, would enable significantly improved estimates of water, energy and carbon transfers between the land and atmosphere. Soil moisture control of these fluxes is a key factor in the performance of atmospheric models used for weather forecasts and climate projections Soil moisture measurements are also of great importance in assessing floods and for monitoring drought. In addition, observations of soil moisture and freeze/thaw timing over the boreal latitudes can help reduce uncertainties in quantifying the global carbon balance. The SMAP measurement concept utilizes an L-band radar and radiometer sharing a rotating 6-meter mesh reflector antenna. The SMAP radiometer and radar flight hardware and ground processing designs are incorporating approaches to identify and mitigate potential terrestrial radio frequency interference (RFI). The radar and radiometer instruments are planned to operate in a 680 km polar orbit, viewing the surface at a constant 40-degree incidence angle with a 1000-km swath width, providing 3-day global coverage. Data from the instruments would yield global maps of soil moisture and freeze/thaw state to be provided at 10 km and 3 km resolutions respectively, every two to three days. Plans are to provide also a radiometer-only soil moisture product at 40-km spatial resolution. This product and the underlying brightness temperatures have characteristics similar to those provided by the Soil Moisture and Ocean Salinity (SMOS) mission. As a result, there are unique opportunities for common data product development and continuity between the two missions. SMAP also has commonalities with other satellite missions having L-band radiometer and/or radar sensors applicable to soil moisture measurement, such as Aquarius, SAO COM, and ALOS-2. The algorithms and data products for SMAP are being developed in the SMAP Science Data System (SDS) Testbed. The algorithms are developed and evaluated in the SDS Testbed using simulated SMAP observations as well as observational data from current airborne and spaceborne L-band sensors including SMOS. The SMAP project is developing a Calibration and Validation (Cal/Val) Plan that is designed to support algorithm development (pre-launch) and data product validation (post-launch). A key component of the Cal/Val Plan is the identification, characterization, and instrumentation of sites that can be used to calibrate and validate the sensor data (Level I) and derived geophysical products (Level 2 and higher). In this presentation we report on the development status of the SMAP data product algorithms, and the planning and implementation of the SMAP Cal/Val program. Several components of the SMAP algorithm development and Cal/Val plans have commonality with those of SMOS, and for this reason there are shared activities and resources that can be utilized between the missions, including in situ networks, ancillary data sets, and long-term monitoring sites.

  11. Land Product Validation (LPV)

    NASA Technical Reports Server (NTRS)

    Schaepman, Gabriela; Roman, Miguel O.

    2013-01-01

    This presentation will discuss Land Product Validation (LPV) objectives and goals, LPV structure update, interactions with other initiatives during report period, outreach to the science community, future meetings and next steps.

  12. Draft Plan for Characterizing Commercial Data Products in Support of Earth Science Research

    NASA Technical Reports Server (NTRS)

    Ryan, Robert E.; Terrie, Greg; Berglund, Judith

    2006-01-01

    This presentation introduces a draft plan for characterizing commercial data products for Earth science research. The general approach to the commercial product verification and validation includes focused selection of a readily available commercial remote sensing products that support Earth science research. Ongoing product verification and characterization will question whether the product meets specifications and will examine its fundamental properties, potential and limitations. Validation will encourage product evaluation for specific science research and applications. Specific commercial products included in the characterization plan include high-spatial-resolution multispectral (HSMS) imagery and LIDAR data products. Future efforts in this process will include briefing NASA headquarters and modifying plans based on feedback, increased engagement with the science community and refinement of details, coordination with commercial vendors and The Joint Agency Commercial Imagery Evaluation (JACIE) for HSMS satellite acquisitions, acquiring waveform LIDAR data and performing verification and validation.

  13. A Cloud Mask for AIRS

    NASA Technical Reports Server (NTRS)

    Brubaker, N.; Jedlovec, G. J.

    2004-01-01

    With the preliminary release of AIRS Level 1 and 2 data to the scientific community, there is a growing need for an accurate AIRS cloud mask for data assimilation studies and in producing products derived from cloud free radiances. Current cloud information provided with the AIRS data are limited or based on simplified threshold tests. A multispectral cloud detection approach has been developed for AIRS that utilizes the hyper-spectral capabilities to detect clouds based on specific cloud signatures across the short wave and long wave infrared window regions. This new AIRS cloud mask has been validated against the existing AIRS Level 2 cloud product and cloud information derived from MODIS. Preliminary results for both day and night applications over the continental U.S. are encouraging. Details of the cloud detection approach and validation results will be presented at the conference.

  14. Validation of the Italian version of the Stanford Presenteeism Scale in nurses.

    PubMed

    Cicolini, Giancarlo; Della Pelle, Carlo; Cerratti, Francesca; Franza, Marcello; Flacco, Maria E

    2016-07-01

    To ascertain the validity and reliability of the Italian version of the Stanford Presenteeism Scale (SPS-6). Presenteeism has been associated with a work productivity reduction, a lower quality of work and an increased risk of developing health disorders. It is particularly high among nurses and needs valid tools to be assessed. A validation study was carried out from July to September 2014. A three-section tool, made of a demographic form, the Stanford Presenteeism Scale (SPS-6) and the Perceived Stress Scale (PSS-10) was administered to a sample of nurses, enrolled in three Italian hospitals. Cronbach's α for the entire sample (229 nurses) was found to be 0.72. A significant negative correlation between SPS and perceived stress scores evidenced the external validity. The factor analysis showed a two-component solution, accounting for 71.2% of the variance. The confirmatory factor analysis showed an adequate fit. The Italian SPS-6 is a valid and reliable tool for workplace surveys. Since the validity and reliability of SPS-6 has been confirmed for the Italian version, we have now a valid tool that can measure the levels of presenteeism among Italian nurses. © 2016 John Wiley & Sons Ltd.

  15. Validation of selected analytical methods using accuracy profiles to assess the impact of a Tobacco Heating System on indoor air quality.

    PubMed

    Mottier, Nicolas; Tharin, Manuel; Cluse, Camille; Crudo, Jean-René; Lueso, María Gómez; Goujon-Ginglinger, Catherine G; Jaquier, Anne; Mitova, Maya I; Rouget, Emmanuel G R; Schaller, Mathieu; Solioz, Jennifer

    2016-09-01

    Studies in environmentally controlled rooms have been used over the years to assess the impact of environmental tobacco smoke on indoor air quality. As new tobacco products are developed, it is important to determine their impact on air quality when used indoors. Before such an assessment can take place it is essential that the analytical methods used to assess indoor air quality are validated and shown to be fit for their intended purpose. Consequently, for this assessment, an environmentally controlled room was built and seven analytical methods, representing eighteen analytes, were validated. The validations were carried out with smoking machines using a matrix-based approach applying the accuracy profile procedure. The performances of the methods were compared for all three matrices under investigation: background air samples, the environmental aerosol of Tobacco Heating System THS 2.2, a heat-not-burn tobacco product developed by Philip Morris International, and the environmental tobacco smoke of a cigarette. The environmental aerosol generated by the THS 2.2 device did not have any appreciable impact on the performances of the methods. The comparison between the background and THS 2.2 environmental aerosol samples generated by smoking machines showed that only five compounds were higher when THS 2.2 was used in the environmentally controlled room. Regarding environmental tobacco smoke from cigarettes, the yields of all analytes were clearly above those obtained with the other two air sample types. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.

  16. Systems Engineering Metrics: Organizational Complexity and Product Quality Modeling

    NASA Technical Reports Server (NTRS)

    Mog, Robert A.

    1997-01-01

    Innovative organizational complexity and product quality models applicable to performance metrics for NASA-MSFC's Systems Analysis and Integration Laboratory (SAIL) missions and objectives are presented. An intensive research effort focuses on the synergistic combination of stochastic process modeling, nodal and spatial decomposition techniques, organizational and computational complexity, systems science and metrics, chaos, and proprietary statistical tools for accelerated risk assessment. This is followed by the development of a preliminary model, which is uniquely applicable and robust for quantitative purposes. Exercise of the preliminary model using a generic system hierarchy and the AXAF-I architectural hierarchy is provided. The Kendall test for positive dependence provides an initial verification and validation of the model. Finally, the research and development of the innovation is revisited, prior to peer review. This research and development effort results in near-term, measurable SAIL organizational and product quality methodologies, enhanced organizational risk assessment and evolutionary modeling results, and 91 improved statistical quantification of SAIL productivity interests.

  17. The Shock and Vibration Digest. Volume 16, Number 9

    DTIC Science & Technology

    1984-09-01

    behavior of the interface between dry Ottawa sand and concrete has been studied using a new device developed for Key Words: Underground structures, Concretes...Establishment, Ottawa , On- Reviews tario, Canada. Rept. No. NAE-AN-7, NRC-21276, -- 153 pp (Apr 1983) _ Recently developed analytical models for the...elements. The final phase of validation included simu- ,- 0 National Aeronautical Establishment, Ottawa , On- lation of dynamic tests of production

  18. Simultaneous quantification of five major active components in capsules of the traditional Chinese medicine ‘Shu-Jin-Zhi-Tong’ by high performance liquid chromatography

    PubMed Central

    Yang, Xing-Xin; Zhang, Xiao-Xia; Chang, Rui-Miao; Wang, Yan-Wei; Li, Xiao-Ni

    2011-01-01

    A simple and reliable high performance liquid chromatography (HPLC) method has been developed for the simultaneous quantification of five major bioactive components in ‘Shu-Jin-Zhi-Tong’ capsules (SJZTC), for the purposes of quality control of this commonly prescribed traditional Chinese medicine. Under the optimum conditions, excellent separation was achieved, and the assay was fully validated in terms of linearity, precision, repeatability, stability and accuracy. The validated method was applied successfully to the determination of the five compounds in SJZTC samples from different production batches. The HPLC method can be used as a valid analytical method to evaluate the intrinsic quality of SJZTC. PMID:29403711

  19. Investigation of the presence of β-hydroxy-β-methylbutyric acid and α-hydroxyisocaproic acid in bovine whole milk and fermented dairy products by a validated liquid chromatography-mass spectrometry method.

    PubMed

    Ehling, Stefan; Reddy, Todime M

    2014-02-19

    A simple, rugged, quantitative, and confirmatory method based on liquid chromatography-mass spectrometry was developed and comprehensively validated for the analysis of the leucine metabolites β-hydroxy-β-methylbutyric acid (HMB) and α-hydroxyisocaproic acid (HICA) in bovine whole milk and yogurt. Mean accuracy (90-110% for HMB and 85-115% for HICA) and total precision (<10% RSD in most cases, except for <20% RSD for HMB at the limit of quantitation) at four concentration levels across three validation runs have been determined. Limits of quantitation for HMB and HICA in whole milk were 20 and 5 μg/L, respectively. Measured concentrations of HMB and HICA were <20-29 and 32-37 μg/L, respectively, in bovine whole milk and <5 and 3.0-15.2 mg/L, respectively, in yogurt. These concentrations are insufficient by large margins to deliver any musculoskeletal benefits, and fortification of milk and dairy products with HMB and/or HICA appears to be justified.

  20. Stennis Space Center Verification & Validation Capabilities

    NASA Technical Reports Server (NTRS)

    Pagnutti, Mary; Ryan, Robert E.; Holekamp, Kara; ONeal, Duane; Knowlton, Kelly; Ross, Kenton; Blonski, Slawomir

    2005-01-01

    Scientists within NASA s Applied Sciences Directorate have developed a well-characterized remote sensing Verification & Validation (V&V) site at the John C. Stennis Space Center (SSC). This site enables the in-flight characterization of satellite and airborne high spatial and moderate resolution remote sensing systems and their products. The smaller scale of the newer high resolution remote sensing systems allows scientists to characterize geometric, spatial, and radiometric data properties using a single V&V site. The targets and techniques used to characterize data from these newer systems can differ significantly from the techniques used to characterize data from the earlier, coarser spatial resolution systems. Scientists are also using the SSC V&V site to characterize thermal infrared systems and active lidar systems. SSC employs geodetic targets, edge targets, radiometric tarps, atmospheric monitoring equipment, and thermal calibration ponds to characterize remote sensing data products. The SSC Instrument Validation Lab is a key component of the V&V capability and is used to calibrate field instrumentation and to provide National Institute of Standards and Technology traceability. This poster presents a description of the SSC characterization capabilities and examples of calibration data.

  1. Assessment of Magical Beliefs about Food and Health.

    PubMed

    Lindeman, M; Keskivaara, P; Roschier, M

    2000-03-01

    The Magical Beliefs About Food and Health scale (MFH) was developed to assess individual differences in the tendency to adopt eating and health instructions that many magazines, health care books and food ideologies regard as valid but which obey universal laws of similarity and contagion. In a study of 216 individuals, the total MFH score showed good internal consistency and it was associated with various validity criteria as hypothesized (e.g. vegetarianism and other ideological commitments to food choice, female gender, increased neuroticism, experiential thinking, positive attitudes towards alternative medicine, low sensation seeking and endorsement of universalism values). Factor analysis yielded two factors: General Magical Beliefs and Animal Products as Food Contaminants. In addition, three other items (the Animal Products as Personality Contaminants scale) cross-loaded on the two factors. The factor structure and test-retest reliability were confirmed with separate samples. The results showed that the total MFH score is a reliable and valid measure of magical food and health beliefs, and that the subscales may prove useful when a multidimensional assessment of magical beliefs is needed.

  2. Examining the ecological validity of the Talent Development Environment Questionnaire.

    PubMed

    Martindale, Russell J J; Collins, Dave; Douglas, Carl; Whike, Ally

    2013-01-01

    It is clear that high class expertise and effective practice exists within many talent development environments across the world. However, there is also a general consensus that widespread evidence-based policy and practice is lacking. As such, it is crucial to develop solutions which can facilitate effective dissemination of knowledge and promotion of evidence-based talent development systems. While the Talent Development Environment Questionnaire (Martindale et al., 2010 ) provides a method through which this could be facilitated, its ecological validity has remained untested. As such, this study aimed to investigate the real world applicability of the questionnaire through discriminant function analysis. Athletes across ten distinct regional squads and academies were identified and separated into two broad levels, 'higher quality' (n = 48) and 'lower quality' (n = 51) environments, based on their process quality and productivity. Results revealed that the Talent Development Environment Questionnaire was able to discriminate with 77.8% accuracy. Furthermore, in addition to the questionnaire as a whole, two individual features, 'quality preparation' (P < 0.01) and 'understanding the athlete' (P < 0.01), were found to be significant discriminators. In conclusion, the results indicate robust structural properties and sound ecological validity, allowing the questionnaire to be used with more confidence in applied and research settings.

  3. NASA's Carbon Monitoring System (CMS) Applications and Application Readiness Levels (ARLs)-An assessment of how all CMS ARLs provide societal benefit.

    NASA Astrophysics Data System (ADS)

    Escobar, V. M.; Sepulveda Carlo, E.; Delgado Arias, S.

    2016-12-01

    During the past six years, the NASA Carbon Monitoring System (CMS) Applications effort has been engaging with stakeholders in an effort to make the 52 CMS project user friendly and policy relevant. Congressionally directed, the CMS initiative is a NASA endeavor providing carbon data products that help characterize and understand carbon sources and sinks at local and international scales. All data are freely available, and scaled for local, state, regional, national and international-level resource management. To facilitate user feedback during development, as well as understanding for the type of use and application the CMS data products can provide, the Applications project utilizes the NASA Applied Sciences Program nine step Application Readiness Level (ARL) indices. These are used to track and manage the progression and distribution of funded projects. ARLs are an adaptation of NASA's technology readiness levels (TRLs) used for managing technology and risk and reflects the three main tiers of a project: research, development and deployment. The ARLs are scaled from 1 to 9, research and development (ARL1) to operational and/or decision making ready products (ARL9). The ARLS can be broken up into three phases: Phase 1, discovery and feasibility (ARL 1-3); Phase 2, development testing and validation (ARL 4-6); and Phase 3, integration into Partner's systems (ARL 7-9). The ARLs are designed to inform both scientist and end user of the product maturity and application capability. The CMS initiative has products that range across all ARLs, providing societal benefit at multiple scales. Lower ARLs contribute to formal documents such as the IPCC while others at higher levels provide decision support quantifying the value of carbon data for greenhouse gas (GHG) reduction planning. Most CMS products have an ARL 5, (validation of a product in a relevant environment), meaning the CMS carbon science is actively in a state of science-user engagement. For the user community, ARLs are a litmus test for knowing the type of user feedback and advocacy that can be implemented into the product design. For the scientist, ARLS help communicate (1) the maturity of their science to users who would like to use it for decision making and (2) the intended use of the product.

  4. Consumer preference models: fuzzy theory approach

    NASA Astrophysics Data System (ADS)

    Turksen, I. B.; Wilson, I. A.

    1993-12-01

    Consumer preference models are widely used in new product design, marketing management, pricing and market segmentation. The purpose of this article is to develop and test a fuzzy set preference model which can represent linguistic variables in individual-level models implemented in parallel with existing conjoint models. The potential improvements in market share prediction and predictive validity can substantially improve management decisions about what to make (product design), for whom to make it (market segmentation) and how much to make (market share prediction).

  5. Bio-markers: traceability in food safety issues.

    PubMed

    Raspor, Peter

    2005-01-01

    Research and practice are focusing on development, validation and harmonization of technologies and methodologies to ensure complete traceability process throughout the food chain. The main goals are: scale-up, implementation and validation of methods in whole food chains, assurance of authenticity, validity of labelling and application of HACCP (hazard analysis and critical control point) to the entire food chain. The current review is to sum the scientific and technological basis for ensuring complete traceability. Tracing and tracking (traceability) of foods are complex processes due to the (bio)markers, technical solutions and different circumstances in different technologies which produces various foods (processed, semi-processed, or raw). Since the food is produced for human or animal consumption we need suitable markers to be stable and traceable all along the production chain. Specific biomarkers can have a function in technology and in nutrition. Such approach would make this development faster and more comprehensive and would make possible that food effect could be monitored with same set of biomarkers in consumer. This would help to develop and implement food safety standards that would be based on real physiological function of particular food component.

  6. Electrochemical sensors for identifying pyocyanin production in clinical Pseudomonas aeruginosa isolates.

    PubMed

    Sismaet, Hunter J; Pinto, Ameet J; Goluch, Edgar D

    2017-11-15

    In clinical practice, delays in obtaining culture results impact patient care and the ability to tailor antibiotic therapy. Despite the advancement of rapid molecular diagnostics, the use of plate cultures inoculated from swab samples continues to be the standard practice in clinical care. Because the inoculation culture process can take between 24 and 48h before a positive identification test can be run, there is an unmet need to develop rapid throughput methods for bacterial identification. Previous work has shown that pyocyanin can be used as a rapid, redox-active biomarker for identifying Pseudomonas aeruginosa in clinical infections. However, further validation is needed to confirm pyocyanin production occurs in all clinical strains of P. aeruginosa. Here, we validate this electrochemical detection strategy using clinical isolates obtained from patients with hospital-acquired infections or with cystic fibrosis. Square-wave voltammetric scans of 94 different clinical P. aeruginosa isolates were taken to measure the concentration of pyocyanin. The results showed that all isolates produced measureable concentrations of pyocyanin with production rates correlated with patient symptoms and comorbidity. Further bioinformatics analysis confirmed that 1649 genetically sequenced strains (99.9%) of P. aeruginosa possess the two genes (PhzM and PhzS) necessary to produce pyocyanin, supporting the specificity of this biomarker. Confirming the production of pyocyanin by all clinically-relevant strains of P. aeruginosa is a significant step towards validating this strategy for rapid, point-of-care diagnostics. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. Influence of process parameters on content uniformity of a low dose active pharmaceutical ingredient in a tablet formulation according to GMP.

    PubMed

    Muselík, Jan; Franc, Aleš; Doležel, Petr; Goněc, Roman; Krondlová, Anna; Lukášová, Ivana

    2014-09-01

    The article describes the development and production of tablets using direct compression of powder mixtures. The aim was to describe the impact of filler particle size and the time of lubricant addition during mixing on content uniformity according to the Good Manufacturing Practice (GMP) process validation requirements. Processes are regulated by complex directives, forcing the producers to validate, using sophisticated methods, the content uniformity of intermediates as well as final products. Cutting down of production time and material, shortening of analyses, and fast and reliable statistic evaluation of results can reduce the final price without affecting product quality. The manufacturing process of directly compressed tablets containing the low dose active pharmaceutical ingredient (API) warfarin, with content uniformity passing validation criteria, is used as a model example. Statistic methods have proved that the manufacturing process is reproducible. Methods suitable for elucidation of various properties of the final blend, e.g., measurement of electrostatic charge by Faraday pail and evaluation of mutual influences of researched variables by partial least square (PLS) regression, were used. Using these methods, it was proved that the filler with higher particle size increased the content uniformity of both blends and the ensuing tablets. Addition of the lubricant, magnesium stearate, during the blending process improved the content uniformity of blends containing the filler with larger particles. This seems to be caused by reduced sampling error due to the suppression of electrostatic charge.

  8. Protocol for the validation of microbiological control of cellular products according to German regulators recommendations--Boon and Bane for the manufacturer.

    PubMed

    Störmer, M; Radojska, S; Hos, N J; Gathof, B S

    2015-04-01

    In order to generate standardized conditions for the microbiological control of HPCs, the PEI recommended defined steps for validation that will lead to extensive validation as shown in this study, where a possible validation principle for the microbiological control of allogeneic SCPs is presented. Although it could be demonstrated that automated culture improves microbial safety of cellular products, the requirement for extensive validation studies needs to be considered. © 2014 International Society of Blood Transfusion.

  9. Effects of export concentration on CO2 emissions in developed countries: an empirical analysis.

    PubMed

    Apergis, Nicholas; Can, Muhlis; Gozgor, Giray; Lau, Chi Keung Marco

    2018-03-08

    This paper provides the evidence on the short- and the long-run effects of the export product concentration on the level of CO 2 emissions in 19 developed (high-income) economies, spanning the period 1962-2010. To this end, the paper makes use of the nonlinear panel unit root and cointegration tests with multiple endogenous structural breaks. It also considers the mean group estimations, the autoregressive distributed lag model, and the panel quantile regression estimations. The findings illustrate that the environmental Kuznets curve (EKC) hypothesis is valid in the panel dataset of 19 developed economies. In addition, it documents that a higher level of the product concentration of exports leads to lower CO 2 emissions. The results from the panel quantile regressions also indicate that the effect of the export product concentration upon the per capita CO 2 emissions is relatively high at the higher quantiles.

  10. Hydrazine Catalyst Production: Sustaining S-405 Technology

    NASA Technical Reports Server (NTRS)

    Wucherer, E. J.; Cook, Timothy; Stiefel, Mark; Humphries, Randy, Jr.; Parker, Janet

    2003-01-01

    The development of the iridium-based Shell 405 catalyst for spontaneous decomposition of hydrazine was one of the key enabling technologies for today's spacecraft and launch vehicles. To ensure that this crucial technology was not lost when Shell elected to exit the business, Aerojet, supported by NASA, has developed a dedicated catalyst production facility that will supply catalyst for future spacecraft and launch vehicle requirements. We have undertaken a program to transfer catalyst production from Shell Chemical USA (Houston, TX) to Aerojet's Redmond, WA location. This technology transition was aided by Aerojet's 30 years of catalyst manufacturing experience and NASA diligence and support in sustaining essential technologies. The facility has produced and tested S-405 catalyst to existing Shell 405 specifications and standards. Our presentation will describe the technology transition effort including development of the manufacturing facility, capture of the manufacturing process, test equipment validation, initial batch build and final testing.

  11. Status of the ion sources developments for the Spiral2 project at GANILa)

    NASA Astrophysics Data System (ADS)

    Lehérissier, P.; Bajeat, O.; Barué, C.; Canet, C.; Dubois, M.; Dupuis, M.; Flambard, J. L.; Frigot, R.; Jardin, P.; Leboucher, C.; Lemagnen, F.; Maunoury, L.; Osmond, B.; Pacquet, J. Y.; Pichard, A.; Thuillier, T.; Peaucelle, C.

    2012-02-01

    The SPIRAL 2 facility is now under construction and will deliver either stable or radioactive ion beams. First tests of nickel beam production have been performed at GANIL with a new version of the large capacity oven, and a calcium beam has been produced on the heavy ion low energy beam transport line of SPIRAL 2, installed at LPSC Grenoble. For the production of radioactive beams, several target/ion-source systems (TISSs) are under development at GANIL as the 2.45 GHz electron cyclotron resonance ion source, the surface ionization source, and the oven prototype for heating the uranium carbide target up to 2000 °C. The existing test bench has been upgraded for these developments and a new one, dedicated for the validation of the TISS before mounting in the production module, is under design. Results and current status of these activities are presented.

  12. SMERGE: A multi-decadal root-zone soil moisture product for CONUS

    NASA Astrophysics Data System (ADS)

    Crow, W. T.; Dong, J.; Tobin, K. J.; Torres, R.

    2017-12-01

    Multi-decadal root-zone soil moisture products are of value for a range of water resource and climate applications. The NASA-funded root-zone soil moisture merging project (SMERGE) seeks to develop such products through the optimal merging of land surface model predictions with surface soil moisture retrievals acquired from multi-sensor remote sensing products. This presentation will describe the creation and validation of a daily, multi-decadal (1979-2015), vertically-integrated (both surface to 40 cm and surface to 100 cm), 0.125-degree root-zone product over the contiguous United States (CONUS). The modeling backbone of the system is based on hourly root-zone soil moisture simulations generated by the Noah model (v3.2) operating within the North American Land Data Assimilation System (NLDAS-2). Remotely-sensed surface soil moisture retrievals are taken from the multi-sensor European Space Agency Climate Change Initiative soil moisture data set (ESA CCI SM). In particular, the talk will detail: 1) the exponential smoothing approach used to convert surface ESA CCI SM retrievals into root-zone soil moisture estimates, 2) the averaging technique applied to merge (temporally-sporadic) remotely-sensed with (continuous) NLDAS-2 land surface model estimates of root-zone soil moisture into the unified SMERGE product, and 3) the validation of the SMERGE product using long-term, ground-based soil moisture datasets available within CONUS.

  13. Microbial production of polyhydroxybutyrate with tailor-made properties: an integrated modelling approach and experimental validation.

    PubMed

    Penloglou, Giannis; Chatzidoukas, Christos; Kiparissides, Costas

    2012-01-01

    The microbial production of polyhydroxybutyrate (PHB) is a complex process in which the final quantity and quality of the PHB depend on a large number of process operating variables. Consequently, the design and optimal dynamic operation of a microbial process for the efficient production of PHB with tailor-made molecular properties is an extremely interesting problem. The present study investigates how key process operating variables (i.e., nutritional and aeration conditions) affect the biomass production rate and the PHB accumulation in the cells and its associated molecular weight distribution. A combined metabolic/polymerization/macroscopic modelling approach, relating the process performance and product quality with the process variables, was developed and validated using an extensive series of experiments and measurements. The model predicts the dynamic evolution of the biomass growth, the polymer accumulation, the consumption of carbon and nitrogen sources and the average molecular weights of the PHB in a bioreactor, under batch and fed-batch operating conditions. The proposed integrated model was used for the model-based optimization of the production of PHB with tailor-made molecular properties in Azohydromonas lata bacteria. The process optimization led to a high intracellular PHB accumulation (up to 95% g of PHB per g of DCW) and the production of different grades (i.e., different molecular weight distributions) of PHB. Copyright © 2011 Elsevier Inc. All rights reserved.

  14. Vending machine assessment methodology. A systematic review.

    PubMed

    Matthews, Melissa A; Horacek, Tanya M

    2015-07-01

    The nutritional quality of food and beverage products sold in vending machines has been implicated as a contributing factor to the development of an obesogenic food environment. How comprehensive, reliable, and valid are the current assessment tools for vending machines to support or refute these claims? A systematic review was conducted to summarize, compare, and evaluate the current methodologies and available tools for vending machine assessment. A total of 24 relevant research studies published between 1981 and 2013 met inclusion criteria for this review. The methodological variables reviewed in this study include assessment tool type, study location, machine accessibility, product availability, healthfulness criteria, portion size, price, product promotion, and quality of scientific practice. There were wide variations in the depth of the assessment methodologies and product healthfulness criteria utilized among the reviewed studies. Of the reviewed studies, 39% evaluated machine accessibility, 91% evaluated product availability, 96% established healthfulness criteria, 70% evaluated portion size, 48% evaluated price, 52% evaluated product promotion, and 22% evaluated the quality of scientific practice. Of all reviewed articles, 87% reached conclusions that provided insight into the healthfulness of vended products and/or vending environment. Product healthfulness criteria and complexity for snack and beverage products was also found to be variable between the reviewed studies. These findings make it difficult to compare results between studies. A universal, valid, and reliable vending machine assessment tool that is comprehensive yet user-friendly is recommended. Copyright © 2015 Elsevier Ltd. All rights reserved.

  15. The construct of food involvement in behavioral research: scale development and validation.

    PubMed

    Bell, Rick; Marshall, David W

    2003-06-01

    The construct of involvement has been found to influence brand loyalty, product information search processing, responses to advertising communications, diffusion of innovations, and ultimately, product choice decisions. Traditionally, involvement has been defined as being a characteristic of either a product or of an individual. In the present research, we make an assumption that an individual's 'food involvement' is a somewhat stable characteristic and we hypothesized that involvement with foods would vary between individuals, that individuals who are more highly involved with food would be better able to discriminate between a set of food samples than would less food involved individuals, and that this discrimination would operate both in affective and perceptive relative judgments. Using standard scale construction techniques, we developed a measure of the characteristic of food involvement, based on activities relating to food acquisition, preparation, cooking, eating and disposal. After several iterations, a final 12-item measure was found to have good test-retest reliability and internal consistency within two subscales. A behavioral validation study demonstrated that measures of food involvement were associated with discrimination and hedonic ratings for a range of foods in a laboratory setting. These findings suggest that food involvement, as measured by the Food Involvement Scale, may be an important mediator to consider when undertaking research with food and food habits.

  16. Development of an effective dose coefficient database using a computational human phantom and Monte Carlo simulations to evaluate exposure dose for the usage of NORM-added consumer products.

    PubMed

    Yoo, Do Hyeon; Shin, Wook-Geun; Lee, Jaekook; Yeom, Yeon Soo; Kim, Chan Hyeong; Chang, Byung-Uck; Min, Chul Hee

    2017-11-01

    After the Fukushima accident in Japan, the Korean Government implemented the "Act on Protective Action Guidelines Against Radiation in the Natural Environment" to regulate unnecessary radiation exposure to the public. However, despite the law which came into effect in July 2012, an appropriate method to evaluate the equivalent and effective doses from naturally occurring radioactive material (NORM) in consumer products is not available. The aim of the present study is to develop and validate an effective dose coefficient database enabling the simple and correct evaluation of the effective dose due to the usage of NORM-added consumer products. To construct the database, we used a skin source method with a computational human phantom and Monte Carlo (MC) simulation. For the validation, the effective dose was compared between the database using interpolation method and the original MC method. Our result showed a similar equivalent dose across the 26 organs and a corresponding average dose between the database and the MC calculations of < 5% difference. The differences in the effective doses were even less, and the result generally show that equivalent and effective doses can be quickly calculated with the database with sufficient accuracy. Copyright © 2017 Elsevier Ltd. All rights reserved.

  17. Verification of MCNP6.2 for Nuclear Criticality Safety Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, Forrest B.; Rising, Michael Evan; Alwin, Jennifer Louise

    2017-05-10

    Several suites of verification/validation benchmark problems were run in early 2017 to verify that the new production release of MCNP6.2 performs correctly for nuclear criticality safety applications (NCS). MCNP6.2 results for several NCS validation suites were compared to the results from MCNP6.1 [1] and MCNP6.1.1 [2]. MCNP6.1 is the production version of MCNP® released in 2013, and MCNP6.1.1 is the update released in 2014. MCNP6.2 includes all of the standard features for NCS calculations that have been available for the past 15 years, along with new features for sensitivity-uncertainty based methods for NCS validation [3]. Results from the benchmark suitesmore » were compared with results from previous verification testing [4-8]. Criticality safety analysts should consider testing MCNP6.2 on their particular problems and validation suites. No further development of MCNP5 is planned. MCNP6.1 is now 4 years old, and MCNP6.1.1 is now 3 years old. In general, released versions of MCNP are supported only for about 5 years, due to resource limitations. All future MCNP improvements, bug fixes, user support, and new capabilities are targeted only to MCNP6.2 and beyond.« less

  18. Review of the cultivation program within the National Alliance for Advanced Biofuels and Bioproducts

    DOE PAGES

    Lammers, Peter J.; Huesemann, Michael; Boeing, Wiebke; ...

    2016-12-12

    The cultivation efforts within the National Alliance for Advanced Biofuels and Bioproducts (NAABB) were developed to provide four major goals for the consortium, which included biomass production for downstream experimentation, development of new assessment tools for cultivation, development of new cultivation reactor technologies, and development of methods for robust cultivation. The NAABB consortium testbeds produced over 1500 kg of biomass for downstream processing. The biomass production included a number of model production strains, but also took into production some of the more promising strains found through the prospecting efforts of the consortium. Cultivation efforts at large scale are intensive andmore » costly, therefore the consortium developed tools and models to assess the productivity of strains under various environmental conditions, at lab scale, and validated these against scaled outdoor production systems. Two new pond-based bioreactor designs were tested for their ability to minimize energy consumption while maintaining, and even exceeding, the productivity of algae cultivation compared to traditional systems. Also, molecular markers were developed for quality control and to facilitate detection of bacterial communities associated with cultivated algal species, including the Chlorella spp. pathogen, Vampirovibrio chlorellavorus, which was identified in at least two test site locations in Arizona and New Mexico. Finally, the consortium worked on understanding methods to utilize compromised municipal wastewater streams for cultivation. In conclusion, this review provides an overview of the cultivation methods and tools developed by the NAABB consortium to produce algae biomass, in robust low energy systems, for biofuel production.« less

  19. Safety validation test equipment operation

    NASA Astrophysics Data System (ADS)

    Kurosaki, Tadaaki; Watanabe, Takashi

    1992-08-01

    An overview of the activities conducted on safety validation test equipment operation for materials used for NASA manned missions is presented. Safety validation tests, such as flammability, odor, offgassing, and so forth were conducted in accordance with NASA-NHB-8060.1C using test subjects common with those used by NASA, and the equipment used were qualified for their functions and performances in accordance with NASDA-CR-99124 'Safety Validation Test Qualification Procedures.' Test procedure systems were established by preparing 'Common Procedures for Safety Validation Test' as well as test procedures for flammability, offgassing, and odor tests. The test operation organization chaired by the General Manager of the Parts and Material Laboratory of NASDA (National Space Development Agency of Japan) was established, and the test leaders and operators in the organization were qualified in accordance with the specified procedures. One-hundred-one tests had been conducted so far by the Parts and Material Laboratory according to the request submitted by the manufacturers through the Space Station Group and the Safety and Product Assurance for Manned Systems Office.

  20. The CMEMS-Med-MFC-Biogeochemistry operational system: implementation of NRT and Multi-Year validation tools

    NASA Astrophysics Data System (ADS)

    Salon, Stefano; Cossarini, Gianpiero; Bolzon, Giorgio; Teruzzi, Anna

    2017-04-01

    The Mediterranean Monitoring and Forecasting Centre (Med-MFC) is one of the regional production centres of the EU Copernicus Marine Environment Monitoring Service (CMEMS). Med-MFC manages a suite of numerical model systems for the operational delivery of the CMEMS products, providing continuous monitoring and forecasting of the Mediterranean marine environment. The CMEMS products of fundamental biogeochemical variables (chlorophyll, nitrate, phosphate, oxygen, phytoplankton biomass, primary productivity, pH, pCO2) are organised as gridded datasets and are available at the marine.copernicus.eu web portal. Quantitative estimates of CMEMS products accuracy are prerequisites to release reliable information to intermediate users, end users and to other downstream services. In particular, validation activities aim to deliver accuracy information of the model products and to serve as a long term monitoring of the performance of the modelling systems. The quality assessment of model output is implemented using a multiple-stages approach, basically inspired to the classic "GODAE 4 Classes" metrics and criteria (consistency, quality, performance and benefit). Firstly, pre-operational runs qualify the operational model system against historical data, also providing a verification of the improvements of the new model system release with respect to the previous version. Then, the near real time (NRT) validation aims at delivering a sustained on-line skill assessment of the model analysis and forecast, relying on the NRT available relevant observations (e.g. in situ, Bio Argo and satellite observations). NRT validation results are operated on weekly basis and published on the MEDEAF web portal (www.medeaf.inogs.it). On a quarterly basis, the integration of the NRT validation activities delivers a comprehensive view of the accuracy of model forecast through the official CMEMS validation webpage. Multi-Year production (e.g. reanalysis runs) follows a similar procedure, and the validation is achieved using the same metrics on available historical observations (e.g. the World Ocean Atlas 2013 dataset). Results of the validation activities show that the comparison of the different variables of the CMEMS products with experimental data is feasible at different levels (i.e. either as skill assessment of the short-term forecast and as model consistency through different system versions) and at different spatial and temporal scales. In particular, the accuracy of some variables (chlorophyll, nitrate, oxygen) can be provided at weekly scale and sub-mesoscale, others (carbonate system, phosphate) at quarterly/annual and sub-basin scale, and others (phytoplankton biomass, primary production) only at the level of consistency of model functioning (e.g. literature- or climatology-based). In spite of a wide literature on model validation has been produced so far, maintaining a validation framework in the biogeochemical operational contest that fulfils GODAE criteria is still a challenge. Recent results of the validation activities and new potential validation framework at the Med-MFC will be presented in our contribution.

Top