Theater Blood Application Was Not Effectively Developed and Implemented
2015-07-17
blood product by unit; and • monitor non- Food and Drug Administration Blood Product Testing. The CONOPS document also identified over 400 specific...time of a transfusion. However, this requirement was not identified in the CONOPS document. Further, PEO DHCS officials provided a traceability ...the CONOPS document, requirements management database, and the traceability matrix increased the risk that the Theater Blood Application
Framework for the quality assurance of 'omics technologies considering GLP requirements.
Kauffmann, Hans-Martin; Kamp, Hennicke; Fuchs, Regine; Chorley, Brian N; Deferme, Lize; Ebbels, Timothy; Hackermüller, Jörg; Perdichizzi, Stefania; Poole, Alan; Sauer, Ursula G; Tollefsen, Knut E; Tralau, Tewes; Yauk, Carole; van Ravenzwaay, Ben
2017-12-01
'Omics technologies are gaining importance to support regulatory toxicity studies. Prerequisites for performing 'omics studies considering GLP principles were discussed at the European Centre for Ecotoxicology and Toxicology of Chemicals (ECETOC) Workshop Applying 'omics technologies in Chemical Risk Assessment. A GLP environment comprises a standard operating procedure system, proper pre-planning and documentation, and inspections of independent quality assurance staff. To prevent uncontrolled data changes, the raw data obtained in the respective 'omics data recording systems have to be specifically defined. Further requirements include transparent and reproducible data processing steps, and safe data storage and archiving procedures. The software for data recording and processing should be validated, and data changes should be traceable or disabled. GLP-compliant quality assurance of 'omics technologies appears feasible for many GLP requirements. However, challenges include (i) defining, storing, and archiving the raw data; (ii) transparent descriptions of data processing steps; (iii) software validation; and (iv) ensuring complete reproducibility of final results with respect to raw data. Nevertheless, 'omics studies can be supported by quality measures (e.g., GLP principles) to ensure quality control, reproducibility and traceability of experiments. This enables regulators to use 'omics data in a fit-for-purpose context, which enhances their applicability for risk assessment. Copyright © 2017 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Dykema, John A.; Anderson, James G.
2006-06-01
A methodology to achieve spectral thermal radiance measurements from space with demonstrable on-orbit traceability to the International System of Units (SI) is described. This technique results in measurements of infrared spectral radiance R(\\tilde {\\upsilon }) , with spectral index \\tilde {\\upsilon } in cm-1, with a relative combined uncertainty u_c[R(\\tilde {\\upsilon })] of 0.0015 (k = 1) for the average mid-infrared radiance emitted by the Earth. This combined uncertainty, expressed in brightness temperature units, is equivalent to ±0.1 K at 250 K at 750 cm-1. This measurement goal is achieved by utilizing a new method for infrared scale realization combined with an instrument design optimized to minimize component uncertainties and admit tests of radiometric performance. The SI traceability of the instrument scale is established by evaluation against source-based and detector-based infrared scales in defined laboratory protocols before launch. A novel strategy is executed to ensure fidelity of on-orbit calibration to the pre-launch scale. This strategy for on-orbit validation relies on the overdetermination of instrument calibration. The pre-launch calibration against scales derived from physically independent paths to the base SI units provides the foundation for a critical analysis of the overdetermined on-orbit calibration to establish an SI-traceable estimate of the combined measurement uncertainty. Redundant calibration sources and built-in diagnostic tests to assess component measurement uncertainties verify the SI traceability of the instrument calibration over the mission lifetime. This measurement strategy can be realized by a practical instrument, a prototype Fourier-transform spectrometer under development for deployment on a small satellite. The measurement record resulting from the methodology described here meets the observational requirements for climate monitoring and climate model testing and improvement.
Experimental and Metrological Basis for SI-Traceable Infrared Radiance Measurements From Space
NASA Astrophysics Data System (ADS)
Gero, P. J.; Dykema, J. A.; Anderson, J. G.; Leroy, S. S.
2007-12-01
In order to establish a climate benchmark record and to be useful in interdecadal climate forecast testing, satellite measurements of high spectral resolution infrared radiance must have uncertainty estimates that can be proven beyond a doubt. An uncertainty in radiance of about 1 part in 1000 is required for climate applications. This can be accomplished by appealing to the best measurement practices of the metrology community. The International System of Units (SI) are linked to fundamental physical properties of matter, and can be realized anywhere in the world without bias. By doing so, one can make an accurate observation to within a specified uncertainty. Achieving SI-traceable radiance measurements from space is a novel requirement, and requires specialized sensor design and a disciplined experimental approach. Infrared remote sensing satellite instruments typically employ blackbody calibration targets, which are tied to the SI through Planck's law and the definition of the Kelvin. The blackbody temperature and emissivity, however, must be determined accurately on- orbit, in order for the blackbody emission scale to be SI-traceable. We outline a methodology of instrument design, pre-flight calibration and on-orbit diagnostics for realizing SI- traceable infrared radiance measurements. This instrument is intended as a component of the Climate Absolute Radiance and Refractivity Earth Observatory (CLARREO), a high priority recommendation of the National Research Council decadal survey. Calibration blackbodies for remote sensing differ from a perfect Planckian blackbody; thus the component uncertainties must be evaluated in order to confer traceability. We have performed traceability experiments in the laboratory to verify blackbody temperature, emissivity and the end-to-end radiance scale. We discuss the design of the Harvard standard blackbody and an intercomparison campaign that will be conducted with the GIFTS blackbody (University of Wisconsin, Madison) and radiometric calibration facilities at NIST. The GIFTS blackbody is a high-performance space-qualified design with a new generation of on-orbit thermometer calibration via miniaturized fixed point cells. NIST facilities allow the step-by-step measurement of blackbody surface properties, thermal properties, on-axis emissivity, and end-to-end radiometric performance. These activities will lay the experimental groundwork for achieving SI-traceable infrared radiance measurements on a satellite instrument.
Messai, Habib; Farman, Muhammad; Sarraj-Laabidi, Abir; Hammami-Semmar, Asma; Semmar, Nabil
2016-11-17
Olive oils (OOs) show high chemical variability due to several factors of genetic, environmental and anthropic types. Genetic and environmental factors are responsible for natural compositions and polymorphic diversification resulting in different varietal patterns and phenotypes. Anthropic factors, however, are at the origin of different blends' preparation leading to normative, labelled or adulterated commercial products. Control of complex OO samples requires their (i) characterization by specific markers; (ii) authentication by fingerprint patterns; and (iii) monitoring by traceability analysis. These quality control and management aims require the use of several multivariate statistical tools: specificity highlighting requires ordination methods; authentication checking calls for classification and pattern recognition methods; traceability analysis implies the use of network-based approaches able to separate or extract mixed information and memorized signals from complex matrices. This chapter presents a review of different chemometrics methods applied for the control of OO variability from metabolic and physical-chemical measured characteristics. The different chemometrics methods are illustrated by different study cases on monovarietal and blended OO originated from different countries. Chemometrics tools offer multiple ways for quantitative evaluations and qualitative control of complex chemical variability of OO in relation to several intrinsic and extrinsic factors.
Datla, R. U.; Rice, J. P.; Lykke, K. R.; Johnson, B. C.; Butler, J. J.; Xiong, X.
2011-01-01
The pre-launch characterization and calibration of remote sensing instruments should be planned and carried out in conjunction with their design and development to meet the mission requirements. The onboard calibrators such as blackbodies and the sensors such as spectral radiometers should be characterized and calibrated using SI traceable standards. In the case of earth remote sensing, this allows inter-comparison and intercalibration of different sensors in space to create global time series of climate records of high accuracy where some inevitable data gaps can be easily bridged. The recommended best practice guidelines for this pre-launch effort is presented based on experience gained at National Institute of Standards and Technology (NIST), National Aeronautics and Space Administration (NASA) and National Oceanic and Atmospheric Administration (NOAA) programs over the past two decades. The currently available radiometric standards and calibration facilities at NIST serving the remote sensing community are described. Examples of best practice calibrations and intercomparisons to build SI (international System of Units) traceable uncertainty budget in the instrumentation used for preflight satellite sensor calibration and validation are presented. PMID:26989588
Datla, R U; Rice, J P; Lykke, K R; Johnson, B C; Butler, J J; Xiong, X
2011-01-01
The pre-launch characterization and calibration of remote sensing instruments should be planned and carried out in conjunction with their design and development to meet the mission requirements. The onboard calibrators such as blackbodies and the sensors such as spectral radiometers should be characterized and calibrated using SI traceable standards. In the case of earth remote sensing, this allows inter-comparison and intercalibration of different sensors in space to create global time series of climate records of high accuracy where some inevitable data gaps can be easily bridged. The recommended best practice guidelines for this pre-launch effort is presented based on experience gained at National Institute of Standards and Technology (NIST), National Aeronautics and Space Administration (NASA) and National Oceanic and Atmospheric Administration (NOAA) programs over the past two decades. The currently available radiometric standards and calibration facilities at NIST serving the remote sensing community are described. Examples of best practice calibrations and intercomparisons to build SI (international System of Units) traceable uncertainty budget in the instrumentation used for preflight satellite sensor calibration and validation are presented.
NASA Technical Reports Server (NTRS)
Kapoor, Manju M.; Mehta, Manju
2010-01-01
The goal of this paper is to emphasize the importance of developing complete and unambiguous requirements early in the project cycle (prior to Preliminary Design Phase). Having a complete set of requirements early in the project cycle allows sufficient time to generate a traceability matrix. Requirements traceability and analysis are the key elements in improving verification and validation process, and thus overall software quality. Traceability can be most beneficial when the system changes. If changes are made to high-level requirements it implies that low-level requirements need to be modified. Traceability ensures that requirements are appropriately and efficiently verified at various levels whereas analysis ensures that a rightly interpreted set of requirements is produced.
Messai, Habib; Farman, Muhammad; Sarraj-Laabidi, Abir; Hammami-Semmar, Asma; Semmar, Nabil
2016-01-01
Background. Olive oils (OOs) show high chemical variability due to several factors of genetic, environmental and anthropic types. Genetic and environmental factors are responsible for natural compositions and polymorphic diversification resulting in different varietal patterns and phenotypes. Anthropic factors, however, are at the origin of different blends’ preparation leading to normative, labelled or adulterated commercial products. Control of complex OO samples requires their (i) characterization by specific markers; (ii) authentication by fingerprint patterns; and (iii) monitoring by traceability analysis. Methods. These quality control and management aims require the use of several multivariate statistical tools: specificity highlighting requires ordination methods; authentication checking calls for classification and pattern recognition methods; traceability analysis implies the use of network-based approaches able to separate or extract mixed information and memorized signals from complex matrices. Results. This chapter presents a review of different chemometrics methods applied for the control of OO variability from metabolic and physical-chemical measured characteristics. The different chemometrics methods are illustrated by different study cases on monovarietal and blended OO originated from different countries. Conclusion. Chemometrics tools offer multiple ways for quantitative evaluations and qualitative control of complex chemical variability of OO in relation to several intrinsic and extrinsic factors. PMID:28231172
The implementation of traceability systems.
Ammendrup, S; Barcos, L O
2006-08-01
Traceability is a tool to help countries meet their objectives of controlling, preventing and eradicating animal diseases. This article sets out the required steps in a traceability system. Before designing a system of traceability, one must identify the different characteristics that need to be traced throughout the various steps in the food production chain. The interaction between different sectors in defining the objectives and the resulting needs of a traceability system is fundamental. A clear legal framework is also indispensable. European Union (EU) legislation requires identification and registration for cattle, pigs, sheep and goats. For intra-EU trade these animals must be accompanied by a health certificate providing information on their identity and health status. The required identification is harmonised on an EU-wide basis with the aim of ensuring traceability for veterinary purposes. Furthermore EU legislation requires that the traceability of food, feed and food-producing animals be established at all stages of production.
Pre-analytical and analytical aspects affecting clinical reliability of plasma glucose results.
Pasqualetti, Sara; Braga, Federica; Panteghini, Mauro
2017-07-01
The measurement of plasma glucose (PG) plays a central role in recognizing disturbances in carbohydrate metabolism, with established decision limits that are globally accepted. This requires that PG results are reliable and unequivocally valid no matter where they are obtained. To control the pre-analytical variability of PG and prevent in vitro glycolysis, the use of citrate as rapidly effective glycolysis inhibitor has been proposed. However, the commercial availability of several tubes with studies showing different performance has created confusion among users. Moreover, and more importantly, studies have shown that tubes promptly inhibiting glycolysis give PG results that are significantly higher than tubes containing sodium fluoride only, used in the majority of studies generating the current PG cut-points, with a different clinical classification of subjects. From the analytical point of view, to be equivalent among different measuring systems, PG results should be traceable to a recognized higher-order reference via the implementation of an unbroken metrological hierarchy. In doing this, it is important that manufacturers of measuring systems consider the uncertainty accumulated through the different steps of the selected traceability chain. In particular, PG results should fulfil analytical performance specifications defined to fit the intended clinical application. Since PG has tight homeostatic control, its biological variability may be used to define these limits. Alternatively, given the central diagnostic role of the analyte, an outcome model showing the impact of analytical performance of test on clinical classifications of subjects can be used. Using these specifications, performance assessment studies employing commutable control materials with values assigned by reference procedure have shown that the quality of PG measurements is often far from desirable and that problems are exacerbated using point-of-care devices. Copyright © 2017 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.
Visualizing and Validating Metadata Traceability within the CDISC Standards.
Hume, Sam; Sarnikar, Surendra; Becnel, Lauren; Bennett, Dorine
2017-01-01
The Food & Drug Administration has begun requiring that electronic submissions of regulated clinical studies utilize the Clinical Data Information Standards Consortium data standards. Within regulated clinical research, traceability is a requirement and indicates that the analysis results can be traced back to the original source data. Current solutions for clinical research data traceability are limited in terms of querying, validation and visualization capabilities. This paper describes (1) the development of metadata models to support computable traceability and traceability visualizations that are compatible with industry data standards for the regulated clinical research domain, (2) adaptation of graph traversal algorithms to make them capable of identifying traceability gaps and validating traceability across the clinical research data lifecycle, and (3) development of a traceability query capability for retrieval and visualization of traceability information.
Visualizing and Validating Metadata Traceability within the CDISC Standards
Hume, Sam; Sarnikar, Surendra; Becnel, Lauren; Bennett, Dorine
2017-01-01
The Food & Drug Administration has begun requiring that electronic submissions of regulated clinical studies utilize the Clinical Data Information Standards Consortium data standards. Within regulated clinical research, traceability is a requirement and indicates that the analysis results can be traced back to the original source data. Current solutions for clinical research data traceability are limited in terms of querying, validation and visualization capabilities. This paper describes (1) the development of metadata models to support computable traceability and traceability visualizations that are compatible with industry data standards for the regulated clinical research domain, (2) adaptation of graph traversal algorithms to make them capable of identifying traceability gaps and validating traceability across the clinical research data lifecycle, and (3) development of a traceability query capability for retrieval and visualization of traceability information. PMID:28815125
DOE Office of Scientific and Technical Information (OSTI.GOV)
MCGREW, D.L.
2001-10-31
This Requirements Verification Report provides the traceability of how Project W-314 fulfilled the Project Development Specification requirements for the AN Farm to 200E Waste Transfer System Upgrade package.
Modeling traceability information and functionality requirement in export-oriented tilapia chain.
Zhang, Xiaoshuan; Feng, Jianying; Xu, Mark; Hu, Jinyou
2011-05-01
Tilapia has been named as the 'food fish of the 21st century' and has become the most important farmed fish. China is the world leader in tilapia production and export. Identifying information and functional requirements is critical in developing an efficient traceability system because traceability has become a fundamental prerequisite for exporting aquaculture products. This paper examines the export-oriented tilapia chains and information flow in the chains, and identifies the key actors, information requirements and information-capturing points. Unified Modeling Language (UML) technology is adopted to describe the information and functionality requirement for chain traceability. The barriers of traceability system adoption are also identified. The results show that the traceability data consist of four categories that must be recorded by each link in the chain. The functionality requirement is classified into four categories from the fundamental information record to decisive quality control; the top three barriers to the traceability system adoption are: high costs of implementing the system, lack of experienced and professional staff; and low level of government involvement and support. Copyright © 2011 Society of Chemical Industry.
Traceability from a US perspective.
Smith, G C; Tatum, J D; Belk, K E; Scanga, J A; Grandin, T; Sofos, J N
2005-09-01
Traceability of a food consists of development of "an information trail that follows the food product's physical trail". Internationally, the US is lagging behind many countries in developing traceability systems for food in general and especially for livestock, poultry and their products. The US food industry is developing, implementing and maintaining traceability systems designed to improve food supply management, facilitate traceback for food safety and quality, and differentiate and market foods with subtle or undetectable quality attributes. Traceability, for livestock, poultry and meat, in its broadest context, can, could, or will eventually be used: (1) to ascertain origin and ownership, and to deter theft and misrepresentation, of animals and meat; (2) for surveillance, control and eradication of foreign animal diseases; (3) for biosecurity protection of the national livestock population; (4) for compliance with requirements of international customers; (5) for compliance with country-of-origin labeling requirements; (6) for improvement of supply-side management, distribution/delivery systems and inventory controls; (7) to facilitate value-based marketing; (8) to facilitate value-added marketing; (9) to isolate the source and extent of quality-control and food-safety problems; and (10) to minimize product recalls and make crisis management protocols more effective. Domestically and internationally, it has now become essential that producers, packers, processors, wholesalers, exporters and retailers assure that livestock, poultry and meat are identified, that record-keeping assures traceability through all or parts of the complete life-cycle, and that, in some cases, the source, the production-practices and/or the process of generating final products, can be verified. At issue, as the US develops traceback capabilities, will be the breadth, depth and precision of its specific traceability systems.
Towards automated traceability maintenance
Mäder, Patrick; Gotel, Orlena
2012-01-01
Traceability relations support stakeholders in understanding the dependencies between artifacts created during the development of a software system and thus enable many development-related tasks. To ensure that the anticipated benefits of these tasks can be realized, it is necessary to have an up-to-date set of traceability relations between the established artifacts. This goal requires the creation of traceability relations during the initial development process. Furthermore, the goal also requires the maintenance of traceability relations over time as the software system evolves in order to prevent their decay. In this paper, an approach is discussed that supports the (semi-) automated update of traceability relations between requirements, analysis and design models of software systems expressed in the UML. This is made possible by analyzing change events that have been captured while working within a third-party UML modeling tool. Within the captured flow of events, development activities comprised of several events are recognized. These are matched with predefined rules that direct the update of impacted traceability relations. The overall approach is supported by a prototype tool and empirical results on the effectiveness of tool-supported traceability maintenance are provided. PMID:23471308
The Expanding Role of Traceability in Seafood: Tools and Key Initiatives.
Lewis, Sara G; Boyle, Mariah
2017-08-01
In the last decade, a range of drivers within the seafood sector have incentivized the application of traceability to issues beyond food safety and inventory management. Some of the issues motivating the expanded use of traceability within the global seafood sector include: increased media attention on the legal and social risks within some seafood supply chains, governmental traceability requirements, private-sector sustainability commitments, and others. This article begins with an overview of these topics in the seafood industry, and why many nongovernment organizations (NGOs), companies, and government actors have turned to traceability as a tool to address them. We discuss how traceability connects to key requirements of environmental sustainability and social responsibility. Later, we review the range of traceability services, tools, software solutions, and the due diligence measures that are currently being leveraged within the seafood sector. The paper concludes with a discussion of several NGO- and industry-led traceability initiatives that are examples of seafood traceability improvements. © 2017 Institute of Food Technologists®.
An Initial Model of Requirements Traceability an Empirical Study
1992-09-22
procedures have been used extensively in the study of human problem-solving, including such areas as general problem-solving behavior, physics problem...heen doing unless you have traceability." " Humans don’t go back to the requirements enough." "Traceabi!ity should be extremely helpful with...by constraints on its usage: ("Traceability needs to be something that humans can work with, not just a whip held over people." "Traceability should
Detection and traceability of genetically modified organisms in the food production chain.
Miraglia, M; Berdal, K G; Brera, C; Corbisier, P; Holst-Jensen, A; Kok, E J; Marvin, H J P; Schimmel, H; Rentsch, J; van Rie, J P P F; Zagon, J
2004-07-01
Both labelling and traceability of genetically modified organisms are current issues that are considered in trade and regulation. Currently, labelling of genetically modified foods containing detectable transgenic material is required by EU legislation. A proposed package of legislation would extend this labelling to foods without any traces of transgenics. These new legislations would also impose labelling and a traceability system based on documentation throughout the food and feed manufacture system. The regulatory issues of risk analysis and labelling are currently harmonised by Codex Alimentarius. The implementation and maintenance of the regulations necessitates sampling protocols and analytical methodologies that allow for accurate determination of the content of genetically modified organisms within a food and feed sample. Current methodologies for the analysis of genetically modified organisms are focused on either one of two targets, the transgenic DNA inserted- or the novel protein(s) expressed- in a genetically modified product. For most DNA-based detection methods, the polymerase chain reaction is employed. Items that need consideration in the use of DNA-based detection methods include the specificity, sensitivity, matrix effects, internal reference DNA, availability of external reference materials, hemizygosity versus homozygosity, extrachromosomal DNA, and international harmonisation. For most protein-based methods, enzyme-linked immunosorbent assays with antibodies binding the novel protein are employed. Consideration should be given to the selection of the antigen bound by the antibody, accuracy, validation, and matrix effects. Currently, validation of detection methods for analysis of genetically modified organisms is taking place. In addition, new methodologies are developed, including the use of microarrays, mass spectrometry, and surface plasmon resonance. Challenges for GMO detection include the detection of transgenic material in materials with varying chromosome numbers. The existing and proposed regulatory EU requirements for traceability of genetically modified products fit within a broader tendency towards traceability of foods in general and, commercially, towards products that can be distinguished from each other. Traceability systems document the history of a product and may serve the purpose of both marketing and health protection. In this framework, segregation and identity preservation systems allow for the separation of genetically modified and non-modified products from "farm to fork". Implementation of these systems comes with specific technical requirements for each particular step of the food processing chain. In addition, the feasibility of traceability systems depends on a number of factors, including unique identifiers for each genetically modified product, detection methods, permissible levels of contamination, and financial costs. In conclusion, progress has been achieved in the field of sampling, detection, and traceability of genetically modified products, while some issues remain to be solved. For success, much will depend on the threshold level for adventitious contamination set by legislation. Copryright 2004 Elsevier Ltd.
Value-Based Requirements Traceability: Lessons Learned
NASA Astrophysics Data System (ADS)
Egyed, Alexander; Grünbacher, Paul; Heindl, Matthias; Biffl, Stefan
Traceability from requirements to code is mandated by numerous software development standards. These standards, however, are not explicit about the appropriate level of quality of trace links. From a technical perspective, trace quality should meet the needs of the intended trace utilizations. Unfortunately, long-term trace utilizations are typically unknown at the time of trace acquisition which represents a dilemma for many companies. This chapter suggests ways to balance the cost and benefits of requirements traceability. We present data from three case studies demonstrating that trace acquisition requires broad coverage but can tolerate imprecision. With this trade-off our lessons learned suggest a traceability strategy that (1) provides trace links more quickly, (2) refines trace links according to user-defined value considerations, and (3) supports the later refinement of trace links in case the initial value consideration has changed over time. The scope of our work considers the entire life cycle of traceability instead of just the creation of trace links.
Traceability of radiation protection instruments
NASA Astrophysics Data System (ADS)
Hino, Y.; Kurosawa, T.
2007-08-01
Radiation protection instruments are used in daily measurement of dose and activities in workplaces and environments for safety management. The requirements for calibration certificates with traceability are increasing for these instruments to ensure the consistency and reliabilities of the measurement results. The present traceability scheme of radiation protection instruments for dose and activity measurements is described with related IEC/ISO requirements. Some examples of desirable future calibration systems with recent new technologies are also discussed to establish the traceability with reasonable costs and reliabilities.
Design of agricultural product quality safety retrospective supervision system of Jiangsu province
NASA Astrophysics Data System (ADS)
Wang, Kun
2017-08-01
In store and supermarkets to consumers can trace back agricultural products through the electronic province card to query their origin, planting, processing, packaging, testing and other important information and found that the problems. Quality and safety issues can identify the responsibility of the problem. This paper designs a retroactive supervision system for the quality and safety of agricultural products in Jiangsu Province. Based on the analysis of agricultural production and business process, the goal of Jiangsu agricultural product quality safety traceability system construction is established, and the specific functional requirements and non-functioning requirements of the retroactive system are analyzed, and the target is specified for the specific construction of the retroactive system. The design of the quality and safety traceability system in Jiangsu province contains the design of the overall design, the trace code design and the system function module.
Hickey, Caitlin; Bhatt, Tejas
2013-12-01
Fifty thought leaders in the area of food traceability met for a 3rd time to discuss methodologies and finalize the principles that define their vision for traceability. Participants in the summit included representatives from industry, trade associations, government, academia, consumer groups, and more. One main focus of this summit included a discussion on the current regulations and voluntary initiatives in place regarding traceability. Overall, it was recognized that the recommendations from this summit group would be more specific and stringent in comparison to these current regulations and initiatives. The participants sought to be leaders in the traceability arena, with their recommendations leading the industry to optimal traceability systems and methods. Participants agreed on many principles for their vision of traceability, emphasizing the importance of access to traceability data. They discussed having industry be asked for "basic" tracing data prior to the need for a large-scale investigation, having standards for sharing data, and having the data in electronic form. Participants foresaw the importance of capturing data electronically in the future, although they recognized that many firms do not currently do this. The group also saw a need for a transition period to implement changes, and to provide implementation training and resource aid to small businesses. Summit participants discussed specific definitions and examples for key data elements and critical tracking events that could be used by industry to capture tracing data at specific points within the supply chain. Overall, participants refined the goals of the summit group and started to identify specific ways to achieve those goals. © 2013 Institute of Food Technologists®
Morrison, Michael; Moraia, Linda Briceño; Steele, Jane C
2016-01-01
This paper describes a traceability system developed for the Stem cells for Biological Assays of Novel drugs and prediCtive toxiCology consortium. The system combines records and labels that to biological material across geographical locations and scientific processes from sample donation to induced pluripotent stem cell line. The labeling system uses a unique identification number to link every aliquot of sample at every stage of the reprogramming pathway back to the original donor. Only staff at the clinical recruitment site can reconnect the unique identification number to the identifying details of a specific donor. This ensures the system meets ethical and legal requirements for protecting privacy while allowing full traceability of biological material. The system can be adapted to other projects and for use with different primary sample types.
Metrology for hydrogen energy applications: a project to address normative requirements
NASA Astrophysics Data System (ADS)
Haloua, Frédérique; Bacquart, Thomas; Arrhenius, Karine; Delobelle, Benoît; Ent, Hugo
2018-03-01
Hydrogen represents a clean and storable energy solution that could meet worldwide energy demands and reduce greenhouse gases emission. The joint research project (JRP) ‘Metrology for sustainable hydrogen energy applications’ addresses standardisation needs through pre- and co-normative metrology research in the fast emerging sector of hydrogen fuel that meet the requirements of the European Directive 2014/94/EU by supplementing the revision of two ISO standards that are currently too generic to enable a sustainable implementation of hydrogen. The hydrogen purity dispensed at refueling points should comply with the technical specifications of ISO 14687-2 for fuel cell electric vehicles. The rapid progress of fuel cell technology now requires revising this standard towards less constraining limits for the 13 gaseous impurities. In parallel, optimized validated analytical methods are proposed to reduce the number of analyses. The study aims also at developing and validating traceable methods to assess accurately the hydrogen mass absorbed and stored in metal hydride tanks; this is a research axis for the revision of the ISO 16111 standard to develop this safe storage technique for hydrogen. The probability of hydrogen impurity presence affecting fuel cells and analytical techniques for traceable measurements of hydrogen impurities will be assessed and new data of maximum concentrations of impurities based on degradation studies will be proposed. Novel validated methods for measuring the hydrogen mass absorbed in hydrides tanks AB, AB2 and AB5 types referenced to ISO 16111 will be determined, as the methods currently available do not provide accurate results. The outputs here will have a direct impact on the standardisation works for ISO 16111 and ISO 14687-2 revisions in the relevant working groups of ISO/TC 197 ‘Hydrogen technologies’.
Traceability of Software Safety Requirements in Legacy Safety Critical Systems
NASA Technical Reports Server (NTRS)
Hill, Janice L.
2007-01-01
How can traceability of software safety requirements be created for legacy safety critical systems? Requirements in safety standards are imposed most times during contract negotiations. On the other hand, there are instances where safety standards are levied on legacy safety critical systems, some of which may be considered for reuse for new applications. Safety standards often specify that software development documentation include process-oriented and technical safety requirements, and also require that system and software safety analyses are performed supporting technical safety requirements implementation. So what can be done if the requisite documents for establishing and maintaining safety requirements traceability are not available?
Technology Infusion Challenges from a Decision Support Perspective
NASA Technical Reports Server (NTRS)
Adumitroaie, V.; Weisbin, C. R.
2009-01-01
In a restricted science budget environment and increasingly numerous required technology developments, the technology investment decisions within NASA are objectively more and more difficult to make such that the end results are satisfying the technical objectives and all the organizational constraints. Under these conditions it is rationally desirable to build an investment portfolio, which has the highest possible technology infusion rate. Arguably the path to infusion is subject to many influencing factors, but here only the challenges associated with the very initial stages are addressed: defining the needs and the subsequent investment decision-support process. It is conceivable that decision consistency and possibly its quality suffer when the decision-making process has limited or no traceability. This paper presents a structured decision-support framework aiming to provide traceable, auditable, infusion- driven recommendations towards a selection process in which these recommendations are used as reference points in further discussions among stakeholders. In this framework addressing well-defined requirements, different measures of success can be defined based on traceability to specific selection criteria. As a direct result, even by using simplified decision models the likelihood of infusion can be probed and consequently improved.
The specifics of dosimetry for food irradiation applications
NASA Astrophysics Data System (ADS)
Kuntz, Florent; Strasser, Alain
2016-12-01
Dose measurement applied to food irradiation is obviously a very important and critical aspect of this process. It is described in many standards and guides. The application of appropriate dosimetry tools is explained. This helps to ensure traceability of this measurement and number of dosimeters available on the market are well studied even though theirs response should be characterized while used in routine processing conditions. When employed in low energy radiation fields, these dosimeters may exhibit specific response compared to the usual Cobalt 60 source irradiation. Traceable calibration or correction factor assessment of this energy dependency is mandatory. It is to mention that the absorbed dose is measured in the dosimeter itself and unfortunately not in/on the food product. However, existing dosimetry systems fulfill all relevant requirements.
Schauzu, M
2004-09-01
Placing genetically modified (GM) plants and derived products on the European Union's (EU) market has been regulated by a Community Directive since 1990. This directive was complemented by a regulation specific for genetically modified and other novel foods in 1997. Specific labelling requirements have been applicable for GM foods since 1998. The law requires a pre-market safety assessment for which criteria have been elaborated and continuously adapted in accordance with the state of the art by national and international bodies and organisations. Consequently, only genetically modified products that have been demonstrated to be as safe as their conventional counterparts can be commercialized. However, the poor acceptance of genetically modified foods has led to a de facto moratorium since 1998. It is based on the lack of a qualified majority of EU member states necessary for authorization to place genetically modified plants and derived foods on the market. New Community Regulations are intended to end this moratorium by providing a harmonized and transparent safety assessment, a centralised authorization procedure, extended labelling provisions and a traceability system for genetically modified organisms (GMO) and derived food and feed.
Proceedings of a Meeting on Traceability for Ionizing Radiation Measurements
NASA Astrophysics Data System (ADS)
Heaton, H. T., II
1982-02-01
General concepts for traceability were presented from several perspectives. The national standards for radiation dosimetry, radioactivity measurements, and neutron measurements were described. Specific programs for achieving traceability to the national standards for radiation measurements in medical, occupational, and environmental applications were summarized.
Measurement uncertainty: Friend or foe?
Infusino, Ilenia; Panteghini, Mauro
2018-02-02
The definition and enforcement of a reference measurement system, based on the implementation of metrological traceability of patients' results to higher order reference methods and materials, together with a clinically acceptable level of measurement uncertainty, are fundamental requirements to produce accurate and equivalent laboratory results. The uncertainty associated with each step of the traceability chain should be governed to obtain a final combined uncertainty on clinical samples fulfilling the requested performance specifications. It is important that end-users (i.e., clinical laboratory) may know and verify how in vitro diagnostics (IVD) manufacturers have implemented the traceability of their calibrators and estimated the corresponding uncertainty. However, full information about traceability and combined uncertainty of calibrators is currently very difficult to obtain. Laboratory professionals should investigate the need to reduce the uncertainty of the higher order metrological references and/or to increase the precision of commercial measuring systems. Accordingly, the measurement uncertainty should not be considered a parameter to be calculated by clinical laboratories just to fulfil the accreditation standards, but it must become a key quality indicator to describe both the performance of an IVD measuring system and the laboratory itself. Copyright © 2018 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Leach, Richard; Haycocks, Jane; Jackson, Keith; Lewis, Andrew; Oldfield, Simon; Yacoot, Andrew
2001-03-01
The only difference between nanotechnology and many other fields of science or engineering is that of size. Control in manufacturing at the nanometre scale still requires accurate and traceable measurements whether one is attempting to machine optical quality glass or write one's company name in single atoms. A number of instruments have been developed at the National Physical Laboratory that address the measurement requirements of the nanotechnology community and provide traceability to the definition of the metre. The instruments discussed in this paper are an atomic force microscope and a surface texture measuring instrument with traceable metrology in all their operational axes, a combined optical and x-ray interferometer system that can be used to calibrate displacement transducers to subnanometre accuracy and a co-ordinate measuring machine with a working volume of (50 mm)3 and 50 nm volumetric accuracy.
This procedure includes the specifications and requirements that must be followed by gas manufacturers during the preparation of compressed cylinder gas Certified Reference Materials (CRM). A CRM is a certified gas standard prepared at a concentration that does not exceed + or - ...
Breed traceability of buffalo meat using microsatellite genotyping technique.
Kannur, Bheemashankar H; Fairoze, Md Nadeem; Girish, P S; Karabasanavar, Nagappa; Rudresh, B H
2017-02-01
Although buffalo has emerged as a major meat producing animal in Asia, major research on breed traceability has so far been focused on cattle (beef). This research gap on buffalo breed traceability has impelled development and validation of buffalo breed traceability using a set of eight microsatellite (STR) markers in seven Indian buffalo breeds (Bhadawari, Jaffaarabadi, Murrah, Mehsana, Nagpuri, Pandharpuri and Surti). Probability of sharing same profile by two individuals at a specific locus was computed considering different STR numbers, allele pooling in breed and population. Match probabilities per breed were considered and six most polymorphic loci were genotyped. Out of eight microsatellite markers studied, markers CSSMO47, DRB3 and CSSM060 were found most polymorphic. Developed technique was validated with known and unknown, blood and meat samples; wherein, samples were genetically traced in 24 out of 25 samples tested. Results of this study showed potential applications of the methodology and encourage other researchers to address the problem of buffalo traceability so as to create a world-wide archive of breed specific genotypes. This work is the first report of breed traceability of buffalo meat utilizing microsatellite genotyping technique.
Traceability in healthcare: crossing boundaries.
Lovis, C
2008-01-01
This paper is a survey on the problem of traceability in healthcare. Traceability covers many different aspects and its understanding varies among different players. In supply chains and retails, traceability usually covers aspects pertaining to logistics. The challenge is to keep trace of objects manufactured, to track their locations in a production and distribution processes. In food industry, traceability has received a lot of attention because of public health problems related to infectious diseases. For instance, in Europe, the challenge of traceability has been to build the tracking of meat, from the living animal to the shell. In the health sector, traceability has mostly been involved in patient safety around human products such as blood derivates contaminants or implanted devices and prosthesis such as mammary implants. There are growing interests involving traceability in health related to drug safety, including the problem of counterfeited drugs, and to privacy. Traceability is also increasingly seen as a mean to improve efficiency of the logistics of care and a way to better understand costs and usage of resources. This survey is reviewing the literature and proposes a discussion based on the real use and needs of traceability in a large teaching hospital. Traceability in healthcare is at the crossroads of numerous needs. It is therefore of particular complexity and raises many new challenges. Identification management and entity tracking, from serialization of consumers' good production in the supply chains, to the identification of actors, patients, care providers, locations and processes is a huge effort, tackling economical, political, ethical and technical challenges. New requirements are needed, not usually met in the supply chain, such as serialization and persistence in time. New problems arise, such as privacy and legal frameworks. There are growing needs to increase traceability for drug products, related to drug safety, counterfeited drugs, and to privacy. Technical problems around reliability, robustness and efficiency of carriers are still to be resolved. There is a lot at stakes. Traceability is a major aspect of the future in healthcare and requires the attention of the community of medical informatics.
TF4SM: A Framework for Developing Traceability Solutions in Small Manufacturing Companies
Bordel Sánchez, Borja; Alcarria, Ramón; Martín, Diego; Robles, Tomás
2015-01-01
Nowadays, manufacturing processes have become highly complex. Besides, more and more, governmental institutions require companies to implement systems to trace a product’s life (especially for foods, clinical materials or similar items). In this paper, we propose a new framework, based on cyber-physical systems, for developing traceability systems in small manufacturing companies (which because of their size cannot implement other commercial products). We propose a general theoretical framework, study the requirements of these companies in relation to traceability systems, propose a reference architecture based on both previous elements and build the first minimum functional prototype, to compare our solution to a traditional tag-based traceability system. Results show that our system reduces the number of inefficiencies and reaction time. PMID:26610509
TF4SM: A Framework for Developing Traceability Solutions in Small Manufacturing Companies.
Bordel Sánchez, Borja; Alcarria, Ramón; Martín, Diego; Robles, Tomás
2015-11-20
Nowadays, manufacturing processes have become highly complex. Besides, more and more, governmental institutions require companies to implement systems to trace a product's life (especially for foods, clinical materials or similar items). In this paper, we propose a new framework, based on cyber-physical systems, for developing traceability systems in small manufacturing companies (which because of their size cannot implement other commercial products). We propose a general theoretical framework, study the requirements of these companies in relation to traceability systems, propose a reference architecture based on both previous elements and build the first minimum functional prototype, to compare our solution to a traditional tag-based traceability system. Results show that our system reduces the number of inefficiencies and reaction time.
Traceability in hardness measurements: from the definition to industry
NASA Astrophysics Data System (ADS)
Germak, Alessandro; Herrmann, Konrad; Low, Samuel
2010-04-01
The measurement of hardness has been and continues to be of significant importance to many of the world's manufacturing industries. Conventional hardness testing is the most commonly used method for acceptance testing and production quality control of metals and metallic products. Instrumented indentation is one of the few techniques available for obtaining various property values for coatings and electronic products in the micrometre and nanometre dimensional scales. For these industries to be successful, it is critical that measurements made by suppliers and customers agree within some practical limits. To help assure this measurement agreement, a traceability chain for hardness measurement traceability from the hardness definition to industry has developed and evolved over the past 100 years, but its development has been complicated. A hardness measurement value not only requires traceability of force, length and time measurements but also requires traceability of the hardness values measured by the hardness machine. These multiple traceability paths are needed because a hardness measurement is affected by other influence parameters that are often difficult to identify, quantify and correct. This paper describes the current situation of hardness measurement traceability that exists for the conventional hardness methods (i.e. Rockwell, Brinell, Vickers and Knoop hardness) and for special-application hardness and indentation methods (i.e. elastomer, dynamic, portables and instrumented indentation).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gröbner, Julian; Reda, Ibrahim; Wacker, Stefan
Atmospheric longwave irradiance is currently not metrologically traceable. Traceability requires formal comparisons in the framwork of the CIPM MRA. A task team on Radiation has been created by the WMO to address these issues.
King, B
2001-11-01
The new laboratory accreditation standard, ISO/IEC 17025, reflects current thinking on good measurement practice by requiring more explicit and more demanding attention to a number of activities. These include client interactions, method validation, traceability, and measurement uncertainty. Since the publication of the standard in 1999 there has been extensive debate about its interpretation. It is the author's view that if good quality practices are already in place and if the new requirements are introduced in a manner that is fit for purpose, the additional work required to comply with the new requirements can be expected to be modest. The paper argues that the rigour required in addressing the issues should be driven by customer requirements and the factors that need to be considered in this regard are discussed. The issues addressed include the benefits, interim arrangements, specifying the analytical requirement, establishing traceability, evaluating the uncertainty and reporting the information.
Maintaining Traceability in an Evolving Distributed Computing Environment
NASA Astrophysics Data System (ADS)
Collier, I.; Wartel, R.
2015-12-01
The management of risk is fundamental to the operation of any distributed computing infrastructure. Identifying the cause of incidents is essential to prevent them from re-occurring. In addition, it is a goal to contain the impact of an incident while keeping services operational. For response to incidents to be acceptable this needs to be commensurate with the scale of the problem. The minimum level of traceability for distributed computing infrastructure usage is to be able to identify the source of all actions (executables, file transfers, pilot jobs, portal jobs, etc.) and the individual who initiated them. In addition, sufficiently fine-grained controls, such as blocking the originating user and monitoring to detect abnormal behaviour, are necessary for keeping services operational. It is essential to be able to understand the cause and to fix any problems before re-enabling access for the user. The aim is to be able to answer the basic questions who, what, where, and when concerning any incident. This requires retaining all relevant information, including timestamps and the digital identity of the user, sufficient to identify, for each service instance, and for every security event including at least the following: connect, authenticate, authorize (including identity changes) and disconnect. In traditional grid infrastructures (WLCG, EGI, OSG etc.) best practices and procedures for gathering and maintaining the information required to maintain traceability are well established. In particular, sites collect and store information required to ensure traceability of events at their sites. With the increased use of virtualisation and private and public clouds for HEP workloads established procedures, which are unable to see 'inside' running virtual machines no longer capture all the information required. Maintaining traceability will at least involve a shift of responsibility from sites to Virtual Organisations (VOs) bringing with it new requirements for their logging infrastructures. VOs indeed need to fulfil a new operational role and become fully active participants in the incident response process. We present an analysis of the changing requirements to maintain traceability for virtualised and cloud based workflows with particular reference to the work of the WLCG Traceability Working Group.
A strategy for selecting data mining techniques in metabolomics.
Banimustafa, Ahmed Hmaidan; Hardy, Nigel W
2012-01-01
There is a general agreement that the development of metabolomics depends not only on advances in chemical analysis techniques but also on advances in computing and data analysis methods. Metabolomics data usually requires intensive pre-processing, analysis, and mining procedures. Selecting and applying such procedures requires attention to issues including justification, traceability, and reproducibility. We describe a strategy for selecting data mining techniques which takes into consideration the goals of data mining techniques on the one hand, and the goals of metabolomics investigations and the nature of the data on the other. The strategy aims to ensure the validity and soundness of results and promote the achievement of the investigation goals.
Practical Use Of It In Traceability In Food Value Chains
NASA Astrophysics Data System (ADS)
Ratcliff, Jon; Boddington, Michael
Traceability is today considered an essential requirement for the food value chain due to the need to provide consumers with accurate information in the event of food safety recalls, to provide assurance with regard the source and production systems for food products and in certain countries to comply with government legislation. Within an individual business traceability can be quite simple to implement, however, in a global trading market, traceability of the entire supply chain, including logistics is extremely complex. For this reason IT solutions such as TraceTracker have been developed which not only provide electronic solutions for complete traceability but also allow products to be tracked at any point in the supply chain.
Portable traceability solution for ground-based calibration of optical instruments
NASA Astrophysics Data System (ADS)
El Gawhary, Omar; van Veghel, Marijn; Kenter, Pepijn; van der Leden, Natasja; Dekker, Paul; Revtova, Elena; Heemskerk, Maurice; Trarbach, André; Vink, Ramon; Doyle, Dominic
2017-11-01
We present a portable traceability solution for the ground-based optical calibration of earth observation (EO) instruments. Currently, traceability for this type of calibration is typically based on spectral irradiance sources (e.g. FEL lamps) calibrated at a national metrology institute (NMI). Disadvantages of this source-based traceability are the inflexibility in operating conditions of the source, which are limited to the settings used during calibration at the NMI, and the susceptibility to aging, which requires frequent recalibrations, and which cannot be easily checked on-site. The detector-based traceability solution presented in this work uses a portable filter radiometer to calibrate light sources onsite, immediately before and after, or even during instrument calibration. The filter radiometer itself is traceable to the primary standard of radiometry in the Netherlands. We will discuss the design and realization, calibration and performance verification.
ALT-114 and ALT-118 Alternative Approaches to NIST ...
In 2016, US EPA approved two separate alternatives (ALT 114 and ALT 118) for the preparation and certification of Hydrogen Chloride (HCl) and Mercury (Hg) cylinder reference gas standards that can serve as EPA Protocol gases where EPA Protocol are required, but unavailable. The alternatives were necessary due to the unavailability of NIST reference materials (SRM, NTRM, CRM or RGM) or VSL reference materials (VSL PRM or VSL CRM), reference materials identified in EPA’s Green Book as necessary to establish the traceability of EPA protocol gases. ALT 114 and ALT 118 provides a pathway for gas vendors to prepare and certify traceable gas cylinder standards for use in certifying Hg and HCl CEMS. In this presentation, EPA will describe the mechanics and requirements of the performance-based approach, provide an update on the availability of these gas standards and also discuss the potential for producing and certifying gas standards for other compounds using this approach. This presentation discusses the importance of NIST-traceable reference gases relative to regulatory source compliance emissions monitoring. Specifically this presentation discusses 2 new approaches for making necessary reference gases available in the absence of NIST reference materials. Moreover, these approaches provide an alternative approach to rapidly make available new reference gases for additional HAPS regulatory compliance emissions measurement and monitoring.
Traceability of biotech-derived animals: application of DNA technology.
Loftus, R
2005-04-01
Traceability is increasingly becoming standard across the agri-food industry, largely driven by recent food crises and the consequent demands for transparency within the food chain. This is leading to the development of a range of traceability concepts and technologies adapted to different industry needs. Experience with genetically modified plants has shown that traceability can play a role in increasing public confidence in biotechnology, and might similarly help allay concerns relating to the development of animal biotechnology. Traceability also forms an essential component of any risk management strategy and is a key requirement for post-marketing surveillance. Given the diversity of traceability concepts and technologies available, consideration needs to be given to the scope and precision of traceability systems for animal biotechnology. Experience to date has shown that conventional tagging and labelling systems can incorporate levels of error and may not have sufficient precision for biotech-derived animals. Deoxyribonucleic acid (DNA) technology can overcome these difficulties by tracing animals and animal by-products through their DNA code rather than an associated label. This offers the possibility of tracing some by-products of animal biotechnology through the supply chain back to source animals, offering unprecedented levels of traceability. Developments in both DNA sampling and analysis technology are making large-scale applications of DNA traceability increasingly cost effective and feasible, and are likely to lead to a broader uptake of DNA traceability concepts.
Performance Prediction of a MongoDB-Based Traceability System in Smart Factory Supply Chains
Kang, Yong-Shin; Park, Il-Ha; Youm, Sekyoung
2016-01-01
In the future, with the advent of the smart factory era, manufacturing and logistics processes will become more complex, and the complexity and criticality of traceability will further increase. This research aims at developing a performance assessment method to verify scalability when implementing traceability systems based on key technologies for smart factories, such as Internet of Things (IoT) and BigData. To this end, based on existing research, we analyzed traceability requirements and an event schema for storing traceability data in MongoDB, a document-based Not Only SQL (NoSQL) database. Next, we analyzed the algorithm of the most representative traceability query and defined a query-level performance model, which is composed of response times for the components of the traceability query algorithm. Next, this performance model was solidified as a linear regression model because the response times increase linearly by a benchmark test. Finally, for a case analysis, we applied the performance model to a virtual automobile parts logistics. As a result of the case study, we verified the scalability of a MongoDB-based traceability system and predicted the point when data node servers should be expanded in this case. The traceability system performance assessment method proposed in this research can be used as a decision-making tool for hardware capacity planning during the initial stage of construction of traceability systems and during their operational phase. PMID:27983654
Performance Prediction of a MongoDB-Based Traceability System in Smart Factory Supply Chains.
Kang, Yong-Shin; Park, Il-Ha; Youm, Sekyoung
2016-12-14
In the future, with the advent of the smart factory era, manufacturing and logistics processes will become more complex, and the complexity and criticality of traceability will further increase. This research aims at developing a performance assessment method to verify scalability when implementing traceability systems based on key technologies for smart factories, such as Internet of Things (IoT) and BigData. To this end, based on existing research, we analyzed traceability requirements and an event schema for storing traceability data in MongoDB, a document-based Not Only SQL (NoSQL) database. Next, we analyzed the algorithm of the most representative traceability query and defined a query-level performance model, which is composed of response times for the components of the traceability query algorithm. Next, this performance model was solidified as a linear regression model because the response times increase linearly by a benchmark test. Finally, for a case analysis, we applied the performance model to a virtual automobile parts logistics. As a result of the case study, we verified the scalability of a MongoDB-based traceability system and predicted the point when data node servers should be expanded in this case. The traceability system performance assessment method proposed in this research can be used as a decision-making tool for hardware capacity planning during the initial stage of construction of traceability systems and during their operational phase.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Choi, Yong Joon; Yoo, Jun Soo; Smith, Curtis Lee
2015-09-01
This INL plan comprehensively describes the Requirements Traceability Matrix (RTM) on main physics and numerical method of the RELAP-7. The plan also describes the testing-based software verification and validation (SV&V) process—a set of specially designed software models used to test RELAP-7.
How to obtain traceability on optical radiation measurements?
NASA Astrophysics Data System (ADS)
Matamoros García, Carlos H.
2006-02-01
Traceability to national standards provides confidence in measurements results, granting a guaranty when carrying out governmental rules and when demonstrating conformity with quality requirements such as ISO 9000 or ISO/IEC 17025 (and the Mexican equivalent standards). The appropriate traceability contributes with confidence of the quality of products or services. This paper presents different ways to obtain traceability in Mexico for the optical radiation measurements, mentioning some applications, and highlighting the necessity of having traceability to the appropriate units of the SI. Additionally it present the national standards maintained by Centro Nacional de Metrologia (CENAM), the national metrology institute in Mexico, that give the technical support to Mexican measurements in this field and the international recognition that the personal of the Optics and Radiometry Division had gained in 10 years of development.
[Computerized system validation of clinical researches].
Yan, Charles; Chen, Feng; Xia, Jia-lai; Zheng, Qing-shan; Liu, Daniel
2015-11-01
Validation is a documented process that provides a high degree of assurance. The computer system does exactly and consistently what it is designed to do in a controlled manner throughout the life. The validation process begins with the system proposal/requirements definition, and continues application and maintenance until system retirement and retention of the e-records based on regulatory rules. The objective to do so is to clearly specify that each application of information technology fulfills its purpose. The computer system validation (CSV) is essential in clinical studies according to the GCP standard, meeting product's pre-determined attributes of the specifications, quality, safety and traceability. This paper describes how to perform the validation process and determine relevant stakeholders within an organization in the light of validation SOPs. Although a specific accountability in the implementation of the validation process might be outsourced, the ultimate responsibility of the CSV remains on the shoulder of the business process owner-sponsor. In order to show that the compliance of the system validation has been properly attained, it is essential to set up comprehensive validation procedures and maintain adequate documentations as well as training records. Quality of the system validation should be controlled using both QC and QA means.
Crandall, Philip G; O'Bryan, Corliss A; Babu, Dinesh; Jarvis, Nathan; Davis, Mike L; Buser, Michael; Adam, Brian; Marcy, John; Ricke, Steven C
2013-10-01
Traceability through the entire food supply chain from conception to consumption is a pressing need for the food industry, consumers and government regulators. A robust, whole-chain traceability system is needed that will effectively address food quality, food safety and food defense issues by providing real-time, transparent and reliable information from beef production through slaughter and distribution to the consumer. Traceability is an expanding part of the food safety continuum that minimizes the risk of foodborne diseases, assures quality and cold-chain integrity. Traceability can be a positive competitive marketing edge for beef producers who can verify specific quality attributes such as humane production or grass fed or Certified Organic. In this review we address the benefits as well as the remaining issues for whole-chain traceability in the beef industry, with particular focus on ground beef for the markets in the United States. Copyright © 2013 Elsevier Ltd. All rights reserved.
Making traceability work across the entire food supply chain.
Bhatt, Tejas; Buckley, Greg; McEntire, Jennifer C; Lothian, Paul; Sterling, Brian; Hickey, Caitlin
2013-12-01
The Institute of Food Technologists held Traceability Research Summits on July 14, August 22, and November 1, 2011, to address how to meet the growing requirement for agriculture and food traceability. Each meeting had a group of about 50 individuals who came from food companies, trade associations, local, state, and federal governments, 3rd-party traceability solution providers, not-for-profit corporations, consultants, and consumer groups. They discussed and deliberated the objectives of traceability and the means to develop product tracing in the food system. A total of 70 people participated in the 3 summits. These individuals were invited to participate in a small workgroup responsible for considering the details related to product tracing and presenting draft concepts to the larger group on November 1, 2011, in Chicago. During this meeting, the larger assembly further refined the concepts and came to an agreement on the basic principles and overall design of the desired approach to traceability. © 2013 Institute of Food Technologists®
NASA Technical Reports Server (NTRS)
1985-01-01
Appendix A contains data that characterize the system functions in sufficient depth as to determine the requirements for the Space Station Data System (SSDS). This data is in the form of: (1) top down traceability report; (2) bottom up traceability report; (3) requirements data sheets; and (4) cross index of requirements paragraphs of the source documents and the requirements numbers. A data base users guide is included that interested parties can use to access the requirements data base and get up to date information about the functions.
Sardina, Maria Teresa; Tortorici, Lina; Mastrangelo, Salvatore; Di Gerlando, Rosalia; Tolone, Marco; Portolano, Baldassare
2015-08-01
In livestock, breed assignment may play a key role in the certification of products linked to specific breeds. Traceability of farm animals and authentication of their products can contribute to improve breed profitability and sustainability of animal productions with significant impact on the rural economy of particular geographic areas and on breed and biodiversity conservation. With the goal of developing a breed genetic traceability system for Girgentana dairy products, the aim of this study was to identify specific microsatellite markers able to discriminate among the most important Sicilian dairy goat breeds, in order to detect possible adulteration in Girgentana dairy products. A total of 20 microsatellite markers were analyzed on 338 individual samples from Girgentana, Maltese, and Derivata di Siria goat breeds. Specific microsatellite markers useful for traceability of dairy products were identified. Eight microsatellite markers showed alleles present at the same time in Maltese and Derivata di Siria and absent in Girgentana and, therefore, they were tested on DNA pools of the three breeds. Considering the electropherograms' results, only FCB20, SRCRSP5, and TGLA122 markers were tested on DNA samples extracted from cheeses of Girgentana goat breed. These three microsatellite markers could be applied in a breed genetic traceability system of Girgentana dairy products in order to detect adulteration due to Maltese and Derivata di Siria goat breeds. Copyright © 2015 Elsevier Ltd. All rights reserved.
An Approach to Building a Traceability Tool for Software Development
NASA Technical Reports Server (NTRS)
Delgado, Nelly; Watson, Tom
1997-01-01
It is difficult in a large, complex computer program to ensure that it meets the specified requirements. As the program evolves over time, a11 program constraints originally elicited during the requirements phase must be maintained. In addition, during the life cycle of the program, requirements typically change and the program must consistently reflect those changes. Imagine the following scenario. Company X wants to develop a system to automate its assembly line. With such a large system, there are many different stakeholders, e.g., managers, experts such as industrial and mechanical engineers, and end-users. Requirements would be elicited from all of the stake holders involved in the system with each stakeholder contributing their point of view to the requirements. For example, some of the requirements provided by an industrial engineer may concern the movement of parts through the assembly line. A point of view provided by the electrical engineer may be reflected in constraints concerning maximum power usage. End-users may be concerned with comfort and safety issues, whereas managers are concerned with the efficiency of the operation. With so many points of view affecting the requirements, it is difficult to manage them, communicate information to relevant stakeholders. and it is likely that conflicts in the requirements will arise. In the coding process, the implementors will make additional assumptions and interpretations on the design and the requirements of the system. During any stage of development, stakeholders may request that a requirement be added or changed. In such a dynamic environment, it is difficult to guarantee that the system will preserve the current set of requirements. Tracing, the mapping between objects in the artifacts of the system being developed, addresses this issue. Artifacts encompass documents such as the system definition, interview transcripts, memoranda, the software requirements specification, user's manuals, the functional specifications, design reports, and system code. Tracing helps 1) validate system features against, the requirement specification, 2) identify error sources and, most importantly, 3) manage change. With so many people involved in the development of the system, it becomes necessary to identify the reasons behind the design requirements or the implementation decisions. This paper is concerned with an approach that maps documents to constraints that capture properties of and relationships between the objects being modeled by the program. Section 2 provides the reader with a background on traceability tools. Section 3 gives a brief description of the context monitoring system on which the approach suggested in this paper is based. Section 4 presents an overview of our approach to providing traceability. The last section presents our future direction of research.
What metrology can do to improve the quality of your atmospheric ammonia measurements
NASA Astrophysics Data System (ADS)
Leuenberger, Daiana; Martin, Nicholas A.; Pascale, Céline; Guillevic, Myriam; Ackermann, Andreas; Ferracci, Valerio; Cassidy, Nathan; Hook, Josh; Battersby, Ross M.; Tang, Yuk S.; Stevens, Amy C. M.; Jones, Matthew R.; Braban, Christine F.; Gates, Linda; Hangartner, Markus; Sacco, Paolo; Pagani, Diego; Hoffnagle, John A.; Niederhauser, Bernhard
2017-04-01
Measuring ammonia in ambient air is a sensitive and priority issue due to its harmful effects on human health and ecosystems. The European Directive 2001/81/EC on "National Emission Ceilings for Certain Atmospheric Pollutants (NEC)" regulates ammonia emissions in the member states. However, there is a lack of regulation to ensure reliable ammonia measurements, namely in applicable analytical technology, maximum allowed uncertainty, quality assurance and quality control (QC/QA) procedures, as well as in the infrastructure to attain metrological traceability, i.e. that the results of measurements are traceable to SI-units through an unbroken chain of calibrations. In the framework of the European Metrology Research Programme (EMRP) project on the topic "Metrology for Ammonia in Ambient Air" (MetNH3), European national metrology institutes (NMI's) have joined to tackle the issue of generating SI-traceable reference material, i.e. generate reference gas mixtures containing known amount fractions of NH3.This requires special infrastructure and analytical techniques: Measurements of ambient ammonia are commonly carried out with diffusive samplers or by active sampling with denuders, but such techniques have not yet been extensively validated. Improvements in the metrological traceability may be achieved through the determination of NH3 diffusive sampling rates using ammonia Primary Standard Gas Mixtures (PSMs), developed by gravimetry at the National Physical Laboratory NPL and a controlled atmosphere test facility in combination with on-line monitoring with a cavity ring-down spectrometer. The Federal Institute of Metrology METAS has developed an infrastructure to generate SI-traceable NH3 reference gas mixtures dynamically in the amount fraction range 0.5-500 nmol/mol (atmospheric concentrations) and with uncertainties UNH3 <3%. The infrastructure consists of a stationary as well as a mobile device for full flexibility for calibrations in the laboratory and in the field. Both devices apply the method of temperature and pressure dependant permeation of a pure substance through a membrane into a stream of pre-purified matrix gas and subsequent dilution to required amount fractions. All relevant parameters are fully traceable to SI-units. Extractive optical analysers can be connected directly to both, stationary and mobile systems for calibration. Moreover, the resulting gas mixture can also be pressurised into coated cylinders by cryo-filling. The mobile system as well as these cylinders can be applied for calibrations of optical instruments in other laboratories and in the field. In addition, an SI-traceable dilution system based on a cascade of critical orifices has been established to dilute NH3 mixtures in the order of μmol/mol stored in cylinders. It is planned to apply this system to calibrate and re-sample gas mixtures in cylinders due to its very economical gas use. Here we present insights into the development of said infrastructure and results performance tests. Moreover, we include results of the study on adsorption/desorption effects in dry as well as humidified matrix gas into the discussion on the generation of reference gas mixtures. Acknowledgement: This work was supported by the European Metrology Research Programme (EMRP). The EMRP is jointly funded by the EMRP participating countries within EURAMET and the European Union.
DOT National Transportation Integrated Search
1999-12-01
The Traceability document consists of brief introductory material and a series of appended Trace Tables. These tables provide complete traceability of ITS User Service Requirements (USR) to elements of the National ITS Architecture. Additional Trace ...
IPG Job Manager v2.0 Design Documentation
NASA Technical Reports Server (NTRS)
Hu, Chaumin
2003-01-01
This viewgraph presentation provides a high-level design of the IPG Job Manager, and satisfies its Master Requirement Specification v2.0 Revision 1.0, 01/29/2003. The presentation includes a Software Architecture/Functional Overview with the following: Job Model; Job Manager Client/Server Architecture; Job Manager Client (Job Manager Client Class Diagram and Job Manager Client Activity Diagram); Job Manager Server (Job Manager Client Class Diagram and Job Manager Client Activity Diagram); Development Environment; Project Plan; Requirement Traceability.
Traceability Through Automatic Program Generation
NASA Technical Reports Server (NTRS)
Richardson, Julian; Green, Jeff
2003-01-01
Program synthesis is a technique for automatically deriving programs from specifications of their behavior. One of the arguments made in favour of program synthesis is that it allows one to trace from the specification to the program. One way in which traceability information can be derived is to augment the program synthesis system so that manipulations and calculations it carries out during the synthesis process are annotated with information on what the manipulations and calculations were and why they were made. This information is then accumulated throughout the synthesis process, at the end of which, every artifact produced by the synthesis is annotated with a complete history relating it to every other artifact (including the source specification) which influenced its construction. This approach requires modification of the entire synthesis system - which is labor-intensive and hard to do without influencing its behavior. In this paper, we introduce a novel, lightweight technique for deriving traceability from a program specification to the corresponding synthesized code. Once a program has been successfully synthesized from a specification, small changes are systematically made to the specification and the effects on the synthesized program observed. We have partially automated the technique and applied it in an experiment to one of our program synthesis systems, AUTOFILTER, and to the GNU C compiler, GCC. The results are promising: 1. Manual inspection of the results indicates that most of the connections derived from the source (a specification in the case of AUTOFILTER, C source code in the case of GCC) to its generated target (C source code in the case of AUTOFILTER, assembly language code in the case of GCC) are correct. 2. Around half of the lines in the target can be traced to at least one line of the source. 3. Small changes in the source often induce only small changes in the target.
Jérôme, Marc; Martinsohn, Jann Thorsten; Ortega, Delphine; Carreau, Philippe; Verrez-Bagnis, Véronique; Mouchel, Olivier
2008-05-28
Traceability in the fish food sector plays an increasingly important role for consumer protection and confidence building. This is reflected by the introduction of legislation and rules covering traceability on national and international levels. Although traceability through labeling is well established and supported by respective regulations, monitoring and enforcement of these rules are still hampered by the lack of efficient diagnostic tools. We describe protocols using a direct sequencing method based on 212-274-bp diagnostic sequences derived from species-specific mitochondria DNA cytochrome b, 16S rRNA, and cytochrome oxidase subunit I sequences which can efficiently be applied to unambiguously determine even closely related fish species in processed food products labeled "anchovy". Traceability of anchovy-labeled products is supported by the public online database AnchovyID ( http://anchovyid.jrc.ec.europa.eu), which provided data obtained during our study and tools for analytical purposes.
Rienzi, L; Bariani, F; Dalla Zorza, M; Albani, E; Benini, F; Chamayou, S; Minasi, M G; Parmegiani, L; Restelli, L; Vizziello, G; Costa, A Nanni
2017-08-01
Can traceability of gametes and embryos be ensured during IVF? The use of a simple and comprehensive traceability system that includes the most susceptible phases during the IVF process minimizes the risk of mismatches. Mismatches in IVF are very rare but unfortunately possible with dramatic consequences for both patients and health care professionals. Traceability is thus a fundamental aspect of the treatment. A clear process of patient and cell identification involving witnessing protocols has to be in place in every unit. To identify potential failures in the traceability process and to develop strategies to mitigate the risk of mismatches, previously failure mode and effects analysis (FMEA) has been used effectively. The FMEA approach is however a subjective analysis, strictly related to specific protocols and thus the results are not always widely applicable. To reduce subjectivity and to obtain a widespread comprehensive protocol of traceability, a multicentre centrally coordinated FMEA was performed. Seven representative Italian centres (three public and four private) were selected. The study had a duration of 21 months (from April 2015 to December 2016) and was centrally coordinated by a team of experts: a risk analysis specialist, an expert embryologist and a specialist in human factor. Principal investigators of each centre were first instructed about proactive risk assessment and FMEA methodology. A multidisciplinary team to perform the FMEA analysis was then formed in each centre. After mapping the traceability process, each team identified the possible causes of mistakes in their protocol. A risk priority number (RPN) for each identified potential failure mode was calculated. The results of the FMEA analyses were centrally investigated and consistent corrective measures suggested. The teams performed new FMEA analyses after the recommended implementations. In each centre, this study involved: the laboratory director, the Quality Control & Quality Assurance responsible, Embryologist(s), Gynaecologist(s), Nurse(s) and Administration. The FMEA analyses were performed according to the Joint Commission International. The FMEA teams identified seven main process phases: oocyte collection, sperm collection, gamete processing, insemination, embryo culture, embryo transfer and gamete/embryo cryopreservation. A mean of 19.3 (SD ± 5.8) associated process steps and 41.9 (SD ± 12.4) possible failure modes were recognized per centre. A RPN ≥15 was calculated in a mean of 6.4 steps (range 2-12, SD ± 3.60). A total of 293 failure modes were centrally analysed 45 of which were considered at medium/high risk. After consistent corrective measures implementation and re-evaluation, a significant reduction in the RPNs in all centres (RPN <15 for all steps) was observed. A simple and comprehensive traceability system was designed as the result of the seven FMEA analyses. The validity of FMEA is in general questionable due to the subjectivity of the judgments. The design of this study has however minimized this risk by introducing external experts for the analysis of the FMEA results. Specific situations such as sperm/oocyte donation, import/export and pre-implantation genetic testing were not taken into consideration. Finally, this study is only limited to the analysis of failure modes that may lead to mismatches, other possible procedural mistakes are not accounted for. Every single IVF centre should have a clear and reliable protocol for identification of patients and traceability of cells during manipulation. The results of this study can support IVF groups in better recognizing critical steps in their protocols, understanding identification and witnessing process, and in turn enhancing safety by introducing validated corrective measures. This study was designed by the Italian Society of Embryology Reproduction and Research (SIERR) and funded by the Italian National Transplant Centre (CNT) of the Italian National Institute of Health (ISS). The authors have no conflicts of interest. N/A. © The Author 2017. Published by Oxford University Press on behalf of the European Society of Human Reproduction and Embryology. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com
Tissue allograft coding and traceability in USM Tissue Bank, Malaysia.
Sheikh Ab Hamid, Suzina; Abd Rahman, Muhamad Nor Firdaus
2010-11-01
In Malaysia, tissue banking activities began in Universiti Sains Malaysia (USM) Tissue Bank in early 1990s. Since then a few other bone banks have been set up in other government hospitals and institutions. However, these banks are not governed by the national authority. In addition there is no requirement set by the national regulatory authority on coding and traceability for donated human tissues for transplantation. Hence, USM Tissue Bank has taken the initiatives to adopt a system that enables the traceability of tissues between the donor, the processed tissue and the recipient based on other international standards for tissue banks. The traceability trail has been effective and the bank is certified compliance to the international standard ISO 9001:2008.
Framework for Architecture Trade Study Using MBSE and Performance Simulation
NASA Technical Reports Server (NTRS)
Ryan, Jessica; Sarkani, Shahram; Mazzuchim, Thomas
2012-01-01
Increasing complexity in modern systems as well as cost and schedule constraints require a new paradigm of system engineering to fulfill stakeholder needs. Challenges facing efficient trade studies include poor tool interoperability, lack of simulation coordination (design parameters) and requirements flowdown. A recent trend toward Model Based System Engineering (MBSE) includes flexible architecture definition, program documentation, requirements traceability and system engineering reuse. As a new domain MBSE still lacks governing standards and commonly accepted frameworks. This paper proposes a framework for efficient architecture definition using MBSE in conjunction with Domain Specific simulation to evaluate trade studies. A general framework is provided followed with a specific example including a method for designing a trade study, defining candidate architectures, planning simulations to fulfill requirements and finally a weighted decision analysis to optimize system objectives.
Muñoz-Colmenero, Marta; Martínez, Jose Luis; Roca, Agustín; Garcia-Vazquez, Eva
2017-01-01
The Next Generation Sequencing methodologies are considered the next step within DNA-based methods and their applicability in different fields is being evaluated. Here, we tested the usefulness of the Ion Torrent Personal Genome Machine (PGM) in food traceability analyzing candies as a model of high processed foods, and compared the results with those obtained by PCR-cloning-sequencing (PCR-CS). The majority of samples exhibited consistency between methodologies, yielding more information and species per product from the PGM platform than PCR-CS. Significantly higher AT-content in sequences of the same species was also obtained from PGM. This together with some taxonomical discrepancies between methodologies suggest that the PGM platform is still pre-mature for its use in food traceability of complex highly processed products. It could be a good option for analysis of less complex food, saving time and cost per sample. Copyright © 2016 Elsevier Ltd. All rights reserved.
EPA and NIST have collaborated to establish the necessary procedures for establishing the required NIST traceability of commercially-provided Hg0 and HgCl2 reference generators. This presentation will discuss the approach of a joint EPA/NIST study to accurately quantify the tru...
Implementing traceability using particle randomness-based textile printed tags
NASA Astrophysics Data System (ADS)
Agrawal, T. K.; Koehl, L.; Campagne, C.
2017-10-01
This article introduces a random particle-based traceability tag for textiles. The proposed tag not only act as a unique signature for the corresponding textile product but also possess the features such as easy to manufacture and hard to copy. It seeks applications in brand authentication and traceability in textile and clothing (T&C) supply chain. A prototype has been developed by screen printing process, in which micron-scale particles were mixed with the printing paste and printed on cotton fabrics to attain required randomness. To encode the randomness, the image of the developed tag was taken and analyzed using image processing. The randomness of the particles acts as a product key or unique signature which is required to decode the tag. Finally, washing and abrasion resistance tests were conducted to check the durability of the printed tag.
DOE Office of Scientific and Technical Information (OSTI.GOV)
BEVINS, R.R.
This document has been updated during the definitive design portion of the first phase of the W-314 Project to capture additional software requirements and is planned to be updated during the second phase of the W-314 Project to cover the second phase of the Project's scope. The objective is to provide requirement traceability by recording the analysis/basis for the functional descriptions of the master pump shutdown system. This document identifies the sources of the requirements and/or how these were derived. Each requirement is validated either by quoting the source or an analysis process involving the required functionality, performance characteristics, operationsmore » input or engineering judgment.« less
Bio-markers: traceability in food safety issues.
Raspor, Peter
2005-01-01
Research and practice are focusing on development, validation and harmonization of technologies and methodologies to ensure complete traceability process throughout the food chain. The main goals are: scale-up, implementation and validation of methods in whole food chains, assurance of authenticity, validity of labelling and application of HACCP (hazard analysis and critical control point) to the entire food chain. The current review is to sum the scientific and technological basis for ensuring complete traceability. Tracing and tracking (traceability) of foods are complex processes due to the (bio)markers, technical solutions and different circumstances in different technologies which produces various foods (processed, semi-processed, or raw). Since the food is produced for human or animal consumption we need suitable markers to be stable and traceable all along the production chain. Specific biomarkers can have a function in technology and in nutrition. Such approach would make this development faster and more comprehensive and would make possible that food effect could be monitored with same set of biomarkers in consumer. This would help to develop and implement food safety standards that would be based on real physiological function of particular food component.
Vallejo-Cordoba, Belinda; González-Córdova, Aarón F
2010-07-01
This review presents an overview of the applicability of CE in the analysis of chemical and biological contaminants involved in emerging food safety issues. Additionally, CE-based genetic analyzers' usefulness as a unique tool in food traceability verification systems was presented. First, analytical approaches for the determination of melamine and specific food allergens in different foods were discussed. Second, natural toxin analysis by CE was updated from the last review reported in 2008. Finally, the analysis of prion proteins associated with the "mad cow" crises and the application of CE-based genetic analyzers for meat traceability were summarized.
NASA Astrophysics Data System (ADS)
Liu, Ting; Li, Qi; Song, Junlin; Yu, Hong
2017-02-01
There is an increasing requirement for traceability of aquaculture products, both for consumer protection and for food safety. There are high error rates in the conventional traceability systems depending on physical labels. Genetic traceability technique depending on DNA-based tracking system can overcome this problem. Genealogy information is essential for genetic traceability, and microsatellite DNA marker is a good choice for pedigree analysis. As increasing genotyping throughput of microsatellites, microsatellite multiplex PCR has become a fast and cost-effective technique. As a commercially important cultured aquatic species, Pacific oyster Crassostrea gigas has the highest global production. The objective of this study was to develop microsatellite multiplex PCR panels with dye-labeled universal primer for pedigree analysis in C. gigas, and these multiplex PCRs were validated using 12 full-sib families with known pedigrees. Here we developed six informative multiplex PCRs using 18 genomic microsatellites in C. gigas. Each multiplex panel contained a single universal primer M13(-21) used as a tail on each locus-specific forward primer and a single universal primer M13(-21) labeled with fluorophores. The polymorphisms of the markers were moderate, with an average of 10.3 alleles per locus and average polymorphic information content of 0.740. The observed heterozygosity per locus ranged from 0.492 to 0.822. Cervus simulations revealed that the six panels would still be of great value when massive families were analysed. Pedigree analysis of real offspring demonstrated that 100% of the offspring were unambiguously allocated to their parents when two multiplex PCRs were used. The six sets of multiplex PCRs can be an important tool for tracing cultured individuals, population genetic analysis, and selective breeding program in C. gigas.
Chao, Jingdong; Skup, Martha; Alexander, Emily; Tundia, Namita; Macaulay, Dendy; Wu, Eric; Mulani, Parvez
2015-03-01
The purpose of the present study was to investigate the traceability of adverse events (AEs) for branded and generic drugs with identical nonproprietary names and to consider potential implications for the traceability of AEs for branded and biosimilar biologics. Adverse event reports in the Food and Drug Administration AE Reporting System (FAERS) were compared with those in a commercial insurance claims database (Truven Health MarketScan(®)) for 2 drugs (levetiracetam and enoxaparin sodium) with manufacturing or prescribing considerations potentially analogous to those of some biosimilars. Monthly rates of branded- and generic-attributed AEs were estimated pre- and post-generic entry. Post-entry branded-to-generic AE relative rate ratios were calculated. In FAERS, monthly AE rate ratios during the post-generic period showed a pattern in which AE rates for the branded products were greater than for the generic products. Differences in rates of brand- and generic-attributed AEs were statistically significant for both study drugs; the AE rate for the branded products peaked at approximately 10 times that of the generic levetiracetam products and approximately 4 times that of the generic enoxaparin sodium products. In contrast, monthly ratios for the MarketScan data were relatively constant over time. Use of the same nonproprietary name for generic and branded products may contribute to poor traceability of AEs reported in the FAERS database due to the significant misattribution of AEs to branded products (when those AEs were in fact associated with patient use of generic products). To ensure accurate and robust safety surveillance and traceability for biosimilar products in the United States, improved product identification mechanisms, such as related but distinguishable nonproprietary names for biosimilars and reference biologics, should be considered.
Development, characterization, and validation of an optical transfer standard for ammonia in air
NASA Astrophysics Data System (ADS)
Lüttschwager, Nils; Balslev-Harder, David; Leuenberger, Daiana; Pogány, Andrea; Werhahn, Olav; Ebert, Volker
2017-04-01
Ammonia is an atmospheric trace gas that is predominantly emitted from anthropogenic agricultural activities. Since elevated levels of ammonia can have negative effects to human health as well as ecosystems, it is imperative to monitor and control ammonia emissions. This requires SI-traceable standards to calibrate ammonia monitoring instrumentation and to make measurements comparable. The lack of such standards became a pressing issue in recent years and the MetNH3 project (www.metnh3.eu) was initiated to fill the gap, pursuing different strategies. The work that we present was part of these endeavours and focusses on the development and application of an optical transfer standard for amount fraction measurements of ammonia in ambient air. An optical transfer standard (OTS) offers an alternative to calibrations of air monitoring instrumentation by means of reference gas mixtures. With an OTS, absolute amount fraction results are derived by evaluating absorption spectra using a spectral model and pre-measured spectral properties of the analyte. In that way, the instrument can measure calibration gas-independent ("calibration-free") and, moreover, can itself serve as standard to calibrate air monitoring analyzers. Molecular spectral properties are the excellent, non-drifting point of reference of the OTS and form, together with traceable measurements of temperature and pressure, the basis for SI-traceable amount fraction measurements. We developed an OTS based on a commercial cavity-ring-down spectrometer with a detection limit below 1 ppb (1 nmol/mol). A custom spectral data evaluation routine for absolute, calibration-free measurements, as well as measurements of spectral properties of ammonia with the focus on measurement uncertainty and traceability [1] are the fundaments of our OTS. Validation measurements were conducted using a SI-traceable ammonia reference gas generator over a period of several months. Here, we present an evaluation of the performance of our OTS from 1 ppb to 200 ppb. We found the results obtained with the OTS to be concordant to reference gas mixtures yielding amount fraction results with standard uncertainties of less than 3 %, for which an uncertainty budget is provided. Acknowledgement: This work was supported by the European Metrology Research Programme (EMRP). The EMRP is jointly funded by the EMRP participating countries within EURAMET and the European Union. References 1. A. Pogány, O. Werhahn, and V. Ebert, High-Accuracy Ammonia Line Intensity Measurements at 1.5 µm, in Imaging and Applied Optics 2016, OSA Technical Digest (online) (Optical Society of America, 2016), paper JT3A.15, DOI: 10.1364/3D.2016.JT3A.15
NASA Technical Reports Server (NTRS)
Denney, Ewen W.; Fischer, Bernd
2009-01-01
Model-based development and automated code generation are increasingly used for production code in safety-critical applications, but since code generators are typically not qualified, the generated code must still be fully tested, reviewed, and certified. This is particularly arduous for mathematical and control engineering software which requires reviewers to trace subtle details of textbook formulas and algorithms to the code, and to match requirements (e.g., physical units or coordinate frames) not represented explicitly in models or code. Both tasks are complicated by the often opaque nature of auto-generated code. We address these problems by developing a verification-driven approach to traceability and documentation. We apply the AUTOCERT verification system to identify and then verify mathematical concepts in the code, based on a mathematical domain theory, and then use these verified traceability links between concepts, code, and verification conditions to construct a natural language report that provides a high-level structured argument explaining why and how the code uses the assumptions and complies with the requirements. We have applied our approach to generate review documents for several sub-systems of NASA s Project Constellation.
a Review on Legal Traceability of Gnss Measurements in the Malaysian Cadastral Practice
NASA Astrophysics Data System (ADS)
Gill, J.; Shariff, N. S.; Omar, K. M.; Din, A. H. M.; Amin, Z. M.
2016-09-01
As the dependency on Global Navigation Satellite System (GNSS) in surveying has been growing over the years, the need for legal traceability of GNSS measurements has become a significant matter. In Malaysia, with the advent of the Malaysia Real-time Kinematic Network (MyRTKnet), GNSS surveying has revolutionised land survey and mapping. Correspondingly, the Department of Survey and Mapping Malaysia (DSMM) amended and published standard regulations and guidelines concerning cadastral survey, i.e., Cadastral Survey Regulations 2009, to include GNSS measurements. However, these regulations and guidelines has not comprehensively incorporated legal traceability of GNSS measurements; which is a prerequisite for cadastral surveys as it requires reliable and conclusive evidence for issues such as boundary disputes. The first objective of this paper is to review and discuss the legal traceability of GNSS measurements. Secondly, it will highlight the current practice and issues, i.e., with regard to legal traceability, within the present Malaysian cadastral regulation and guidelines, in relation to the prevalently adopted Network RTK (N-RTK) technique, GNSS instrument calibrations, and reference stations' accuracy. Lastly, a rudimentary best practice guideline for GNSS surveying in cadastral survey for Malaysia is proposed. It is expected that this paper will contribute to the implementation of a best practice guideline, which is inclusive of legal traceability of GNSS measurements, for the Malaysian cadastral practice.
Improved reliability of pH measurements.
Spitzer, Petra; Werner, Barbara
2002-11-01
Measurements of pH are performed on a large scale at laboratory level, and in industry. To meet the quality-control requirements and other technical specifications there is a need for traceability in measurement results. The prerequisite for the international acceptance of analytical data is reliability. To measure means to compare. Comparability entails use of recognised references to which the standard buffer solutions used for calibration of pH meter-electrode assemblies can be traced. The new recommendation on the measurement of pH recently published as a provisional document by the International Union on Pure and Applied Chemistry (IUPAC) enables traceability for measured pH values to a conventional reference frame which is recognised world-wide. The primary method for pH will be described. If analytical data are to be accepted internationally it is necessary to demonstrate the equivalence of the national traceability structures, including national measurement standards. For the first time key comparisons for pH have been performed by the Consultative Committee for Amount of Substance (CCQM, set up by the International Bureau of Weights and Measures, BIPM) to assess the equivalence of the national measurement procedures used to determine the pH of primary standard buffer solutions. The results of the first key comparison on pH CCQM-K9, and other international initiatives to improve the consistency of the results of measurement for pH, are reported.
Adoption of Requirements Engineering Practices in Malaysian Software Development Companies
NASA Astrophysics Data System (ADS)
Solemon, Badariah; Sahibuddin, Shamsul; Ghani, Abdul Azim Abd
This paper presents exploratory survey results on Requirements Engineering (RE) practices of some software development companies in Malaysia. The survey attempted to identify patterns of RE practices the companies are implementing. Information required for the survey was obtained through a survey, mailed self-administered questionnaires distributed to project managers and software developers who are working at software development companies operated across the country. The results showed that the overall adoption of the RE practices in these companies is strong. However, the results also indicated that fewer companies in the survey have use appropriate CASE tools or software to support their RE process and practices, define traceability policies and maintain traceability manual in their projects.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yoo, Jun Soo; Choi, Yong Joon; Smith, Curtis Lee
2016-09-01
This document addresses two subjects involved with the RELAP-7 Software Verification and Validation Plan (SVVP): (i) the principles and plan to assure the independence of RELAP-7 assessment through the code development process, and (ii) the work performed to establish the RELAP-7 assessment plan, i.e., the assessment strategy, literature review, and identification of RELAP-7 requirements. Then, the Requirements Traceability Matrices (RTMs) proposed in previous document (INL-EXT-15-36684) are updated. These RTMs provide an efficient way to evaluate the RELAP-7 development status as well as the maturity of RELAP-7 assessment through the development process.
Traceability System For Agricultural Productsbased on Rfid and Mobile Technology
NASA Astrophysics Data System (ADS)
Sugahara, Koji
In agriculture, it is required to establish and integrate food traceability systems and risk management systems in order to improve food safety in the entire food chain. The integrated traceability system for agricultural products was developed, based on innovative technology of RFID and mobile computing. In order to identify individual products on the distribution process efficiently,small RFID tags with unique ID and handy RFID readers were applied. On the distribution process, the RFID tags are checked by using the readers, and transit records of the products are stored to the database via wireless LAN.Regarding agricultural production, the recent issues of pesticides misuse affect consumer confidence in food safety. The Navigation System for Appropriate Pesticide Use (Nouyaku-navi) was developed, which is available in the fields by Internet cell-phones. Based on it, agricultural risk management systems have been developed. These systems collaborate with traceability systems and they can be applied for process control and risk management in agriculture.
Bogani, Patrizia; Spiriti, Maria Michela; Lazzarano, Stefano; Arcangeli, Annarosa; Buiatti, Marcello; Minunni, Maria
2011-11-01
The World Anti-Doping Agency fears the use of gene doping to enhance athletic performances. Thus, a bioanalytical approach based on end point PCR for detecting markers' of transgenesis traceability was developed. A few sequences from two different vectors using an animal model were selected and traced in different tissues and at different times. In particular, enhanced green fluorescent protein gene and a construct-specific new marker were targeted in the analysis. To make the developed detection approach open to future routine doping analysis, matrices such as urine and tears as well blood were also tested. This study will have impact in evaluating the vector transgenes traceability for the detection of a gene doping event by non-invasive sampling.
Results of Absolute Cavity Pyrgeometer and Infrared Integrating Sphere Comparisons
DOE Office of Scientific and Technical Information (OSTI.GOV)
Reda, Ibrahim M; Sengupta, Manajit; Dooraghi, Michael R
Accurate and traceable atmospheric longwave irradiance measurements are required for understanding radiative impacts on the Earth's energy budget. The standard to which pyrgeometers are traceable is the interim World Infrared Standard Group (WISG), maintained in the Physikalisch-Meteorologisches Observatorium Davos (PMOD). The WISG consists of four pyrgeometers that were calibrated using Rolf Philipona's Absolute Sky-scanning Radiometer [1]. The Atmospheric Radiation Measurement (ARM) facility has recently adopted the WISG to maintain the traceability of the calibrations of all Eppley precision infrared radiometer (PIR) pyrgeometers. Subsequently, Julian Grobner [2] developed the infrared interferometer spectrometer and radiometer (IRIS) radiometer, and Ibrahim Reda [3] developedmore » the absolute cavity pyrgeometer (ACP). The ACP and IRIS were developed to establish a world reference for calibrating pyrgeometers with traceability to the International System of Units (SI). The two radiometers are unwindowed with negligible spectral dependence, and they are traceable to SI units through the temperature scale (ITS-90). The two instruments were compared directly to the WISG three times at PMOD and twice at the Southern Great Plains (SGP) facility to WISG-traceable pyrgeometers. The ACP and IRIS agreed within +/- 1 W/m2 to +/- 3 W/m2 in all comparisons, whereas the WISG references exhibit a 2-5 Wm2 low bias compared to the ACP/IRIS average, depending on the water vapor column, as noted in Grobner et al. [4]. Consequently, a case for changing the current WISG has been made by Grobner and Reda. However, during the five comparisons the column water vapor exceeded 8 mm. Therefore, it is recommended that more ACP and IRIS comparisons should be held under different environmental conditions and water vapor column content to better establish the traceability of these instruments to SI with established uncertainty.« less
Concept document of the repository-based software engineering program: A constructive appraisal
NASA Technical Reports Server (NTRS)
1992-01-01
A constructive appraisal of the Concept Document of the Repository-Based Software Engineering Program is provided. The Concept Document is designed to provide an overview of the Repository-Based Software Engineering (RBSE) Program. The Document should be brief and provide the context for reading subsequent requirements and product specifications. That is, all requirements to be developed should be traceable to the Concept Document. Applied Expertise's analysis of the Document was directed toward assuring that: (1) the Executive Summary provides a clear, concise, and comprehensive overview of the Concept (rewrite as necessary); (2) the sections of the Document make best use of the NASA 'Data Item Description' for concept documents; (3) the information contained in the Document provides a foundation for subsequent requirements; and (4) the document adequately: identifies the problem being addressed; articulates RBSE's specific role; specifies the unique aspects of the program; and identifies the nature and extent of the program's users.
NASA Technical Reports Server (NTRS)
Killough, Brian; Stover, Shelley
2008-01-01
The Committee on Earth Observation Satellites (CEOS) provides a brief to the Goddard Institute for Space Studies (GISS) regarding the CEOS Systems Engineering Office (SEO) and current work on climate requirements and analysis. A "system framework" is provided for the Global Earth Observation System of Systems (GEOSS). SEO climate-related tasks are outlined including the assessment of essential climate variable (ECV) parameters, use of the "systems framework" to determine relevant informational products and science models and the performance of assessments and gap analyses of measurements and missions for each ECV. Climate requirements, including instruments and missions, measurements, knowledge and models, and decision makers, are also outlined. These requirements would establish traceability from instruments to products and services allowing for benefit evaluation of instruments and measurements. Additionally, traceable climate requirements would provide a better understanding of global climate models.
UML Profiles for Design Decisions and Non-Functional Requirements
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhu, Liming; Gorton, Ian
2007-06-30
A software architecture is composed of a collection of design decisions. Each design decision helps or hinders certain Non-Functional Requirements (NFR). Current software architecture views focus on expressing components and connectors in the system. Design decisions and their relationships with non-functional requirements are often captured in separate design documentation, not explicitly expressed in any views. This disassociation makes architecture comprehension and architecture evolution harder. In this paper, we propose a UML profile for modeling design decisions and an associated UML profile for modeling non-functional requirements in a generic way. The two UML profiles treat design decisions and nonfunctional requirements asmore » first-class elements. Modeled design decisions always refer to existing architectural elements and thus maintain traceability between the two. We provide a mechanism for checking consistency over this traceability. An exemplar is given as« less
Liu, Yu; Zhang, Xufeng; Li, Ying; Wang, Haixia
2017-11-01
Geographical origin traceability is an important issue for controlling the quality of seafood and safeguarding the interest of consumers. In the present study, a new method of compound-specific isotope analysis (CSIA) of fatty acids was established to evaluate its applicability in establishing the origin traceability of Apostichopus japonicus in the coastal areas of China. Moreover, principal component analysis (PCA) and discriminant analysis (DA) were applied to distinguish between the origins of A. japonicus. The results show that the stable carbon isotope compositions of fatty acids of A. japonicus significantly differ in terms of both season and origin. They also indicate that the stable carbon isotope composition of fatty acids could effectively discriminate between the origins of A. japonicus, except for between Changhai Island and Zhangzi Island in the spring of 2016 because of geographical proximity or the similarity of food sources. The fatty acids that have the highest contribution to identifying the geographical origins of A. japonicus are C22:6n-3, C16:1n-7, C20:5n-3, C18:0 and C23:1n-9, when considering the fatty acid contents, the stable carbon isotope composition of fatty acids and the results of the PCA and DA. We conclude that CSIA of fatty acids, combined with multivariate statistical analysis such as PCA and DA, may be an effective tool for establishing the traceability of A. japonicus in the coastal areas of China. The relevant conclusions of the present study provide a new method for determining the traceability of seafood or other food products. © 2017 Society of Chemical Industry. © 2017 Society of Chemical Industry.
NASA Astrophysics Data System (ADS)
Kowalewski, M. G.; Janz, S. J.
2015-02-01
Methods of absolute radiometric calibration of backscatter ultraviolet (BUV) satellite instruments are compared as part of an effort to minimize pre-launch calibration uncertainties. An internally illuminated integrating sphere source has been used for the Shuttle Solar BUV, Total Ozone Mapping Spectrometer, Ozone Mapping Instrument, and Global Ozone Monitoring Experiment 2 using standardized procedures traceable to national standards. These sphere-based spectral responsivities agree to within the derived combined standard uncertainty of 1.87% relative to calibrations performed using an external diffuser illuminated by standard irradiance sources, the customary spectral radiance responsivity calibration method for BUV instruments. The combined standard uncertainty for these calibration techniques as implemented at the NASA Goddard Space Flight Center’s Radiometric Calibration and Development Laboratory is shown to less than 2% at 250 nm when using a single traceable calibration standard.
Si-Traceable Scale for Measurements of Radiocarbon Concentration
NASA Astrophysics Data System (ADS)
Hodges, Joseph T.; Fleisher, Adam J.; Liu, Qingnan; Long, David A.
2017-06-01
Radiocarbon (^{14}C) dating of organic materials is based on measuring the ^{14}C/^{12}C atomic fraction relative to the nascent value that existed when the material was formed by photosynthetic conversion of carbon dioxide present in the atmosphere. This field of measurement has numerous applications including source apportionment of anthropogenic and biogenic fuels and combustion emissions, carbon cycle dynamics, archaeology, and forensics. Accelerator mass spectrometry (AMS) is the most widely used method for radiocarbon detection because it can measure extremely small amounts of radiocarbon (background of nominally 1.2 parts-per-trillion) with high relative precision (0.4 %). AMS measurements of radiocarbon are typically calibrated by reference to standard oxalic-acid (C_2H_2O_4) samples of known radiocativity that are derived from plant matter. Specifically, the internationally accepted absolute dating reference for so-called "modern-equivalent" radiocarbon is 95 % of the specific radioactivity in AD 1950 of the National Bureau of Standards (NBS) oxalic acid standard reference material and normalized to δ^{13}C_{VPDB} = 19 per mil. With this definition, a "modern-equivalent" corresponds to 1.176(70) parts-per-trillion of ^{14}C relative to total carbon content. As an alternative radiocarbon scale, we propose an SI-traceable method to determine ^{14}C absolute concentration which is based on linear Beer-Lambert-law absorption measurements of selected ^{14}C^{16}O_2 ν_3-band line areas. This approach is attractive because line intensities of chosen radiocarbon dioxide transitions can be determined by ab initio calculations with relative uncertainties below 0.5 %. This assumption is justified by the excellent agreement between theoretical values of line intensities and measurements for stable isotopologues of CO_2. In the case of cavity ring-down spectroscopy (CRDS) measurements of ^{14}C^{16}O_2 peak areas, we show that absolute, SI-traceable concentrations of radiocarbon can be determined through measurements of time, frequency, pressure and temperature. Notably, this approach will not require knowledge of the radiocarbon half-life and is expected to provide a stable scale that does not require an artifact standard. M. Stuiver and H. A. Polach, Radiocarbon 19, (1977) 355 O. L. Polyansky et al., Phys. Rev. Lett. 114, (2015) 243001
Traceability of genetically modified organisms.
Aarts, Henk J M; van Rie, Jean-Paul P F; Kok, Esther J
2002-01-01
EU regulations stipulate the labeling of food products containing genetically modified organisms (GMOs) unless the GMO content is due to adventitious and unintended 'contamination' and not exceeding the 1% level at ingredient basis. In addition, member states have to ensure full traceability at all stages of the placing on the market of GMOs. Both requirements ensure consumers 'right to know', facilitate enforcement of regulatory requirements and are of importance for environmental monitoring and postmarket surveillance. Besides administrative procedures, such as used in quality certification systems, the significance of adequate molecular methods becomes more and more apparent. During the last decade a considerable number of molecular methods have been developed and validated that enable the detection, identification and quantification of GMO impurities. Most of them rely on the PCR technology and can only detect one specific stretch of DNA. It can, however, be anticipated that in the near future the situation will become more complex. The number of GMO varieties, including 'stacked-gene' varieties, which will enter the European Market will increase and it is likely that these varieties will harbor more variable constructs. New tools will be necessary to keep up with these developments. One of the most promising techniques is microarray analysis. This technique enables the screening for a large number of different GMOs within a single experiment.
NASA Astrophysics Data System (ADS)
Zhu, Banghe; Rasmussen, John C.; Litorja, Maritoni; Sevick-Muraca, Eva M.
2017-03-01
All medical devices for Food and Drug market approval require specifications of performance based upon International System of Units (SI) or units derived from SI for reasons of traceability. Recently, near-infrared fluorescence (NIRF) imaging devices of a variety of designs have emerged on the market and in investigational clinical studies. Yet the design of devices used in the clinical studies vary widely, suggesting variable device performance. Device performance depends upon optimal excitation of NIRF imaging agents, rejection of backscattered excitation and ambient light, and selective collection of fluorescence emanating from the fluorophore. There remains no traceable working standards with SI units of radiance to enable prediction that a given molecular imaging agent can be detected in humans by a given NIRF imaging device. Furthermore, as technologies evolve and as NIRF imaging device components change, there remains no standardized means to track device improvements over time and establish clinical performance without involving clinical trials, often costly. In this study, we deployed a methodology to calibrate luminescent radiance of a stable, solid phantom in SI units of mW/cm2/sr for characterizing the measurement performance of ICCD and IsCMOS camera based NIRF imaging devices, such as signal-to-noise ratio (SNR) and contrast. The methodology allowed determination of superior SNR of the ICCD over the IsCMOS system; comparable contrast of ICCD and IsCMOS depending upon binning strategies.
Decision Model for Planning and Scheduling of Seafood Product Considering Traceability
NASA Astrophysics Data System (ADS)
Agustin; Mawengkang, Herman; Mathelinea, Devy
2018-01-01
Due to the global challenges, it is necessary for an industrial company to integrate production scheduling and distribution planning, in order to be more efficient and to get more economics advantages. This paper presents seafood production planning and scheduling of a seafood manufacture company which produces simultaneously multi kind of seafood products, located at Aceh Province, Indonesia. The perishability nature of fish highly restricts its storage duration and delivery conditions. Traceability is a tracking requirement to check whether the quality of the product is satisfied. The production and distribution planning problem aims to meet customer demand subject to traceability of the seafood product and other restrictions. The problem is modeled as a mixed integer linear program, and then it is solved using neighborhood search approach.
Preliminary Analysis of Effect of Random Segment Errors on Coronagraph Performance
NASA Technical Reports Server (NTRS)
Stahl, Mark T.; Shaklan, Stuart B.; Stahl, H. Philip
2015-01-01
Are we alone in the Universe is probably the most compelling science question of our generation. To answer it requires a large aperture telescope with extreme wavefront stability. To image and characterize Earth-like planets requires the ability to block 10(exp 10) of the host stars light with a 10(exp -11) stability. For an internal coronagraph, this requires correcting wavefront errors and keeping that correction stable to a few picometers rms for the duration of the science observation. This requirement places severe specifications upon the performance of the observatory, telescope and primary mirror. A key task of the AMTD project (initiated in FY12) is to define telescope level specifications traceable to science requirements and flow those specifications to the primary mirror. From a systems perspective, probably the most important question is: What is the telescope wavefront stability specification? Previously, we suggested this specification should be 10 picometers per 10 minutes; considered issues of how this specification relates to architecture, i.e. monolithic or segmented primary mirror; and asked whether it was better to have few or many segmented. This paper reviews the 10 picometers per 10 minutes specification; provides analysis related to the application of this specification to segmented apertures; and suggests that a 3 or 4 ring segmented aperture is more sensitive to segment rigid body motion that an aperture with fewer or more segments.
Klein, Kevin; Scholl, Joep H G; Vermeer, Niels S; Broekmans, André W; Van Puijenbroek, Eugène P; De Bruin, Marie L; Stolk, Pieter
2016-02-01
Pharmacovigilance requirements for biologics mandate that EU Member States shall ensure that any biologic that is the subject of a suspected adverse drug reaction (ADR) is identifiable by brand name and batch number. Recent studies showed that brand name identification is well established, whereas batch numbers are (still) poorly reported. We evaluated information-recording systems and practices in the Dutch hospital setting to identify determinants for brand name and batch number recording as well as success factors and bottlenecks for traceability. We surveyed Dutch hospital pharmacists with an online questionnaire on systems and practices in hospitals for recording brand names and batch numbers. Additionally, we performed an analysis of the traceability of recombinant biologics in spontaneous ADR reports (received between 2009 and 2014) from the Netherlands Pharmacovigilance Centre Lareb. The survey showed that brand names are not routinely recorded in the clinical practice of Dutch hospitals, whereas batch numbers are poorly recorded. Seventy-six percent of the 1523 ADR reports for recombinant biologics had a traceable brand name whereas 5% of these reports contained a batch number. The results suggest a possible relationship between the availability of brand and batch number information in clinical practice and the inclusion of this information in ADR reports for biologics. The limited traceability of brand names and batch numbers in ADR reports may be primarily caused by the shortcomings in the recording of information in clinical practice. We recommend efforts to improve information-recording systems as a first step to improve the traceability of biologics in ADR reporting.
Framework for Design of Traceability System on Organic Rice Certification
NASA Astrophysics Data System (ADS)
Purwandoko, P. B.; Seminar, K. B.; Sutrisno; Sugiyanta
2018-05-01
Nowadays, the preferences of organic products such as organic rice have been increased. It because of the people awareness of the healthy and eco-friendly food product consumption has grown. Therefore, it is very important to ensure organic quality of the product that will be produced. Certification is a series of process that holds to ensure the quality of products meets all criteria of organic standards. Currently, there is a problem that traceability information system for organic rice certification has been not available. The current system still conducts manually caused the loss of information during storage process. This paper aimed at developing a traceability framework on organic rice certification process. First, the main discussed issues are organic certification process. Second, unified modeling language (UML) is used to build the model of user requirement in order to develop traceability system for all actors in the certification process. Furthermore, the information captured model along certification process will be explained in this paper. The model shows the information flow that has to be recorded for each actor. Finally, the challenges in the implementation system will be discussed in this paper.
NASA Astrophysics Data System (ADS)
Liu, Shihong; Meng, Hong; Zheng, Huoguo; Wu, Jiangshou
Traceability system has become an important means for food safety management. Global food industry and many countries have paid increasing attention to the construction of food traceability system, but rarely referred to tracing terminal. According to the technical requirements of cereal and oil products quality safety tracing process, we design and develop a mobile tracing terminal based on GPRS for agricultural products quality tracking to facilitate quality supervisors and consumers to track and trace the quality of related agricultural products anytime ,anywhere.
[Management of pre-analytical nonconformities].
Berkane, Z; Dhondt, J L; Drouillard, I; Flourié, F; Giannoli, J M; Houlbert, C; Surgat, P; Szymanowicz, A
2010-12-01
The main nonconformities enumerated to facilitate consensual codification. In each case, an action is defined: refusal to realize the examination with request of a new sample, request of information or correction, results cancellation, nurse or physician information. A traceability of the curative, corrective and preventive actions is needed. Then, methodology and indicators are proposed to assess nonconformity and to follow the quality improvements. The laboratory information system can be used instead of dedicated software. Tools for the follow-up of nonconformities scores are proposed. Finally, we propose an organization and some tools allowing the management and control of the nonconformities occurring during the pre-examination phase.
Instructions for Plastic Encapsulated Microcircuit(PEM) Selection, Screening and Qualification.
NASA Technical Reports Server (NTRS)
King, Terry; Teverovsky, Alexander; Leidecker, Henning
2002-01-01
The use of Plastic Encapsulated Microcircuits (PEMs) is permitted on NASA Goddard Space Flight Center (GSFC) spaceflight applications, provided each use is thoroughly evaluated for thermal, mechanical, and radiation implications of the specific application and found to meet mission requirements. PEMs shall be selected for their functional advantage and availability, not for cost saving; the steps necessary to ensure reliability usually negate any initial apparent cost advantage. A PEM shall not be substituted for a form, fit and functional equivalent, high reliability, hermetic device in spaceflight applications. Due to the rapid change in wafer-level designs typical of commercial parts and the unknown traceability between packaging lots and wafer lots, lot specific testing is required for PEMs, unless specifically excepted by the Mission Assurance Requirements (MAR) for the project. Lot specific qualification, screening, radiation hardness assurance analysis and/or testing, shall be consistent with the required reliability level as defined in the MAR. Developers proposing to use PEMs shall address the following items in their Performance Assurance Implementation Plan: source selection (manufacturers and distributors), storage conditions for all stages of use, packing, shipping and handling, electrostatic discharge (ESD), screening and qualification testing, derating, radiation hardness assurance, test house selection and control, data collection and retention.
[Preliminary studies on critical control point of traceability system in wolfberry].
Liu, Sai; Xu, Chang-Qing; Li, Jian-Ling; Lin, Chen; Xu, Rong; Qiao, Hai-Li; Guo, Kun; Chen, Jun
2016-07-01
As a traditional Chinese medicine, wolfberry (Lycium barbarum) has a long cultivation history and a good industrial development foundation. With the development of wolfberry production, the expansion of cultivation area and the increased attention of governments and consumers on food safety, the quality and safety requirement of wolfberry is higher demanded. The quality tracing and traceability system of production entire processes is the important technology tools to protect the wolfberry safety, and to maintain sustained and healthy development of the wolfberry industry. Thus, this article analyzed the wolfberry quality management from the actual situation, the safety hazard sources were discussed according to the HACCP (hazard analysis and critical control point) and GAP (good agricultural practice for Chinese crude drugs), and to provide a reference for the traceability system of wolfberry. Copyright© by the Chinese Pharmaceutical Association.
Theobald, P D; Esward, T J; Dowson, S P; Preston, R C
2005-03-01
Acoustic emission (AE) is a widely used technique that has been employed for the integrity testing of a range of vessels and structures for many years. The last decade has seen advances in signal processing, such that the reliability of AE technology is now being recognised by a wider range of industries. Furthermore, the need for quality control at the manufacturing stage, and requirements of in-service testing, is encouraging the issue of traceable measurements to be addressed. Currently, no independent calibration service for acoustic emission transducers is available within Europe. The UKs National Physical Laboratory (NPL) is undertaking work to develop a measurement facility for the traceable calibration of AE sensors. Such calibrations can contribute to greater acceptance of AE techniques in general, by meeting quality system and other traceability requirements. In this paper the key issues surrounding the development of such a facility are reviewed, including the need to establish repeatable AE sources, select suitable test blocks and to understand the limitations imposed by AE sensors themselves. To provide an absolute measurement of the displacement on the surface of a test block, laser interferometry is employed. In this way the output voltage of an AE sensor can be directly related to the displacement detected at the block surface. A possible calibration methodology is discussed and preliminary calibration results are presented for a commercially available AE sensor, showing its response to longitudinal wave modes.
NASA Technical Reports Server (NTRS)
Sandford, Stephen P.
2010-01-01
The Climate Absolute Radiance and Refractivity Observatory (CLARREO) is one of four Tier 1 missions recommended by the recent NRC Decadal Survey report on Earth Science and Applications from Space (NRC, 2007). The CLARREO mission addresses the need to provide accurate, broadly acknowledged climate records that are used to enable validated long-term climate projections that become the foundation for informed decisions on mitigation and adaptation policies that address the effects of climate change on society. The CLARREO mission accomplishes this critical objective through rigorous SI traceable decadal change observations that are sensitive to many of the key uncertainties in climate radiative forcings, responses, and feedbacks that in turn drive uncertainty in current climate model projections. These same uncertainties also lead to uncertainty in attribution of climate change to anthropogenic forcing. For the first time CLARREO will make highly accurate, global, SI-traceable decadal change observations sensitive to the most critical, but least understood, climate forcings, responses, and feedbacks. The CLARREO breakthrough is to achieve the required levels of accuracy and traceability to SI standards for a set of observations sensitive to a wide range of key decadal change variables. The required accuracy levels are determined so that climate trend signals can be detected against a background of naturally occurring variability. Climate system natural variability therefore determines what level of accuracy is overkill, and what level is critical to obtain. In this sense, the CLARREO mission requirements are considered optimal from a science value perspective. The accuracy for decadal change traceability to SI standards includes uncertainties associated with instrument calibration, satellite orbit sampling, and analysis methods. Unlike most space missions, the CLARREO requirements are driven not by the instantaneous accuracy of the measurements, but by accuracy in the large time/space scale averages that are key to understanding decadal changes.
Flight software development for the isothermal dendritic growth experiment
NASA Technical Reports Server (NTRS)
Levinson, Laurie H.; Winsa, Edward A.; Glicksman, Martin E.
1989-01-01
The Isothermal Dendritic Growth Experiment (IDGE) is a microgravity materials science experiment scheduled to fly in the cargo bay of the shuttle on the United States Microgravity Payload (USMP) carrier. The experiment will be operated by real-time control software which will not only monitor and control onboard experiment hardware, but will also communicate, via downlink data and uplink commands, with the Payload Operations Control Center (POCC) at NASA George C. Marshall Space Flight Center (MSFC). The software development approach being used to implement this system began with software functional requirements specification. This was accomplished using the Yourdon/DeMarco methodology as supplemented by the Ward/Mellor real-time extensions. The requirements specification in combination with software prototyping was then used to generate a detailed design consisting of structure charts, module prologues, and Program Design Language (PDL) specifications. This detailed design will next be used to code the software, followed finally by testing against the functional requirements. The result will be a modular real-time control software system with traceability through every phase of the development process.
Flight software development for the isothermal dendritic growth experiment
NASA Technical Reports Server (NTRS)
Levinson, Laurie H.; Winsa, Edward A.; Glicksman, M. E.
1990-01-01
The Isothermal Dendritic Growth Experiment (IDGE) is a microgravity materials science experiment scheduled to fly in the cargo bay of the shuttle on the United States Microgravity Payload (USMP) carrier. The experiment will be operated by real-time control software which will not only monitor and control onboard experiment hardware, but will also communicate, via downlink data and unlink commands, with the Payload Operations Control Center (POCC) at NASA George C. Marshall Space Flight Center (MSFC). The software development approach being used to implement this system began with software functional requirements specification. This was accomplished using the Yourdon/DeMarco methodology as supplemented by the Ward/Mellor real-time extensions. The requirements specification in combination with software prototyping was then used to generate a detailed design consisting of structure charts, module prologues, and Program Design Language (PDL) specifications. This detailed design will next be used to code the software, followed finally by testing against the functional requirements. The result will be a modular real-time control software system with traceability through every phase of the development process.
Progress toward a performance based specification for diamond grinding wheels
DOE Office of Scientific and Technical Information (OSTI.GOV)
Taylor, J.S.; Piscotty, M.S.; Blaedel, K.L.
1996-11-12
This work sought to improve the communication between users and makers of fine diamond grinding wheels. A promising avenue for this is to formulate a voluntary product standard that comprises performance indicators that bridge the gap between specific user requirements and the details of wheel formulations. We propose a set of performance specifiers of figures-of-merit, that might be assessed by straightforward and traceable testing methods, but do not compromise proprietary information of the wheel user of wheel maker. One such performance indicator might be wheel hardness. In addition we consider technologies that might be required to realize the benefits ofmore » optimized grinding wheels. A non-contact wheel-to- workpiece proximity sensor may provide a means of monitoring wheel wear and thus wheel position, for wheels that exhibit high wear rates in exchange for improved surface finish.« less
Traceable quantum sensing and metrology relied up a quantum electrical triangle principle
NASA Astrophysics Data System (ADS)
Fang, Yan; Wang, Hengliang; Yang, Xinju; Wei, Jingsong
2016-11-01
Hybrid quantum state engineering in quantum communication and imaging1-2 needs traceable quantum sensing and metrology, which are especially critical to quantum internet3 and precision measurements4 that are important across all fields of science and technology-. We aim to set up a mode of traceable quantum sensing and metrology. We developed a method by specially transforming an atomic force microscopy (AFM) and a scanning tunneling microscopy (STM) into a conducting atomic force microscopy (C-AFM) with a feedback control loop, wherein quantum entanglement enabling higher precision was relied upon a set-point, a visible light laser beam-controlled an interferometer with a surface standard at z axis, diffractometers with lateral standards at x-y axes, four-quadrant photodiode detectors, a scanner and its image software, a phase-locked pre-amplifier, a cantilever with a kHz Pt/Au conducting tip, a double barrier tunneling junction model, a STM circuit by frequency modulation and a quantum electrical triangle principle involving single electron tunneling effect, quantum Hall effect and Josephson effect5. The average and standard deviation result of repeated measurements on a 1 nm height local micro-region of nanomedicine crystal hybrid quantum state engineering surface and its differential pA level current and voltage (dI/dV) in time domains by using C-AFM was converted into an international system of units: Siemens (S), an indicated value 0.86×10-12 S (n=6) of a relative standard uncertainty was superior over a relative standard uncertainty reference value 2.3×10-10 S of 2012 CODADA quantized conductance6. It is concluded that traceable quantum sensing and metrology is emerging.
Chen, Xiangyu; Zhao, Xin; Abeyweera, Thushara P.; Rotenberg, Susan A.
2012-01-01
A previous report (Biochemistry 46: 2364–2370, 2007) described the application of The Traceable Kinase Method to identify substrates of PKCα in non-transformed human breast MCF-10A cells. Here, a non-radioactive variation of this method compared the phospho-protein profiles of three traceable PKC isoforms (α, δ and ζ) for the purpose of identifying novel, isoform-selective substrates. Each FLAG-tagged traceable kinase was expressed and co-immunoprecipitated along with high affinity substrates. The isolated kinase and its associated substrates were subjected to an in vitro phosphorylation reaction with traceable kinase-specific N6-phenyl-ATP, and the resulting phospho-proteins were analyzed by Western blot with an antibody that recognizes the phosphorylated PKC consensus site. Phospho-protein profiles generated by PKC-α and -δ were similar and differed markedly from that of PKC-ζ. Mass spectrometry of selected bands revealed known PKC substrates and several potential substrates that included the small GTPase-associated effector protein Cdc42 effector protein-4 (CEP4). Of those potential substrates tested, only CEP4 was phosphorylated by pure PKC-α, –δ, and −ζ isoforms in vitro, and by endogenous PKC isoforms in MCF-10A cells treated with DAG-lactone, a membrane permeable PKC activator. Under these conditions, the stoichiometry of CEP4 phosphorylation was 3.2 ± 0.5 (mol phospho-CEP4/mol CEP4). Following knock-down with isoform-specific shRNA-encoding plasmids, phosphorylation of CEP4 was substantially decreased in response to silencing of each of the three isoforms (PKC–α, –δ, or –ζ), whereas testing of kinase-dead mutants supported a role for only PKC-α and –δ in CEP4 phosphorylation. These findings identify CEP4 as a novel intracellular PKC substrate that is phosphorylated by multiple PKC isoforms. PMID:22897107
System description for DART (Decision Analysis for Remediation Technologies)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nonte, J.; Bolander, T.; Nickelson, D.
1997-09-01
DART is a computer aided system populated with influence models to determine quantitative benefits derived by matching requirements and technologies. The DART database is populated with data from over 900 DOE sites from 10 Field Offices. These sites are either source terms, such as buried waste pits, or soil or groundwater contaminated plumes. The data, traceable to published documents, consists of site-specific data (contaminants, area, volume, depth, size, remedial action dates, site preferred remedial option), problems (e.g., offsite contaminant plume), and Site Technology Coordinating Group (STCG) need statements (also contained in the Ten-Year Plan). DART uses this data to calculatemore » and derive site priorities, risk rankings, and site specific technology requirements. DART is also populated with over 900 industry and DOE SCFA technologies. Technology capabilities can be used to match technologies to waste sites based on the technology`s capability to meet site requirements and constraints. Queries may be used to access, sort, roll-up, and rank site data. Data roll-ups may be graphically displayed.« less
Liang, Wanjie; Cao, Jing; Fan, Yan; Zhu, Kefeng; Dai, Qiwei
2015-01-01
In recent years, traceability systems have been developed as effective tools for improving the transparency of supply chains, thereby guaranteeing the quality and safety of food products. In this study, we proposed a cattle/beef supply chain traceability model and a traceability system based on radio frequency identification (RFID) technology and the EPCglobal network. First of all, the transformations of traceability units were defined and analyzed throughout the cattle/beef chain. Secondly, we described the internal and external traceability information acquisition, transformation, and transmission processes throughout the beef supply chain in detail, and explained a methodology for modeling traceability information using the electronic product code information service (EPCIS) framework. Then, the traceability system was implemented based on Fosstrak and FreePastry software packages, and animal ear tag code and electronic product code (EPC) were employed to identify traceability units. Finally, a cattle/beef supply chain included breeding business, slaughter and processing business, distribution business and sales outlet was used as a case study to evaluate the beef supply chain traceability system. The results demonstrated that the major advantages of the traceability system are the effective sharing of information among business and the gapless traceability of the cattle/beef supply chain.
Liang, Wanjie; Cao, Jing; Fan, Yan; Zhu, Kefeng; Dai, Qiwei
2015-01-01
In recent years, traceability systems have been developed as effective tools for improving the transparency of supply chains, thereby guaranteeing the quality and safety of food products. In this study, we proposed a cattle/beef supply chain traceability model and a traceability system based on radio frequency identification (RFID) technology and the EPCglobal network. First of all, the transformations of traceability units were defined and analyzed throughout the cattle/beef chain. Secondly, we described the internal and external traceability information acquisition, transformation, and transmission processes throughout the beef supply chain in detail, and explained a methodology for modeling traceability information using the electronic product code information service (EPCIS) framework. Then, the traceability system was implemented based on Fosstrak and FreePastry software packages, and animal ear tag code and electronic product code (EPC) were employed to identify traceability units. Finally, a cattle/beef supply chain included breeding business, slaughter and processing business, distribution business and sales outlet was used as a case study to evaluate the beef supply chain traceability system. The results demonstrated that the major advantages of the traceability system are the effective sharing of information among business and the gapless traceability of the cattle/beef supply chain. PMID:26431340
Tracing Asian Seabass Individuals to Single Fish Farms Using Microsatellites
Yue, Gen Hua; Xia, Jun Hong; Liu, Peng; Liu, Feng; Sun, Fei; Lin, Grace
2012-01-01
Traceability through physical labels is well established, but it is not highly reliable as physical labels can be easily changed or lost. Application of DNA markers to the traceability of food plays an increasingly important role for consumer protection and confidence building. In this study, we tested the efficiency of 16 polymorphic microsatellites and their combinations for tracing 368 fish to four populations where they originated. Using the maximum likelihood and Bayesian methods, three most efficient microsatellites were required to assign over 95% of fish to the correct populations. Selection of markers based on the assignment score estimated with the software WHICHLOCI was most effective in choosing markers for individual assignment, followed by the selection based on the allele number of individual markers. By combining rapid DNA extraction, and high-throughput genotyping of selected microsatellites, it is possible to conduct routine genetic traceability with high accuracy in Asian seabass. PMID:23285169
A Collection Scheme for Tracing Information of Pig Safety Production
NASA Astrophysics Data System (ADS)
Luo, Qingyao; Xiong, Benhai; Yang, Liang
This study takes one main production pattern of smallhold pig farming in Tianjin as a study prototype, deeply analyzes characters of informations about tracing inputs including vaccines,feeds,veterinary drugs and supervision test in pig farming, proposesinputs metadata, criteria for integrating inputs event and interface norms for data transmision, developes and completes identification of 2D ear tags and traceability information collection system of pig safety production based on mobile PDA. The system has implemented functions including setting and invalidate of 2D ear tags, collection of tracing inputs and supervision in the mobile PDA and finally integration of tracing events (the epidemic event,feed event,drug event and supervision event) on the traceability data center (server). The PDA information collection system has been applied for demonstration in Tianjin, the collection is simple, convenient and feasible. It could meet with requirements of traceability information system of pig safety production
Kleter, Gijs; McFarland, Sarah; Bach, Alex; Bernabucci, Umberto; Bikker, Paul; Busani, Luca; Kok, Esther; Kostov, Kaloyan; Nadal, Anna; Pla, Maria; Ronchi, Bruno; Terre, Marta; Einspanier, Ralf
2017-10-06
This review, which has been prepared within the frame of the European Union (EU)-funded project MARLON, surveys the organisation and characteristics of specific livestock and feed production chains (conventional, organic, GM-free) within the EU, with an emphasis on controls, regulations, traceability, and common production practices. Furthermore, an overview of the origin of animal feed used in the EU as well as an examination of the use of genetically modified organisms (GMOs) in feed is provided. From the data, it shows that livestock is traceable at the herd or individual level, depending on the species. Husbandry practices can vary widely according to geography and animal species, whilst controls and checks are in place for notifiable diseases and general health symptoms (such as mortality, disease, productive performance). For feeds, it would be possible only to make coarse estimates, at best, for the amount of GM feed ingredients that an animal is exposed to. Labeling requirements are apparently correctly followed. Provided that confounding factors are taken into account, practices such as organic agriculture that explicitly involve the use of non-GM feeds could be used for comparison to those involving the use of GM feed. Copyright © 2017 Elsevier Ltd. All rights reserved.
Rollout Strategy to Implement Interoperable Traceability in the Seafood Industry.
Gooch, Martin; Dent, Benjamin; Sylvia, Gilbert; Cusack, Christopher
2017-08-01
Verifying the accuracy and rigor of data exchanged within and between businesses for the purposes of traceability rests on the existence of effective and efficient interoperable information systems that meet users' needs. Interoperability, particularly given the complexities intrinsic to the seafood industry, requires that the systems used by businesses operating along the supply chain share a common technology architecture that is robust, resilient, and evolves as industry needs change. Technology architectures are developed through engaging industry stakeholders in understanding why an architecture is required, the benefits provided to the industry and individual businesses and supply chains, and how the architecture will translate into practical results. This article begins by reiterating the benefits that the global seafood industry can capture by implementing interoperable chain-length traceability and the reason for basing the architecture on a peer-to-peer networked database concept versus more traditional centralized or linear approaches. A summary of capabilities that already exist within the seafood industry that the proposed architecture uses is discussed; and a strategy for implementing the architecture is presented. The 6-step strategy is presented in the form of a critical path. © 2017 Institute of Food Technologists®.
Hernández-Jover, M; Schembri, N; Toribio, J-A; Holyoake, P K
2009-10-01
To evaluate the implementation and barriers to adoption, among pig producers, of a newly introduced traceability and food safety system in Australia. Implementation of the PigPass national vendor declaration (NVD) linked to an on-farm quality assurance (QA) program was evaluated in May and December 2007 at saleyards and abattoirs in New South Wales, Victoria and Queensland. Four focus group discussions with saleyard producers were held between April and July 2007. Implementation of the PigPass system in terms of accurate completion of the form and QA accreditation was higher at the export abattoir than at the regional saleyard at the first audit (P < 0.01). Implementation increased at the second audit at the abattoirs, but little change with time was observed at saleyards. Approximately half of the producers at saleyards used photocopied PigPass forms, made at least one error (>64%), and many vendors did not appear to be QA-accredited. During focus groups, producers expressed the view that PigPass implementation improved animal and product traceability. They identified the associated costs and a perceived lack of support by information providers as obstacles for adoption. Improvement in the implementation of PigPass among producers marketing pigs at export abattoirs was observed during the 8-month period of the study. There is a need for a more uniform message to producers from government agencies on the importance of the PigPass NVD and QA and extension and education targeted toward producers supplying pigs to saleyards and domestic abattoirs to ensure compliance with the traceability requirements.
The Portfolio Approach Developed to Underpin the Capital Investment Program Plan Review (CIPPR)
2014-11-06
Basinger, Director, DCI, CFD Scientific Letter The PORTFOLIO APPROACH developed to underpin the Capital Investment Program Plan Review (CIPPR) To better...prepare senior management for meetings about CIPPR in November 2014, this scientific letter has been pre- pared upon request [1] to clarify some of...Research and Analysis in support of CIPPR was to: 1. Provide scientific support to the development of a traceable and sustainable approach and process by
Accurate Radiometry from Space: An Essential Tool for Climate Studies
NASA Technical Reports Server (NTRS)
Fox, Nigel; Kaiser-Weiss, Andrea; Schmutz, Werner; Thome, Kurtis; Young, Dave; Wielicki, Bruce; Winkler, Rainer; Woolliams, Emma
2011-01-01
The Earth s climate is undoubtedly changing; however, the time scale, consequences and causal attribution remain the subject of significant debate and uncertainty. Detection of subtle indicators from a background of natural variability requires measurements over a time base of decades. This places severe demands on the instrumentation used, requiring measurements of sufficient accuracy and sensitivity that can allow reliable judgements to be made decades apart. The International System of Units (SI) and the network of National Metrology Institutes were developed to address such requirements. However, ensuring and maintaining SI traceability of sufficient accuracy in instruments orbiting the Earth presents a significant new challenge to the metrology community. This paper highlights some key measurands and applications driving the uncertainty demand of the climate community in the solar reflective domain, e.g. solar irradiances and reflectances/radiances of the Earth. It discusses how meeting these uncertainties facilitate significant improvement in the forecasting abilities of climate models. After discussing the current state of the art, it describes a new satellite mission, called TRUTHS, which enables, for the first time, high-accuracy SI traceability to be established in orbit. The direct use of a primary standard and replication of the terrestrial traceability chain extends the SI into space, in effect realizing a metrology laboratory in space . Keywords: climate change; Earth observation; satellites; radiometry; solar irradiance
2010-01-01
The importance of effective and timely traceability in both the recall of substances of human origin (blood, cells, tissues and organs) implicated in infectious transmission, and in the prevention of inappropriate use of substances of human origin is now well recognised. However, traceability remains poorly understood and inadequately controlled in many cases. In particular there is: a lack of appreciation of the complexity of the traceability pathway; a fragmented approach to traceability; and, an assumption that traceability data is static. The traceability path for a single tissue donor may involve dozens or even hundreds of different organizations, each responsible for one segment of the path. Whilst responsibility within each organization may be clearly defined, responsibility for maintaining the interfaces between organizations is often less clear. Traceability is seldom regarded in a holistic manner, the assumption being made that if each segment of the pathway is correctly maintained then the full path will be intact. End to end traceability audits are not routinely performed, and the only true test of the trail occurs when recall is required—often with inadequate results. PMID:20628821
Analysis of selected volatile organic compounds at background level in South Africa.
NASA Astrophysics Data System (ADS)
Ntsasa, Napo; Tshilongo, James; Lekoto, Goitsemang
2017-04-01
Volatile organic compounds (VOC) are measured globally at urban air pollution monitoring and background level at specific locations such as the Cape Point station. The urban pollution monitoring is legislated at government level; however, the background levels are scientific outputs of the World Meteorological Organisation Global Atmospheric Watch program (WMO/GAW). The Cape Point is a key station in the Southern Hemisphere which monitors greenhouse gases and halocarbons, with reported for over the past decade. The Cape Point station does not have the measurement capability VOC's currently. A joint research between the Cape Point station and the National Metrology Institute of South Africa (NMISA) objective is to perform qualitative and quantitative analysis of volatile organic compounds listed in the GAW program. NMISA is responsible for development, maintain and disseminate primary reference gas mixtures which are directly traceable to the International System of Units (SI) The results of some volatile organic compounds which where sampled in high pressure gas cylinders will be presented. The analysis of samples was performed on the gas chromatography with flame ionisation detector and mass selective detector (GC-FID/MSD) with a dedicate cryogenic pre-concentrator system. Keywords: volatile organic compounds, gas chromatography, pre-concentrator
Automating Traceability for Generated Software Artifacts
NASA Technical Reports Server (NTRS)
Richardson, Julian; Green, Jeffrey
2004-01-01
Program synthesis automatically derives programs from specifications of their behavior. One advantage of program synthesis, as opposed to manual coding, is that there is a direct link between the specification and the derived program. This link is, however, not very fine-grained: it can be best characterized as Program is-derived- from Specification. When the generated program needs to be understood or modified, more $ne-grained linking is useful. In this paper, we present a novel technique for automatically deriving traceability relations between parts of a specification and parts of the synthesized program. The technique is very lightweight and works -- with varying degrees of success - for any process in which one artifact is automatically derived from another. We illustrate the generality of the technique by applying it to two kinds of automatic generation: synthesis of Kalman Filter programs from speci3cations using the Aut- oFilter program synthesis system, and generation of assembly language programs from C source code using the GCC C compilel: We evaluate the effectiveness of the technique in the latter application.
Proceedings of the August 2011 Traceability Research Summit.
Bhatt, Tejas; Buckley, Greg; McEntire, Jennifer C
2013-12-01
IFT's Traceability Improvement Initiative aims to advance work in the area of food product tracing through several means including hosted events where thought leaders exchange knowledge and ideas. In August 2011, the Initiative, in collaboration with GS1 US, convened a group of 50 product tracing stakeholders, as a follow-on to a successful event the month prior. Representatives conducting pilots or implementation studies in produce, seafood, dairy, and other industries discussed the objectives, challenges and learnings. Some of the learnings from on-going initiatives included the sense that better information management provides a return of investment; data often exist but may not necessarily be appropriately linked through the supply chain; and enhanced product tracing enables better accountability and quality control. Challenges identified in enabling traceability throughout the supply chain were the distribution complexity; the need for training, communication, and collaboration; improving the reliability, quality and security of data captured, stored and shared as well as the importance of standards in data and interoperability of technology. Several approaches to overcoming these challenges were discussed. The first approach incrementally improves upon the current "one up/one down" system by requiring electronic records and tracking internal as well as external critical tracking events. The benefits of this approach are its similarity to existing regulatory requirements and low cost of implementation; resulting in a higher probability of adoption. The major disadvantage to this process is the longer response time required during a trace (back or forward). The second approach is similar to a "pedigree" approach where historical information about the food travels with it through the value chain. A major advantage of this approach is the quickest response time during a trace. Some of the disadvantages of this approach are potential for misuse of data, the volume of data required to be maintained at value chain end points, and data privacy concerns. The third approach requires individual nodes within the value chain to maintain electronic records for its own data and make them available for querying during a traceback for outbreak investigation. The major advantage of this approach is the protection of confidential information and the potential for quicker access during a trace. However, the primary disadvantage of this approach is the need for greater computational power and a more complex mechanism to linking the value chain through the data. As next steps, a subgroup will work on clarifying the approach to meeting the goals of traceability, better defining critical tracking events, and articulating the strategy and return on investment from a regulatory and industry perspective. This will result in improved alignment of on-going traceability pilots and initiatives as well as a more actionable guidance document for public review. © 2012 Institute of Food Technologists®
Traceability validation of a high speed short-pulse testing method used in LED production
NASA Astrophysics Data System (ADS)
Revtova, Elena; Vuelban, Edgar Moreno; Zhao, Dongsheng; Brenkman, Jacques; Ulden, Henk
2017-12-01
Industrial processes of LED (light-emitting diode) production include LED light output performance testing. Most of them are monitored and controlled by optically, electrically and thermally measuring LEDs by high speed short-pulse measurement methods. However, these are not standardized and a lot of information is proprietary that it is impossible for third parties, such as NMIs, to trace and validate. It is known, that these techniques have traceability issue and metrological inadequacies. Often due to these, the claimed performance specifications of LEDs are overstated, which consequently results to manufacturers experiencing customers' dissatisfaction and a large percentage of failures in daily use of LEDs. In this research a traceable setup is developed to validate one of the high speed testing techniques, investigate inadequacies and work out the traceability issues. A well-characterised short square pulse of 25 ms is applied to chip-on-board (CoB) LED modules to investigate the light output and colour content. We conclude that the short-pulse method is very efficient in case a well-defined electrical current pulse is applied and the stabilization time of the device is "a priori" accurately determined. No colour shift is observed. The largest contributors to the measurement uncertainty include badly-defined current pulse and inaccurate calibration factor.
Lim, Jinsook; Song, Kyung Eun; Song, Sang Hoon; Choi, Hyun-Jung; Koo, Sun Hoe; Kwon, Gye Choel
2016-05-01
-The traceability of clinical results to internationally recognized and accepted reference materials and reference measurement procedures has become increasingly important. Therefore, the establishment of traceability has become a mandatory requirement for all in vitro diagnostics devices. -To evaluate the traceability of the Abbott Architect c8000 system (Abbott Laboratories, Abbott Park, Illinois), consisting of calibrators and reagents, across 4 different chemistry analyzers, and to evaluate its general performance on the Toshiba 2000FR NEO (Toshiba Medical Systems Corporation, Otawara-shi, Tochigi-ken, Japan). -For assessment of traceability, secondary reference materials were evaluated 5 times, and then bias was calculated. Precision, linearity, and carryover were determined according to the guidelines of the Clinical and Laboratory Standards Institute (Wayne, Pennsylvania). -The biases from 4 different analyzers ranged from -2.33% to 2.70% on the Toshiba 2000FR NEO, -2.33% to 5.12% on the Roche Hitachi 7600 (Roche Diagnostics International, Basel, Switzerland), -0.93% to 2.87% on the Roche Modular, and -2.16% to 2.86% on the Abbott Architect c16000. The total coefficients of variance of all analytes were less than 5%. The coefficients of determination (R(2)) were more than 0.9900. The carryover rate ranged from -0.54% to 0.17%. -Abbott clinical chemistry assays met the performance criteria based on desirable biological variation for precision, bias, and total error. They also showed excellent linearity and carryover. Therefore, these clinical chemistry assays were found to be accurate and reliable and are readily applicable on the various platforms used in this study.
Establishing the traceability of a uranyl nitrate solution to a standard reference material
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jackson, C.H.; Clark, J.P.
1978-01-01
A uranyl nitrate solution for use as a Working Calibration and Test Material (WCTM) was characterized, using a statistically designed procedure to document traceability to National Bureau of Standards Reference Material (SPM-960). A Reference Calibration and Test Material (PCTM) was prepared from SRM-960 uranium metal to approximate the acid and uranium concentration of the WCTM. This solution was used in the characterization procedure. Details of preparing, handling, and packaging these solutions are covered. Two outside laboratories, each having measurement expertise using a different analytical method, were selected to measure both solutions according to the procedure for characterizing the WCTM. Twomore » different methods were also used for the in-house characterization work. All analytical results were tested for statistical agreement before the WCTM concentration and limit of error values were calculated. A concentration value was determined with a relative limit of error (RLE) of approximately 0.03% which was better than the target RLE of 0.08%. The use of this working material eliminates the expense of using SRMs to fulfill traceability requirements for uranium measurements on this type material. Several years' supply of uranyl nitrate solution with NBS traceability was produced. The cost of this material was less than 10% of an equal quantity of SRM-960 uranium metal.« less
Characterization Approaches to Place Invariant Sites on SI-Traceable Scales
NASA Technical Reports Server (NTRS)
Thome, Kurtis
2012-01-01
The effort to understand the Earth's climate system requires a complete integration of remote sensing imager data across time and multiple countries. Such an integration necessarily requires ensuring inter-consistency between multiple sensors to create the data sets needed to understand the climate system. Past efforts at inter-consistency have forced agreement between two sensors using sources that are viewed by both sensors at nearly the same time, and thus tend to be near polar regions over snow and ice. The current work describes a method that would provide an absolute radiometric calibration of a sensor rather than an inter-consistency of a sensor relative to another. The approach also relies on defensible error budgets that eventually provides a cross comparison of sensors without systematic errors. The basis of the technique is a model-based, SI-traceable prediction of at-sensor radiance over selected sites. The predicted radiance would be valid for arbitrary view and illumination angles and for any date of interest that is dominated by clear-sky conditions. The effort effectively works to characterize the sites as sources with known top-of-atmosphere radiance allowing accurate intercomparison of sensor data that without the need for coincident views. Data from the Advanced Spaceborne Thermal Emission and Reflection and Radiometer (ASTER), Enhanced Thematic Mapper Plus (ETM+), and Moderate Resolution Imaging Spectroradiometer (MODIS) are used to demonstrate the difficulties of cross calibration as applied to current sensors. Special attention is given to the differences caused in the cross-comparison of sensors in radiance space as opposed to reflectance space. The radiance comparisons lead to significant differences created by the specific solar model used for each sensor. The paper also proposes methods to mitigate the largest error sources in future systems. The results from these historical intercomparisons provide the basis for a set of recommendations to ensure future SI-traceable cross calibration using future missions such as CLARREO and TRUTHS. The paper describes a proposed approach that relies on model-based, SI-traceable predictions of at-sensor radiance over selected sites. The predicted radiance would be valid for arbitrary view and illumination angles and for any date of interest that is dominated by clear-sky conditions. The basis of the method is highly accurate measurements of at-sensor radiance of sufficient quality to understand the spectral and BRDF characteristics of the site and sufficient historical data to develop an understanding of temporal effects from changing surface and atmospheric conditions.
Ma, Dong-Hong; Wang, Xi-Chang; Liu, Li-Ping; Liu, Yuan
2011-04-01
The geographical origin traceability of food, an important part of traceability system, is effective in protecting the quality and safety of foodstuffs. Near-infrared spectroscopy (NIR), which is a powerful technique for geographical origin traceability, has attracted extensive attention by scientists due to its speediness, non-pollution and simple operation. This paper presents the advantages and disadvantages of techniques that have been used for food geographical origin traceability. The basic principles of NIR and its applications in different food geographical origin traceability are presented too. Furthermore, problems in applications are analyzed and the future development trends are discussed.
Pasqualone, Antonella; Montemurro, Cinzia; di Rienzo, Valentina; Summo, Carmine; Paradiso, Vito Michele; Caponio, Francesco
2016-08-01
In recent years, an increasing number of typicality marks has been awarded to high-quality olive oils produced from local cultivars. In this case, quality control requires effective varietal checks of the starting materials. Moreover, accurate cultivar identification is essential in vegetative-propagated plants distributed by nurseries and is a pre-requisite to register new cultivars. Food genomics provides many tools for cultivar identification and traceability from tree to oil and table olives. The results of the application of different classes of DNA markers to olive with the purpose of checking cultivar identity and variability of plant material are extensively discussed in this review, with special regard to repeatability issues and polymorphism degree. The characterization of olive germplasm from all countries of the Mediterranean basin and from less studied geographical areas is described and innovative high-throughput molecular tools to manage reference collections are reviewed. Then the transferability of DNA markers to processed products - virgin olive oils and table olives - is overviewed to point out strengths and weaknesses, with special regard to (i) the influence of processing steps and storage time on the quantity and quality of residual DNA, (ii) recent advances to overcome the bottleneck of DNA extraction from processed products, (iii) factors affecting whole comparability of DNA profiles between fresh plant materials and end-products, (iv) drawbacks in the analysis of multi-cultivar versus single-cultivar end-products and (v) the potential of quantitative polymerase chain reaction (PCR)-based techniques. © 2016 Society of Chemical Industry. © 2016 Society of Chemical Industry.
[The requirements of standard and conditions of interchangeability of medical articles].
Men'shikov, V V; Lukicheva, T I
2013-11-01
The article deals with possibility to apply specific approaches under evaluation of interchangeability of medical articles for laboratory analysis. The development of standardized analytical technologies of laboratory medicine and formulation of requirements of standards addressed to manufacturers of medical articles the clinically validated requirements are to be followed. These requirements include sensitivity and specificity of techniques, accuracy and precision of research results, stability of reagents' quality in particular conditions of their transportation and storage. The validity of requirements formulated in standards and addressed to manufacturers of medical articles can be proved using reference system, which includes master forms and standard samples, reference techniques and reference laboratories. This approach is supported by data of evaluation of testing systems for measurement of level of thyrotrophic hormone, thyroid hormones and glycated hemoglobin HB A1c. The versions of testing systems can be considered as interchangeable only in case of results corresponding to the results of reference technique and comparable with them. In case of absence of functioning reference system the possibilities of the Joined committee of traceability in laboratory medicine make it possible for manufacturers of reagent sets to apply the certified reference materials under development of manufacturing of sets for large listing of analytes.
Method for traceable measurement of LTE signals
NASA Astrophysics Data System (ADS)
Sunder Dash, Soumya; Pythoud, Frederic; Leuchtmann, Pascal; Leuthold, Juerg
2018-04-01
This contribution presents a reference setup to measure the power of the cell-specific resource elements present in downlink long term evolution (LTE) signals in a way that the measurements are traceable to the international system of units. This setup can be used to calibrate the LTE code-selective field probes that are used to measure the radiation of base stations for mobile telephony. It can also be used to calibrate LTE signal generators and receivers. The method is based on traceable scope measurements performed directly at the output of a measuring antenna. It implements offline digital signal processing demodulation algorithms that consider the digital down-conversion, timing synchronization, frequency synchronization, phase synchronization and robust LTE cell identification to produce the downlink time-frequency LTE grid. Experimental results on conducted test scenarios, both single-input-single-output and multiple-input-multiple-output antenna configuration, show promising results confirming measurement uncertainties of the order of 0.05 dB with a coverage factor of 2.
Nuclear reference materials to meet the changing needs of the global nuclear community
DOE Office of Scientific and Technical Information (OSTI.GOV)
Martin, H.R.; Gradle, C.G.; Narayanan, U.I.
New Brunswick Laboratory (NBL) serves as the U.S. Government`s certifying authority for nuclear reference materials and measurement calibration standards. In this role, NBL provides nuclear reference materials certified for chemical and/or isotopic compositions traceable to a nationally accepted, internationally compatible reference base. Emphasis is now changing as to the types of traceable nuclear reference materials needed as operations change within the Department of Energy complex and at nuclear facilities around the world. New challenges include: environmental and waste minimization issues, facilities and materials transitioning from processing to storage modes with corresponding changes in the types of measurements being performed, emphasismore » on requirements for characterization of waste materials, and difficulties in transporting nuclear materials and international factors, including IAEA influences. During these changing times, it is critical that traceable reference materials be provided for calibration or validation of the performance of measurement systems. This paper will describe actions taken and planned to meet the changing reference material needs of the global nuclear community.« less
Xiao, Xinqing; Fu, Zetian; Qi, Lin; Mira, Trebar; Zhang, Xiaoshuan
2015-10-01
The main export varieties in China are brand-name, high-quality bred aquatic products. Among them, tilapia has become the most important and fast-growing species since extensive consumer markets in North America and Europe have evolved as a result of commodity prices, year-round availability and quality of fresh and frozen products. As the largest tilapia farming country, China has over one-third of its tilapia production devoted to further processing and meeting foreign market demand. Using by tilapia fillet processing, this paper introduces the efforts for developing and evaluating ITS-TF: an intelligent traceability system integrated with statistical process control (SPC) and fault tree analysis (FTA). Observations, literature review and expert questionnaires were used for system requirement and knowledge acquisition; scenario simulation was applied to evaluate and validate ITS-TF performance. The results show that traceability requirement is evolved from a firefighting model to a proactive model for enhancing process management capacity for food safety; ITS-TF transforms itself as an intelligent system to provide functions on early warnings and process management by integrated SPC and FTA. The valuable suggestion that automatic data acquisition and communication technology should be integrated into ITS-TF was achieved for further system optimization, perfection and performance improvement. © 2014 Society of Chemical Industry.
A hierarchical modeling methodology for the definition and selection of requirements
NASA Astrophysics Data System (ADS)
Dufresne, Stephane
This dissertation describes the development of a requirements analysis methodology that takes into account the concept of operations and the hierarchical decomposition of aerospace systems. At the core of the methodology, the Analytic Network Process (ANP) is used to ensure the traceability between the qualitative and quantitative information present in the hierarchical model. The proposed methodology is implemented to the requirements definition of a hurricane tracker Unmanned Aerial Vehicle. Three research objectives are identified in this work; (1) improve the requirements mapping process by matching the stakeholder expectations with the concept of operations, systems and available resources; (2) reduce the epistemic uncertainty surrounding the requirements and requirements mapping; and (3) improve the requirements down-selection process by taking into account the level of importance of the criteria and the available resources. Several challenges are associated with the identification and definition of requirements. The complexity of the system implies that a large number of requirements are needed to define the systems. These requirements are defined early in the conceptual design, where the level of knowledge is relatively low and the level of uncertainty is large. The proposed methodology intends to increase the level of knowledge and reduce the level of uncertainty by guiding the design team through a structured process. To address these challenges, a new methodology is created to flow-down the requirements from the stakeholder expectations to the systems alternatives. A taxonomy of requirements is created to classify the information gathered during the problem definition. Subsequently, the operational and systems functions and measures of effectiveness are integrated to a hierarchical model to allow the traceability of the information. Monte Carlo methods are used to evaluate the variations of the hierarchical model elements and consequently reduce the epistemic uncertainty. The proposed methodology is applied to the design of a hurricane tracker Unmanned Aerial Vehicles to demonstrate the origin and impact of requirements on the concept of operations and systems alternatives. This research demonstrates that the hierarchical modeling methodology provides a traceable flow-down of the requirements from the problem definition to the systems alternatives phases of conceptual design.
Providing traceability for neuroimaging analyses.
McClatchey, Richard; Branson, Andrew; Anjum, Ashiq; Bloodsworth, Peter; Habib, Irfan; Munir, Kamran; Shamdasani, Jetendr; Soomro, Kamran
2013-09-01
With the increasingly digital nature of biomedical data and as the complexity of analyses in medical research increases, the need for accurate information capture, traceability and accessibility has become crucial to medical researchers in the pursuance of their research goals. Grid- or Cloud-based technologies, often based on so-called Service Oriented Architectures (SOA), are increasingly being seen as viable solutions for managing distributed data and algorithms in the bio-medical domain. For neuroscientific analyses, especially those centred on complex image analysis, traceability of processes and datasets is essential but up to now this has not been captured in a manner that facilitates collaborative study. Few examples exist, of deployed medical systems based on Grids that provide the traceability of research data needed to facilitate complex analyses and none have been evaluated in practice. Over the past decade, we have been working with mammographers, paediatricians and neuroscientists in three generations of projects to provide the data management and provenance services now required for 21st century medical research. This paper outlines the finding of a requirements study and a resulting system architecture for the production of services to support neuroscientific studies of biomarkers for Alzheimer's disease. The paper proposes a software infrastructure and services that provide the foundation for such support. It introduces the use of the CRISTAL software to provide provenance management as one of a number of services delivered on a SOA, deployed to manage neuroimaging projects that have been studying biomarkers for Alzheimer's disease. In the neuGRID and N4U projects a Provenance Service has been delivered that captures and reconstructs the workflow information needed to facilitate researchers in conducting neuroimaging analyses. The software enables neuroscientists to track the evolution of workflows and datasets. It also tracks the outcomes of various analyses and provides provenance traceability throughout the lifecycle of their studies. As the Provenance Service has been designed to be generic it can be applied across the medical domain as a reusable tool for supporting medical researchers thus providing communities of researchers for the first time with the necessary tools to conduct widely distributed collaborative programmes of medical analysis. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
Ouadghiri, S; Atouf, O; Brick, C; Benseffaj, N; Essakalli, M
2012-02-01
The blood transfusion and haemovigilance service of the Ibn-Sina hospital in Rabat (Morocco) was created 1997. This unit manages the pretransfusional tests, distribution of blood products, traceability and haemovigilance. The objective of this study was to analyze, over a period of 12years, the traceability of blood products delivered in our hospital and the measures used to improve feedback information. This is a retrospective study conducted between 1999 and 2010. Traceability rate was calculated from the feedback of traceability forms supplied with blood products (number of blood products noted on traceability forms on the total number of delivered product). To improve traceability rate, several actions were undertaken: one-time training, awareness campaigns and call phones asking for feedback information. Between 1999 and 2010, the service has delivered 173,858 blood products. The average rate of traceability during this period was 13.4 %. Traceability rate varies widely over time (5.2 % in 1999, 15.5 % in 2010) and shows a maximum value of 27.2 % in 2005. Feedback information is lower in emergency departments than in medical and surgical services. Feedback information about traceability in Ibn-Sina hospital remains very poor despite the measures used. Other actions, such as continuous education courses, low enforcement and informatisation should be considered. Copyright © 2011 Elsevier Masson SAS. All rights reserved.
40 CFR 98.474 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2011 CFR
2011-07-01
... American National Standards Institute (ANSI), the American Gas Association (AGA), the American Society of... Standards and Technology (NIST) traceable. (c) General. (1) If you measure the concentration of any CO2...
NASA Technical Reports Server (NTRS)
1973-01-01
The traffic analyses and system requirements data generated in the study resulted in the development of two traffic models; the baseline traffic model and the new traffic model. The baseline traffic model provides traceability between the numbers and types of geosynchronous missions considered in the study and the entire spectrum of missions foreseen in the total national space program. The information presented pertaining to the baseline traffic model includes: (1) definition of the baseline traffic model, including identification of specific geosynchronous missions and their payload delivery schedules through 1990; (2) Satellite location criteria, including the resulting distribution of the satellite population; (3) Geosynchronous orbit saturation analyses, including the effects of satellite physical proximity and potential electromagnetic interference; and (4) Platform system requirements analyses, including satellite and mission equipment descriptions, the options and limitations in grouping satellites, and on-orbit servicing criteria (both remotely controlled and man-attended).
NASA Astrophysics Data System (ADS)
Iacobucci, Joseph V.
The research objective for this manuscript is to develop a Rapid Architecture Alternative Modeling (RAAM) methodology to enable traceable Pre-Milestone A decision making during the conceptual phase of design of a system of systems. Rather than following current trends that place an emphasis on adding more analysis which tends to increase the complexity of the decision making problem, RAAM improves on current methods by reducing both runtime and model creation complexity. RAAM draws upon principles from computer science, system architecting, and domain specific languages to enable the automatic generation and evaluation of architecture alternatives. For example, both mission dependent and mission independent metrics are considered. Mission dependent metrics are determined by the performance of systems accomplishing a task, such as Probability of Success. In contrast, mission independent metrics, such as acquisition cost, are solely determined and influenced by the other systems in the portfolio. RAAM also leverages advances in parallel computing to significantly reduce runtime by defining executable models that are readily amendable to parallelization. This allows the use of cloud computing infrastructures such as Amazon's Elastic Compute Cloud and the PASTEC cluster operated by the Georgia Institute of Technology Research Institute (GTRI). Also, the amount of data that can be generated when fully exploring the design space can quickly exceed the typical capacity of computational resources at the analyst's disposal. To counter this, specific algorithms and techniques are employed. Streaming algorithms and recursive architecture alternative evaluation algorithms are used that reduce computer memory requirements. Lastly, a domain specific language is created to provide a reduction in the computational time of executing the system of systems models. A domain specific language is a small, usually declarative language that offers expressive power focused on a particular problem domain by establishing an effective means to communicate the semantics from the RAAM framework. These techniques make it possible to include diverse multi-metric models within the RAAM framework in addition to system and operational level trades. A canonical example was used to explore the uses of the methodology. The canonical example contains all of the features of a full system of systems architecture analysis study but uses fewer tasks and systems. Using RAAM with the canonical example it was possible to consider both system and operational level trades in the same analysis. Once the methodology had been tested with the canonical example, a Suppression of Enemy Air Defenses (SEAD) capability model was developed. Due to the sensitive nature of analyses on that subject, notional data was developed. The notional data has similar trends and properties to realistic Suppression of Enemy Air Defenses data. RAAM was shown to be traceable and provided a mechanism for a unified treatment of a variety of metrics. The SEAD capability model demonstrated lower computer runtimes and reduced model creation complexity as compared to methods currently in use. To determine the usefulness of the implementation of the methodology on current computing hardware, RAAM was tested with system of system architecture studies of different sizes. This was necessary since system of systems may be called upon to accomplish thousands of tasks. It has been clearly demonstrated that RAAM is able to enumerate and evaluate the types of large, complex design spaces usually encountered in capability based design, oftentimes providing the ability to efficiently search the entire decision space. The core algorithms for generation and evaluation of alternatives scale linearly with expected problem sizes. The SEAD capability model outputs prompted the discovery a new issue, the data storage and manipulation requirements for an analysis. Two strategies were developed to counter large data sizes, the use of portfolio views and top 'n' analysis. This proved the usefulness of the RAAM framework and methodology during Pre-Milestone A capability based analysis. (Abstract shortened by UMI.).
Traceability of Plant Diet Contents in Raw Cow Milk Samples
Ponzoni, Elena; Mastromauro, Francesco; Gianì, Silvia; Breviario, Diego
2009-01-01
The use of molecular marker in the dairy sector is gaining large acceptance as a reliable diagnostic approach for food authenticity and traceability. Using a PCR approach, the rbcL marker, a chloroplast-based gene, was selected to amplify plant DNA fragments in raw cow milk samples collected from stock farms or bought on the Italian market. rbcL-specific DNA fragments could be found in total milk, as well as in the skimmed and the cream fractions. When the PCR amplified fragments were sent to sequence, the nucleotide composition of the chromatogram reflected the multiple contents of the polyphytic diet. PMID:22253982
EARLINET Single Calculus Chain - technical - Part 1: Pre-processing of raw lidar data
NASA Astrophysics Data System (ADS)
D'Amico, Giuseppe; Amodeo, Aldo; Mattis, Ina; Freudenthaler, Volker; Pappalardo, Gelsomina
2016-02-01
In this paper we describe an automatic tool for the pre-processing of aerosol lidar data called ELPP (EARLINET Lidar Pre-Processor). It is one of two calculus modules of the EARLINET Single Calculus Chain (SCC), the automatic tool for the analysis of EARLINET data. ELPP is an open source module that executes instrumental corrections and data handling of the raw lidar signals, making the lidar data ready to be processed by the optical retrieval algorithms. According to the specific lidar configuration, ELPP automatically performs dead-time correction, atmospheric and electronic background subtraction, gluing of lidar signals, and trigger-delay correction. Moreover, the signal-to-noise ratio of the pre-processed signals can be improved by means of configurable time integration of the raw signals and/or spatial smoothing. ELPP delivers the statistical uncertainties of the final products by means of error propagation or Monte Carlo simulations. During the development of ELPP, particular attention has been payed to make the tool flexible enough to handle all lidar configurations currently used within the EARLINET community. Moreover, it has been designed in a modular way to allow an easy extension to lidar configurations not yet implemented. The primary goal of ELPP is to enable the application of quality-assured procedures in the lidar data analysis starting from the raw lidar data. This provides the added value of full traceability of each delivered lidar product. Several tests have been performed to check the proper functioning of ELPP. The whole SCC has been tested with the same synthetic data sets, which were used for the EARLINET algorithm inter-comparison exercise. ELPP has been successfully employed for the automatic near-real-time pre-processing of the raw lidar data measured during several EARLINET inter-comparison campaigns as well as during intense field campaigns.
FPGA-Based Smart Sensor for Online Displacement Measurements Using a Heterodyne Interferometer
Vera-Salas, Luis Alberto; Moreno-Tapia, Sandra Veronica; Garcia-Perez, Arturo; de Jesus Romero-Troncoso, Rene; Osornio-Rios, Roque Alfredo; Serroukh, Ibrahim; Cabal-Yepez, Eduardo
2011-01-01
The measurement of small displacements on the nanometric scale demands metrological systems of high accuracy and precision. In this context, interferometer-based displacement measurements have become the main tools used for traceable dimensional metrology. The different industrial applications in which small displacement measurements are employed requires the use of online measurements, high speed processes, open architecture control systems, as well as good adaptability to specific process conditions. The main contribution of this work is the development of a smart sensor for large displacement measurement based on phase measurement which achieves high accuracy and resolution, designed to be used with a commercial heterodyne interferometer. The system is based on a low-cost Field Programmable Gate Array (FPGA) allowing the integration of several functions in a single portable device. This system is optimal for high speed applications where online measurement is needed and the reconfigurability feature allows the addition of different modules for error compensation, as might be required by a specific application. PMID:22164040
Detector Noise Characterization and Performance of MODIS Thermal Emissive Bands
NASA Technical Reports Server (NTRS)
Xiong, X.; Wu, A.; Chen, N.; Chiang, K.; Xiong, S.; Wenny, B.; Barnes, W. L.
2007-01-01
MODIS has 16 thermal emissive bands, a total of 160 individual detectors (10 for each spectral bands), located on the two cold focal plane assemblies (CFPA). MODIS TEB detectors were fully characterized pre-launch in a thermal vacuum (TV) environment using a NIST traceable blackbody calibration source (BCS) with temperatures ranging from 170 to 340K. On-orbit the TEB detectors are calibrated using an on-board blackbody (BB) on a scan-by-scan basis. For nominal on-orbit operation, the on-board BB temperature is typically controlled at 285K for Aqua MODIS and 290K for Terra MODIS. For the MODIS TEB calibration, each detector's noise equivalent temperature difference (NEdT) is often used to assess its performance and this parameter is a major contributor to the calibration uncertainty. Because of its impact on sensor calibration and data product quality, each MODIS TEB detector's NEdT is monitored on a daily basis at a fixed BB temperature and completely characterized on a regular basis at a number of BB temperatures. In this paper, we describe MODIS on-orbit TEB NEdT characterization activities, approaches, and results. We compare both pre-launch and on-orbit performance with sensor design specification and examine detector noise characterization impact on the calibration uncertainty. To date, 135 TEB detectors (out of a total of 160 detectors) in Terra MODIS (launched in December 1999) and 158 in Aqua MODIS (launched in May 2002) continue to perform with their NEdT below (or better than) their design specifications. A complete summary of all TEB noisy detectors, identified both pre-launch and on-orbit, is provided.
Significance of ITER IWS Material Selection and Qualification
NASA Astrophysics Data System (ADS)
Mehta, Bhoomi K.; Raval, Jigar; Maheshwari, Abha; Laad, Rahul; Singh, Gurlovleen; Pathak, Haresh
2017-04-01
In-Wall Shielding (IWS) is one of the important components of ITER Vacuum Vessel (VV) which fills the space between double walls of VV with cooling water. Procurement Arrangement (PA) for IWS has been signed with Indian Domestic Agency (INDA). Procurement of IWS materials, fabrication of IWS blocks and its delivery to respective Domestic Agency (DA) and ITER Organization (IO) are the main scope of this PA. Hence, INDIA is the only country which is contributing to VV IWS among all seven ITER partners. The main functions of the IWS are to provide Neutron Shielding with blanket, VV shells and water during plasma operations and to reduce ripple of the Toroidal Magnetic Field. To meet these functional requirements IWS blocks are made up of special materials (Borated Steels SS304 B4 & SS304 B7, Ferritic Steels SS 430, Austenitic Steel SS 316 L (N)-IG, XM-19 and Inconel-625) which are qualified, reliable and traceable for the design assessment. The choice of these materials has a significant influence on performance, maintainability, licensing, detailed design parameters and waste disposal. The main reasons for the materials selected for IWS are its high mechanical strength at operating temperatures, water chemistry properties, excellent fabrication characteristics and low cost relative to other similar materials. All the materials are qualified with respect to their respective codes (ASTM/EN standards with additional requirements as described in RCC-MR code 2007) and ITER requirements. Agreed Notified Body (ANB) has control conformity of materials certificates with approved material specification and traceability procedure for Safety Important Component (SIC). The procurement strategy for all the IWS materials has been developed in close collaboration with IO, ANB and Industries as per Product Procurement Specification (PPS). The R&D for sample, bulk material production, testing, inspection and handling as required are carried out by IN DA and IO. At present almost all IWS materials (∼2500 Tons) has been procured by IN DA with spares to manufacture ∼9000 IWS blocks. This paper summarizes IWS material selection, qualification and procurement processes in detail.
Duewer, David L; Kline, Margaret C; Romsos, Erica L; Toman, Blaza
2018-05-01
The highly multiplexed polymerase chain reaction (PCR) assays used for forensic human identification perform best when used with an accurately determined quantity of input DNA. To help ensure the reliable performance of these assays, we are developing a certified reference material (CRM) for calibrating human genomic DNA working standards. To enable sharing information over time and place, CRMs must provide accurate and stable values that are metrologically traceable to a common reference. We have shown that droplet digital PCR (ddPCR) limiting dilution end-point measurements of the concentration of DNA copies per volume of sample can be traceably linked to the International System of Units (SI). Unlike values assigned using conventional relationships between ultraviolet absorbance and DNA mass concentration, entity-based ddPCR measurements are expected to be stable over time. However, the forensic community expects DNA quantity to be stated in terms of mass concentration rather than entity concentration. The transformation can be accomplished given SI-traceable values and uncertainties for the number of nucleotide bases per human haploid genome equivalent (HHGE) and the average molar mass of a nucleotide monomer in the DNA polymer. This report presents the considerations required to establish the metrological traceability of ddPCR-based mass concentration estimates of human nuclear DNA. Graphical abstract The roots of metrological traceability for human nuclear DNA mass concentration results. Values for the factors in blue must be established experimentally. Values for the factors in red have been established from authoritative source materials. HHGE stands for "haploid human genome equivalent"; there are two HHGE per diploid human genome.
Andreis, Elisabeth; Küllmer, Kai
2014-01-01
Self-monitoring of blood glucose (BG) by means of handheld BG systems is a cornerstone in diabetes therapy. The aim of this article is to describe a procedure with proven traceability for calibration and evaluation of BG systems to guarantee reliable BG measurements. Isotope dilution gas chromatography mass spectrometry (ID/GC/MS) is a method that fulfills all requirements to be used in a higher-order reference measurement procedure. However, this method is not applicable for routine measurements because of the time-consuming sample preparation. A hexokinase method with perchloric acid (PCA) sample pretreatment is used in a measurement procedure for such purposes. This method is directly linked to the ID/GC/MS method by calibration with a glucose solution that has an ID/GC/MS-determined target value. BG systems are calibrated with whole blood samples. The glucose levels in such samples are analyzed by this ID/GC/MS-linked hexokinase method to establish traceability to higher-order reference material. For method comparison, the glucose concentrations in 577 whole blood samples were measured using the PCA-hexokinase method and the ID/GC/MS method; this resulted in a mean deviation of 0.1%. The mean deviation between BG levels measured in >500 valid whole blood samples with BG systems and the ID/GC/MS was 1.1%. BG systems allow a reliable glucose measurement if a true reference measurement procedure, with a noninterrupted traceability chain using ID/GC/MS linked hexokinase method for calibration of BG systems, is implemented. Systems should be calibrated by means of a traceable and defined measurement procedure to avoid bias. PMID:24876614
75 FR 70753 - Market Test Involving Greeting Cards
Federal Register 2010, 2011, 2012, 2013, 2014
2010-11-18
... businesses will produce and distribute pre-approved envelopes according to specific design requirements which... will produce and distribute pre-approved envelopes with specific design requirements that will be... a market test beginning on or about January 1, 2011, of an experimental market dominant product...
FOREWORD: Materials metrology Materials metrology
NASA Astrophysics Data System (ADS)
Bennett, Seton; Valdés, Joaquin
2010-04-01
It seems that so much of modern life is defined by the materials we use. From aircraft to architecture, from cars to communications, from microelectronics to medicine, the development of new materials and the innovative application of existing ones have underpinned the technological advances that have transformed the way we live, work and play. Recognizing the need for a sound technical basis for drafting codes of practice and specifications for advanced materials, the governments of countries of the Economic Summit (G7) and the European Commission signed a Memorandum of Understanding in 1982 to establish the Versailles Project on Advanced Materials and Standards (VAMAS). This project supports international trade by enabling scientific collaboration as a precursor to the drafting of standards. The VAMAS participants recognized the importance of agreeing a reliable, universally accepted basis for the traceability of the measurements on which standards depend for their preparation and implementation. Seeing the need to involve the wider metrology community, VAMAS approached the Comité International des Poids et Mesures (CIPM). Following discussions with NMI Directors and a workshop at the BIPM in February 2005, the CIPM decided to establish an ad hoc Working Group on the metrology applicable to the measurement of material properties. The Working Group presented its conclusions to the CIPM in October 2007 and published its final report in 2008, leading to the signature of a Memorandum of Understanding between VAMAS and the BIPM. This MoU recognizes the work that is already going on in VAMAS as well as in the Consultative Committees of the CIPM and establishes a framework for an ongoing dialogue on issues of materials metrology. The question of what is meant by traceability in the metrology of the properties of materials is particularly vexed when the measurement results depend on a specified procedure. In these cases, confidence in results requires not only traceable calibration of the various instruments and standards used but also the reliable application of an accepted measurement procedure. Nowhere is this more evident than in the use of hardness scales, which are not directly traceable to the SI. This special issue of Metrologia includes a summary of the findings and conclusions of the Working Group and a further 14 papers covering the full range of properties of interest in science, engineering and standards making. It includes papers by authors at eight national measurement institutes and four other research centres. In addition to mechanical properties, there are papers addressing issues associated with the measurement of electromagnetic, acoustic and optical properties as well as those arising from the specific structural features of many new materials. As guest editors, we are extremely grateful to all the authors who have contributed to this special issue on the measurement of the properties of materials. We hope it will contribute to a wider appreciation of many of the associated issues and foster a growing understanding of the importance of ensuring that all such measurements are performed in accordance with accepted standards and procedures, with proper attention to the need to establish the traceability of the results. Only in this way can the performance, safety and fitness for purpose of products be guaranteed.
Object Based Systems Engineering
2011-10-17
practically impossible where the original SMEs are unavailable or lack perfect recall. 7. Capture the precious and transient logic behind this...complex system. References 1. FITCH, J. Exploiting Decision-to-Requirements Traceability, briefing to NDIA CMMI Conference, November, 2009 2
Requirement Development Process and Tools
NASA Technical Reports Server (NTRS)
Bayt, Robert
2017-01-01
Requirements capture the system-level capabilities in a set of complete, necessary, clear, attainable, traceable, and verifiable statements of need. Requirements should not be unduly restrictive, but should set limits that eliminate items outside the boundaries drawn, encourage competition (or alternatives), and capture source and reason of requirement. If it is not needed by the customer, it is not a requirement. They establish the verification methods that will lead to product acceptance. These must be reproducible assessment methods.
The perceived value of dairy product traceability in modern society: An exploratory study.
Charlebois, Sylvain; Haratifar, Sanaz
2015-05-01
The current study assessed the perceived value of food traceability in modern society by young consumers. After experiencing numerous recalls and food safety-related incidences, consumers are increasingly aware of the tools available to mitigate risks. Food traceability has been associated with food safety procedures for many years, but recent high-profile cases of food fraud around the world have given traceability a different strategic purpose. Focusing solely on dairy products, our survey results offer a glimpse of consumer perceptions of traceability as a means to preserve food integrity and authenticity. This study explored the various influences that market-oriented traceability has had on dairy consumers. For example, results show that if the dairy sector could guarantee that their product is in fact organic, 53.8% of respondents who often purchase organic milk would consider always purchasing traceable organic milk. This research produced a quantitative set of information related to the perceived value of food traceability, which could be useful for the creation and development of improved guidelines and better education for consumers. We discuss limitations and suggest areas for new research. Copyright © 2015 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.
Standards for the validation of remotely sensed albedo products
NASA Astrophysics Data System (ADS)
Adams, Jennifer
2015-04-01
Land surface albedo is important component of the Earth's energy balance, defined as the fraction of shortwave radiation absorbed by a surface, and is one many Essential Climate Variables (ECVS) that can be retrieved from space through remote sensing. To quantify the accuracy of these products, they must be validated with respect to in-situ measurements of albedo using an albedometer. Whilst accepted standards exist for the calibration of albedometers, standards for the use of in-situ measurement schemes, and their use in validation procedures have yet to be developed. It is essential that we can assess the quality of remotely sensed albedo data, and to identify traceable sources of uncertainty during process of providing these data. As a result of the current lack of accepted standards for in-situ albedo retrieval and validation procedures, we are not yet able to identify and quantify traceable sources of uncertainty. Establishing standard protocols for in-situ retrievals for the validation of global albedo products would allow inter-product use and comparison, in addition to product standardization. Accordingly, this study aims to assess the quality of in-situ albedo retrieval schemes and identify sources of uncertainty, specifically in vegetation environments. A 3D Monte Carlo Ray Tracing Model will be used to simulate albedometer instruments in complex 3D vegetation canopies. To determine sources of uncertainty, factors that influence albedo measurement uncertainty were identified and will subsequently be examined: 1. Time of day (Solar Zenith Angle) 2. Ecosytem type 3. Placement of albedometer within the ecosystem 4. Height of albedometer above the canopy 5. Clustering within the ecosystem A variety of 3D vegetation canopies have been generated to cover the main ecosystems found globally, different seasons, and different plant distributions. Canopies generated include birchstand and pinestand forests for summer and winter, savanna, shrubland, cropland and citrus orchard. All canopies were simulated for a 100x100m area to best represent in-situ measurement conditions. Preliminary tests have been conducted, firstly, identifying the spectral range required to estimate broadband albedo (BBA) and secondly, determining the hyper-spectral intervals required to calculate BBA from spectral albedo. Final results are expected to be able to identify for the factors aforementioned, given a specified confidence level and within 3% accuracy, when does uncertainty of in-situ measurement fall within these critera, and outside these criteria. As the uncertainty of in-situ measurements should be made on an individual basis accounting for relevant factors, this study aims to document for a specific scenario traceable uncertainty sources in in-situ albedo retrieval.
Augmented halal food traceability system: analysis and design using UML
NASA Astrophysics Data System (ADS)
Usman, Y. V.; Fauzi, A. M.; Irawadi, T. T.; Djatna, T.
2018-04-01
Augmented halal food traceability is expanding the range of halal traceability in food supply chain where currently only available for tracing from the source of raw material to the industrial warehouse or inbound logistic. The halal traceability system must be developed in the integrated form that includes inbound and outbound logistics. The objective of this study was to develop a reliable initial model of integrated traceability system of halal food supply chain. The method was based on unified modeling language (UML) such as use case, sequence, and business process diagram. A goal programming model was formulated considering two objective functions which include (1) minimization of risk of halal traceability failures happened potentially during outbound logistics activities and (2) maximization of quality of halal product information. The result indicates the supply of material is the most important point to be considered in minimizing the risk of failure of halal food traceability system whereas no risk observed in manufacturing and distribution.
STS users study (study 2.2). Volume 2: STS users plan (user data requirements) study
NASA Technical Reports Server (NTRS)
Pritchard, E. I.
1975-01-01
Pre-flight scheduling and pre-flight requirements of the space transportation system are discussed. Payload safety requirements, shuttle flight manifests, and interface specifications are studied in detail.
Durante, Caterina; Baschieri, Carlo; Bertacchini, Lucia; Bertelli, Davide; Cocchi, Marina; Marchetti, Andrea; Manzini, Daniela; Papotti, Giulia; Sighinolfi, Simona
2015-04-15
Geographical origin and authenticity of food are topics of interest for both consumers and producers. Among the different indicators used for traceability studies, (87)Sr/(86)Sr isotopic ratio has provided excellent results. In this study, two analytical approaches for wine sample pre-treatment, microwave and low temperature mineralisation, were investigated to develop accurate and precise analytical method for (87)Sr/(86)Sr determination. The two procedures led to comparable results (paired t-test, with t
Communicating food safety, authenticity and consumer choice. Field experiences.
Syntesa, Heiner Lehr
2013-04-01
The paper reviews patented and non-patented technologies, methods and solutions in the area of food traceability. It pays special attention to the communication of food safety, authenticity and consumer choice. Twenty eight recent patents are reviewed in the areas of (secure) identification, product freshness indicators, meat traceability, (secure) transport of information along the supply chain, country/region/place of origin, automated authentication, supply chain management systems, consumer interaction systems. In addition, solutions and pilot projects are described in the areas of Halal traceability, traceability of bird's nests, cold chain management, general food traceability and other areas.
Code of Federal Regulations, 2014 CFR
2014-04-01
... QUALITY SYSTEM REGULATION Identification and Traceability § 820.65 Traceability. Each manufacturer of a device that is intended for surgical implant into the body or to support or sustain life and whose...
Code of Federal Regulations, 2013 CFR
2013-04-01
... QUALITY SYSTEM REGULATION Identification and Traceability § 820.65 Traceability. Each manufacturer of a device that is intended for surgical implant into the body or to support or sustain life and whose...
Code of Federal Regulations, 2012 CFR
2012-04-01
... QUALITY SYSTEM REGULATION Identification and Traceability § 820.65 Traceability. Each manufacturer of a device that is intended for surgical implant into the body or to support or sustain life and whose...
Code of Federal Regulations, 2010 CFR
2010-04-01
... QUALITY SYSTEM REGULATION Identification and Traceability § 820.65 Traceability. Each manufacturer of a device that is intended for surgical implant into the body or to support or sustain life and whose...
Code of Federal Regulations, 2011 CFR
2011-04-01
... QUALITY SYSTEM REGULATION Identification and Traceability § 820.65 Traceability. Each manufacturer of a device that is intended for surgical implant into the body or to support or sustain life and whose...
NASA Astrophysics Data System (ADS)
Chow, Sherman S. M.
Traceable signature scheme extends a group signature scheme with an enhanced anonymity management mechanism. The group manager can compute a tracing trapdoor which enables anyone to test if a signature is signed by a given misbehaving user, while the only way to do so for group signatures requires revealing the signer of all signatures. Nevertheless, it is not tracing in a strict sense. For all existing schemes, T tracing agents need to recollect all N' signatures ever produced and perform RN' “checks” for R revoked users. This involves a high volume of transfer and computations. Increasing T increases the degree of parallelism for tracing but also the probability of “missing” some signatures in case some of the agents are dishonest.
RELAP-7 Code Assessment Plan and Requirement Traceability Matrix
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yoo, Junsoo; Choi, Yong-joon; Smith, Curtis L.
2016-10-01
The RELAP-7, a safety analysis code for nuclear reactor system, is under development at Idaho National Laboratory (INL). Overall, the code development is directed towards leveraging the advancements in computer science technology, numerical solution methods and physical models over the last decades. Recently, INL has also been putting an effort to establish the code assessment plan, which aims to ensure an improved final product quality through the RELAP-7 development process. The ultimate goal of this plan is to propose a suitable way to systematically assess the wide range of software requirements for RELAP-7, including the software design, user interface, andmore » technical requirements, etc. To this end, we first survey the literature (i.e., international/domestic reports, research articles) addressing the desirable features generally required for advanced nuclear system safety analysis codes. In addition, the V&V (verification and validation) efforts as well as the legacy issues of several recently-developed codes (e.g., RELAP5-3D, TRACE V5.0) are investigated. Lastly, this paper outlines the Requirement Traceability Matrix (RTM) for RELAP-7 which can be used to systematically evaluate and identify the code development process and its present capability.« less
On-orbit Metrology and Calibration Requirements for Space Station Activities Definition Study
NASA Technical Reports Server (NTRS)
Cotty, G. M.; Ranganathan, B. N.; Sorrell, A. L.
1989-01-01
The Space Station is the focal point for the commercial development of space. The long term routine operation of the Space Station and the conduct of future commercial activities suggests the need for in-space metrology capabilities analogous when possible to those on-Earth. The ability to perform periodic calibrations and measurements with proper traceability is imperative for the routine operation of the Space Station. An initial review, however, indicated a paucity of data related to metrology and calibration requirements for in-space operations. This condition probably exists because of the highly developmental aspect of space activities to date, their short duration, and nonroutine nature. The on-orbit metrology and calibration needs of the Space Station were examined and assessed. In order to achieve this goal, the following tasks were performed: an up-to-date literature review; identification of on-orbit calibration techniques; identification of sensor calibration requirements; identification of calibration equipment requirements; definition of traceability requirements; preparation of technology development plans; and preparation of the final report. Significant information and major highlights pertaining to each task is presented. In addition, some general (generic) conclusions/observations and recommendations that are pertinent to the overall in-space metrology and calibration activities are presented.
Automatic Dynamic Aircraft Modeler (ADAM) for the Computer Program NASTRAN
NASA Technical Reports Server (NTRS)
Griffis, H.
1985-01-01
Large general purpose finite element programs require users to develop large quantities of input data. General purpose pre-processors are used to decrease the effort required to develop structural models. Further reduction of effort can be achieved by specific application pre-processors. Automatic Dynamic Aircraft Modeler (ADAM) is one such application specific pre-processor. General purpose pre-processors use points, lines and surfaces to describe geometric shapes. Specifying that ADAM is used only for aircraft structures allows generic structural sections, wing boxes and bodies, to be pre-defined. Hence with only gross dimensions, thicknesses, material properties and pre-defined boundary conditions a complete model of an aircraft can be created.
Improving integrity of on-line grammage measurement with traceable basic calibration.
Kangasrääsiö, Juha
2010-07-01
The automatic control of grammage (basis weight) in paper and board production is based upon on-line grammage measurement. Furthermore, the automatic control of other quality variables such as moisture, ash content and coat weight, may rely on the grammage measurement. The integrity of Kr-85 based on-line grammage measurement systems was studied, by performing basic calibrations with traceably calibrated plastic reference standards. The calibrations were performed according to the EN ISO/IEC 17025 standard, which is a requirement for calibration laboratories. The observed relative measurement errors were 3.3% in the first time calibrations at the 95% confidence level. With the traceable basic calibration method, however, these errors can be reduced to under 0.5%, thus improving the integrity of on-line grammage measurements. Also a standardised algorithm, based on the experience from the performed calibrations, is proposed to ease the adjustment of the different grammage measurement systems. The calibration technique can basically be applied to all beta-radiation based grammage measurements. 2010 ISA. Published by Elsevier Ltd. All rights reserved.
Force sensor characterization under sinusoidal excitations.
Medina, Nieves; de Vicente, Jesús
2014-10-06
The aim in the current work is the development of a method to characterize force sensors under sinusoidal excitations using a primary standard as the source of traceability. During this work the influence factors have been studied and a method to minimise their contributions, as well as the corrections to be performed under dynamic conditions have been established. These results will allow the realization of an adequate characterization of force sensors under sinusoidal excitations, which will be essential for its further proper use under dynamic conditions. The traceability of the sensor characterization is based in the direct definition of force as mass multiplied by acceleration. To do so, the sensor is loaded with different calibrated loads and is maintained under different sinusoidal accelerations by means of a vibration shaker system that is able to generate accelerations up to 100 m/s2 with frequencies from 5 Hz up to 2400 Hz. The acceleration is measured by means of a laser vibrometer with traceability to the units of time and length. A multiple channel data acquisition system is also required to simultaneously acquire the electrical output signals of the involved instrument in real time.
Hg0 and HgCl2 Reference Gas Standards: ?NIST Traceability ...
EPA and NIST have collaborated to establish the necessary procedures for establishing the required NIST traceability of commercially-provided Hg0 and HgCl2 reference generators. This presentation will discuss the approach of a joint EPA/NIST study to accurately quantify the true concentrations of Hg0 and HgCl2 reference gases produced from high quality, NIST-traceable, commercial Hg0 and HgCl2 generators. This presentation will also discuss the availability of HCl and Hg0 compressed reference gas standards as a result of EPA's recently approved Alternative Methods 114 and 118. Gaseous elemental mercury (Hg0) and oxidized mercury (HgCl2) reference standards are integral to the use of mercury continuous emissions monitoring systems (Hg CEMS) for regulatory compliance emissions monitoring. However, a quantitative disparity of approximately 7-10% has been observed between commercial Hg0 and HgCl2 reference gases which currently limits the use of (HgCl2) reference gas standards. Resolving this disparity would enable the expanded use of (HgCl2) reference gas standards for regulatory compliance purposes.
[Managing the cold chain in healthcare facilities].
Royer, Mathilde; Breton Marchand, Justine; Pons, David
2017-11-01
The storage of temperature-sensitive healthcare products requires control of the cold chain. Healthcare facilities must have the appropriate equipment at their disposal and ensure the traceability and monitoring of temperatures. Copyright © 2017 Elsevier Masson SAS. All rights reserved.
EPA Traceability Protocol for Assay and Certification of Gaseous Calibration Standards
EPA's air monitoring regulations require the use of Protocol Gases to set air pollution monitors. This protocol balances the government's need for accuracy with the producers' need for flexibility, low cost, and minimum external oversight.
Mykkänen, Juha; Virkanen, Hannu; Tuomainen, Mika
2013-01-01
The governance of large eHealth initiatives requires traceability of many requirements and design decisions. We provide a model which we use to conceptually analyze variability of several enterprise architecture (EA) elements throughout the extended lifecycle of development goals using interrelated projects related to the national ePrescription in Finland.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Koukoulas, Triantafillos, E-mail: triantafillos.koukoulas@npl.co.uk; Piper, Ben
Since the introduction of the International System of Units (the SI system) in 1960, weights, measures, standardised approaches, procedures, and protocols have been introduced, adapted, and extensively used. A major international effort and activity concentrate on the definition and traceability of the seven base SI units in terms of fundamental constants, and consequently those units that are derived from the base units. In airborne acoustical metrology and for the audible range of frequencies up to 20 kHz, the SI unit of sound pressure, the pascal, is realised indirectly and without any knowledge or measurement of the sound field. Though themore » principle of reciprocity was originally formulated by Lord Rayleigh nearly two centuries ago, it was devised in the 1940s and eventually became a calibration standard in the 1960s; however, it can only accommodate a limited number of acoustic sensors of specific types and dimensions. International standards determine the device sensitivity either through coupler or through free-field reciprocity but rely on the continuous availability of specific acoustical artefacts. Here, we show an optical method based on gated photon correlation spectroscopy that can measure sound pressures directly and absolutely in fully anechoic conditions, remotely, and without disturbing the propagating sound field. It neither relies on the availability or performance of any measurement artefact nor makes any assumptions of the device geometry and sound field characteristics. Most importantly, the required units of sound pressure and microphone sensitivity may now be experimentally realised, thus providing direct traceability to SI base units.« less
Effective Materials Property Information Management for the 21st Century
NASA Technical Reports Server (NTRS)
Ren, Weiju; Cebon, David; Arnold, Steve
2009-01-01
This paper discusses key principles for the development of materials property information management software systems. There are growing needs for automated materials information management in various organizations. In part these are fueled by the demands for higher efficiency in material testing, product design and engineering analysis. But equally important, organizations are being driven by the need for consistency, quality and traceability of data, as well as control of access to sensitive information such as proprietary data. Further, the use of increasingly sophisticated nonlinear, anisotropic and multi-scale engineering analyses requires both processing of large volumes of test data for development of constitutive models and complex materials data input for Computer-Aided Engineering (CAE) software. And finally, the globalization of economy often generates great needs for sharing a single "gold source" of materials information between members of global engineering teams in extended supply chains. Fortunately, material property management systems have kept pace with the growing user demands and evolved to versatile data management systems that can be customized to specific user needs. The more sophisticated of these provide facilities for: (i) data management functions such as access, version, and quality controls; (ii) a wide range of data import, export and analysis capabilities; (iii) data "pedigree" traceability mechanisms; (iv) data searching, reporting and viewing tools; and (v) access to the information via a wide range of interfaces. In this paper the important requirements for advanced material data management systems, future challenges and opportunities such as automated error checking, data quality characterization, identification of gaps in datasets, as well as functionalities and business models to fuel database growth and maintenance are discussed.
Development of the management for parenteral nutrition traceability in a standard hospital.
Bernabeu Soria, Beatriz; Mateo García, Máxima; Wanden-Berghe, Carmina; Cervera Peris, Mercedes; Piñeiro Corrales, Guadalupe; Sanz-Valero, Javier
2015-11-01
to develop the traceability control and the hazard analysis in the processes of parenteral nutrients (PN). a standardized graphical notation was generated, describing in detail each of the stages in the overall process. The presence of hazards was analysed by sequencing decisions. The existence of Control Points (CP) or Critical Control Points (CCP) was estimated by Criticality Index (CI) for each hazard taking into account the probability of occurrence and the severity of the damage. The threshold for the IC was set in 6. a specific flow chart for the management and traceability of PN was obtained, defining each of the stages in CPs (validation and transcription of the prescription and administration) or CCPs (preparation, storage and infusion pump -flow and filter-). Stages regarding the delivery, the recovery and the recycle of the packing material of PNs are not considered CPs and, therefore, they were not included in the dashboard. PN must be dealt with in the frame of a standardized management system in order to improve patient safety, clinical relevance, maximize resource efficiency and minimize procedural issues. The proposed system provides a global management model whose steps are fully defined, allowing monitoring and verification of PN. It would be convenient to make use of a software application to support the monitoring of the traceability management and to store the historical records in order to evaluate the system. Copyright AULA MEDICA EDICIONES 2014. Published by AULA MEDICA. All rights reserved.
NASA Astrophysics Data System (ADS)
Witt, K.; Wolf, H. U.; Heuck, C.; Kammel, M.; Kummrow, A.; Neukammer, J.
2013-10-01
Haemoglobin concentration in blood is one of the most frequently measured analytes in laboratory medicine. Reference and routine methods for the determination of the haemoglobin concentration in blood are based on the conversion of haeme, haemoglobin and haemiglobin species into uniform end products. The total haemoglobin concentration in blood is measured using the absorbance of the reaction products. Traceable absorbance measurement values on the highest metrological level are a prerequisite for the calibration and evaluation of procedures with respect to their suitability for routine measurements and their potential as reference measurement procedures. For this purpose, we describe a procedure to establish traceability of spectral absorbance measurements for the haemiglobincyanide (HiCN) method and for the alkaline haematin detergent (AHD) method. The latter is characterized by a higher stability of the reaction product. In addition, the toxic hazard of cyanide, which binds to the iron ion of the haem group and thus inhibits the oxygen transport, is avoided. Traceability is established at different wavelengths by applying total least-squares analysis to derive the conventional quantity values for the absorbance from the measured values. Extrapolation and interpolation are applied to get access to the spectral regions required to characterize the Q-absorption bands of the HiCN and AHD methods, respectively. For absorbance values between 0.3 and 1.8, the contributions of absorbance measurements to the total expanded uncertainties (95% level of confidence) of absorbance measurements range from 1% to 0.4%.
Göhler, Daniel; Wessely, Benno; Stintz, Michael; Lazzerini, Giovanni Mattia; Yacoot, Andrew
2017-01-01
Dimensional measurements on nano-objects by atomic force microscopy (AFM) require samples of safely fixed and well individualized particles with a suitable surface-specific particle number on flat and clean substrates. Several known and proven particle preparation methods, i.e., membrane filtration, drying, rinsing, dip coating as well as electrostatic and thermal precipitation, were performed by means of scanning electron microscopy to examine their suitability for preparing samples for dimensional AFM measurements. Different suspensions of nano-objects (with varying material, size and shape) stabilized in aqueous solutions were prepared therefore on different flat substrates. The drop-drying method was found to be the most suitable one for the analysed suspensions, because it does not require expensive dedicated equipment and led to a uniform local distribution of individualized nano-objects. Traceable AFM measurements based on Si and SiO2 coated substrates confirmed the suitability of this technique. PMID:28904839
Fiala, Petra; Göhler, Daniel; Wessely, Benno; Stintz, Michael; Lazzerini, Giovanni Mattia; Yacoot, Andrew
2017-01-01
Dimensional measurements on nano-objects by atomic force microscopy (AFM) require samples of safely fixed and well individualized particles with a suitable surface-specific particle number on flat and clean substrates. Several known and proven particle preparation methods, i.e., membrane filtration, drying, rinsing, dip coating as well as electrostatic and thermal precipitation, were performed by means of scanning electron microscopy to examine their suitability for preparing samples for dimensional AFM measurements. Different suspensions of nano-objects (with varying material, size and shape) stabilized in aqueous solutions were prepared therefore on different flat substrates. The drop-drying method was found to be the most suitable one for the analysed suspensions, because it does not require expensive dedicated equipment and led to a uniform local distribution of individualized nano-objects. Traceable AFM measurements based on Si and SiO 2 coated substrates confirmed the suitability of this technique.
Current Developments in Future Planetary Probe Sensors for TPS
NASA Technical Reports Server (NTRS)
Martinez, Ed; Venkatapathy, Ethiraj; Oishu, Tomo
2003-01-01
In-situ Thermal Protection System (TPS) sensors are required to provide traceability of TPS performance and sizing tools. Traceability will lead to higher fidelity design tools, which in turn will lead to lower design safety margins, and decreased heatshield mass. Decreasing TPS mass will enable certain missions that are not otherwise feasible, and directly increase science payload. NASA Ames is currently developing two flight measurements as essential to advancing the state of TPS traceability for material modeling and aerothermal simulation: heat flux and surface recession (for ablators). The heat flux gage is applicable to both ablators and non-ablators and is therefore the more generalized sensor concept of the two with wider applicability to mission scenarios. This paper describes the development of a microsensor capable of surface and in-depth temperature and heat flux measurements for TPS materials appropriate to Titan, Neptune, and Mars aerocapture, and direct entry. The thermal sensor will be monolithic solid state devices composed of thick film platinum RTD on an alumina substrate. Choice of materials and critical dimensions are used to tailor gage response, determined during calibration activities, to specific (forebody vs. aftbody) heating environments. Current design has maximum operating temperature of 1500 K, and allowable constant heat flux of q=28.7 watts per square centimeter, and time constants between 0.05 and 0.2 seconds. The catalytic and radiative response of these heat flux gages can also be changed through the use of appropriate coatings. By using several co-located gages with various surface coatings, data can be obtained to isolate surface heat flux components due to radiation, catalycity and convection. Selectivity to radiative heat flux is a useful feature even for an in-depth gage, as radiative transport may be a significant heat transport mechanism for porous TPS materials in Titan aerocapture. This paper also reports on progress to adapt a previously flown surface recession sensor, based on the Jupiter probe Galileo Analog Resistance Ablation Detector (ARAD), to appropriate aerocapture conditions.
Development of Solid State Thermal Sensors for Aeroshell TPS Flight Applications
NASA Technical Reports Server (NTRS)
Martinez, Ed; Oishi, Tomo; Gorbonov, Sergey
2005-01-01
In-situ Thermal Protection System (TPS) sensors are required to provide verification by traceability of TPS performance and sizing tools. Traceability will lead to higher fidelity design tools, which in turn will lead to lower design safety margins, and decreased heatshield mass. Decreasing TPS mass will enable certain missions that are not otherwise feasible, and directly increase science payload. NASA Ames is currently developing two flight measurements as essential to advancing the state of TPS traceability for material modeling and aerothermal simulation: heat flux and surface recession (for ablators). The heat flux gage is applicable to both ablators and non-ablators and is therefore the more generalized sensor concept of the two with wider applicability to mission scenarios. This paper describes the continuing development of a thermal microsensor capable of surface and in-depth temperature and heat flux measurements for TPS materials appropriate to Titan, Neptune, and Mars aerocapture, and direct entry. The thermal sensor is a monolithic solid state device composed of thick film platinum RTD on an alumina substrate. Choice of materials and critical dimensions are used to tailor gage response, determined during calibration activities, to specific (forebody vs. aftbody) heating environments. Current design has maximum operating temperature of 1500K, and allowable constant heat flux of q=28.7 W/cm(sup 2), and time constants between 0.05 and 0.2 seconds. The catalytic and radiative response of these heat flux gages can also be changed through the use of appropriate coatings. By using several co-located gages with various surface coatings, data can be obtained to isolate surface heat flux components due to radiation, catalycity and convection. Selectivity to radiative heat flux is a useful feature even for an in-depth gage, as radiative transport may be a significant heat transport mechanism for porous TPS materials in Titan aerocapture.
75 FR 24569 - Animal Traceability; Public Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2010-05-05
... DEPARTMENT OF AGRICULTURE Animal and Plant Health Inspection Service [Docket No. APHIS-2010-0050] Animal Traceability; Public Meetings AGENCY: Animal and Plant Health Inspection Service, USDA. ACTION... input on the new framework being developed for animal disease traceability. Additional meetings are...
75 FR 33576 - Animal Traceability; Public Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2010-06-14
... DEPARTMENT OF AGRICULTURE Animal and Plant Health Inspection Service [Docket No. APHIS-2010-0050] Animal Traceability; Public Meetings AGENCY: Animal and Plant Health Inspection Service, USDA. ACTION... the new framework being developed for animal disease traceability. The meetings are being organized by...
75 FR 47769 - Animal Traceability; Public Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2010-08-09
... DEPARTMENT OF AGRICULTURE Animal and Plant Health Inspection Service [Docket No. APHIS-2010-0050] Animal Traceability; Public Meetings AGENCY: Animal and Plant Health Inspection Service, USDA. ACTION... the new framework being developed for animal disease traceability. The meetings are being organized by...
40 CFR 1065.301 - Overview and general provisions.
Code of Federal Regulations, 2013 CFR
2013-07-01
... judgment. (d) Use NIST-traceable standards to the tolerances we specify for calibrations and verifications. Where we specify the need to use NIST-traceable standards, you may alternatively ask for our approval to use international standards that are not NIST-traceable. ...
40 CFR 1065.301 - Overview and general provisions.
Code of Federal Regulations, 2012 CFR
2012-07-01
... judgment. (d) Use NIST-traceable standards to the tolerances we specify for calibrations and verifications. Where we specify the need to use NIST-traceable standards, you may alternatively ask for our approval to use international standards that are not NIST-traceable. ...
40 CFR 1065.301 - Overview and general provisions.
Code of Federal Regulations, 2010 CFR
2010-07-01
... judgment. (d) Use NIST-traceable standards to the tolerances we specify for calibrations and verifications. Where we specify the need to use NIST-traceable standards, you may alternatively ask for our approval to use international standards that are not NIST-traceable. ...
40 CFR 1065.301 - Overview and general provisions.
Code of Federal Regulations, 2014 CFR
2014-07-01
... judgment. (d) Use NIST-traceable standards to the tolerances we specify for calibrations and verifications. Where we specify the need to use NIST-traceable standards, you may alternatively ask for our approval to use international standards that are not NIST-traceable. ...
40 CFR 1065.301 - Overview and general provisions.
Code of Federal Regulations, 2011 CFR
2011-07-01
... judgment. (d) Use NIST-traceable standards to the tolerances we specify for calibrations and verifications. Where we specify the need to use NIST-traceable standards, you may alternatively ask for our approval to use international standards that are not NIST-traceable. ...
NASA Technical Reports Server (NTRS)
Reil, Robin L.
2014-01-01
Model Based Systems Engineering (MBSE) has recently been gaining significant support as a means to improve the "traditional" document-based systems engineering (DBSE) approach to engineering complex systems. In the spacecraft design domain, there are many perceived and propose benefits of an MBSE approach, but little analysis has been presented to determine the tangible benefits of such an approach (e.g. time and cost saved, increased product quality). This paper presents direct examples of how developing a small satellite system model can improve traceability of the mission concept to its requirements. A comparison of the processes and approaches for MBSE and DBSE is made using the NASA Ames Research Center SporeSat CubeSat mission as a case study. A model of the SporeSat mission is built using the Systems Modeling Language standard and No Magic's MagicDraw modeling tool. The model incorporates mission concept and requirement information from the mission's original DBSE design efforts. Active dependency relationships are modeled to demonstrate the completeness and consistency of the requirements to the mission concept. Anecdotal information and process-duration metrics are presented for both the MBSE and original DBSE design efforts of SporeSat.
Improved Traceability of Mission Concept to Requirements Using Model Based Systems Engineering
NASA Technical Reports Server (NTRS)
Reil, Robin
2014-01-01
Model Based Systems Engineering (MBSE) has recently been gaining significant support as a means to improve the traditional document-based systems engineering (DBSE) approach to engineering complex systems. In the spacecraft design domain, there are many perceived and propose benefits of an MBSE approach, but little analysis has been presented to determine the tangible benefits of such an approach (e.g. time and cost saved, increased product quality). This thesis presents direct examples of how developing a small satellite system model can improve traceability of the mission concept to its requirements. A comparison of the processes and approaches for MBSE and DBSE is made using the NASA Ames Research Center SporeSat CubeSat mission as a case study. A model of the SporeSat mission is built using the Systems Modeling Language standard and No Magics MagicDraw modeling tool. The model incorporates mission concept and requirement information from the missions original DBSE design efforts. Active dependency relationships are modeled to analyze the completeness and consistency of the requirements to the mission concept. Overall experience and methodology are presented for both the MBSE and original DBSE design efforts of SporeSat.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Burchard, Ross L.; Pierson, Kathleen P.; Trumbo, Derek
Tarjetas is used to generate requirements from source documents. These source documents are in a hierarchical XML format that have been produced from PDF documents processed through the “Reframe” software package. The software includes the ability to create Topics and associate text Snippets with those topics. Requirements are then generated and text Snippets with their associated Topics are referenced to the requirement. The software maintains traceability from the requirement ultimately to the source document that produced the snippet
Carrazana González, J; Fernández, I M; Capote Ferrera, E; Rodríguez Castro, G
2008-11-01
Information about how the laboratory of Centro de Protección e Higiene de las Radiaciones (CPHR), Cuba establishes its traceability to the International System of Units for the measurement of radionuclides in environmental test items is presented. A comparison among different methodologies of uncertainty calculation, including an analysis of the feasibility of using the Kragten-spreadsheet approach, is shown. In the specific case of the gamma spectrometric assay, the influence of each parameter, and the identification of the major contributor, in the relative difference between the methods of uncertainty calculation (Kragten and partial derivative) is described. The reliability of the uncertainty calculation results reported by the commercial software Gamma 2000 from Silena is analyzed.
Spatial and Temporal Analysis of Bias HAST System Temperature
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pfeifer, Kent B.; Furrer, III, Clint T; Sandoval, Paul Anthony
2017-03-01
High-reliability components for high-consequence systems require detailed testing of operation after having undergone highly accelerated stress testing (HAST) under unusual conditions of high-temperature and humidity. This paper describes the design and operation of a system called "Wormwood" that is a highly multiplexed temperature measurement system that is designed to operate under HAST conditions to allow measurement of the temperature as a function of time and position in a HAST chamber. HAST chambers have single-point temperature measurements that can be traceable to NIST standards. The objective of these "Wormwood" measurements is to verify the uniformity and stability of the remaining volumemore » of the HAST chamber with respect to the single traceable standard.« less
Requirement analysis for the one-stop logistics management of fresh agricultural products
NASA Astrophysics Data System (ADS)
Li, Jun; Gao, Hongmei; Liu, Yuchuan
2017-08-01
Issues and concerns for food safety, agro-processing, and the environmental and ecological impact of food production have been attracted many research interests. Traceability and logistics management of fresh agricultural products is faced with the technological challenges including food product label and identification, activity/process characterization, information systems for the supply chain, i.e., from farm to table. Application of one-stop logistics service focuses on the whole supply chain process integration for fresh agricultural products is studied. A collaborative research project for the supply and logistics of fresh agricultural products in Tianjin was performed. Requirement analysis for the one-stop logistics management information system is studied. The model-driven business transformation, an approach uses formal models to explicitly define the structure and behavior of a business, is applied for the review and analysis process. Specific requirements for the logistic management solutions are proposed. Development of this research is crucial for the solution of one-stop logistics management information system integration platform for fresh agricultural products.
40 CFR 1065.315 - Pressure, temperature, and dewpoint calibration.
Code of Federal Regulations, 2011 CFR
2011-07-01
... quantities that are NIST-traceable within 0.5% uncertainty. (2) Temperature. We recommend digital dry-block... errors. We recommend using calibration reference quantities that are NIST-traceable within 0.5... NIST-traceable simulator that is independently calibrated and, as appropriate, cold-junction...
40 CFR 1065.315 - Pressure, temperature, and dewpoint calibration.
Code of Federal Regulations, 2010 CFR
2010-07-01
... quantities that are NIST-traceable within 0.5% uncertainty. (2) Temperature. We recommend digital dry-block... errors. We recommend using calibration reference quantities that are NIST-traceable within 0.5... NIST-traceable simulator that is independently calibrated and, as appropriate, cold-junction...
40 CFR 1065.315 - Pressure, temperature, and dewpoint calibration.
Code of Federal Regulations, 2012 CFR
2012-07-01
... quantities that are NIST-traceable within 0.5% uncertainty. (2) Temperature. We recommend digital dry-block... errors. We recommend using calibration reference quantities that are NIST-traceable within 0.5... NIST-traceable simulator that is independently calibrated and, as appropriate, cold-junction...
40 CFR 1065.315 - Pressure, temperature, and dewpoint calibration.
Code of Federal Regulations, 2014 CFR
2014-07-01
... quantities that are NIST-traceable within 0.5% uncertainty. (2) Temperature. We recommend digital dry-block... errors. We recommend using calibration reference quantities that are NIST-traceable within 0.5... NIST-traceable simulator that is independently calibrated and, as appropriate, cold-junction...
40 CFR 1065.315 - Pressure, temperature, and dewpoint calibration.
Code of Federal Regulations, 2013 CFR
2013-07-01
... quantities that are NIST-traceable within 0.5% uncertainty. (2) Temperature. We recommend digital dry-block... errors. We recommend using calibration reference quantities that are NIST-traceable within 0.5... NIST-traceable simulator that is independently calibrated and, as appropriate, cold-junction...
A PetriNet-Based Approach for Supporting Traceability in Cyber-Physical Manufacturing Systems
Huang, Jiwei; Zhu, Yeping; Cheng, Bo; Lin, Chuang; Chen, Junliang
2016-01-01
With the growing popularity of complex dynamic activities in manufacturing processes, traceability of the entire life of every product has drawn significant attention especially for food, clinical materials, and similar items. This paper studies the traceability issue in cyber-physical manufacturing systems from a theoretical viewpoint. Petri net models are generalized for formulating dynamic manufacturing processes, based on which a detailed approach for enabling traceability analysis is presented. Models as well as algorithms are carefully designed, which can trace back the lifecycle of a possibly contaminated item. A practical prototype system for supporting traceability is designed, and a real-life case study of a quality control system for bee products is presented to validate the effectiveness of the approach. PMID:26999141
A PetriNet-Based Approach for Supporting Traceability in Cyber-Physical Manufacturing Systems.
Huang, Jiwei; Zhu, Yeping; Cheng, Bo; Lin, Chuang; Chen, Junliang
2016-03-17
With the growing popularity of complex dynamic activities in manufacturing processes, traceability of the entire life of every product has drawn significant attention especially for food, clinical materials, and similar items. This paper studies the traceability issue in cyber-physical manufacturing systems from a theoretical viewpoint. Petri net models are generalized for formulating dynamic manufacturing processes, based on which a detailed approach for enabling traceability analysis is presented. Models as well as algorithms are carefully designed, which can trace back the lifecycle of a possibly contaminated item. A practical prototype system for supporting traceability is designed, and a real-life case study of a quality control system for bee products is presented to validate the effectiveness of the approach.
Establishment of the Co-C Eutectic Fixed-Point Cell for Thermocouple Calibrations at NIMT
NASA Astrophysics Data System (ADS)
Ongrai, O.; Elliott, C. J.
2017-08-01
In 2015, NIMT first established a Co-C eutectic temperature reference (fixed-point) cell measurement capability for thermocouple calibration to support the requirements of Thailand's heavy industries and secondary laboratories. The Co-C eutectic fixed-point cell is a facility transferred from NPL, where the design was developed through European and UK national measurement system projects. In this paper, we describe the establishment of a Co-C eutectic fixed-point cell for thermocouple calibration at NIMT. This paper demonstrates achievement of the required furnace uniformity, the Co-C plateau realization and the comparison data between NIMT and NPL Co-C cells by using the same standard Pt/Pd thermocouple, demonstrating traceability. The NIMT measurement capability for noble metal type thermocouples at the new Co-C eutectic fixed point (1324.06°C) is estimated to be within ± 0.60 K (k=2). This meets the needs of Thailand's high-temperature thermocouple users—for which previously there has been no traceable calibration facility.
External Quality Assessment Scheme for reference laboratories - review of 8 years' experience.
Kessler, Anja; Siekmann, Lothar; Weykamp, Cas; Geilenkeuser, Wolf Jochen; Dreazen, Orna; Middle, Jonathan; Schumann, Gerhard
2013-05-01
We describe an External Quality Assessment Scheme (EQAS) intended for reference (calibration) laboratories in laboratory medicine and supervised by the Scientific Division of the International Federation of Clinical Chemistry and Laboratory Medicine and the responsible Committee on Traceability in Laboratory Medicine. The official EQAS website, RELA (www.dgkl-rfb.de:81), is open to interested parties. Information on all requirements for participation and results of surveys are published annually. As an additional feature, the identity of every participant in relation to the respective results is disclosed. The results of various groups of measurands (metabolites and substrates, enzymes, electrolytes, glycated hemoglobins, proteins, hormones, thyroid hormones, therapeutic drugs) are discussed in detail. The RELA system supports reference measurement laboratories preparing for accreditation according to ISO 17025 and ISO 15195. Participation in a scheme such as RELA is one of the requirements for listing of the services of a calibration laboratory by the Joint Committee on Traceability in Laboratory Medicine.
Mottese, Antonio Francesco; Naccari, Clara; Vadalà, Rossella; Bua, Giuseppe Daniel; Bartolomeo, Giovanni; Rando, Rossana; Cicero, Nicola; Dugo, Giacomo
2018-01-01
Opuntia ficus-indica L. Miller fruits, particularly 'Ficodindia dell'Etna' of Biancavilla (POD), 'Fico d'india tradizionale di Roccapalumba' with protected brand and samples from an experimental field in Pezzolo (Sicily) were analyzed by inductively coupled plasma mass spectrometry in order to determine the multi-element profile. A multivariate chemometric approach, specifically principal component analysis (PCA), was applied to individuate how mineral elements may represent a marker of geographic origin, which would be useful for traceability. PCA has allowed us to verify that the geographical origin of prickly pear fruits is significantly influenced by trace element content, and the results found in Biancavilla PDO samples were linked to the geological composition of this volcanic areas. It was observed that two principal components accounted for 72.03% of the total variance in the data and, in more detail, PC1 explains 45.51% and PC2 26.52%, respectively. This study demonstrated that PCA is an integrated tool for the traceability of food products and, at the same time, a useful method of authentication of typical local fruits such as prickly pear. © 2017 Society of Chemical Industry. © 2017 Society of Chemical Industry.
Stem cell banking: between traceability and identifiability
2010-01-01
Stem cell banks are increasingly seen as an essential resource of biological materials for both basic and translational research. Stem cell banks support transnational access to quality-controlled and ethically sourced stem cell lines from different origins and of varying grades. According to the Organisation for Economic Co-operation and Development, advances in regenerative medicine are leading to the development of a bioeconomy, 'a world where biotechnology contributes to a significant share of economic output'. Consequently, stem cell banks are destined to constitute a pillar of the bioeconomy in many countries. While certain ethical and legal concerns are specific to the nature of stem cells, stem cell banking could do well to examine the approaches fostered by tissue banking generally. Indeed, the past decade has seen a move to simplify and harmonize biological tissue and data banking so as to foster international interoperability. In particular, the issues of consent and of traceability illustrate not only commonalities but the opportunity for stem cell banking to appreciate the lessons learned in biobanking generally. This paper analyzes convergence and divergence in issues surrounding policy harmonization, transnational sharing, informed consent, traceability and return of results in the context of stem cell banks. PMID:20923580
The Traceable Radiometry Underpinning Terrestrial and Helio Studies (TRUTHS) mission
NASA Astrophysics Data System (ADS)
Green, Paul D.; Fox, Nigel P.; Lobb, Daniel; Friend, Jonathan
2015-10-01
TRUTHS (Traceable Radiometry Underpinning Terrestrial- and Helio-Studies) is a proposed small satellite mission to enable a space-based climate observing system capable of delivering data of the quality needed to provide the information needed by policy makers to make robust mitigation and adaptation decisions. This is achieved by embedding trust and confidence in the data and derived information (tied to international standards) from both its own measurements and by upgrading the performance and interoperability of other EO platforms, such as the Sentinels by in-flight reference calibration. TRUTHS would provide measurements of incoming (total and spectrally resolved) and global reflected spectrally and spatially (50 m) solar radiation at the 0.3% uncertainty level. These fundamental climate data products can be convolved into the building blocks for many ECVs and EO applications as envisaged by the 2015 ESA science strategy; in a cost effective manner. We describe the scientific drivers for the TRUTHS mission and how the requirements for the climate benchmarking and cross-calibration reference sensor are both complementary and simply implemented, with a small additional complexity on top of heritage calibration schemes. The calibration scheme components and the route to SI-traceable Earth-reflected solar spectral radiance and solar spectral irradiance are described.
Force Sensor Characterization Under Sinusoidal Excitations
Medina, Nieves; de Vicente, Jesús
2014-01-01
The aim in the current work is the development of a method to characterize force sensors under sinusoidal excitations using a primary standard as the source of traceability. During this work the influence factors have been studied and a method to minimise their contributions, as well as the corrections to be performed under dynamic conditions have been established. These results will allow the realization of an adequate characterization of force sensors under sinusoidal excitations, which will be essential for its further proper use under dynamic conditions. The traceability of the sensor characterization is based in the direct definition of force as mass multiplied by acceleration. To do so, the sensor is loaded with different calibrated loads and is maintained under different sinusoidal accelerations by means of a vibration shaker system that is able to generate accelerations up to 100 m/s2 with frequencies from 5 Hz up to 2400 Hz. The acceleration is measured by means of a laser vibrometer with traceability to the units of time and length. A multiple channel data acquisition system is also required to simultaneously acquire the electrical output signals of the involved instrument in real time. PMID:25290287
Development of Traceable Phantoms for Improved Image Quantification in Positron Emission Tomography
NASA Astrophysics Data System (ADS)
Zimmerman, Brian
2014-03-01
Clinical trials for new drugs increasingly rely on imaging data to monitor patient response to the therapy being studied. In the case of radiopharmaceutical applications, imaging data are also used to estimate organ and tumor doses in order to arrive at the optimal dosage for safe and effective treatment. Positron Emission Tomography (PET) is one of the most commonly used imaging modalities for these types of applications. In large, multicenter trials it is crucial to minimize as much as possible the variability that arises due to use of different types of scanners and other instrumentation so that the biological response can be more readily evaluated. This can be achieved by ensuring that all the instruments are calibrated to a common standard and that their performance is continuously monitored throughout the trial. Maintaining links to a single standard also enables the comparability of data acquired on a heterogeneous collection of instruments in different clinical settings. As the standards laboratory for the United States, the National Institute of Standards and Technology (NIST) has been developing a suite of phantoms having traceable activity content to enable scanner calibration and performance testing. The configurations range from small solid cylindrical sources having volumes from 1 mL to 23 mL to large cylinders having a total volume of 9 L. The phantoms are constructed with 68Ge as a long-lived substitute for the more clinically useful radionuclide 18F. The contained activity values are traceable to the national standard for 68Ge and are also linked to the standard for 18F through a careful series of comparisons. The techniques that have been developed are being applied to a variety of new phantom configurations using different radionuclides. Image-based additive manufacturing techniques are also being investigated to create fillable phantoms having irregular shapes which can better mimic actual organs and tumors while still maintaining traceability back to primary standards for radioactivity. This talk will describe the methods used to construct, calibrate, and characterize the phantoms, focusing on the preservation of the traceability link to the primary standards of the radionuclides used. The on-going development of specialized traceable phantoms for specific organ dosimetry applications and imaging physics studies will also be discussed.
Advanced Mirror Technology Development (AMTD) for Very Large Space Telescopes
NASA Technical Reports Server (NTRS)
Stahl, H. Philip
2013-01-01
Accomplishments include: Assembled outstanding team from academia, industry and government with expertise in science and space telescope engineering. Derived engineering specifications for monolithic primary mirror from science measurement needs & implementation constraints. Pursuing long-term strategy to mature technologies necessary to enable future large aperture space telescopes. Successfully demonstrated capability to make 0.5 m deep mirror substrate and polish it to UVOIR traceable figure specification.
Wine Traceability: A Data Model and Prototype in Albanian Context
Vukatana, Kreshnik; Sevrani, Kozeta; Hoxha, Elira
2016-01-01
Vine traceability is a critical issue that has gained interest internationally. Quality control programs and schemes are mandatory in many countries including EU members and the USA. Albania has transformed most of the EU regulations on food into laws. Regarding the vine sector, the obligation of wine producers to keep traceability data is part of the legislation. The analysis on the interviews conducted with Albanian winemakers show that these data are actually recorded only in hard copy. Another fact that emerges from the interviews is that only two producers have implemented the ISO (International Organization for Standardization) standards on food. The purpose of this paper is to develop an agile and automated traceability system based on these standards. We propose a data model and system prototype that are described in the second and third section of this work. The data model is an adaption along the lines of the GS1 (Global Standards One) specifications for a wine supply chain. The proposed prototype has a key component that is mobile access to the information about wine through barcode technology. By using this mechanism the consumer obtains transparency on his expectations concerning the quality criteria. Another important component of the proposed system in this paper is a real-time notification module that works as an alert system when a risk is identified. This can help producers and authorities to have a rapid identification of a contaminated product. It is important in cases when recalling the product from the market or preventing it from reaching the consumer. PMID:28231105
Wine Traceability: A Data Model and Prototype in Albanian Context.
Vukatana, Kreshnik; Sevrani, Kozeta; Hoxha, Elira
2016-02-17
Vine traceability is a critical issue that has gained interest internationally. Quality control programs and schemes are mandatory in many countries including EU members and the USA. Albania has transformed most of the EU regulations on food into laws. Regarding the vine sector, the obligation of wine producers to keep traceability data is part of the legislation. The analysis on the interviews conducted with Albanian winemakers show that these data are actually recorded only in hard copy. Another fact that emerges from the interviews is that only two producers have implemented the ISO (International Organization for Standardization) standards on food. The purpose of this paper is to develop an agile and automated traceability system based on these standards. We propose a data model and system prototype that are described in the second and third section of this work. The data model is an adaption along the lines of the GS1 (Global Standards One) specifications for a wine supply chain. The proposed prototype has a key component that is mobile access to the information about wine through barcode technology. By using this mechanism the consumer obtains transparency on his expectations concerning the quality criteria. Another important component of the proposed system in this paper is a real-time notification module that works as an alert system when a risk is identified. This can help producers and authorities to have a rapid identification of a contaminated product. It is important in cases when recalling the product from the market or preventing it from reaching the consumer.
Cryar, Adam; Pritchard, Caroline; Burkitt, William; Walker, Michael; O'Connor, Gavin; Burns, Duncan Thorburn; Quaglia, Milena
2013-01-01
Current routine food allergen quantification methods, which are based on immunochemistry, offer high sensitivity but can suffer from issues of specificity and significant variability of results. MS approaches have been developed, but currently lack metrological traceability. A feasibility study on the application of metrologically traceable MS-based reference procedures was undertaken. A proof of concept involving proteolytic digestion and isotope dilution MS for quantification of protein allergens in a food matrix was undertaken using lysozyme in wine as a model system. A concentration of lysozyme in wine of 0.95 +/- 0.03 microg/g was calculated based on the concentrations of two peptides, confirming that this type of analysis is viable at allergenically meaningful concentrations. The challenges associated with this promising method were explored; these included peptide stability, chemical modification, enzymatic digestion, and sample cleanup. The method is suitable for the production of allergen in food certified reference materials, which together with the achieved understanding of the effects of sample preparation and of the matrix on the final results, will assist in addressing the bias of the techniques routinely used and improve measurement confidence. Confirmation of the feasibility of MS methods for absolute quantification of an allergenic protein in a food matrix with results traceable to the International System of Units is a step towards meaningful comparison of results for allergen proteins among laboratories. This approach will also underpin risk assessment and risk management of allergens in the food industry, and regulatory compliance of the use of thresholds or action levels when adopted.
CCQM-K140: carbon stable isotope ratio delta values in honey
NASA Astrophysics Data System (ADS)
Dunn, P. J. H.; Goenaga-Infante, H.; Goren, A. C.; Şimşek, A.; Bilsel, M.; Ogrinc, N.; Armishaw, P.; Hai, L.
2017-01-01
As there can be small but measureable differences in isotope ratios between different sources of the same element/compound/material, isotope ratio measurements are applied in a number of different fields including archaeology, environmental science, geochemistry, forensic science and ecology. Isotope ratios for the light elements (H, C, N, O and S) are typically reported as δ-values which are isotope ratios expressed relative to an internationally agreed standard (this standard is the zero-point on the scale), although absolute isotope ratios which are traceable to the SI have also been reported. The IAWG has been granted a traceability exception for the use of arbitrary delta scales until SI traceability can be established at the required level of uncertainty but this goal is some years away. While the CCQM IAWG has previously organised several pilot studies on isotope ratio determination (CCQM-P75: Stable isotope delta values in methionine, 2006; CCQM-P105: Sr isotope ratios in wine, 2008; CCQM-K98: Pb isotope ratios in bronze with additional delta values in CCQM-P134, 2011), it has been a number of years since delta values of light elements have been considered and there has been no key comparison (KC). Therefore, the IAWG has included the need for a KC (CCQM-K140) based on an arbitrary delta scale in its program to support ongoing requirements to demonstrate core capabilities as well as specific claims of measurement capability (CMCs) in this area. The performance of all five of the CCQM-K140 participants was very good, illustrating their ability to obtain accurate results for carbon isotope ratios, within the calibration range afforded by internationally agreed reference materials (δ13CVPDB-LSVEC between -47.32 % and +535.3 %) with measurement uncertainties of between 0.08 and 0.28 %. This was despite the fact that no two participants used exactly the same approach in terms of instrumentation or data treatment. Main text To reach the main text of this paper, click on Final Report. Note that this text is that which appears in Appendix B of the BIPM key comparison database kcdb.bipm.org/. The final report has been peer-reviewed and approved for publication by the CCQM, according to the provisions of the CIPM Mutual Recognition Arrangement (CIPM MRA).
Reference Model for Project Support Environments Version 1.0
1993-02-28
relationship with the framework’s Process Support services and with the Lifecycle Process Engineering services. Examples: "* ORCA (Object-based...Design services. Examples: "* ORCA (Object-based Requirements Capture and Analysis). "* RETRAC (REquirements TRACeability). 4.3 Life-Cycle Process...34traditional" computer tools. Operations: Examples of audio and video processing operations include: "* Create, modify, and delete sound and video data
78 FR 2039 - Traceability for Livestock Moving Interstate
Federal Register 2010, 2011, 2012, 2013, 2014
2013-01-09
... Inspection Service 9 CFR Parts 71, 77, 78, et al. Traceability for Livestock Moving Interstate; Final Rule #0..., and 86 [Docket No. APHIS-2009-0091] RIN 0579-AD24 Traceability for Livestock Moving Interstate AGENCY: Animal and Plant Health Inspection Service, USDA. ACTION: Final rule. SUMMARY: We are amending the...
76 FR 50081 - Traceability for Livestock Moving Interstate
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-11
... Inspection Service 9 CFR Parts 71, 77, 78, et al. Traceability for Livestock Moving Interstate; Proposed Rule... 90 [Docket No. APHIS-2009-0091] RIN 0579-AD24 Traceability for Livestock Moving Interstate AGENCY: Animal and Plant Health Inspection Service, USDA. ACTION: Proposed rule. SUMMARY: We are proposing to...
Cattle traceability system in Japan for bovine spongiform encephalopathy.
Sugiura, Katsuaki; Onodera, Takashi
2008-01-01
To promote consumer confidence in the safety of beef and to ensure the proper implementation of eradication measures against bovine spongiform encephalopathy (BSE), the Cattle Traceability Law was approved by the Diet in June 2003 and a cattle traceability system has been in operation in Japan since December 2003. The system enables tracing the cohort and offspring animals of a BSE case within 24 h of its detection. The traceability database system also provides distributors, restaurants and consumers with information on the cattle from which the beef that they sell, serve and consume, originate.
Study on Full Supply Chain Quality and Safetytraceability Systems For Cereal And Oilproducts
NASA Astrophysics Data System (ADS)
Liu, Shihong; Zheng, Huoguo; Meng, Hong; Hu, Haiyan; Wu, Jiangshou; Li, Chunhua
Global food industry and Governments in many countries are putting increasing emphasis on establishment of food traceability systems. Food traceability has become an effective way in food safety management. Aimed at the major quality problems of cereal and oil products existing in the production, processing, warehousing, distribution and other links in the supply chain, this paper firstly proposes a new traceability framework combines the information flow with critical control points and quality indicators. Then it introduces traceability database design and data access mode to realize the framework. In practice, Code design for tracing goods is a challenge thing, so this paper put forward a code system based on UCC/EAN-128 standard.Middleware and Electronic terminal design are also briefly introduced to accomplish traceability system for cereal and oil products.
Yao, Sen; Li, Tao; Liu, HongGao; Li, JieQing; Wang, YuanZhong
2018-04-01
Boletaceae mushrooms are wild-grown edible mushrooms that have high nutrition, delicious flavor and large economic value distributing in Yunnan Province, China. Traceability is important for the authentication and quality assessment of Boletaceae mushrooms. In this study, UV-visible and Fourier transform infrared (FTIR) spectroscopies were applied for traceability of 247 Boletaceae mushroom samples in combination with chemometrics. Compared with a single spectroscopy technique, data fusion strategy can obviously improve the classification performance in partial least square discriminant analysis (PLS-DA) and grid-search support vector machine (GS-SVM) models, for both species and geographical origin traceability. In addition, PLS-DA and GS-SVM models can provide 100.00% accuracy for species traceability and have reliable evaluation parameters. For geographical origin traceability, the accuracy of prediction in the PLS-DA model by data fusion was just 64.63%, but the GS-SVM model based on data fusion was 100.00%. The results demonstrated that the data fusion strategy of UV-visible and FTIR combined with GS-SVM could provide a higher synergic effect for traceability of Boletaceae mushrooms and have a good generalization ability for the comprehensive quality control and evaluation of similar foods. © 2017 Society of Chemical Industry. © 2017 Society of Chemical Industry.
Supporting Public Administration with an Integrated BPR Environment
NASA Astrophysics Data System (ADS)
Ciaghi, Aaron; Villafiorita, Adolfo; Weldemariam, Komminist; Mattioli, Andrea; Phan, Quoc-Sang
The definition or redesign of Public Administration (PA) procedures is particularly challenging. This is, for example, due to the requirement of cooperation of different organizational units and actors, different laws and procedures for the production of several artifacts, and maintaining traceability while integrating processes with new laws.
NASA Astrophysics Data System (ADS)
Doytchinov, I.; Tonnellier, X.; Shore, P.; Nicquevert, B.; Modena, M.; Mainaud Durand, H.
2018-05-01
Micrometric assembly and alignment requirements for future particle accelerators, and especially large assemblies, create the need for accurate uncertainty budgeting of alignment measurements. Measurements and uncertainties have to be accurately stated and traceable, to international standards, for metre-long sized assemblies, in the range of tens of µm. Indeed, these hundreds of assemblies will be produced and measured by several suppliers around the world, and will have to be integrated into a single machine. As part of the PACMAN project at CERN, we proposed and studied a practical application of probabilistic modelling of task-specific alignment uncertainty by applying a simulation by constraints calibration method. Using this method, we calibrated our measurement model using available data from ISO standardised tests (10360 series) for the metrology equipment. We combined this model with reference measurements and analysis of the measured data to quantify the actual specific uncertainty of each alignment measurement procedure. Our methodology was successfully validated against a calibrated and traceable 3D artefact as part of an international inter-laboratory study. The validated models were used to study the expected alignment uncertainty and important sensitivity factors in measuring the shortest and longest of the compact linear collider study assemblies, 0.54 m and 2.1 m respectively. In both cases, the laboratory alignment uncertainty was within the targeted uncertainty budget of 12 µm (68% confidence level). It was found that the remaining uncertainty budget for any additional alignment error compensations, such as the thermal drift error due to variation in machine operation heat load conditions, must be within 8.9 µm and 9.8 µm (68% confidence level) respectively.
EARLINET Single Calculus Chain - technical - Part 1: Pre-processing of raw lidar data
NASA Astrophysics Data System (ADS)
D'Amico, G.; Amodeo, A.; Mattis, I.; Freudenthaler, V.; Pappalardo, G.
2015-10-01
In this paper we describe an automatic tool for the pre-processing of lidar data called ELPP (EARLINET Lidar Pre-Processor). It is one of two calculus modules of the EARLINET Single Calculus Chain (SCC), the automatic tool for the analysis of EARLINET data. The ELPP is an open source module that executes instrumental corrections and data handling of the raw lidar signals, making the lidar data ready to be processed by the optical retrieval algorithms. According to the specific lidar configuration, the ELPP automatically performs dead-time correction, atmospheric and electronic background subtraction, gluing of lidar signals, and trigger-delay correction. Moreover, the signal-to-noise ratio of the pre-processed signals can be improved by means of configurable time integration of the raw signals and/or spatial smoothing. The ELPP delivers the statistical uncertainties of the final products by means of error propagation or Monte Carlo simulations. During the development of the ELPP module, particular attention has been payed to make the tool flexible enough to handle all lidar configurations currently used within the EARLINET community. Moreover, it has been designed in a modular way to allow an easy extension to lidar configurations not yet implemented. The primary goal of the ELPP module is to enable the application of quality-assured procedures in the lidar data analysis starting from the raw lidar data. This provides the added value of full traceability of each delivered lidar product. Several tests have been performed to check the proper functioning of the ELPP module. The whole SCC has been tested with the same synthetic data sets, which were used for the EARLINET algorithm inter-comparison exercise. The ELPP module has been successfully employed for the automatic near-real-time pre-processing of the raw lidar data measured during several EARLINET inter-comparison campaigns as well as during intense field campaigns.
Effective Materials Property Information Management for the 21st Century
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ren, Weiju; Cebon, David; Barabash, Oleg M
2011-01-01
This paper discusses key principles for the development of materials property information management software systems. There are growing needs for automated materials information management in various organizations. In part these are fuelled by the demands for higher efficiency in material testing, product design and engineering analysis. But equally important, organizations are being driven by the needs for consistency, quality and traceability of data, as well as control of access to proprietary or sensitive information. Further, the use of increasingly sophisticated nonlinear, anisotropic and multi-scale engineering analyses requires both processing of large volumes of test data for development of constitutive modelsmore » and complex materials data input for Computer-Aided Engineering (CAE) software. And finally, the globalization of economy often generates great needs for sharing a single gold source of materials information between members of global engineering teams in extended supply-chains. Fortunately material property management systems have kept pace with the growing user demands and evolved to versatile data management systems that can be customized to specific user needs. The more sophisticated of these provide facilities for: (i) data management functions such as access, version, and quality controls; (ii) a wide range of data import, export and analysis capabilities; (iii) data pedigree traceability mechanisms; (iv) data searching, reporting and viewing tools; and (v) access to the information via a wide range of interfaces. In this paper the important requirements for advanced material data management systems, future challenges and opportunities such as automated error checking, data quality characterization, identification of gaps in datasets, as well as functionalities and business models to fuel database growth and maintenance are discussed.« less
Ultrasound Metrology in Mexico: a round robin test for medical diagnostics
NASA Astrophysics Data System (ADS)
Amezola Luna, R.; López Sánchez, A. L.; Elías Juárez, A. A.
2011-02-01
This paper presents preliminary statistical results from an on-going imaging medical ultrasound study, of particular relevance for gynecology and obstetrics areas. Its scope is twofold, firstly to compile the medical ultrasound infrastructure available in cities of Queretaro-Mexico, and second to promote the use of traceable measurement standards as a key aspect to assure quality of ultrasound examinations performed by medical specialists. The experimental methodology is based on a round robin test using an ultrasound phantom for medical imaging. The physician, using its own ultrasound machine, couplant and facilities, measures the size and depth of a set of pre-defined reflecting and absorbing targets of the reference phantom, which simulate human illnesses. Measurements performed give the medical specialist an objective feedback regarding some performance characteristics of their ultrasound examination systems, such as measurement system accuracy, dead zone, axial resolution, depth of penetration and anechoic targets detection. By the end of March 2010, 66 entities with medical ultrasound facilities, from both public and private institutions, have performed measurements. A network of medical ultrasound calibration laboratories in Mexico, with traceability to The International System of Units via national measurement standards, may indeed contribute to reduce measurement deviations and thus attain better diagnostics.
Early Predictors of Transfusion and Mortality After Injury: A Review of the Data-Based Literature
2006-01-01
requiring red cell therapy and, among those with a pre-hospital index score 3, only 14% required transfusion.11 In a series published in 2002 by Starr and... fracture .12 Given the complexity of some of the pre-hospital trauma scoring sys- tems, Franklin and colleagues evaluated pre-hospital hypoten- sion as a...surrogate for pre-hospital scoring and found that nearly 50% of patients with survivable pre-hospital hypotension re- quired specific therapy for
Enhanced Traceability for Bulk Processing of Sentinel-Derived Information Products
NASA Astrophysics Data System (ADS)
Lankester, Thomas; Hubbard, Steven; Knowelden, Richard
2016-08-01
The advent of widely available, systematically acquired and advanced Earth observations from the Sentinel platforms is spurring development of a wide range of derived information products. Whilst welcome, this rapid rate of development inevitably leads to some processing instability as algorithms and production steps are required to evolve accordingly. To mitigate this instability, the provenance of EO-derived information products needs to be traceable and transparent.Airbus Defence and Space (Airbus DS) has developed the Airbus Processing Cloud (APC) as a virtualised processing farm for bulk production of EO-derived data and information products. The production control system of the APC transforms internal configuration control information into an INSPIRE metadata file containing a stepwise set of processing steps and data source elements that provide the complete and transparent provenance of each product generated.
Code of Federal Regulations, 2010 CFR
2010-10-01
... or agency); (D) Manufacturer of the lighter. For a foreign manufacturer, the U.S. agent or importer... foreign manufacturer's U.S. agent or importer. (iii) Test reports must be traceable to a specific lighter... a contrasting background and must be in letters measuring at least 12.7 mm (0.5 inch) in height. (e...
ALT-114 and ALT-118 Alternative Approaches to NIST-Traceable Reference Gases
In 2016, US EPA approved two separate alternatives (ALT 114 and ALT 118) for the preparation and certification of Hydrogen Chloride (HCl) and Mercury (Hg) cylinder reference gas standards that can serve as EPA Protocol gases where EPA Protocol are required, but unavailable. The a...
Reference Materials for Food and Nutrition Metrology: Past, Present and Future
USDA-ARS?s Scientific Manuscript database
Establishment of a metrology-based measurement system requires the solid foundation of traceability of measurements to available, appropriate certified reference materials (CRM). In the early 1970’s the first “biological” RM of Bowens Kale, as well as Orchard Leaves and Bovine Liver SRMs, from the ...
Instructional Design: Science, Technology, Both, Neither
ERIC Educational Resources Information Center
Gropper, George L.
2017-01-01
What would it take for instructional design to qualify as a bona fide applied discipline? First and foremost, a fundamental requirement is a testable and tested theoretical base. Untested rationales until verified remain in limbo. Secondly, the discipline's applied prescriptions must be demonstrably traceable to the theoretical base once it is…
Current Barriers to Large-scale Interoperability of Traceability Technology in the Seafood Sector.
Hardt, Marah J; Flett, Keith; Howell, Colleen J
2017-08-01
Interoperability is a critical component of full-chain digital traceability, but is almost nonexistent in the seafood industry. Using both quantitative and qualitative methodology, this study explores the barriers impeding progress toward large-scale interoperability among digital traceability systems in the seafood sector from the perspectives of seafood companies, technology vendors, and supply chains as a whole. We highlight lessons from recent research and field work focused on implementing traceability across full supply chains and make some recommendations for next steps in terms of overcoming challenges and scaling current efforts. © 2017 Institute of Food Technologists®.
Flight software requirements and design support system
NASA Technical Reports Server (NTRS)
Riddle, W. E.; Edwards, B.
1980-01-01
The desirability and feasibility of computer-augmented support for the pre-implementation activities occurring during the development of flight control software was investigated. The specific topics to be investigated were the capabilities to be included in a pre-implementation support system for flight control software system development, and the specification of a preliminary design for such a system. Further, the pre-implementation support system was to be characterized and specified under the constraints that it: (1) support both description and assessment of flight control software requirements definitions and design specification; (2) account for known software description and assessment techniques; (3) be compatible with existing and planned NASA flight control software development support system; and (4) does not impose, but may encourage, specific development technologies. An overview of the results is given.
A primer on precision medicine informatics.
Sboner, Andrea; Elemento, Olivier
2016-01-01
In this review, we describe key components of a computational infrastructure for a precision medicine program that is based on clinical-grade genomic sequencing. Specific aspects covered in this review include software components and hardware infrastructure, reporting, integration into Electronic Health Records for routine clinical use and regulatory aspects. We emphasize informatics components related to reproducibility and reliability in genomic testing, regulatory compliance, traceability and documentation of processes, integration into clinical workflows, privacy requirements, prioritization and interpretation of results to report based on clinical needs, rapidly evolving knowledge base of genomic alterations and clinical treatments and return of results in a timely and predictable fashion. We also seek to differentiate between the use of precision medicine in germline and cancer. © The Author 2015. Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.
[Establishment of traceability system of Chinese medicinal materials' quality].
Qi, Yao-dong; Gao, Shi-man; Liu, Hai-tao; Li, Xi-wen; Wei, Jian-he; Zhang, Ben-gang; Sun, Xiao-bo; Xiao, Pei-gen
2015-12-01
The quality of Chinese medicinal materials relates greatly to the clinical curative effect and security. In order to ensure the quality and safety of Chinese medicinal materials, a systematic and operable traceability system needs to be established. It can realize the whole process of quality and safety management of Chinese medicinal materials "from production to consumption" through recording and inquiring information and recalling defective products, which is an important direction for the future development of traditional Chinese medicine. But it is still at the exploration and trial stage. In this paper, a framework of Chinese medicinal materials' quality and safety traceability system was established on the basis of the domestic and international experience about the construction of food and agricultural products traceability systems. The relationship between traceability system of Chinese medicinal materials' quality and GAP, GMP, GSP was analyzed, and the possible problems and the corresponding solutions were discussed.
NASA Astrophysics Data System (ADS)
Neumann, Jay; Parlato, Russell; Tracy, Gregory; Randolph, Max
2015-09-01
Focal plane alignment for large format arrays and faster optical systems require enhanced precision methodology and stability over temperature. The increase in focal plane array size continues to drive the alignment capability. Depending on the optical system, the focal plane flatness of less than 25μm (.001") is required over transition temperatures from ambient to cooled operating temperatures. The focal plane flatness requirement must also be maintained in airborne or launch vibration environments. This paper addresses the challenge of the detector integration into the focal plane module and housing assemblies, the methodology to reduce error terms during integration and the evaluation of thermal effects. The driving factors influencing the alignment accuracy include: datum transfers, material effects over temperature, alignment stability over test, adjustment precision and traceability to NIST standard. The FPA module design and alignment methodology reduces the error terms by minimizing the measurement transfers to the housing. In the design, the proper material selection requires matched coefficient of expansion materials minimizes both the physical shift over temperature as well as lowering the stress induced into the detector. When required, the co-registration of focal planes and filters can achieve submicron relative positioning by applying precision equipment, interferometry and piezoelectric positioning stages. All measurements and characterizations maintain traceability to NIST standards. The metrology characterizes the equipment's accuracy, repeatability and precision of the measurements.
Cobbaert, Christa; Smit, Nico; Gillery, Philippe
2018-05-07
In our efforts to advance the profession and practice of clinical laboratory medicine, strong coordination and collaboration are needed more than ever before. At the dawn of the 21st century, medical laboratories are facing many unmet clinical needs, a technological revolution promising a plethora of better biomarkers, financial constraints, a growing scarcity of well-trained laboratory technicians and a sharply increasing number of International Organization for Standardization guidelines and new regulations to which medical laboratories should comply in order to guarantee safety and effectiveness of medical test results. Although this is a global trend, medical laboratories across continents and countries are in distinct phases and experience various situations. A universal underlying requirement for safe and global use of medical test results is the standardization and harmonization of test results. Since two decades and after a number of endeavors on standardization/harmonization of medical tests, it is time to reflect on the effectiveness of the approaches used. To keep laboratory medicine sustainable, viable and affordable, clarification of the promises of metrological traceability of test results for improving sick and health care, realization of formal commitment among all stakeholders of the metrological traceability chain and preparation of a joint and global plan for action are essential prerequisites. Policy makers and regulators should not only overwhelm the diagnostic sector with oversight and regulations but should also create the conditions by establishing a global professional forum for anchoring the metrological traceability concept in the medical test domain. Even so, professional societies should have a strong voice in their (inter-) national governments to negotiate long-lasting public policy commitment and funds for global standardization of medical tests.
On the traceability of gaseous reference materials
NASA Astrophysics Data System (ADS)
Brown, Richard J. C.; Brewer, Paul J.; Harris, Peter M.; Davidson, Stuart; van der Veen, Adriaan M. H.; Ent, Hugo
2017-06-01
The complex and multi-parameter nature of chemical composition measurement means that establishing traceability is a challenging task. As a result incorrect interpretations about the origin of the metrological traceability of chemical measurement results can occur. This discussion paper examines why this is the case by scrutinising the peculiarities of the gas metrology area. It considers in particular: primary methods, dissemination of metrological traceability and the role of documentary standards and accreditation bodies in promulgating best practice. There is also a discussion of documentary standards relevant to the NMI and reference material producer community which need clarification, and the impact which key stakeholders in the quality infrastructure can bring to these issues.
Requirements, Verification, and Compliance (RVC) Database Tool
NASA Technical Reports Server (NTRS)
Rainwater, Neil E., II; McDuffee, Patrick B.; Thomas, L. Dale
2001-01-01
This paper describes the development, design, and implementation of the Requirements, Verification, and Compliance (RVC) database used on the International Space Welding Experiment (ISWE) project managed at Marshall Space Flight Center. The RVC is a systems engineer's tool for automating and managing the following information: requirements; requirements traceability; verification requirements; verification planning; verification success criteria; and compliance status. This information normally contained within documents (e.g. specifications, plans) is contained in an electronic database that allows the project team members to access, query, and status the requirements, verification, and compliance information from their individual desktop computers. Using commercial-off-the-shelf (COTS) database software that contains networking capabilities, the RVC was developed not only with cost savings in mind but primarily for the purpose of providing a more efficient and effective automated method of maintaining and distributing the systems engineering information. In addition, the RVC approach provides the systems engineer the capability to develop and tailor various reports containing the requirements, verification, and compliance information that meets the needs of the project team members. The automated approach of the RVC for capturing and distributing the information improves the productivity of the systems engineer by allowing that person to concentrate more on the job of developing good requirements and verification programs and not on the effort of being a "document developer".
Functional Requirements Document for HALE UAS Operations in the NAS: Step 1. Version 3
NASA Technical Reports Server (NTRS)
2006-01-01
The purpose of this Functional Requirements Document (FRD) is to compile the functional requirements needed to achieve the Access 5 Vision of "operating High Altitude, Long Endurance (HALE) Unmanned Aircraft Systems (UAS) routinely, safely, and reliably in the national airspace system (NAS)" for Step 1. These functional requirements could support the development of a minimum set of policies, procedures and standards by the Federal Aviation Administration (FAA) and various standards organizations. It is envisioned that this comprehensive body of work will enable the FAA to establish and approve regulations to govern safe operation of UAS in the NAS on a routine or daily "file and fly" basis. The approach used to derive the functional requirements found within this FRD was to decompose the operational requirements and objectives identified within the Access 5 Concept of Operations (CONOPS) into the functions needed to routinely and safely operate a HALE UAS in the NAS. As a result, four major functional areas evolved to enable routine and safe UAS operations for an on-demand basis in the NAS. These four major functions are: Aviate, Navigate, Communicate, and Avoid Hazards. All of the functional requirements within this document can be directly traceable to one of these four major functions. Some functions, however, are traceable to several, or even all, of these four major functions. These cross-cutting functional requirements support the "Command / Control: function as well as the "Manage Contingencies" function. The requirements associated to these high-level functions and all of their supporting low-level functions are addressed in subsequent sections of this document.
USDA-ARS?s Scientific Manuscript database
Color measurements of cotton fiber and cotton textile products are important quality parameters. The Uster® High Volume Instrument (HVI) is an instrument used globally to classify cotton quality, including cotton color. Cotton color by HVI is based on two cotton-specific color parameters—Rd (diffuse...
Mpamhanga, C J; Wotton, S B
2015-09-01
This study compared normal post-Jarvis stun/kill responses and carcass quality with those occurring when crush restraint was not used during pre-slaughter. The carcasses of 1065 cattle slaughtered during one week at a commercial abattoir were evaluated for quality. The post-stun/kill responses of 788 of these animals were also assessed. An additional study of data from the carcasses of 6061 cattle was further evaluated for quality findings. A significant reduction in post-stun/kill limb movement, muscle tone and the expression of brainstem functions was recorded when restraint was not used. Abolishing crush restraint pre-slaughter also produced a significant reduction in the incidence of blood splash. In addition, the study also showed that animal identification post-slaughter could be successfully implemented with no negative consequences to food safety or traceability. It is suggested that abolishing the use of pre-slaughter crush restraint of cattle would enhance animal welfare and operator safety in plants whether electrical, or mechanical stunning was employed. Copyright © 2015. Published by Elsevier Ltd.
48 CFR 252.211-7008 - Use of Government-assigned Serial Numbers
Code of Federal Regulations, 2012 CFR
2012-10-01
... all levels of life cycle management. Major end items include aircraft; ships; boats; motorized wheeled... never changes in order to provide traceability of the item throughout its total life cycle. The term... items for use throughout the life of the major end item. The Contractor may elect, but is not required...
48 CFR 252.211-7008 - Use of Government-Assigned Serial Numbers
Code of Federal Regulations, 2011 CFR
2011-10-01
... all levels of life cycle management. Major end items include aircraft; ships; boats; motorized wheeled... never changes in order to provide traceability of the item throughout its total life cycle. The term... items for use throughout the life of the major end item. The Contractor may elect, but is not required...
48 CFR 252.211-7008 - Use of Government-assigned Serial Numbers
Code of Federal Regulations, 2013 CFR
2013-10-01
... all levels of life cycle management. Major end items include aircraft; ships; boats; motorized wheeled... never changes in order to provide traceability of the item throughout its total life cycle. The term... items for use throughout the life of the major end item. The Contractor may elect, but is not required...
48 CFR 252.211-7008 - Use of Government-assigned Serial Numbers
Code of Federal Regulations, 2014 CFR
2014-10-01
... all levels of life cycle management. Major end items include aircraft; ships; boats; motorized wheeled... never changes in order to provide traceability of the item throughout its total life cycle. The term... items for use throughout the life of the major end item. The Contractor may elect, but is not required...
USDA-ARS?s Scientific Manuscript database
Establishment of a metrology-based measurement system requires the solid foundation of traceability of measurements to available, appropriate certified reference materials (CRM). In the early 1970s the first “biological” Reference Material (RM) of Bowens Kale, Orchard Leaves, and Bovine Liver from ...
Code of Federal Regulations, 2014 CFR
2014-07-01
... accuracy that is traceable to National Institute of Standards and Technology (NIST) standards. (ii) The... section. (i) Perform a single-point calibration using an NIST-certified buffer solution that is accurate... include a redundant pH sensor, perform a single point calibration using an NIST-certified buffer solution...
Code of Federal Regulations, 2013 CFR
2013-07-01
... accuracy that is traceable to National Institute of Standards and Technology (NIST) standards. (ii) The... section. (i) Perform a single-point calibration using an NIST-certified buffer solution that is accurate... include a redundant pH sensor, perform a single point calibration using an NIST-certified buffer solution...
High performance, accelerometer-based control of the Mini-MAST structure at Langley Research Center
NASA Technical Reports Server (NTRS)
Collins, Emmanuel G., Jr.; King, James A.; Phillips, Douglas J.; Hyland, David C.
1991-01-01
Many large space system concepts will require active vibration control to satisfy critical performance requirements such as line of sight pointing accuracy and constraints on rms surface roughness. In order for these concepts to become operational, it is imperative that the benefits of active vibration control be shown to be practical in ground based experiments. The results of an experiment shows the successful application of the Maximum Entropy/Optimal Projection control design methodology to active vibration control for a flexible structure. The testbed is the Mini-Mast structure at NASA-Langley and has features dynamically traceable to future space systems. To maximize traceability to real flight systems, the controllers were designed and implemented using sensors (four accelerometers and one rate gyro) that are actually mounted to the structure. Ground mounted displacement sensors that could greatly ease the control design task were available but were used only for performance evaluation. The use of the accelerometers increased the potential of destabilizing the system due to spillover effects and motivated the use of precompensation strategy to achieve sufficient compensator roll-off.
High performance, accelerometer-based control of the Mini-MAST structure
NASA Technical Reports Server (NTRS)
Collins, Emmanuel G., Jr.; King, James A.; Phillips, Douglas J.; Hyland, David C.
1992-01-01
Many large space system concepts will require active vibration control to satisfy critical performance requirements such as line of sight pointing accuracy and constraints on rms surface roughness. In order for these concepts to become operational, it is imperative that the benefits of active vibration control be shown to be practical in ground based experiments. The results of an experiment shows the successful application of the Maximum Entropy/Optical Projection control design methodology to active vibration control for a flexible structure. The testbed is the Mini-Mast structure at NASA-Langley and has features dynamically traceable to future space systems. To maximize traceability to real flight systems, the controllers were designed and implemented using sensors (four accelerometers and one rate gyro) that are actually mounted to the structure. Ground mounted displacement sensors that could greatly ease the control design task were available but were used only for performance evaluation. The use of the accelerometers increased the potential of destabilizing the system due to spillover effects and motivated the use of precompensation strategy to achieve sufficient compensator roll-off.
Surface-specific additive manufacturing test artefacts
NASA Astrophysics Data System (ADS)
Townsend, Andrew; Racasan, Radu; Blunt, Liam
2018-06-01
Many test artefact designs have been proposed for use with additive manufacturing (AM) systems. These test artefacts have primarily been designed for the evaluation of AM form and dimensional performance. A series of surface-specific measurement test artefacts designed for use in the verification of AM manufacturing processes are proposed here. Surface-specific test artefacts can be made more compact because they do not require the large dimensions needed for accurate dimensional and form measurements. The series of three test artefacts are designed to provide comprehensive information pertaining to the manufactured surface. Measurement possibilities include deviation analysis, surface texture parameter data generation, sub-surface analysis, layer step analysis and build resolution comparison. The test artefacts are designed to provide easy access for measurement using conventional surface measurement techniques, for example, focus variation microscopy, stylus profilometry, confocal microscopy and scanning electron microscopy. Additionally, the test artefacts may be simply visually inspected as a comparative tool, giving a fast indication of process variation between builds. The three test artefacts are small enough to be included in every build and include built-in manufacturing traceability information, making them a convenient physical record of the build.
Construction of Traceability System for Quality Safety of Cereal and Oil Products
NASA Astrophysics Data System (ADS)
Zheng, Huoguo; Liu, Shihong; Meng, Hong; Hu, Haiyan
After several significant food safety incident, global food industry and governments in many countries are putting increasing emphasis on establishment of food traceability systems. Food traceability has become an effective way in food quality and safety management. The traceability system for quality safety of cereal and oil products was designed and implemented with HACCP and FMECA method, encoding, information processing, and hardware R&D technology etc, according to the whole supply chain of cereal and oil products. Results indicated that the system provide not only the management in origin, processing, circulating and consuming for enterprise, but also tracing service for customers and supervisor by means of telephone, internet, SMS, touch machine and mobile terminal.
Kinumi, Tomoya; Goto, Mari; Eyama, Sakae; Kato, Megumi; Kasama, Takeshi; Takatsu, Akiko
2012-07-01
A certified reference material (CRM) is a higher-order calibration material used to enable a traceable analysis. This paper describes the development of a C-peptide CRM (NMIJ CRM 6901-a) by the National Metrology Institute of Japan using two independent methods for amino acid analysis based on isotope-dilution mass spectrometry. C-peptide is a 31-mer peptide that is utilized for the evaluation of β-cell function in the pancreas in clinical testing. This CRM is a lyophilized synthetic peptide having the human C-peptide sequence, and contains deamidated and pyroglutamylated forms of C-peptide. By adding water (1.00 ± 0.01) g into the vial containing the CRM, the C-peptide solution in 10 mM phosphate buffer saline (pH 6.6) is reconstituted. We assigned two certified values that represent the concentrations of total C-peptide (mixture of C-peptide, deamidated C-peptide, and pyroglutamylated C-peptide) and C-peptide. The certified concentration of total C-peptide was determined by two amino acid analyses using pre-column derivatization liquid chromatography-mass spectrometry and hydrophilic chromatography-mass spectrometry following acid hydrolysis. The certified concentration of C-peptide was determined by multiplying the concentration of total C-peptide by the ratio of the relative area of C-peptide to that of the total C-peptide measured by liquid chromatography. The certified value of C-peptide (80.7 ± 5.0) mg/L represents the concentration of the specific entity of C-peptide; on the other hand, the certified value of total C-peptide, (81.7 ± 5.1) mg/L can be used for analyses that does not differentiate deamidated and pyroglutamylated C-peptide from C-peptide itself, such as amino acid analyses and immunochemical assays.
Yu, Chang-Ho; Kwon, Tae-Kyu; Park, Chan Hee; Ohta, Makoto; Kim, Sung Hoon
2015-01-01
In this paper, we investigated the parameters with effective traceability to assess the mechanical properties of interventional devices. In our evaluation system, a box-shaped poly (vinyl alcohol) hydrogel (PVA-H) and silicone were prepared with realistic geometry, and the measurement and evaluation of traceability were carried out on devices using load hand force. The phantom models had a total of five curve pathways to reach the aneurysm sac. Traceability depends on the performance of the interventional devices in order to pass through the curved part of the model simulation track. The traceability of the guide wire was found to be much better than that of the balloon and stent loading catheter, as it reached the aneurysm sac in both phantom models. Observation using the video record is another advantage of our system, because the high transparency of the materials with silicone and PVA-H can allow visualization of the inside of an artery.
NASA Astrophysics Data System (ADS)
Qi, Man; Edgar-Nevill, Denis; Wang, Yongquan; Xu, Rongsheng
Traceability is a key to the investigation of the internet criminal and a cornerstone of internet research. It is impossible to prevent all internet misuse but may be possible to identify and trace the users, and then take appropriate action. This paper presents the value of traceability within the email/-newsposting utilities, the technologies being using to hide identities, the difficulties in locating the traceable data and the challenges in tracking online trails.
Management traceability information system for the food supply chain
NASA Astrophysics Data System (ADS)
Bendriss, S.; Benabdelhafid, A.; Boukachour, J.
2008-06-01
For a long time, the traceability was applied only for management reasons, but with the advent of new communication and information technologies more and more used in the logistic medium, the notion of the traceability became new extensive to meet the new market needs in term of information by ensuring accessibility the data characteristic or been dependent on the product throughout its life cycle. On the basis of this postulate, we tried to raise some questions of research, beginning by the presentation of the progress achieved, assumptions and objective relating to the traceability, in the second time we mentioned principal work by showing how evolved the scientific question especially the information systems integrating the traceability were developed very little in the literature. Based on what was developed in the first part, we present our generic modeling approach of communicating product "smart object", able to take into account the various essential elements for its traceability: the product in its various states, various operations carried out on the product, resources used, its localization, and interactions between the product and its environment carried out on the basis of whole of service. In order to validate our generic modeling, a case of study representing an application in a context of food industry is presented.
Who decides who has won the bet? Total and Anthropogenic Warming Indices
NASA Astrophysics Data System (ADS)
Haustein, K.; Allen, M. R.; Otto, F. E. L.; Schmidt, A.; Frame, D. J.; Forster, P.; Matthews, D.
2016-12-01
An extension of the idea of betting markets as a means of revealing opinions about future climate are climate policies indexed to geophysical indicators: for example, to ensure net zero global carbon dioxide emissions by the time anthropogenic warming reaches 1.5 degrees above pre-industrial, given about 1 degree of warming already, emissions must fall, on average, by 20% of their current value for every tenth of a degree of anthropogenic warming from now on. In principle, policies conditioned on some measure of attributable warming are robust to uncertainty in the global climate response: the risk of a higher or lower response than expected is borne by those affected by climate change mitigation policy rather than those affected by climate change impacts, as is the case with emission targets for specific years based on "current understanding" of the response. To implement any indexed policy, or to agree payout terms for any bet on future climate, requires consensus on the definition of the index: how is it calculated, and who is responsible for releasing it? The global mean surface temperature of the current decade relative to pre-industrial may vary by 0.1 degree or more depending on precisely what is measured, what is defined as pre-industrial, and the treatment of regions with sparse data coverage in earlier years. Indices defined using different conventions, however, are all expected to evolve very similarly over the coming decades, so agreeing on a conservative, traceable index such as HadCRUT is more important than debating the "true" global temperature. A more important question is whether indexed policies and betting markets should focus on total warming, including natural and anthropogenic drivers and internal variability, or an Anthropogenic Warming Index (AWI) representing an unbiased estimate of warming attributable to human influence to date. We propose a simple AWI based solely on observed temperatures and global natural and anthropogenic forcing estimates. It is much less volatile than total observed warming, which might discourage participation in betting markets, but would be a substantial advantage for indexed policies. It is also much more relevant to the UNFCCC goal of limiting anthropogenic warming to "well below" 2 degrees. The 2016 value for the AWI will be announced at AGU.
Provenance for actionable data products and indicators in marine ecosystem assessments
NASA Astrophysics Data System (ADS)
Beaulieu, S. E.; Maffei, A. R.; Fox, P. A.; West, P.; Di Stefano, M.; Hare, J. A.; Fogarty, M.
2013-12-01
Ecosystem-based management of Large Marine Ecosystems (LMEs) involves the sharing of data and information products among a diverse set of stakeholders - from environmental and fisheries scientists to policy makers, commercial entities, nonprofits, and the public. Often the data products that are shared have resulted from a number of processing steps and may also have involved the combination of a number of data sources. The traceability from an actionable data product or indicator back to its original data source(s) is important not just for trust and understanding of each final data product, but also to compare with similar data products produced by the different stakeholder groups. For a data product to be traceable, its provenance, i.e., lineage or history, must be recorded and preferably machine-readable. We are collaborating on a use case to develop a software framework for the bi-annual Ecosystem Status Report (ESR) for the U.S. Northeast Shelf LME. The ESR presents indicators of ecosystem status including climate forcing, primary and secondary production, anthropogenic factors, and integrated ecosystem measures. Our software framework retrieves data, conducts standard analyses, provides iterative and interactive visualization, and generates final graphics for the ESR. The specific process for each data and information product is updated in a metadata template, including data source, code versioning, attribution, and related contextual information suitable for traceability, repeatability, explanation, verification, and validation. Here we present the use of standard metadata for provenance for data products in the ESR, in particular the W3C provenance (PROV) family of specifications, including the PROV-O ontology which maps the PROV data model to RDF. We are also exploring extensions to PROV-O in development (e.g., PROV-ES for Earth Science Data Systems, D-PROV for workflow structure). To associate data products in the ESR to domain-specific ontologies we are also exploring the Global Change Information System ontology, BCO-DMO Ocean Data Ontology, and other relevant published ontologies (e.g., Integrated Ocean Observing System ontology). We are also using the mapping of ISO 19115-2 Lineage to PROV-O and comparing both strategies for traceability of marine ecosystem indicators. The use of standard metadata for provenance for data products in the ESR will enable the transparency, and ultimately reproducibility, endorsed in the recent NOAA Information Quality Guidelines. Semantically enabling not only the provenance but also the data products will yield a better understanding of the connected web of relationships between marine ecosystem and ocean health assessments conducted by different stakeholder groups.
NASA Technical Reports Server (NTRS)
Heath, Donald F.; Georgiev, Georgi
2012-01-01
This paper describes the combination of a Mie scattering spectral BSDF and BTDF albedo standard whose calibration is traceable to the NIST SIRCUS Facility or the NIST STARR II Facility. The Space-based Calibration Transfer Spectroradiometer (SCATS) sensor uses a simple, invariant optical configuration and dedicated narrow band spectral channel modules to provide very accurate, polarization-insensitive, stable measurements of earth albedo and lunar disk albedo. Optical degradation effects on calibration stability are eliminated through use of a common optical system for observations of the Sun, Earth, and Moon. The measurements from space would be traceable to SI units through preflight calibrations of radiance and irradiance at NIST's SIRCUS facility and the invariant optical system used in the sensor. Simultaneous measurements are made in multiple spectral channels covering the solar reflective wavelength range of 300 nm to 2.4 microns. The large dynamic range of signals is handled by use of single-element, highly-linear detectors, stable discrete electronic components, and a non imaging optical configuration. Up to 19 spectral modules can be mounted on a single-axis drive to give direct pointing at the Earth and at least once per orbit view of the Sun and Moon. By observing the Sun on every orbit, the most stringent stability requirements of the system are limited to short time periods. The invariant optical system for both radiance and irradiance measurements also give excellent transfer to-orbit SI traceability. Emerging instrumental requirements for remotely sensing tropospheric trace species have led to a rethinking by some of the paradigm for Systeme International d'Unites (SI) traceability of the spectral irradiance and radiance radiometric calibrations to spectral albedo (sr(exp -1)) which is not a SI unit. In the solar reflective wavelength region the spectral albedo calibrations are tied often to either the spectral albedo of a solar diffuser or the Moon. This new type of Mie scattering diffuser (MSD) is capable of withstanding high temperatures, and is more Lambertian than Spectralon(tm). It has the potential of covering the entire solar reflective wavelength region. Laboratory measurements have shown that the specular reflectance component is negligible, and indicate that internal absorption by multiple scattering is small. This MSD, a true volume diffuser, exhibits a high degree of radiometric stability which suggests that measurements at the National Institute of Standards and Technology (NIST) could provide a spectral albedo standard. Measurements have been made of its radiometric stability under a simulated space environment of high energy gamma rays, high energy protons, and UV radiation from ambient down to the vacuum ultraviolet H Lyman alpha at 121.6 nm for its eventual use in space as a solar diffuser.
Using SysML for MBSE analysis of the LSST system
NASA Astrophysics Data System (ADS)
Claver, Charles F.; Dubois-Felsmann, Gregory; Delgado, Francisco; Hascall, Pat; Marshall, Stuart; Nordby, Martin; Schalk, Terry; Schumacher, German; Sebag, Jacques
2010-07-01
The Large Synoptic Survey Telescope is a complex hardware - software system of systems, making up a highly automated observatory in the form of an 8.4m wide-field telescope, a 3.2 billion pixel camera, and a peta-scale data processing and archiving system. As a project, the LSST is using model based systems engineering (MBSE) methodology for developing the overall system architecture coded with the Systems Modeling Language (SysML). With SysML we use a recursive process to establish three-fold relationships between requirements, logical & physical structural component definitions, and overall behavior (activities and sequences) at successively deeper levels of abstraction and detail. Using this process we have analyzed and refined the LSST system design, ensuring the consistency and completeness of the full set of requirements and their match to associated system structure and behavior. As the recursion process proceeds to deeper levels we derive more detailed requirements and specifications, and ensure their traceability. We also expose, define, and specify critical system interfaces, physical and information flows, and clarify the logic and control flows governing system behavior. The resulting integrated model database is used to generate documentation and specifications and will evolve to support activities from construction through final integration, test, and commissioning, serving as a living representation of the LSST as designed and built. We discuss the methodology and present several examples of its application to specific systems engineering challenges in the LSST design.
Resolving the problem of compliance with the ever increasing and changing regulations
NASA Astrophysics Data System (ADS)
Leigh, Harley
1992-01-01
The most common problem identified at several U.S. Department of Energy (DOE) sites is regulatory compliance. Simply, the project viability depends on identifying regulatory requirements at the beginning of a specific project to avoid possible delays and cost overruns. The Radioisotope Power Systems Facility (RPSF) is using the Regulatory Compliance System (RCS) to deal with the problem that well over 1000 regulatory documents had to be reviewed for possible compliance requirements applicable to the facility. This overwhelming number of possible documents is not atypical of all DOE facilities thus far reviewed using the RCS system. The RCS was developed to provide control and tracking of all the regulatory and institutional requirements on a given project. WASTREN, Inc., developed the RCS through various DOE contracts and continues to enhance and update the system for existing and new contracts. The RCS provides the information to allow the technical expert to assimilate and manage accurate resource information, compile the necessary checklists, and document that the project or facility fulfills all of the appropriate regulatory requirements. The RCS provides on-line information, including status throughout the project life, thereby allowing more intelligent and proactive decision making. Also, consistency and traceability are provided for regulatory compliance documentation.
The European nanometrology landscape.
Leach, Richard K; Boyd, Robert; Burke, Theresa; Danzebrink, Hans-Ulrich; Dirscherl, Kai; Dziomba, Thorsten; Gee, Mark; Koenders, Ludger; Morazzani, Valérie; Pidduck, Allan; Roy, Debdulal; Unger, Wolfgang E S; Yacoot, Andrew
2011-02-11
This review paper summarizes the European nanometrology landscape from a technical perspective. Dimensional and chemical nanometrology are discussed first as they underpin many of the developments in other areas of nanometrology. Applications for the measurement of thin film parameters are followed by two of the most widely relevant families of functional properties: measurement of mechanical and electrical properties at the nanoscale. Nanostructured materials and surfaces, which are seen as key materials areas having specific metrology challenges, are covered next. The final section describes biological nanometrology, which is perhaps the most interdisciplinary applications area, and presents unique challenges. Within each area, a review is provided of current status, the capabilities and limitations of current techniques and instruments, and future directions being driven by emerging industrial measurement requirements. Issues of traceability, standardization, national and international programmes, regulation and skills development will be discussed in a future paper.
The European nanometrology landscape
NASA Astrophysics Data System (ADS)
Leach, Richard K.; Boyd, Robert; Burke, Theresa; Danzebrink, Hans-Ulrich; Dirscherl, Kai; Dziomba, Thorsten; Gee, Mark; Koenders, Ludger; Morazzani, Valérie; Pidduck, Allan; Roy, Debdulal; Unger, Wolfgang E. S.; Yacoot, Andrew
2011-02-01
This review paper summarizes the European nanometrology landscape from a technical perspective. Dimensional and chemical nanometrology are discussed first as they underpin many of the developments in other areas of nanometrology. Applications for the measurement of thin film parameters are followed by two of the most widely relevant families of functional properties: measurement of mechanical and electrical properties at the nanoscale. Nanostructured materials and surfaces, which are seen as key materials areas having specific metrology challenges, are covered next. The final section describes biological nanometrology, which is perhaps the most interdisciplinary applications area, and presents unique challenges. Within each area, a review is provided of current status, the capabilities and limitations of current techniques and instruments, and future directions being driven by emerging industrial measurement requirements. Issues of traceability, standardization, national and international programmes, regulation and skills development will be discussed in a future paper.
Carrera, Mónica; Gallardo, José M
2017-02-08
The determination of the geographical origin of food products is relevant to comply with the legal regulations of traceability, to avoid food fraud, and to guarantee food quality and safety to the consumers. For these reasons, stable isotope ratio (SIR) analysis using an isotope ratio mass spectrometry (IRMS) instrument is one of the most useful techniques for evaluating food traceability and authenticity. The present study was aimed to determine, for the first time, the geographical origin for all commercial fish species belonging to the Merlucciidae family using SIR analysis of carbon (δ 13 C) and nitrogen (δ 15 N). The specific results enabled their clear classification according to the FAO (Food and Agriculture Organization of the United Nations) fishing areas, latitude, and geographical origin in the following six different clusters: European, North African, South African, North American, South American, and Australian hake species.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-12-26
... Individuals With Hearing and Speech Disabilities; E911 Requirements for IP-Enabled Service Providers AGENCY... no uniform numbering system for iTRS services; some iTRS users were reached via an IP address, while...-digit number and his IP address, making it relatively traceable (unlike conventional PSTN spoofing...
Assessment of documentation requirements under DOE 5481. 1, Safety Analysis and Review System (SARS)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Browne, E.T.
1981-03-01
This report assesses the requirements of DOE Order 5481.1, Safety Analysis and Review System for DOE Operations (SARS) in regard to maintaining SARS documentation. Under SARS, all pertinent details of the entire safety analysis and review process for each DOE operation are to be traceable from the initial identification of a hazard. This report is intended to provide assistance in identifying the points in the SARS cycle at which documentation is required, what type of documentation is most appropriate, and where it ultimately should be maintained.
NASA Technical Reports Server (NTRS)
Butler, James J.; Johnson, B. Carol; Rice, Joseph P.; Brown, Steven W.; Barnes, Robert A.
2007-01-01
Historically, the traceability of the laboratory calibration of Earth-observing satellite instruments to a primary radiometric reference scale (SI units) is the responsibility of each instrument builder. For the NASA Earth Observing System (EOS), a program has been developed using laboratory transfer radiometers, each with its own traceability to the primary radiance scale of a national metrology laboratory, to independently validate the radiances assigned to the laboratory sources of the instrument builders. The EOS Project Science Office also developed a validation program for the measurement of onboard diffuse reflecting plaques, which are also used as radiometric standards for Earth-observing satellite instruments. Summarized results of these validation campaigns, with an emphasis on the current state-of-the-art uncertainties in laboratory radiometric standards, will be presented. Future mission uncertainty requirements, and possible enhancements to the EOS validation program to ensure that those uncertainties can be met, will be presented.
Thermal Protection Test Bed Pathfinder Development Project
NASA Technical Reports Server (NTRS)
Snapp, Cooper
2015-01-01
In order to increase thermal protection capabilities for future reentry vehicles, a method to obtain relevant test data is required. Although arc jet testing can be used to obtain some data on materials, the best method to obtain these data is to actually expose them to an atmospheric reentry. The overprediction of the Orion EFT-1 flight data is an example of how the ground test to flight traceability is not fully understood. The RED-Data small reentry capsule developed by Terminal Velocity Aerospace is critical to understanding this traceability. In order to begin to utilize this technology, ES3 needs to be ready to build and integrate heat shields onto the RED-Data vehicle. Using a heritage Shuttle tile material for the heat shield will both allow valuable insight into the environment that the RED-Data vehicle can provide and give ES3 the knowledge and capability to build and integrate future heat shields for this vehicle.
Varietal Tracing of Virgin Olive Oils Based on Plastid DNA Variation Profiling
Pérez-Jiménez, Marga; Besnard, Guillaume; Dorado, Gabriel; Hernandez, Pilar
2013-01-01
Olive oil traceability remains a challenge nowadays. DNA analysis is the preferred approach to an effective varietal identification, without any environmental influence. Specifically, olive organelle genomics is the most promising approach for setting up a suitable set of markers as they would not interfere with the pollinator variety DNA traces. Unfortunately, plastid DNA (cpDNA) variation of the cultivated olive has been reported to be low. This feature could be a limitation for the use of cpDNA polymorphisms in forensic analyses or oil traceability, but rare cpDNA haplotypes may be useful as they can help to efficiently discriminate some varieties. Recently, the sequencing of olive plastid genomes has allowed the generation of novel markers. In this study, the performance of cpDNA markers on olive oil matrices, and their applicability on commercial Protected Designation of Origin (PDO) oils were assessed. By using a combination of nine plastid loci (including multi-state microsatellites and short indels), it is possible to fingerprint six haplotypes (in 17 Spanish olive varieties), which can discriminate high-value commercialized cultivars with PDO. In particular, a rare haplotype was detected in genotypes used to produce a regional high-value commercial oil. We conclude that plastid haplotypes can help oil traceability in commercial PDO oils and set up an experimental methodology suitable for organelle polymorphism detection in the complex olive oil matrices. PMID:23950947
NIST Stars: Absolute Spectrophotometric Calibration of Vega and Sirius
NASA Astrophysics Data System (ADS)
Deustua, Susana; Woodward, John T.; Rice, Joseph P.; Brown, Steven W.; Maxwell, Stephen E.; Alberding, Brian G.; Lykke, Keith R.
2018-01-01
Absolute flux calibration of standard stars, traceable to SI (International System of Units) standards, is essential for 21st century astrophysics. Dark energy investigations that rely on observations of Type Ia supernovae and precise photometric redshifts of weakly lensed galaxies require a minimum accuracy of 0.5 % in the absolute color calibration. Studies that aim to address fundamental stellar astrophysics also benefit. In the era of large telescopes and all sky surveys well-calibrated standard stars that do not saturate and that are available over the whole sky are needed. Significant effort has been expended to obtain absolute measurements of the fundamental standards Vega and Sirius (and other stars) in the visible and near infrared, achieving total uncertainties between1% and 3%, depending on wavelength, that do not meet the needed accuracy. The NIST Stars program aims to determine the top-of-the-atmosphere absolute spectral irradiance of bright stars to an uncertainty less than 1% from a ground-based observatory. NIST Stars has developed a novel, fully SI-traceable laboratory calibration strategy that will enable achieving the desired accuracy. This strategy has two key components. The first is the SI-traceable calibration of the entire instrument system, and the second is the repeated spectroscopic measurement of the target star throughout the night. We will describe our experimental strategy, present preliminary results for Vega and Sirius and an end-to-end uncertainty budget
Are the expected benefits of requirements reuse hampered by distance? An experiment.
Carrillo de Gea, Juan M; Nicolás, Joaquín; Fernández-Alemán, José L; Toval, Ambrosio; Idri, Ali
2016-01-01
Software development processes are often performed by distributed teams which may be separated by great distances. Global software development (GSD) has undergone a significant growth in recent years. The challenges concerning GSD are especially relevant to requirements engineering (RE). Stakeholders need to share a common ground, but there are many difficulties as regards the potentially variable interpretation of the requirements in different contexts. We posit that the application of requirements reuse techniques could alleviate this problem through the diminution of the number of requirements open to misinterpretation. This paper presents a reuse-based approach with which to address RE in GSD, with special emphasis on specification techniques, namely parameterised requirements and traceability relationships. An experiment was carried out with the participation of 29 university students enrolled on a Computer Science and Engineering course. Two main scenarios that represented co-localisation and distribution in software development were portrayed by participants from Spain and Morocco. The global teams achieved a slightly better performance than the co-located teams as regards effectiveness , which could be a result of the worse productivity of the global teams in comparison to the co-located teams. Subjective perceptions were generally more positive in the case of the distributed teams ( difficulty , speed and understanding ), with the exception of quality . A theoretical model has been proposed as an evaluation framework with which to analyse, from the point of view of the factor of distance, the effect of requirements specification techniques on a set of performance and perception-based variables. The experiment utilised a new internationalisation requirements catalogue. None of the differences found between co-located and distributed teams were significant according to the outcome of our statistical tests. The well-known benefits of requirements reuse in traditional co-located projects could, therefore, also be expected in GSD projects.
Molecular traceability of beef from synthetic Mexican bovine breeds.
Rodríguez-Ramírez, R; Arana, A; Alfonso, L; González-Córdova, A F; Torrescano, G; Guerrero Legarreta, I; Vallejo-Cordoba, B
2011-10-06
Traceability ensures a link between carcass, quarters or cuts of beef and the individual animal or the group of animals from which they are derived. Meat traceability is an essential tool for successful identification and recall of contaminated products from the market during a food crisis. Meat traceability is also extremely important for protection and value enhancement of good-quality brands. Molecular meat traceability would allow verification of conventional methods used for beef tracing in synthetic Mexican bovine breeds. We evaluated a set of 11 microsatellites for their ability to identify animals belonging to these synthetic breeds, Brangus and Charolais/Brahman (78 animals). Seven microsatellite markers allowed sample discrimination with a match probability, defined as the probability of finding two individuals sharing by chance the same genotypic profile, of 10(-8). The practical application of the marker set was evaluated by testing eight samples from carcasses and pieces of meat at the slaughterhouse and at the point of sale. The DNA profiles of the two samples obtained at these two different points in the production-commercialization chain always proved that they came from the same animal.
NASA Technical Reports Server (NTRS)
Corban, Robert
1993-01-01
The systems engineering process for the concept definition phase of the program involves requirements definition, system definition, and consistent concept definition. The requirements definition process involves obtaining a complete understanding of the system requirements based on customer needs, mission scenarios, and nuclear thermal propulsion (NTP) operating characteristics. A system functional analysis is performed to provide a comprehensive traceability and verification of top-level requirements down to detailed system specifications and provides significant insight into the measures of system effectiveness to be utilized in system evaluation. The second key element in the process is the definition of system concepts to meet the requirements. This part of the process involves engine system and reactor contractor teams to develop alternative NTP system concepts that can be evaluated against specific attributes, as well as a reference configuration against which to compare system benefits and merits. Quality function deployment (QFD), as an excellent tool within Total Quality Management (TQM) techniques, can provide the required structure and provide a link to the voice of the customer in establishing critical system qualities and their relationships. The third element of the process is the consistent performance comparison. The comparison process involves validating developed concept data and quantifying system merits through analysis, computer modeling, simulation, and rapid prototyping of the proposed high risk NTP subsystems. The maximum amount possible of quantitative data will be developed and/or validated to be utilized in the QFD evaluation matrix. If upon evaluation of a new concept or its associated subsystems determine to have substantial merit, those features will be incorporated into the reference configuration for subsequent system definition and comparison efforts.
Pre-Retirement Rehearsal Project: A Healthy Retirement.
ERIC Educational Resources Information Center
Ellenberg, Donna
This fourth in a series of six packages of instructional materials developed by the Pre-Retirement Rehearsal Project contains a student's pre-retirement booklet specifically intended for adults with limited reading ability and teacher's guide, which consider these topics: dietary requirements, nutrition, facts and fallacies about health, foods and…
NASA Astrophysics Data System (ADS)
Good, Peter; Andrews, Timothy; Chadwick, Robin; Dufresne, Jean-Louis; Gregory, Jonathan M.; Lowe, Jason A.; Schaller, Nathalie; Shiogama, Hideo
2016-11-01
nonlinMIP provides experiments that account for state-dependent regional and global climate responses. The experiments have two main applications: (1) to focus understanding of responses to CO2 forcing on states relevant to specific policy or scientific questions (e.g. change under low-forcing scenarios, the benefits of mitigation, or from past cold climates to the present day), or (2) to understand the state dependence (non-linearity) of climate change - i.e. why doubling the forcing may not double the response. State dependence (non-linearity) of responses can be large at regional scales, with important implications for understanding mechanisms and for general circulation model (GCM) emulation techniques (e.g. energy balance models and pattern-scaling methods). However, these processes are hard to explore using traditional experiments, which explains why they have had so little attention in previous studies. Some single model studies have established novel analysis principles and some physical mechanisms. There is now a need to explore robustness and uncertainty in such mechanisms across a range of models (point 2 above), and, more broadly, to focus work on understanding the response to CO2 on climate states relevant to specific policy/science questions (point 1). nonlinMIP addresses this using a simple, small set of CO2-forced experiments that are able to separate linear and non-linear mechanisms cleanly, with a good signal-to-noise ratio - while being demonstrably traceable to realistic transient scenarios. The design builds on the CMIP5 (Coupled Model Intercomparison Project Phase 5) and CMIP6 DECK (Diagnostic, Evaluation and Characterization of Klima) protocols, and is centred around a suite of instantaneous atmospheric CO2 change experiments, with a ramp-up-ramp-down experiment to test traceability to gradual forcing scenarios. In all cases the models are intended to be used with CO2 concentrations rather than CO2 emissions as the input. The understanding gained will help interpret the spread in policy-relevant scenario projections. Here we outline the basic physical principles behind nonlinMIP, and the method of establishing traceability from abruptCO2 to gradual forcing experiments, before detailing the experimental design, and finally some analysis principles. The test of traceability from abruptCO2 to transient experiments is recommended as a standard analysis within the CMIP5 and CMIP6 DECK protocols.
Leal, Miguel Costa; Pimentel, Tânia; Ricardo, Fernando; Rosa, Rui; Calado, Ricardo
2015-06-01
Market globalization and recurring food safety alerts have resulted in a growing consumer awareness of the need for food traceability. This is particularly relevant for seafood due to its perishable nature and importance as a key protein source for the population of the world. Here, we provide an overview of the current needs for seafood origin traceability, along with the limitations and challenges for its implementation. We focus on geochemical, biochemical, and molecular tools and how they should be optimized to be implemented globally and to address our societal needs. We suggest that seafood traceability is key to enforcing food safety regulations and fisheries control, combat fraud, and fulfill present and future expectations of conscientious producers, consumers, and authorities. Copyright © 2015 Elsevier Ltd. All rights reserved.
General Framework for Animal Food Safety Traceability Using GS1 and RFID
NASA Astrophysics Data System (ADS)
Cao, Weizhu; Zheng, Limin; Zhu, Hong; Wu, Ping
GS1 is global traceability standard, which is composed by the encoding system (EAN/UCC, EPC), the data carriers identified automatically (bar codes, RFID), electronic data interchange standards (EDI, XML). RFID is a non-contact, multi-objective automatic identification technique. Tracing of source food, standardization of RFID tags, sharing of dynamic data are problems to solve urgently for recent traceability systems. The paper designed general framework for animal food safety traceability using GS1 and RFID. This framework uses RFID tags encoding with EPCglobal tag data standards. Each information server has access tier, business tier and resource tier. These servers are heterogeneous and distributed, providing user access interfaces by SOAP or HTTP protocols. For sharing dynamic data, discovery service and object name service are used to locate dynamic distributed information servers.
NASA Technical Reports Server (NTRS)
Thome, Kurtis; McCorkel, Joel; Hair, Jason; McAndrew, Brendan; Daw, Adrian; Jennings, Donald; Rabin, Douglas
2012-01-01
The Climate Absolute Radiance and Refractivity Observatory (CLARREO) mission addresses the need to observe high-accuracy, long-term climate change trends and to use decadal change observations as the most critical method to determine the accuracy of climate change. One of the major objectives of CLARREO is to advance the accuracy of SI traceable absolute calibration at infrared and reflected solar wavelengths. This advance is required to reach the on-orbit absolute accuracy required to allow climate change observations to survive data gaps while remaining sufficiently accurate to observe climate change to within the uncertainty of the limit of natural variability. While these capabilities exist at NIST in the laboratory, there is a need to demonstrate that it can move successfully from NIST to NASA and/or instrument vendor capabilities for future spaceborne instruments. The current work describes the test plan for the Solar, Lunar for Absolute Reflectance Imaging Spectroradiometer (SOLARIS) which is the calibration demonstration system (CDS) for the reflected solar portion of CLARREO. The goal of the CDS is to allow the testing and evaluation of calibration approaches , alternate design and/or implementation approaches and components for the CLARREO mission. SOLARIS also provides a test-bed for detector technologies, non-linearity determination and uncertainties, and application of future technology developments and suggested spacecraft instrument design modifications. The end result of efforts with the SOLARIS CDS will be an SI-traceable error budget for reflectance retrieval using solar irradiance as a reference and methods for laboratory-based, absolute calibration suitable for climate-quality data collections. The CLARREO mission addresses the need to observe high-accuracy, long-term climate change trends and advance the accuracy of SI traceable absolute calibration. The current work describes the test plan for the SOLARIS which is the calibration demonstration system for the reflected solar portion of CLARREO. SOLARIS provides a test-bed for detector technologies, non-linearity determination and uncertainties, and application of future technology developments and suggested spacecraft instrument design modifications. The end result will be an SI-traceable error budget for reflectance retrieval using solar irradiance as a reference and methods for laboratory-based, absolute calibration suitable for climate-quality data collections.
Design data needs modular high-temperature gas-cooled reactor. Revision 2
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1987-03-01
The Design Data Needs (DDNs) provide summary statements for program management, of the designer`s need for experimental data to confirm or validate assumptions made in the design. These assumptions were developed using the Integrated Approach and are tabulated in the Functional Analysis Report. These assumptions were also necessary in the analyses or trade studies (A/TS) to develop selections of hardware design or design requirements. Each DDN includes statements providing traceability to the function and the associated assumption that requires the need.
40 CFR 1065.790 - Mass standards.
Code of Federal Regulations, 2013 CFR
2013-07-01
... are certified as NIST-traceable within 0.1% uncertainty. Calibration weights may be certified by any calibration lab that maintains NIST-traceability. Make sure your highest calibration weight has no greater...
40 CFR 1065.790 - Mass standards.
Code of Federal Regulations, 2014 CFR
2014-07-01
... are certified as NIST-traceable within 0.1% uncertainty. Calibration weights may be certified by any calibration lab that maintains NIST-traceability. Make sure your highest calibration weight has no greater...
40 CFR 1065.790 - Mass standards.
Code of Federal Regulations, 2012 CFR
2012-07-01
... are certified as NIST-traceable within 0.1% uncertainty. Calibration weights may be certified by any calibration lab that maintains NIST-traceability. Make sure your highest calibration weight has no greater...
Best practice guidelines for the operation of a donor human milk bank in an Australian NICU.
Hartmann, B T; Pang, W W; Keil, A D; Hartmann, P E; Simmer, K
2007-10-01
Until the establishment of the PREM Bank (Perron Rotary Express Milk Bank) donor human milk banking had not occurred in Australia for the past 20 years. In re-establishing donor human milk banking in Australia, the focus of the PREM Bank has been to develop a formal and consistent approach to safety and quality in processing during the operation of the human milk bank. There is currently no existing legislation in Australia that specifically regulates the operation of donor human milk banks. For this reason the PREM Bank has utilised existing and internationally recognised management practices for managing hazards during food production. These tools (specifically HACCP) have been used to guide the development of Standard Operating Procedures and Good Manufacturing Practice for the screening of donors and processing of donor human milk. Donor screening procedures are consistent with those recommended by other human milk banks operating internationally, and also consistent with the requirements for blood and tissue donation in Australia. Controlled documentation and record keep requirements have also been developed that allow complete traceability from individual donation to individual feed dispensed to recipient and maintain a record of all processing and storage conditions. These operational requirements have been developed to reduce any risk associated with feeding pasteurised donor human milk to hospitalised preterm or ill infants to acceptable levels.
NASA Astrophysics Data System (ADS)
Kumar, Anil; Kumar, Harish; Mandal, Goutam; Das, M. B.; Sharma, D. C.
The present paper discusses the establishment of traceability of reference grade hydrometers at National Physical Laboratory, India (NPLI). The reference grade hydrometers are calibrated and traceable to the primary solid density standard. The calibration has been done according to standard procedure based on Cuckow's Method and the reference grade hydrometers calibrated covers a wide range. The uncertainty of the reference grade hydrometers has been computed and corrections are also calculated for the scale readings, at which observations are taken.
40 CFR 1065.790 - Mass standards.
Code of Federal Regulations, 2010 CFR
2010-07-01
... are certified as NIST-traceable within 0.1 % uncertainty. Calibration weights may be certified by any calibration lab that maintains NIST-traceability. Make sure your lowest calibration weight has no greater than...
40 CFR 1065.790 - Mass standards.
Code of Federal Regulations, 2011 CFR
2011-07-01
... are certified as NIST-traceable within 0.1 % uncertainty. Calibration weights may be certified by any calibration lab that maintains NIST-traceability. Make sure your lowest calibration weight has no greater than...
Science-based requirements and operations development for the Maunakea Spectroscopic Explorer
NASA Astrophysics Data System (ADS)
McConnachie, Alan W.; Flagey, Nicolas; Murowinski, Rick; Szeto, Kei; Salmon, Derrick; Withington, Kanoa; Mignot, Shan
2016-07-01
MSE is a wide field telescope (1.5 square degree field of view) with an aperture of 11.25m. It is dedicated to multi-object spectroscopy at several different spectral resolutions in the range R 2500 - 40000 over a broad wavelength range (0:36 - 1:8μm). MSE enables transformational science in areas as diverse as exoplanetary host characterization; stellar monitoring campaigns; tomographic mapping of the interstellar and intergalactic media; the in-situ chemical tagging of the distant Galaxy; connecting galaxies to the large scale structure of the Universe; measuring the mass functions of cold dark matter sub-halos in galaxy and cluster-scale hosts; reverberation mapping of supermassive black holes in quasars. Here, we summarize the Observatory and describe the development of the top level science requirements and operational concepts. Specifically, we describe the definition of the Science Requirements to be the set of capabilities that allow certain high impact science programs to be conducted. We cross reference these science cases to the science requirements to illustrate the traceability of this approach. We further discuss the operations model for MSE and describe the development of the Operations Concept Document, one of the foundational documents for the project. We also discuss the next stage in the science based development of MSE, specifically the development of the initial Legacy Survey that will occupy a majority of time on the telescope over the first few years of operation.
Code of Federal Regulations, 2010 CFR
2010-07-01
... until the leak check is passed. Post-test leak check ≤4% of average sampling rate After sampling ** See... the test site. The sorbent media must be obtained from a source that can demonstrate the quality...-traceable calibration gas standards and reagents shall be used for the tests and procedures required under...
Software Process Automation: Experiences from the Trenches.
1996-07-01
Integration of problem database Weaver tions) J Process WordPerfect, All-in-One, Oracle, CM Integration of tools Weaver System K Process Framemaker , CM...handle change requests and problem reports. * Autoplan, a project management tool * Framemaker , a document processing system * Worldview, a document...Cadre, Team Work, FrameMaker , some- thing for requirements traceability, their own homegrown scheduling tool, and their own homegrown tool integrator
Campos, Maria Doroteia; Valadas, Vera; Campos, Catarina; Morello, Laura; Braglia, Luca; Breviario, Diego; Cardoso, Hélia G
2018-01-01
Traceability of processed food and feed products has been gaining importance due to the impact that those products can have on human/animal health and to the associated economic and legal concerns, often related to adulterations and frauds as it can be the case for meat and milk. Despite mandatory traceability requirements for the analysis of feed composition, few reliable and accurate methods are presently available to enforce the legislative frame and allow the authentication of animal feeds. In this study, nine sensitive and species-specific real-time PCR TaqMan MGB assays are described for plant species detection in animal feed samples. The method is based on selective real-time qPCR (RT-qPCR) amplification of target genes belonging to the alternative oxidase (AOX) gene family. The plant species selected for detection in feed samples were wheat, maize, barley, soybean, rice and sunflower as common components of feeds, and cotton, flax and peanut as possible undesirable contaminants. The obtained results were compared with end-point PCR methodology. The applicability of the AOX TaqMan assays was evaluated through the screening of commercial feed samples, and by the analysis of plant mixtures with known composition. The RT-qPCR methodology allowed the detection of the most abundant species in feeds but also the identification of contaminant species present in lower amounts, down to 1% w/w. AOX-based methodology provides a suitable molecular marker approach to ascertain plant species composition of animal feed samples, thus supporting feed control and enforcement of the feed sector and animal production.
Weber, Michael; Hellriegel, Christine; Rueck, Alexander; Wuethrich, Juerg; Jenks, Peter
2014-05-01
Quantitative NMR spectroscopy (qNMR) is gaining interest across both analytical and industrial research applications and has become an essential tool for the content assignment and quantitative determination of impurities. The key benefits of using qNMR as measurement method for the purity determination of organic molecules are discussed, with emphasis on the ability to establish traceability to "The International System of Units" (SI). The work describes a routine certification procedure from the point of view of a commercial producer of certified reference materials (CRM) under ISO/IEC 17025 and ISO Guide 34 accreditation, that resulted in a set of essential references for (1)H qNMR measurements, and the relevant application data for these substances are given. The overall process includes specific selection criteria, pre-tests, experimental conditions, homogeneity and stability studies. The advantages of an accelerated stability study over the classical stability-test design are shown with respect to shelf-life determination and shipping conditions. Copyright © 2013 The Authors. Published by Elsevier B.V. All rights reserved.
Okuno, Yukiko; McNairn, Adrian J.; den Elzen, Nicole; Pines, Jonathon; Gilbert, David M.
2001-01-01
We have examined the behavior of pre-replication complex (pre-RC) proteins in relation to key cell cycle transitions in Chinese Hamster Ovary (CHO) cells. ORC1, ORC4 and Cdc6 were stable (T1/2 >2 h) and associated with a chromatin-containing fraction throughout the cell cycle. Green fluorescent protein-tagged ORC1 associated with chromatin throughout mitosis in living cells and co-localized with ORC4 in metaphase spreads. Association of Mcm proteins with chromatin took place during telophase, ∼30 min after the destruction of geminin and cyclins A and B, and was coincident with the licensing of chromatin to replicate in geminin-supplemented Xenopus egg extracts. Neither Mcm recruitment nor licensing required protein synthesis throughout mitosis. Moreover, licensing could be uncoupled from origin specification in geminin-supplemented extracts; site-specific initiation within the dihydrofolate reductase locus required nuclei from cells that had passed through the origin decision point (ODP). These results demonstrate that mammalian pre-RC assembly takes place during telophase, mediated by post-translational modifications of pre-existing proteins, and is not sufficient to select specific origin sites. A subsequent, as yet undefined, step selects which pre-RCs will function as replication origins. PMID:11483529
Myae, Aye Chan; Goddard, Ellen; Aubeeluck, Ashwina
2011-01-01
Traceability systems are an important tool (1) for tracking, monitoring, and managing product flows through the supply chain for better efficiency and profitability of suppliers, and (2) to improve consumer confidence in the face of serious food safety incidents. After the global bovine spongiform encephalopathy (BSE) crisis affected producers, consumers, trade, and the health status of animals and humans, new systems to help confirm the status of cattle products along the supply chain from farm to fork were implemented in many countries (Trautman et al. 2008 ). In this study, people's overall food safety beliefs are explored with the main objective of measuring the link between their food safety beliefs and their attitudes toward traceability. A comparison is made among English-speaking Canadians, French-speaking Canadians, and Japanese consumers. In the study, an Internet-based survey was used to collect data from nationally representative samples of the population in Canada-English (1275), Canada-French (343), and Japanese (1940) in the summer of 2009. Respondents' interests in traceability systems are clearly linked to their sense that the industry is primarily responsible for any food safety outbreaks. Moreover, it is clear that certain segments of the population in all samples feel strongly about the importance of farm to fork traceability in beef; thus, policymakers may wish to consider extending traceability beyond the point of slaughter as a way of encouraging beef sales in Canada.
50 CFR 622.41 - Species specific limitations.
Code of Federal Regulations, 2012 CFR
2012-10-01
... test and evaluate a new BRD design for up to 60 days without being subject to the observer requirements...) [Reserved] (i) Pre-certification. The pre-certification phase allows a person to test and evaluate a new BRD... vessel specified in the application. The RA will issue a pre-certification phase LOA if the BRD design is...
The Use of UML for Software Requirements Expression and Management
NASA Technical Reports Server (NTRS)
Murray, Alex; Clark, Ken
2015-01-01
It is common practice to write English-language "shall" statements to embody detailed software requirements in aerospace software applications. This paper explores the use of the UML language as a replacement for the English language for this purpose. Among the advantages offered by the Unified Modeling Language (UML) is a high degree of clarity and precision in the expression of domain concepts as well as architecture and design. Can this quality of UML be exploited for the definition of software requirements? While expressing logical behavior, interface characteristics, timeliness constraints, and other constraints on software using UML is commonly done and relatively straight-forward, achieving the additional aspects of the expression and management of software requirements that stakeholders expect, especially traceability, is far less so. These other characteristics, concerned with auditing and quality control, include the ability to trace a requirement to a parent requirement (which may well be an English "shall" statement), to trace a requirement to verification activities or scenarios which verify that requirement, and to trace a requirement to elements of the software design which implement that requirement. UML Use Cases, designed for capturing requirements, have not always been satisfactory. Some applications of them simply use the Use Case model element as a repository for English requirement statements. Other applications of Use Cases, in which Use Cases are incorporated into behavioral diagrams that successfully communicate the behaviors and constraints required of the software, do indeed take advantage of UML's clarity, but not in ways that support the traceability features mentioned above. Our approach uses the Stereotype construct of UML to precisely identify elements of UML constructs, especially behaviors such as State Machines and Activities, as requirements, and also to achieve the necessary mapping capabilities. We describe this approach in the context of a space-based software application currently under development at the Jet Propulsion Laboratory.
Traceable nanoscale measurement at NML-SIRIM
NASA Astrophysics Data System (ADS)
Dahlan, Ahmad M.; Abdul Hapip, A. I.
2012-06-01
The role of national metrology institute (NMI) has always been very crucial in national technology development. One of the key activities of the NMI is to provide traceable measurement in all parameters under the International System of Units (SI). Dimensional measurement where size and shape are two important features investigated, is one of the important area covered by NMIs. To support the national technology development, particularly in manufacturing sectors and emerging technology such nanotechnology, the National Metrology Laboratory, SIRIM Berhad (NML-SIRIM), has embarked on a project to equip Malaysia with state-of-the-art nanoscale measurement facility with the aims of providing traceability of measurement at nanoscale. This paper will look into some of the results from current activities at NML-SIRIM related to measurement at nanoscale particularly on application of atomic force microscope (AFM) and laser based sensor in dimensional measurement. Step height standards of different sizes were measured using AFM and laser-based sensors. These probes are integrated into a long-range nanoscale measuring machine traceable to the international definition of the meter thus ensuring their traceability. Consistency of results obtained by these two methods will be discussed and presented. Factors affecting their measurements as well as their related uncertainty of measurements will also be presented.
Inorganic phosphate blocks binding of pre-miRNA to Dicer-2 via its PAZ domain
Fukunaga, Ryuya; Colpan, Cansu; Han, Bo W; Zamore, Phillip D
2014-01-01
In Drosophila, Dicer-1 produces microRNAs (miRNAs) from pre-miRNAs, whereas Dicer-2 generates small interfering RNAs from long double-stranded RNA (dsRNA), a process that requires ATP hydrolysis. We previously showed that inorganic phosphate inhibits Dicer-2 cleavage of pre-miRNAs, but not long dsRNAs. Here, we report that phosphate-dependent substrate discrimination by Dicer-2 reflects dsRNA substrate length. Efficient processing by Dicer-2 of short dsRNA requires a 5′ terminal phosphate and a two-nucleotide, 3′ overhang, but does not require ATP. Phosphate inhibits cleavage of such short substrates. In contrast, cleavage of longer dsRNA requires ATP but no specific end structure: phosphate does not inhibit cleavage of these substrates. Mutation of a pair of conserved arginine residues in the Dicer-2 PAZ domain blocked cleavage of short, but not long, dsRNA. We propose that inorganic phosphate occupies a PAZ domain pocket required to bind the 5′ terminal phosphate of short substrates, blocking their use and restricting pre-miRNA processing in flies to Dicer-1. Our study helps explain how a small molecule can alter the substrate specificity of a nucleic acid processing enzyme. PMID:24488111
Space Tug avionics definition study. Volume 1: Executive summary
NASA Technical Reports Server (NTRS)
1975-01-01
A top down approach was used to identify, compile, and develop avionics functional requirements for all flight and ground operational phases. Such requirements as safety mission critical functions and criteria, minimum redundancy levels, software memory sizing, power for tug and payload, data transfer between payload, tug, shuttle, and ground were established. Those functional requirements that related to avionics support of a particular function were compiled together under that support function heading. This unique approach provided both organizational efficiency and traceability back to the applicable operational phase and event. Each functional requirement was then allocated to the appropriate subsystems and its particular characteristics were quantified.
EPA’s Hg Gas Traceability Approach for Source Emissions Measurement and Monitoring
Solicited presentation (special topic) at the International Conference on Mercury as a Global Pollutant on how EPA establishes the NIST traceability of reference materials used to support regulatory mercury emissions measurements.
Recent advance in DNA-based traceability and authentication of livestock meat PDO and PGI products.
Nicoloso, Letizia; Crepaldi, Paola; Mazza, Raffaele; Ajmone-Marsan, Paolo; Negrini, Riccardo
2013-04-01
This review updates the available molecular techniques and technologies and discusses how they can be used for traceability, food control and enforcement activities. The review also provides examples on how molecular techniques succeeded to trace back unknowns to their breeds of origin, to fingerprint single individuals and to generate evidence in court cases. The examples demonstrate the potential of the DNA based traceability techniques and explore possibilities for translating the next generation genomics tools into a food and feed control and enforcement framework.
Serum albumin: accuracy and clinical use.
Infusino, Ilenia; Panteghini, Mauro
2013-04-18
Albumin is the major plasma protein and its determination is used for the prognostic assessment of several diseases. Clinical guidelines call for monitoring of serum albumin with specific target cut-offs that are independent of the assay used. This requires accurate and equivalent results among different commercially available methods (i.e., result standardization) through a consistent definition and application of a reference measurement system. This should be associated with the definition of measurement uncertainty goals based on medical relevance of serum albumin to make results reliable for patient management. In this paper, we show that, in the current situation, if one applies analytical goals for serum albumin measurement derived from its biologic variation, the uncertainty budget derived from each step of the albumin traceability chain is probably too high to fulfil established quality levels for albumin measurement and to guarantee the accuracy needed for clinical usefulness of the test. The situation is further worsened if non-specific colorimetric methods are used for albumin measurement as they represent an additional random source of uncertainty. Copyright © 2013 Elsevier B.V. All rights reserved.
Applied metrology in the production of superconducting model magnets for particle accelerators
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ferradas Troitino, Jose; Bestmann, Patrick; Bourcey, Nicolas
2017-12-22
The production of superconducting magnets for particle accelerators involves high precision assemblies and tight tolerances, in order to achieve the requirements for their appropriate performance. It is therefore essential to have a strict control and traceability over the geometry of each component of the system, and also to be able to compensate possible inherent deviations coming from the production process.
Functions and requirements document for interim store solidified high-level and transuranic waste
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith-Fewell, M.A., Westinghouse Hanford
1996-05-17
The functions, requirements, interfaces, and architectures contained within the Functions and Requirements (F{ampersand}R) Document are based on the information currently contained within the TWRS Functions and Requirements database. The database also documents the set of technically defensible functions and requirements associated with the solidified waste interim storage mission.The F{ampersand}R Document provides a snapshot in time of the technical baseline for the project. The F{ampersand}R document is the product of functional analysis, requirements allocation and architectural structure definition. The technical baseline described in this document is traceable to the TWRS function 4.2.4.1, Interim Store Solidified Waste, and its related requirements, architecture,more » and interfaces.« less
NASA Astrophysics Data System (ADS)
Guillevic, Myriam; Pascale, Céline; Ackermann, Andreas; Leuenberger, Daiana; Niederhauser, Bernhard
2016-04-01
In the framework of the KEY-VOCs and AtmoChem-ECV projects, we are currently developing new facilities to dynamically generate reference gas mixtures for a variety of reactive compounds, at concentrations measured in the atmosphere and in a SI-traceable way (i.e. the amount of substance fraction in mole per mole is traceable to SI-units). Here we present the realisation of such standards for water vapour in the range 1-10 μmol/mol and for volatile organic compounds (VOCs) such as limonene, alpha-pinene, MVK, MEK, in the nmol/mol range. The matrix gas can be nitrogen or synthetic air. Further development in gas purification techniques could make possible to use purified atmospheric air as carrier gas. The method is based on permeation and dynamic dilution: one permeator containing a pure substance (either water, limonene, MVK, MEK or α-pinene) is kept into a permeation chamber with a constant gas flow. The mass loss is precisely calibrated using a magnetic suspension balance. The carrier gas is purified beforehand from the compounds of interest to the required level, using commercially available purification cartridges. This primary mixture is then diluted to reach the required amount of substance fraction. All flows are piloted by mass flow controllers which makes the production process flexible and easily adaptable to generate the required concentration. All parts in contact with the gas mixture are passivated using coated surfaces, to reduce adsorption/desorption processes as much as possible. Two setups are currently developed: one already built and fixed in our laboratory in Bern as well as a portable generator that is still under construction and that could be used anywhere in the field. The permeation chamber of the portable generator has multiple individual cells allowing the generation of mixtures up to 5 different components if needed. Moreover the presented technique can be adapted and applied to a large variety of molecules (e.g., NO2, BTEX, CFCs, HCFCs, HFCs and other refrigerants) and is particularly suitable for gas species and/or concentration ranges that are not stable in cylinders.
CORSAIR-Calibrated Observations of Radiance Spectra from the Atmosphere in the Far- Infrared
NASA Astrophysics Data System (ADS)
Mlynczak, M. G.; Johnson, D.; Abedin, N.; Liu, X.; Kratz, D.; Jordan, D.; Wang, J.; Bingham, G.; Latvakoski, H.; Bowman, K.; Kaplan, S.
2008-12-01
The CORSAIR project is a new NASA Instrument Incubator Project (IIP) whose primary goal is to develop and demonstrate the necessary technologies to achieve SI-traceable, on-orbit measurements of Earth's spectral radiance in the far-infrared (far-IR). The far-IR plays a vital role in the energy balance of the Earth yet its spectrum has not been comprehensively observed from space for the purposes of climate sensing. The specific technologies being developed under CORSAIR include: passively cooled, antenna-coupled terahertz detectors for the far-IR (by Raytheon Vision Systems); accurately calibrated, SI-traceable blackbody sources for the far-IR (by Space Dynamics Laboratory); and high-performance broad bandpass beamsplitters (by ITT). These technologies complement those already developed under past Langley IIP projects (FIRST; INFLAME) in the areas of Fourier Transform Spectrometers and dedicated far-IR beamsplitters. The antenna-coupled far-IR detectors will be validated in the FIRST instrument at Langley. The SI-traceable far-IR blackbodies will be developed in conjunction with the National Institute of Standards and Technology (NIST). An overview of the CORSAIR technologies will be presented as well as their larger role in the Climate Absolute Radiance and Refractivity Observatory (CLARREO) mission. Upon successful completion of CORSAIR these IIP efforts will provide the necessary technologies to achieve the first comprehensive, accurate, high-resolution measurements from a satellite of the far-IR spectrum of the Earth and its atmosphere, enabling major advances in our understanding of Earth's climate.
NASA Astrophysics Data System (ADS)
Buchholz, Bernhard; Ebert, Volker
2014-05-01
Airborne hygrometry is often demanded in scientific flight campaigns to provide datasets for environmental modeling or to correct for water vapor dilution or cross sensitivity effects in other gas analytical techniques. Water vapor measurements, however, are quite challenging due to the large dynamic range in the atmosphere (between 2 and 40000 ppmv) and the high spatio-temporal variability. Airborne hygrometers therefore need to combine a large measurement range with high temporal resolution to resolve - at typical airspeeds of 500 to 900 km/h - atmospheric gradients of several 1000 ppmv/s. Especially during the ascent into the upper troposphere, hygrometers need to work at high gas exchange rates to minimize water vapor adsorption effects. On the other hand, water vapor sensors are difficult to calibrate due to the strong water adsorption and the lack of bottled reference gas standards, which requires pre- or/and post-flight field calibrations. Recently in-flight calibration using an airborne H2O generator was demonstrated, which minimizes calibration drift but still imposes a lot of additional work and hardware to the experiments, since these kind of calibrations just transfer the accuracy level issues to the in-flight calibration-source. To make things worse, the low gas flow (1-5 std l/min, compared with up to 100 std l/min in flight for fast response instruments) adheres critical questions of wall absorption/desorption of the source and instrument even during the calibration process. The national metrological institutes (NMIs) maintain a global metrological water vapor scale which is defined via national primary humidity generators. These provide for calibration purposes well-defined, accurate water vapor samples of excellent comparability and stability traced back to the SI-units. The humidity calibration chain is maintained via high accuracy (but rather slow) Dew-Point-Mirror-Hygrometers as transfer standards. These provide a traceable performance and calibration link to any industrial or research laboratory hygrometer. To establish metrological traceability in field and particular in airborne hygrometers is however challenging and requires fast, field-compatible, metrologically qualified transfer hygrometry standards to link the metrological and the environmental sciences water scales. The SEALDH (Selective Extractive Airborne Laser Diode Hygrometer) development started 3 years ago and aims at filling this gap by using Tunable Diode Laser Absorption Spectroscopy (TDLAS) with a special, calibration-free data evaluation [1]. Previously developed, laboratory-based TDLAS instruments, such as [2] [3], were starting points to develop an autonomously operating, extractive water vapor sensor in a compact 19' 4 HU form factor. This new airborne package and far-reaching developments [4] in hard- and software allow an autonomous, low maintenance, airborne operation. SEALDH-II can be used in a calibration-free field sensor mode (with an absolute, metrologically defined uncertainty of 4.3% +- 3ppmv). The response time is mainly limited by the gas flow and significantly below 1 sec with a precision down to 0.08 ppmv (1σ, 1sec) measured at 600 ppmv and 1000 hPa. The excellent long-term stability of SEALDH-II (
Resolving the problem of compliance with the ever increasing and changing regulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Leigh, H.
1991-06-01
The most common problem identified at several US Department of Energy (DOE) sites is regulatory compliance. Simply, the project viability depends on identifying regulatory requirements at the beginning of a specific project to avoid possible delays and cost overruns. The Radioisotope Power Systems Facility (RFSP) is using the Regulatory Compliance System (RCS) to deal with the problem that well over 1000 regulatory documents had to be reviewed for possible compliance requirements applicable to the facility. This overwhelming number of possible documents is not atypical of all DOE facilities thus far reviewed using the RCS system. The RCS was developed tomore » provide a control and tracking of all the regulatory and institutional requirements on a given project. WASTREN, Inc., developed the RCS through various DOE contracts and continues to enhance and update the system for existing and new contracts. The RCS provides the information to allow the technical expert to assimilate and manage accurate resource information, compile the checklists, and document that the project or facility fulfills all of the appropriate regulatory requirements. The RCS provides on-line information, including status throughput the project life, thereby allowing more intelligent and proactive decision making. Also, consistency and traceability are provided for regulatory compliance documentation. 1 ref., 1 fig.« less
Application of LogitBoost Classifier for Traceability Using SNP Chip Data
Kang, Hyunsung; Cho, Seoae; Kim, Heebal; Seo, Kang-Seok
2015-01-01
Consumer attention to food safety has increased rapidly due to animal-related diseases; therefore, it is important to identify their places of origin (POO) for safety purposes. However, only a few studies have addressed this issue and focused on machine learning-based approaches. In the present study, classification analyses were performed using a customized SNP chip for POO prediction. To accomplish this, 4,122 pigs originating from 104 farms were genotyped using the SNP chip. Several factors were considered to establish the best prediction model based on these data. We also assessed the applicability of the suggested model using a kinship coefficient-filtering approach. Our results showed that the LogitBoost-based prediction model outperformed other classifiers in terms of classification performance under most conditions. Specifically, a greater level of accuracy was observed when a higher kinship-based cutoff was employed. These results demonstrated the applicability of a machine learning-based approach using SNP chip data for practical traceability. PMID:26436917
Application of LogitBoost Classifier for Traceability Using SNP Chip Data.
Kim, Kwondo; Seo, Minseok; Kang, Hyunsung; Cho, Seoae; Kim, Heebal; Seo, Kang-Seok
2015-01-01
Consumer attention to food safety has increased rapidly due to animal-related diseases; therefore, it is important to identify their places of origin (POO) for safety purposes. However, only a few studies have addressed this issue and focused on machine learning-based approaches. In the present study, classification analyses were performed using a customized SNP chip for POO prediction. To accomplish this, 4,122 pigs originating from 104 farms were genotyped using the SNP chip. Several factors were considered to establish the best prediction model based on these data. We also assessed the applicability of the suggested model using a kinship coefficient-filtering approach. Our results showed that the LogitBoost-based prediction model outperformed other classifiers in terms of classification performance under most conditions. Specifically, a greater level of accuracy was observed when a higher kinship-based cutoff was employed. These results demonstrated the applicability of a machine learning-based approach using SNP chip data for practical traceability.
The Role of Geographical Indication in Supporting Food Safety: A not Taken for Granted Nexus
2014-01-01
The paper focuses on the role of geographical indication in supporting strategies of food safety. Starting from the distinction between generic and specific quality, the article analyses the main factors influencing food safety in cases of geographical indication products, by stressing the importance of traceability systems and biodiversity in securing generic and specific quality. In the second part, the paper investigates the coordination problems behind a designation of origin and conditions to foster an effective collective action, a prerequisite to grant food safety through geographical indications. PMID:27800417
Traceable accounts of subjective probability judgments in the IPCC and beyond
NASA Astrophysics Data System (ADS)
Baer, P. G.
2012-12-01
One of the major sources of controversy surrounding the reports of the IPCC has been the characterization of uncertainty. Although arguably the IPCC has paid more attention to the process of uncertainty analysis and communication than any comparable assessment body, its efforts to achieve consistency have produced mixed results. In particular, the extensive use of subjective probability assessment has attracted widespread criticism. Statements such as "Average Northern Hemisphere temperatures during the second half of the 20th century were very likely higher than during any other 50-year period in the last 500 years" are ubiquitous (one online database lists nearly 3000 such claims), and indeed are the primary way in which its key "findings" are reported. Much attention is drawn to the precise quantitative definition of such statements (e.g., "very likely" means >90% probability, vs. "extremely likely" which means >95% certainty). But there is no process by which the decision regarding the choice of such uncertainty level for a given finding is formally made or reported, and thus they are easily by disputed by anyone, expert or otherwise, who disagrees with the assessment. In the "Uncertainty Guidance Paper" for the Third Assessment Report, Richard Moss and Steve Schneider defined the concept of a "traceable account," which gave exhaustive detail regarding how one ought to provide documentation of such an uncertainty assessment. But the guidance, while appearing straightforward and reasonable, in fact was an unworkable recipe, which would have taken near-infinite time if used for more than a few key results, and would have required a different structuring of the text than the conventional scientific assessment. And even then it would have left a gap when it came to the actual provenance of any such specific judgments, because there simply is no formal step at which individuals turn their knowledge of the evidence on some finding into a probability judgment. The Uncertainty Guidance Papers for the TAR and subsequent assessments have left open the possibility of using such an expert elicitation within the IPCC drafting process, but to my knowledge it has never been done. Were it in fact attempted, it would reveal the inconvenient truth that there is no uniquely correct method for aggregating probability statements; indeed the standard practice within climate-related expert elicitations has been to report all individual estimates without aggregation. But if a report requires a single "consensus estimate," once you have even a single divergent opinion, the question of how to aggregate becomes unavoidable. In this paper, I review in greater detail the match or lack of it between the vision of a "traceable account" and IPCC practice, and the public discussion of selected examples of probabilistic judgments in AR4. I propose elements of a structure based on a flexible software architecture that could facilitate the development and documentation of what I call "collective subjective probability." Using a simple prototype and a pair of sample "findings" from AR4, I demonstrate an example of how such a structure could be used by a small expert community to implement a practical model of a "traceable account." I conclude with as discussion of the prospects of using such modular elicitations in support of, or as an alternative to, conventional IPCC assessment processes.
An Optical Frequency Comb Tied to GPS for Laser Frequency/Wavelength Calibration
Stone, Jack A.; Egan, Patrick
2010-01-01
Optical frequency combs can be employed over a broad spectral range to calibrate laser frequency or vacuum wavelength. This article describes procedures and techniques utilized in the Precision Engineering Division of NIST (National Institute of Standards and Technology) for comb-based calibration of laser wavelength, including a discussion of ancillary measurements such as determining the mode order. The underlying purpose of these calibrations is to provide traceable standards in support of length measurement. The relative uncertainty needed to fulfill this goal is typically 10−8 and never below 10−12, very modest requirements compared to the capabilities of comb-based frequency metrology. In this accuracy range the Global Positioning System (GPS) serves as an excellent frequency reference that can provide the traceable underpinning of the measurement. This article describes techniques that can be used to completely characterize measurement errors in a GPS-based comb system and thus achieve full confidence in measurement results. PMID:27134794
Neutron activation analysis: A primary method of measurement
NASA Astrophysics Data System (ADS)
Greenberg, Robert R.; Bode, Peter; De Nadai Fernandes, Elisabete A.
2011-03-01
Neutron activation analysis (NAA), based on the comparator method, has the potential to fulfill the requirements of a primary ratio method as defined in 1998 by the Comité Consultatif pour la Quantité de Matière — Métrologie en Chimie (CCQM, Consultative Committee on Amount of Substance — Metrology in Chemistry). This thesis is evidenced in this paper in three chapters by: demonstration that the method is fully physically and chemically understood; that a measurement equation can be written down in which the values of all parameters have dimensions in SI units and thus having the potential for metrological traceability to these units; that all contributions to uncertainty of measurement can be quantitatively evaluated, underpinning the metrological traceability; and that the performance of NAA in CCQM key-comparisons of trace elements in complex matrices between 2000 and 2007 is similar to the performance of Isotope Dilution Mass Spectrometry (IDMS), which had been formerly designated by the CCQM as a primary ratio method.
NEVADA TEST SITE WASTE ACCEPTANCE CRITERIA, JUNE 2006
DOE Office of Scientific and Technical Information (OSTI.GOV)
U.S. DEPARTMENT OF ENERGY, NATIONAL NUCLEAR SECURITY ADMINISTRATION NEVADA SITE OFFICE
This document establishes the U.S. Department of Energy (DOE), National Nuclear Security Administration Nevada Site Office (NNSA/NSO) waste acceptance criteria (WAC). The WAC provides the requirements, terms, and conditions under which the Nevada Test Site (NTS) will accept low-level radioactive (LLW) and mixed waste (MW) for disposal. It includes requirements for the generator waste certification program, characterization, traceability, waste form, packaging, and transfer. The criteria apply to radioactive waste received at the NTS Area 3 and Area 5 Radioactive Waste Management Complex (RWMC) for storage or disposal.
Nevada Test Site Waste Acceptance Criteria
DOE Office of Scientific and Technical Information (OSTI.GOV)
U. S. Department of Energy, National Nuclear Security Administration Nevada Site Office
This document establishes the U.S. Department of Energy (DOE), National Nuclear Security Administration Nevada Site Office (NNSA/NSO) waste acceptance criteria (WAC). The WAC provides the requirements, terms, and conditions under which the Nevada Test Site (NTS) will accept low-level radioactive (LLW) and mixed waste (MW) for disposal. It includes requirements for the generator waste certification program, characterization, traceability, waste form, packaging, and transfer. The criteria apply to radioactive waste received at the NTS Area 3 and Area 5 Radioactive Waste Management Complex (RWMC) for storage or disposal.
Maringer, F J; Suráň, J; Kovář, P; Chauvenet, B; Peyres, V; García-Toraño, E; Cozzella, M L; De Felice, P; Vodenik, B; Hult, M; Rosengård, U; Merimaa, M; Szücs, L; Jeffery, C; Dean, J C J; Tymiński, Z; Arnold, D; Hinca, R; Mirescu, G
2013-11-01
In 2011 the joint research project Metrology for Radioactive Waste Management (MetroRWM)(1) of the European Metrology Research Programme (EMRP) started with a total duration of three years. Within this project, new metrological resources for the assessment of radioactive waste, including their calibration with new reference materials traceable to national standards will be developed. This paper gives a review on national, European and international strategies as basis for science-based metrological requirements in clearance and acceptance of radioactive waste. © 2013 Elsevier Ltd. All rights reserved.
Template for updating regulations in QA manuals
DOE Office of Scientific and Technical Information (OSTI.GOV)
White, M.G.; Banerjee, B.
1992-01-01
Recently, the U.S. Department of Energy (DOE) issued new quality assurance (QA) orders to reflect current policies for conduct and operation of DOE-authorized programs and facilities. Establishing traceability to new QA criteria and requirements from former multidraft orders, QA manuals, and guidance documentation for DOE-funded work can be confusing. Identified critical considerations still must be addressed. Most of the newly stated QA criteria can be cross referenced, where applicable, to former QA plans and manuals. Where additional criteria occur, new procedures may be required, together with revisions in QA plans and manuals.
Cargo Movement Operations System (CMOS) Requirements Traceability Matrix, ECP. Version 2
1990-06-07
NO [ ] COMMENT DISPOSITION: COMMENT STATUS: OPEN ( ] CLOSED [ ] ORIGINATOR CONTROL NUMBER: RTM2-0003 PROGRAM OFFICE CONTROL NUMBER: DATA ITEM DISCREPANCY WORKSHEET CDRL NUMBER: A018-02A DATE: 06/07/90 ORIGINATOR NAME: Patrick L. Combs OFFICE SYMBOL: SAIC TELEPHONE NUMBER: 272-2999 SUBSTANTIVE: X EDITORIAL: PAGE NUMBER: E-25 PARA NUMBER: SC122 COMMENT OR RECOMMENDED CHANGE: Delete SS0850 and SS0851 from SC122. RATIONALE: These requirements are not present in all the subordinate system capabilities and, therefore, should not be allocated to
Bennardello, Francesco; Fidone, Carmelo; Cabibbo, Sergio; Calabrese, Salvatore; Garozzo, Giovanni; Cassarino, Grazia; Antolino, Agostino; Tavolino, Giuseppe; Zisa, Nuccio; Falla, Cadigia; Drago, Giuseppe; Di Stefano, Giovanna; Bonomo, Pietro
2009-01-01
Background One of the most serious risks of blood transfusions is an error in ABO blood group compatibility, which can cause a haemolytic transfusion reaction and, in the most severe cases, the death of the patient. The frequency and type of errors observed suggest that these are inevitable, in that mistakes are inherent to human nature, unless significant changes, including the use of computerised instruments, are made to procedures. Methods In order to identify patients who are candidates for the transfusion of blood components and to guarantee the traceability of the transfusion, the Securblood system (BBS srl) was introduced. This system records the various stages of the transfusion process, the health care workers involved and any immediate transfusion reactions. The patients and staff are identified by fingerprinting or a bar code. The system was implemented within Ragusa hospital in 16 operative units (ordinary wards, day hospital, operating theatres). Results In the period from August 2007 to July 2008, 7282 blood components were transfused within the hospital, of which 5606 (77%) using the Securblood system. Overall, 1777 patients were transfused. In this year of experience, no transfusion errors were recorded and each blood component was transfused to the right patient. We recorded 33 blocks of the terminals (involving 0.6% of the transfused blood components) which required the intervention of staff from the Service of Immunohaematology and Transfusion Medicine (SIMT). Most of the blocks were due to procedural errors. Conclusions The Securblood system guarantees complete traceability of the transfusion process outside the SIMT and eliminates the possibility of mistaken identification of patients or blood components. The use of fingerprinting to identify health care staff (nurses and doctors) and patients obliges the staff to carry out the identification procedures directly in the presence of the patient and guarantees the presence of the doctor at the start of the transfusion. PMID:19657483
Bennardello, Francesco; Fidone, Carmelo; Cabibbo, Sergio; Calabrese, Salvatore; Garozzo, Giovanni; Cassarino, Grazia; Antolino, Agostino; Tavolino, Giuseppe; Zisa, Nuccio; Falla, Cadigia; Drago, Giuseppe; Di Stefano, Giovanna; Bonomo, Pietro
2009-07-01
One of the most serious risks of blood transfusions is an error in ABO blood group compatibility, which can cause a haemolytic transfusion reaction and, in the most severe cases, the death of the patient. The frequency and type of errors observed suggest that these are inevitable, in that mistakes are inherent to human nature, unless significant changes, including the use of computerised instruments, are made to procedures. In order to identify patients who are candidates for the transfusion of blood components and to guarantee the traceability of the transfusion, the Securblood system (BBS srl) was introduced. This system records the various stages of the transfusion process, the health care workers involved and any immediate transfusion reactions. The patients and staff are identified by fingerprinting or a bar code. The system was implemented within Ragusa hospital in 16 operative units (ordinary wards, day hospital, operating theatres). In the period from August 2007 to July 2008, 7282 blood components were transfused within the hospital, of which 5606 (77%) using the Securblood system. Overall, 1777 patients were transfused. In this year of experience, no transfusion errors were recorded and each blood component was transfused to the right patient. We recorded 33 blocks of the terminals (involving 0.6% of the transfused blood components) which required the intervention of staff from the Service of Immunohaematology and Transfusion Medicine (SIMT). Most of the blocks were due to procedural errors. The Securblood system guarantees complete traceability of the transfusion process outside the SIMT and eliminates the possibility of mistaken identification of patients or blood components. The use of fingerprinting to identify health care staff (nurses and doctors) and patients obliges the staff to carry out the identification procedures directly in the presence of the patient and guarantees the presence of the doctor at the start of the transfusion.
NASA Astrophysics Data System (ADS)
Kawamura, M.; Umeda, K.; Ohi, T.; Ishimaru, T.; Niizato, T.; Yasue, K.; Makino, H.
2007-12-01
We have developed a formal evaluation method to assess the potential impact of natural phenomena (earthquakes and faulting; volcanism; uplift, subsidence, denudation and sedimentation; climatic and sea-level changes) on a High Level Radioactive Waste (HLW) Disposal System. In 2000, we had developed perturbation scenarios in a generic and conservative sense and illustrated the potential impact on a HLW disposal system. As results of the development of perturbation scenarios, two points were highlighted for consideration in subsequent work: improvement of the scenarios from the viewpoints of reality, transparency, traceability and consistency and avoiding extreme conservatism. Subsequently, we have thus developed a new procedure for describing such perturbation scenarios based on further studies of the characteristics of these natural perturbation phenomena in Japan. The approach to describing the perturbation scenario is effectively developed in five steps: Step 1: Description of potential process of phenomena and their impacts on the geological environment. Step 2: Characterization of potential changes of geological environment in terms of T-H-M-C (Thermal - Hydrological - Mechanical - Chemical) processes. The focus is on specific T-H-M-C parameters that influence geological barrier performance, utilizing the input from Step 1. Step 3: Classification of potential influences, based on similarity of T-H-M-C perturbations. This leads to development of perturbation scenarios to serve as a basis for consequence analysis. Step 4: Establishing models and parameters for performance assessment. Step 5: Calculation and assessment. This study focuses on identifying key T-H-M-C process associated with perturbations at Step 2. This framework has two advantages. First one is assuring maintenance of traceability during the scenario construction processes, facilitating the production and structuring of suitable records. The second is providing effective elicitation and organization of information from a wide range of investigations of earth sciences within a performance assessment context. In this framework, scenario development work proceeds in a stepwise manner, to ensure clear identification of the impact of processes associated with these phenomena on a HLW disposal system. Output is organized to create credible scenarios with required transparency, consistency, traceability and adequate conservatism. In this presentation, the potential impact of natural phenomena in the viewpoint of performance assessment for HLW disposal will be discussed and modeled using the approach.
USCEA/NIST measurement assurance programs for the radiopharmaceutical and nuclear power industries
DOE Office of Scientific and Technical Information (OSTI.GOV)
Golas, D.B.
1993-12-31
In cooperation with the U.S. Council for Energy Awareness (USCEA), the National Institute of Standards and Technology (NIST) supervises and administers two measurement assurance programs for radioactivity measurement traceability. One, in existence since the mid 1970s, provides traceability to suppliers of radiochemicals and radiopharmaceuticals, dose calibrators, and nuclear pharmacy services. The second program, begun in 1987, provides traceability to the nuclear power industry for utilities, source suppliers, and service laboratories. Each program is described, and the results of measurements of samples of known, but undisclosed activity, prepared at NIST and measured by the participants are presented.
Lightweight approach to model traceability in a CASE tool
NASA Astrophysics Data System (ADS)
Vileiniskis, Tomas; Skersys, Tomas; Pavalkis, Saulius; Butleris, Rimantas; Butkiene, Rita
2017-07-01
A term "model-driven" is not at all a new buzzword within the ranks of system development community. Nevertheless, the ever increasing complexity of model-driven approaches keeps fueling all kinds of discussions around this paradigm and pushes researchers forward to research and develop new and more effective ways to system development. With the increasing complexity, model traceability, and model management as a whole, becomes indispensable activities of model-driven system development process. The main goal of this paper is to present a conceptual design and implementation of a practical lightweight approach to model traceability in a CASE tool.
Zhao, Jie; Li, Tingting; Zhu, Chao; Jiang, Xiaoling; Zhao, Yan; Xu, Zhenzhen; Yang, Shuming; Chen, Ailiang
2018-06-01
Meat traceability based on molecular markers is exerting a great influence on food safety and will enhance its key role in the future. This study aimed to investigate and verify the polymorphism of 23 microsatellite markers and select the most suitable markers for individual identification and meat traceability of six swine breeds in the Chinese market. The mean polymorphism information content value of these 23 loci was 0.7851, and each locus exhibited high polymorphism in the pooled population. There were 10 loci showing good polymorphism in each breed, namely, Sw632, S0155, Sw2406, Sw830, Sw2525, Sw72, Sw2448, Sw911, Sw122 and CGA. When six highly polymorphic loci were combined, the match probability value for two random individual genotypes among the pig breeds (Beijing Black, Sanyuan and Taihu) was lower than 1.151 E-06. An increasing number of loci indicated a gradually decreasing match probability value and therefore enhanced traceability accuracy. The validation results of tracing 18 blood and corresponding meat samples based on five highly polymorphic loci (Sw2525, S0005, Sw0107, Sw911 and Sw857) were successful, with 100% conformation probability, which provided a foundation for establishing a traceability system for pork in the Chinese market.
Qi, Luming; Liu, Honggao; Li, Jieqing; Li, Tao; Wang, Yuanzhong
2018-01-15
Origin traceability is an important step to control the nutritional and pharmacological quality of food products. Boletus edulis mushroom is a well-known food resource in the world. Its nutritional and medicinal properties are drastically varied depending on geographical origins. In this study, three sensor systems (inductively coupled plasma atomic emission spectrophotometer (ICP-AES), ultraviolet-visible (UV-Vis) and Fourier transform mid-infrared spectroscopy (FT-MIR)) were applied for the origin traceability of 192 mushroom samples (caps and stipes) in combination with chemometrics. The difference between cap and stipe was clearly illustrated based on a single sensor technique, respectively. Feature variables from three instruments were used for origin traceability. Two supervised classification methods, partial least square discriminant analysis (FLS-DA) and grid search support vector machine (GS-SVM), were applied to develop mathematical models. Two steps (internal cross-validation and external prediction for unknown samples) were used to evaluate the performance of a classification model. The result is satisfactory with high accuracies ranging from 90.625% to 100%. These models also have an excellent generalization ability with the optimal parameters. Based on the combination of three sensory systems, our study provides a multi-sensory and comprehensive origin traceability of B. edulis mushrooms.
Qi, Luming; Liu, Honggao; Li, Jieqing; Li, Tao
2018-01-01
Origin traceability is an important step to control the nutritional and pharmacological quality of food products. Boletus edulis mushroom is a well-known food resource in the world. Its nutritional and medicinal properties are drastically varied depending on geographical origins. In this study, three sensor systems (inductively coupled plasma atomic emission spectrophotometer (ICP-AES), ultraviolet-visible (UV-Vis) and Fourier transform mid-infrared spectroscopy (FT-MIR)) were applied for the origin traceability of 184 mushroom samples (caps and stipes) in combination with chemometrics. The difference between cap and stipe was clearly illustrated based on a single sensor technique, respectively. Feature variables from three instruments were used for origin traceability. Two supervised classification methods, partial least square discriminant analysis (FLS-DA) and grid search support vector machine (GS-SVM), were applied to develop mathematical models. Two steps (internal cross-validation and external prediction for unknown samples) were used to evaluate the performance of a classification model. The result is satisfactory with high accuracies ranging from 90.625% to 100%. These models also have an excellent generalization ability with the optimal parameters. Based on the combination of three sensory systems, our study provides a multi-sensory and comprehensive origin traceability of B. edulis mushrooms. PMID:29342969
Yong, Ken-Tye; Roy, Indrajit; Swihart, Mark T.; Prasad, Paras N.
2009-01-01
The use of nanoparticles in biological application has been rapidly advancing toward practical applications in human cancer diagnosis and therapy. Upon linking the nanoparticles with biomolecules, they can be used to locate cancerous area as well as for traceable drug delivery with high affinity and specificity. In this review, we discuss the engineering of multifunctional nanoparticle probes and their use in bioimaging and nanomedicine. PMID:20305738
An aspect-oriented approach for designing safety-critical systems
NASA Astrophysics Data System (ADS)
Petrov, Z.; Zaykov, P. G.; Cardoso, J. P.; Coutinho, J. G. F.; Diniz, P. C.; Luk, W.
The development of avionics systems is typically a tedious and cumbersome process. In addition to the required functions, developers must consider various and often conflicting non-functional requirements such as safety, performance, and energy efficiency. Certainly, an integrated approach with a seamless design flow that is capable of requirements modelling and supporting refinement down to an actual implementation in a traceable way, may lead to a significant acceleration of development cycles. This paper presents an aspect-oriented approach supported by a tool chain that deals with functional and non-functional requirements in an integrated manner. It also discusses how the approach can be applied to development of safety-critical systems and provides experimental results.
2008-06-13
LITHIUM - ION BATTERY FOR LAUNCH VEHICLE APPLICATIONS APPROVED FOR...valid OMB control number. 1. REPORT DATE 13 JUN 2008 2. REPORT TYPE N/A 3. DATES COVERED - 4. TITLE AND SUBTITLE SMC-S-018 (2008) Lithium - Ion Battery for...reliability lithium - ion battery for use in launch vehicles. 4.2 Identification and Traceability All cells and batteries require an attached
Cargo Movement Operations System (CMOS) Requirements Traceability Matrix, Version 3 Increment II
1990-12-17
above SCs should be documented. CMOS PMO ACCEPTS COMMENT: YES [ ] NO [ ] ERCI ACCEPTS COMMENT: YES [ ] NO [ ] COMMENT DISPOSITION: COMMENT STATUS: OPEN...These two documents should be in agreement with each other. CMOS PMO ACCEPTS COMMENT: YES [ ] NO [ ] ERCI ACCEPTS COMMENT: YES [ ] NO [ ] COMMENT DISPOSITION...completeness, they should be documented. CMOS PMO ACCEPTS COMMENT: YES [ ] NO [ ] ERCI ACCEPTS COMMENT: YES [ ] NO [ ] COMMENT DISPOSITION: COMMENT STATUS: OPEN
Waste Characterization Methods
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vigil-Holterman, Luciana R.; Naranjo, Felicia Danielle
2016-02-02
This report discusses ways to classify waste as outlined by LANL. Waste Generators must make a waste determination and characterize regulated waste by appropriate analytical testing or use of acceptable knowledge (AK). Use of AK for characterization requires several source documents. Waste characterization documentation must be accurate, sufficient, and current (i.e., updated); relevant and traceable to the waste stream’s generation, characterization, and management; and not merely a list of information sources.
Arrhenius, Karine; Brown, Andrew S; van der Veen, Adriaan M H
2016-01-01
The traceable and accurate measurement of biogas impurities is essential in order to robustly assess compliance with the specifications for biomethane being developed by CEN/TC408. An essential part of any procedure aiming to determinate the content of impurities is the sampling and the transfer of the sample to the laboratory. Key issues are the suitability of the sample container and minimising the losses of impurities during the sampling and analysis process. In this paper, we review the state-of-the-art in biogas sampling with the focus on trace impurities. Most of the vessel suitability studies reviewed focused on raw biogas. Many parameters need to be studied when assessing the suitability of vessels for sampling and storage, among them, permeation through the walls, leaks through the valves or physical leaks, sorption losses and adsorption effects to the vessel walls, chemical reactions and the expected initial concentration level. The majority of these studies looked at siloxanes, for which sampling bags, canisters, impingers and sorbents have been reported to be fit-for-purpose in most cases, albeit with some limitations. We conclude that the optimum method requires a combination of different vessels to cover the wide range of impurities commonly found in biogas, which have a wide range of boiling points, polarities, water solubilities, and reactivities. The effects from all the parts of the sampling line must be considered and precautions must be undertaken to minimize these effects. More practical suitability tests, preferably using traceable reference gas mixtures, are needed to understand the influence of the containers and the sampling line on sample properties and to reduce the uncertainty of the measurement. Copyright © 2015 Elsevier B.V. All rights reserved.
Lowndes, Catherine M; Jayachandran, A A; Banandur, Pradeep; Ramesh, Banadakoppa M; Washington, Reynold; Sangameshwar, B M; Moses, Stephen; Blanchard, James; Alary, Michel
2012-05-01
This study compared rates of HIV-related sexual risk behaviours reported in individual face-to-face (FTFI) and group anonymous polling booth (PBS) interviews in India. In PBS, respondents grouped by gender and marital status answered yes/no questions by putting tokens with question numbers in colour-coded containers. Data were subsequently collated for each group as a whole, so responses were not traceable back to individuals. Male and female PBS participants reported substantially higher rates of pre-marital, extra-marital, commercial and anal sex than FTFI participants; e.g. 11 vs. 2% married males reported paying for sex; 6 vs. 1% unmarried males reported homosexual anal sex.
78 FR 68868 - First-Class Mail Postage Payment Option
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-15
.... The Postal Service states that participating businesses will produce and distribute pre-approved envelopes and postcards according to specific design requirements established by the Postal Service and have the option of increasing the value of the pre-approved envelopes by applying a customized Picture...
Ye, Junqiang; Beetz, Nadine; O'Keeffe, Sean; Tapia, Juan Carlos; Macpherson, Lindsey; Chen, Weisheng V; Bassel-Duby, Rhonda; Olson, Eric N; Maniatis, Tom
2015-06-09
We report that mice lacking the heterogeneous nuclear ribonucleoprotein U (hnRNP U) in the heart develop lethal dilated cardiomyopathy and display numerous defects in cardiac pre-mRNA splicing. Mutant hearts have disorganized cardiomyocytes, impaired contractility, and abnormal excitation-contraction coupling activities. RNA-seq analyses of Hnrnpu mutant hearts revealed extensive defects in alternative splicing of pre-mRNAs encoding proteins known to be critical for normal heart development and function, including Titin and calcium/calmodulin-dependent protein kinase II delta (Camk2d). Loss of hnRNP U expression in cardiomyocytes also leads to aberrant splicing of the pre-mRNA encoding the excitation-contraction coupling component Junctin. We found that the protein product of an alternatively spliced Junctin isoform is N-glycosylated at a specific asparagine site that is required for interactions with specific protein partners. Our findings provide conclusive evidence for the essential role of hnRNP U in heart development and function and in the regulation of alternative splicing.
Respiratory Network Stability and Modulatory Response to Substance P Require Nalcn.
Yeh, Szu-Ying; Huang, Wei-Hsiang; Wang, Wei; Ward, Christopher S; Chao, Eugene S; Wu, Zhenyu; Tang, Bin; Tang, Jianrong; Sun, Jenny J; Esther van der Heijden, Meike; Gray, Paul A; Xue, Mingshan; Ray, Russell S; Ren, Dejian; Zoghbi, Huda Y
2017-04-19
Respiration is a rhythmic activity as well as one that requires responsiveness to internal and external circumstances; both the rhythm and neuromodulatory responses of breathing are controlled by brainstem neurons in the preBötzinger complex (preBötC) and the retrotrapezoid nucleus (RTN), but the specific ion channels essential to these activities remain to be identified. Because deficiency of sodium leak channel, non-selective (Nalcn) causes lethal apnea in humans and mice, we investigated Nalcn function in these neuronal groups. We found that one-third of mice lacking Nalcn in excitatory preBötC neurons died soon after birth; surviving mice developed apneas in adulthood. Interestingly, in both preBötC and RTN neurons, the Nalcn current influences the resting membrane potential, contributes to maintenance of stable network activity, and mediates modulatory responses to the neuropeptide substance P. These findings reveal Nalcn's specific role in both rhythmic stability and responsiveness to neuropeptides within the respiratory network. Copyright © 2017 Elsevier Inc. All rights reserved.
ARCHITECT: The architecture-based technology evaluation and capability tradeoff method
NASA Astrophysics Data System (ADS)
Griendling, Kelly A.
The use of architectures for the design, development, and documentation of system-of-systems engineering has become a common practice in recent years. This practice became mandatory in the defense industry in 2004 when the Department of Defense Architecture Framework (DoDAF) Promulgation Memo mandated that all Department of Defense (DoD) architectures must be DoDAF compliant. Despite this mandate, there has been significant confusion and a lack of consistency in the creation and the use of the architecture products. Products are typically created as static documents used for communication and documentation purposes that are difficult to change and do not support engineering design activities and acquisition decision making. At the same time, acquisition guidance has been recently reformed to move from the bottom-up approach of the Requirements Generation System (RGS) to the top-down approach mandated by the Joint Capabilities Integration and Devel- opment System (JCIDS), which requires the use of DoDAF to support acquisition. Defense agencies have had difficulty adjusting to this new policy, and are struggling to determine how to meet new acquisition requirements. This research has developed the Architecture-based Technology Evaluation and Capability Tradeoff (ARCHITECT) Methodology to respond to these challenges and address concerns raised about the defense acquisition process, particularly the time required to implement parts of the process, the need to evaluate solutions across capability and mission areas, and the need to use a rigorous, traceable, repeatable method that utilizes modeling and simulation to better substantiate early-phase acquisition decisions. The objective is to create a capability-based systems engineering methodology for the early phases of design and acquisition (specifically Pre-Milestone A activities) which improves agility in defense acquisition by (1) streamlining the development of key elements of JCIDS and DoDAF, (2) moving the creation of DoDAF products forward in the defense acquisition process, and (3) using DoDAF products for more than documentation by integrating them into the problem definition and analysis of alternatives phases and applying executable architecting. This research proposes and demonstrates the plausibility of a prescriptive methodology for developing executable DoDAF products which will explicitly support decision-making in the early phases of JCIDS. A set of criteria by which CBAs should be judged is proposed, and the methodology is developed with these criteria in mind. The methodology integrates existing tools and techniques for systems engineering and system of systems engineering with several new modeling and simulation tools and techniques developed as part of this research to fill gaps noted in prior CBAs. A suppression of enemy air defenses (SEAD) mission is used to demonstrate the ap- plication of ARCHITECT and to show the plausibility of the approach. For the SEAD study, metrics are derived and a gap analysis is performed. The study then identifies and quantitatively compares system and operational architecture alternatives for performing SEAD. A series of down-selections is performed to identify promising architectures, and these promising solutions are subject to further analysis where the impacts of force structure and network structure are examined. While the numerical results of the SEAD study are notional and could not be applied to an actual SEAD CBA, the example served to highlight many of the salient features of the methodology. The SEAD study presented enabled pre-Milestone A tradeoffs to be performed quantitatively across a large number of architectural alternatives in a traceable and repeatable manner. The alternatives considered included variations on operations, systems, organizational responsibilities (through the assignment of systems to tasks), network (or collaboration) structure, interoperability level, and force structure. All of the information used in the study is preserved in the environment, which is dynamic and allows for on-the-fly analysis. The assumptions used were consistent, which was assured through the use of single file documenting all inputs, which was shared across all models. Furthermore, a model was made of the ARCHITECT methodology itself, and was used to demonstrate that even if the steps took twice as long to perform as they did in the case of the SEAD example, the methodology still provides the ability to conduct CBA analyses in less time than prior CBAs to date. Overall, it is shown that the ARCHITECT methodology results in an improvement over current CBAs in the criteria developed here.
Final report of the key comparison CCQM-K98: Pb isotope amount ratios in bronze
NASA Astrophysics Data System (ADS)
Vogl, Jochen; Yim, Yong-Hyeon; Lee, Kyoung-Seok; Goenaga-Infante, Heidi; Malinowskiy, Dmitriy; Ren, Tongxiang; Wang, Jun; Vocke, Robert D., Jr.; Murphy, Karen; Nonose, Naoko; Rienitz, Olaf; Noordmann, Janine; Näykki, Teemu; Sara-Aho, Timo; Ari, Betül; Cankur, Oktay
2014-01-01
Isotope amount ratios are proving useful in an ever increasing array of applications that range from studies unravelling transport processes, to pinpointing the provenance of specific samples as well as trace element quantification by using isotope dilution mass spectrometry (IDMS). These expanding applications encompass fields as diverse as archaeology, food chemistry, forensic science, geochemistry, medicine and metrology. However, to be effective tools, the isotope ratio data must be reliable and traceable to enable the comparability of measurement results. The importance of traceability and comparability in isotope ratio analysis has already been recognized by the Inorganic Analysis Working Group (IAWG) within the CCQM. While the requirements for isotope ratio accuracy and precision in the case of IDMS are generally quite modest, 'absolute' Pb isotope ratio measurements for geochemical applications as well as forensic provenance studies require Pb isotope ratio measurements of the highest quality. To support present and future CMCs on isotope ratio determinations, a key comparison was urgently needed and therefore initiated at the IAWG meeting in Paris in April 2011. The analytical task within such a comparison was decided to be the measurement of Pb isotope amount ratios in water and bronze. Measuring Pb isotope amount ratios in an aqueous Pb solution tested the ability of analysts to correct for any instrumental effects on the measured ratios, while the measurement of Pb isotope amount ratios in a metal matrix sample provided a real world test of the whole chemical and instrumental procedure. A suitable bronze material with a Pb mass fraction between 10 and 100 mg•kg-1 and a high purity solution of Pb with a mass fraction of approximately 100 mg•kg-1 was available at the pilot laboratory (BAM), both offering a natural-like Pb isotopic composition. The mandatory measurands, the isotope amount ratios n(206Pb)/n(204Pb), n(207Pb)/n(204Pb) and n(208Pb)/n(204Pb) were selected such that they correspond with those commonly reported in Pb isotopic studies and fully describe the isotopic composition of Pb in the sample. Additionally, the isotope amount ratio n(208Pb)/n(206Pb) was added, as this isotope ratio is typically measured when performing Pb quantitation by IDMS involving a 206Pb spike. Each participant was free to use any method they deemed suitable for measuring the individual isotope ratios. However, the majority of the results were obtained by using muIti-collector ICPMS or TIMS. The key requirements for all analytical procedures were a traceability statement for all results and the establishment of an uncertainty budget meeting a target uncertainty for all ratios of 0.2 %, relative (k=1). Additionally, the use of a Pb-matrix separation procedure was encouraged. The obtained overall result was excellent, demonstrating that the individual results reported by the NMIs/DIs were comparable and compatible for the determination of Pb isotope ratios. MC-ICPMS and MC-TIMS data were consistent with each other and agree to within 0.05 %. The corresponding uncertainties can be considered as realistic uncertainties and mainly range from 0.02 % to 0.08 % (k=1). As stated above isotope ratios are being increasingly used in different fields. Despite the availability and ease of use of new mass spectrometers, the metrology of unbiased isotope ratio measurements remains very challenging. Therefore, further comparisons are urgently needed, and should be designed to also engage scientists outside the NMI/DI community. Possible follow-up studies should focus on isotope ratio and delta measurements important for environmental and technical applications (e.g. B), food traceability and forensics (e.g. H, C, N, O, S and 87Sr/86Sr) or climate change issues (e.g. Li, B, Mg, Ca, Si). Main text. To reach the main text of this paper, click on Final Report. The final report has been peer-reviewed and approved for publication by the CCQM.
40 CFR Appendix H to Part 75... - [Reserved
Code of Federal Regulations, 2013 CFR
2013-07-01
... 40 Protection of Environment 17 2013-07-01 2013-07-01 false [Reserved] H Appendix H to Part 75-Revised Traceability Protocol No. 1 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) CONTINUOUS EMISSION MONITORING Appendix H to Part 75—Revised Traceability...
EPA Traceability Protocol for Assay and Certification of Gaseous Calibration Standards
In 1997, the U.S. Environmental Protection Agency (EPA) in Research Triangle Park, North Carolina, revised its 1993 version of its traceability protocol for the assay and certification of compressed gas and permeation-device calibration standards. The protocol allows producers o...
40 CFR Appendix H to Part 75... - [Reserved
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 16 2010-07-01 2010-07-01 false [Reserved] H Appendix H to Part 75-Revised Traceability Protocol No. 1 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) CONTINUOUS EMISSION MONITORING Appendix H to Part 75—Revised Traceability...
Automatic summary generating technology of vegetable traceability for information sharing
NASA Astrophysics Data System (ADS)
Zhenxuan, Zhang; Minjing, Peng
2017-06-01
In order to solve problems of excessive data entries and consequent high costs for data collection in vegetable traceablility for farmers in traceability applications, the automatic summary generating technology of vegetable traceability for information sharing was proposed. The proposed technology is an effective way for farmers to share real-time vegetable planting information in social networking platforms to enhance their brands and obtain more customers. In this research, the influencing factors in the vegetable traceablility for customers were analyzed to establish the sub-indicators and target indicators and propose a computing model based on the collected parameter values of the planted vegetables and standard legal systems on food safety. The proposed standard parameter model involves five steps: accessing database, establishing target indicators, establishing sub-indicators, establishing standard reference model and computing scores of indicators. On the basis of establishing and optimizing the standards of food safety and traceability system, this proposed technology could be accepted by more and more farmers and customers.
A false single nucleotide polymorphism generated by gene duplication compromises meat traceability.
Sanz, Arianne; Ordovás, Laura; Zaragoza, Pilar; Sanz, Albina; de Blas, Ignacio; Rodellar, Clementina
2012-07-01
Controlling meat traceability using SNPs is an effective method of ensuring food safety. We have analyzed several SNPs to create a panel for bovine genetic identification and traceability studies. One of these was the transversion g.329C>T (Genbank accession no. AJ496781) on the cytochrome P450 17A1 gene, which has been included in previously published panels. Using minisequencing reactions, we have tested 701 samples belonging to eight Spanish cattle breeds. Surprisingly, an excess of heterozygotes was detected, implying an extreme departure from Hardy-Weinberg equilibrium (P<0.001). By alignment analysis and sequencing, we detected that the g.329C>T SNP is a false positive polymorphism, which allows us to explain the inflated heterozygotic value. We recommend that this ambiguous SNP, as well as other polymorphisms located in this region, should not be used in identification, traceability or disease association studies. Annotation of these false SNPs should improve association studies and avoid misinterpretations. Copyright © 2012 Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Plante, Jeannete
2010-01-01
GEIA-STD-0005-1 defines the objectives of, and requirements for, documenting processes that assure customers and regulatory agencies that AHP electronic systems containing lead-free solder, piece parts, and boards will satisfy the applicable requirements for performance, reliability, airworthiness, safety, and certify-ability throughout the specified life of performance. It communicates requirements for a Lead-Free Control Plan (LFCP) to assist suppliers in the development of their own Plans. The Plan documents the Plan Owner's (supplier's) processes, that assure their customer, and all other stakeholders that the Plan owner's products will continue to meet their requirements. The presentation reviews quality assurance requirements traceability and LFCP template instructions.
NASA Astrophysics Data System (ADS)
Wysocka, Irena; Vassileva, Emilia
2017-02-01
Analytical procedure for the determination of fourteen rare earth elements (REEs) in the seawater samples has been developed and validated. The elements (La, Ce, Pr, Nd, Sm, Eu, Gd, Tb, Dy, Ho, Er, Tm, Yb, Lu) at ultra-trace level were measured by high resolution sector field inductively coupled plasma mass spectrometry (HR ICP-SFMS) after off-line analytes pre-concentration and matrix separation. The sample pre-treatment was carried out by commercially available automated system seaFAST-pico™, which is a low-pressure ion chromatography technique, based on solid phase extraction principles. Efficient elimination of seawater matrix and up to 50-fold pre-concentration of REEs enabled their accurate and precise quantification at ng L- 1 level. A validation approach in line with the requirements of ISO/IEC 17025 standard and Eurachem guidelines were followed. With this in mind, selectivity, working range, linearity, recovery (from 92% to 102%), repeatability (1%-4%), intermediate precision (2%-6%), limits of detection (0.001-0.08 ng L- 1) were systematically assessed. The total uncertainty associated to each result was estimated and the main sources of uncertainty sorted out. All major contributions to the combined uncertainty of the obtained results were identified and propagated together, following the ISO/GUM guidelines. The relative expanded uncertainty was estimated at range from 10.4% to 11.6% (k = 2). Demonstration of traceability of measurement results was also presented. Due to the low limits of detection, this method enables the determination of ultra-low levels of REEs in the open seawater as well as small variations in their concentrations. The potential of the proposed analytical procedure, based on combination of seaFAST-pico™ for sample preparation and HR ICP-SFMS, was demonstrated by direct analysis of seawater form different regions of the world.
NASA Astrophysics Data System (ADS)
Vargas Pereira, Thaiane; Beatrici, Anderson
2018-03-01
Some of the more sensitive weighing equipment available nowadays has its repeatability close to tenth of microgram. OIML characterize mass standards bigger them 1 mg, so in this range doesn’t exist direct traceability to the kg prototype. The ASTM has a characterization of mass standard 50, 100, 200 e 500 micrograms. This work have a purpose of providing traceability to mass measurement in microgram scale (nanonewton scale in force) with the confection and calibration of a standard weights collection. At this time were studied two materials, Tungsten and MetGlass2705M (MetGlass), and produced 12 mass standards.
High-resolution interferometic microscope for traceable dimensional nanometrology in Brazil
NASA Astrophysics Data System (ADS)
Malinovski, I.; França, R. S.; Lima, M. S.; Bessa, M. S.; Silva, C. R.; Couceiro, I. B.
2016-07-01
The double color interferometric microscope is developed for step height standards nanometrology traceable to meter definition via primary wavelength laser standards. The setup is based on two stabilized lasers to provide traceable measurements of highest possible resolution down to the physical limits of the optical instruments in sub-nanometer to micrometer range of the heights. The wavelength reference is He-Ne 633 nm stabilized laser, the secondary source is Blue-Green 488 nm grating laser diode. Accurate fringe portion is measured by modulated phase-shift technique combined with imaging interferometry and Fourier processing. Self calibrating methods are developed to correct systematic interferometric errors.
Metrological AFMs and its application for versatile nano-dimensional metrology tasks
NASA Astrophysics Data System (ADS)
Dai, Gaoliang; Dziomba, T.; Pohlenz, F.; Danzebrink, H.-U.; Koenders, L.
2010-08-01
Traceable calibrations of various micro and nano measurement devices are crucial tasks for ensuring reliable measurements for micro and nanotechnology. Today metrological AFM are widely used for traceable calibrations of nano dimensional standards. In this paper, we introduced the developments of metrological force microscopes at PTB. Of the three metrological AFMs described here, one is capable of measuring in a volume of 25 mm x 25 mm x 5 mm. All instruments feature interferometers and the three-dimensional position measurements are thus directly traceable to the metre definition. Some calibration examples on, for instance, flatness standards, step height standards, one and two dimensional gratings are demonstrated.
40 CFR 68.77 - Pre-startup review.
Code of Federal Regulations, 2010 CFR
2010-07-01
...) CHEMICAL ACCIDENT PREVENTION PROVISIONS Program 3 Prevention Program § 68.77 Pre-startup review. (a) The... stationary sources when the modification is significant enough to require a change in the process safety... substances to a process: (1) Construction and equipment is in accordance with design specifications; (2...
Metrological traceability of holmium oxide solution
NASA Astrophysics Data System (ADS)
Gonçalves, D. E. F.; Gomes, J. F. S.; Alvarenga, A. P. D.; Borges, P. P.; Araujo, T. O.
2018-03-01
Holmium oxide solution was prepared as a candidate of certified reference material for spectrophotometer wavelength scale calibration. Here is presented the necessary steps for evaluation of the uncertainty and the establishment of metrological traceability for the production of this material. Preliminary results from the first produced batch are shown.
Benson, Sarah J; Lennard, Christopher J; Maynard, Philip; Hill, David M; Andrew, Anita S; Neal, Ken; Stuart-Williams, Hilary; Hope, Janet; Walker, G Stewart; Roux, Claude
2010-01-01
Comparability of data over time and between laboratories is a key issue for consideration in the development of global databases, and more broadly for quality assurance in general. One mechanism that can be utilized for evaluating traceability is an inter-laboratory trial. This paper addresses an inter-laboratory trial conducted across a number of Australian and New Zealand isotope ratio mass spectrometry (IRMS) laboratories. The main objective of this trial was to determine whether IRMS laboratories in these countries would record comparable values for the distributed samples. Four carbon containing and four nitrogen containing compounds were distributed to seven laboratories in Australia and one in New Zealand. The laboratories were requested to analyze the samples using their standard procedures. The data from each laboratory was evaluated collectively using International Standard ISO 13528 (Statistical methods for use in proficiency testing by inter-laboratory comparisons). "Warning signals" were raised against one participant in this trial. "Action signals" requiring corrective action were raised against four participants. These participants reviewed the data and possible sources for the discrepancies. This inter-laboratory trial was successful in providing an initial snapshot of the potential for traceability between the participating laboratories. The statistical methods described in this article could be used as a model for others needing to evaluate stable isotope results derived from multiple laboratories, e.g., inter-laboratory trials/proficiency testing. Ongoing trials will be conducted to improve traceability across the Australian and New Zealand IRMS community.
NASA Astrophysics Data System (ADS)
Jones, Christopher W.; O’Connor, Daniel
2018-07-01
Dimensional surface metrology is required to enable advanced manufacturing process control for products such as large-area electronics, microfluidic structures, and light management films, where performance is determined by micrometre-scale geometry or roughness formed over metre-scale substrates. While able to perform 100% inspection at a low cost, commonly used 2D machine vision systems are insufficient to assess all of the functionally relevant critical dimensions in such 3D products on their own. While current high-resolution 3D metrology systems are able to assess these critical dimensions, they have a relatively small field of view and are thus much too slow to keep up with full production speeds. A hybrid 2D/3D inspection concept is demonstrated, combining a small field of view, high-performance 3D topography-measuring instrument with a large field of view, high-throughput 2D machine vision system. In this concept, the location of critical dimensions and defects are first registered using the 2D system, then smart routing algorithms and high dynamic range (HDR) measurement strategies are used to efficiently acquire local topography using the 3D sensor. A motion control platform with a traceable position referencing system is used to recreate various sheet-to-sheet and roll-to-roll inline metrology scenarios. We present the artefacts and procedures used to calibrate this hybrid sensor system for traceable dimensional measurement, as well as exemplar measurement of optically challenging industrial test structures.
Meier, Anja; Mehrle, Stefan; Weiss, Thomas S; Mier, Walter; Urban, Stephan
2013-07-01
Chronic infection with the human hepatitis B virus (HBV) is a global health problem and a main cause of progressive liver diseases. HBV exhibits a narrow host range, replicating primarily in hepatocytes. Both host and hepatocyte specificity presumably involve specific receptor interactions on the target cell; however, direct evidence for this hypothesis is missing. Following the observation that HBV entry is specifically blocked by L-protein-derived preS1-lipopeptides, we visualized specific HBV receptor/ligand complexes on hepatic cells and quantified the turnover kinetics. Using fluorescein isothiocyanate-labeled, myristoylated HBV preS1-peptides we demonstrate (1) the presence of a highly specific HBV receptor on the plasma membrane of HBV-susceptible primary human and tupaia hepatocytes and HepaRG cells but also on hepatocytes from the nonsusceptible species mouse, rat, rabbit and dog; (2) the requirement of a differentiated state of the hepatocyte for specific preS1-binding; (3) the lack of detectable amounts of the receptor on HepG2 and HuH7 cells; (4) a slow receptor turnover at the hepatocyte membrane; and (5) an association of the receptor with actin microfilaments. The presence of the preS1-receptor in primary hepatocytes from some non-HBV-susceptible species indicates that the lack of susceptibility of these cells is owed to a postbinding step. These findings suggest that HBV hepatotropism is mediated by the highly selective expression of a yet unknown receptor* on differentiated hepatocytes, while species specificity of the HBV infection requires selective downstream events, e.g., the presence of host dependency or the absence of host restriction factors. The criteria defined here will allow narrowing down reasonable receptor candidates and provide a binding assay for HBV-receptor expression screens in hepatic cells. Copyright © 2012 American Association for the Study of Liver Diseases.
TWO NEW GAS STANDARDS PROGRAMS AT THE NATIONAL INSTITUTE OF STANDARDS AND TECHNOLOGY
The EPA/NIST certified reference materials (CRM) program is being terminated and replaced with two new ones: the NIST Traceable Reference Materials (NTRM) and the Research Gas Mixture (RGM) programs. hese new programs are being implemented to provide NIST traceability to a wider ...
Determination of NIST-Traceable Quantitative Weight Percentage Purity for G Agent Standards
Nuclear magnetic resonance (NMR) with phosphorus -31 detection is described to determine the weight percent purity of feedstock samples of agents GA...and GD in a way that is National Institute of Standards and Technology (NIST)-traceable. A Precision and Accuracy test is described.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-06-29
... Traceability; Tribal Nations Using Systems for Location Identification AGENCY: Animal and Plant Health... using systems for location identification for the animal disease traceability framework and to request....aphis.usda.gov ). FOR FURTHER INFORMATION CONTACT: For information on Tribal Nations using location...
NASA Astrophysics Data System (ADS)
Besse, S.; Benkhoff, J.; Bentley, M.; Cornet, T.; Moissl, R.; Munoz, C.; Zender, J.
2018-05-01
The BepiColombo Science Ground Segment is developing, in collaboration with the instrument teams, targeted science traceability matrix of each instrument. They are defined in such a way that they can be tracked during the observation lifecycle.
In 1997, the U.S. Environmental Protection Agency (EPA) in Research Triangle Park, North Carolina, revised its 1993 version of its traceability protocol for the assay and certification of compressed gas and permeation-device calibration standards. The protocol allows producers of...
Methodologies for Salmonella enterica subsp. enterica Subtyping: Gold Standards and Alternatives▿
Wattiau, Pierre; Boland, Cécile; Bertrand, Sophie
2011-01-01
For more than 80 years, subtyping of Salmonella enterica has been routinely performed by serotyping, a method in which surface antigens are identified based on agglutination reactions with specific antibodies. The serotyping scheme, which is continuously updated as new serovars are discovered, has generated over time a data set of the utmost significance, allowing long-term epidemiological surveillance of Salmonella in the food chain and in public health control. Conceptually, serotyping provides no information regarding the phyletic relationships inside the different Salmonella enterica subspecies. In epidemiological investigations, identification and tracking of salmonellosis outbreaks require the use of methods that can fingerprint the causative strains at a taxonomic level far more specific than the one achieved by serotyping. During the last 2 decades, alternative methods that could successfully identify the serovar of a given strain by probing its DNA have emerged, and molecular biology-based methods have been made available to address phylogeny and fingerprinting issues. At the same time, accredited diagnostics have become increasingly generalized, imposing stringent methodological requirements in terms of traceability and measurability. In these new contexts, the hand-crafted character of classical serotyping is being challenged, although it is widely accepted that classification into serovars should be maintained. This review summarizes and discusses modern typing methods, with a particular focus on those having potential as alternatives for classical serotyping or for subtyping Salmonella strains at a deeper level. PMID:21856826
Potential Fuel Savings of Specific ATC System Improvements.
1982-02-01
to today’s fuel/cost conscious airspace user. To the extent they are needed to resolve actual conflicts between aircraft competing for the use of...evolved over t : the years. They are rarely, if ever, traceable to an excessive mber of aircraft competing for the same airspace, based on real-time...there were 24 northbound arrivals via RIC, while there were 71 potentially competing southbound overflights via J14. However, the most popular cruise
Ares I-X Launch Vehicle Modal Test Overview
NASA Technical Reports Server (NTRS)
Buehrle, Ralph D.; Bartolotta, Paul A.; Templeton, Justin D.; Reaves, Mercedes C.; Horta, Lucas G.; Gaspar, James L.; Parks, Russell A.; Lazor, Daniel R.
2010-01-01
The first test flight of NASA's Ares I crew launch vehicle, called Ares I-X, is scheduled for launch in 2009. Ares IX will use a 4-segment reusable solid rocket booster from the Space Shuttle heritage with mass simulators for the 5th segment, upper stage, crew module and launch abort system. Flight test data will provide important information on ascent loads, vehicle control, separation, and first stage reentry dynamics. As part of hardware verification, a series of modal tests were designed to verify the dynamic finite element model (FEM) used in loads assessments and flight control evaluations. Based on flight control system studies, the critical modes were the first three free-free bending mode pairs. Since a test of the free-free vehicle is not practical within project constraints, modal tests for several configurations in the nominal integration flow were defined to calibrate the FEM. A traceability study by Aerospace Corporation was used to identify the critical modes for the tested configurations. Test configurations included two partial stacks and the full Ares I-X launch vehicle on the Mobile Launcher Platform. This paper provides an overview for companion papers in the Ares I-X Modal Test Session. The requirements flow down, pre-test analysis, constraints and overall test planning are described.
Gil, Maria I; Selma, Maria V; Suslow, Trevor; Jacxsens, Liesbeth; Uyttendaele, Mieke; Allende, Ana
2015-01-01
This review includes an overview of the most important preventive measures along the farm to fork chain to prevent microbial contamination of leafy greens. It also includes the technological and managerial interventions related to primary production, postharvest handling, processing practices, distribution, and consumer handling to eliminate pathogens in leafy greens. When the microbiological risk is already present, preventive measures to limit actual contamination events or pathogen survival are considered intervention strategies. In codes of practice the focus is mainly put on explaining preventive measures. However, it is also important to establish more focused intervention strategies. This review is centered mainly on leafy vegetables as the commodity identified as the highest priority in terms of fresh produce microbial safety from a global perspective. There is no unique preventive measure or intervention strategy that could be applied at one point of the food chain. We should encourage growers of leafy greens to establish procedures based on the HACCP principles at the level of primary production. The traceability of leafy vegetables along the chain is an essential element in ensuring food safety. Thus, in dealing with the food safety issues associated with fresh produce it is clear that a multidisciplinary farm to fork strategy is required.
NEVADA TEST SITE WASTE ACCEPTANCE CRITERIA
DOE Office of Scientific and Technical Information (OSTI.GOV)
U.S. DEPARTMENT OF ENERGY, NATIONAL NUCLEAR SECURITY ADMINISTRATION, NEVADA SITE OFFICE
This document establishes the U. S. Department of Energy, National Nuclear Security Administration Nevada Site Office (NNSA/NSO) waste acceptance criteria (WAC). The WAC provides the requirements, terms, and conditions under which the Nevada Test Site will accept low-level radioactive and mixed waste for disposal. Mixed waste generated within the State of Nevada by NNSA/NSO activities is accepted for disposal. It includes requirements for the generator waste certification program, characterization, traceability, waste form, packaging, and transfer. The criteria apply to radioactive waste received at the Nevada Test Site Area 3 and Area 5 Radioactive Waste Management Site for storage or disposal.
Piva, Elisa; Tosato, Francesca; Plebani, Mario
2015-12-07
Most errors in laboratory medicine occur in the pre-analytical phase of the total testing process. Phlebotomy, a crucial step in the pre-analytical phase influencing laboratory results and patient outcome, calls for quality assurance procedures and automation in order to prevent errors and ensure patient safety. We compared the performance of a new small, automated device, the ProTube Inpeco, designed for use in phlebotomy with a complete traceability of the process, with a centralized automated system, BC ROBO. ProTube was used for 15,010 patients undergoing phlebotomy with 48,776 tubes being labeled. The mean time and standard deviation (SD) for blood sampling was 3:03 (min:sec; SD ± 1:24) when using ProTube, against 5:40 (min:sec; SD ± 1:57) when using BC ROBO. The mean number of patients per hour managed at each phlebotomy point was 16 ± 3 with ProTube, and 10 ± 2 with BC ROBO. No tubes were labeled erroneously or incorrectly, even if process failure occurred in 2.8% of cases when ProTube was used. Thanks to its cutting edge technology, the ProTube has many advantages over BC ROBO, above all in verifying patient identity, and in allowing a reduction in both identification error and tube mislabeling.
Gamalinda, Michael; Jakovljevic, Jelena; Babiano, Reyes; Talkish, Jason; de la Cruz, Jesús; Woolford, John L
2013-02-01
Ribosome synthesis involves the coordinated folding and processing of pre-rRNAs with assembly of ribosomal proteins. In eukaryotes, these events are facilitated by trans-acting factors that propel ribosome maturation from the nucleolus to the cytoplasm. However, there is a gap in understanding how ribosomal proteins configure pre-ribosomes in vivo to enable processing to occur. Here, we have examined the role of adjacent yeast r-proteins L17, L35 and L37 in folding and processing of pre-rRNAs, and binding of other proteins within assembling ribosomes. These three essential ribosomal proteins, which surround the polypeptide exit tunnel, are required for 60S subunit formation as a consequence of their role in removal of the ITS2 spacer from 27SB pre-rRNA. L17-, L35- and L37-depleted cells exhibit turnover of aberrant pre-60S assembly intermediates. Although the structure of ITS2 does not appear to be grossly affected in their absence, these three ribosomal proteins are necessary for efficient recruitment of factors required for 27SB pre-rRNA processing, namely, Nsa2 and Nog2, which associate with pre-60S ribosomal particles containing 27SB pre-rRNAs. Altogether, these data support that L17, L35 and L37 are specifically required for a recruiting step immediately preceding removal of ITS2.
Gamalinda, Michael; Jakovljevic, Jelena; Babiano, Reyes; Talkish, Jason; de la Cruz, Jesús; Woolford, John L.
2013-01-01
Ribosome synthesis involves the coordinated folding and processing of pre-rRNAs with assembly of ribosomal proteins. In eukaryotes, these events are facilitated by trans-acting factors that propel ribosome maturation from the nucleolus to the cytoplasm. However, there is a gap in understanding how ribosomal proteins configure pre-ribosomes in vivo to enable processing to occur. Here, we have examined the role of adjacent yeast r-proteins L17, L35 and L37 in folding and processing of pre-rRNAs, and binding of other proteins within assembling ribosomes. These three essential ribosomal proteins, which surround the polypeptide exit tunnel, are required for 60S subunit formation as a consequence of their role in removal of the ITS2 spacer from 27SB pre-rRNA. L17-, L35- and L37-depleted cells exhibit turnover of aberrant pre-60S assembly intermediates. Although the structure of ITS2 does not appear to be grossly affected in their absence, these three ribosomal proteins are necessary for efficient recruitment of factors required for 27SB pre-rRNA processing, namely, Nsa2 and Nog2, which associate with pre-60S ribosomal particles containing 27SB pre-rRNAs. Altogether, these data support that L17, L35 and L37 are specifically required for a recruiting step immediately preceding removal of ITS2. PMID:23268442
NASA Technical Reports Server (NTRS)
1983-01-01
The space station mission requirements data base consists of 149 attached and free-flying missions each of which is documented by a set of three interrelated documents: (1) NASA LaRC Data Sheets - with three sheets comprising a set for each payload element described. These sheets contain user payload element data necessary to drive Space Station architectural options. (2) GDC-derived operations descriptions that supplement the LaRC payload element data in the operations areas such as further descriptions of crew involvement, EVA, etc. (3) Payload elements synthesis sheets used by GDC to provide requirements traceability to data sources and to provide a narrative describing the basis for formulating the payload element requirements.
Analyzing organic tea certification and traceability system within the Taiwanese tea industry.
Wang, Mao-Chang; Yang, Chin-Ying
2015-04-01
We applied game theory to the organic tea certification process and traceability system used by the Taiwanese tea industry to elucidate the strategic choices made by tea farmers and organic tea certification agencies. Thus, this paper clarifies how relevant variables affect the organic certification process and traceability system used within the tea industry. The findings indicate that farmers who generate high revenues experience failures regarding tea deliveries, cash outflow, damage compensation, and quasi-rent. An additional problem included the high costs yielded when tea farmers colluded with or switched organic tea certification agencies. Furthermore, there could be decreasing levels of personal interest in planting non-organic tea and lowering the costs of planting organic tea and the managerial accounting costs of building comprehensive traceability systems; thus, the analysis yielded strong results and a superior equilibrium. This research is unprecedented, using an innovative model and providing a novel analysis structure for use in the tea industry. These results contribute to the field of literature and should serve as a valuable reference for members of the tea industry, government, and academia. © 2014 Society of Chemical Industry.
[Study on brand traceability of vinegar based on near infrared spectroscopy technology].
Guan, Xiao; Liu, Jing; Gu, Fang-Qing; Yang, Yong-Jian
2014-09-01
In the present paper, 152 vinegar samples with four different brands were chosen as research targets, and their near infrared spectra were collected by diffusion reflection mode and transmission mode, respectively. Furthermore, the brand traceability models for edible vinegar were constructed. The effects of the collection mode and pretreatment methods of spectrum on the precision of traceability models were investigated intensively. The models constructed by PLS1-DA modeling method using spectrum data of 114 training samples were applied to predict 38 test samples, and R2, RMSEC and RMSEP of the model based on transmission mode data were 0.92, 0.113 and 0.127, respectively, with recognition rate of 76.32%, and those based on diffusion reflection mode data were 0.97, 0.102 and 0.119, with recognition rate of 86.84%. The results demonstrated that the near infrared spectrum combined with PLS1-DA can be used to establish the brand traceability models for edible vinegar, and diffuse reflection mode is more beneficial for predictive ability of the model.
Beltrán, María; Sánchez-Astudillo, María; Aparicio, Ramón; García-González, Diego L
2015-02-15
The geographical traceability of virgin olive oil can be controlled by chemical species that are linked to the production area. Trace elements are among these species. The hypothesis is that the transfer of elements from the soil to the oil is subjected to minor variations and therefore this chemical information can be used for geographical traceability. In order to confirm this hypothesis, the trace elements of virgin olive oils from south-western Spain were analysed, and the same elements were determined in the corresponding olive-pomaces and soils. The differences in the concentration were studied according to cultivars and locations. Results show some coincidences in the selection of elements in soils (W, Fe, Na), olive-pomace (W, Fe, Na, Mg, Mn, Ca, Ba, Li) and olive oils (W, Fe, Mg, Mn, Ca, Ba, Li, Bi), which supports their utility in traceability. In the case of olive oils, 93% of the samples were correctly classified in their geographical origins (96% for Beas, 77% for Gibraleón, 91% for Niebla, and 100% for Sanlúcar de Guadiana). Copyright © 2014 Elsevier Ltd. All rights reserved.
Alasonati, Enrica; Fettig, Ina; Richter, Janine; Philipp, Rosemarie; Milačič, Radmila; Sčančar, Janez; Zuliani, Tea; Tunç, Murat; Bilsel, Mine; Gören, Ahmet Ceyhan; Fisicaro, Paola
2016-11-01
The European Union (EU) has included tributyltin (TBT) and its compounds in the list of priority water pollutants. Quality standards demanded by the EU Water Framework Directive (WFD) require determination of TBT at so low concentration level that chemical analysis is still difficult and further research is needed to improve the sensitivity, the accuracy and the precision of existing methodologies. Within the frame of a joint research project "Traceable measurements for monitoring critical pollutants under the European Water Framework Directive" in the European Metrology Research Programme (EMRP), four metrological and designated institutes have developed a primary method to quantify TBT in natural water using liquid-liquid extraction (LLE) and species-specific isotope dilution mass spectrometry (SSIDMS). The procedure has been validated at the Environmental Quality Standard (EQS) level (0.2ngL(-1) as cation) and at the WFD-required limit of quantification (LOQ) (0.06ngL(-1) as cation). The LOQ of the methodology was 0.06ngL(-1) and the average measurement uncertainty at the LOQ was 36%, which agreed with WFD requirements. The analytical difficulties of the method, namely the presence of TBT in blanks and the sources of measurement uncertainties, as well as the interlaboratory comparison results are discussed in detail. Copyright © 2016 Elsevier B.V. All rights reserved.
New NIST Photomask Linewidth Standard
NASA Astrophysics Data System (ADS)
Potzick, James E.; Pedulla, J. Marc; Stocker, Michael T.
2002-12-01
NIST is preparing to issue the next generation in its line of binary photomask linewidth standards. Called SRM 2059, it was developed for calibrating microscopes used to measure linewidths on photomasks, and consists of antireflecting chrome line and space patterns on a 6 inch quartz substrate ( 6 × 6 × 0.25 inches, or 15.2 × 15.2 × 0.635 cm). Certified line- and space-widths range from nominal 0.250 μm to 32 μm, and pitches from 0.5 μm to 250 μm, and are traceable to the definition of the meter. NIST's reference value, the definition of the meter, is well defined and unconditionally stable. Any replacement or duplicate NIST linewidth standard will be traceable to this same reference, and thus traceable to any other NIST length standard. Such measurement traceability can be achieved only by evaluating the measurement uncertainty (not just the repeatability) of each length comparison in the metrology chain between the definition of the meter and the NIST linewidth standard. This process results in a confidence interval about the calibration result that has a 95% probability of containing the true value. While the meter (and the μm) are well-defined, the geometrical width of a chrome line with nonrectangular cross section is not, and so the "true value" linewidth must be carefully defined to best meet users' needs. The paper and presentation will describe how these mask features are measured at NIST and how their measurement traceability is accomplished.
NASA Astrophysics Data System (ADS)
Hernández Forero, Liz Catherine; Bahamón Cortés, Nelson
2017-06-01
Around the world, there are different providers of timestamp (mobile, radio or television operators, satellites of the GPS network, astronomical measurements, etc.), however, the source of the legal time for a country is either the national metrology institute or another designated laboratory. This activity requires a time standard based on an atomic time scale. The International Bureau of Weights and Measures (BIPM) calculates a weighted average of the time kept in more than 60 nations and produces a single international time scale, called Coordinated Universal Time (UTC). This article presents the current time scale that generates Legal Time for the Republic of Colombia produced by the Instituto Nacional de Metrología (INM) using the time and frequency national standard, a cesium atomic oscillator. It also illustrates how important it is for the academic, scientific and industrial communities, as well as the general public, to be synchronized with this time scale, which is traceable to the International System (SI) of units, through international comparisons that are made in real time.
EMPRESS: A European Project to Enhance Process Control Through Improved Temperature Measurement
NASA Astrophysics Data System (ADS)
Pearce, J. V.; Edler, F.; Elliott, C. J.; Rosso, L.; Sutton, G.; Andreu, A.; Machin, G.
2017-08-01
A new European project called EMPRESS, funded by the EURAMET program `European Metrology Program for Innovation and Research,' is described. The 3 year project, which started in the summer of 2015, is intended to substantially augment the efficiency of high-value manufacturing processes by improving temperature measurement techniques at the point of use. The project consortium has 18 partners and 5 external collaborators, from the metrology sector, high-value manufacturing, sensor manufacturing, and academia. Accurate control of temperature is key to ensuring process efficiency and product consistency and is often not achieved to the level required for modern processes. Enhanced efficiency of processes may take several forms including reduced product rejection/waste; improved energy efficiency; increased intervals between sensor recalibration/maintenance; and increased sensor reliability, i.e., reduced amount of operator intervention. Traceability of temperature measurements to the International Temperature Scale of 1990 (ITS-90) is a critical factor in establishing low measurement uncertainty and reproducible, consistent process control. Introducing such traceability in situ (i.e., within the industrial process) is a theme running through this project.
Defining the measurand in radius of curvature measurements
NASA Astrophysics Data System (ADS)
Davies, Angela; Schmitz, Tony L.
2003-11-01
Traceable radius of curvature measurements are critical for precision optics manufacture. An optical bench measurement of radius is very repeatable and is the preferred method for low-uncertainty applications. On an optical bench, the displacement of the optic is measured as it is moved between the cat's eye and confocal positions, each identified using a figure measuring interferometer. Traceability requires connection to a basic unit (the meter, here) in addition to a defensible uncertainty analysis, and the identification and proper propagation of all uncertainty sources in this measurement is challenging. Recent work has focused on identifying all uncertainty contributions; measurement biases have been approximately taken into account and uncertainties combined in an RSS sense for a final measurement estimate and uncertainty. In this paper we report on a new mathematical definition of the radius measurand, which is a single function that depends on all uncertainty sources, such as error motions, alignment uncertainty, displacement gauge uncertainty, etc. The method is based on a homogeneous transformation matrix (HTM) formalism, and intrinsically defines an unbiased estimate for radius, providing a single mathematical expression for uncertainty propagation through a Taylor-series expansion.
NASA Astrophysics Data System (ADS)
Bell, S. A.; Miao, P.; Carroll, P. A.
2018-04-01
Evolved vapor coulometry is a measurement technique that selectively detects water and is used to measure water content of materials. The basis of the measurement is the quantitative electrolysis of evaporated water entrained in a carrier gas stream. Although this measurement has a fundamental principle—based on Faraday's law which directly relates electrolysis current to amount of substance electrolyzed—in practice it requires calibration. Commonly, reference materials of known water content are used, but the variety of these is limited, and they are not always available for suitable values, materials, with SI traceability, or with well-characterized uncertainty. In this paper, we report development of an alternative calibration approach using as a reference the water content of humid gas of defined dew point traceable to the SI via national humidity standards. The increased information available through this new type of calibration reveals a variation of the instrument performance across its range not visible using the conventional approach. The significance of this is discussed along with details of the calibration technique, example results, and an uncertainty evaluation.
Effective Materials Property Information Management for the 21st Century
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ren, Weiju; Cebon, David; Arnold, Steve
2010-01-01
This paper discusses key principles for the development of materials property information management software systems. There are growing needs for automated materials information management in industry, research organizations and government agencies. In part these are fuelled by the demands for higher efficiency in material testing, product design and development and engineering analysis. But equally important, organizations are being driven to employ sophisticated methods and software tools for managing their mission-critical materials information by the needs for consistency, quality and traceability of data, as well as control of access to proprietary or sensitive information. Furthermore the use of increasingly sophisticated nonlinear,more » anisotropic and multi-scale engineering analysis approaches, particularly for composite materials, requires both processing of much larger volumes of test data for development of constitutive models and much more complex materials data input requirements for Computer-Aided Engineering (CAE) software. And finally, the globalization of engineering processes and outsourcing of design and development activities generates much greater needs for sharing a single gold source of materials information between members of global engineering teams in extended supply-chains. Fortunately material property management systems have kept pace with the growing user demands. They have evolved from hard copy archives, through simple electronic databases, to versatile data management systems that can be customized to specific user needs. The more sophisticated of these provide facilities for: (i) data management functions such as access control, version control, and quality control; (ii) a wide range of data import, export and analysis capabilities; (iii) mechanisms for ensuring that all data is traceable to its pedigree sources: details of testing programs, published sources, etc; (iv) tools for searching, reporting and viewing the data; and (v) access to the information via a wide range of interfaces, including web browsers, rich clients, programmatic access and clients embedded in third-party applications, such as CAE systems. This paper discusses the important requirements for advanced material data management systems as well as the future challenges and opportunities such as automated error checking, automated data quality assessment and characterization, identification of gaps in data, as well as functionalities and business models to keep users returning to the source: to generate user demand to fuel database growth and maintenance.« less
Omar, Jone; Slowikowski, Boleslaw; Boix, Ana; von Holst, Christoph
2017-08-01
Feed additives need to be authorised to be placed on the market according to Regulation (EU) No. 1831/2003. Next to laying down the procedural requirements, the regulation creates the European Union Reference Laboratory for Feed Additives (EURL-FA) and requires that applicants send samples to the EURL-FA. Once authorised, the characteristics of the marketed feed additives should correspond to those deposited in the sample bank of the EURL-FA. For this purpose, the submitted samples were subjected to near-infrared (NIR) and Raman spectroscopy for spectral characterisation. These techniques have the valuable potential of characterising the feed additives in a non-destructive manner without any complicated sample preparation. This paper describes the capability of spectroscopy for a rapid characterisation of products to establish whether specific authorisation criteria are met. This study is based on the analysis of feed additive samples from different categories and functional groups, namely products containing (1) selenium, (2) zinc and manganese, (3) vitamins and (4) essential oils such as oregano and thyme oil. The use of chemometrics turned out to be crucial, especially in cases where the differentiation of spectra by visual inspection was very difficult.
Site systems engineering fiscal year 1999 multi-year work plan (MYWP) update for WBS 1.8.2.2
DOE Office of Scientific and Technical Information (OSTI.GOV)
GRYGIEL, M.L.
1998-10-08
Manage the Site Systems Engineering process to provide a traceable integrated requirements-driven, and technically defensible baseline. Through the Site Integration Group(SIG), Systems Engineering ensures integration of technical activities across all site projects. Systems Engineering's primary interfaces are with the RL Project Managers, the Project Direction Office and with the Project Major Subcontractors, as well as with the Site Planning organization. Systems Implementation: (1) Develops, maintains, and controls the site integrated technical baseline, ensures the Systems Engineering interfaces between projects are documented, and maintain the Site Environmental Management Specification. (2) Develops and uses dynamic simulation models for verification of the baselinemore » and analysis of alternatives. (3) Performs and documents fictional and requirements analyses. (4) Works with projects, technology management, and the SIG to identify and resolve technical issues. (5) Supports technical baseline information for the planning and budgeting of the Accelerated Cleanup Plan, Multi-Year Work Plans, Project Baseline Summaries as well as performance measure reporting. (6) Works with projects to ensure the quality of data in the technical baseline. (7) Develops, maintains and implements the site configuration management system.« less
Pre-Professional Training for Serving Children with ASD: An Apprenticeship Model of Supervision
ERIC Educational Resources Information Center
Donaldson, Amy L.
2015-01-01
Children with autism spectrum disorder (ASD) often present with varied skill profiles and levels of severity making development and implementation of specialized school services challenging. Research indicates that school professionals require and desire additional ASD-specific professional development, both at the pre-and in-service levels.…
An integrative solution for managing, tracing and citing sensor-related information
NASA Astrophysics Data System (ADS)
Koppe, Roland; Gerchow, Peter; Macario, Ana; Schewe, Ingo; Rehmcke, Steven; Düde, Tobias
2017-04-01
In a data-driven scientific world, the need to capture information on sensors used in the data acquisition process has become increasingly important. Following the recommendations of the Open Geospatial Consortium (OGC), we started by adopting the SensorML standard for describing platforms, devices and sensors. However, it soon became obvious to us that understanding, implementing and filling such standards costs significant effort and cannot be expected from every scientist individually. So we developed a web-based sensor management solution (https://sensor.awi.de) for describing platforms, devices and sensors as hierarchy of systems which supports tracing changes to a system whereas hiding complexity. Each platform contains devices where each device can have sensors associated with specific identifiers, contacts, events, related online resources (e.g. manufacturer factsheets, calibration documentation, data processing documentation), sensor output parameters and geo-location. In order to better understand and address real world requirements, we have closely interacted with field-going scientists in the context of the key national infrastructure project "FRontiers in Arctic marine Monitoring ocean observatory" (FRAM) during the software development. We learned that not only the lineage of observations is crucial for scientists but also alert services using value ranges, flexible output formats and information on data providers (e.g. FTP sources) for example. Mostly important, persistent and citable versions of sensor descriptions are required for traceability and reproducibility allowing seamless integration with existing information systems, e.g. PANGAEA. Within the context of the EU-funded Ocean Data Interoperability Platform project (ODIP II) and in cooperation with 52north we are proving near real-time data via Sensor Observation Services (SOS) along with sensor descriptions based on our sensor management solution. ODIP II also aims to develop a harmonized SensorML profile for the marine community which we will be adopting in our solution as soon as available. In this presentation we will show our sensor management solution which is embedded in our data flow framework to offer out-of-the-box interoperability with existing information systems and standards. In addition, we will present real world examples and challenges related to the description and traceability of sensor metadata.
Datla, Raju; Weinreb, Michael; Rice, Joseph; Johnson, B. Carol; Shirley, Eric; Cao, Changyong
2014-01-01
This paper traces the cooperative efforts of scientists at the National Oceanic and Atmospheric Administration (NOAA) and the National Institute of Standards and Technology (NIST) to improve the calibration of operational satellite sensors for remote sensing of the Earth’s land, atmosphere and oceans. It gives a chronological perspective of the NOAA satellite program and the interactions between the two agencies’ scientists to address pre-launch calibration and issues of sensor performance on orbit. The drive to improve accuracy of measurements has had a new impetus in recent years because of the need for improved weather prediction and climate monitoring. The highlights of this cooperation and strategies to achieve SI-traceability and improve accuracy for optical satellite sensor data are summarized1. PMID:26601030
Datla, Raju; Weinreb, Michael; Rice, Joseph; Johnson, B Carol; Shirley, Eric; Cao, Changyong
2014-01-01
This paper traces the cooperative efforts of scientists at the National Oceanic and Atmospheric Administration (NOAA) and the National Institute of Standards and Technology (NIST) to improve the calibration of operational satellite sensors for remote sensing of the Earth's land, atmosphere and oceans. It gives a chronological perspective of the NOAA satellite program and the interactions between the two agencies' scientists to address pre-launch calibration and issues of sensor performance on orbit. The drive to improve accuracy of measurements has had a new impetus in recent years because of the need for improved weather prediction and climate monitoring. The highlights of this cooperation and strategies to achieve SI-traceability and improve accuracy for optical satellite sensor data are summarized.
TESTING OF A 20-METER SOLAR SAIL SYSTEM
NASA Technical Reports Server (NTRS)
Gaspar, J. L.; Behun, V.; Mann, T.; Murphy D.; Macy, B.
2005-01-01
This paper describes the structural dynamic tests conducted in-vacuum on the Scalable Square Solar Sail (S(sup 4)) System 20-meter test article developed by ATK Space Systems as part of a ground demonstrator system development program funded by NASA's In-Space Propulsion program1-3. These tests were conducted for the purpose of validating analytical models that would be required by a flight test program to predict in space performance4. Specific tests included modal vibration tests on the solar sail system in a 1 Torr vacuum environment using various excitation locations and techniques including magnetic excitation at the sail quadrant corners, piezoelectric stack actuation at the mast roots, spreader bar excitation at the mast tips, and bi-morph piezoelectric patch actuation on the sail cords. The excitation methods were evaluated for their suitability to in-vacuum ground testing and their traceability to the development of on-orbit flight test techniques. The solar sail masts were also tested in ambient atmospheric conditions and these results are also discussed.
TESTING OF A 20-METER SOLAR SAIL SYSTEM
NASA Technical Reports Server (NTRS)
Gaspar, Jim L.; Behun, Vaughan; Mann, Troy; Murphy, Dave; Macy, Brian
2005-01-01
This paper describes the structural dynamic tests conducted in-vacuum on the Scalable Square Solar Sail (S(sup 4)) System 20-meter test article developed by ATK Space Systems as part of a ground demonstrator system development program funded by NASA's In-Space Propulsion program. These tests were conducted for the purpose of validating analytical models that would be required by a flight test program to predict in space performance. Specific tests included modal vibration tests on the solar sail system in a 1 Torr vacuum environment using various excitation locations and techniques including magnetic excitation at the sail quadrant corners, piezoelectric stack actuation at the mast roots, spreader bar excitation at the mast tips, and bi-morph piezoelectric patch actuation on the sail cords. The excitation methods are evaluated for their suitability to in-vacuum ground testing and their traceability to the development of on-orbit flight test techniques. The solar sail masts were also tested in ambient atmospheric conditions and these results are also discussed.
Identifying and Tracing User Needs
NASA Astrophysics Data System (ADS)
To, C.; Tauer, E.
2017-12-01
Providing adequate tools to the user community hinges on reaching the specific goals and needs behind the intended application of the tool. While the approach of leveraging user-supplied inputs and use cases to identify those goals is not new, there frequently remains the challenge of tracing those use cases through to implementation in an efficient and manageable fashion. Processes can become overcomplicated very quickly, and additionally, explicitly mapping progress towards the achievement of the user demands can become overwhelming when hundreds of use-cases are at play. This presentation will discuss a demonstrated use-case approach that has achieved an initial success with a tool re-design and deployment, the means to apply use cases in the generation of a roadmap for future releases over time, and the ability to include and adjust to new user requirements and suggestions with minimal disruption to the traceability. It is hoped that the findings and lessons learned will help make use case employment easier for others seeking to create user-targeted capabilities.
An Automated Thermocouple Calibration System
NASA Technical Reports Server (NTRS)
Bethea, Mark D.; Rosenthal, Bruce N.
1992-01-01
An Automated Thermocouple Calibration System (ATCS) was developed for the unattended calibration of type K thermocouples. This system operates from room temperature to 650 C and has been used for calibration of thermocouples in an eight-zone furnace system which may employ as many as 60 thermocouples simultaneously. It is highly efficient, allowing for the calibration of large numbers of thermocouples in significantly less time than required for manual calibrations. The system consists of a personal computer, a data acquisition/control unit, and a laboratory calibration furnace. The calibration furnace is a microprocessor-controlled multipurpose temperature calibrator with an accuracy of +/- 0.7 C. The accuracy of the calibration furnace is traceable to the National Institute of Standards and Technology (NIST). The computer software is menu-based to give the user flexibility and ease of use. The user needs no programming experience to operate the systems. This system was specifically developed for use in the Microgravity Materials Science Laboratory (MMSL) at the NASA LeRC.
Spatially restricted G protein-coupled receptor activity via divergent endocytic compartments.
Jean-Alphonse, Frederic; Bowersox, Shanna; Chen, Stanford; Beard, Gemma; Puthenveedu, Manojkumar A; Hanyaloglu, Aylin C
2014-02-14
Postendocytic sorting of G protein-coupled receptors (GPCRs) is driven by their interactions between highly diverse receptor sequence motifs with their interacting proteins, such as postsynaptic density protein (PSD95), Drosophila disc large tumor suppressor (Dlg1), zonula occludens-1 protein (zo-1) (PDZ) domain proteins. However, whether these diverse interactions provide an underlying functional specificity, in addition to driving sorting, is unknown. Here we identify GPCRs that recycle via distinct PDZ ligand/PDZ protein pairs that exploit their recycling machinery primarily for targeted endosomal localization and signaling specificity. The luteinizing hormone receptor (LHR) and β2-adrenergic receptor (B2AR), two GPCRs sorted to the regulated recycling pathway, underwent divergent trafficking to distinct endosomal compartments. Unlike B2AR, which traffics to early endosomes (EE), LHR internalizes to distinct pre-early endosomes (pre-EEs) for its recycling. Pre-EE localization required interactions of the LHR C-terminal tail with the PDZ protein GAIP-interacting protein C terminus, inhibiting its traffic to EEs. Rerouting the LHR to EEs, or EE-localized GPCRs to pre-EEs, spatially reprograms MAPK signaling. Furthermore, LHR-mediated activation of MAPK signaling requires internalization and is maintained upon loss of the EE compartment. We propose that combinatorial specificity between GPCR sorting sequences and interacting proteins dictates an unprecedented spatiotemporal control in GPCR signal activity.
NASA Astrophysics Data System (ADS)
Salim, Mohd Faiz; Roslan, Ridha; Ibrahim, Mohd Rizal Mamat @
2014-02-01
Deterministic Safety Analysis (DSA) is one of the mandatory requirements conducted for Nuclear Power Plant licensing process, with the aim of ensuring safety compliance with relevant regulatory acceptance criteria. DSA is a technique whereby a set of conservative deterministic rules and requirements are applied for the design and operation of facilities or activities. Computer codes are normally used to assist in performing all required analysis under DSA. To ensure a comprehensive analysis, the conduct of DSA should follow a systematic approach. One of the methodologies proposed is the Standardized and Consolidated Reference Experimental (and Calculated) Database (SCRED) developed by University of Pisa. Based on this methodology, the use of Reference Data Set (RDS) as a pre-requisite reference document for developing input nodalization was proposed. This paper shall describe the application of RDS with the purpose of assessing its effectiveness. Two RDS documents were developed for an Integral Test Facility of LOBI-MOD2 and associated Test A1-83. Data and information from various reports and drawings were referred in preparing the RDS. The results showed that by developing RDS, it has made possible to consolidate all relevant information in one single document. This is beneficial as it enables preservation of information, promotes quality assurance, allows traceability, facilitates continuous improvement, promotes solving of contradictions and finally assisting in developing thermal hydraulic input regardless of whichever code selected. However, some disadvantages were also recognized such as the need for experience in making engineering judgments, language barrier in accessing foreign information and limitation of resources. Some possible improvements are suggested to overcome these challenges.
Recent developments in dimensional nanometrology using AFMs
NASA Astrophysics Data System (ADS)
Yacoot, Andrew; Koenders, Ludger
2011-12-01
Scanning probe microscopes, in particular the atomic force microscope (AFM), have developed into sophisticated instruments that, throughout the world, are no longer used just for imaging, but for quantitative measurements. A role of the national measurement institutes has been to provide traceable metrology for these instruments. This paper presents a brief overview as to how this has been achieved, highlights the future requirements for metrology to support developments in AFM technology and describes work in progress to meet this need.
Dimensional nanometrology at the National Physical Laboratory
NASA Astrophysics Data System (ADS)
Yacoot, Andrew; Leach, Richard; Hughes, Ben; Giusca, Claudiu; Jones, Christopher; Wilson, Alan
2008-10-01
The growth in nanotechnology has led to an increased requirement for traceable dimensional measurements of nanometre-sized objects and micrometre-sized objects with nanometre tolerances. To meet this challenge NPL has developed both purpose built instrumentation and added metrology to commercially available equipment. This paper describes the development and use of a selection of these instruments that include: atomic force microscopy, x-ray interferometry, a low force balance, a micro coordinate measuring machine and an areal surface texture measuring instrument.
High Sensitivity Optomechanical Reference Accelerometer over 10 kHz
2014-06-05
bandwidth of 10 kHz and is traceable. We have incorporated a Fabry-P erot fiber-optic micro-cavity that is currently capable of measuring the test-mass...10 kHz- bandwidth requires displacement detection sensitivities at levels of 10 16 m= Hz p . Optical detection schemes, such as Fabry-P erot ...based micro- mirror Fabry-P erot cavity19,20 was built to operate in reflec- tion as the optical sensor. The mechanical oscillator ground platform and
Naumann, R; Alexander-Weber, Ch; Eberhardt, R; Giera, J; Spitzer, P
2002-11-01
Routine pH measurements are carried out with pH meter-glass electrode assemblies. In most cases the glass and reference electrodes are thereby fashioned into a single probe, the so-called 'combination electrode' or simply 'the pH electrode'. The use of these electrodes is subject to various effects, described below, producing uncertainties of unknown magnitude. Therefore, the measurement of pH of a sample requires a suitable calibration by certified standard buffer solutions (CRMs) traceable to primary pH standards. The procedures in use are based on calibrations at one point, at two points bracketing the sample pH and at a series of points, the so-called multi-point calibration. The multi-point calibration (MPC) is recommended if minimum uncertainty and maximum consistency are required over a wide range of unknown pH values. Details of uncertainty computations for the two-point and MPC procedure are given. Furthermore, the multi-point calibration is a useful tool to characterise the performance of pH electrodes. This is demonstrated with different commercial pH electrodes. ELECTRONIC SUPPLEMENTARY MATERIAL is available if you access this article at http://dx.doi.org/10.1007/s00216-002-1506-5. On that page (frame on the left side), a link takes you directly to the supplementary material.
Gilmore, Adam Matthew
2014-01-01
Contemporary spectrofluorimeters comprise exciting light sources, excitation and emission monochromators, and detectors that without correction yield data not conforming to an ideal spectral response. The correction of the spectral properties of the exciting and emission light paths first requires calibration of the wavelength and spectral accuracy. The exciting beam path can be corrected up to the sample position using a spectrally corrected reference detection system. The corrected reference response accounts for both the spectral intensity and drift of the exciting light source relative to emission and/or transmission detector responses. The emission detection path must also be corrected for the combined spectral bias of the sample compartment optics, emission monochromator, and detector. There are several crucial issues associated with both excitation and emission correction including the requirement to account for spectral band-pass and resolution, optical band-pass or neutral density filters, and the position and direction of polarizing elements in the light paths. In addition, secondary correction factors are described including (1) subtraction of the solvent's fluorescence background, (2) removal of Rayleigh and Raman scattering lines, as well as (3) correcting for sample concentration-dependent inner-filter effects. The importance of the National Institute of Standards and Technology (NIST) traceable calibration and correction protocols is explained in light of valid intra- and interlaboratory studies and effective spectral qualitative and quantitative analyses including multivariate spectral modeling.
Roberts, R. Edward; Husain, Masud
2015-01-01
Introduction Although the pre-supplementary motor area (pre-SMA) is one of the most frequently reported areas of activation in functional imaging studies, the role of this brain region in cognition is still a matter of intense debate. Here we present a patient with a focal lesion of caudal pre-SMA who displays a selective deficit in updating a response plan to switch actions, but shows no impairment when required to withhold a response – stopping. Materials & methods The patient and a control group underwent three tasks designed to measure different aspects of cognitive control and executive function. Results The pre-SMA patient displayed no impairment when responding in the face of distracting stimuli (Eriksen flanker paradigm), or when required to halt an on-going response (STOP task). However, a specific deficit was observed when she was required to rapidly switch between response plans (CHANGE task). Conclusions These findings suggest that the caudal pre-SMA may have a particularly important role in a network of brain regions required for rapidly updating and implementing response plans. The lack of any significant impairment on other measures of cognitive control suggests that this is not likely due to a global deficit in cognitive control. We discuss the implications of these results in the context of current theories of pre-SMA function. PMID:25282056
Quality Assurance Specifications for Planetary Protection Assays
NASA Astrophysics Data System (ADS)
Baker, Amy
As the European Space Agency planetary protection (PP) activities move forward to support the ExoMars and other planetary missions, it will become necessary to increase staffing of labo-ratories that provide analyses for these programs. Standardization of procedures, a comprehen-sive quality assurance program, and unilateral training of personnel will be necessary to ensure that the planetary protection goals and schedules are met. The PP Quality Assurance/Quality Control (QAQC) program is designed to regulate and monitor procedures performed by labora-tory personnel to ensure that all work meets data quality objectives through the assembly and launch process. Because personnel time is at a premium and sampling schedules are often de-pendent on engineering schedules, it is necessary to have flexible staffing to support all sampling requirements. The most productive approach to having a competent and flexible work force is to establish well defined laboratory procedures and training programs that clearly address the needs of the program and the work force. The quality assurance specification for planetary protection assays has to ensure that labora-tories and associated personnel can demonstrate the competence to perform assays according to the applicable standard AD4. Detailed subjects included in the presentation are as follows: • field and laboratory control criteria • data reporting • personnel training requirements and certification • laboratory audit criteria. Based upon RD2 for primary and secondary validation and RD3 for data quality objectives, the QAQC will provide traceable quality assurance safeguards by providing structured laboratory requirements for guidelines and oversight including training and technical updates, standardized documentation, standardized QA/QC checks, data review and data archiving.
USDA-ARS?s Scientific Manuscript database
Small, coded, pill-sized tracers embedded in grain are proposed as a method for grain traceability. A sampling process for a grain traceability system was designed and investigated by applying probability statistics using a science-based sampling approach to collect an adequate number of tracers fo...
Research and Construction of DC Energy Measurement Traceability Technology
NASA Astrophysics Data System (ADS)
Zhi, Wang; Maotao, Yang; Jing, Yang
2018-02-01
With the implementation of energy saving and emission reduction policies, DC energy metering has been widely used in many fields. In view of the lack of a DC energy measurementtraceability system, in combination with the process of downward measurement transfer in relation to the DC charger-based field calibration technology and DC energy meter and shunt calibration technologies, the paper proposed DC fast charging, high DC, small DC voltage output and measuring technologies, and built a time-based plan by converting high DC voltage into low voltage and high current into low current and then into low voltage, leaving DC energy traceable to national standards in terms of voltage, current and time and thus filling in the gap in DC energy measurement traceability.
Getzmann, Stephan; Lewald, Jörg; Falkenstein, Michael
2014-01-01
Speech understanding in complex and dynamic listening environments requires (a) auditory scene analysis, namely auditory object formation and segregation, and (b) allocation of the attentional focus to the talker of interest. There is evidence that pre-information is actively used to facilitate these two aspects of the so-called "cocktail-party" problem. Here, a simulated multi-talker scenario was combined with electroencephalography to study scene analysis and allocation of attention in young and middle-aged adults. Sequences of short words (combinations of brief company names and stock-price values) from four talkers at different locations were simultaneously presented, and the detection of target names and the discrimination between critical target values were assessed. Immediately prior to speech sequences, auditory pre-information was provided via cues that either prepared auditory scene analysis or attentional focusing, or non-specific pre-information was given. While performance was generally better in younger than older participants, both age groups benefited from auditory pre-information. The analysis of the cue-related event-related potentials revealed age-specific differences in the use of pre-cues: Younger adults showed a pronounced N2 component, suggesting early inhibition of concurrent speech stimuli; older adults exhibited a stronger late P3 component, suggesting increased resource allocation to process the pre-information. In sum, the results argue for an age-specific utilization of auditory pre-information to improve listening in complex dynamic auditory environments.
Getzmann, Stephan; Lewald, Jörg; Falkenstein, Michael
2014-01-01
Speech understanding in complex and dynamic listening environments requires (a) auditory scene analysis, namely auditory object formation and segregation, and (b) allocation of the attentional focus to the talker of interest. There is evidence that pre-information is actively used to facilitate these two aspects of the so-called “cocktail-party” problem. Here, a simulated multi-talker scenario was combined with electroencephalography to study scene analysis and allocation of attention in young and middle-aged adults. Sequences of short words (combinations of brief company names and stock-price values) from four talkers at different locations were simultaneously presented, and the detection of target names and the discrimination between critical target values were assessed. Immediately prior to speech sequences, auditory pre-information was provided via cues that either prepared auditory scene analysis or attentional focusing, or non-specific pre-information was given. While performance was generally better in younger than older participants, both age groups benefited from auditory pre-information. The analysis of the cue-related event-related potentials revealed age-specific differences in the use of pre-cues: Younger adults showed a pronounced N2 component, suggesting early inhibition of concurrent speech stimuli; older adults exhibited a stronger late P3 component, suggesting increased resource allocation to process the pre-information. In sum, the results argue for an age-specific utilization of auditory pre-information to improve listening in complex dynamic auditory environments. PMID:25540608
Quantitative Measurements of X-ray Intensity
DOE Office of Scientific and Technical Information (OSTI.GOV)
Haugh, M. J., Schneider, M.
This chapter describes the characterization of several X-ray sources and their use in calibrating different types of X-ray cameras at National Security Technologies, LLC (NSTec). The cameras are employed in experimental plasma studies at Lawrence Livermore National Laboratory (LLNL), including the National Ignition Facility (NIF). The sources provide X-rays in the energy range from several hundred eV to 110 keV. The key to this effort is measuring the X-ray beam intensity accurately and traceable to international standards. This is accomplished using photodiodes of several types that are calibrated using radioactive sources and a synchrotron source using methods and materials thatmore » are traceable to the U.S. National Institute of Standards and Technology (NIST). The accreditation procedures are described. The chapter begins with an introduction to the fundamental concepts of X-ray physics. The types of X-ray sources that are used for device calibration are described. The next section describes the photodiode types that are used for measuring X-ray intensity: power measuring photodiodes, energy dispersive photodiodes, and cameras comprising photodiodes as pixel elements. Following their description, the methods used to calibrate the primary detectors, the power measuring photodiodes and the energy dispersive photodiodes, as well as the method used to get traceability to international standards are described. The X-ray source beams can then be measured using the primary detectors. The final section then describes the use of the calibrated X-ray beams to calibrate X-ray cameras. Many of the references are web sites that provide databases, explanations of the data and how it was generated, and data calculations for specific cases. Several general reference books related to the major topics are included. Papers expanding some subjects are cited.« less
A Greenhouse-Gas Information System: Monitoring and Validating Emissions Reporting and Mitigation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jonietz, Karl K.; Dimotakis, Paul E.; Rotman, Douglas A.
2011-09-26
This study and report focus on attributes of a greenhouse-gas information system (GHGIS) needed to support MRV&V needs. These needs set the function of such a system apart from scientific/research monitoring of GHGs and carbon-cycle systems, and include (not exclusively): the need for a GHGIS that is operational, as required for decision-support; the need for a system that meets specifications derived from imposed requirements; the need for rigorous calibration, verification, and validation (CV&V) standards, processes, and records for all measurement and modeling/data-inversion data; the need to develop and adopt an uncertainty-quantification (UQ) regimen for all measurement and modeling data; andmore » the requirement that GHGIS products can be subjected to third-party questioning and scientific scrutiny. This report examines and assesses presently available capabilities that could contribute to a future GHGIS. These capabilities include sensors and measurement technologies; data analysis and data uncertainty quantification (UQ) practices and methods; and model-based data-inversion practices, methods, and their associated UQ. The report further examines the need for traceable calibration, verification, and validation processes and attached metadata; differences between present science-/research-oriented needs and those that would be required for an operational GHGIS; the development, operation, and maintenance of a GHGIS missions-operations center (GMOC); and the complex systems engineering and integration that would be required to develop, operate, and evolve a future GHGIS.« less
Climate Benchmark Missions: CLARREO
NASA Technical Reports Server (NTRS)
Wielicki, Bruce A.; Young, David F.
2010-01-01
CLARREO (Climate Absolute Radiance and Refractivity Observatory) is one of the four Tier 1 missions recommended by the recent NRC decadal survey report on Earth Science and Applications from Space (NRC, 2007). The CLARREO mission addresses the need to rigorously observe climate change on decade time scales and to use decadal change observations as the most critical method to determine the accuracy of climate change projections such as those used in the Fourth Assessment Report of the Intergovernmental Panel on Climate Change (IPCC AR4). A rigorously known accuracy of both decadal change observations as well as climate projections is critical in order to enable sound policy decisions. The CLARREO mission accomplishes this critical objective through highly accurate and SI traceable decadal change observations sensitive to many of the key uncertainties in climate radiative forcings, responses, and feedbacks that in turn drive uncertainty in current climate model projections. The same uncertainties also lead to uncertainty in attribution of climate change to anthropogenic forcing. The CLARREO breakthrough in decadal climate change observations is to achieve the required levels of accuracy and traceability to SI standards for a set of observations sensitive to a wide range of key decadal change variables. These accuracy levels are determined both by the projected decadal changes as well as by the background natural variability that such signals must be detected against. The accuracy for decadal change traceability to SI standards includes uncertainties of calibration, sampling, and analysis methods. Unlike most other missions, all of the CLARREO requirements are judged not by instantaneous accuracy, but instead by accuracy in large time/space scale average decadal changes. Given the focus on decadal climate change, the NRC Decadal Survey concluded that the single most critical issue for decadal change observations was their lack of accuracy and low confidence in observing the small but critical climate change signals. CLARREO is the recommended attack on this challenge, and builds on the last decade of climate observation advances in the Earth Observing System as well as metrological advances at NIST (National Institute of Standards and Technology) and other standards laboratories.
Dennis, Bradley M; Gondek, Stephen P; Guyer, Richard A; Hamblin, Susan E; Gunter, Oliver L; Guillamondegui, Oscar D
2017-04-01
Concerted management of the traumatic hemothorax is ill-defined. Surgical management of specific hemothoraces may be beneficial. A comprehensive strategy to delineate appropriate patients for additional procedures does not exist. We developed an evidence-based algorithm for hemothorax management. We hypothesize that the use of this algorithm will decrease additional interventions. A pre-/post-study was performed on all patients admitted to our trauma service with traumatic hemothorax from August 2010 to September 2013. An evidence-based management algorithm was initiated for the management of retained hemothoraces. Patients with length of stay (LOS) less than 24 hours or admitted during an implementation phase were excluded. Study data included age, Injury Severity Score, Abbreviated Injury Scale chest, mechanism of injury, ventilator days, intensive care unit (ICU) LOS, total hospital LOS, and interventions required. Our primary outcome was number of patients requiring more than 1 intervention. Secondary outcomes were empyema rate, number of patients requiring specific additional interventions, 28-day ventilator-free days, 28-day ICU-free days, hospital LOS, all-cause 6-month readmission rate. Standard statistical analysis was performed for all data. Six hundred forty-two patients (326 pre and 316 post) met the study criteria. There were no demographic differences in either group. The number of patients requiring more than 1 intervention was significantly reduced (49 pre vs. 28 post, p = 0.02). Number of patients requiring VATS decreased (27 pre vs. 10 post, p < 0.01). Number of catheters placed by interventional radiology increased (2 pre vs. 10 post, p = 0.02). Intrapleural thrombolytic use, open thoracotomy, empyema, and 6-month readmission rates were unchanged. The "post" group more ventilator-free days (median, 23.9 vs. 22.5, p = 0.04), but ICU and hospital LOS were unchanged. Using an evidence-based hemothorax algorithm reduced the number of patients requiring additional interventions without increasing complication rates. Defined criteria for surgical intervention allows for more appropriate utilization of resources. Therapeutic study, level IV.
SPIDER: Next Generation Chip Scale Imaging Sensor
NASA Astrophysics Data System (ADS)
Duncan, Alan; Kendrick, Rick; Thurman, Sam; Wuchenich, Danielle; Scott, Ryan P.; Yoo, S. J. B.; Su, Tiehui; Yu, Runxiang; Ogden, Chad; Proiett, Roberto
The LM Advanced Technology Center and UC Davis are developing an Electro-Optical (EO) imaging sensor called SPIDER (Segmented Planar Imaging Detector for Electro-optical Reconnaissance) that provides a 10x to 100x size, weight, and power (SWaP) reduction alternative to the traditional bulky optical telescope and focal plane detector array. The substantial reductions in SWaP would reduce cost and/or provide higher resolution by enabling a larger aperture imager in a constrained volume. The SPIDER concept consists of thousands of direct detection white-light interferometers densely packed onto Photonic Integrated Circuits (PICs) to measure the amplitude and phase of the visibility function at spatial frequencies that span the full synthetic aperture. In other words, SPIDER would sample the object being imaged in the Fourier domain (i.e., spatial frequency domain), and then digitally reconstruct an image. The conventional approach for imaging interferometers requires complex mechanical delay lines to form the interference fringes. This results in designs that are not traceable to more than a few simultaneous spatial frequency measurements. SPIDER seeks to achieve this traceability by employing micron-=scale optical waveguides and nanophotonic structures fabricated on a PIC with micron-scale packing density to form the necessary interferometers. Prior LM IRAD and DARPA/NASA CRAD-funded SPIDER risk reduction experiments, design trades, and simulations have matured the SPIDER imager concept to a TRL 3 level. Current funding under the DARPA SPIDER Zoom program is maturing the underlying PIC technology for SPIDER to the TRL 4 level. This is done by developing and fabricating a second-generation PIC that is fully traceable to the multiple layers and low-power phase modulators required for higher-dimension waveguide arrays that are needed for higher field-of-view sensors. Our project also seeks to extend the SPIDER concept to add a zoom capability that would provide simultaneous low-resolution, large field-of-view and steerable high-resolution, narrow field-of-view imaging modes. A proof of concept demo is being designed to validate this capability. Finally, data collected by this project would be used to benchmark and increase the fidelity of our SPIDER image simulations and enhance our ability to predict the performance of existing and future SPIDER sensor design variations. These designs and their associated performance characteristics could then be evaluated as candidates for future mission opportunities to identify specific transition paths. This paper provides an overview of performance data on the first-generation PIC for SPIDER developed under DARPA SeeMe program funding. We provide a design description of the SPICER Zoom imaging sensor and the second-generation PIC (high- and low-resolution versions) currently under development on the DARPA SPIDER Zoom effort. Results of performance simulations and design trades are presented. Unique low-cost payload applications for future SSA missions are also discussed.
Santos, António J M; Nogueira, Cristina; Ortega-Bellido, Maria; Malhotra, Vivek
2016-05-09
Procollagens, pre-chylomicrons, and pre-very low-density lipoproteins (pre-VLDLs) are too big to fit into conventional COPII-coated vesicles, so how are these bulky cargoes exported from the endoplasmic reticulum (ER)? We have shown that TANGO1 located at the ER exit site is necessary for procollagen export. We report a role for TANGO1 and TANGO1-like (TALI), a chimeric protein resulting from fusion of MIA2 and cTAGE5 gene products, in the export of pre-chylomicrons and pre-VLDLs from the ER. TANGO1 binds TALI, and both interact with apolipoprotein B (ApoB) and are necessary for the recruitment of ApoB-containing lipid particles to ER exit sites for their subsequent export. Although export of ApoB requires the function of both TANGO1 and TALI, the export of procollagen XII by the same cells requires only TANGO1. These findings reveal a general role for TANGO1 in the export of bulky cargoes from the ER and identify a specific requirement for TALI in assisting TANGO1 to export bulky lipid particles. © 2016 Santos et al.
ERIC Educational Resources Information Center
Carr, Ronald L.
2013-01-01
This dissertation is a compilation of three separate works representing a wide range of issues related to pre-college engineering. Each work addresses multiple levels of concern for educators from national policy to specific classroom intervention. Although presenting different styles of writing--due to different journals requirements--and various…
Mendikute, Alberto; Zatarain, Mikel; Bertelsen, Álvaro; Leizea, Ibai
2017-01-01
Photogrammetry methods are being used more and more as a 3D technique for large scale metrology applications in industry. Optical targets are placed on an object and images are taken around it, where measuring traceability is provided by precise off-process pre-calibrated digital cameras and scale bars. According to the 2D target image coordinates, target 3D coordinates and camera views are jointly computed. One of the applications of photogrammetry is the measurement of raw part surfaces prior to its machining. For this application, post-process bundle adjustment has usually been adopted for computing the 3D scene. With that approach, a high computation time is observed, leading in practice to time consuming and user dependent iterative review and re-processing procedures until an adequate set of images is taken, limiting its potential for fast, easy-to-use, and precise measurements. In this paper, a new efficient procedure is presented for solving the bundle adjustment problem in portable photogrammetry. In-process bundle computing capability is demonstrated on a consumer grade desktop PC, enabling quasi real time 2D image and 3D scene computing. Additionally, a method for the self-calibration of camera and lens distortion has been integrated into the in-process approach due to its potential for highest precision when using low cost non-specialized digital cameras. Measurement traceability is set only by scale bars available in the measuring scene, avoiding the uncertainty contribution of off-process camera calibration procedures or the use of special purpose calibration artifacts. The developed self-calibrated in-process photogrammetry has been evaluated both in a pilot case scenario and in industrial scenarios for raw part measurement, showing a total in-process computing time typically below 1 s per image up to a maximum of 2 s during the last stages of the computed industrial scenes, along with a relative precision of 1/10,000 (e.g., 0.1 mm error in 1 m) with an error RMS below 0.2 pixels at image plane, ranging at the same performance reported for portable photogrammetry with precise off-process pre-calibrated cameras. PMID:28891946
Mendikute, Alberto; Yagüe-Fabra, José A; Zatarain, Mikel; Bertelsen, Álvaro; Leizea, Ibai
2017-09-09
Photogrammetry methods are being used more and more as a 3D technique for large scale metrology applications in industry. Optical targets are placed on an object and images are taken around it, where measuring traceability is provided by precise off-process pre-calibrated digital cameras and scale bars. According to the 2D target image coordinates, target 3D coordinates and camera views are jointly computed. One of the applications of photogrammetry is the measurement of raw part surfaces prior to its machining. For this application, post-process bundle adjustment has usually been adopted for computing the 3D scene. With that approach, a high computation time is observed, leading in practice to time consuming and user dependent iterative review and re-processing procedures until an adequate set of images is taken, limiting its potential for fast, easy-to-use, and precise measurements. In this paper, a new efficient procedure is presented for solving the bundle adjustment problem in portable photogrammetry. In-process bundle computing capability is demonstrated on a consumer grade desktop PC, enabling quasi real time 2D image and 3D scene computing. Additionally, a method for the self-calibration of camera and lens distortion has been integrated into the in-process approach due to its potential for highest precision when using low cost non-specialized digital cameras. Measurement traceability is set only by scale bars available in the measuring scene, avoiding the uncertainty contribution of off-process camera calibration procedures or the use of special purpose calibration artifacts. The developed self-calibrated in-process photogrammetry has been evaluated both in a pilot case scenario and in industrial scenarios for raw part measurement, showing a total in-process computing time typically below 1 s per image up to a maximum of 2 s during the last stages of the computed industrial scenes, along with a relative precision of 1/10,000 (e.g. 0.1 mm error in 1 m) with an error RMS below 0.2 pixels at image plane, ranging at the same performance reported for portable photogrammetry with precise off-process pre-calibrated cameras.
Advanced UVOIR Mirror Technology Development (AMTD) for Very Large Space Telescopes
NASA Technical Reports Server (NTRS)
Stahl, H. Philip; Smith, W. Scott; Mosier, Gary; Abplanalp, Laura; Arnold, William
2014-01-01
ASTRO2010 Decadal stated that an advanced large-aperture ultraviolet, optical, near-infrared (UVOIR) telescope is required to enable the next generation of compelling astrophysics and exoplanet science; and, that present technology is not mature enough to affordably build and launch any potential UVOIR mission concept. AMTD builds on the state of art (SOA) defined by over 30 years of monolithic & segmented ground & space-telescope mirror technology to mature six key technologies. AMTD is deliberately pursuing multiple design paths to provide the science community with op-tions to enable either large aperture monolithic or segmented mirrors with clear engineering metrics traceable to science requirements.
NASA Technical Reports Server (NTRS)
Collins, Emmanuel G., Jr.; Phillips, Douglas J.; Hyland, David C.
1990-01-01
Many large space system concepts will require active vibration control to satisfy critical performance requirements such as line-of-sight accuracy. In order for these concepts to become operational it is imperative that the benefits of active vibration control be practically demonstrated in ground based experiments. The results of the experiment successfully demonstrate active vibration control for a flexible structure. The testbed is the Active Control Technique Evaluation for Spacecraft (ACES) structure at NASA Marshall Space Flight Center. The ACES structure is dynamically traceable to future space systems and especially allows the study of line-of-sight control issues.
NASA Technical Reports Server (NTRS)
Biernacki, John; Juhasz, John; Sadler, Gerald
1991-01-01
A team of Space Station Freedom (SSF) system engineers are in the process of extensive analysis of the SSF requirements, particularly those pertaining to the electrical power system (EPS). The objective of this analysis is the development of a comprehensive, computer-based requirements model, using an enhanced modern structured analysis methodology (EMSA). Such a model provides a detailed and consistent representation of the system's requirements. The process outlined in the EMSA methodology is unique in that it allows the graphical modeling of real-time system state transitions, as well as functional requirements and data relationships, to be implemented using modern computer-based tools. These tools permit flexible updating and continuous maintenance of the models. Initial findings resulting from the application of EMSA to the EPS have benefited the space station program by linking requirements to design, providing traceability of requirements, identifying discrepancies, and fostering an understanding of the EPS.
USDA-ARS?s Scientific Manuscript database
Food-grade tracers were printed with two-dimensional Data Matrix (DM) barcode so that they could carry simulated identifying information about grain as part of a prospective traceability system. The key factor in evaluating the tracers was their ability to be read with a code scanner after being rem...
Food quality and safety: traceability and labeling.
Lupien, John R
2005-01-01
This article discusses food systems in general, their development over the past 120 years, and realities and problems faced by a world population of over 6 billion people. Various food and feed problems are mentioned, and the concept of "traceability" is discussed in the context of the broader and more useful approach of using "good practices" at all levels of the food chain.
Metrological Traceability in the Social Sciences: A Model from Reading Measurement
NASA Astrophysics Data System (ADS)
Stenner, A. Jackson; Fisher, William P., Jr.
2013-09-01
The central importance of reading ability in learning makes it the natural place to start in formative and summative assessments in education. The Lexile Framework for Reading constitutes a commercial metrological traceability network linking books, test results, instructional materials, and students in elementary and secondary English and Spanish language reading education in the U.S., Canada, Mexico, and Australia.
Chrysochou, Polymeros; Chryssochoidis, George; Kehagia, Olga
2009-12-01
The implementation of traceability in the food supply chain has reinforced adoption of technologies with the ability to track forward and trace back product-related information. Based on the premise that these technologies can be used as a means to provide product-related information to consumers, this paper explores the perceived benefits and drawbacks of such technologies. The aim is to identify factors that influence consumers' perceptions of such technologies, and furthermore to advise the agri-food business on issues that they should consider prior to the implementation of such technologies in their production lines. For the purposes of the study, a focus group study was conducted across 12 European countries, while a set of four different technologies used as a means to provide traceability information to consumers was the focal point of the discussions in each focus group. Results show that the amount of and confidence in the information provided, perceived levels of convenience, impact on product quality and safety, impact on consumers' health and the environment, and potential consequences on ethical and privacy liberties constitute important factors influencing consumers' perceptions of technologies that provide traceability.
Cai, Rui; Wang, Shisheng; Tang, Bo; Li, Yueqing; Zhao, Weijie
2018-01-01
Sea cucumber is the major tonic seafood worldwide, and geographical origin traceability is an important part of its quality and safety control. In this work, a non-destructive method for origin traceability of sea cucumber (Apostichopus japonicus) from northern China Sea and East China Sea using near infrared spectroscopy (NIRS) and multivariate analysis methods was proposed. Total fat contents of 189 fresh sea cucumber samples were determined and partial least-squares (PLS) regression was used to establish the quantitative NIRS model. The ordered predictor selection algorithm was performed to select feasible wavelength regions for the construction of PLS and identification models. The identification model was developed by principal component analysis combined with Mahalanobis distance and scaling to the first range algorithms. In the test set of the optimum PLS models, the root mean square error of prediction was 0.45, and correlation coefficient was 0.90. The correct classification rates of 100% were obtained in both identification calibration model and test model. The overall results indicated that NIRS method combined with chemometric analysis was a suitable tool for origin traceability and identification of fresh sea cucumber samples from nine origins in China. PMID:29410795
Guo, Xiuhan; Cai, Rui; Wang, Shisheng; Tang, Bo; Li, Yueqing; Zhao, Weijie
2018-01-01
Sea cucumber is the major tonic seafood worldwide, and geographical origin traceability is an important part of its quality and safety control. In this work, a non-destructive method for origin traceability of sea cucumber ( Apostichopus japonicus ) from northern China Sea and East China Sea using near infrared spectroscopy (NIRS) and multivariate analysis methods was proposed. Total fat contents of 189 fresh sea cucumber samples were determined and partial least-squares (PLS) regression was used to establish the quantitative NIRS model. The ordered predictor selection algorithm was performed to select feasible wavelength regions for the construction of PLS and identification models. The identification model was developed by principal component analysis combined with Mahalanobis distance and scaling to the first range algorithms. In the test set of the optimum PLS models, the root mean square error of prediction was 0.45, and correlation coefficient was 0.90. The correct classification rates of 100% were obtained in both identification calibration model and test model. The overall results indicated that NIRS method combined with chemometric analysis was a suitable tool for origin traceability and identification of fresh sea cucumber samples from nine origins in China.
SEPAC software configuration control plan and procedures, revision 1
NASA Technical Reports Server (NTRS)
1981-01-01
SEPAC Software Configuration Control Plan and Procedures are presented. The objective of the software configuration control is to establish the process for maintaining configuration control of the SEPAC software beginning with the baselining of SEPAC Flight Software Version 1 and encompass the integration and verification tests through Spacelab Level IV Integration. They are designed to provide a simplified but complete configuration control process. The intent is to require a minimum amount of paperwork but provide total traceability of SEPAC software.
[Guidelines concerning sample reception and request recording of laboratory tests].
Bailly, P; Dhondt, J L; Drouard, L; Houlbert, C; Soubiran, P; Szymanowicz, A
2010-12-01
The process is described to help to achieve the requirements of the ISO 15189 standard. The precautions to be respected for a correct recording of the request are specified. The criteria for traceability are formalized. A logogram illustrates the propositions of attitude to be followed when occurs nonconformities. Then, we propose guidelines for the treatment of the identification uncertainties of the primary sample. An algorithm is proposed to formalize the process and treat the situations which can be met with an irreplaceable or critical sample.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yoo, Jun Soo; Choi, Yong Joon
The RELAP-7 code verification and validation activities are ongoing under the code assessment plan proposed in the previous document (INL-EXT-16-40015). Among the list of V&V test problems in the ‘RELAP-7 code V&V RTM (Requirements Traceability Matrix)’, the RELAP-7 7-equation model has been tested with additional demonstration problems and the results of these tests are reported in this document. In this report, we describe the testing process, the test cases that were conducted, and the results of the evaluation.
Cargo Movement Operations System (CMOS) Requirements Traceability Matrix
1990-04-29
this worksheet and are arranged in page number order. I Comnt Page SS No. No. No . Comment 1. C-10 SS0804 Delete this shall statement. 2. C-29 SS0810...correspond with each other. I CMOS PMO ACCEPTS COM1*ENT: YES [ ] NO [ ] i ERCI ACCEPTS COMMENT: YES [ ] NO [ ] COMMENT DISPOSITION: I COMMENT STATUS: OPEN...ACCEPTS COMMENT: YES [ ] NO [ ] I ERCI ACCEPTS COMMENT: YES [ ] NO [ ] COMMENT DISPOSITION: II COMMENT STATUS: OPEN [(] CLOSED [1] I U I I I I I I I
PDSS configuration control plan and procedures
NASA Technical Reports Server (NTRS)
1983-01-01
The payload development support system (PDSS) configuration control plan and procedures are presented. These plans and procedures establish the process for maintaining configuration control of the PDSS system, especially the Spacelab experiment interface device's (SEID) RAU, HRM, and PDI interface simulations and the PDSS ECOS DEP Services simulation. The plans and procedures as specified are designed to provide a simplified but complete configuration control process. The intent is to require a minimum amount of paperwork but provide total traceability of PDSS during experiment test activities.
Cognitive learning: a machine learning approach for automatic process characterization from design
NASA Astrophysics Data System (ADS)
Foucher, J.; Baderot, J.; Martinez, S.; Dervilllé, A.; Bernard, G.
2018-03-01
Cutting edge innovation requires accurate and fast process-control to obtain fast learning rate and industry adoption. Current tools available for such task are mainly manual and user dependent. We present in this paper cognitive learning, which is a new machine learning based technique to facilitate and to speed up complex characterization by using the design as input, providing fast training and detection time. We will focus on the machine learning framework that allows object detection, defect traceability and automatic measurement tools.
Huang, Ying; Bayfield, Mark A; Intine, Robert V; Maraia, Richard J
2006-07-01
By sequence-specific binding to 3' UUU-OH, the La protein shields precursor (pre)-RNAs from 3' end digestion and is required to protect defective pre-transfer RNAs from decay. Although La is comprised of a La motif and an RNA-recognition motif (RRM), a recent structure indicates that the RRM beta-sheet surface is not involved in UUU-OH recognition, raising questions as to its function. Progressively defective suppressor tRNAs in Schizosaccharomyces pombe reveal differential sensitivities to La and Rrp6p, a 3' exonuclease component of pre-tRNA decay. 3' end protection is compromised by mutations to the La motif but not the RRM surface. The most defective pre-tRNAs require a second activity of La, in addition to 3' protection, that requires an intact RRM surface. The two activities of La in tRNA maturation map to its two conserved RNA-binding surfaces and suggest a modular model that has implications for its other ligands.
Importance of Calibration/Validation Traceability for Multi-Sensor Imaging Spectrometry Applications
NASA Technical Reports Server (NTRS)
Thome, K.
2017-01-01
Knowledge of calibration traceability is essential for ensuring the quality of data products relying on multiple sensors and especially true for imaging spectrometers. The current work discusses the expected impact that imaging spectrometers have in ensuring radiometric traceability for both multispectral and hyperspectral products. The Climate Absolute Radiance and Refractivity Observatory Pathfinder mission is used to show the role that high-accuracy imaging spectrometers can play in understanding test sites used for vicarious calibration of sensors. The associated Solar, Lunar for Absolute Reflectance Imaging Spectroradiometer calibration demonstration system is used to illustrate recent advances in laboratory radiometric calibration approaches that will allow both the use of imaging spectrometers as calibration standards as well as to ensure the consistency of the multiple imaging spectrometers expected to be on orbit in the next decade.
An intelligent traceability system: Efficient tool for a supply chain sustainability
NASA Astrophysics Data System (ADS)
Bougdira, Abdesselam; Ahaitouf, Abdelaziz; Akharraz, Ismail
2016-07-01
The supply chain sustainability becomes a necessity for a smooth, a rapid and a fluid economic transaction. To reach a sustainable supply chain, we propose to focus attention on products and their lifecycle. So, we consider the traceability as a major success key to ensure the supply chain sustainability. For that, we consider a supply chain design that use an intelligent products traced by an intelligent traceability system. This system identifies, restores history and properties of a product, besides it tracks, in real-time a product. This solution can, also, bring, in the product environment, appropriate adjustments to prevent any risk of threatening qualities for the product. So, it helps supply chain contributors making the sustainable adjustments and the instant benchmark of the supply chain sustainability.
Kong, Daochun; Coleman, Thomas R.; DePamphilis, Melvin L.
2003-01-01
Budding yeast (Saccharomyces cerevisiae) origin recognition complex (ORC) requires ATP to bind specific DNA sequences, whereas fission yeast (Schizosaccharomyces pombe) ORC binds to specific, asymmetric A:T-rich sites within replication origins, independently of ATP, and frog (Xenopus laevis) ORC seems to bind DNA non-specifically. Here we show that despite these differences, ORCs are functionally conserved. Firstly, SpOrc1, SpOrc4 and SpOrc5, like those from other eukaryotes, bound ATP and exhibited ATPase activity, suggesting that ATP is required for pre-replication complex (pre-RC) assembly rather than origin specificity. Secondly, SpOrc4, which is solely responsible for binding SpORC to DNA, inhibited up to 70% of XlORC-dependent DNA replication in Xenopus egg extract by preventing XlORC from binding to chromatin and assembling pre-RCs. Chromatin-bound SpOrc4 was located at AT-rich sequences. XlORC in egg extract bound preferentially to asymmetric A:T-sequences in either bare DNA or in sperm chromatin, and it recruited XlCdc6 and XlMcm proteins to these sequences. These results reveal that XlORC initiates DNA replication preferentially at the same or similar sites to those targeted in S.pombe. PMID:12840006
Shukla, Jayendra Nath; Palli, Subba Reddy
2014-01-01
Tribolium castaneum Transformer (TcTra) is essential for female sex determination and maintenance through the regulation of sex-specific splicing of doublesex (dsx) pre-mRNA. In females, TcTra also regulates the sex-specific splicing of its own pre-mRNA to ensure continuous production of functional Tra protein. Transformer protein is absent in males and hence dsx pre-mRNA is spliced in a default mode. The mechanisms by which males inhibit the production of functional Tra protein are not known. Here, we report on functional characterization of transformer-2 (tra-2) gene (an ortholog of Drosophila transformer-2) in T. castaneum. RNA interference-mediated knockdown in the expression of gene coding for tra-2 in female pupae or adults resulted in the production of male-specific isoform of dsx and both female and male isoforms of tra suggesting that Tra-2 is essential for the female-specific splicing of tra and dsx pre-mRNAs. Interestingly, knockdown of tra-2 in males did not affect the splicing of dsx but resulted in the production of both female and male isoforms of tra suggesting that Tra-2 suppresses female-specific splicing of tra pre-mRNA in males. This dual regulation of sex-specific splicing of tra pre-mRNA ensures a tight regulation of sex determination and maintenance. These data suggest a critical role for Tra-2 in suppression of female sex determination cascade in males. In addition, RNAi studies showed that Tra-2 is also required for successful embryonic and larval development in both sexes. PMID:24056158
2018-01-01
ECBC-TR-1506 NIST-TRACEABLE NMR METHOD TO DETERMINE QUANTITATIVE WEIGHT PERCENTAGE PURITY OF MUSTARD (HD) FEEDSTOCK SAMPLES David J...McGarvey RESEARCH AND TECHNOLOGY DIRECTORATE William R. Creasy LEIDOS, INC. Abingdon, MD 21009-1261 Theresa R. Connell EXCET, INC...be construed as an official Department of the Army position unless so designated by other authorizing documents. REPORT DOCUMENTATION PAGE
Prevent: what is pre-criminal space?
Goldberg, David; Jadhav, Sushrut; Younis, Tarek
2017-01-01
Prevent is a UK-wide programme within the government's anti-terrorism strategy aimed at stopping individuals from supporting or taking part in terrorist activities. NHS England's Prevent Training and Competencies Framework requires health professionals to understand the concept of pre-criminal space. This article examines pre-criminal space, a new term which refers to a period of time during which a person is referred to a specific Prevent-related safeguarding panel, Channel. It is unclear what the concept of pre-criminal space adds to the Prevent programme. The term should be either clarified or removed from the Framework. PMID:28811915
Roadmap for In-Space Propulsion Technology
NASA Technical Reports Server (NTRS)
Meyer, Michael; Johnson, Les; Palaszewski, Bryan; Coote, David; Goebel, Dan; White, Harold
2012-01-01
NASA has created a roadmap for the development of advanced in-space propulsion technologies for the NASA Office of the Chief Technologist (OCT). This roadmap was drafted by a team of subject matter experts from within the Agency and then independently evaluated, integrated and prioritized by a National Research Council (NRC) panel. The roadmap describes a portfolio of in-space propulsion technologies that could meet future space science and exploration needs, and shows their traceability to potential future missions. Mission applications range from small satellites and robotic deep space exploration to space stations and human missions to Mars. Development of technologies within the area of in-space propulsion will result in technical solutions with improvements in thrust, specific impulse (Isp), power, specific mass (or specific power), volume, system mass, system complexity, operational complexity, commonality with other spacecraft systems, manufacturability, durability, and of course, cost. These types of improvements will yield decreased transit times, increased payload mass, safer spacecraft, and decreased costs. In some instances, development of technologies within this area will result in mission-enabling breakthroughs that will revolutionize space exploration. There is no single propulsion technology that will benefit all missions or mission types. The requirements for in-space propulsion vary widely according to their intended application. This paper provides an updated summary of the In-Space Propulsion Systems technology area roadmap incorporating the recommendations of the NRC.
Adesokan, Hezekiah K; Ocheja, Samuel E
2014-01-01
Livestock diseases and other animal health events are a threat to achieving sustainable livestock industry. The knowledge of trace-back and the practice of providing feedback on diseases encountered in slaughtered animals from the abattoir to the farm can help limit the spread as well as manage potential future incidents of such diseases. We assessed the knowledge, attitudes and practices of 200 willing livestock traders on traceability in Bodija Municipal Abattoir, south-western Nigeria. The results reveal that the majority of these traders had poor knowledge (79.5 %) and practices (74.0 %) of traceability, though 89.5 % demonstrated good attitudes. While 22.9 % knew that traceability could be an effective means to control diseases, only a lower proportion (9.0 %) knew the health status of the animals being purchased. Though 29.0 % reported the diseases encountered in their animals during slaughter to the farm, only 9.5 % followed up to ensure the farmers take steps at preventing further occurrence of the reported diseases. While age (p = 0.000; 0.014) and education (p = 0.000; 0.000) were both significant for good knowledge and attitudes, frequency of condemned cases (p = 0.000) and length of years in the trade (p = 0.004) were, respectively, significant for good knowledge and attitudes with none associated with practice. These poor levels of knowledge and practices of traceability are a threat to sustainable livestock industry, food security and human health; hence, there is an urgent need to institute national feedback mechanism on slaughtered animals in order to strengthen interventions against diseases at farm levels.
Vaquier, C; Legrand, D; Caldani, C
2009-05-01
Since 1985, a Council Resolution defines a new approach to technical harmonization and standards. The directives resulting from this new approach establish the essential safety requirements with which products put on the market must conform, and which should therefore enjoy free movement throughout the European Union, owing to a presumption of conformity. However, if the manufacturer is responsible for the conformity of its product in terms of safety, it is the end-user who must make sure that its requirements for a specific use or a considered application are satisfied and that the product meets its needs. It is the objective of the procedure of validation, which calls upon protocols of qualification intended to show the aptitude of a material, a system, a device, of an installation, to meet the requirements of specified quality and safety. This concept of qualification applies to the data-processing software of the hospital blood banks. Only the stage of operational qualification will be developed here. It involves scenarios made up of test files which make it possible to check the electronic data interchanges between the hospital blood bank and the blood establishment (transmission of the results of analysis, transmission of the data of traceability), as well as the functions of assistance to the issuing of the labile blood products.
Technology Readiness Level Assessment Process as Applied to NASA Earth Science Missions
NASA Technical Reports Server (NTRS)
Leete, Stephen J.; Romero, Raul A.; Dempsey, James A.; Carey, John P.; Cline, Helmut P.; Lively, Carey F.
2015-01-01
Technology assessments of fourteen science instruments were conducted within NASA using the NASA Technology Readiness Level (TRL) Metric. The instruments were part of three NASA Earth Science Decadal Survey missions in pre-formulation. The Earth Systematic Missions Program (ESMP) Systems Engineering Working Group (SEWG), composed of members of three NASA Centers, provided a newly modified electronic workbook to be completed, with instructions. Each instrument development team performed an internal assessment of its technology status, prepared an overview of its instrument, and completed the workbook with the results of its assessment. A team from the ESMP SEWG met with each instrument team and provided feedback. The instrument teams then reported through the Program Scientist for their respective missions to NASA's Earth Science Division (ESD) on technology readiness, taking the SEWG input into account. The instruments were found to have a range of TRL from 4 to 7. Lessons Learned are presented; however, due to the competition-sensitive nature of the assessments, the results for specific missions are not presented. The assessments were generally successful, and produced useful results for the agency. The SEWG team identified a number of potential improvements to the process. Particular focus was on ensuring traceability to guiding NASA documents, including the NASA Systems Engineering Handbook. The TRL Workbook has been substantially modified, and the revised workbook is described.
Services of the CDRH X-ray calibration laboratory and their traceability to National Standards
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cerra, F.; Heaton, H.T.
The X-ray Calibration Laboratory (XCL) of the Center for Devices and Radiological Health (CDRH) provides calibration services for the Food and Drug Administration (FDA). The instruments calibrated are used by FDA and contract state inspectors to verify compliance with federal x-ray performance standards and for national surveys of x-ray trends. In order to provide traceability of measurements, the CDRH XCL is accredited by the National Voluntary Laboratory Accreditation Program (NVLAP) for reference, diagnostic, and x-ray survey instrument calibrations. In addition to these accredited services, the CDRH XCL also calibrates non-invasive kVp meters in single- and three-phase x-ray beams, and thermoluminescentmore » dosimeter (TLD) chips used to measure CT beam profiles. The poster illustrates these services and shows the traceability links back to the National Standards.« less
Traceable measurements of small forces and local mechanical properties
NASA Astrophysics Data System (ADS)
Campbellová, Anna; Valtr, Miroslav; Zůda, Jaroslav; Klapetek, Petr
2011-09-01
Measurement of local mechanical properties is an important topic in the fields of nanoscale device fabrication, thin film deposition and composite material development. Nanoindentation instruments are commonly used to study hardness and related mechanical properties at the nanoscale. However, traceability and uncertainty aspects of the measurement process often remain left aside. In this contribution, the use of a commercial nanoindentation instrument for metrology purposes will be discussed. Full instrument traceability, provided using atomic force microscope cantilevers and a mass comparator (normal force), interferometer (depth) and atomic force microscope (area function) is described. The uncertainty of the loading/unloading curve measurements will be analyzed and the resulting uncertainties for quantities, that are computed from loading curves such as hardness or elastic modulus, are studied. For this calculation a combination of uncertainty propagation law and Monte Carlo uncertainty evaluations are used.
Kim, Lindsay; McGee, Lesley; Tomczyk, Sara
2016-01-01
SUMMARY Streptococcus pneumoniae inflicts a huge disease burden as the leading cause of community-acquired pneumonia and meningitis. Soon after mainstream antibiotic usage, multiresistant pneumococcal clones emerged and disseminated worldwide. Resistant clones are generated through adaptation to antibiotic pressures imposed while naturally residing within the human upper respiratory tract. Here, a huge array of related commensal streptococcal strains transfers core genomic and accessory resistance determinants to the highly transformable pneumococcus. β-Lactam resistance is the hallmark of pneumococcal adaptability, requiring multiple independent recombination events that are traceable to nonpneumococcal origins and stably perpetuated in multiresistant clonal complexes. Pneumococcal strains with elevated MICs of β-lactams are most often resistant to additional antibiotics. Basic underlying mechanisms of most pneumococcal resistances have been identified, although new insights that increase our understanding are continually provided. Although all pneumococcal infections can be successfully treated with antibiotics, the available choices are limited for some strains. Invasive pneumococcal disease data compiled during 1998 to 2013 through the population-based Active Bacterial Core surveillance program (U.S. population base of 30,600,000) demonstrate that targeting prevalent capsular serotypes with conjugate vaccines (7-valent and 13-valent vaccines implemented in 2000 and 2010, respectively) is extremely effective in reducing resistant infections. Nonetheless, resistant non-vaccine-serotype clones continue to emerge and expand. PMID:27076637
Applicability of SCAR markers to food genomics: olive oil traceability.
Pafundo, Simona; Agrimonti, Caterina; Maestri, Elena; Marmiroli, Nelson
2007-07-25
DNA analysis with molecular markers has opened a shortcut toward a genomic comprehension of complex organisms. The availability of micro-DNA extraction methods, coupled with selective amplification of the smallest extracted fragments with molecular markers, could equally bring a breakthrough in food genomics: the identification of original components in food. Amplified fragment length polymorphisms (AFLPs) have been instrumental in plant genomics because they may allow rapid and reliable analysis of multiple and potentially polymorphic sites. Nevertheless, their direct application to the analysis of DNA extracted from food matrixes is complicated by the low quality of DNA extracted: its high degradation and the presence of inhibitors of enzymatic reactions. The conversion of an AFLP fragment to a robust and specific single-locus PCR-based marker, therefore, could extend the use of molecular markers to large-scale analysis of complex agro-food matrixes. In the present study is reported the development of sequence characterized amplified regions (SCARs) starting from AFLP profiles of monovarietal olive oils analyzed on agarose gel; one of these was used to identify differences among 56 olive cultivars. All the developed markers were purposefully amplified in olive oils to apply them to olive oil traceability.
Traceable measurements of the electrical parameters of solid-state lighting products
NASA Astrophysics Data System (ADS)
Zhao, D.; Rietveld, G.; Braun, J.-P.; Overney, F.; Lippert, T.; Christensen, A.
2016-12-01
In order to perform traceable measurements of the electrical parameters of solid-state lighting (SSL) products, it is necessary to technically adequately define the measurement procedures and to identify the relevant uncertainty sources. The present published written standard for SSL products specifies test conditions, but it lacks an explanation of how adequate these test conditions are. More specifically, both an identification of uncertainty sources and a quantitative uncertainty analysis are absent. This paper fills the related gap in the present written standard. New uncertainty sources with respect to conventional lighting sources are determined and their effects are quantified. It shows that for power measurements, the main uncertainty sources are temperature deviation, power supply voltage distortion, and instability of the SSL product. For current RMS measurements, the influence of bandwidth, shunt resistor, power supply source impedance and ac frequency flatness are significant as well. The measurement uncertainty depends not only on the test equipment but is also a function of the characteristics of the device under test (DUT), for example, current harmonics spectrum and input impedance. Therefore, an online calculation tool is provided to help non-electrical experts. Following our procedures, unrealistic uncertainty estimations, unnecessary procedures and expensive equipment can be prevented.
Using templates and linguistic patterns to define process performance indicators
NASA Astrophysics Data System (ADS)
del-Río-Ortega, Adela; Resinas, Manuel; Durán, Amador; Ruiz-Cortés, Antonio
2016-02-01
Process performance management (PPM) aims at measuring, monitoring and analysing the performance of business processes (BPs), in order to check the achievement of strategic and operational goals and to support decision-making for their optimisation. PPM is based on process performance indicators (PPIs), so having an appropriate definition of them is crucial. One of the main problems of PPIs definition is to express them in an unambiguous, complete, understandable, traceable and verifiable manner. In practice, PPIs are defined informally - usually in ad hoc, natural language, with its well-known problems - or they are defined from an implementation perspective, hardly understandable to non-technical people. In order to solve this problem, in this article we propose a novel approach to improve the definition of PPIs using templates and linguistic patterns. This approach promotes reuse, reduces both ambiguities and missing information, is understandable to all stakeholders and maintains traceability with the process model. Furthermore, it enables the automated processing of PPI definitions by its straightforward translation into the PPINOT metamodel, allowing the gathering of the required information for their computation as well as the analysis of the relationships between them and with BP elements.
Sajnóg, Adam; Hanć, Anetta; Barałkiewicz, Danuta
2018-05-15
Analysis of clinical specimens by imaging techniques allows to determine the content and distribution of trace elements on the surface of the examined sample. In order to obtain reliable results, the developed procedure should be based not only on the properly prepared sample and performed calibration. It is also necessary to carry out all phases of the procedure in accordance with the principles of chemical metrology whose main pillars are the use of validated analytical methods, establishing the traceability of the measurement results and the estimation of the uncertainty. This review paper discusses aspects related to sampling, preparation and analysis of clinical samples by laser ablation inductively coupled plasma mass spectrometry (LA-ICP-MS) with emphasis on metrological aspects, i.e. selected validation parameters of the analytical method, the traceability of the measurement result and the uncertainty of the result. This work promotes the introduction of metrology principles for chemical measurement with emphasis to the LA-ICP-MS which is the comparative method that requires studious approach to the development of the analytical procedure in order to acquire reliable quantitative results. Copyright © 2018 Elsevier B.V. All rights reserved.
Game and venison - meat for the modern consumer.
Hoffman, L C; Wiklund, E
2006-09-01
This review focuses on how game meat from southern Africa and venison that are increasingly being imported into Europe and the US addresses consumer issues as pertaining to production (wild, free range or intensive production) and harvesting methods, healthiness (chemical composition, particularly fatty acid composition), and traceability. Although African game meat species are farmed extensively, deer species are farmed using extensive to intensive production systems. However, the increasingly intensive production of the cervids and the accompanying practices associated with this (castration, velvetting, feeding of balanced diets, etc.) may have a negative impact in the near future on the consumer's perception of these animals. These alternative meat species are all harvested in a sustainable manner using acceptable methods. All these species have very low muscle fat contents consisting predominantly of structural lipid components (phospholipid and cholesterol) that have high proportions of polyunsaturated fatty acids. This results in the meat having desirable polyunsaturated:saturated and n-6:n-3 fatty acid ratios. The South African traceability system is discussed briefly as an example on how these exporting countries are able to address the requirements pertaining to the import of meat as stipulated by the European Economic Community.
Hedoux, S; Dode, X; Pivot, C; Couray-Targe, S; Aulagner, G
2012-07-01
The best practice contract has given a new objective to the hospital pharmacists for the reimbursement in addition to Diagnosis Related Groups' (DRGs) tariffs. We built our pharmaceutical quality control for the administration traceability follow-up regarding the DRGs and the cost of care, for two reasons: the nominal drugs dispensation in link with the prescription made by pharmacist and the important expenditure of these drugs. Our organization depends on the development level of the informatized drugs circuit and minimizes the risk of financial shortfalls or wrong benefits, possible causes of economic penalties for our hospital. On the basis of this follow-up, we highlighted our activity and identified problems of management and drugs circuit organization. The quality of the administration traceability impacts directly on the quality of the medical records and the reimbursements of the expensive drugs. A better knowledge of prescription software is also required for a better quality and security of the medical data used in the medical informatic systems. The drugs management and the personal treatment in and between the care units need to be improved too. We have to continue and improve our organization with the future financial model for ATU drugs and the FIDES project. The health personnel awareness and the development of best informatic tools are also required. Copyright © 2012 Elsevier Masson SAS. All rights reserved.
SU-F-J-06: Optimized Patient Inclusion for NaF PET Response-Based Biopsies
DOE Office of Scientific and Technical Information (OSTI.GOV)
Roth, A; Harmon, S; Perk, T
Purpose: A method to guide mid-treatment biopsies using quantitative [F-18]NaF PET/CT response is being investigated in a clinical trial. This study aims to develop methodology to identify patients amenable to mid-treatment biopsy based on pre-treatment imaging characteristics. Methods: 35 metastatic prostate cancer patients had NaF PET/CT scans taken prior to the start of treatment and 9–12 weeks into treatment. For mid-treatment biopsy targeting, lesions must be at least 1.5 cm{sup 3} and located in a clinically feasible region (lumbar/sacral spine, pelvis, humerus, or femur). Three methods were developed based on number of lesions present prior to treatment: a feasibility-restricted method,more » a location-restricted method, and an unrestricted method. The feasibility restricted method only utilizes information from lesions meeting biopsy requirements in the pre-treatment scan. The unrestricted method accounts for all lesions present in the pre-treatment scan. For each method, optimized classification cutoffs for candidate patients were determined. Results: 13 of the 35 patients had enough lesions at the mid-treatment for biopsy candidacy. Of 1749 lesions identified in all 35 patients at mid-treatment, only 9.8% were amenable to biopsy. Optimizing the feasibility-restricted method required 4 lesions at pre-treatment meeting volume and region requirements for biopsy, resulting patient identification sensitivity of 0.8 and specificity of 0.7. Of 6 false positive patients, only one patient lacked lesions for biopsy. Restricting for location alone showed poor results (sensitivity 0.2 and specificity 0.3). The optimized unrestricted method required patients have at least 37 lesions in pretreatment scan, resulting in a sensitivity of 0.8 and specificity of 0.8. There were 5 false positives, only one lacked lesions for biopsy. Conclusion: Incorporating the overall pre-treatment number of NaF PET/CT identified lesions provided best prediction for identifying candidate patients for mid-treatment biopsy. This study provides validity for prediction-based inclusion criteria that can be extended to various clinical trial scenarios. Funded by Prostate Cancer Foundation.« less
NASA Astrophysics Data System (ADS)
1994-01-01
Summer School, 27 June to 8 July 1994, Viana do Castelo, Hotel do Parque, Portugal Optical fibres, with their extremely low transmission loss, untapped bandwidth and controllable dispersion, dominate a broad range of technologies in which applications must respond to the increasing constraints of today's specifications as well as envisage future requirements. Optical fibres dominate communications systems. In the area of sensors, fibre optics will be fully exploited for their immunity to EMI, their high sensitivity and their large dynamic range. The maturity of single mode optical technology has led to intensive R&D of a range of components based on the advantages of transmission characteristics and signal processing. Specifications and intercompatibility requests for the new generation of both analogue and digital fibre optical components and systems has created a demand for sophisticated measuring techniques based on unique and complex instruments. In recent years there has been a signification evolution in response to the explosion of applications and the tightening of specifications. These developments justify a concerted effort to focus on trends in optical fibre metrology and standards. Objective The objective of this school is to provide a progressive and comprehensive presentation of current issues concerning passive and active optical fibre characterization and measurement techniques. Passive fibre components support a variety of developments in optical fibre systems and will be discussed in terms of relevance and standards. Particular attention will be paid to devices for metrological purposes such as reference fibres and calibration artefacts. The characterization and testing of optical fibre amplifiers, which have great potential in telecommunications, data distribution networks and as a system part in instrumentation, will be covered. Methods of measurement and means of calibration with traceability will be discussed, together with the characterization requirements of the new generation of analogue and digital fibre optical systems, which require sophisticated measurement techniques employing complex instruments unique to optical measurements. The school will foster and enhance the interaction between material, devices, systems, and standards-oriented R&D communities, as well as between engineers concerned with design and manufacturers of systems and instrumentation. Topics Review of optical fibre communication technology and systems Measurement techniques for fibre characterization: Reliability and traceability Reference fibres and calibration artefacts Ribbon fibres Mechanical and environmental testing Fibre reliability Polarimetric measurements Passive components characterization: Splices and connectors Couplers, splitters, taps and WDMs Optical fibres and isolators WDM technologies and applications: WDM technologies Tunable optical filters Fibre amplifiers and sources: Performances and characterization Design and standards Nonlinear effects Subsystem design and standards: Design and fabrication techniques Performance degradation and reliability Evaluation of costs/performance/technology Sensors IR - optical fibres Plastic fibres Instrumentation Registration Participation free of charge for postgraduate students, with some grants available for travel and lodging expenses. All correspondence should be addressed to: Secretariat, Trends in Optical Fibre Metrology and Standards, a/c Prof. Olivério D D Soares, Centro de Ciências e Tecnologias Opticas, Lab. Fisica - Faculdade de Ciências, Praça Gomes Teixeira, P-4000 Porto, Portugal. Tel: 351-2-310290, 351-2-2001648; Fax: 351-2-319267.
Mennecke, B E; Townsend, A M; Hayes, D J; Lonergan, S M
2007-10-01
This study utilizes an analysis technique commonly used in marketing, the conjoint analysis method, to examine the relative utilities of a set of beef steak characteristics considered by a national sample of 1,432 US consumers, as well as additional localized samples representing undergraduate students at a business college and in an animal science department. The analyses indicate that among all respondents, region of origin is by far the most important characteristic; this is followed by animal breed, traceability, animal feed, and beef quality. Alternatively, the cost of cut, farm ownership, the use (or nonuse) of growth promoters, and whether the product is guaranteed tender were the least important factors. Results for animal science undergraduates are similar to the aggregate results, except that these students emphasized beef quality at the expense of traceability and the nonuse of growth promoters. Business students also emphasized region of origin but then emphasized traceability and cost. The ideal steak for the national sample is from a locally produced, choice Angus fed a mixture of grain and grass that is traceable to the farm of origin. If the product was not produced locally, respondents indicated that their preferred production states are, in order from most to least preferred, Iowa, Texas, Nebraska, and Kansas.
NASA Astrophysics Data System (ADS)
Wähmer, M.; Anhalt, K.; Hollandt, J.; Klein, R.; Taubert, R. D.; Thornagel, R.; Ulm, G.; Gavrilov, V.; Grigoryeva, I.; Khlevnoy, B.; Sapritsky, V.
2017-10-01
Absolute spectral radiometry is currently the only established primary thermometric method for the temperature range above 1300 K. Up to now, the ongoing improvements of high-temperature fixed points and their formal implementation into an improved temperature scale with the mise en pratique for the definition of the kelvin, rely solely on single-wavelength absolute radiometry traceable to the cryogenic radiometer. Two alternative primary thermometric methods, yielding comparable or possibly even smaller uncertainties, have been proposed in the literature. They use ratios of irradiances to determine the thermodynamic temperature traceable to blackbody radiation and synchrotron radiation. At PTB, a project has been established in cooperation with VNIIOFI to use, for the first time, all three methods simultaneously for the determination of the phase transition temperatures of high-temperature fixed points. For this, a dedicated four-wavelengths ratio filter radiometer was developed. With all three thermometric methods performed independently and in parallel, we aim to compare the potential and practical limitations of all three methods, disclose possibly undetected systematic effects of each method and thereby confirm or improve the previous measurements traceable to the cryogenic radiometer. This will give further and independent confidence in the thermodynamic temperature determination of the high-temperature fixed point's phase transitions.
Traceable calibration and demonstration of a portable dynamic force transfer standard
NASA Astrophysics Data System (ADS)
Vlajic, Nicholas; Chijioke, Ako
2017-08-01
In general, the dynamic sensitivity of a force transducer depends upon the mechanical system in which it is used. This dependence serves as motivation to develop a dynamic force transfer standard, which can be used to calibrate an application transducer in situ. In this work, we SI-traceably calibrate a hand-held force transducer, namely an impact hammer, by using a mass suspended from a thin line which is cut to produce a known dynamic force in the form of a step function. We show that this instrument is a promising candidate as a transfer standard, since its dynamic response has small variance between different users. This calibrated transfer standard is then used to calibrate a secondary force transducer in an example application setting. The combined standard uncertainty (k = 2) in the calibration of the transfer standard was determined to be 2.1% or less, up to a bandwidth of 5 kHz. The combined standard uncertainty (k = 2) in the performed transfer calibration was less than 4%, up to 3 kHz. An advantage of the transfer calibration framework presented here, is that the transfer standard can be used to transfer SI-traceable calibrations without the use of any SI-traceable voltage metrology instrumentation.
Shukla, Jayendra Nath; Palli, Subba Reddy
2013-12-01
Tribolium castaneum Transformer (TcTra) is essential for female sex determination and maintenance through the regulation of sex-specific splicing of doublesex (dsx) pre-mRNA. In females, TcTra also regulates the sex-specific splicing of its own pre-mRNA to ensure continuous production of functional Tra protein. Transformer protein is absent in males and hence dsx pre-mRNA is spliced in a default mode. The mechanisms by which males inhibit the production of functional Tra protein are not known. Here, we report on functional characterization of transformer-2 (tra-2) gene (an ortholog of Drosophila transformer-2) in T. castaneum. RNA interference-mediated knockdown in the expression of gene coding for tra-2 in female pupae or adults resulted in the production of male-specific isoform of dsx and both female and male isoforms of tra suggesting that Tra-2 is essential for the female-specific splicing of tra and dsx pre-mRNAs. Interestingly, knockdown of tra-2 in males did not affect the splicing of dsx but resulted in the production of both female and male isoforms of tra suggesting that Tra-2 suppresses female-specific splicing of tra pre-mRNA in males. This dual regulation of sex-specific splicing of tra pre-mRNA ensures a tight regulation of sex determination and maintenance. These data suggest a critical role for Tra-2 in suppression of female sex determination cascade in males. In addition, RNAi studies showed that Tra-2 is also required for successful embryonic and larval development in both sexes. Copyright © 2013 Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Priestly, Kory; Smith, George L.; Thomas, Susan; Maddock, Suzanne L.
2009-01-01
Continuation of the Earth Radiation Budget (ERB) Climate Data Record (CDR) has been identified as critical in the 2007 NRC Decadal Survey, the Global Climate Observing System WCRP report, and in an assessment titled Impacts of NPOESS Nunn-McCurdy Certification on Joint NASA-NOAA Climate Goals. In response, NASA, NOAA and NPOESS agreed in early 2008 to fly the final existing CERES Flight Model (FM-5) on the NPP spacecraft for launch in 2010. Future opportunities for ERB CDR continuity consist of procuring an additional CERES Sensor with modest performance upgrades for flight on the NPOESS C1 spacecraft in 2013, followed by a new CERES follow-on sensor for flight in 2018 on the NPOESS C3 spacecraft. While science goals remain unchanged for the long-term ERB Climate Data Record, it is now understood that the task of achieving these goals is more difficult for two reasons. The first is an increased understanding of the dynamics of the Earth/atmosphere system which demonstrates that rigorous separation of natural variability from anthropogenic change on decadal time scales requires higher accuracy and stability than originally envisioned. Secondly, future implementation scenarios involve less redundancy in flight hardware (1 vs. 2 orbits and operational sensors) resulting in higher risk of loss of continuity and reduced number of independent observations to characterize performance of individual sensors. Although EOS CERES CDR's realize a factor of 2 to 4 improvement in accuracy and stability over previous ERBE CDR's, future sensors will require an additional factor of 2 improvement to answer rigorously the science questions moving forward. Modest investments, defined through the CERES Science Team s 30-year operational history of the EOS CERES sensors, in onboard calibration hardware and pre-flight calibration and test program will ensure meeting these goals while reducing costs in re-processing scientific datasets. The CERES FM-5 pre-flight radiometric characterization program benefited from the 30-year operational experience of the CERES EOS sensors, as well as a stronger emphasis of radiometric characterization in the Statement of Work with the sensor provider. Improvements to the pre-flight program included increased spectral, spatial, and temporal sampling under vacuum conditions as well as additional tests to characterize the primary and transfer standards in the calibration facility. Future work will include collaboration with NIST to further enhance the understanding of the radiometric performance of this equipment prior to flight. The current effort summarizes these improvements to the CERES FM-5 pre-flight sensor characterization program, as well as modifications to inflight calibration procedures and operational tasking. In addition, an estimate of the impacts to the system level accuracy and traceability is presented.
NASA Astrophysics Data System (ADS)
Hamelin, Elizabeth I.; Blake, Thomas A.; Perez, Jonas W.; Crow, Brian S.; Shaner, Rebecca L.; Coleman, Rebecca M.; Johnson, Rudolph C.
2016-05-01
Public health response to large scale chemical emergencies presents logistical challenges for sample collection, transport, and analysis. Diagnostic methods used to identify and determine exposure to chemical warfare agents, toxins, and poisons traditionally involve blood collection by phlebotomists, cold transport of biomedical samples, and costly sample preparation techniques. Use of dried blood spots, which consist of dried blood on an FDA-approved substrate, can increase analyte stability, decrease infection hazard for those handling samples, greatly reduce the cost of shipping/storing samples by removing the need for refrigeration and cold chain transportation, and be self-prepared by potentially exposed individuals using a simple finger prick and blood spot compatible paper. Our laboratory has developed clinical assays to detect human exposures to nerve agents through the analysis of specific protein adducts and metabolites, for which a simple extraction from a dried blood spot is sufficient for removing matrix interferents and attaining sensitivities on par with traditional sampling methods. The use of dried blood spots can bridge the gap between the laboratory and the field allowing for large scale sample collection with minimal impact on hospital resources while maintaining sensitivity, specificity, traceability, and quality requirements for both clinical and forensic applications.
Beyond a series of security nets: Applying STAMP & STPA to port security
Williams, Adam D.
2015-11-17
Port security is an increasing concern considering the significant role of ports in global commerce and today’s increasingly complex threat environment. Current approaches to port security mirror traditional models of accident causality -- ‘a series of security nets’ based on component reliability and probabilistic assumptions. Traditional port security frameworks result in isolated and inconsistent improvement strategies. Recent work in engineered safety combines the ideas of hierarchy, emergence, control and communication into a new paradigm for understanding port security as an emergent complex system property. The ‘System-Theoretic Accident Model and Process (STAMP)’ is a new model of causality based on systemsmore » and control theory. The associated analysis process -- System Theoretic Process Analysis (STPA) -- identifies specific technical or procedural security requirements designed to work in coordination with (and be traceable to) overall port objectives. This process yields port security design specifications that can mitigate (if not eliminate) port security vulnerabilities related to an emphasis on component reliability, lack of coordination between port security stakeholders or economic pressures endemic in the maritime industry. As a result, this article aims to demonstrate how STAMP’s broader view of causality and complexity can better address the dynamic and interactive behaviors of social, organizational and technical components of port security.« less
Beyond a series of security nets: Applying STAMP & STPA to port security
DOE Office of Scientific and Technical Information (OSTI.GOV)
Williams, Adam D.
Port security is an increasing concern considering the significant role of ports in global commerce and today’s increasingly complex threat environment. Current approaches to port security mirror traditional models of accident causality -- ‘a series of security nets’ based on component reliability and probabilistic assumptions. Traditional port security frameworks result in isolated and inconsistent improvement strategies. Recent work in engineered safety combines the ideas of hierarchy, emergence, control and communication into a new paradigm for understanding port security as an emergent complex system property. The ‘System-Theoretic Accident Model and Process (STAMP)’ is a new model of causality based on systemsmore » and control theory. The associated analysis process -- System Theoretic Process Analysis (STPA) -- identifies specific technical or procedural security requirements designed to work in coordination with (and be traceable to) overall port objectives. This process yields port security design specifications that can mitigate (if not eliminate) port security vulnerabilities related to an emphasis on component reliability, lack of coordination between port security stakeholders or economic pressures endemic in the maritime industry. As a result, this article aims to demonstrate how STAMP’s broader view of causality and complexity can better address the dynamic and interactive behaviors of social, organizational and technical components of port security.« less
A CMMI-based approach for medical software project life cycle study.
Chen, Jui-Jen; Su, Wu-Chen; Wang, Pei-Wen; Yen, Hung-Chi
2013-01-01
In terms of medical techniques, Taiwan has gained international recognition in recent years. However, the medical information system industry in Taiwan is still at a developing stage compared with the software industries in other nations. In addition, systematic development processes are indispensable elements of software development. They can help developers increase their productivity and efficiency and also avoid unnecessary risks arising during the development process. Thus, this paper presents an application of Light-Weight Capability Maturity Model Integration (LW-CMMI) to Chang Gung Medical Research Project (CMRP) in the Nuclear medicine field. This application was intended to integrate user requirements, system design and testing of software development processes into three layers (Domain, Concept and Instance) model. Then, expressing in structural System Modeling Language (SysML) diagrams and converts part of the manual effort necessary for project management maintenance into computational effort, for example: (semi-) automatic delivery of traceability management. In this application, it supports establishing artifacts of "requirement specification document", "project execution plan document", "system design document" and "system test document", and can deliver a prototype of lightweight project management tool on the Nuclear Medicine software project. The results of this application can be a reference for other medical institutions in developing medical information systems and support of project management to achieve the aim of patient safety.
Engineering large-scale agent-based systems with consensus
NASA Technical Reports Server (NTRS)
Bokma, A.; Slade, A.; Kerridge, S.; Johnson, K.
1994-01-01
The paper presents the consensus method for the development of large-scale agent-based systems. Systems can be developed as networks of knowledge based agents (KBA) which engage in a collaborative problem solving effort. The method provides a comprehensive and integrated approach to the development of this type of system. This includes a systematic analysis of user requirements as well as a structured approach to generating a system design which exhibits the desired functionality. There is a direct correspondence between system requirements and design components. The benefits of this approach are that requirements are traceable into design components and code thus facilitating verification. The use of the consensus method with two major test applications showed it to be successful and also provided valuable insight into problems typically associated with the development of large systems.
Traceable calibration of photovoltaic reference cells using natural sunlight
NASA Astrophysics Data System (ADS)
Müllejans, H.; Zaaiman, W.; Pavanello, D.; Dunlop, E. D.
2018-02-01
At the European Solar Test Installation (ESTI) photovoltaic (PV) reference cells are calibrated traceably to SI units via the World Radiometric Reference (WRR) using natural sunlight. The Direct Sunlight Method (DSM) is described in detail and the latest measurement results and an updated uncertainty budget are reported. These PV reference cells then provide a practical means for measuring the irradiance of natural or simulated sunlight during the calibration of other PV devices.
Kim, Yeong Gug; Woo, Eunju
2016-07-01
The objectives of this study are to apply the TAM using the addition of perceived information to individuals' behavioral intention to use the QR code for the food traceability system; and to determine the moderating effects of food involvement on the relationship between perceived information and perceived usefulness. Results from a survey of 420 respondents are analyzed using structural equation modeling. The study findings reveal that the extended TAM has a satisfactory fit to the data and that the underlying dimensions have a significant effect on consumers' intention to use the QR code for the food traceability system. In addition, food involvement plays a significant moderating function in the relationship between perceived information and perceived usefulness. The implications of this study for future research are discussed. Copyright © 2016 Elsevier Ltd. All rights reserved.
A new simplex chemometric approach to identify olive oil blends with potentially high traceability.
Semmar, N; Laroussi-Mezghani, S; Grati-Kamoun, N; Hammami, M; Artaud, J
2016-10-01
Olive oil blends (OOBs) are complex matrices combining different cultivars at variable proportions. Although qualitative determinations of OOBs have been subjected to several chemometric works, quantitative evaluations of their contents remain poorly developed because of traceability difficulties concerning co-occurring cultivars. Around this question, we recently published an original simplex approach helping to develop predictive models of the proportions of co-occurring cultivars from chemical profiles of resulting blends (Semmar & Artaud, 2015). Beyond predictive model construction and validation, this paper presents an extension based on prediction errors' analysis to statistically define the blends with the highest predictability among all the possible ones that can be made by mixing cultivars at different proportions. This provides an interesting way to identify a priori labeled commercial products with potentially high traceability taking into account the natural chemical variability of different constitutive cultivars. Copyright © 2016 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Leuenberger, Daiana; Pascale, Céline; Guillevic, Myriam; Ackermann, Andreas; Niederhauser, Bernhard
2017-04-01
Ammonia (NH3) in the atmosphere is the major precursor for neutralising atmospheric acids and is thus affecting not only the long-range transport of sulphur dioxide and nitrogen oxides but also stabilises secondary particulate matter. These aerosols have negative impacts on air quality and human health. Moreover, they negatively affect terrestrial ecosystems after deposition. NH3 has been included in the air quality monitoring networks and emission reduction directives of European nations. Atmospheric concentrations are in the order of 0.5-500 nmol/mol. However, the lowest substance amount fraction of available certified reference material (CRM) is 10 μmol/mol. This due to the fact that adsorption on the walls of aluminium cylinders and desorption as pressure in the cylinder decreases cause substantial instabilities in the amount fractions of the gas mixtures. Moreover, analytical techniques to be calibrated are very diverse and cause challenges for the production and application of CRM. The Federal Institute of Metrology METAS has developed, partially in the framework of EMRP JRP ENV55 MetNH3, an infrastructure to meet with the different requirements in order to generate SI-traceable NH3 reference gas mixtures dynamically in the amount fraction range 0.5-500 nmol/mol and with uncertainties UNH3 <3%. The infrastructure consists of a stationary as well as a mobile device for full flexibility in the application: In the stationary system, a magnetic suspension balance monitors the specific temperature and pressure dependent mass loss over time of the pure substance in a permeation tube (here NH3) by permeation through a membrane into a constant flow of carrier gas. Subsequently, this mixture is diluted with a system of thermal mass flow controllers in one or two consecutive steps to desired amount fractions. The permeation tube with calibrated permeation rate (mass loss over time previously determined in the magnetic suspension balance) can be transferred into the temperature-regulated permeation chamber of a newly developed mobile reference gas generator (ReGaS1). In addition to the permeation chamber it consists of the same dilution system as afore mentioned, stationary system. All components are fully traceable to SI standards. Considerable effort has been made to minimise adsorption on the gas-wetted stainless steel surfaces and thus to reduce stabilisation times by applying the SilcoNert2000® coating substance. Analysers can be connected directly to both, stationary and mobile systems for calibration. Moreover, the resulting gas mixture can also be pressurised into coated cylinders by cryo-filling. The mobile system as well as these cylinders can be applied for calibrations in other laboratories and in the field. In addition, an SI traceable system based on a cascade of critical orifices has been established to dilute NH3 mixtures in the order of μmol/mol stored in cylinders for the participation in the international key-comparison CCQM K117. It is planned to establish this system to calibrate and re-sample gas cylinders due to its very economical gas use. Here we present insights into the development of said infrastructure and results of the first performance tests. Moreover, we include results of the study on adsorption/desorption effects in dry as well as humidified matrix gas into the discussion on the generation of reference gas mixtures. Acknowledgement: This work was supported by the European Metrology Research Programme (EMRP). The EMRP is jointly funded by the EMRP participating countries within EURAMET and the European Union.
Identification of granite varieties from colour spectrum data.
Araújo, María; Martínez, Javier; Ordóñez, Celestino; Vilán, José Antonio
2010-01-01
The granite processing sector of the northwest of Spain handles many varieties of granite with specific technical and aesthetic properties that command different prices in the natural stone market. Hence, correct granite identification and classification from the outset of processing to the end-product stage optimizes the management and control of stocks of granite slabs and tiles and facilitates the operation of traceability systems. We describe a methodology for automatically identifying granite varieties by processing spectral information captured by a spectrophotometer at various stages of processing using functional machine learning techniques.
Identification of Granite Varieties from Colour Spectrum Data
Araújo, María; Martínez, Javier; Ordóñez, Celestino; Vilán, José Antonio
2010-01-01
The granite processing sector of the northwest of Spain handles many varieties of granite with specific technical and aesthetic properties that command different prices in the natural stone market. Hence, correct granite identification and classification from the outset of processing to the end-product stage optimizes the management and control of stocks of granite slabs and tiles and facilitates the operation of traceability systems. We describe a methodology for automatically identifying granite varieties by processing spectral information captured by a spectrophotometer at various stages of processing using functional machine learning techniques. PMID:22163673
Software for imaging phase-shift interference microscope
NASA Astrophysics Data System (ADS)
Malinovski, I.; França, R. S.; Couceiro, I. B.
2018-03-01
In recent years absolute interference microscope was created at National Metrology Institute of Brazil (INMETRO). The instrument by principle of operation is imaging phase-shifting interferometer (PSI) equipped with two stabilized lasers of different colour as traceable reference wavelength sources. We report here some progress in development of the software for this instrument. The status of undergoing internal validation and verification of the software is also reported. In contrast with standard PSI method, different methodology of phase evaluation is applied. Therefore, instrument specific procedures for software validation and verification are adapted and discussed.
White, Judith; Carolan-Rees, Grace
2013-01-01
A standardised terminology for describing medical devices can enable safe and unambiguous exchange of information. Proposed changes to EU-wide medical devices regulations mandate the use of such a system. This article reviews two important classification systems for medical devices in the UK. The Global Medical Device Nomenclature (GMDN) provides a classification system specifically for medical devices and diagnostics, and facilitates data exchange between manufacturers and regulators. SNOMED CT is the terminology of choice in the NHS for communicating, sharing and storing information about patients’ healthcare episodes. Harmonisation of GMDN and SNOMED CT will encourage use of single terminology throughout the lifetime of a device; from regulatory approval through clinical use and post-marketing surveillance. Manufacturers will be required to register medical devices with a European device database (Eudamed) and to fit certain devices with a Unique Device Identifier; both are efforts to improve transparency and traceability of medical devices. Successful implementation of these elements depends on having a consistent nomenclature for medical devices. PMID:23885299
Quantitative optical metrology with CMOS cameras
NASA Astrophysics Data System (ADS)
Furlong, Cosme; Kolenovic, Ervin; Ferguson, Curtis F.
2004-08-01
Recent advances in laser technology, optical sensing, and computer processing of data, have lead to the development of advanced quantitative optical metrology techniques for high accuracy measurements of absolute shapes and deformations of objects. These techniques provide noninvasive, remote, and full field of view information about the objects of interest. The information obtained relates to changes in shape and/or size of the objects, characterizes anomalies, and provides tools to enhance fabrication processes. Factors that influence selection and applicability of an optical technique include the required sensitivity, accuracy, and precision that are necessary for a particular application. In this paper, sensitivity, accuracy, and precision characteristics in quantitative optical metrology techniques, and specifically in optoelectronic holography (OEH) based on CMOS cameras, are discussed. Sensitivity, accuracy, and precision are investigated with the aid of National Institute of Standards and Technology (NIST) traceable gauges, demonstrating the applicability of CMOS cameras in quantitative optical metrology techniques. It is shown that the advanced nature of CMOS technology can be applied to challenging engineering applications, including the study of rapidly evolving phenomena occurring in MEMS and micromechatronics.
A visualization framework for design and evaluation
NASA Astrophysics Data System (ADS)
Blundell, Benjamin J.; Ng, Gary; Pettifer, Steve
2006-01-01
The creation of compelling visualisation paradigms is a craft often dominated by intuition and issues of aesthetics, with relatively few models to support good design. The majority of problem cases are approached by simply applying a previously evaluated visualisation technique. A large body of work exists covering the individual aspects of visualisation design such as the human cognition aspects visualisation methods for specific problem areas, psychology studies and so forth, yet most frameworks regarding visualisation are applied after-the-fact as an evaluation measure. We present an extensible framework for visualisation aimed at structuring the design process, increasing decision traceability and delineating the notions of function, aesthetics and usability. The framework can be used to derive a set of requirements for good visualisation design and evaluating existing visualisations, presenting possible improvements. Our framework achieves this by being both broad and general, built on top of existing works, with hooks for extensions and customizations. This paper shows how existing theories of information visualisation fit into the scheme, presents our experience in the application of this framework on several designs, and offers our evaluation of the framework and the designs studied.
NASA Astrophysics Data System (ADS)
Frew, Russell; Cannavan, Andrew; Zandric, Zora; Maestroni, Britt; Abrahim, Aiman
2013-04-01
Traceability systems play a key role in assuring a safe and reliable food supply. Analytical techniques harnessing the spatial patterns in distribution of stable isotope and trace element ratios can be used for the determination of the provenance of food. Such techniques offer the potential to enhance global trade by providing an independent means of verifying "paper" traceability systems and can also help to prove authenticity, to combat fraudulent practices, and to control adulteration, which are important issues for economic, religious or cultural reasons. To address some of the challenges that developing countries face in attempting to implement effective food traceability systems, the IAEA, through its Joint FAO/IAEA Division on Nuclear Techniques in Food and Agriculture, has initiated a 5-year coordinated research project involving institutes in 15 developing and developed countries (Austria, Botswana, Chile, China, France, India, Lebanon, Morocco, Portugal, Singapore, Sweden, Thailand, Uganda, UK, USA). The objective is to help in member state laboratories to establish robust analytical techniques and databases, validated to international standards, to determine the provenance of food. Nuclear techniques such as stable isotope and multi-element analysis, along with complementary methods, will be applied for the verification of food traceability systems and claims related to food origin, production, and authenticity. This integrated and multidisciplinary approach to strengthening capacity in food traceability will contribute to the effective implementation of holistic systems for food safety and control. The project focuses mainly on the development of techniques to confirm product authenticity, with several research partners also considering food safety issues. Research topics encompass determination of the geographical origin of a variety of commodities, including seed oils, rice, wine, olive oil, wheat, orange juice, fish, groundnuts, tea, pork, honey and coffee, the adulteration of milk with soy protein, chemical contamination of food products, and inhomogeneity in isotopic ratios in poultry and eggs as a means to determine production history. Analytical techniques include stable isotope ratio measurements (2H/1H, 13C/12C, 15N/14N, 18O/16O, 34S/32S, 87Sr/86Sr, 208Pb/207Pb/206Pb), elemental analysis, DNA fingerprinting, fatty acid and other biomolecule profiling, chromatography-mass spectrometry and near infra-red spectroscopy.
Determinants of referral practices of clients by traditional birth attendants in Ilorin, Nigeria.
Abodunrin, O L; Akande, T M; Musa, I O; Aderibigbe, S A
2010-06-01
A sizeable number of deliveries still take place with the assistance of Traditional Birth Attendants in Nigeria. This study aims to determine the factors that determine the referral practices of the TBAs in Ilorin of high risk and complicated pregnancies. This descriptive study was conducted among all the 162 registered TBAs in Ilorin that were traceable using pre-tested semi-structured interviewer-administered questionnaire. About 90%, whose source of skill acquisition was by inheritance did not refer their clients appropriately compared with 48% of those whose source of skill acquisition was through formal training (p<0.05). The more the numbers of trainings, the more appropriate the referral (p<0.05). Having supervisory visit by qualified personnel is associated with appropriate referral practices (p<0.05). Regular training and re-training of TBAs with routine monitoring and supportive supervision will promote prompt referral of high risk and complicated pregnancies and deliveries.
NASA Technical Reports Server (NTRS)
Waller, Jess M.; Beeson, Harold D.; Newton, Barry E.; Fries, Joseph (Technical Monitor)
2000-01-01
The dimensional stability of polychlorotrifluoroethylene (PCTFE) valve seats used in oxygen regulator applications was determined by thermomechanical analysis (TMA). Two traceable grades of PCTFE were tested; Kel-F 81 and Neoflon CTFE M400H. For these particular resins, the effect of percent crystallinity, zero strength time (ZST) molecular weight, resin grade, process history (compression-molded versus extruded) on the dimensional stability and annealing behavior was determined. In addition to the traceable Kel-F 81 and Neoflon CTFE M400H grades, actual PCI'PH valve seats of differing geometry and design were tested by TMA. The PCTFE valve seats were of unspecified resin grade, although certain inferences about the grade could be drawn based on knowledge of the valve seat fabrication date. Results consistently revealed dimensional instability of varying magnitude at temperatures ranging from 40 to 70 degrees Celsius. Furthermore, some of the pre- 1 995 seats appeared to be more dimensionally stable than those fabricated after 1995. The TMA results are discussed in the context of several proposed ignition mechanisms; namely, particle impact, presence of contaminant oils and fibers, and localized heating by flow friction and/or resonance. The effect of metal constraint on the dimensional stability of PCTFE is also discussed. Finally, the effect of percent crystallinity, ZST molecular weight, resin grade, process history (compression-molded versus extruded) on the AIT, delta Hc and impact sensitivity of various types of Neoflon CTFE M400H was determined using Kel-F 81 as a control. Results show that the AIT, delta Hc and impact sensitivity were essentially independent of Neoflon CTFE process history and structure.
Kumkrong, Paramee; Thiensong, Benjaporn; Le, Phuong Mai; McRae, Garnet; Windust, Anthony; Deawtong, Suladda; Meija, Juris; Maxwell, Paulette; Yang, Lu; Mester, Zoltán
2016-11-02
Methods based on species specific isotope dilution were developed for the accurate and SI traceable determination of arsenobetaine (AsBet) and methylmercury (MeHg) in prawn and cuttlefish tissues by LC-MS/MS and SPME GC-ICPMS. Quantitation of AsBet and MeHg were achieved by using a 13 C-enriched AsBet spike (NRC CRM CBET-1) and an enriched spike of Me 198 Hg (NRC CRM EMMS-1), respectively, wherein analyte mass fractions in enriched spikes were determined by reverse isotope dilution using natural abundance AsBet and MeHg primary standards. Purity of these primary standards were characterized by quantitative 1 H-NMR with the use of NIST SRM 350b benzoic acid as a primary calibrator, ensuring the final measurement results traceable to SI. Validation of employed methods of ID LC-MS/MS and ID SPME GC-ICPMS was demonstrated by analysis of several biological CRMs (DORM-4, TORT-3, DOLT-5, BCR-627 and BCR-463) with satisfying results. The developed methods were applied for the determination of AsBet and MeHg in two new certified reference materials (CRMs) prawn (PRON-1) and cuttlefish (SQID-1) produced jointly by Thailand Institute of Scientific and Technological Research (TISTR) and National Research Council Canada (NRC). With additional measurements of AsBet using LC-ICPMS with standard additions calibration and external calibration at NRC and TISTR, respectively, certified values of 1.206 ± 0.058 and 13.96 ± 0.54 mg kg -1 for AsBet as As (expanded uncertainty, k = 2) were obtained for the new CRMs PRON-1 and SQID-1, respectively. The reference value of 0.324 ± 0.028 mg kg -1 as Hg (expanded uncertainty, k = 2) for MeHg was obtained for the SQID-1 based on the results obtained by ID SPME GC-ICPMS method only, whereas MeHg in PRON-1 was found to be < 0.015 mg kg -1 . It was found that AsBet comprised 69.7% and 99.0% of total As in the prawn and cuttlefish, respectively, whereas MeHg comprised 94.5% of total Hg in cuttlefish. Crown Copyright © 2016. Published by Elsevier B.V. All rights reserved.
Rodríguez-Ramírez, Roberto; González-Córdova, Aarón F; Vallejo-Cordoba, Belinda
2011-01-31
This work presents an overview of the applicability of PCR-based capillary electrophoresis (CE) in food authentication and traceability of foods from animal origin. Analytical approaches for authenticating and tracing meat and meat products and fish and seafood products are discussed. Particular emphasis will be given to the usefulness of genotyping in food tracing by using CE-based genetic analyzers. Copyright © 2010 Elsevier B.V. All rights reserved.
Waste collection multi objective model with real time traceability data.
Faccio, Maurizio; Persona, Alessandro; Zanin, Giorgia
2011-12-01
Waste collection is a highly visible municipal service that involves large expenditures and difficult operational problems, plus it is expensive to operate in terms of investment costs (i.e. vehicles fleet), operational costs (i.e. fuel, maintenances) and environmental costs (i.e. emissions, noise and traffic congestions). Modern traceability devices, like volumetric sensors, identification RFID (Radio Frequency Identification) systems, GPRS (General Packet Radio Service) and GPS (Global Positioning System) technology, permit to obtain data in real time, which is fundamental to implement an efficient and innovative waste collection routing model. The basic idea is that knowing the real time data of each vehicle and the real time replenishment level at each bin makes it possible to decide, in function of the waste generation pattern, what bin should be emptied and what should not, optimizing different aspects like the total covered distance, the necessary number of vehicles and the environmental impact. This paper describes a framework about the traceability technology available in the optimization of solid waste collection, and introduces an innovative vehicle routing model integrated with the real time traceability data, starting the application in an Italian city of about 100,000 inhabitants. The model is tested and validated using simulation and an economical feasibility study is reported at the end of the paper. Copyright © 2011 Elsevier Ltd. All rights reserved.
The Joint Committee for Traceability in Laboratory Medicine (JCTLM) - its history and operation.
Jones, Graham R D; Jackson, Craig
2016-01-30
The Joint Committee for Traceability in Laboratory Medicine (JCTLM) was formed to bring together the sciences of metrology, laboratory medicine and laboratory quality management. The aim of this collaboration is to support worldwide comparability and equivalence of measurement results in clinical laboratories for the purpose of improving healthcare. The JCTLM has its origins in the activities of international metrology treaty organizations, professional societies and federations devoted to improving measurement quality in physical, chemical and medical sciences. The three founding organizations, the International Committee for Weights and Measures (CIPM), the International Federation of Clinical Chemistry and Laboratory Medicine (IFCC) and the International Laboratory Accreditation Cooperation (ILAC) are the leaders of this activity. The main service of the JCTLM is a web-based database with a list of reference materials, reference methods and reference measurement services meeting appropriate international standards. This database allows manufacturers to select references for assay traceability and provides support for suppliers of these services. As of mid 2015 the database lists 295 reference materials for 162 analytes, 170 reference measurement procedures for 79 analytes and 130 reference measurement services for 39 analytes. There remains a need for the development and implementation of metrological traceability in many areas of laboratory medicine and the JCTLM will continue to promote these activities into the future. Copyright © 2015 Elsevier B.V. All rights reserved.
SWIR calibration of Spectralon reflectance factor
NASA Astrophysics Data System (ADS)
Georgiev, Georgi T.; Butler, James J.; Cooksey, Catherine; Ding, Leibo; Thome, Kurtis J.
2011-11-01
Satellite instruments operating in the reflective solar wavelength region require accurate and precise determination of the Bidirectional Reflectance Factor (BRF) of laboratory-based diffusers used in their pre-flight and on-orbit radiometric calibrations. BRF measurements are required throughout the reflected-solar spectrum from the ultraviolet through the shortwave infrared. Spectralon diffusers are commonly used as a reflectance standard for bidirectional and hemispherical geometries. The Diffuser Calibration Laboratory (DCaL) at NASA's Goddard Space Flight Center is a secondary calibration facility with reflectance measurements traceable to those made by the Spectral Tri-function Automated Reference Reflectometer (STARR) facility at the National Institute of Standards and Technology (NIST). For more than two decades, the DCaL has provided numerous NASA projects with BRF data in the ultraviolet (UV), visible (VIS) and the Near InfraRed (NIR) spectral regions. Presented in this paper are measurements of BRF from 1475 nm to 1625 nm obtained using an indium gallium arsenide detector and a tunable coherent light source. The sample was a 50.8 mm (2 in) diameter, 99% white Spectralon target. The BRF results are discussed and compared to empirically generated data from a model based on NIST certified values of 6°directional-hemispherical spectral reflectance factors from 900 nm to 2500 nm. Employing a new NIST capability for measuring bidirectional reflectance using a cooled, extended InGaAs detector, BRF calibration measurements of the same sample were also made using NIST's STARR from 1475 nm to 1625 nm at an incident angle of 0° and at viewing angle of 45°. The total combined uncertainty for BRF in this ShortWave Infrared (SWIR) range is less than 1%. This measurement capability will evolve into a BRF calibration service in SWIR region in support of NASA remote sensing missions.
Srivastava, Abneesh; Michael Verkouteren, R
2018-07-01
Isotope ratio measurements have been conducted on a series of isotopically distinct pure CO 2 gas samples using the technique of dual-inlet isotope ratio mass spectrometry (DI-IRMS). The influence of instrumental parameters, data normalization schemes on the metrological traceability and uncertainty of the sample isotope composition have been characterized. Traceability to the Vienna PeeDee Belemnite(VPDB)-CO 2 scale was realized using the pure CO 2 isotope reference materials(IRMs) 8562, 8563, and 8564. The uncertainty analyses include contributions associated with the values of iRMs and the repeatability and reproducibility of our measurements. Our DI-IRMS measurement system is demonstrated to have high long-term stability, approaching a precision of 0.001 parts-per-thousand for the 45/44 and 46/44 ion signal ratios. The single- and two-point normalization bias for the iRMs were found to be within their published standard uncertainty values. The values of 13 C/ 12 C and 18 O/ 16 O isotope ratios are expressed relative to VPDB-CO 2 using the [Formula: see text] and [Formula: see text] notation, respectively, in parts-per-thousand (‰ or per mil). For the samples, value assignments between (-25 to +2) ‰ and (-33 to -1) ‰ with nominal combined standard uncertainties of (0.05, 0.3) ‰ for [Formula: see text] and [Formula: see text], respectively were obtained. These samples are used as laboratory reference to provide anchor points for value assignment of isotope ratios (with VPDB traceability) to pure CO 2 samples. Additionally, they serve as potential parent isotopic source material required for the development of gravimetric based iRMs of CO 2 in CO 2 -free dry air in high pressure gas cylinder packages at desired abundance levels and isotopic composition values. Graphical abstract CO 2 gas isotope ratio metrology.
Legge, Jennifer
2013-10-01
Musculoskeletal injuries account for the largest proportion of workplace injuries. In an attempt to predict, and subsequently manage, the risk of sprains and strains in the workplace, employers are turning to pre-employment screening. Functional capacity evaluations (FCEs) are increasing in popularity as a tool for pre-employment screening despite limited published evidence for their validity in healthy working populations. This narrative review will present an overview of the state of the evidence for pre-employment functional testing, propose a framework for decision-making to determine the suitability of assessment tools, and discuss the role and potential ethical challenges for physiotherapists conducting pre-employment functional testing. Much of the evidence surrounding the validity of functional testing is in the context of the injured worker and prediction of return to work. In healthy populations, FCE components, such as aerobic fitness and manual handling activities, have demonstrated predictability of workplace injury in a small number of studies. This predictability improves when workers' performance is compared with the job demands. This job-specific approach is also required to meet anti-discrimination requirements. There are a number of practical limitations to functional testing, although these are not limited to the pre-employment domain. Physiotherapists need to have a clear understanding of the legal requirements and potential ethical challenges that they may face when conducting pre-employment functional assessments (PEFAs). Further research is needed into the efficacy of pre-employment testing for workplace injury prevention. Physiotherapists and PEFAs are just one part of a holistic approach to workplace injury prevention.
c-Myb Coordinates Survival and the Expression of Genes That Are Critical for the Pre-BCR Checkpoint.
Fahl, Shawn P; Daamen, Andrea R; Crittenden, Rowena B; Bender, Timothy P
2018-05-15
The c-Myb transcription factor is required for adult hematopoiesis, yet little is known about c-Myb function during lineage-specific differentiation due to the embryonic lethality of Myb -null mutations. We previously used tissue-specific inactivation of the murine Myb locus to demonstrate that c-Myb is required for differentiation to the pro-B cell stage, survival during the pro-B cell stage, and the pro-B to pre-B cell transition during B lymphopoiesis. However, few downstream mediators of c-Myb-regulated function have been identified. We demonstrate that c-Myb regulates the intrinsic survival of CD19 + pro-B cells in the absence of IL-7 by repressing expression of the proapoptotic proteins Bmf and Bim and that levels of Bmf and Bim mRNA are further repressed by IL-7 signaling in pro-B cells. c-Myb regulates two crucial components of the IL-7 signaling pathway: the IL-7Rα-chain and the negative regulator SOCS3 in CD19 + pro-B cells. Bypassing IL-7R signaling through constitutive activation of Stat5b largely rescues survival of c-Myb-deficient pro-B cells, whereas constitutively active Akt is much less effective. However, rescue of pro-B cell survival is not sufficient to rescue proliferation of pro-B cells or the pro-B to small pre-B cell transition, and we further demonstrate that c-Myb-deficient large pre-B cells are hypoproliferative. Analysis of genes crucial for the pre-BCR checkpoint demonstrates that, in addition to IL-7Rα, the genes encoding λ5, cyclin D3, and CXCR4 are downregulated in the absence of c-Myb, and λ5 is a direct c-Myb target. Thus, c-Myb coordinates survival with the expression of genes that are required during the pre-BCR checkpoint. Copyright © 2018 by The American Association of Immunologists, Inc.
Proposed best practice for projects that involve modelling and simulation.
O'Kelly, Michael; Anisimov, Vladimir; Campbell, Chris; Hamilton, Sinéad
2017-03-01
Modelling and simulation has been used in many ways when developing new treatments. To be useful and credible, it is generally agreed that modelling and simulation should be undertaken according to some kind of best practice. A number of authors have suggested elements required for best practice in modelling and simulation. Elements that have been suggested include the pre-specification of goals, assumptions, methods, and outputs. However, a project that involves modelling and simulation could be simple or complex and could be of relatively low or high importance to the project. It has been argued that the level of detail and the strictness of pre-specification should be allowed to vary, depending on the complexity and importance of the project. This best practice document does not prescribe how to develop a statistical model. Rather, it describes the elements required for the specification of a project and requires that the practitioner justify in the specification the omission of any of the elements and, in addition, justify the level of detail provided about each element. This document is an initiative of the Special Interest Group for modelling and simulation. The Special Interest Group for modelling and simulation is a body open to members of Statisticians in the Pharmaceutical Industry and the European Federation of Statisticians in the Pharmaceutical Industry. Examples of a very detailed specification and a less detailed specification are included as appendices. Copyright © 2016 John Wiley & Sons, Ltd.
Development of a multiplex DNA-based traceability tool for crop plant materials.
Voorhuijzen, Marleen M; van Dijk, Jeroen P; Prins, Theo W; Van Hoef, A M Angeline; Seyfarth, Ralf; Kok, Esther J
2012-01-01
The authenticity of food is of increasing importance for producers, retailers and consumers. All groups benefit from the correct labelling of the contents of food products. Producers and retailers want to guarantee the origin of their products and check for adulteration with cheaper or inferior ingredients. Consumers are also more demanding about the origin of their food for various socioeconomic reasons. In contrast to this increasing demand, correct labelling has become much more complex because of global transportation networks of raw materials and processed food products. Within the European integrated research project 'Tracing the origin of food' (TRACE), a DNA-based multiplex detection tool was developed-the padlock probe ligation and microarray detection (PPLMD) tool. In this paper, this method is extended to a 15-plex traceability tool with a focus on products of commercial importance such as the emmer wheat Farro della Garfagnana (FdG) and Basmati rice. The specificity of 14 plant-related padlock probes was determined and initially validated in mixtures comprising seven or nine plant species/varieties. One nucleotide difference in target sequence was sufficient for the distinction between the presence or absence of a specific target. At least 5% FdG or Basmati rice was detected in mixtures with cheaper bread wheat or non-fragrant rice, respectively. The results suggested that even lower levels of (un-)intentional adulteration could be detected. PPLMD has been shown to be a useful tool for the detection of fraudulent/intentional admixtures in premium foods and is ready for the monitoring of correct labelling of premium foods worldwide.
Detection of rabbit and hare processed material in compound feeds by TaqMan real-time PCR.
Pegels, N; López-Calleja, I; García, T; Martín, R; González, I
2013-01-01
Food and feed traceability has become a priority for governments due to consumer demand for comprehensive and integrated safety policies. In the present work, a TaqMan real-time PCR assay targeting the mitochondrial 12S rRNA gene was developed for specific detection of rabbit and hare material in animal feeds and pet foods. The technique is based on the use of three species-specific primer/probe detection systems targeting three 12S rRNA gene fragments: one from rabbit species, another one from hare species and a third fragment common to rabbit and hare (62, 102 and 75 bp length, respectively). A nuclear 18S rRNA PCR system, detecting a 77-bp amplicon, was used as positive amplification control. Assay performance and sensitivity were assessed through the analysis of a batch of laboratory-scale feeds treated at 133°C at 3 bar for 20 min to reproduce feed processing conditions dictated by European regulations. Successful detection of highly degraded rabbit and hare material was achieved at the lowest target concentration assayed (0.1%). Furthermore, the method was applied to 96 processed commercial pet food products to determine whether correct labelling had been used at the market level. The reported real-time PCR technique detected the presence of rabbit tissues in 80 of the 96 samples analysed (83.3%), indicating a possible labelling fraud in some pet foods. The real-time PCR method reported may be a useful tool for traceability purposes within the framework of feed control.
Teng, Y K Onno; Bredewold, Edwin O W; Rabelink, Ton J; Huizinga, Tom W J; Eikenboom, H C Jeroen; Limper, Maarten; Fritsch-Stork, Ruth D E; Bloemenkamp, Kitty W M; Sueters, Marieke
2017-11-20
Patients with SLE are often young females of childbearing age and a pregnancy wish in this patient group is common. However, SLE patients are at high risk for adverse pregnancy outcomes that require adequate guidance. It is widely acknowledged that pre-pregnancy counselling is the pivotal first step in the management of SLE patients with a wish to become pregnant. Next, management of these patients is usually multidisciplinary and often requires specific expertise from the different physicians involved. Very recently a EULAR recommendation was published emphasizing the need for adequate preconception counselling and risk stratification. Therefore the present review specifically addresses the issue of pre-pregnancy counselling for SLE patients with an evidence-based approach. The review summarizes data retrieved from recently published, high-quality cohort studies that have contributed to a better understanding and estimation of pregnancy-related risks for SLE patients. The present review categorizes risks from a patient-oriented point of view, that is, the influence of pregnancy on SLE, of SLE on pregnancy, of SLE on the foetus/neonate and of SLE-related medication. Lastly, pre-pregnancy counselling of SLE patients with additional secondary APS is reviewed. Collectively these data can guide clinicians to formulate appropriate preventive strategies and patient-tailored monitoring plans during pre-pregnancy counselling of SLE patients. © The Author 2017. Published by Oxford University Press on behalf of the British Society for Rheumatology. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Proceedings of the July 2011 Traceability Research Summit.
Newsome, Rosetta L; Bhatt, Tejas; McEntire, Jennifer C
2013-12-01
At a discussion-based forum of 50 leaders in the area of food product tracing, participants recognized the need for the development of a common vision for a simple, low cost and implementable traceability approach. A key theme that emerged during the day's discussions revolved around not reinventing the wheel: there are many efforts underway, including numerous pilots, and these efforts should be collaborative. The group sought more information on current initiatives and felt that learning from the experiences of others could help form a realistic vision for the future. Although any forthcoming actions from the US FDA are unknown, industry fully expects that improvements in product tracing will be necessary, and expects that industry itself (through the "demand" side) will enact requirements that may surpass regulatory mandates. A chief concern is uniform adoption, which will require outreach to and support from the global community as well as small firms that may lack the resources and education to keep up. Ultimately, an approach that is global, economical, scalable, and inclusive of firms of all sizes who handles all types of food products, will have the greatest likelihood of success. While the ability to rapidly link products across the supply chain serves as an ideal goal, there are still substantial concerns to be addressed, particularly regarding confidentiality of data, and who will have access to what information under what circumstances, which was woven into virtually every discussion topic. Who will spearhead the development of the visions is a question, but there was general agreement that a joint partnership which includes all stakeholders is a necessity. © 2012 Institute of Food Technologists®
Robust Informatics Infrastructure Required For ICME: Combining Virtual and Experimental Data
NASA Technical Reports Server (NTRS)
Arnold, Steven M.; Holland, Frederic A. Jr.; Bednarcyk, Brett A.
2014-01-01
With the increased emphasis on reducing the cost and time to market of new materials, the need for robust automated materials information management system(s) enabling sophisticated data mining tools is increasing, as evidenced by the emphasis on Integrated Computational Materials Engineering (ICME) and the recent establishment of the Materials Genome Initiative (MGI). This need is also fueled by the demands for higher efficiency in material testing; consistency, quality and traceability of data; product design; engineering analysis; as well as control of access to proprietary or sensitive information. Further, the use of increasingly sophisticated nonlinear, anisotropic and or multi-scale models requires both the processing of large volumes of test data and complex materials data necessary to establish processing-microstructure-property-performance relationships. Fortunately, material information management systems have kept pace with the growing user demands and evolved to enable: (i) the capture of both point wise data and full spectra of raw data curves, (ii) data management functions such as access, version, and quality controls;(iii) a wide range of data import, export and analysis capabilities; (iv) data pedigree traceability mechanisms; (v) data searching, reporting and viewing tools; and (vi) access to the information via a wide range of interfaces. This paper discusses key principles for the development of a robust materials information management system to enable the connections at various length scales to be made between experimental data and corresponding multiscale modeling toolsets to enable ICME. In particular, NASA Glenn's efforts towards establishing such a database for capturing constitutive modeling behavior for both monolithic and composites materials
NASA Astrophysics Data System (ADS)
Best, Fred A.; Revercomb, Henry E.; Knuteson, Robert O.; Tobin, David C.; Ellington, Scott D.; Werner, Mark W.; Adler, Douglas P.; Garcia, Raymond K.; Taylor, Joseph K.; Ciganovich, Nick N.; Smith, William L., Sr.; Bingham, Gail E.; Elwell, John D.; Scott, Deron K.
2005-01-01
The NASA New Millennium Program's Geosynchronous Imaging Fourier Transform Spectrometer (GIFTS) instrument provides enormous advances in water vapor, wind, temperature, and trace gas profiling from geostationary orbit. The top-level instrument calibration requirement is to measure brightness temperature to better than 1 K (3 sigma) over a broad range of atmospheric brightness temperatures, with a reproducibility of +/-0.2 K. For in-flight radiometric calibration, GIFTS uses views of two on-board blackbody sources (290 K and 255 K) along with cold space, sequenced at regular programmable intervals. The blackbody references are cavities that follow the UW Atmospheric Emitted Radiance Interferometer (AERI) design, scaled to the GIFTS beam size. The cavity spectral emissivity is better than 0.998 with an absolute uncertainty of less than 0.001. Absolute blackbody temperature uncertainties are estimated at 0.07 K. This paper describes the detailed design of the GIFTS on-board calibration system that recently underwent its Critical Design Review. The blackbody cavities use ultra-stable thermistors to measure temperature, and are coated with high emissivity black paint. Monte Carlo modeling has been performed to calculate the cavity emissivity. Both absolute temperature and emissivity measurements are traceable to NIST, and detailed uncertainty budgets have been developed and used to show the overall system meets accuracy requirements. The blackbody controller is housed on a single electronics board and provides precise selectable set point temperature control, thermistor resistance measurement, and the digital interface to the GIFTS instrument. Plans for the NIST traceable ground calibration of the on-board blackbody system have also been developed and are presented in this paper.
Quality data collection and management technology of aerospace complex product assembly process
NASA Astrophysics Data System (ADS)
Weng, Gang; Liu, Jianhua; He, Yongxi; Zhuang, Cunbo
2017-04-01
Aiming at solving problems of difficult management and poor traceability for discrete assembly process quality data, a data collection and management method is proposed which take the assembly process and BOM as the core. Data collection method base on workflow technology, data model base on BOM and quality traceability of assembly process is included in the method. Finally, assembly process quality data management system is developed and effective control and management of quality information for complex product assembly process is realized.
Study of mechanism improving target course traceability in G-Vectoring Control
NASA Astrophysics Data System (ADS)
Yamakado, Makoto; Abe, Masato; Kano, Yoshio; Umetsu, Daisuke; Yoshioka, Thoru
2018-05-01
Production-type G-Vectoring Control vehicles are now being put on the market. Customers and reviewers have praised the handling quality and course traceability of these vehicles. This paper clarifies the mechanism behind this improvement in handling quality using a simple bicycle model and driver model analysis. It focuses on the residual yaw angular acceleration when the steering speed is zero and shows that GVC reduces its value. This result provides evidence for improved handling quality in GVC vehicles.
Naito, H K
1989-03-01
We have approached a dawn of a new era in detection, evaluation, treatment, and monitoring of individuals with elevated blood cholesterol levels who are at increased risk for CHD. The NHLBI's National Cholesterol Education Program will be the major force underlying this national awareness program, which is dependent on the clinical laboratories providing reliable data. Precision or reproducibility of results is not a problem for most of the laboratories, but accuracy is a major concern. Both the manufacturers and laboratorians need to standardize the measurement for cholesterol so that the accuracy base is traceable to the NCCLS NRS/CHOL. The manufacturers need to adopt a uniform policy that will ensure that the values assigned to calibration, quality control, and quality assurance or survey materials are accurate and traceable to the NCCLS/CHOL. Since, at present, there are some limitations of these materials caused by matrix effects, laboratories are encouraged to use the CDC-NHLBI National Reference Laboratory Network to evaluate and monitor their ability to measure patient blood cholesterol levels accurately. Major areas of analytical problems are identified and general, as well as specific, recommendations are provided to help ensure reliable measurement of cholesterol in patient specimens.
NASA Astrophysics Data System (ADS)
Lambert, Jean-Christopher; Bojkov, Bojan
The Committee on Earth Observation Satellites (CEOS)/Working Group on Calibration and Validation (WGCV) is developing a global data quality strategy for the Global Earth Obser-vation System of Systems (GEOSS). In this context, CEOS WGCV elaborated the GEOSS Quality Assurance framework for Earth Observation (QA4EO, http://qa4eo.org). QA4EO en-compasses a documentary framework and a set of ten guidelines, which describe the top-level approach of QA activities and key requirements that drive the QA process. QA4EO is appli-cable virtually to all Earth Observation data. Calibration and validation activities are a cornerstone of the GEOSS data quality strategy. Proper uncertainty assessment of the satellite measurements and their derived data products is essential, and needs to be continuously monitored and traceable to standards. As a practical application of QA4EO, CEOS WGCV has undertaken to establish a set of best practices, methodologies and guidelines for satellite calibration and validation. The present paper reviews current developments of best practices and guidelines for the vali-dation of atmospheric composition satellites. Aimed as a community effort, the approach is to start with current practices that could be improved with time. The present review addresses current validation capabilities, achievements, caveats, harmonization efforts, and challenges. Terminologies and general principles of validation are reminded. Going beyond elementary def-initions of validation like the assessment of uncertainties, the specific GEOSS context requires considering also the validation of individual service components and against user requirements.
Agile Methods for Open Source Safety-Critical Software
Enquobahrie, Andinet; Ibanez, Luis; Cheng, Patrick; Yaniv, Ziv; Cleary, Kevin; Kokoori, Shylaja; Muffih, Benjamin; Heidenreich, John
2011-01-01
The introduction of software technology in a life-dependent environment requires the development team to execute a process that ensures a high level of software reliability and correctness. Despite their popularity, agile methods are generally assumed to be inappropriate as a process family in these environments due to their lack of emphasis on documentation, traceability, and other formal techniques. Agile methods, notably Scrum, favor empirical process control, or small constant adjustments in a tight feedback loop. This paper challenges the assumption that agile methods are inappropriate for safety-critical software development. Agile methods are flexible enough to encourage the right amount of ceremony; therefore if safety-critical systems require greater emphasis on activities like formal specification and requirements management, then an agile process will include these as necessary activities. Furthermore, agile methods focus more on continuous process management and code-level quality than classic software engineering process models. We present our experiences on the image-guided surgical toolkit (IGSTK) project as a backdrop. IGSTK is an open source software project employing agile practices since 2004. We started with the assumption that a lighter process is better, focused on evolving code, and only adding process elements as the need arose. IGSTK has been adopted by teaching hospitals and research labs, and used for clinical trials. Agile methods have matured since the academic community suggested they are not suitable for safety-critical systems almost a decade ago, we present our experiences as a case study for renewing the discussion. PMID:21799545
Agile Methods for Open Source Safety-Critical Software.
Gary, Kevin; Enquobahrie, Andinet; Ibanez, Luis; Cheng, Patrick; Yaniv, Ziv; Cleary, Kevin; Kokoori, Shylaja; Muffih, Benjamin; Heidenreich, John
2011-08-01
The introduction of software technology in a life-dependent environment requires the development team to execute a process that ensures a high level of software reliability and correctness. Despite their popularity, agile methods are generally assumed to be inappropriate as a process family in these environments due to their lack of emphasis on documentation, traceability, and other formal techniques. Agile methods, notably Scrum, favor empirical process control, or small constant adjustments in a tight feedback loop. This paper challenges the assumption that agile methods are inappropriate for safety-critical software development. Agile methods are flexible enough to encourage the rightamount of ceremony; therefore if safety-critical systems require greater emphasis on activities like formal specification and requirements management, then an agile process will include these as necessary activities. Furthermore, agile methods focus more on continuous process management and code-level quality than classic software engineering process models. We present our experiences on the image-guided surgical toolkit (IGSTK) project as a backdrop. IGSTK is an open source software project employing agile practices since 2004. We started with the assumption that a lighter process is better, focused on evolving code, and only adding process elements as the need arose. IGSTK has been adopted by teaching hospitals and research labs, and used for clinical trials. Agile methods have matured since the academic community suggested they are not suitable for safety-critical systems almost a decade ago, we present our experiences as a case study for renewing the discussion.
Stability of gravimetrically prepared ammonia in nitrogen standards at 10 and 100 µmolmol-1
NASA Astrophysics Data System (ADS)
Amico di Meane, Elena; Ferracci, Valerio; Martin, Nicholas A.; Brewer, Paul J.; Worton, David R.
2017-04-01
Ammonia (NH3) is a well-known ambient pollutant which plays a key role in both atmospheric chemistry and biogeochemical processes occurring in a variety of ecosystems. Ammonia is emitted from intensive animal farming and certain industrial processes: once in the atmosphere, it contributes to the increasing ambient levels of particulate matter observed across Europe. As legislation is being implemented to curb ammonia emissions, it is crucial to achieve metrological traceability for ammonia measurements in ambient air to allow comparability of field measurements, ensure accuracy of emissions inventories and verify the effectiveness of emission ceiling policies. The development of stable and traceable gas standards for instrument calibration underpins all of the above. To address this requirement, a stability study on gravimetrically-prepared high-pressure ammonia mixtures in nitrogen was carried out for two years for two different cylinder types at two different concentrations: 10 and 100 ppm. New standards were prepared gravimetrically every three to six months for comparison to determine any variations due to instability. In the first type of cylinders ammonia appears stable at 100 ppm but shows degradation of about 2% at 10 ppm over the timescale of the stability study; on the other hand, the second type of cylinders exhibits good stability already at the 10 ppm level.
Improving medical device regulation: the United States and Europe in perspective.
Sorenson, Corinna; Drummond, Michael
2014-03-01
Recent debates and events have brought into question the effectiveness of existing regulatory frameworks for medical devices in the United States and Europe to ensure their performance, safety, and quality. This article provides a comparative analysis of medical device regulation in the two jurisdictions, explores current reforms to improve the existing systems, and discusses additional actions that should be considered to fully meet this aim. Medical device regulation must be improved to safeguard public health and ensure that high-quality and effective technologies reach patients. We explored and analyzed medical device regulatory systems in the United States and Europe in accordance with the available gray and peer-reviewed literature and legislative documents. The two regulatory systems differ in their mandate and orientation, organization, pre- and postmarket evidence requirements, and transparency of process. Despite these differences, both jurisdictions face similar challenges for ensuring that only safe and effective devices reach the market, monitoring real-world use, and exchanging pertinent information on devices with key users such as clinicians and patients. To address these issues, reforms have recently been introduced or debated in the United States and Europe that are principally focused on strengthening regulatory processes, enhancing postmarket regulation through more robust surveillance systems, and improving the traceability and monitoring of devices. Some changes in premarket requirements for devices are being considered. Although the current reforms address some of the outstanding challenges in device regulation, additional steps are needed to improve existing policy. We examine a number of actions to be considered, such as requiring high-quality evidence of benefit for medium- and high-risk devices; moving toward greater centralization and coordination of regulatory approval in Europe; creating links between device identifier systems and existing data collection tools, such as electronic health records; and fostering increased and more effective use of registries to ensure safe postmarket use of new and existing devices. © 2014 Milbank Memorial Fund.
Improving Medical Device Regulation: The United States and Europe in Perspective
SORENSON, CORINNA; DRUMMOND, MICHAEL
2014-01-01
Context: Recent debates and events have brought into question the effectiveness of existing regulatory frameworks for medical devices in the United States and Europe to ensure their performance, safety, and quality. This article provides a comparative analysis of medical device regulation in the two jurisdictions, explores current reforms to improve the existing systems, and discusses additional actions that should be considered to fully meet this aim. Medical device regulation must be improved to safeguard public health and ensure that high-quality and effective technologies reach patients. Methods: We explored and analyzed medical device regulatory systems in the United States and Europe in accordance with the available gray and peer-reviewed literature and legislative documents. Findings: The two regulatory systems differ in their mandate and orientation, organization, pre-and postmarket evidence requirements, and transparency of process. Despite these differences, both jurisdictions face similar challenges for ensuring that only safe and effective devices reach the market, monitoring real-world use, and exchanging pertinent information on devices with key users such as clinicians and patients. To address these issues, reforms have recently been introduced or debated in the United States and Europe that are principally focused on strengthening regulatory processes, enhancing postmarket regulation through more robust surveillance systems, and improving the traceability and monitoring of devices. Some changes in premarket requirements for devices are being considered. Conclusions: Although the current reforms address some of the outstanding challenges in device regulation, additional steps are needed to improve existing policy. We examine a number of actions to be considered, such as requiring high-quality evidence of benefit for medium-and high-risk devices; moving toward greater centralization and coordination of regulatory approval in Europe; creating links between device identifier systems and existing data collection tools, such as electronic health records; and fostering increased and more effective use of registries to ensure safe postmarket use of new and existing devices. PMID:24597558
Detector Based Realisation of Illuminance Scale at NML-SIRIM
NASA Astrophysics Data System (ADS)
Abdullah, Mohd Nizam; Abidin, Mohd Nasir Zainal; Abidin, Abdul Rashid Zainal; Shaari, Sahbudin
2009-07-01
Illuminance scale is one of the fundamentals in the realisation of candela in optical radiation. The en route of the realisation is based on the fundamental process from the unbroken chain of traceability which includes from the primary standard disseminated to working standard and lastly the end user. There are many variations towards this realisation even though some of the national metrology institutes (NMI) does not have the primary standard but their traceability still valid. The realisation of National Metrology Laboratory SIRIM (NML-SIRIM), Malaysia illuminance scale is based on detector. The scale is traceable to National Physical Labortaory (NPL), United Kingdom (UK) by annually calibrating photometers and luminous intensity lamp. This paper describes measurement method and the system set-up was previously crosschecked with Korea Research Institute Standards and Science (KRISS), Republic of Korea. The agreement between both laboratories is within 0.5% the uncertainty maintained at NML-SIRIM. Furthermore, the basic measurement equation for illuminance realisation is also derived.
DeWerd, Larry A; Huq, M Saiful; Das, Indra J; Ibbott, Geoffrey S; Hanson, William F; Slowey, Thomas W; Williamson, Jeffrey F; Coursey, Bert M
2004-03-01
Low dose rate brachytherapy is being used extensively for the treatment of prostate cancer. As of September 2003, there are a total of thirteen 125I and seven 103Pd sources that have calibrations from the National Institute of Standards and Technology (NIST) and the Accredited Dosimetry Calibration Laboratories (ADCLs) of the American Association of Physicists in Medicine (AAPM). The dosimetry standards for these sources are traceable to the NIST wide-angle free-air chamber. Procedures have been developed by the AAPM Calibration Laboratory Accreditation Subcommittee to standardize quality assurance and calibration, and to maintain the dosimetric traceability of these sources to ensure accurate clinical dosimetry. A description of these procedures is provided to the clinical users for traceability purposes as well as to provide guidance to the manufacturers of brachytherapy sources and ADCLs with regard to these procedures.
Cai, Yong; Li, Xiwen; Li, Mei; Chen, Xiaojia; Hu, Hao; Ni, Jingyun; Wang, Yitao
2015-01-01
Chemical fingerprinting is currently a widely used tool that enables rapid and accurate quality evaluation of Traditional Chinese Medicine (TCM). However, chemical fingerprints are not amenable to information storage, recognition, and retrieval, which limit their use in Chinese medicine traceability. In this study, samples of three kinds of Chinese medicines were randomly selected and chemical fingerprints were then constructed by using high performance liquid chromatography. Based on chemical data, the process of converting the TCM chemical fingerprint into two-dimensional code is presented; preprocess and filtering algorithm are also proposed aiming at standardizing the large amount of original raw data. In order to know which type of two-dimensional code (2D) is suitable for storing data of chemical fingerprints, current popular types of 2D codes are analyzed and compared. Results show that QR Code is suitable for recording the TCM chemical fingerprint. The fingerprint information of TCM can be converted into data format that can be stored as 2D code for traceability and quality control.
Bouslimi, D; Coatrieux, G; Cozic, M; Roux, Ch
2014-01-01
In this paper, we propose a novel crypto-watermarking system for the purpose of verifying the reliability of images and tracing them, i.e. identifying the person at the origin of an illegal distribution. This system couples a common watermarking method, based on Quantization Index Modulation (QIM), and a joint watermarking-decryption (JWD) approach. At the emitter side, it allows the insertion of a watermark as a proof of reliability of the image before sending it encrypted; at the reception, another watermark, a proof of traceability, is embedded during the decryption process. The scheme we propose makes interoperate such a combination of watermarking approaches taking into account risks of interferences between embedded watermarks, allowing the access to both reliability and traceability proofs. Experimental results confirm the efficiency of our system, and demonstrate it can be used to identify the physician at the origin of a disclosure even if the image has been modified.
Ritota, Mena; Casciani, Lorena; Valentini, Massimiliano
2013-05-01
Analytical traceability of PGI and PDO foods (Protected Geographical Indication and Protected Denomination Origin respectively) is one of the most challenging tasks of current applied research. Here we proposed a metabolomic approach based on the combination of (1)H high-resolution magic angle spinning-nuclear magnetic resonance (HRMAS-NMR) spectroscopy with multivariate analysis, i.e. PLS-DA, as a reliable tool for the traceability of Italian PGI chicories (Cichorium intybus L.), i.e. Radicchio Rosso di Treviso and Radicchio Variegato di Castelfranco, also known as red and red-spotted, respectively. The metabolic profile was gained by means of HRMAS-NMR, and multivariate data analysis allowed us to build statistical models capable of providing clear discrimination among the two varieties and classification according to the geographical origin. Based on Variable Importance in Projection values, the molecular markers for classifying the different types of red chicories analysed were found accounting for both the cultivar and the place of origin. © 2012 Society of Chemical Industry.
Traceability of 'Limone di Siracusa PGI' by a multidisciplinary analytical and chemometric approach.
Amenta, M; Fabroni, S; Costa, C; Rapisarda, P
2016-11-15
Food traceability is increasingly relevant with respect to safety, quality and typicality issues. Lemon fruits grown in a typical lemon-growing area of southern Italy (Siracusa), have been awarded the PGI (Protected Geographical Indication) recognition as 'Limone di Siracusa'. Due to its peculiarity, consumers have an increasing interest about this product. The detection of potential fraud could be improved by using the tools linking the composition of this production to its typical features. This study used a wide range of analytical techniques, including conventional techniques and analytical approaches, such as spectral (NIR spectra), multi-elemental (Fe, Zn, Mn, Cu, Li, Sr) and isotopic ((13)C/(12)C, (18)O/(16)O) marker investigations, joined with multivariate statistical analysis, such as PLS-DA (Partial Least Squares Discriminant Analysis) and LDA (Linear Discriminant Analysis), to implement a traceability system to verify the authenticity of 'Limone di Siracusa' production. The results demonstrated a very good geographical discrimination rate. Copyright © 2016 Elsevier Ltd. All rights reserved.
Sardaro, Maria Luisa Savo; Marmiroli, Marta; Maestri, Elena; Marmiroli, Nelson
2013-01-01
Genetic diversity underlies the improvement of crops by plant breeding. Landraces of tomato (Solanum lycopersicum L.) can contain valuable alleles not common in modern germplasms. The aim was to measure genetic diversity present in 47 most common tomato varieties grown in Italy, 35 were varieties used for processing and 12 were landraces considered “salad varieties”. Furthermore, we demonstrated the possibility that the variety traceability can be extended through the entire production chain. Diversity was measured using 11 microsatellite markers and 94 genotypes. Among the markers used, a total of 48 alleles were detected. A dendrogram based on total microsatellite polymorphism grouped 47 varieties into three major clusters at 0.75 similarity coefficient, differentiating the modern varieties from tomatoes landraces. The DNA markers developed confirmed the possibility to support the genotype identification all along the tomato production chain. The number of alleles and genotypes identified in the present work is the largest considering papers on food traceability. PMID:24804014
Sardaro, Maria Luisa Savo; Marmiroli, Marta; Maestri, Elena; Marmiroli, Nelson
2013-01-01
Genetic diversity underlies the improvement of crops by plant breeding. Landraces of tomato (Solanum lycopersicum L.) can contain valuable alleles not common in modern germplasms. The aim was to measure genetic diversity present in 47 most common tomato varieties grown in Italy, 35 were varieties used for processing and 12 were landraces considered "salad varieties". Furthermore, we demonstrated the possibility that the variety traceability can be extended through the entire production chain. Diversity was measured using 11 microsatellite markers and 94 genotypes. Among the markers used, a total of 48 alleles were detected. A dendrogram based on total microsatellite polymorphism grouped 47 varieties into three major clusters at 0.75 similarity coefficient, differentiating the modern varieties from tomatoes landraces. The DNA markers developed confirmed the possibility to support the genotype identification all along the tomato production chain. The number of alleles and genotypes identified in the present work is the largest considering papers on food traceability.
Groves, Kate; Cryar, Adam; Walker, Michael; Quaglia, Milena
2018-01-01
Assessing the recovery of food allergens from solid processed matrixes is one of the most difficult steps that needs to be overcome to enable the accurate quantification of protein allergens by immunoassay and MS. A feasibility study is described herein applying International System of Units (SI)-traceably quantified milk protein solutions to assess recovery by an improved extraction method. Untargeted MS analysis suggests that this novel extraction method can be further developed to provide high recoveries for a broad range of food allergens. A solution of α-casein was traceably quantified to the SI for the content of α-S1 casein. Cookie dough was prepared by spiking a known amount of the SI-traceable quantified solution into a mixture of flour, sugar, and soya spread, followed by baking. A novel method for the extraction of protein food allergens from solid matrixes based on proteolytic digestion was developed, and its performance was compared with the performance of methods reported in the literature.
Hansen, Anne-Mette Sølvbjerg; Fromberg, Arvid; Frandsen, Henrik Lauritz
2014-10-22
Authenticity and traceability of vanilla flavors were investigated using gas chromatography-isotope ratio mass spectrometry (GC-IRMS). Vanilla flavors produced by chemical synthesis (n = 2), fermentation (n = 1), and extracted from two different species of the vanilla orchid (n = 79) were analyzed. The authenticity of the flavor compound vanillin was evaluated on the basis of measurements of ratios of carbon stable isotopes (δ(13)C). It was found that results of δ(13)C for vanillin extracted from Vanilla planifolia and Vanilla tahitensis were significantly different (t test) and that it was possible to differentiate these two groups of natural vanillin from vanillin produced otherwise. Vanilla flavors were also analyzed for ratios of hydrogen stable isotopes (δ(2)H). A graphic representation of δ(13)C versus δ(2)H revealed that vanillin extracted from pods grown in adjacent geographic origins grouped together. Accordingly, values of δ(13)C and δ(2)H can be used for studies of authenticity and traceability of vanilla flavors.
From soil to grape and wine: Variation of light and heavy elements isotope ratios.
Durante, Caterina; Bertacchini, Lucia; Bontempo, Luana; Camin, Federica; Manzini, Daniela; Lambertini, Paolo; Marchetti, Andrea; Paolini, Mauro
2016-11-01
In the development of a geographical traceability model, it is necessary to understand if the value of the monitored indicators in a food is correlated to its origin or if it is also influenced by 'external factors' such as those coming from its production. In this study, a deeper investigation of the trend of direct geographical traceability indicators along the winemaking process of two traditional oenological products was carried out. Different processes were monitored, sampling each step of their production (grape juice, intermediate products and wine). The results related to the determinations of δ(18)O, (D/H)I, (D/H)II, δ(13)C, δ(15)N and (87)Sr/(86)Sr have been reported. Furthermore, correspondence with the isotopic values coming from the respective soil and vine-branch samples have been investigated as well, showing the optimal traceability power of the monitored geographical tracers. Copyright © 2016 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Valkiers, S.; Ding, T.; Ruße, K.; de Bièvre, P.; Taylor, P. D. P.
2005-04-01
SI-traceable ("absolute") values have been obtained for sulfur isotope amount ratios n(33S)/n(32S) and n(34S)/n(32S), in two batches of high purity SO2 gas (IRMM-2012 and IRMM-2013). The SO2 gas was converted at IMR-Beijing to Ag2S, then fluorinated to SF6 gas both at IMR-Beijing and at IRMM-Geel. Yields of different conversion methods exceeded 99%. The sulfur amount-of-substance measurements were performed by gas mass spectrometry on SF5+ ions using "IRMM's amount comparator II". These isotope amount ratios were calibrated by means of gravimetrically prepared synthetic mixtures of highly enriched sulfur isotopes (32S, 33S and 34S) in Ag2S form. The ratio values in the SO2 Secondary Measurement Standard are traceable to the SI system. They can be used in the calibration of field sulfur isotope measurements thus making these metrologically traceable to the SI.