Sample records for require characterization information

  1. National Transonic Facility Characterization Status

    NASA Technical Reports Server (NTRS)

    Bobbitt, C., Jr.; Everhart, J.; Foster, J.; Hill, J.; McHatton, R.; Tomek, W.

    2000-01-01

    This paper describes the current status of the characterization of the National Transonic Facility. The background and strategy for the tunnel characterization, as well as the current status of the four main areas of the characterization (tunnel calibration, flow quality characterization, data quality assurance, and support of the implementation of wall interference corrections) are presented. The target accuracy requirements for tunnel characterization measurements are given, followed by a comparison of the measured tunnel flow quality to these requirements based on current available information. The paper concludes with a summary of which requirements are being met, what areas need improvement, and what additional information is required in follow-on characterization studies.

  2. Energy technology characterizations handbook: environmental pollution and control factors. Third edition

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1983-03-01

    This Handbook deals with environmental characterization information for a range of energy-supply systems and provides supplementary information on environmental controls applicable to a select group of environmentally characterized energy systems. Environmental residuals, physical-resource requirements, and discussion of applicable standards are the principal information provided. The quantitative and qualitative data provided are useful for evaluating alternative policy and technical strategies and for assessing the environmental impact of facility siting, energy production, and environmental controls.

  3. DEVELOPMENT OF CHEMICAL METHODS TO CHARACTERIZE EXPOSURE TO EDCS IN THE NEUSE RIVER BASIN

    EPA Science Inventory

    To develop a quantitative health and environmental risk assessment of endocrine disrupting compounds (EDCs), information on exposures is essential. A full exposure assessment has complex requirements that require preliminary information to direct further research in this area....

  4. 1998 report on Hanford Site land disposal restrictions for mixed waste

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Black, D.G.

    1998-04-10

    This report was submitted to meet the requirements of Hanford Federal Facility Agreement and Consent Order (Tri-Party Agreement) Milestone M-26-01H. This milestone requires the preparation of an annual report that covers characterization, treatment, storage, minimization, and other aspects of managing land-disposal-restricted mixed waste at the Hanford Facility. The US Department of Energy, its predecessors, and contractors on the Hanford Facility were involved in the production and purification of nuclear defense materials from the early 1940s to the late 1980s. These production activities have generated large quantities of liquid and solid mixed waste. This waste is regulated under authority of bothmore » the Resource Conservation and Recovery Act of l976 and the Atomic Energy Act of 1954. This report covers only mixed waste. The Washington State Department of Ecology, US Environmental Protection Agency, and US Department of Energy have entered into the Tri-Party Agreement to bring the Hanford Facility operations into compliance with dangerous waste regulations. The Tri-Party Agreement required development of the original land disposal restrictions (LDR) plan and its annual updates to comply with LDR requirements for mixed waste. This report is the eighth update of the plan first issued in 1990. The Tri-Party Agreement requires and the baseline plan and annual update reports provide the following information: (1) Waste Characterization Information -- Provides information about characterizing each LDR mixed waste stream. The sampling and analysis methods and protocols, past characterization results, and, where available, a schedule for providing the characterization information are discussed. (2) Storage Data -- Identifies and describes the mixed waste on the Hanford Facility. Storage data include the Resource Conservation and Recovery Act of 1976 dangerous waste codes, generator process knowledge needed to identify the waste and to make LDR determinations, quantities stored, generation rates, location and method of storage, an assessment of storage-unit compliance status, storage capacity, and the bases and assumptions used in making the estimates.« less

  5. Tank characterization report for double-shell tank 241-AW-105

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sasaki, L.M.

    1997-06-05

    One of the major functions of the Tank Waste Remediation System (TWRS) is to characterize wastes in support of waste management and disposal activities at the Hanford Site. Analytical data from sampling and analysis, along with other available information about a tank, are compiled and maintained in a tank characterization report (TCR). This report and its appendices serve as the TCR for double-shell tank 241-AW-105. The objectives of this report are to use characterization data in response to technical issues associated with tank 241-AW-105 waste; and to provide a standard characterization of this waste in terms of a best-basis inventorymore » estimate. The response to technical issues is summarized in Section 2.0, and the best-basis inventory estimate is presented in Section 3.0. Recommendations regarding safety status and additional sampling needs are provided in Section 4.0. Supporting data and information are contained in the appendices. This report supports the requirements of the Hanford Federal Facility Agreement and Consent Order milestone Characterization. information presented in this report originated from sample analyses and known historical sources. While only the results of a recent sampling event will be used to fulfill the requirements of the data quality objectives (DQOs), other information can be used to support or question conclusions derived from these results. Historical information for tank 241-AW-105 is provided in Appendix A, including surveillance information, records pertaining to waste transfers and tank operations, and expected tank contents derived from a process knowledge model. The recent sampling event listed, as well as pertinent sample data obtained before 1996, are summarized in Appendix B along with the sampling results. The results of the 1996 grab sampling event satisfied the data requirements specified in the sampling and analysis plan (SAP) for this tank. In addition, the tank headspace flammability was measured, which addresses one of the requirements specified in the safety screening DQO. The statistical analysis and numerical manipulation of data used in issue resolution are reported in Appendix C. Appendix D contains the evaluation to establish the best basis for the inventory estimate and the statistical analysis performed for this evaluation. A bibliography that resulted from an in-depth literature search of all known information sources applicable to tank 241-AW-105 and its respective waste types is contained in Appendix E. A majority of the documents listed in Appendix E may be found in the Tank Characterization and Safety Resource Center.« less

  6. Algorithms for Automated Characterization of Three-Axis Stabilized GEOs using Non-Resolved Optical Observations

    DTIC Science & Technology

    2012-09-01

    Daniel Fulcoly AFRL Space Vehicles Directorate Stephen A. Gregory Boeing Corp. Non- resolved optical observations of satellites have been known...to supply researchers with valuable information about satellite status. Until recently most non- resolved analysis techniques have required an expert...rapidly characterizing satellites from non- resolved optical data of 3-axis stabilized geostationary satellites . We will present background information on

  7. Characterizing Navigation in Interactive Learning Environments

    ERIC Educational Resources Information Center

    Liang, Hai-Ning; Sedig, Kamran

    2009-01-01

    Interactive learning environments (ILEs) are increasingly used to support and enhance instruction and learning experiences. ILEs maintain and display information, allowing learners to interact with this information. One important method of interacting with information is navigation. Often, learners are required to navigate through the information…

  8. Waste Characterization Methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vigil-Holterman, Luciana R.; Naranjo, Felicia Danielle

    2016-02-02

    This report discusses ways to classify waste as outlined by LANL. Waste Generators must make a waste determination and characterize regulated waste by appropriate analytical testing or use of acceptable knowledge (AK). Use of AK for characterization requires several source documents. Waste characterization documentation must be accurate, sufficient, and current (i.e., updated); relevant and traceable to the waste stream’s generation, characterization, and management; and not merely a list of information sources.

  9. Airborne and Ground-Based Optical Characterization of Legacy Underground Nuclear Test Sites

    NASA Astrophysics Data System (ADS)

    Vigil, S.; Craven, J.; Anderson, D.; Dzur, R.; Schultz-Fellenz, E. S.; Sussman, A. J.

    2015-12-01

    Detecting, locating, and characterizing suspected underground nuclear test sites is a U.S. security priority. Currently, global underground nuclear explosion monitoring relies on seismic and infrasound sensor networks to provide rapid initial detection of potential underground nuclear tests. While seismic and infrasound might be able to generally locate potential underground nuclear tests, additional sensing methods might be required to further pinpoint test site locations. Optical remote sensing is a robust approach for site location and characterization due to the ability it provides to search large areas relatively quickly, resolve surface features in fine detail, and perform these tasks non-intrusively. Optical remote sensing provides both cultural and surface geological information about a site, for example, operational infrastructure, surface fractures. Surface geological information, when combined with known or estimated subsurface geologic information, could provide clues concerning test parameters. We have characterized two legacy nuclear test sites on the Nevada National Security Site (NNSS), U20ak and U20az using helicopter-, ground- and unmanned aerial system-based RGB imagery and light detection and ranging (lidar) systems. The multi-faceted information garnered from these different sensing modalities has allowed us to build a knowledge base of how a nuclear test site might look when sensed remotely, and the standoff distances required to resolve important site characteristics.

  10. The Global Emergency Observation and Warning System

    NASA Technical Reports Server (NTRS)

    Bukley, Angelia P.; Mulqueen, John A.

    1994-01-01

    Based on an extensive characterization of natural hazards, and an evaluation of their impacts on humanity, a set of functional technical requirements for a global warning and relief system was developed. Since no technological breakthroughs are required to implement a global system capable of performing the functions required to provide sufficient information for prevention, preparedness, warning, and relief from natural disaster effects, a system is proposed which would combine the elements of remote sensing, data processing, information distribution, and communications support on a global scale for disaster mitigation.

  11. Software Suite to Support In-Flight Characterization of Remote Sensing Systems

    NASA Technical Reports Server (NTRS)

    Stanley, Thomas; Holekamp, Kara; Gasser, Gerald; Tabor, Wes; Vaughan, Ronald; Ryan, Robert; Pagnutti, Mary; Blonski, Slawomir; Kenton, Ross

    2014-01-01

    A characterization software suite was developed to facilitate NASA's in-flight characterization of commercial remote sensing systems. Characterization of aerial and satellite systems requires knowledge of ground characteristics, or ground truth. This information is typically obtained with instruments taking measurements prior to or during a remote sensing system overpass. Acquired ground-truth data, which can consist of hundreds of measurements with different data formats, must be processed before it can be used in the characterization. Accurate in-flight characterization of remote sensing systems relies on multiple field data acquisitions that are efficiently processed, with minimal error. To address the need for timely, reproducible ground-truth data, a characterization software suite was developed to automate the data processing methods. The characterization software suite is engineering code, requiring some prior knowledge and expertise to run. The suite consists of component scripts for each of the three main in-flight characterization types: radiometric, geometric, and spatial. The component scripts for the radiometric characterization operate primarily by reading the raw data acquired by the field instruments, combining it with other applicable information, and then reducing it to a format that is appropriate for input into MODTRAN (MODerate resolution atmospheric TRANsmission), an Air Force Research Laboratory-developed radiative transport code used to predict at-sensor measurements. The geometric scripts operate by comparing identified target locations from the remote sensing image to known target locations, producing circular error statistics defined by the Federal Geographic Data Committee Standards. The spatial scripts analyze a target edge within the image, and produce estimates of Relative Edge Response and the value of the Modulation Transfer Function at the Nyquist frequency. The software suite enables rapid, efficient, automated processing of ground truth data, which has been used to provide reproducible characterizations on a number of commercial remote sensing systems. Overall, this characterization software suite improves the reliability of ground-truth data processing techniques that are required for remote sensing system in-flight characterizations.

  12. Next generation sequencing of the genomes of 11 international RWA biotypes

    USDA-ARS?s Scientific Manuscript database

    Scientists researching poorly characterized species struggle to gain understanding of the species they study on a sub-cellular level due to the time and investment required to build up an informative knowledge base. This becomes problematic when a poorly characterized species is a pest of a major e...

  13. Universities for a Small Planet--A Time to Reconceptualize Our Role.

    ERIC Educational Resources Information Center

    Wallin, Franklin W.

    1983-01-01

    The modern world is characterized by global interdependence and a shift in society to information as the basic resource. The amount and quality of education required to be functionally literate in an information-based society will keep growing. (MLW)

  14. Drug Information Education in Doctor of Pharmacy Programs

    PubMed Central

    Wang, Fei; Troutman, William G.; Seo, Teresa; Peak, Amy; Rosenberg, Jack M.

    2006-01-01

    Objective To characterize pharmacy program standards and trends in drug information education. Methods A questionnaire containing 34 questions addressing general demographic characteristics, organization, and content of drug information education was distributed to 86 colleges and schools of pharmacy in the United States using a Web-based survey system. Results Sixty colleges responded (73% response rate). All colleges offered a campus-based 6-year first-professional degree PharmD program. Didactic drug information was a required course in over 70% of these schools. Only 51 of the 60 colleges offered an advanced pharmacy practice experience (APPE) in drug information, and 62% of these did so only on an elective basis. Conclusion Although almost all of the PharmD programs in the US include a required course in drug information, the majority do not have a required APPE in this important area. PMID:17136172

  15. Behavioral Health and Performance (BHP) Work-Rest Cycles

    NASA Technical Reports Server (NTRS)

    Leveton, Lauren B.; Whitmire, Alexandra

    2011-01-01

    BHP Program Element Goal: Identify, characterize, and prevent or reduce behavioral health and performance risks associated with space travel, exploration and return to terrestrial life. BHP Requirements: a) Characterize and assess risks (e.g., likelihood and consequences). b) Develop tools and technologies to prevent, monitor, and treat adverse outcomes. c) Inform standards. d) Develop technologies to: 1) reduce risks and human systems resource requirements (e.g., crew time, mass, volume, power) and 2) ensure effective human-system integration across exploration mission.

  16. Data Quality Objectives for Regulatory Requirements for Dangerous Waste Sampling and Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    MULKEY, C.H.

    1999-07-02

    This document describes sampling and analytical requirements needed to meet state and federal regulations for dangerous waste (DW). The River Protection Project (RPP) is assigned to the task of storage and interim treatment of hazardous waste. Any final treatment or disposal operations, as well as requirements under the land disposal restrictions (LDRs), fall in the jurisdiction of another Hanford organization and are not part of this scope. The requirements for this Data Quality Objective (DQO) Process were developed using the RPP Data Quality Objective Procedure (Banning 1996), which is based on the U.S. Environmental Protection Agency's (EPA) Guidance for themore » Data Quality Objectives Process (EPA 1994). Hereafter, this document is referred to as the DW DQO. Federal and state laws and regulations pertaining to waste contain requirements that are dependent upon the composition of the waste stream. These regulatory drivers require that pertinent information be obtained. For many requirements, documented process knowledge of a waste composition can be used instead of analytical data to characterize or designate a waste. When process knowledge alone is used to characterize a waste, it is a best management practice to validate the information with analytical measurements.« less

  17. Automating Network Node Behavior Characterization by Mining Communication Patterns

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carroll, Thomas E.; Chikkagoudar, Satish; Arthur-Durett, Kristine M.

    Enterprise networks of scale are complex, dynamic computing environments that respond to evolv- ing business objectives and requirements. Characteriz- ing system behaviors in these environments is essential for network management and cyber security operations. Characterization of system’s communication is typical and is supported using network flow information (NetFlow). Related work has characterized behavior using theoretical graph metrics; results are often difficult to interpret by enterprise staff. We propose a different approach, where flow information is mapped to sets of tags that contextualize the data in terms of network principals and enterprise concepts. Frequent patterns are then extracted and are expressedmore » as behaviors. Behaviors can be com- pared, identifying systems expressing similar behaviors. We evaluate the approach using flow information collected by a third party.« less

  18. Borehole Tool for the Comprehensive Characterization of Hydrate-bearing Sediments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dai, Sheng; Santamarina, J. Carlos

    Reservoir characterization and simulation require reliable parameters to anticipate hydrate deposits responses and production rates. The acquisition of the required fundamental properties currently relies on wireline logging, pressure core testing, and/or laboratory observations of synthesized specimens, which are challenged by testing capabilities and innate sampling disturbances. The project reviews hydrate-bearing sediments, properties, and inherent sampling effects, albeit lessen with the developments in pressure core technology, in order to develop robust correlations with index parameters. The resulting information is incorporated into a tool for optimal field characterization and parameter selection with uncertainty analyses. Ultimately, the project develops a borehole tool formore » the comprehensive characterization of hydrate-bearing sediments at in situ, with the design recognizing past developments and characterization experience and benefited from the inspiration of nature and sensor miniaturization.« less

  19. RH-TRU Waste Inventory Characterization by AK and Proposed WIPP RH-TRU Waste Characterization Objectives

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Most, W. A.; Kehrman, R.; Gist, C.

    2002-02-26

    The U.S. Department of Energy (DOE)-Carlsbad Field Office (CBFO) has developed draft documentation to present the proposed Waste Isolation Pilot Plant (WIPP) remote-handled (RH-) transuranic (TRU) waste characterization program to its regulators, the U.S. Environmental Protection Agency and the New Mexico Environment Department. Compliance with Title 40, Code of Federal Regulations, Parts 191 and 194; the WIPP Land Withdrawal Act (PL 102-579); and the WIPP Hazardous Waste Facility Permit, as well as the Certificates of Compliance for the 72-B and 10-160B Casks, requires that specific waste parameter limits be imposed on DOE sites disposing of TRU waste at WIPP. Themore » DOE-CBFO must control the sites' compliance with the limits by specifying allowable characterization methods. As with the established WIPP contact handled TRU waste characterization program, the DOE-CBFO has proposed a Remote-Handled TRU Waste Acceptance Criteria (RH-WAC) document consolidating the requirements from various regulatory drivers and proposed allowable characterization methods. These criteria are consistent with the recommendation of a recent National Academy Sciences/National Research Council to develop an RH-TRU waste characterization approach that removes current self imposed requirements that lack a legal or safety basis. As proposed in the draft RH-WAC and other preliminary documents, the DOE-CBFO RH-TRU waste characterization program proposes the use of acceptable knowledge (AK) as the primary method for obtaining required characterization information. The use of AK involves applying knowledge of the waste in light of the materials or processes used to generate the waste. Documentation, records, or processes providing information about various attributes of a waste stream, such as chemical, physical, and radiological properties, may be used as AK and may be applied to individual waste containers either independently or in conjunction with radiography, visual examination, assay, and other sampling and analytical data. RH-TRU waste cannot be shipped to WIPP on the basis of AK alone if documentation demonstrating that all of the prescribed limits in the RH-WAC are met is not available, discrepancies exist among AK source documents describing the same waste stream and the most conservative assumptions regarding those documents indicates that a limit will not be met, or all required data are not available for a given waste stream.« less

  20. Environmental characterization report for the Gulf Interior Region, Texas study area. [Oakwood, Palestine and Keechi salt domes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1982-10-01

    This report is published as a product of the National Waste Terminal Storage (NWTS) Program. The objective of this program is the development of terminal waste storage facilities in deep, stable geologic formations for high-level nuclear waste, including spent fuel elements from commercial power reactors and transuranic nuclear waste for which the federal government is responsible. The report is part of the area study phase and contains environmental information for the Texas Study Area of the Gulf Interior Region acquired from federal, state, and regional agencies. The data in this report meet the requirements of predetermined survey plans and willmore » be used in determining locations of approximately 80 square kilometers (30 square miles) that will be further characterized. Information on surface water, atmosphere, background radiation, natural ecosystems, agricultural systems, demography, socioeconomics, land use, and transportation is presented. The environmental characterization will ensure that data on environmental values required by the National Environmental Policy Act (NEPA) of 1969 are available.« less

  1. Changes in Information Processing with Aging: Implications for Teaching Motor Skills.

    ERIC Educational Resources Information Center

    Anshel, Mark H.

    Although there are marked individual differences in the effect of aging on learning and performing motor skills, there is agreement that humans process information less efficiently with advanced age. Significant decrements have been found specifically with motor tasks that are characterized as externally-paced, rapid, complex, and requiring rapid…

  2. EU Regulation of Nanobiocides: Challenges in Implementing the Biocidal Product Regulation (BPR)

    PubMed Central

    Brinch, Anna; Hansen, Steffen Foss; Hartmann, Nanna B.; Baun, Anders

    2016-01-01

    The Biocidal Products Regulation (BPR) contains several provisions for nanomaterials (NMs) and is the first regulation in the European Union to require specific testing and risk assessment for the NM form of a biocidal substance as a part of the information requirements. Ecotoxicological data are one of the pillars of the information requirements in the BPR, but there are currently no standard test guidelines for the ecotoxicity testing of NMs. The overall objective of this work was to investigate the implications of the introduction of nano-specific testing requirements in the BPR and to explore how these might be fulfilled in the case of copper oxide nanoparticles. While there is information and data available in the open literature that could be used to fulfill the BPR information requirements, most of the studies do not take the Organisation for Economic Co-operation and Development’s nanospecific test guidelines into consideration. This makes it difficult for companies as well as regulators to fulfill the BPR information requirements for nanomaterials. In order to enable a nanospecific risk assessment, best practices need to be developed regarding stock suspension preparation and characterization, exposure suspensions preparation, and for conducting ecotoxicological test. PMID:28344290

  3. EU Regulation of Nanobiocides: Challenges in Implementing the Biocidal Product Regulation (BPR).

    PubMed

    Brinch, Anna; Hansen, Steffen Foss; Hartmann, Nanna B; Baun, Anders

    2016-02-16

    The Biocidal Products Regulation (BPR) contains several provisions for nanomaterials (NMs) and is the first regulation in the European Union to require specific testing and risk assessment for the NM form of a biocidal substance as a part of the information requirements. Ecotoxicological data are one of the pillars of the information requirements in the BPR, but there are currently no standard test guidelines for the ecotoxicity testing of NMs. The overall objective of this work was to investigate the implications of the introduction of nano-specific testing requirements in the BPR and to explore how these might be fulfilled in the case of copper oxide nanoparticles. While there is information and data available in the open literature that could be used to fulfill the BPR information requirements, most of the studies do not take the Organisation for Economic Co-operation and Development's nanospecific test guidelines into consideration. This makes it difficult for companies as well as regulators to fulfill the BPR information requirements for nanomaterials. In order to enable a nanospecific risk assessment, best practices need to be developed regarding stock suspension preparation and characterization, exposure suspensions preparation, and for conducting ecotoxicological test.

  4. Results and lessons learned from MODIS polarization sensitivity characterization

    NASA Astrophysics Data System (ADS)

    Sun, J.; Xiong, X.; Wang, X.; Qiu, S.; Xiong, S.; Waluschka, E.

    2006-08-01

    In addition to radiometric, spatial, and spectral calibration requirements, MODIS design specifications include polarization sensitivity requirements of less than 2% for all Reflective Solar Bands (RSB) except for the band centered at 412nm. To the best of our knowledge, MODIS was the first imaging radiometer that went through comprehensive system level (end-to-end) polarization characterization. MODIS polarization sensitivity was measured pre-launch at a number of sensor view angles using a laboratory Polarization Source Assembly (PSA) that consists of a rotatable source, a polarizer (Ahrens prism design), and a collimator. This paper describes MODIS polarization characterization approaches used by MODIS Characterization Support Team (MCST) at NASA/GSFC and addresses issues and concerns in the measurements. Results (polarization factor and phase angle) using different analyzing methods are discussed. Also included in this paper is a polarization characterization comparison between Terra and Aqua MODIS. Our previous and recent analysis of MODIS RSB polarization sensitivity could provide useful information for future Earth-observing sensor design, development, and characterization.

  5. PI Microgravity Services Role for International Space Station Operations

    NASA Technical Reports Server (NTRS)

    DeLombard, Richard

    1998-01-01

    During the ISS era, the NASA Lewis Research Center's Principal Investigator Microgravity Services (PIMS) project will provide to principal investigators (PIs) microgravity environment information and characterization of the accelerations to which their experiments were exposed during on orbit operations. PIMS supports PIs by providing them with microgravity environment information for experiment vehicles, carriers, and locations within the vehicle. This is done to assist the PI with their effort to evaluate the effect of acceleration on their experiments. Furthermore, PIMS responsibilities are to support the investigators in the area of acceleration data analysis and interpretation, and provide the Microgravity science community with a microgravity environment characterization of selected experiment carriers and vehicles. Also, PIMS provides expertise in the areas of microgravity experiment requirements, vibration isolation, and the implementation of requirements for different spacecraft to the microgravity community and other NASA programs.

  6. Evaluation of water-quality data and monitoring program for Lake Travis, near Austin, Texas

    USGS Publications Warehouse

    Rast, Walter; Slade, Raymond M.

    1998-01-01

    The multiple-comparison tests indicate that, for some constituents, a single sampling site for a constituent or property might adequately characterize the water quality of Lake Travis for that constituent or property. However, multiple sampling sites are required to provide information of sufficient temporal and spatial resolution to accurately evaluate other water-quality constituents for the reservoir. For example, the water-quality data from surface samples and from bottom samples indicate that nutrients (nitrogen, phosphorus) might require additional sampling sites for a more accurate characterization of their in-lake dynamics.

  7. Space station data system analysis/architecture study. Task 1: Functional requirements definition, DR-5. Appendix: Requirements data base

    NASA Technical Reports Server (NTRS)

    1985-01-01

    Appendix A contains data that characterize the system functions in sufficient depth as to determine the requirements for the Space Station Data System (SSDS). This data is in the form of: (1) top down traceability report; (2) bottom up traceability report; (3) requirements data sheets; and (4) cross index of requirements paragraphs of the source documents and the requirements numbers. A data base users guide is included that interested parties can use to access the requirements data base and get up to date information about the functions.

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schey, Stephen; Francfort, Jim

    Task 1includes a survey of the inventory of non-tactical fleet vehicles at Naval Air Station Whidbey Island (NASWI) to characterize the fleet. This information and characterization are used to select vehicles for monitoring that takes place during Task 2. This monitoring involves data logging of vehicle operation in order to identify the vehicle’s mission and travel requirements. Individual observations of these selected vehicles provide the basis for recommendations related to PEV adoption. It also identifies whether a battery electric vehicle or plug-in hybrid electric vehicle (collectively referred to as PEVs) can fulfill the mission requirements and provide observations related tomore » placement of PEV charging infrastructure. This report provides the results of the assessments and observations of the current non-tactical fleet, fulfilling the Task 1 requirements.« less

  9. Multiscale Structure of UXO Site Characterization: Spatial Estimation and Uncertainty Quantification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ostrouchov, George; Doll, William E.; Beard, Les P.

    2009-01-01

    Unexploded ordnance (UXO) site characterization must consider both how the contamination is generated and how we observe that contamination. Within the generation and observation processes, dependence structures can be exploited at multiple scales. We describe a conceptual site characterization process, the dependence structures available at several scales, and consider their statistical estimation aspects. It is evident that most of the statistical methods that are needed to address the estimation problems are known but their application-specific implementation may not be available. We demonstrate estimation at one scale and propose a representation for site contamination intensity that takes full account of uncertainty,more » is flexible enough to answer regulatory requirements, and is a practical tool for managing detailed spatial site characterization and remediation. The representation is based on point process spatial estimation methods that require modern computational resources for practical application. These methods have provisions for including prior and covariate information.« less

  10. Legacy sample disposition project. Volume 2: Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gurley, R.N.; Shifty, K.L.

    1998-02-01

    This report describes the legacy sample disposition project at the Idaho Engineering and Environmental Laboratory (INEEL), which assessed Site-wide facilities/areas to locate legacy samples and owner organizations and then characterized and dispositioned these samples. This project resulted from an Idaho Department of Environmental Quality inspection of selected areas of the INEEL in January 1996, which identified some samples at the Test Reactor Area and Idaho Chemical Processing Plant that had not been characterized and dispositioned according to Resource Conservation and Recovery Act (RCRA) requirements. The objective of the project was to manage legacy samples in accordance with all applicable environmentalmore » and safety requirements. A systems engineering approach was used throughout the project, which included collecting the legacy sample information and developing a system for amending and retrieving the information. All legacy samples were dispositioned by the end of 1997. Closure of the legacy sample issue was achieved through these actions.« less

  11. Partial information decomposition as a spatiotemporal filter.

    PubMed

    Flecker, Benjamin; Alford, Wesley; Beggs, John M; Williams, Paul L; Beer, Randall D

    2011-09-01

    Understanding the mechanisms of distributed computation in cellular automata requires techniques for characterizing the emergent structures that underlie information processing in such systems. Recently, techniques from information theory have been brought to bear on this problem. Building on this work, we utilize the new technique of partial information decomposition to show that previous information-theoretic measures can confound distinct sources of information. We then propose a new set of filters and demonstrate that they more cleanly separate out the background domains, particles, and collisions that are typically associated with information storage, transfer, and modification in cellular automata.

  12. Ecological Characterization Data for the 2004 Composite Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Downs, Janelle L.; Simmons, Mary A.; Stegen, Jennifer A.

    2004-11-01

    A composite analysis is required by U.S. Department of Energy (DOE) Order 435.1 to ensure public safety through the management of active and planned low-level radioactive waste disposal facilities associated with the Hanford Site. The original Hanford Site Composite Analysis of 1998 must be revised and submitted to DOE Headquarters (DOE-HQ) in 2004 because of revisions to waste site information in the 100, 200, and 300 Areas, updated performance assessments and environmental impact statements (EIS), changes in inventory estimates for key sites and constituents, and a change in the definition of offsite receptors. Beginning in fiscal year (FY) 2003, themore » DOE Richland Operations Office (DOE-RL) initiated activities, including the development of data packages, to support the 2004 Composite Analysis. This report describes the data compiled in FY 2003 to support ecological site assessment modeling for the 2004 Composite Analysis. This work was conducted as part of the Characterization of Systems Task of the Groundwater Remediation Project (formerly the Groundwater Protection Program) managed by Fluor Hanford, Inc., Richland, Washington. The purpose of this report is to provide summaries of the characterization information and available spatial data on the biological resources and ecological receptors found in the upland, riparian, aquatic, and island habitats on the Hanford Site. These data constitute the reference information used to establish parameters for the ecological risk assessment module of the System Assessment Capability and other assessment activities requiring information on the presence and distribution of biota on the Hanford Site.« less

  13. Toxicologic Pathology: The Basic Building Block of Risk Assessment

    EPA Science Inventory

    Human health risk assessment is a critical factor in many risk management decisions. Evaluation of human health risk requires research the provides information that appropriately characterizes potential hazards from exposure. Pathology endpoints are the central response around ...

  14. Hybrid Network Architectures for the Next Generation NAS

    NASA Technical Reports Server (NTRS)

    Madubata, Christian

    2003-01-01

    To meet the needs of the 21st Century NAS, an integrated, network-centric infrastructure is essential that is characterized by secure, high bandwidth, digital communication systems that support precision navigation capable of reducing position errors for all aircraft to within a few meters. This system will also require precision surveillance systems capable of accurately locating all aircraft, and automatically detecting any deviations from an approved path within seconds and be able to deliver high resolution weather forecasts - critical to create 4- dimensional (space and time) profiles for up to 6 hours for all atmospheric conditions affecting aviation, including wake vortices. The 21st Century NAS will be characterized by highly accurate digital data bases depicting terrain, obstacle, and airport information no matter what visibility conditions exist. This research task will be to perform a high-level requirements analysis of the applications, information and services required by the next generation National Airspace System. The investigation and analysis is expected to lead to the development and design of several national network-centric communications architectures that would be capable of supporting the Next Generation NAS.

  15. Robust and efficient anomaly detection using heterogeneous representations

    NASA Astrophysics Data System (ADS)

    Hu, Xing; Hu, Shiqiang; Xie, Jinhua; Zheng, Shiyou

    2015-05-01

    Various approaches have been proposed for video anomaly detection. Yet these approaches typically suffer from one or more limitations: they often characterize the pattern using its internal information, but ignore its external relationship which is important for local anomaly detection. Moreover, the high-dimensionality and the lack of robustness of pattern representation may lead to problems, including overfitting, increased computational cost and memory requirements, and high false alarm rate. We propose a video anomaly detection framework which relies on a heterogeneous representation to account for both the pattern's internal information and external relationship. The internal information is characterized by slow features learned by slow feature analysis from low-level representations, and the external relationship is characterized by the spatial contextual distances. The heterogeneous representation is compact, robust, efficient, and discriminative for anomaly detection. Moreover, both the pattern's internal information and external relationship can be taken into account in the proposed framework. Extensive experiments demonstrate the robustness and efficiency of our approach by comparison with the state-of-the-art approaches on the widely used benchmark datasets.

  16. 49 CFR 579.22 - Reporting requirements for manufacturers of 100 or more buses, manufacturers of 500 or more...

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ..., 14 air bags, 15 seat belts, 16 structure, 17 latch, 18 vehicle speed control, 19 tires, 20 wheels, 21...., hydraulic or air), the information required by this subsection shall be reported by each of the two brake types. If the service brake system in a vehicle is not readily characterized as either hydraulic or air...

  17. 49 CFR 579.22 - Reporting requirements for manufacturers of 100 or more buses, manufacturers of 500 or more...

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ..., 14 air bags, 15 seat belts, 16 structure, 17 latch, 18 vehicle speed control, 19 tires, 20 wheels, 21...., hydraulic or air), the information required by this subsection shall be reported by each of the two brake types. If the service brake system in a vehicle is not readily characterized as either hydraulic or air...

  18. Integrating gene expression data with demographic, clinical, and environmental exposure information to reveal endotypes of childhood asthma

    EPA Science Inventory

    RATIONALE. Childhood asthma is a multifactorial disease whose pathogenesis involves complex interplay between genetic susceptibility and modulating external factors. Therefore, effectively characterizing these multiple etiological pathways, or “endotypes”, requires an integrative...

  19. Measurements Required to Understand the Lunar Dust Environment and Transport Mechanism

    NASA Technical Reports Server (NTRS)

    Spann, James F., Jr.; Abbas, Mian

    2006-01-01

    Going back to the lunar surface offers an opportunity to understand the dust environment and associated transport mechanisms. This talk will explore what measurements are required to understand and characterize the dust-plasma environment in which robotic and human activities will be conducted. The understanding gained with the measurements can be used to make informed decisions on engineering solutions and follow-on investigations. Particular focus will be placed on required measurements of the size, spatial and charge distribution of the suspended lunar regolith.

  20. The Global Invasive Species Information Network: contributing to GEO Task BI-07-01b

    NASA Astrophysics Data System (ADS)

    Graham, J.; Morisette, J. T.; Simpson, A.

    2009-12-01

    Invasive alien species (IAS) threaten biodiversity and exert a tremendous cost on society for IAS prevention and eradication. They endanger natural ecosystem functioning and seriously impact biodiversity and agricultural production. The task definition for the GEO task BI-07-01b: Invasive Species Monitoring System is to characterize, monitor, and predict changes in the distribution of invasive species. This includes characterizing the current requirements and capacity for invasive species monitoring and developing strategies for implementing cross-search functionality among existing online invasive species information systems from around the globe. The Task is being coordinated by members of the Global Invasive Species Information Network (GISIN) and their partners. Information on GISIN and a prototype of the network is available at www.gisin.org. This talk will report on the current status of GISIN and review how researchers can either contribute to or utilize data from this network.

  1. The Advancement of Public Awareness, Concerning TRU Waste Characterization, Using a Virtual Document

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    West, T. B.; Burns, T. P.; Estill, W. G.

    2002-02-28

    Building public trust and confidence through openness is a goal of the DOE Carlsbad Field Office for the Waste Isolation Pilot Plant (WIPP). The objective of the virtual document described in this paper is to give the public an overview of the waste characterization steps, an understanding of how waste characterization instrumentation works, and the type and amount of data generated from a batch of drums. The document is intended to be published on a web page and/or distributed at public meetings on CDs. Users may gain as much information as they desire regarding the transuranic (TRU) waste characterization program,more » starting at the highest level requirements (drivers) and progressing to more and more detail regarding how the requirements are met. Included are links to: drivers (which include laws, permits and DOE Orders); various characterization steps required for transportation and disposal under WIPP's Hazardous Waste Facility Permit; physical/chemical basis for each characterization method; types of data produced; and quality assurance process that accompanies each measurement. Examples of each type of characterization method in use across the DOE complex are included. The original skeleton of the document was constructed in a PowerPoint presentation and included descriptions of each section of the waste characterization program. This original document had a brief overview of Acceptable Knowledge, Non-Destructive Examination, Non-Destructive Assay, Small Quantity sites, and the National Certification Team. A student intern was assigned the project of converting the document to a virtual format and to discuss each subject in depth. The resulting product is a fully functional virtual document that works in a web browser and functions like a web page. All documents that were referenced, linked to, or associated, are included on the virtual document's CD. WIPP has been engaged in a variety of Hazardous Waste Facility Permit modification activities. During the public meetings, discussion centered on proposed changes to the characterization program. The philosophy behind the virtual document is to show the characterization process as a whole, rather than as isolated parts. In addition to public meetings, other uses for the information might be as a training tool for new employees at the WIPP facility to show them where their activities fit into the overall scheme, as well as an employee review to help prepare for waste certification audits.« less

  2. Regulatory controls on the hydrogeological characterization of a mixed waste disposal site, Radioactive Waste Management Complex, Idaho National Engineering Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ruebelmann, K.L.

    1990-01-01

    Following the detection of chlorinated volatile organic compounds in the groundwater beneath the SDA in the summer of 1987, hydrogeological characterization of the Radioactive Waste Management Complex (RWMC), Idaho National Engineering Laboratory (INEL) was required by the Resource Conservation and Recovery Act (RCRA). The waste site, the Subsurface Disposal Area (SDA), is the subject of a RCRA Corrective Action Program. Regulatory requirements for the Corrective Action Program dictate a phased approach to evaluation of the SDA. In the first phase of the program, the SDA is the subject of a RCRA Facility Investigation (RIF), which will obtain information to fullymore » characterize the physical properties of the site, determine the nature and extent of contamination, and identify pathways for migration of contaminants. If the need for corrective measures is identified during the RIF, a Corrective Measures Study (CMS) will be performed as second phase. Information generated during the RIF will be used to aid in the selection and implementation of appropriate corrective measures to correct the release. Following the CMS, the final phase is the implementation of the selected corrective measures. 4 refs., 1 fig.« less

  3. Characterization and Prediction of Chemical Functions and Weight Fractions in Consumer Products

    EPA Science Inventory

    Assessing exposures from the thousands of chemicals in commerce requires quantitative information on the chemical constituents of consumer products. Unfortunately, gaps in available composition data prevent assessment of exposure to chemicals in many products. Here we propose fil...

  4. Improving Exposure Science and Dose Metrics for Toxicity Testing, Screening, Prioritizing, and Risk Assessment

    EPA Science Inventory

    Advance the characterization of exposure and dose metrics required to translate advances and findings in computational toxicology to information that can be directly used to support exposure and risk assessment for decision making and improved public health.

  5. Questioning the Consensus: Managing Carrier Status Results Generated by Newborn Screening

    PubMed Central

    Robert, Jason Scott; Hayeems, Robin Z.

    2009-01-01

    An apparent consensus governs the management of carrier status information generated incidentally through newborn screening: results cannot be withheld from parents. This normative stance encodes the focus on autonomy and distaste for paternalism that characterize the principles of clinical bioethics. However, newborn screening is a classic public health intervention in which paternalism may trump autonomy and through which parents are—in effect—required to receive carrier information. In truth, the disposition of carrier results generates competing moral infringements: to withhold information or require its possession. Resolving this dilemma demands consideration of a distinctive body of public health ethics to highlight the moral imperatives associated with the exercise of collective authority in the pursuit of public health benefits. PMID:19059852

  6. Information Infrastructure, Information Environments, and Long-Term Collaboration

    NASA Astrophysics Data System (ADS)

    Baker, K. S.; Pennington, D. D.

    2009-12-01

    Information infrastructure that supports collaborative science is a complex system of people, organizational arrangements, and tools that require co-management. Contemporary studies are exploring how to establish and characterize effective collaborative information environments. Collaboration depends on the flow of information across the human and technical system components through mechanisms that create linkages, both conceptual and technical. This transcends the need for requirements solicitation and usability studies, highlighting synergistic interactions between humans and technology that can lead to emergence of group level cognitive properties. We consider the ramifications of placing priority on establishing new metaphors and new types of learning environments located near-to-data-origin for the field sciences. In addition to changes in terms of participant engagement, there are implications in terms of innovative contributions to the design of information systems and data exchange. While data integration occurs in the minds of individual participants, it may be facilitated by collaborative thinking and community infrastructure. Existing learning frameworks - from Maslow’s hierarchy of needs to organizational learning - require modification and extension if effective approaches to decentralized information management and systems design are to emerge. Case studies relating to data integration include ecological community projects: development of cross-disciplinary conceptual maps and of a community unit registry.

  7. Fostering Engagement Activities To Advance Adaptation And Resiliency

    NASA Astrophysics Data System (ADS)

    Dissen, J.; Owen, T.; Brewer, M.; Hollingshead, A.; Mecray, E. L.; Werner, K.

    2015-12-01

    As the understanding of climate risks grows for public and private companies, the dissemination of meaningful climate and environmental information becomes important for improved risk management practices and innovation. In a broader effort to build capacity for adaptation and demonstrate the value of investment in resiliency, NCEI and its partners have made several shifts to showcase an improved understanding of uses and applications of climate and environmental data and information. The NOAA NCEI engagement initiative includes actively exploring ways to: 1) identify opportunities in data use and applications and 2) characterize needs and requirements from customers to help inform investment in the relevant science. This presentation will highlight: 1) NCEI's engagement initiative strategy, 2) our regional and national partnerships as agents of engagement in the region, 3) a few examples of uses of climate information with select stakeholders and 4) justification of customer engagement and requirements as a critical component in informing the science agenda.

  8. Proximal—distal pattern formation in Drosophila: cell autonomous requirement for Distal-less gene activity in limb development

    PubMed Central

    Cohen, Stephen M.; Jürgens, Gerd

    1989-01-01

    Limb development in the Drosophila embryo requires a pattern-forming system to organize positional information along the proximal–distal axis of the limb. This system must function in the context of the well characterized anterior–posterior and dorsal–ventral pattern-forming systems that are required to organize the body plan of the embryo. By genetic criteria the Distal-less gene appears to play a central role in limb development. Lack-of-function Distal-less mutations cause the deletion of a specific subset of embryonic peripheral sense organs that represent the evolutionary remnants of larval limbs. Distal-less activity is also required in the imaginal discs for the development of adult limbs. This requirement is cell autonomous and region specific within the developing limb primordium. Production of genetically mosaic imaginal discs, in which clones of cells lack Distal-less activity, indicates the existence of an organized proximal–distal positional information in very young imaginal disc primordia. We suggest that this graded positional information may depend on the activity of the Distal-less gene. Images PMID:16453891

  9. Analysis of acoustic emission signals and monitoring of machining processes

    PubMed

    Govekar; Gradisek; Grabec

    2000-03-01

    Monitoring of a machining process on the basis of sensor signals requires a selection of informative inputs in order to reliably characterize and model the process. In this article, a system for selection of informative characteristics from signals of multiple sensors is presented. For signal analysis, methods of spectral analysis and methods of nonlinear time series analysis are used. With the aim of modeling relationships between signal characteristics and the corresponding process state, an adaptive empirical modeler is applied. The application of the system is demonstrated by characterization of different parameters defining the states of a turning machining process, such as: chip form, tool wear, and onset of chatter vibration. The results show that, in spite of the complexity of the turning process, the state of the process can be well characterized by just a few proper characteristics extracted from a representative sensor signal. The process characterization can be further improved by joining characteristics from multiple sensors and by application of chaotic characteristics.

  10. Tank characterization report for single-shell tank 241-S-111

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Conner, J.M.

    1997-04-28

    One of the major functions of the Tank Waste Remediation System (TWRS) is to characterize wastes in support of waste management and disposal activities at the Hanford Site. Analytical data from sampling and analysis, along with other available information about a tank, are compiled and maintained in a tank characterization report (TCR). This report and its appendices serve as the TCR for single-shell tank 241-S-111. The objectives of this report are: (1) to use characterization data to address technical issues associated with tank 241-S-111 waste; and (2) to provide a standard characterization of this waste in terms of a best-basismore » inventory estimate. The response to technical issues is summarized in Section 2.0, and the best-basis inventory estimate is presented in Section 3.0. Recommendations regarding safety status and additional sampling needs are provided in Section 4.0. Supporting data and information are contained in the appendices. This report also supports the requirements of Hanford Federal Facility Agreement and Consent Order (Ecology et al. 1996) milestone M-44-10.« less

  11. 40 CFR 194.24 - Waste characterization.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... other information and methods. (b) The Department shall submit in the compliance certification... proposed for disposal in the disposal system, WIPP complies with the numeric requirements of § 194.34 and... release. (2) Identify and describe the method(s) used to quantify the limits of waste components...

  12. Electric Vehicle Preparedness: Task 2, Identification of Vehicles for Installation of Data Loggers for Marine Corps Base Camp Lejeune

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schey, Stephen; Francfort, Jim

    2015-02-01

    In Task 1, a survey was completed of the inventory of non-tactical fleet vehicles at the Marine Corps Base Camp Lejeune (MCBCL) to characterize the fleet. This information and characterization was used to select vehicles for further monitoring, which involves data logging of vehicle movements in order to identify the vehicle’s mission and travel requirements. Individual observations of these selected vehicles provide the basis for recommendations related to PEV adoption. It also identifies whether a battery electric vehicle or plug-in hybrid electric vehicle (collectively referred to as PEVs) can fulfill the mission requirements and provides observations related to placement ofmore » PEV charging infrastructure. This report provides the list of vehicles selected by MCBCL and Intertek for further monitoring and fulfills the Task 2 requirements.« less

  13. Technique for handling wave propagation specific effects in biological tissue: mapping of the photon transport equation to Maxwell's equations.

    PubMed

    Handapangoda, Chintha C; Premaratne, Malin; Paganin, David M; Hendahewa, Priyantha R D S

    2008-10-27

    A novel algorithm for mapping the photon transport equation (PTE) to Maxwell's equations is presented. Owing to its accuracy, wave propagation through biological tissue is modeled using the PTE. The mapping of the PTE to Maxwell's equations is required to model wave propagation through foreign structures implanted in biological tissue for sensing and characterization of tissue properties. The PTE solves for only the magnitude of the intensity but Maxwell's equations require the phase information as well. However, it is possible to construct the phase information approximately by solving the transport of intensity equation (TIE) using the full multigrid algorithm.

  14. Methods for the In Vitro Characterization of Nanomedicines—Biological Component Interaction

    PubMed Central

    Fornaguera, Cristina; Solans, Conxita

    2017-01-01

    The design of colloidal nanosystems intended for biomedical applications, specifically in the field of personalized medicine, has increased notably in the last years. Consequently, a variety of characterization techniques devoted to studying nanomedicine interactions with proteins and cells have been developed, since a deep characterization of nanosystems is required before starting preclinical and clinical studies. In this context, this review aims to summarize the main techniques used to assess the interaction of nanomedicines with biological systems, highlighting their advantages and disadvantages. Testing designed nanomaterials with these techniques is required in order to have more information about their behavior on a physiological environment. Moreover, techniques used to study the interaction of nanomedicines with proteins, such as albumin and fibrinogen, are summarized. These interactions are not desired, since they usually are the first signal to the body for the activation of the immune system, which leads to the clearance of the exogenous components. On the other hand, techniques for studying the cell toxicity of nanosystems are also summarized, since this information is required before starting preclinical steps. The translation of knowledge from novel designed nanosystems at a research laboratory scale to real human therapies is usually a limiting or even a final point due to the lack of systematic studies regarding these two aspects: nanoparticle interaction with biological components and nanoparticle cytotoxicity. In conclusion, this review will be a useful support for those scientists aiming to develop nanosystems for nanomedicine purposes. PMID:28134833

  15. A Parallel Stochastic Framework for Reservoir Characterization and History Matching

    DOE PAGES

    Thomas, Sunil G.; Klie, Hector M.; Rodriguez, Adolfo A.; ...

    2011-01-01

    The spatial distribution of parameters that characterize the subsurface is never known to any reasonable level of accuracy required to solve the governing PDEs of multiphase flow or species transport through porous media. This paper presents a numerically cheap, yet efficient, accurate and parallel framework to estimate reservoir parameters, for example, medium permeability, using sensor information from measurements of the solution variables such as phase pressures, phase concentrations, fluxes, and seismic and well log data. Numerical results are presented to demonstrate the method.

  16. A project management system for the X-29A flight test program

    NASA Technical Reports Server (NTRS)

    Stewart, J. F.; Bauer, C. A.

    1983-01-01

    The project-management system developed for NASA's participation in the X-29A aircraft development program is characterized from a theoretical perspective, as an example of a system appropriate to advanced, highly integrated technology projects. System-control theory is applied to the analysis of classical project-management techniques and structures, which are found to be of closed-loop multivariable type; and the effects of increasing project complexity and integration are evaluated. The importance of information flow, sampling frequency, information holding, and delays is stressed. The X-29A system is developed in four stages: establishment of overall objectives and requirements, determination of information processes (block diagrams) definition of personnel functional roles and relationships, and development of a detailed work-breakdown structure. The resulting system is shown to require a greater information flow to management than conventional methods. Sample block diagrams are provided.

  17. WIPP waste characterization program sampling and analysis guidance manual

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1991-01-01

    The Waste Isolation Pilot Plant (WIPP) Waste Characterization Program Sampling and Analysis Guidance Manual (Guidance Manual) provides a unified source of information on the sampling and analytical techniques that enable Department of Energy (DOE) facilities to comply with the requirements established in the current revision of the Quality Assurance Program Plan (QAPP) for the WIPP Experimental-Waste Characterization Program (the Program). This Guidance Manual includes all of the sampling and testing methodologies accepted by the WIPP Project Office (DOE/WPO) for use in implementing the Program requirements specified in the QAPP. This includes methods for characterizing representative samples of transuranic (TRU) wastesmore » at DOE generator sites with respect to the gas generation controlling variables defined in the WIPP bin-scale and alcove test plans, as well as waste container headspace gas sampling and analytical procedures to support waste characterization requirements under the WIPP test program and the Resource Conservation and Recovery Act (RCRA). The procedures in this Guidance Manual are comprehensive and detailed and are designed to provide the necessary guidance for the preparation of site specific procedures. The use of these procedures is intended to provide the necessary sensitivity, specificity, precision, and comparability of analyses and test results. The solutions to achieving specific program objectives will depend upon facility constraints, compliance with DOE Orders and DOE facilities' operating contractor requirements, and the knowledge and experience of the TRU waste handlers and analysts. With some analytical methods, such as gas chromatography/mass spectrometry, the Guidance Manual procedures may be used directly. With other methods, such as nondestructive/destructive characterization, the Guidance Manual provides guidance rather than a step-by-step procedure.« less

  18. Development of a Screening Approach to Detect Thyroid Disrupting Chemicals that Inhibit the Human Sodium/Iodide Symporter (NIS)

    EPA Science Inventory

    Thyroid hormone synthesis requires active iodide uptake mediated by the sodium/iodide symporter (NIS). Monovalent anions, such as the environmental contaminant perchlorate, have been well characterized as competitive inhibitors of NIS, yet limited information exists for more stru...

  19. Plasma-edge studies using carbon resistance probes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wampler, W.R.

    1984-01-01

    Characterization of erosion and hydrogen-recycling processes occurring at the edge of magnetically confined plasmas requires knowledge of the energy and flux of hydrogen isotopes incident on the materials. A new plasma-edge probe technique, the carbon resistance probe, has been developed to obtain this information. This technique uti

  20. PUBLIC HEALTH AND ECOLOGICAL INTERCONNECTIVITY: A CONDITIONAL PROBABILITY APPROACH ASSOCIATING DEGRADATION OF STREAMS AND INFANT MORTALITY

    EPA Science Inventory

    Effective public health policy should not be based solely on clinical, individualbased

    information, but requires a broad characterization of human health conditions

    across large geographic areas. For the most part, the necessary monitoring of human

    health to ...

  1. CHARACTERIZE INTERACTIONS BETWEEN ECOSYSTEM FUNCTIONING AND CHANGES IN CLIMATE, UV, AND LAND USE

    EPA Science Inventory

    Assessments of the long-term impacts of global changes in climate, ultraviolet (UV) radiation and land use on ecosystems require scientific data, concepts and models that describe the responses of ecosystem health to stresses related to the changes as well as information and mode...

  2. Comparison of Modeling Approaches to Prioritize Chemicals Based on Estimates of Exposure and Exposure Potential

    EPA Science Inventory

    While only limited data are available to characterize the potential toxicity of over 8 million commercially available chemical substances, there is even less information available on the exposure and use-scenarios that are required to link potential toxicity to human and ecologic...

  3. Surface characterization of nanomaterials and nanoparticles: Important needs and challenging opportunities

    PubMed Central

    Baer, Donald R.; Engelhard, Mark H.; Johnson, Grant E.; Laskin, Julia; Lai, Jinfeng; Mueller, Karl; Munusamy, Prabhakaran; Thevuthasan, Suntharampillai; Wang, Hongfei; Washton, Nancy; Elder, Alison; Baisch, Brittany L.; Karakoti, Ajay; Kuchibhatla, Satyanarayana V. N. T.; Moon, DaeWon

    2013-01-01

    This review examines characterization challenges inherently associated with understanding nanomaterials and the roles surface and interface characterization methods can play in meeting some of the challenges. In parts of the research community, there is growing recognition that studies and published reports on the properties and behaviors of nanomaterials often have reported inadequate or incomplete characterization. As a consequence, the true value of the data in these reports is, at best, uncertain. With the increasing importance of nanomaterials in fundamental research and technological applications, it is desirable that researchers from the wide variety of disciplines involved recognize the nature of these often unexpected challenges associated with reproducible synthesis and characterization of nanomaterials, including the difficulties of maintaining desired materials properties during handling and processing due to their dynamic nature. It is equally valuable for researchers to understand how characterization approaches (surface and otherwise) can help to minimize synthesis surprises and to determine how (and how quickly) materials and properties change in different environments. Appropriate application of traditional surface sensitive analysis methods (including x-ray photoelectron and Auger electron spectroscopies, scanning probe microscopy, and secondary ion mass spectroscopy) can provide information that helps address several of the analysis needs. In many circumstances, extensions of traditional data analysis can provide considerably more information than normally obtained from the data collected. Less common or evolving methods with surface selectivity (e.g., some variations of nuclear magnetic resonance, sum frequency generation, and low and medium energy ion scattering) can provide information about surfaces or interfaces in working environments (operando or in situ) or information not provided by more traditional methods. Although these methods may require instrumentation or expertise not generally available, they can be particularly useful in addressing specific questions, and examples of their use in nanomaterial research are presented. PMID:24482557

  4. Surface characterization of nanomaterials and nanoparticles: Important needs and challenging opportunities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baer, Donald R.; Engelhard, Mark H.; Johnson, Grant E.

    2013-09-15

    This review examines characterization challenges inherently associated with understanding nanomaterials and the roles surface and interface characterization methods can play in meeting some of the challenges. In parts of the research community, there is growing recognition that studies and published reports on the properties and behaviors of nanomaterials often have reported inadequate or incomplete characterization. As a consequence, the true value of the data in these reports is, at best, uncertain. With the increasing importance of nanomaterials in fundamental research and technological applications, it is desirable that researchers from the wide variety of disciplines involved recognize the nature of thesemore » often unexpected challenges associated with reproducible synthesis and characterization of nanomaterials, including the difficulties of maintaining desired materials properties during handling and processing due to their dynamic nature. It is equally valuable for researchers to understand how characterization approaches (surface and otherwise) can help to minimize synthesis surprises and to determine how (and how quickly) materials and properties change in different environments. Appropriate application of traditional surface sensitive analysis methods (including x-ray photoelectron and Auger electron spectroscopies, scanning probe microscopy, and secondary ion mass spectroscopy) can provide information that helps address several of the analysis needs. In many circumstances, extensions of traditional data analysis can provide considerably more information than normally obtained from the data collected. Less common or evolving methods with surface selectivity (e.g., some variations of nuclear magnetic resonance, sum frequency generation, and low and medium energy ion scattering) can provide information about surfaces or interfaces in working environments (operando or in situ) or information not provided by more traditional methods. Although these methods may require instrumentation or expertise not generally available, they can be particularly useful in addressing specific questions, and examples of their use in nanomaterial research are presented.« less

  5. Statistical Characterization of Environmental Error Sources Affecting Electronically Scanned Pressure Transducers

    NASA Technical Reports Server (NTRS)

    Green, Del L.; Walker, Eric L.; Everhart, Joel L.

    2006-01-01

    Minimization of uncertainty is essential to extend the usable range of the 15-psid Electronically Scanned Pressure [ESP) transducer measurements to the low free-stream static pressures found in hypersonic wind tunnels. Statistical characterization of environmental error sources inducing much of this uncertainty requires a well defined and controlled calibration method. Employing such a controlled calibration system, several studies were conducted that provide quantitative information detailing the required controls needed to minimize environmental and human induced error sources. Results of temperature, environmental pressure, over-pressurization, and set point randomization studies for the 15-psid transducers are presented along with a comparison of two regression methods using data acquired with both 0.36-psid and 15-psid transducers. Together these results provide insight into procedural and environmental controls required for long term high-accuracy pressure measurements near 0.01 psia in the hypersonic testing environment using 15-psid ESP transducers.

  6. Statistical Characterization of Environmental Error Sources Affecting Electronically Scanned Pressure Transducers

    NASA Technical Reports Server (NTRS)

    Green, Del L.; Walker, Eric L.; Everhart, Joel L.

    2006-01-01

    Minimization of uncertainty is essential to extend the usable range of the 15-psid Electronically Scanned Pressure (ESP) transducer measurements to the low free-stream static pressures found in hypersonic wind tunnels. Statistical characterization of environmental error sources inducing much of this uncertainty requires a well defined and controlled calibration method. Employing such a controlled calibration system, several studies were conducted that provide quantitative information detailing the required controls needed to minimize environmental and human induced error sources. Results of temperature, environmental pressure, over-pressurization, and set point randomization studies for the 15-psid transducers are presented along with a comparison of two regression methods using data acquired with both 0.36-psid and 15-psid transducers. Together these results provide insight into procedural and environmental controls required for long term high-accuracy pressure measurements near 0.01 psia in the hypersonic testing environment using 15-psid ESP transducers.

  7. FROM2D to 3d Supervised Segmentation and Classification for Cultural Heritage Applications

    NASA Astrophysics Data System (ADS)

    Grilli, E.; Dininno, D.; Petrucci, G.; Remondino, F.

    2018-05-01

    The digital management of architectural heritage information is still a complex problem, as a heritage object requires an integrated representation of various types of information in order to develop appropriate restoration or conservation strategies. Currently, there is extensive research focused on automatic procedures of segmentation and classification of 3D point clouds or meshes, which can accelerate the study of a monument and integrate it with heterogeneous information and attributes, useful to characterize and describe the surveyed object. The aim of this study is to propose an optimal, repeatable and reliable procedure to manage various types of 3D surveying data and associate them with heterogeneous information and attributes to characterize and describe the surveyed object. In particular, this paper presents an approach for classifying 3D heritage models, starting from the segmentation of their textures based on supervised machine learning methods. Experimental results run on three different case studies demonstrate that the proposed approach is effective and with many further potentials.

  8. Protein Signaling Networks from Single Cell Fluctuations and Information Theory Profiling

    PubMed Central

    Shin, Young Shik; Remacle, F.; Fan, Rong; Hwang, Kiwook; Wei, Wei; Ahmad, Habib; Levine, R.D.; Heath, James R.

    2011-01-01

    Protein signaling networks among cells play critical roles in a host of pathophysiological processes, from inflammation to tumorigenesis. We report on an approach that integrates microfluidic cell handling, in situ protein secretion profiling, and information theory to determine an extracellular protein-signaling network and the role of perturbations. We assayed 12 proteins secreted from human macrophages that were subjected to lipopolysaccharide challenge, which emulates the macrophage-based innate immune responses against Gram-negative bacteria. We characterize the fluctuations in protein secretion of single cells, and of small cell colonies (n = 2, 3,···), as a function of colony size. Measuring the fluctuations permits a validation of the conditions required for the application of a quantitative version of the Le Chatelier's principle, as derived using information theory. This principle provides a quantitative prediction of the role of perturbations and allows a characterization of a protein-protein interaction network. PMID:21575571

  9. Characterizing the Fundamental Intellectual Steps Required in the Solution of Conceptual Problems

    NASA Astrophysics Data System (ADS)

    Stewart, John

    2010-02-01

    At some level, the performance of a science class must depend on what is taught, the information content of the materials and assignments of the course. The introductory calculus-based electricity and magnetism class at the University of Arkansas is examined using a catalog of the basic reasoning steps involved in the solution of problems assigned in the class. This catalog was developed by sampling popular physics textbooks for conceptual problems. The solution to each conceptual problem was decomposed into its fundamental reasoning steps. These fundamental steps are, then, used to quantify the distribution of conceptual content within the course. Using this characterization technique, an exceptionally detailed picture of the information flow and structure of the class can be produced. The intellectual structure of published conceptual inventories is compared with the information presented in the class and the dependence of conceptual performance on the details of coverage extracted. )

  10. Probabilistic characterization of wind turbine blades via aeroelasticity and spinning finite element formulation

    NASA Astrophysics Data System (ADS)

    Velazquez, Antonio; Swartz, R. Andrew

    2012-04-01

    Wind energy is an increasingly important component of this nation's renewable energy portfolio, however safe and economical wind turbine operation is a critical need to ensure continued adoption. Safe operation of wind turbine structures requires not only information regarding their condition, but their operational environment. Given the difficulty inherent in SHM processes for wind turbines (damage detection, location, and characterization), some uncertainty in conditional assessment is expected. Furthermore, given the stochastic nature of the loading on turbine structures, a probabilistic framework is appropriate to characterize their risk of failure at a given time. Such information will be invaluable to turbine controllers, allowing them to operate the structures within acceptable risk profiles. This study explores the characterization of the turbine loading and response envelopes for critical failure modes of the turbine blade structures. A framework is presented to develop an analytical estimation of the loading environment (including loading effects) based on the dynamic behavior of the blades. This is influenced by behaviors including along and across-wind aero-elastic effects, wind shear gradient, tower shadow effects, and centrifugal stiffening effects. The proposed solution includes methods that are based on modal decomposition of the blades and require frequent updates to the estimated modal properties to account for the time-varying nature of the turbine and its environment. The estimated demand statistics are compared to a code-based resistance curve to determine a probabilistic estimate of the risk of blade failure given the loading environment.

  11. Deep Borehole Field Test Requirements and Controlled Assumptions.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hardin, Ernest

    2015-07-01

    This document presents design requirements and controlled assumptions intended for use in the engineering development and testing of: 1) prototype packages for radioactive waste disposal in deep boreholes; 2) a waste package surface handling system; and 3) a subsurface system for emplacing and retrieving packages in deep boreholes. Engineering development and testing is being performed as part of the Deep Borehole Field Test (DBFT; SNL 2014a). This document presents parallel sets of requirements for a waste disposal system and for the DBFT, showing the close relationship. In addition to design, it will also inform planning for drilling, construction, and scientificmore » characterization activities for the DBFT. The information presented here follows typical preparations for engineering design. It includes functional and operating requirements for handling and emplacement/retrieval equipment, waste package design and emplacement requirements, borehole construction requirements, sealing requirements, and performance criteria. Assumptions are included where they could impact engineering design. Design solutions are avoided in the requirements discussion. Deep Borehole Field Test Requirements and Controlled Assumptions July 21, 2015 iv ACKNOWLEDGEMENTS This set of requirements and assumptions has benefited greatly from reviews by Gordon Appel, Geoff Freeze, Kris Kuhlman, Bob MacKinnon, Steve Pye, David Sassani, Dave Sevougian, and Jiann Su.« less

  12. Requirement analysis for the one-stop logistics management of fresh agricultural products

    NASA Astrophysics Data System (ADS)

    Li, Jun; Gao, Hongmei; Liu, Yuchuan

    2017-08-01

    Issues and concerns for food safety, agro-processing, and the environmental and ecological impact of food production have been attracted many research interests. Traceability and logistics management of fresh agricultural products is faced with the technological challenges including food product label and identification, activity/process characterization, information systems for the supply chain, i.e., from farm to table. Application of one-stop logistics service focuses on the whole supply chain process integration for fresh agricultural products is studied. A collaborative research project for the supply and logistics of fresh agricultural products in Tianjin was performed. Requirement analysis for the one-stop logistics management information system is studied. The model-driven business transformation, an approach uses formal models to explicitly define the structure and behavior of a business, is applied for the review and analysis process. Specific requirements for the logistic management solutions are proposed. Development of this research is crucial for the solution of one-stop logistics management information system integration platform for fresh agricultural products.

  13. Efficient high-dimensional characterization of conductivity in a sand box using massive MRI-imaged concentration data

    NASA Astrophysics Data System (ADS)

    Lee, J. H.; Yoon, H.; Kitanidis, P. K.; Werth, C. J.; Valocchi, A. J.

    2015-12-01

    Characterizing subsurface properties, particularly hydraulic conductivity, is crucial for reliable and cost-effective groundwater supply management, contaminant remediation, and emerging deep subsurface activities such as geologic carbon storage and unconventional resources recovery. With recent advances in sensor technology, a large volume of hydro-geophysical and chemical data can be obtained to achieve high-resolution images of subsurface properties, which can be used for accurate subsurface flow and reactive transport predictions. However, subsurface characterization with a plethora of information requires high, often prohibitive, computational costs associated with "big data" processing and large-scale numerical simulations. As a result, traditional inversion techniques are not well-suited for problems that require coupled multi-physics simulation models with massive data. In this work, we apply a scalable inversion method called Principal Component Geostatistical Approach (PCGA) for characterizing heterogeneous hydraulic conductivity (K) distribution in a 3-D sand box. The PCGA is a Jacobian-free geostatistical inversion approach that uses the leading principal components of the prior information to reduce computational costs, sometimes dramatically, and can be easily linked with any simulation software. Sequential images of transient tracer concentrations in the sand box were obtained using magnetic resonance imaging (MRI) technique, resulting in 6 million tracer-concentration data [Yoon et. al., 2008]. Since each individual tracer observation has little information on the K distribution, the dimension of the data was reduced using temporal moments and discrete cosine transform (DCT). Consequently, 100,000 unknown K values consistent with the scale of MRI data (at a scale of 0.25^3 cm^3) were estimated by matching temporal moments and DCT coefficients of the original tracer data. Estimated K fields are close to the true K field, and even small-scale variability of the sand box was captured to highlight high K connectivity and contrasts between low and high K zones. Total number of 1,000 MODFLOW and MT3DMS simulations were required to obtain final estimates and corresponding estimation uncertainty, showing the efficiency and effectiveness of our method.

  14. Characterization of the stress and refractive-index distributions in optical fibers and fiber-based devices

    NASA Astrophysics Data System (ADS)

    Hutsel, Michael R.

    2011-07-01

    Optical fiber technology continues to advance rapidly as a result of the increasing demands on communication systems and the expanding use of fiber-based sensing. New optical fiber types and fiber-based communications components are required to permit higher data rates, an increased number of channels, and more flexible installation requirements. Fiber-based sensors are continually being developed for a broad range of sensing applications, including environmental, medical, structural, industrial, and military. As optical fibers and fiber-based devices continue to advance, the need to understand their fundamental physical properties increases. The residual-stress distribution (RSD) and the refractive-index distribution (RID) play fundamental roles in the operation and performance of optical fibers. Custom RIDs are used to tailor the transmission properties of fibers used for long-distance transmission and to enable fiber-based devices such as long-period fiber gratings (LPFGs). The introduction and modification of RSDs enable specialty fibers, such as polarization-maintaining fiber, and contribute to the operation of fiber-based devices. Furthermore, the RSD and the RID are inherently linked through the photoelastic effect. Therefore, both the RSD and the RID need to be characterized because these fundamental properties are coupled and affect the fabrication, operation, and performance of fibers and fiber-based devices. To characterize effectively the physical properties of optical fibers, the RSD and the RID must be measured without perturbing or destroying the optical fiber. Furthermore, the techniques used must not be limited in detecting small variations and asymmetries in all directions through the fiber. Finally, the RSD and the RID must be characterized concurrently without moving the fiber to enable the analysis of the relationship between the RSD and the RID. Although many techniques exist for characterizing the residual stress and the refractive index in optical fibers, there is no existing methodology that meets all of these requirements. Therefore, the primary objective of the research presented in this thesis was to provide a methodology that is capable of characterizing concurrently the three-dimensional RSD and RID in optical fibers and fiber-based devices. This research represents a detailed study of the requirements for characterizing optical fibers and how these requirements are met through appropriate data analysis and experimental apparatus design and implementation. To validate the developed methodology, the secondary objective of this research was to characterize both unperturbed and modified optical fibers. The RSD and the RID were measured in a standard telecommunications-grade optical fiber, Corning SMF-28. The effects of cleaving this fiber were also analyzed and the longitudinal variations that result from cleaving were explored for the first time. The fabrication of carbon-dioxide-laser-induced (CO2 -laser-induced) LPFGs was also examined. These devices provide many of the functionalities required for fiber-based communications components as well as fiber-based sensors, and they offer relaxed fabrication requirements when compared to LPFGs fabricated by other methods. The developed methodology was used to perform the first measurements of the changes that occur in the RSD and the RID during LPFG fabrication. The analysis of these measurements ties together many of the existing theories of CO2-laser-induced LPFG fabrication to present a more coherent understanding of the processes that occur. In addition, new evidence provides detailed information on the functional form of the RSD and the RID in LPFGs. This information is crucial for the modeling of LPFG behavior, for the design of LPFGs for specific applications, for the tailoring of fabrication parameters to meet design requirements, and for understanding the limitations of LPFG fabrication in commercial optical fibers. Future areas of research concerning the improvement of the developed methodology, the need to characterize other fibers and fiber-based devices, and the characterization of CO2-laser-induced LPFGs are identified and discussed.

  15. Probability density function characterization for aggregated large-scale wind power based on Weibull mixtures

    DOE PAGES

    Gomez-Lazaro, Emilio; Bueso, Maria C.; Kessler, Mathieu; ...

    2016-02-02

    Here, the Weibull probability distribution has been widely applied to characterize wind speeds for wind energy resources. Wind power generation modeling is different, however, due in particular to power curve limitations, wind turbine control methods, and transmission system operation requirements. These differences are even greater for aggregated wind power generation in power systems with high wind penetration. Consequently, models based on one-Weibull component can provide poor characterizations for aggregated wind power generation. With this aim, the present paper focuses on discussing Weibull mixtures to characterize the probability density function (PDF) for aggregated wind power generation. PDFs of wind power datamore » are firstly classified attending to hourly and seasonal patterns. The selection of the number of components in the mixture is analyzed through two well-known different criteria: the Akaike information criterion (AIC) and the Bayesian information criterion (BIC). Finally, the optimal number of Weibull components for maximum likelihood is explored for the defined patterns, including the estimated weight, scale, and shape parameters. Results show that multi-Weibull models are more suitable to characterize aggregated wind power data due to the impact of distributed generation, variety of wind speed values and wind power curtailment.« less

  16. Accounting for geophysical information in geostatistical characterization of unexploded ordnance (UXO) sites.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Saito, Hirotaka; Goovaerts, Pierre; McKenna, Sean Andrew

    2003-06-01

    Efficient and reliable unexploded ordnance (UXO) site characterization is needed for decisions regarding future land use. There are several types of data available at UXO sites and geophysical signal maps are one of the most valuable sources of information. Incorporation of such information into site characterization requires a flexible and reliable methodology. Geostatistics allows one to account for exhaustive secondary information (i.e.,, known at every location within the field) in many different ways. Kriging and logistic regression were combined to map the probability of occurrence of at least one geophysical anomaly of interest, such as UXO, from a limited numbermore » of indicator data. Logistic regression is used to derive the trend from a geophysical signal map, and kriged residuals are added to the trend to estimate the probabilities of the presence of UXO at unsampled locations (simple kriging with varying local means or SKlm). Each location is identified for further remedial action if the estimated probability is greater than a given threshold. The technique is illustrated using a hypothetical UXO site generated by a UXO simulator, and a corresponding geophysical signal map. Indicator data are collected along two transects located within the site. Classification performances are then assessed by computing proportions of correct classification, false positive, false negative, and Kappa statistics. Two common approaches, one of which does not take any secondary information into account (ordinary indicator kriging) and a variant of common cokriging (collocated cokriging), were used for comparison purposes. Results indicate that accounting for exhaustive secondary information improves the overall characterization of UXO sites if an appropriate methodology, SKlm in this case, is used.« less

  17. Waste Characterization Process

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lambert, Patrick E.

    2014-11-01

    The purpose is to provide guidance to the Radiological Characterization Reviewer to complete the radiological characterization of waste items. This information is used for Department of Transportation (DOT) shipping and disposal, typically at the Nevada National Security Site (NNSS). Complete characterization ensures compliance with DOT shipping laws and NNSS Waste Acceptance Criteria (WAC). The fines for noncompliance can be extreme. This does not include possible bad press, and endangerment to the public, employees and the environment. A Radiological Characterization Reviewer has an important role in the organization. The scope is to outline the characterization process, but does not to includemore » every possible situation. The Radiological Characterization Reviewer position requires a strong background in Health Physics; therefore, these concepts are minimally addressed. The characterization process includes many Excel spreadsheets that were developed by Michael Enghauser known as the WCT software suite. New Excel spreadsheets developed as part of this project include the Ra- 226 Decider and the Density Calculator by Jesse Bland, MicroShield Density Calculator and Molecular Weight Calculator by Pat Lambert.« less

  18. Using Teradata University Network (TUN), a Free Internet Resource for Teaching and Learning

    ERIC Educational Resources Information Center

    Winter, Robert; Gericke, Anke; Bucher, Tobias

    2008-01-01

    Business intelligence and information logistics have become an important part of teaching curricula in recent years due to the increased demand for adequately trained graduates. Since these fields are characterized by a high amount of software and methodology innovations, teaching materials and teaching aids require constant updating. Teradata has…

  19. Building a Conceptual Framework for Data Literacy

    ERIC Educational Resources Information Center

    Gummer, Edith; Mandinach, Ellen

    2015-01-01

    Background: The increasing focus on education as an evidence-based practice requires that educators can effectively use data to inform their practice. At the level of classroom instructional decision making, the nature of the specific knowledge and skills teachers need to use data effectively is complex and not well characterized. Being able to…

  20. Characterization of information requirements for studies of CO/sub 2/ effects: water resources, agriculture, fisheries, forests and human health

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    White, M R

    1985-12-01

    The report discusses how climate change and vegetative response will affect selected areas of our way of life as a result of increased carbon dioxide concentrations. Needs for future research are identified. Separate abstracts have been prepared for individual chapters. (ACR)

  1. Relevant Scatterers Characterization in SAR Images

    NASA Astrophysics Data System (ADS)

    Chaabouni, Houda; Datcu, Mihai

    2006-11-01

    Recognizing scenes in a single look meter resolution Synthetic Aperture Radar (SAR) images, requires the capability to identify relevant signal signatures in condition of variable image acquisition geometry, arbitrary objects poses and configurations. Among the methods to detect relevant scatterers in SAR images, we can mention the internal coherence. The SAR spectrum splitted in azimuth generates a series of images which preserve high coherence only for particular object scattering. The detection of relevant scatterers can be done by correlation study or Independent Component Analysis (ICA) methods. The present article deals with the state of the art for SAR internal correlation analysis and proposes further extensions using elements of inference based on information theory applied to complex valued signals. The set of azimuth looks images is analyzed using mutual information measures and an equivalent channel capacity is derived. The localization of the "target" requires analysis in a small image window, thus resulting in imprecise estimation of the second order statistics of the signal. For a better precision, a Hausdorff measure is introduced. The method is applied to detect and characterize relevant objects in urban areas.

  2. Ecological Processes of Isolated Wetlands: Ecosystem Services and the Significant Nexus (Invited)

    NASA Astrophysics Data System (ADS)

    Lane, C.; Autrey, B.; D'Amico, E.

    2013-12-01

    Geographically isolated wetlands occur throughout the US and are characterized by a wetland system completely surrounded by uplands. Examples include prairie potholes, woodland seasonal (i.e., vernal) pools, cypress domes, playas, and other such systems. Decisions by the US Supreme Court in 2001 and 2006 have affected the jurisdictional status of geographically isolated wetlands such that those failing to have a demonstrable 'significant nexus' to navigable waters may have no federal protection under the Clean Water Act. These systems are typically small and, as such, may be under-counted in assessments of area and abundance. Areal extent is a portion of the information required to characterize the functions associated with geographically isolated wetlands and understanding both site-specific and larger-scale processes are also required to better quantify those functions. In addition, quantifying anthropogenic effects on system processing informs our understanding of the contributions and the connectivity of geographically isolated wetlands to other waters. This presentation focuses on both efforts to quantify the contribution of geographically isolated wetlands to system-scale processes, focusing on nutrient assimilation and hydrologic storage, as well as concurrent research to identify their locations at multiple scales. Findings from this research may help elucidate the link between geographically isolated wetlands and other systems, and may inform discussions on ecosystem services provided by geographically isolated wetlands.

  3. Scalable subsurface inverse modeling of huge data sets with an application to tracer concentration breakthrough data from magnetic resonance imaging

    NASA Astrophysics Data System (ADS)

    Lee, Jonghyun; Yoon, Hongkyu; Kitanidis, Peter K.; Werth, Charles J.; Valocchi, Albert J.

    2016-07-01

    Characterizing subsurface properties is crucial for reliable and cost-effective groundwater supply management and contaminant remediation. With recent advances in sensor technology, large volumes of hydrogeophysical and geochemical data can be obtained to achieve high-resolution images of subsurface properties. However, characterization with such a large amount of information requires prohibitive computational costs associated with "big data" processing and numerous large-scale numerical simulations. To tackle such difficulties, the principal component geostatistical approach (PCGA) has been proposed as a "Jacobian-free" inversion method that requires much smaller forward simulation runs for each iteration than the number of unknown parameters and measurements needed in the traditional inversion methods. PCGA can be conveniently linked to any multiphysics simulation software with independent parallel executions. In this paper, we extend PCGA to handle a large number of measurements (e.g., 106 or more) by constructing a fast preconditioner whose computational cost scales linearly with the data size. For illustration, we characterize the heterogeneous hydraulic conductivity (K) distribution in a laboratory-scale 3-D sand box using about 6 million transient tracer concentration measurements obtained using magnetic resonance imaging. Since each individual observation has little information on the K distribution, the data were compressed by the zeroth temporal moment of breakthrough curves, which is equivalent to the mean travel time under the experimental setting. Only about 2000 forward simulations in total were required to obtain the best estimate with corresponding estimation uncertainty, and the estimated K field captured key patterns of the original packing design, showing the efficiency and effectiveness of the proposed method.

  4. Characterization of shape and deformation of MEMS by quantitative optoelectronic metrology techniques

    NASA Astrophysics Data System (ADS)

    Furlong, Cosme; Pryputniewicz, Ryszard J.

    2002-06-01

    Recent technological trends based on miniaturization of mechanical, electro-mechanical, and photonic devices to the microscopic scale, have led to the development of microelectromechanical systems (MEMS). Effective development of MEMS components requires the synergism of advanced design, analysis, and fabrication methodologies, and also of quantitative metrology techniques for characterizing their performance, reliability, and integrity during the electronic packaging cycle. In this paper, we describe opto-electronic techniques for measuring, with sub-micrometer accuracy, shape and changes in states of deformation of MEMS strictures. With the described opto-electronic techniques, it is possible to characterize MEMS components using the display and data modes. In the display mode, interferometric information related to shape and deformation is displayed at video frame rates, providing the capability for adjusting and setting experimental conditions. In the data mode, interferometric information related to shape and deformation is recorded as high-spatial and high-digital resolution images, which are further processed to provide quantitative 3D information. Furthermore, the quantitative 3D data are exported to computer-aided design (CAD) environments and utilized for analysis and optimization of MEMS devices. Capabilities of opto- electronic techniques are illustrated with representative applications demonstrating their applicability to provide indispensable quantitative information for the effective development and optimization of MEMS devices.

  5. Buried transuranic wastes at ORNL: Review of past estimates and reconciliation with current data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Trabalka, J.R.

    1997-09-01

    Inventories of buried (generally meaning disposed of) transuranic (TRU) wastes at Oak Ridge National Laboratory (ORNL) have been estimated for site remediation and waste management planning over a period of about two decades. Estimates were required because of inadequate waste characterization and incomplete disposal records. For a variety of reasons, including changing definitions of TRU wastes, differing objectives for the estimates, and poor historical data, the published results have sometimes been in conflict. The purpose of this review was (1) to attempt to explain both the rationale for and differences among the various estimates, and (2) to update the estimatesmore » based on more recent information obtained from waste characterization and from evaluations of ORNL waste data bases and historical records. The latter included information obtained from an expert panel`s review and reconciliation of inconsistencies in data identified during preparation of the ORNL input for the third revision of the Baseline Inventory Report for the Waste Isolation Pilot Plant. The results summarize current understanding of the relationship between past estimates of buried TRU wastes and provide the most up-to-date information on recorded burials thereafter. The limitations of available information on the latter and thus the need for improved waste characterization are highlighted.« less

  6. Methods and apparatus for non-acoustic speech characterization and recognition

    DOEpatents

    Holzrichter, John F.

    1999-01-01

    By simultaneously recording EM wave reflections and acoustic speech information, the positions and velocities of the speech organs as speech is articulated can be defined for each acoustic speech unit. Well defined time frames and feature vectors describing the speech, to the degree required, can be formed. Such feature vectors can uniquely characterize the speech unit being articulated each time frame. The onset of speech, rejection of external noise, vocalized pitch periods, articulator conditions, accurate timing, the identification of the speaker, acoustic speech unit recognition, and organ mechanical parameters can be determined.

  7. Methods and apparatus for non-acoustic speech characterization and recognition

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Holzrichter, J.F.

    By simultaneously recording EM wave reflections and acoustic speech information, the positions and velocities of the speech organs as speech is articulated can be defined for each acoustic speech unit. Well defined time frames and feature vectors describing the speech, to the degree required, can be formed. Such feature vectors can uniquely characterize the speech unit being articulated each time frame. The onset of speech, rejection of external noise, vocalized pitch periods, articulator conditions, accurate timing, the identification of the speaker, acoustic speech unit recognition, and organ mechanical parameters can be determined.

  8. Cross delay line sensor characterization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Owens, Israel J; Remelius, Dennis K; Tiee, Joe J

    There exists a wealth of information in the scientific literature on the physical properties and device characterization procedures for complementary metal oxide semiconductor (CMOS), charge coupled device (CCD) and avalanche photodiode (APD) format detectors. Numerous papers and books have also treated photocathode operation in the context of photomultiplier tube (PMT) operation for either non imaging applications or limited night vision capability. However, much less information has been reported in the literature about the characterization procedures and properties of photocathode detectors with novel cross delay line (XDL) anode structures. These allow one to detect single photons and create images by recordingmore » space and time coordinate (X, Y & T) information. In this paper, we report on the physical characteristics and performance of a cross delay line anode sensor with an enhanced near infrared wavelength response photocathode and high dynamic range micro channel plate (MCP) gain (> 10{sup 6}) multiplier stage. Measurement procedures and results including the device dark event rate (DER), pulse height distribution, quantum and electronic device efficiency (QE & DQE) and spatial resolution per effective pixel region in a 25 mm sensor array are presented. The overall knowledge and information obtained from XDL sensor characterization allow us to optimize device performance and assess capability. These device performance properties and capabilities make XDL detectors ideal for remote sensing field applications that require single photon detection, imaging, sub nano-second timing response, high spatial resolution (10's of microns) and large effective image format.« less

  9. Error Characterization of Flight Trajectories Reconstructed Using Structure from Motion

    DTIC Science & Technology

    2015-03-27

    adjustment using IMU rotation information, the accuracy of the yaw, pitch and roll is limited and numerical errors can be as high as 1e-4 depending on...due to either zero mean, Gaussian noise and/or bias in the IMU measured yaw, pitch and roll angles. It is possible that when errors in these...requires both the information on how the camera is mounted to the IMU /aircraft and the measured yaw, pitch and roll at the time of the first image

  10. Topological networks for quantum communication between distant qubits

    NASA Astrophysics Data System (ADS)

    Lang, Nicolai; Büchler, Hans Peter

    2017-11-01

    Efficient communication between qubits relies on robust networks, which allow for fast and coherent transfer of quantum information. It seems natural to harvest the remarkable properties of systems characterized by topological invariants to perform this task. Here, we show that a linear network of coupled bosonic degrees of freedom, characterized by topological bands, can be employed for the efficient exchange of quantum information over large distances. Important features of our setup are that it is robust against quenched disorder, all relevant operations can be performed by global variations of parameters, and the time required for communication between distant qubits approaches linear scaling with their distance. We demonstrate that our concept can be extended to an ensemble of qubits embedded in a two-dimensional network to allow for communication between all of them.

  11. Characterizing Task-Based OpenMP Programs

    PubMed Central

    Muddukrishna, Ananya; Jonsson, Peter A.; Brorsson, Mats

    2015-01-01

    Programmers struggle to understand performance of task-based OpenMP programs since profiling tools only report thread-based performance. Performance tuning also requires task-based performance in order to balance per-task memory hierarchy utilization against exposed task parallelism. We provide a cost-effective method to extract detailed task-based performance information from OpenMP programs. We demonstrate the utility of our method by quickly diagnosing performance problems and characterizing exposed task parallelism and per-task instruction profiles of benchmarks in the widely-used Barcelona OpenMP Tasks Suite. Programmers can tune performance faster and understand performance tradeoffs more effectively than existing tools by using our method to characterize task-based performance. PMID:25860023

  12. Structural Characterization of Mannan Cell Wall Polysaccharides in Plants Using PACE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pidatala, Venkataramana R.; Mahboubi, Amir; Mortimer, Jenny C.

    Plant cell wall polysaccharides are notoriously difficult to analyze, and most methods require expensive equipment, skilled operators, and large amounts of purified material. Here, we describe a simple method for gaining detailed polysaccharide structural information, including resolution of structural isomers. For polysaccharide analysis by gel electrophoresis (PACE), plant cell wall material is hydrolyzed with glycosyl hydrolases specific to the polysaccharide of interest (e.g., mannanases for mannan). Large format polyacrylamide gels are then used to separate the released oligosaccharides, which have been fluorescently labeled. Gels can be visualized with a modified gel imaging system (see Table of Materials). The resulting oligosaccharidemore » fingerprint can either be compared qualitatively or, with replication, quantitatively. Linkage and branching information can be established using additional glycosyl hydrolases (e.g., mannosidases and galactosidases). Whilst this protocol describes a method for analyzing glucomannan structure, it can be applied to any polysaccharide for which characterized glycosyl hydrolases exist. Alternatively, it can be used to characterize novel glycosyl hydrolases using defined polysaccharide substrates.« less

  13. Structural Characterization of Mannan Cell Wall Polysaccharides in Plants Using PACE.

    PubMed

    Pidatala, Venkataramana R; Mahboubi, Amir; Mortimer, Jenny C

    2017-10-16

    Plant cell wall polysaccharides are notoriously difficult to analyze, and most methods require expensive equipment, skilled operators, and large amounts of purified material. Here, we describe a simple method for gaining detailed polysaccharide structural information, including resolution of structural isomers. For polysaccharide analysis by gel electrophoresis (PACE), plant cell wall material is hydrolyzed with glycosyl hydrolases specific to the polysaccharide of interest (e.g., mannanases for mannan). Large format polyacrylamide gels are then used to separate the released oligosaccharides, which have been fluorescently labeled. Gels can be visualized with a modified gel imaging system (see Table of Materials). The resulting oligosaccharide fingerprint can either be compared qualitatively or, with replication, quantitatively. Linkage and branching information can be established using additional glycosyl hydrolases (e.g., mannosidases and galactosidases). Whilst this protocol describes a method for analyzing glucomannan structure, it can be applied to any polysaccharide for which characterized glycosyl hydrolases exist. Alternatively, it can be used to characterize novel glycosyl hydrolases using defined polysaccharide substrates.

  14. Structural Characterization of Mannan Cell Wall Polysaccharides in Plants Using PACE

    DOE PAGES

    Pidatala, Venkataramana R.; Mahboubi, Amir; Mortimer, Jenny C.

    2017-10-16

    Plant cell wall polysaccharides are notoriously difficult to analyze, and most methods require expensive equipment, skilled operators, and large amounts of purified material. Here, we describe a simple method for gaining detailed polysaccharide structural information, including resolution of structural isomers. For polysaccharide analysis by gel electrophoresis (PACE), plant cell wall material is hydrolyzed with glycosyl hydrolases specific to the polysaccharide of interest (e.g., mannanases for mannan). Large format polyacrylamide gels are then used to separate the released oligosaccharides, which have been fluorescently labeled. Gels can be visualized with a modified gel imaging system (see Table of Materials). The resulting oligosaccharidemore » fingerprint can either be compared qualitatively or, with replication, quantitatively. Linkage and branching information can be established using additional glycosyl hydrolases (e.g., mannosidases and galactosidases). Whilst this protocol describes a method for analyzing glucomannan structure, it can be applied to any polysaccharide for which characterized glycosyl hydrolases exist. Alternatively, it can be used to characterize novel glycosyl hydrolases using defined polysaccharide substrates.« less

  15. Characterizing rare-event property distributions via replicate molecular dynamics simulations of proteins.

    PubMed

    Krishnan, Ranjani; Walton, Emily B; Van Vliet, Krystyn J

    2009-11-01

    As computational resources increase, molecular dynamics simulations of biomolecules are becoming an increasingly informative complement to experimental studies. In particular, it has now become feasible to use multiple initial molecular configurations to generate an ensemble of replicate production-run simulations that allows for more complete characterization of rare events such as ligand-receptor unbinding. However, there are currently no explicit guidelines for selecting an ensemble of initial configurations for replicate simulations. Here, we use clustering analysis and steered molecular dynamics simulations to demonstrate that the configurational changes accessible in molecular dynamics simulations of biomolecules do not necessarily correlate with observed rare-event properties. This informs selection of a representative set of initial configurations. We also employ statistical analysis to identify the minimum number of replicate simulations required to sufficiently sample a given biomolecular property distribution. Together, these results suggest a general procedure for generating an ensemble of replicate simulations that will maximize accurate characterization of rare-event property distributions in biomolecules.

  16. Remote sensing techniques to assess active fire characteristics and post-fire effects

    Treesearch

    Leigh B. Lentile; Zachary A. Holden; Alistair M. S. Smith; Michael J. Falkowski; Andrew T. Hudak; Penelope Morgan; Sarah A. Lewis; Paul E. Gessler; Nate C. Benson

    2006-01-01

    Space and airborne sensors have been used to map area burned, assess characteristics of active fires, and characterize post-fire ecological effects. Confusion about fire intensity, fire severity, burn severity, and related terms can result in the potential misuse of the inferred information by land managers and remote sensing practitioners who require unambiguous...

  17. Effects of Palagonitic Dust Coatings on Thermal Emission Spectra of Rocks and Minerals: Implications for Mineralogical Characterization of the Martian Surface by MGS-TES

    NASA Technical Reports Server (NTRS)

    Graff, T. G.; Morris, R.; Christensen, P.

    2001-01-01

    Thermal emission measurements on dust-coated rocks and minerals show that a 300 5m thick layer is required to mask emission from the substrate and that non-linear effects are present. Additional information is contained in the original extended abstract.

  18. Space station (modular) mission analysis. Volume 1: Mission analysis

    NASA Technical Reports Server (NTRS)

    1971-01-01

    The mission analysis on the modular space station considers experimental requirements and options characterized by low initial cost and incremental manning. Features that affect initial development and early operating costs are identified and their impacts on the program are assessed. Considered are the areas of experiment, mission, operations, information management, and long life and safety analyses.

  19. Fast or Frugal, but Not Both: Decision Heuristics under Time Pressure

    ERIC Educational Resources Information Center

    Bobadilla-Suarez, Sebastian; Love, Bradley C.

    2018-01-01

    Heuristics are simple, yet effective, strategies that people use to make decisions. Because heuristics do not require all available information, they are thought to be easy to implement and to not tax limited cognitive resources, which has led heuristics to be characterized as fast-and-frugal. We question this monolithic conception of heuristics…

  20. Lunar Landing Operational Risk Model

    NASA Technical Reports Server (NTRS)

    Mattenberger, Chris; Putney, Blake; Rust, Randy; Derkowski, Brian

    2010-01-01

    Characterizing the risk of spacecraft goes beyond simply modeling equipment reliability. Some portions of the mission require complex interactions between system elements that can lead to failure without an actual hardware fault. Landing risk is currently the least characterized aspect of the Altair lunar lander and appears to result from complex temporal interactions between pilot, sensors, surface characteristics and vehicle capabilities rather than hardware failures. The Lunar Landing Operational Risk Model (LLORM) seeks to provide rapid and flexible quantitative insight into the risks driving the landing event and to gauge sensitivities of the vehicle to changes in system configuration and mission operations. The LLORM takes a Monte Carlo based approach to estimate the operational risk of the Lunar Landing Event and calculates estimates of the risk of Loss of Mission (LOM) - Abort Required and is Successful, Loss of Crew (LOC) - Vehicle Crashes or Cannot Reach Orbit, and Success. The LLORM is meant to be used during the conceptual design phase to inform decision makers transparently of the reliability impacts of design decisions, to identify areas of the design which may require additional robustness, and to aid in the development and flow-down of requirements.

  1. Design for interaction between humans and intelligent systems during real-time fault management

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.; Schreckenghost, Debra L.; Thronesbery, Carroll G.

    1992-01-01

    Initial results are reported to provide guidance and assistance for designers of intelligent systems and their human interfaces. The objective is to achieve more effective human-computer interaction (HCI) for real time fault management support systems. Studies of the development of intelligent fault management systems within NASA have resulted in a new perspective of the user. If the user is viewed as one of the subsystems in a heterogeneous, distributed system, system design becomes the design of a flexible architecture for accomplishing system tasks with both human and computer agents. HCI requirements and design should be distinguished from user interface (displays and controls) requirements and design. Effective HCI design for multi-agent systems requires explicit identification of activities and information that support coordination and communication between agents. The effects are characterized of HCI design on overall system design and approaches are identified to addressing HCI requirements in system design. The results include definition of (1) guidance based on information level requirements analysis of HCI, (2) high level requirements for a design methodology that integrates the HCI perspective into system design, and (3) requirements for embedding HCI design tools into intelligent system development environments.

  2. Nanomaterial characterization: considerations and needs for hazard assessment and safety evaluation.

    PubMed

    Boverhof, Darrell R; David, Raymond M

    2010-02-01

    Nanotechnology is a rapidly emerging field of great interest and promise. As new materials are developed and commercialized, hazard information also needs to be generated to reassure regulators, workers, and consumers that these materials can be used safely. The biological properties of nanomaterials are closely tied to the physical characteristics, including size, shape, dissolution rate, agglomeration state, and surface chemistry, to name a few. Furthermore, these properties can be altered by the medium used to suspend or disperse these water-insoluble particles. However, the current toxicology literature lacks much of the characterization information that allows toxicologists and regulators to develop "rules of thumb" that could be used to assess potential hazards. To effectively develop these rules, toxicologists need to know the characteristics of the particle that interacts with the biological system. This void leaves the scientific community with no options other than to evaluate all materials for all potential hazards. Lack of characterization could also lead to different laboratories reporting discordant results on seemingly the same test material because of subtle differences in the particle or differences in the dispersion medium used that resulted in altered properties and toxicity of the particle. For these reasons, good characterization using a minimal characterization data set should accompany and be required of all scientific publications on nanomaterials.

  3. Use of bioclimatic indexes to characterize phenological phases of apple varieties in Northern Italy.

    PubMed

    Valentini, N; Me, G; Ferrero, R; Spanna, F

    2001-11-01

    The research was designed to characterize the phenological behaviour of different apple varieties and to compare different bioclimatic indexes in order to evaluate their adaptability in describing the phenological phases of fruit species. A field study on the requirement for chilling units (winter chilling requirement) and the accumulation of growing degree hours of 15 native apple cultivars was carried out in a fruit-growing area in North West Italy (Cuneo Province, Piedmont). From 1991 to 1993, climatic data were collected at meteorological stations installed in an experimental orchard (Verzuolo, Cuneo). Four methods were compared to determine the winter chilling requirement: Hutchins, Weinberger-Eggert, Utah and North Carolina. The Utah method was applied to determine the time when the chilling units accumulated become effective in meeting the rest requirements. A comparison of the different methods indicated that the Weinberger-Eggert method is the best: as it showed the lowest statistical variability during the 3 years of observations. The growing degree hour requirement (GDH) was estimated by the North Carolina method with two different base temperatures: 4.4 degrees C and 6.1 degrees C. More difficulties were met when the date of rest completion and the beginning of GDH accumulation was determined. The best base temperature for the estimation of GDH is 4.4 degrees C. Phenological and climatic characterizations are two basic tools for giving farmers and agricultural advisors important information about which varieties to choose and which are the best and the most correct cultivation practices to follow.

  4. Scalable subsurface inverse modeling of huge data sets with an application to tracer concentration breakthrough data from magnetic resonance imaging

    DOE PAGES

    Lee, Jonghyun; Yoon, Hongkyu; Kitanidis, Peter K.; ...

    2016-06-09

    When characterizing subsurface properties is crucial for reliable and cost-effective groundwater supply management and contaminant remediation. With recent advances in sensor technology, large volumes of hydro-geophysical and geochemical data can be obtained to achieve high-resolution images of subsurface properties. However, characterization with such a large amount of information requires prohibitive computational costs associated with “big data” processing and numerous large-scale numerical simulations. To tackle such difficulties, the Principal Component Geostatistical Approach (PCGA) has been proposed as a “Jacobian-free” inversion method that requires much smaller forward simulation runs for each iteration than the number of unknown parameters and measurements needed inmore » the traditional inversion methods. PCGA can be conveniently linked to any multi-physics simulation software with independent parallel executions. In our paper, we extend PCGA to handle a large number of measurements (e.g. 106 or more) by constructing a fast preconditioner whose computational cost scales linearly with the data size. For illustration, we characterize the heterogeneous hydraulic conductivity (K) distribution in a laboratory-scale 3-D sand box using about 6 million transient tracer concentration measurements obtained using magnetic resonance imaging. Since each individual observation has little information on the K distribution, the data was compressed by the zero-th temporal moment of breakthrough curves, which is equivalent to the mean travel time under the experimental setting. Moreover, only about 2,000 forward simulations in total were required to obtain the best estimate with corresponding estimation uncertainty, and the estimated K field captured key patterns of the original packing design, showing the efficiency and effectiveness of the proposed method. This article is protected by copyright. All rights reserved.« less

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Jonghyun; Yoon, Hongkyu; Kitanidis, Peter K.

    When characterizing subsurface properties is crucial for reliable and cost-effective groundwater supply management and contaminant remediation. With recent advances in sensor technology, large volumes of hydro-geophysical and geochemical data can be obtained to achieve high-resolution images of subsurface properties. However, characterization with such a large amount of information requires prohibitive computational costs associated with “big data” processing and numerous large-scale numerical simulations. To tackle such difficulties, the Principal Component Geostatistical Approach (PCGA) has been proposed as a “Jacobian-free” inversion method that requires much smaller forward simulation runs for each iteration than the number of unknown parameters and measurements needed inmore » the traditional inversion methods. PCGA can be conveniently linked to any multi-physics simulation software with independent parallel executions. In our paper, we extend PCGA to handle a large number of measurements (e.g. 106 or more) by constructing a fast preconditioner whose computational cost scales linearly with the data size. For illustration, we characterize the heterogeneous hydraulic conductivity (K) distribution in a laboratory-scale 3-D sand box using about 6 million transient tracer concentration measurements obtained using magnetic resonance imaging. Since each individual observation has little information on the K distribution, the data was compressed by the zero-th temporal moment of breakthrough curves, which is equivalent to the mean travel time under the experimental setting. Moreover, only about 2,000 forward simulations in total were required to obtain the best estimate with corresponding estimation uncertainty, and the estimated K field captured key patterns of the original packing design, showing the efficiency and effectiveness of the proposed method. This article is protected by copyright. All rights reserved.« less

  6. Characterization of selected elementary motion detector cells to image primitives.

    PubMed

    Benson, Leslie A; Barrett, Steven F; Wright, Cameron H G

    2008-01-01

    Developing a visual sensing system, complete with motion processing hardware and software would have many applications to current technology. It could be mounted on many autonomous vehicles to provide information about the navigational environment, as well as obstacle avoidance features. Incorporating the motion processing capabilities into the sensor requires a new approach to the algorithm implementation. This research, and that of many others, have turned to nature for inspiration. Elementary motion detector (EMD) cells are involved in a biological preprocessing network to provide information to the motion processing lobes of the house degrees y Musca domestica. This paper describes the response of the photoreceptor inputs to the EMDs. The inputs to the EMD components are tested as they are stimulated with varying image primitives. This is the first of many steps in characterizing the EMD response to image primitives.

  7. Digital Mapping and Environmental Characterization of National Wild and Scenic River Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McManamay, Ryan A; Bosnall, Peter; Hetrick, Shelaine L

    2013-09-01

    Spatially accurate geospatial information is required to support decision-making regarding sustainable future hydropower development. Under a memorandum of understanding among several federal agencies, a pilot study was conducted to map a subset of National Wild and Scenic Rivers (WSRs) at a higher resolution and provide a consistent methodology for mapping WSRs across the United States and across agency jurisdictions. A subset of rivers (segments falling under the jurisdiction of the National Park Service) were mapped at a high resolution using the National Hydrography Dataset (NHD). The spatial extent and representation of river segments mapped at NHD scale were compared withmore » the prevailing geospatial coverage mapped at a coarser scale. Accurately digitized river segments were linked to environmental attribution datasets housed within the Oak Ridge National Laboratory s National Hydropower Asset Assessment Program database to characterize the environmental context of WSR segments. The results suggest that both the spatial scale of hydrography datasets and the adherence to written policy descriptions are critical to accurately mapping WSRs. The environmental characterization provided information to deduce generalized trends in either the uniqueness or the commonness of environmental variables associated with WSRs. Although WSRs occur in a wide range of human-modified landscapes, environmental data layers suggest that they provide habitats important to terrestrial and aquatic organisms and recreation important to humans. Ultimately, the research findings herein suggest that there is a need for accurate, consistent, mapping of the National WSRs across the agencies responsible for administering each river. Geospatial applications examining potential landscape and energy development require accurate sources of information, such as data layers that portray realistic spatial representations.« less

  8. Health Monitoring for Airframe Structural Characterization

    NASA Technical Reports Server (NTRS)

    Munns, Thomas E.; Kent, Renee M.; Bartolini, Antony; Gause, Charles B.; Borinski, Jason W.; Dietz, Jason; Elster, Jennifer L.; Boyd, Clark; Vicari, Larry; Ray, Asok; hide

    2002-01-01

    This study established requirements for structural health monitoring systems, identified and characterized a prototype structural sensor system, developed sensor interpretation algorithms, and demonstrated the sensor systems on operationally realistic test articles. Fiber-optic corrosion sensors (i.e., moisture and metal ion sensors) and low-cycle fatigue sensors (i.e., strain and acoustic emission sensors) were evaluated to validate their suitability for monitoring aging degradation; characterize the sensor performance in aircraft environments; and demonstrate placement processes and multiplexing schemes. In addition, a unique micromachined multimeasure and sensor concept was developed and demonstrated. The results show that structural degradation of aircraft materials could be effectively detected and characterized using available and emerging sensors. A key component of the structural health monitoring capability is the ability to interpret the information provided by sensor system in order to characterize the structural condition. Novel deterministic and stochastic fatigue damage development and growth models were developed for this program. These models enable real time characterization and assessment of structural fatigue damage.

  9. Waste Isolation Pilot Plant Annual Site Environmental Report for 2014. Emended

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    none,

    2015-09-01

    The purpose of the Waste Isolation Pilot Plant (WIPP) Annual Site Environmental Report for 2014 (ASER) is to provide information required by U.S. Department of Energy (DOE) Order 231.1B, Environment, Safety, and Health Reporting. Specifically, the ASER presents summary environmental data to: Characterize site environmental management performance; Summarize environmental occurrences and responses reported during the calendar year (CY); Confirm compliance with environmental standards and requirements; Highlight significant environmental accomplishments, including progress toward the DOE environmental sustainability goals made through implementation of the WIPP Environmental Management System (EMS).

  10. Instrumentation for localized superconducting cavity diagnostics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Conway, Z. A.; Ge, M.; Iwashita, Y.

    2017-01-12

    Superconducting accelerator cavities are now routinely operated at levels approaching the theoretical limit of niobium. To achieve these operating levels more information than is available from the RF excitation signal is required to characterize and determine fixes for the sources of performance limitations. This information is obtained using diagnostic techniques which complement the analysis of the RF signal. In this paper we describe the operation and select results from three of these diagnostic techniques: the use of large scale thermometer arrays, second sound wave defect location and high precision cavity imaging with the Kyoto camera.

  11. Non-contact fluid characterization in containers using ultrasonic waves

    DOEpatents

    Sinha, Dipen N [Los Alamos, NM

    2012-05-15

    Apparatus and method for non-contact (stand-off) ultrasonic determination of certain characteristics of fluids in containers or pipes are described. A combination of swept frequency acoustic interferometry (SFAI), wide-bandwidth, air-coupled acoustic transducers, narrowband frequency data acquisition, and data conversion from the frequency domain to the time domain, if required, permits meaningful information to be extracted from such fluids.

  12. Automated quantification of the synchrogram by recurrence plot analysis.

    PubMed

    Nguyen, Chinh Duc; Wilson, Stephen James; Crozier, Stuart

    2012-04-01

    Recently, the concept of phase synchronization of two weakly coupled oscillators has raised a great research interest and has been applied to characterize synchronization phenomenon in physiological data. Phase synchronization of cardiorespiratory coupling is often studied by a synchrogram analysis, a graphical tool investigating the relationship between instantaneous phases of two signals. Although several techniques have been proposed to automatically quantify the synchrogram, most of them require a preselection of a phase-locking ratio by trial and error. One technique does not require this information; however, it is based on the power spectrum of phase's distribution in the synchrogram, which is vulnerable to noise. This study aims to introduce a new technique to automatically quantify the synchrogram by studying its dynamic structure. Our technique exploits recurrence plot analysis, which is a well-established tool for characterizing recurring patterns and nonstationarities in experiments. We applied our technique to detect synchronization in simulated and measured infants' cardiorespiratory data. Our results suggest that the proposed technique is able to systematically detect synchronization in noisy and chaotic data without preselecting the phase-locking ratio. By embedding phase information of the synchrogram into phase space, the phase-locking ratio is automatically unveiled as the number of attractors.

  13. Systematic review of the incremental costs of interventions that increase immunization coverage.

    PubMed

    Ozawa, Sachiko; Yemeke, Tatenda T; Thompson, Kimberly M

    2018-05-10

    Achieving and maintaining high vaccination coverage requires investments, but the costs and effectiveness of interventions to increase coverage remain poorly characterized. We conducted a systematic review of the literature to identify peer-reviewed studies published in English that reported interventions aimed at increasing immunization coverage and the associated costs and effectiveness of the interventions. We found limited information in the literature, with many studies reporting effectiveness estimates, but not providing cost information. Using the available data, we developed a cost function to support future programmatic decisions about investments in interventions to increase immunization coverage for relatively low and high-income countries. The cost function estimates the non-vaccine cost per dose of interventions to increase absolute immunization coverage by one percent, through either campaigns or routine immunization. The cost per dose per percent increase in absolute coverage increased with higher baseline coverage, demonstrating increasing incremental costs required to reach higher coverage levels. Future studies should evaluate the performance of the cost function and add to the database of available evidence to better characterize heterogeneity in costs and generalizability of the cost function. Copyright © 2018. Published by Elsevier Ltd.

  14. Experimental Demonstration of Fault-Tolerant State Preparation with Superconducting Qubits.

    PubMed

    Takita, Maika; Cross, Andrew W; Córcoles, A D; Chow, Jerry M; Gambetta, Jay M

    2017-11-03

    Robust quantum computation requires encoding delicate quantum information into degrees of freedom that are hard for the environment to change. Quantum encodings have been demonstrated in many physical systems by observing and correcting storage errors, but applications require not just storing information; we must accurately compute even with faulty operations. The theory of fault-tolerant quantum computing illuminates a way forward by providing a foundation and collection of techniques for limiting the spread of errors. Here we implement one of the smallest quantum codes in a five-qubit superconducting transmon device and demonstrate fault-tolerant state preparation. We characterize the resulting code words through quantum process tomography and study the free evolution of the logical observables. Our results are consistent with fault-tolerant state preparation in a protected qubit subspace.

  15. Black hole complementarity in gravity's rainbow

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gim, Yongwan; Kim, Wontae, E-mail: yongwan89@sogang.ac.kr, E-mail: wtkim@sogang.ac.kr

    2015-05-01

    To see how the gravity's rainbow works for black hole complementary, we evaluate the required energy for duplication of information in the context of black hole complementarity by calculating the critical value of the rainbow parameter in the certain class of the rainbow Schwarzschild black hole. The resultant energy can be written as the well-defined limit for the vanishing rainbow parameter which characterizes the deformation of the relativistic dispersion relation in the freely falling frame. It shows that the duplication of information in quantum mechanics could not be allowed below a certain critical value of the rainbow parameter; however, itmore » might be possible above the critical value of the rainbow parameter, so that the consistent formulation in our model requires additional constraints or any other resolutions for the latter case.« less

  16. Airborne Use of Traffic Intent Information in a Distributed Air-Ground Traffic Management Concept: Experiment Design and Preliminary Results

    NASA Technical Reports Server (NTRS)

    Wing, David J.; Adams, Richard J.; Duley, Jacqueline A.; Legan, Brian M.; Barmore, Bryan E.; Moses, Donald

    2001-01-01

    A predominant research focus in the free flight community has been on the type of information required on the flight deck to enable pilots to "autonomously" maintain separation from other aircraft. At issue are the relative utility and requirement for information exchange between aircraft regarding the current "state" and/or the "intent" of each aircraft. This paper presents the experimental design and some initial findings of an experimental research study designed to provide insight into the issue of intent information exchange in constrained en-route operations and its effect on pilot decision making and flight performance. Two operational modes for autonomous operations were compared in a piloted simulation. The tactical mode was characterized primarily by the use of state information for conflict detection and resolution and an open-loop means for the pilot to meet operational constraints. The strategic mode involved the combined use of state and intent information, provided the pilot an additional level of alerting, and allowed a closed-loop approach to meeting operational constraints. Potential operational benefits of both modes are illustrated through several scenario case studies. Subjective data results are presented that generally indicate pilot consensus in favor of the strategic mode.

  17. Spatially characterizing visitor use and its association with informal trails in Yosemite Valley meadows.

    PubMed

    Walden-Schreiner, Chelsey; Leung, Yu-Fai

    2013-07-01

    Ecological impacts associated with nature-based recreation and tourism can compromise park and protected area goals if left unrestricted. Protected area agencies are increasingly incorporating indicator-based management frameworks into their management plans to address visitor impacts. Development of indicators requires empirical evaluation of indicator measures and examining their ecological and social relevance. This study addresses the development of the informal trail indicator in Yosemite National Park by spatially characterizing visitor use in open landscapes and integrating use patterns with informal trail condition data to examine their spatial association. Informal trail and visitor use data were collected concurrently during July and August of 2011 in three, high-use meadows of Yosemite Valley. Visitor use was clustered at statistically significant levels in all three study meadows. Spatial data integration found no statistically significant differences between use patterns and trail condition class. However, statistically significant differences were found between the distance visitors were observed from informal trails and visitor activity type with active activities occurring closer to trail corridors. Gender was also found to be significant with male visitors observed further from trail corridors. Results highlight the utility of integrated spatial analysis in supporting indicator-based monitoring and informing management of open landscapes. Additional variables for future analysis and methodological improvements are discussed.

  18. Spatially Characterizing Visitor Use and Its Association with Informal Trails in Yosemite Valley Meadows

    NASA Astrophysics Data System (ADS)

    Walden-Schreiner, Chelsey; Leung, Yu-Fai

    2013-07-01

    Ecological impacts associated with nature-based recreation and tourism can compromise park and protected area goals if left unrestricted. Protected area agencies are increasingly incorporating indicator-based management frameworks into their management plans to address visitor impacts. Development of indicators requires empirical evaluation of indicator measures and examining their ecological and social relevance. This study addresses the development of the informal trail indicator in Yosemite National Park by spatially characterizing visitor use in open landscapes and integrating use patterns with informal trail condition data to examine their spatial association. Informal trail and visitor use data were collected concurrently during July and August of 2011 in three, high-use meadows of Yosemite Valley. Visitor use was clustered at statistically significant levels in all three study meadows. Spatial data integration found no statistically significant differences between use patterns and trail condition class. However, statistically significant differences were found between the distance visitors were observed from informal trails and visitor activity type with active activities occurring closer to trail corridors. Gender was also found to be significant with male visitors observed further from trail corridors. Results highlight the utility of integrated spatial analysis in supporting indicator-based monitoring and informing management of open landscapes. Additional variables for future analysis and methodological improvements are discussed.

  19. A Task-oriented Approach for Hydrogeological Site Characterization

    NASA Astrophysics Data System (ADS)

    Rubin, Y.; Nowak, W.; de Barros, F.

    2010-12-01

    Hydrogeological site characterization is a challenging task from several reasons: (1) the large spatial variability and scarcity of prior information render the outcome of any planned sampling campaign uncertain; (2) there are no simple tools for comparing between the many alternative measurement techniques and data acquisition strategies, and (3) physical and budgetary constraints associated with data acquisition. This paper presents several ideas on how to plan sampling campaigns in a rational manner while addressing these challenges. The first idea is to recognize that different sites and different problems require different characterization strategies. Hence the idea is to plan data acquisition according to its capability for meeting site-specific goals. For example, the characterization needs at a “research problem” site (e.g., a site intended to investigate the transport of uranium in the subsurface such as in Hanford) are different from those of a “problem” site (e.g., contaminated site associated with a health risk to human such as Camp Lejeune, or determining the safe yield of an aquifer). This distinction requires planners to define the characterization goal(s) in a quantitative manner. The second idea is to define metrics that could link specific data types and data acquisition strategies with the site-specific goals in a way that would allow planners to compare between strongly different, alternatives strategies at the design stage (even prior to data acquisition) and to modify the strategies as more data become available. To meet this goal, we developed the concept of the (comparative) information yield curve. Finally, we propose to look at site characterization from the perspective of statistical hypothesis testing, whereby data acquisition strategies could be evaluated in terms of their ability to support or refute various hypotheses made with regard to the characterization goals, and the strategies could be modified once the test is completed. Accept/reject regions for hypothesis testing can be determined based on goals determined by regulations or by agreement between the stakeholders. Hypothesis-driven design could help in minimizing the chances of making wrong decision (false positives or false negatives) with regard to the site-specific goals.

  20. Using the Saccharomyces Genome Database (SGD) for analysis of genomic information

    PubMed Central

    Skrzypek, Marek S.; Hirschman, Jodi

    2011-01-01

    Analysis of genomic data requires access to software tools that place the sequence-derived information in the context of biology. The Saccharomyces Genome Database (SGD) integrates functional information about budding yeast genes and their products with a set of analysis tools that facilitate exploring their biological details. This unit describes how the various types of functional data available at SGD can be searched, retrieved, and analyzed. Starting with the guided tour of the SGD Home page and Locus Summary page, this unit highlights how to retrieve data using YeastMine, how to visualize genomic information with GBrowse, how to explore gene expression patterns with SPELL, and how to use Gene Ontology tools to characterize large-scale datasets. PMID:21901739

  1. Thermal and Chemical Characterization of Composite Materials. MSFC Center Director's Discretionary Fund Final Report, Project No. ED36-18

    NASA Technical Reports Server (NTRS)

    Stanley, D. C.; Huff, T. L.

    2003-01-01

    The purpose of this research effort was to: (1) provide a concise and well-defined property profile of current and developing composite materials using thermal and chemical characterization techniques and (2) optimize analytical testing requirements of materials. This effort applied a diverse array of methodologies to ascertain composite material properties. Often, a single method of technique will provide useful, but nonetheless incomplete, information on material composition and/or behavior. To more completely understand and predict material properties, a broad-based analytical approach is required. By developing a database of information comprised of both thermal and chemical properties, material behavior under varying conditions may be better understood. THis is even more important in the aerospace community, where new composite materials and those in the development stage have little reference data. For example, Fourier transform infrared (FTIR) spectroscopy spectral databases available for identification of vapor phase spectra, such as those generated during experiments, generally refer to well-defined chemical compounds. Because this method renders a unique thermal decomposition spectral pattern, even larger, more diverse databases, such as those found in solid and liquid phase FTIR spectroscopy libraries, cannot be used. By combining this and other available methodologies, a database specifically for new materials and materials being developed at Marshall Space Flight Center can be generated . In addition, characterizing materials using this approach will be extremely useful in the verification of materials and identification of anomalies in NASA-wide investigations.

  2. Towards precision medicine: from quantitative imaging to radiomics

    PubMed Central

    Acharya, U. Rajendra; Hagiwara, Yuki; Sudarshan, Vidya K.; Chan, Wai Yee; Ng, Kwan Hoong

    2018-01-01

    Radiology (imaging) and imaging-guided interventions, which provide multi-parametric morphologic and functional information, are playing an increasingly significant role in precision medicine. Radiologists are trained to understand the imaging phenotypes, transcribe those observations (phenotypes) to correlate with underlying diseases and to characterize the images. However, in order to understand and characterize the molecular phenotype (to obtain genomic information) of solid heterogeneous tumours, the advanced sequencing of those tissues using biopsy is required. Thus, radiologists image the tissues from various views and angles in order to have the complete image phenotypes, thereby acquiring a huge amount of data. Deriving meaningful details from all these radiological data becomes challenging and raises the big data issues. Therefore, interest in the application of radiomics has been growing in recent years as it has the potential to provide significant interpretive and predictive information for decision support. Radiomics is a combination of conventional computer-aided diagnosis, deep learning methods, and human skills, and thus can be used for quantitative characterization of tumour phenotypes. This paper discusses the overview of radiomics workflow, the results of various radiomics-based studies conducted using various radiological images such as computed tomography (CT), magnetic resonance imaging (MRI), and positron-emission tomography (PET), the challenges we are facing, and the potential contribution of radiomics towards precision medicine. PMID:29308604

  3. Temporal Characterization of Aircraft Noise Sources

    NASA Technical Reports Server (NTRS)

    Grosveld, Ferdinand W.; Sullivan, Brenda M.; Rizzi, Stephen A.

    2004-01-01

    Current aircraft source noise prediction tools yield time-independent frequency spectra as functions of directivity angle. Realistic evaluation and human assessment of aircraft fly-over noise require the temporal characteristics of the noise signature. The purpose of the current study is to analyze empirical data from broadband jet and tonal fan noise sources and to provide the temporal information required for prediction-based synthesis. Noise sources included a one-tenth-scale engine exhaust nozzle and a one-fifth scale scale turbofan engine. A methodology was developed to characterize the low frequency fluctuations employing the Short Time Fourier Transform in a MATLAB computing environment. It was shown that a trade-off is necessary between frequency and time resolution in the acoustic spectrogram. The procedure requires careful evaluation and selection of the data analysis parameters, including the data sampling frequency, Fourier Transform window size, associated time period and frequency resolution, and time period window overlap. Low frequency fluctuations were applied to the synthesis of broadband noise with the resulting records sounding virtually indistinguishable from the measured data in initial subjective evaluations. Amplitude fluctuations of blade passage frequency (BPF) harmonics were successfully characterized for conditions equivalent to take-off and approach. Data demonstrated that the fifth harmonic of the BPF varied more in frequency than the BPF itself and exhibited larger amplitude fluctuations over the duration of the time record. Frequency fluctuations were found to be not perceptible in the current characterization of tonal components.

  4. Label-free nanoscale characterization of red blood cell structure and dynamics using single-shot transport of intensity equation

    NASA Astrophysics Data System (ADS)

    Poola, Praveen Kumar; John, Renu

    2017-10-01

    We report the results of characterization of red blood cell (RBC) structure and its dynamics with nanometric sensitivity using transport of intensity equation microscopy (TIEM). Conventional transport of intensity technique requires three intensity images and hence is not suitable for studying real-time dynamics of live biological samples. However, assuming the sample to be homogeneous, phase retrieval using transport of intensity equation has been demonstrated with single defocused measurement with x-rays. We adopt this technique for quantitative phase light microscopy of homogenous cells like RBCs. The main merits of this technique are its simplicity, cost-effectiveness, and ease of implementation on a conventional microscope. The phase information can be easily merged with regular bright-field and fluorescence images to provide multidimensional (three-dimensional spatial and temporal) information without any extra complexity in the setup. The phase measurement from the TIEM has been characterized using polymeric microbeads and the noise stability of the system has been analyzed. We explore the structure and real-time dynamics of RBCs and the subdomain membrane fluctuations using this technique.

  5. A sedimentological approach to hydrologic characterization: A detailed three-dimensional study of an outcrop of the Sierra Ladrones Formation, Albuquerque basin

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lohmann, R.C.

    1992-01-01

    Three-dimensional geologic outcrop studies which quantitatively describe the geologic architecture of deposits of a specific depositional environment are a necessary requirement for characterization of the permeability structure of an aquifer. The objective of this study is to address this need for quantitative, three-dimensional outcrop studies. For this study, a 10,000 m{sup 2} by 25 m high outcrop of Pliocene-Pleistocene Sierra Ladrones Formation located near Belen, New Mexico was mapped in detail, and the geologic architecture was quantified using geostatistical variogram analysis. In general, the information contained in this study should be useful for hydrologists working on the characterization of aquifersmore » from similar depositional environments such as this one. However, for the permeability correlation study to be truly useful, the within-element correlation structure needs to be superimposed on the elements themselves instead of using mean log (k) values, as was done for this study. Such information is derived from outcrop permeability sampling such as the work of Davis (1990) and Goggin et al. (1988).« less

  6. Measuring predictability in ultrasonic signals: an application to scattering material characterization.

    PubMed

    Carrión, Alicia; Miralles, Ramón; Lara, Guillermo

    2014-09-01

    In this paper, we present a novel and completely different approach to the problem of scattering material characterization: measuring the degree of predictability of the time series. Measuring predictability can provide information of the signal strength of the deterministic component of the time series in relation to the whole time series acquired. This relationship can provide information about coherent reflections in material grains with respect to the rest of incoherent noises that typically appear in non-destructive testing using ultrasonics. This is a non-parametric technique commonly used in chaos theory that does not require making any kind of assumptions about attenuation profiles. In highly scattering media (low SNR), it has been shown theoretically that the degree of predictability allows material characterization. The experimental results obtained in this work with 32 cement probes of 4 different porosities demonstrate the ability of this technique to do classification. It has also been shown that, in this particular application, the measurement of predictability can be used as an indicator of the percentages of porosity of the test samples with great accuracy. Copyright © 2014 Elsevier B.V. All rights reserved.

  7. Characterization of stormwater runoff from bridges in North Carolina and the effects of bridge deck runoff on receiving streams

    USGS Publications Warehouse

    Wagner, Chad R.; Fitzgerald, Sharon A.; Sherrell, Roy D.; Harned, Douglas A.; Staub, Erik L.; Pointer, Brian H.; Wehmeyer, Loren L.

    2011-01-01

    In 2008, the North Carolina General Assembly passed House Bill 2436 that required the North Carolina Department of Transportation (NCDOT) to study the water-quality effects of bridges on receiving streams. In response, the NCDOT and the U.S. Geological Survey (USGS) collaborated on a study to provide information necessary to address the requirements of the Bill. To better understand the effects of stormwater runoff from bridges on receiving streams, the following tasks were performed: (1) characterize stormwater runoff quality and quantity from a representative selection of bridges in North Carolina; (2) measure stream water quality upstream from selected bridges to compare bridge deck stormwater concentrations and loads to stream constituent concentrations and loads; and (3) determine if the chemistry of bed sediments upstream and downstream from selected bridges differs substantially based on presence or absence of a best management practice for bridge runoff.

  8. Reprint of: Combining theory and experiment for X-ray absorption spectroscopy and resonant X-ray scattering characterization of polymers

    DOE PAGES

    Su, Gregory M.; Cordova, Isvar A.; Brady, Michael A.; ...

    2016-11-01

    An improved understanding of fundamental chemistry, electronic structure, morphology, and dynamics in polymers and soft materials requires advanced characterization techniques that are amenable to in situ and operando studies. Soft X-ray methods are especially useful in their ability to non-destructively provide information on specific materials or chemical moieties. Analysis of these experiments, which can be very dependent on X-ray energy and polarization, can quickly become complex. Complementary modeling and predictive capabilities are required to properly probe these critical features. Here in this paper, we present relevant background on this emerging suite of techniques. We focus on how the combination ofmore » theory and experiment has been applied and can be further developed to drive our understanding of how these methods probe relevant chemistry, structure, and dynamics in soft materials.« less

  9. Combining theory and experiment for X-ray absorption spectroscopy and resonant X-ray scattering characterization of polymers

    DOE PAGES

    Su, Gregory M.; Cordova, Isvar A.; Brady, Michael A.; ...

    2016-07-04

    We present that an improved understanding of fundamental chemistry, electronic structure, morphology, and dynamics in polymers and soft materials requires advanced characterization techniques that are amenable to in situ and operando studies. Soft X-ray methods are especially useful in their ability to non-destructively provide information on specific materials or chemical moieties. Analysis of these experiments, which can be very dependent on X-ray energy and polarization, can quickly become complex. Complementary modeling and predictive capabilities are required to properly probe these critical features. Here, we present relevant background on this emerging suite of techniques. Finally, we focus on how the combinationmore » of theory and experiment has been applied and can be further developed to drive our understanding of how these methods probe relevant chemistry, structure, and dynamics in soft materials.« less

  10. Electric Vehicle Preparedness: Task 1, Assessment of Fleet Inventory for Marine Corps Base Camp Lejeune

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schey, Stephen; Francfort, Jim

    2015-01-01

    Several U.S. Department of Defense-based studies were conducted to identify potential U.S. Department of Defense transportation systems that are strong candidates for introduction or expansion of plug-in electric vehicles (PEVs). Task 1 included a survey of the inventory of non-tactical fleet vehicles at the Marine Corps Base Camp Lejeune (MCBCL) to characterize the fleet. This information and characterization will be used to select vehicles for monitoring that takes place during Task 2. This monitoring involves data logging of vehicle operation in order to identify the vehicle’s mission and travel requirements. Individual observations of these selected vehicles provide the basis formore » recommendations related to PEV adoption. It also identifies whether a battery electric vehicle or plug-in hybrid electric vehicle (collectively referred to as PEVs) can fulfill the mission requirements and provides observations related to placement of PEV charging infrastructure.« less

  11. Molecular characterization of multivalent bioconjugates by size-exclusion chromatography with multiangle laser light scattering.

    PubMed

    Pollock, Jacob F; Ashton, Randolph S; Rode, Nikhil A; Schaffer, David V; Healy, Kevin E

    2012-09-19

    The degree of substitution and valency of bioconjugate reaction products are often poorly judged or require multiple time- and product-consuming chemical characterization methods. These aspects become critical when analyzing and optimizing the potency of costly polyvalent bioactive conjugates. In this study, size-exclusion chromatography with multiangle laser light scattering was paired with refractive index detection and ultraviolet spectroscopy (SEC-MALS-RI-UV) to characterize the reaction efficiency, degree of substitution, and valency of the products of conjugation of either peptides or proteins to a biopolymer scaffold, i.e., hyaluronic acid (HyA). Molecular characterization was more complete compared to estimates from a protein quantification assay, and exploitation of this method led to more accurate deduction of the molecular structures of polymer bioconjugates. Information obtained using this technique can improve macromolecular engineering design principles and help to better understand multivalent macromolecular interactions in biological systems.

  12. Molecular characterization of multivalent bioconjugates by size-exclusion chromatography (SEC) with multi-angle laser light scattering (MALS)

    PubMed Central

    Pollock, Jacob F.; Ashton, Randolph S.; Rode, Nikhil A.; Schaffer, David V.; Healy, Kevin E.

    2013-01-01

    The degree of substitution and valency of bioconjugate reaction products are often poorly judged or require multiple time- and product- consuming chemical characterization methods. These aspects become critical when analyzing and optimizing the potency of costly polyvalent bioactive conjugates. In this study, size-exclusion chromatography with multi-angle laser light scattering was paired with refractive index detection and ultraviolet spectroscopy (SEC-MALS-RI-UV) to characterize the reaction efficiency, degree of substitution, and valency of the products of conjugation of either peptides or proteins to a biopolymer scaffold, i.e., hyaluronic acid (HyA). Molecular characterization was more complete compared to estimates from a protein quantification assay, and exploitation of this method led to more accurate deduction of the molecular structures of polymer bioconjugates. Information obtained using this technique can improve macromolecular engineering design principles and better understand multivalent macromolecular interactions in biological systems. PMID:22794081

  13. Dynamical estimation of neuron and network properties III: network analysis using neuron spike times.

    PubMed

    Knowlton, Chris; Meliza, C Daniel; Margoliash, Daniel; Abarbanel, Henry D I

    2014-06-01

    Estimating the behavior of a network of neurons requires accurate models of the individual neurons along with accurate characterizations of the connections among them. Whereas for a single cell, measurements of the intracellular voltage are technically feasible and sufficient to characterize a useful model of its behavior, making sufficient numbers of simultaneous intracellular measurements to characterize even small networks is infeasible. This paper builds on prior work on single neurons to explore whether knowledge of the time of spiking of neurons in a network, once the nodes (neurons) have been characterized biophysically, can provide enough information to usefully constrain the functional architecture of the network: the existence of synaptic links among neurons and their strength. Using standardized voltage and synaptic gating variable waveforms associated with a spike, we demonstrate that the functional architecture of a small network of model neurons can be established.

  14. National Water-Quality Assessment (NAWQA) area-characterization toolbox

    USGS Publications Warehouse

    Price, Curtis V.; Nakagaki, Naomi; Hitt, Kerie J.

    2010-01-01

    This is release 1.0 of the National Water-Quality Assessment (NAWQA) Area-Characterization Toolbox. These tools are designed to be accessed using ArcGIS Desktop software (versions 9.3 and 9.3.1). The toolbox is composed of a collection of custom tools that implement geographic information system (GIS) techniques used by the NAWQA Program to characterize aquifer areas, drainage basins, and sampled wells. These tools are built on top of standard functionality included in ArcGIS Desktop running at the ArcInfo license level. Most of the tools require a license for the ArcGIS Spatial Analyst extension. ArcGIS is a commercial GIS software system produced by ESRI, Inc. (http://www.esri.com). The NAWQA Area-Characterization Toolbox is not supported by ESRI, Inc. or its technical support staff. Any use of trade, product, or firm names is for descriptive purposes only and does not imply endorsement by the U.S. Government.

  15. Analysis on laser plasma emission for characterization of colloids by video-based computer program

    NASA Astrophysics Data System (ADS)

    Putri, Kirana Yuniati; Lumbantoruan, Hendra Damos; Isnaeni

    2016-02-01

    Laser-induced breakdown detection (LIBD) is a sensitive technique for characterization of colloids with small size and low concentration. There are two types of detection, optical and acoustic. Optical LIBD employs CCD camera to capture the plasma emission and uses the information to quantify the colloids. This technique requires sophisticated technology which is often pricey. In order to build a simple, home-made LIBD system, a dedicated computer program based on MATLAB™ for analyzing laser plasma emission was developed. The analysis was conducted by counting the number of plasma emissions (breakdowns) during a certain period of time. Breakdown probability provided information on colloid size and concentration. Validation experiment showed that the computer program performed well on analyzing the plasma emissions. Optical LIBD has A graphical user interface (GUI) was also developed to make the program more user-friendly.

  16. Decoding thalamic afferent input using microcircuit spiking activity

    PubMed Central

    Sederberg, Audrey J.; Palmer, Stephanie E.

    2015-01-01

    A behavioral response appropriate to a sensory stimulus depends on the collective activity of thousands of interconnected neurons. The majority of cortical connections arise from neighboring neurons, and thus understanding the cortical code requires characterizing information representation at the scale of the cortical microcircuit. Using two-photon calcium imaging, we densely sampled the thalamically evoked response of hundreds of neurons spanning multiple layers and columns in thalamocortical slices of mouse somatosensory cortex. We then used a biologically plausible decoder to characterize the representation of two distinct thalamic inputs, at the level of the microcircuit, to reveal those aspects of the activity pattern that are likely relevant to downstream neurons. Our data suggest a sparse code, distributed across lamina, in which a small population of cells carries stimulus-relevant information. Furthermore, we find that, within this subset of neurons, decoder performance improves when noise correlations are taken into account. PMID:25695647

  17. Decoding thalamic afferent input using microcircuit spiking activity.

    PubMed

    Sederberg, Audrey J; Palmer, Stephanie E; MacLean, Jason N

    2015-04-01

    A behavioral response appropriate to a sensory stimulus depends on the collective activity of thousands of interconnected neurons. The majority of cortical connections arise from neighboring neurons, and thus understanding the cortical code requires characterizing information representation at the scale of the cortical microcircuit. Using two-photon calcium imaging, we densely sampled the thalamically evoked response of hundreds of neurons spanning multiple layers and columns in thalamocortical slices of mouse somatosensory cortex. We then used a biologically plausible decoder to characterize the representation of two distinct thalamic inputs, at the level of the microcircuit, to reveal those aspects of the activity pattern that are likely relevant to downstream neurons. Our data suggest a sparse code, distributed across lamina, in which a small population of cells carries stimulus-relevant information. Furthermore, we find that, within this subset of neurons, decoder performance improves when noise correlations are taken into account. Copyright © 2015 the American Physiological Society.

  18. Automatic Fault Characterization via Abnormality-Enhanced Classification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bronevetsky, G; Laguna, I; de Supinski, B R

    Enterprise and high-performance computing systems are growing extremely large and complex, employing hundreds to hundreds of thousands of processors and software/hardware stacks built by many people across many organizations. As the growing scale of these machines increases the frequency of faults, system complexity makes these faults difficult to detect and to diagnose. Current system management techniques, which focus primarily on efficient data access and query mechanisms, require system administrators to examine the behavior of various system services manually. Growing system complexity is making this manual process unmanageable: administrators require more effective management tools that can detect faults and help tomore » identify their root causes. System administrators need timely notification when a fault is manifested that includes the type of fault, the time period in which it occurred and the processor on which it originated. Statistical modeling approaches can accurately characterize system behavior. However, the complex effects of system faults make these tools difficult to apply effectively. This paper investigates the application of classification and clustering algorithms to fault detection and characterization. We show experimentally that naively applying these methods achieves poor accuracy. Further, we design novel techniques that combine classification algorithms with information on the abnormality of application behavior to improve detection and characterization accuracy. Our experiments demonstrate that these techniques can detect and characterize faults with 65% accuracy, compared to just 5% accuracy for naive approaches.« less

  19. Characterization of Contrast Agent Microbubbles for Ultrasound Imaging and Therapy Research.

    PubMed

    Mulvana, Helen; Browning, Richard J; Luan, Ying; de Jong, Nico; Tang, Meng-Xing; Eckersley, Robert J; Stride, Eleanor

    2017-01-01

    The high efficiency with which gas microbubbles can scatter ultrasound compared with the surrounding blood pool or tissues has led to their widespread employment as contrast agents in ultrasound imaging. In recent years, their applications have been extended to include super-resolution imaging and the stimulation of localized bio-effects for therapy. The growing exploitation of contrast agents in ultrasound and in particular these recent developments have amplified the need to characterize and fully understand microbubble behavior. The aim in doing so is to more fully exploit their utility for both diagnostic imaging and potential future therapeutic applications. This paper presents the key characteristics of microbubbles that determine their efficacy in diagnostic and therapeutic applications and the corresponding techniques for their measurement. In each case, we have presented information regarding the methods available and their respective strengths and limitations, with the aim of presenting information relevant to the selection of appropriate characterization methods. First, we examine methods for determining the physical properties of microbubble suspensions and then techniques for acoustic characterization of both suspensions and single microbubbles. The next section covers characterization of microbubbles as therapeutic agents, including as drug carriers for which detailed understanding of their surface characteristics and drug loading capacity is required. Finally, we discuss the attempts that have been made to allow comparison across the methods employed by various groups to characterize and describe their microbubble suspensions and promote wider discussion and comparison of microbubble behavior.

  20. Electromagnetic Characterization of Inhomogeneous Media

    DTIC Science & Technology

    2012-03-22

    Engineering and Management Air Force Institute of Technology Air University Air Education and Training Command In Partial Fulfillment of the Requirements...found in the laboratory data, fun is the code that contains the theatrical formulation of S11, and beta0 is the initial constitutive parameter estimate...collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources

  1. Land Cover Characterization Program

    USGS Publications Warehouse

    ,

    1997-01-01

    (2) identify sources, develop procedures, and organize partners to deliver data and information to meet user requirements. The LCCP builds on the heritage and success of previous USGS land use and land cover programs and projects. It will be compatible with current concepts of government operations, the changing needs of the land use and land cover data users, and the technological tools with which the data are applied.

  2. Method of and Apparatus for Histological Human Tissue Characterization Using Ultrasound

    NASA Technical Reports Server (NTRS)

    Yost, William T. (Inventor); Cantrell, John H. (Inventor); TalEr, George A. (Inventor)

    1999-01-01

    A method and apparatus for determining important histological characteristics of tissue, including a determination of the tissue's health. Electrical pulses are converted into meaningful numerical representations through the use of Fourier Transforms. These numerical representations are then used to determine important histological characteristics of tissue. This novel invention does not require rectification and thus provides for detailed information from the ultrasonic scan.

  3. Method of and Apparatus for Histological Human Tissue Characterization Using Ultrasound

    NASA Technical Reports Server (NTRS)

    Yost, William T. (Inventor); Cantrell, John H. (Inventor); Taler, George A. (Inventor)

    1998-01-01

    A method and apparatus for determining important histological characteristics of tissue, including a determination of the tissue's health is discussed. Electrical pulses are converted into meaningful numerical representations through the use of Fourier Transforms. These numerical representations are then used to determine important histological characteristics of tissue. This novel invention does not require rectification and thus provides for detailed information from the ultrasonic scan.

  4. Web-GIS platform for green infrastructure in Bucharest, Romania

    NASA Astrophysics Data System (ADS)

    Sercaianu, Mihai; Petrescu, Florian; Aldea, Mihaela; Oana, Luca; Rotaru, George

    2015-06-01

    In the last decade, reducing urban pollution and improving quality of public spaces became a more and more important issue for public administration authorities in Romania. The paper describes the development of a web-GIS solution dedicated to monitoring of the green infrastructure in Bucharest, Romania. Thus, the system allows the urban residents (citizens) to collect themselves and directly report relevant information regarding the current status of the green infrastructure of the city. Consequently, the citizens become an active component of the decision-support process within the public administration. Besides the usual technical characteristics of such geo-information processing systems, due to the complex legal and organizational problems that arise in collecting information directly from the citizens, additional analysis was required concerning, for example, local government involvement, environmental protection agencies regulations or public entities requirements. Designing and implementing the whole information exchange process, based on the active interaction between the citizens and public administration bodies, required the use of the "citizen-sensor" concept deployed with GIS tools. The information collected and reported from the field is related to a lot of factors, which are not always limited to the city level, providing the possibility to consider the green infrastructure as a whole. The "citizen-request" web-GIS for green infrastructure monitoring solution is characterized by a very diverse urban information, due to the fact that the green infrastructure itself is conditioned by a lot of urban elements, such as urban infrastructures, urban infrastructure works and construction density.

  5. Laboratory Information Management System (LIMS): A case study

    NASA Technical Reports Server (NTRS)

    Crandall, Karen S.; Auping, Judith V.; Megargle, Robert G.

    1987-01-01

    In the late 70's, a refurbishment of the analytical laboratories serving the Materials Division at NASA Lewis Research Center was undertaken. As part of the modernization efforts, a Laboratory Information Management System (LIMS) was to be included. Preliminary studies indicated a custom-designed system as the best choice in order to satisfy all of the requirements. A scaled down version of the original design has been in operation since 1984. The LIMS, a combination of computer hardware, provides the chemical characterization laboratory with an information data base, a report generator, a user interface, and networking capabilities. This paper is an account of the processes involved in designing and implementing that LIMS.

  6. Transcriptional profiling suggests that multiple metabolic adaptations are required for effective proliferation of Pseudomonas aeruginosa in jet fuel.

    PubMed

    Gunasekera, Thusitha S; Striebich, Richard C; Mueller, Susan S; Strobel, Ellen M; Ruiz, Oscar N

    2013-01-01

    Fuel is a harsh environment for microbial growth. However, some bacteria can grow well due to their adaptive mechanisms. Our goal was to characterize the adaptations required for Pseudomonas aeruginosa proliferation in fuel. We have used DNA-microarrays and RT-PCR to characterize the transcriptional response of P. aeruginosa to fuel. Transcriptomics revealed that genes essential for medium- and long-chain n-alkane degradation including alkB1 and alkB2 were transcriptionally induced. Gas chromatography confirmed that P. aeruginosa possesses pathways to degrade different length n-alkanes, favoring the use of n-C11-18. Furthermore, a gamut of synergistic metabolic pathways, including porins, efflux pumps, biofilm formation, and iron transport, were transcriptionally regulated. Bioassays confirmed that efflux pumps and biofilm formation were required for growth in jet fuel. Furthermore, cell homeostasis appeared to be carefully maintained by the regulation of porins and efflux pumps. The Mex RND efflux pumps were required for fuel tolerance; blockage of these pumps precluded growth in fuel. This study provides a global understanding of the multiple metabolic adaptations required by bacteria for survival and proliferation in fuel-containing environments. This information can be applied to improve the fuel bioremediation properties of bacteria.

  7. Postmarket studies required by the US Food and Drug Administration for new drugs and biologics approved between 2009 and 2012: cross sectional analysis.

    PubMed

    Wallach, Joshua D; Egilman, Alexander C; Dhruva, Sanket S; McCarthy, Margaret E; Miller, Jennifer E; Woloshin, Steven; Schwartz, Lisa M; Ross, Joseph S

    2018-05-24

    To characterize postmarketing requirements for new drugs and biologics approved by the US Food and Drug Administration (FDA), and to examine rates and timeliness of registration, results reporting, and publication of required prospective cohort studies, registries, and clinical trials. Cross sectional analysis. Postmarketing requirements for all new drugs and biologics approved by the FDA between 1 January 2009 and 31 December 2012, with follow-up up to 15 November 2017. Postmarketing requirements and their characteristics known at the time of FDA approval, including FDA authority, study design, and study characteristics. Rates and timeliness of registration and results reporting on ClinicalTrials.gov and publication in peer reviewed journals of required prospective cohort studies, registries, and clinical trials. Between 2009 and 12, the FDA approved 97 new drugs and biologics for 106 indications with at least one postmarketing requirement at the time of first approval, for a total of 437 postmarketing requirements. Postmarket study descriptions were short (median word count 44 (interquartile range 29-71)) and often lacked information to determine an up to date progress (131 (30%)). 220 (50.3%) postmarketing requirements were for new animal or other studies (including pharmacokinetic studies); 134 (30.7%) were for prospective cohort studies, registries, and clinical trials; and 83 (19.0%) were for secondary analyses or follow-up studies. Of 110 clinical trials, 38 (34.5%), 44 (40.0%), 62 (56.4%), 66 (60.0%), and 98 (89.1%) did not report enough information to establish use of randomization, comparator type, allocation, outcome, and number of patients to be enrolled, respectively. Of 134 required prospective cohort studies, registries, and clinical trials, 102 (76.1%) were registered on ClinicalTrials.gov; of 50 registered and completed studies, 36 (72.0%) had reported results on ClinicalTrials.gov. Among 65 completed studies, 47 (72.3%) had either reported results or were published a median of 47 months (interquartile range 32-67) after FDA approval. 32 (68.1%) of these 47 studies did not report results publicly by the time of their original FDA report submission deadline. Postmarketing requirements for new drugs and biologics were often briefly described and did not contain enough information to characterize study designs. Approximately three quarters of postmarketing requirements for prospective cohort studies, registries, and clinical trials were registered on ClinicalTrials.gov, and nearly three quarters of completed studies reported results or were published, suggesting that at least a quarter of these required studies are not being publicly disseminated. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  8. A Risk Management Framework to Characterize Black Swan Risks: A Case Study of Lightning Effects on Insensitive High Explosives

    NASA Astrophysics Data System (ADS)

    Sanders, Gary A.

    Effective and efficient risk management processes include the use of high fidelity modeling and simulation during the concept exploration phase as part of the technology and risk assessment activities, with testing and evaluation tasks occurring in later design development phases. However, some safety requirements and design architectures may be dominated by the low probability/high consequence "Black Swan" vulnerabilities that require very early testing to characterize and efficiently mitigate. Failure to address these unique risks has led to catastrophic systems failures including the space shuttle Challenger, Deepwater Horizon, Fukushima nuclear reactor, and Katrina dike failures. Discovering and addressing these risks later in the design and development process can be very costly or even lead to project cancellation. This paper examines the need for risk management process adoption of early hazard phenomenology testing to inform the technical risk assessment, requirements definition and conceptual design. A case study of the lightning design vulnerability of the insensitive high explosives being used in construction, mining, demolition, and defense industries will be presented to examine the impact of this vulnerability testing during the concept exploration phase of the design effort. While these insensitive high explosives are far less sensitive to accidental initiation by fire, impact, friction or even electrical stimuli, their full range of sensitivities have not been characterized and ensuring safe engineering design and operations during events such as lightning storms requires vulnerability testing during the risk assessment phase.

  9. Paradigm Shift in Data Content and Informatics Infrastructure Required for Generalized Constitutive Modeling of Materials Behavior

    NASA Technical Reports Server (NTRS)

    Arnold, S. M.

    2006-01-01

    Materials property information such as composition and thermophysical/mechanical properties abound in the literature. Oftentimes, however, the corresponding response curves from which these data are determined are missing or at the very least difficult to retrieve. Further, the paradigm for collecting materials property information has historically centered on (1) properties for materials comparison/selection purposes and (2) input requirements for conventional design/analysis methods. However, just as not all materials are alike or equal, neither are all constitutive models (and thus design/ analysis methods) equal; each model typically has its own specific and often unique required materials parameters, some directly measurable and others indirectly measurable. Therefore, the type and extent of materials information routinely collected is not always sufficient to meet the current, much less future, needs of the materials modeling community. Informatics has been defined as the science concerned with gathering, manipulating, storing, retrieving, and classifying recorded information. A key aspect of informatics is its focus on understanding problems and applying information technology as needed to address those problems. The primary objective of this article is to highlight the need for a paradigm shift in materials data collection, analysis, and dissemination so as to maximize the impact on both practitioners and researchers. Our hope is to identify and articulate what constitutes "sufficient" data content (i.e., quality and quantity) for developing, characterizing, and validating sophisticated nonlinear time- and history-dependent (hereditary) constitutive models. Likewise, the informatics infrastructure required for handling the potentially massive amounts of materials data will be discussed.

  10. Spatially Compact Neural Clusters in the Dorsal Striatum Encode Locomotion Relevant Information.

    PubMed

    Barbera, Giovanni; Liang, Bo; Zhang, Lifeng; Gerfen, Charles R; Culurciello, Eugenio; Chen, Rong; Li, Yun; Lin, Da-Ting

    2016-10-05

    An influential striatal model postulates that neural activities in the striatal direct and indirect pathways promote and inhibit movement, respectively. Normal behavior requires coordinated activity in the direct pathway to facilitate intended locomotion and indirect pathway to inhibit unwanted locomotion. In this striatal model, neuronal population activity is assumed to encode locomotion relevant information. Here, we propose a novel encoding mechanism for the dorsal striatum. We identified spatially compact neural clusters in both the direct and indirect pathways. Detailed characterization revealed similar cluster organization between the direct and indirect pathways, and cluster activities from both pathways were correlated with mouse locomotion velocities. Using machine-learning algorithms, cluster activities could be used to decode locomotion relevant behavioral states and locomotion velocity. We propose that neural clusters in the dorsal striatum encode locomotion relevant information and that coordinated activities of direct and indirect pathway neural clusters are required for normal striatal controlled behavior. VIDEO ABSTRACT. Published by Elsevier Inc.

  11. National Land Imaging Requirements (NLIR) Pilot Project summary report: summary of moderate resolution imaging user requirements

    USGS Publications Warehouse

    Vadnais, Carolyn; Stensaas, Gregory

    2014-01-01

    Under the National Land Imaging Requirements (NLIR) Project, the U.S. Geological Survey (USGS) is developing a functional capability to obtain, characterize, manage, maintain and prioritize all Earth observing (EO) land remote sensing user requirements. The goal is a better understanding of community needs that can be supported with land remote sensing resources, and a means to match needs with appropriate solutions in an effective and efficient way. The NLIR Project is composed of two components. The first component is focused on the development of the Earth Observation Requirements Evaluation System (EORES) to capture, store and analyze user requirements, whereas, the second component is the mechanism and processes to elicit and document the user requirements that will populate the EORES. To develop the second component, the requirements elicitation methodology was exercised and refined through a pilot project conducted from June to September 2013. The pilot project focused specifically on applications and user requirements for moderate resolution imagery (5–120 meter resolution) as the test case for requirements development. The purpose of this summary report is to provide a high-level overview of the requirements elicitation process that was exercised through the pilot project and an early analysis of the moderate resolution imaging user requirements acquired to date to support ongoing USGS sustainable land imaging study needs. The pilot project engaged a limited set of Federal Government users from the operational and research communities and therefore the information captured represents only a subset of all land imaging user requirements. However, based on a comparison of results, trends, and analysis, the pilot captured a strong baseline of typical applications areas and user needs for moderate resolution imagery. Because these results are preliminary and represent only a sample of users and application areas, the information from this report should only be used to indicate general user needs for the applications covered. Users of the information are cautioned that use of specific numeric results may be inappropriate without additional research. Any information used or cited from this report should specifically be cited as preliminary findings.

  12. Extraction and textural characterization of above-ground areas from aerial stereo pairs: a quality assessment

    NASA Astrophysics Data System (ADS)

    Baillard, C.; Dissard, O.; Jamet, O.; Maître, H.

    Above-ground analysis is a key point to the reconstruction of urban scenes, but it is a difficult task because of the diversity of the involved objects. We propose a new method to above-ground extraction from an aerial stereo pair, which does not require any assumption about object shape or nature. A Digital Surface Model is first produced by a stereoscopic matching stage preserving discontinuities, and then processed by a region-based Markovian classification algorithm. The produced above-ground areas are finally characterized as man-made or natural according to the grey level information. The quality of the results is assessed and discussed.

  13. New high resolution Random Telegraph Noise (RTN) characterization method for resistive RAM

    NASA Astrophysics Data System (ADS)

    Maestro, M.; Diaz, J.; Crespo-Yepes, A.; Gonzalez, M. B.; Martin-Martinez, J.; Rodriguez, R.; Nafria, M.; Campabadal, F.; Aymerich, X.

    2016-01-01

    Random Telegraph Noise (RTN) is one of the main reliability problems of resistive switching-based memories. To understand the physics behind RTN, a complete and accurate RTN characterization is required. The standard equipment used to analyse RTN has a typical time resolution of ∼2 ms which prevents evaluating fast phenomena. In this work, a new RTN measurement procedure, which increases the measurement time resolution to 2 μs, is proposed. The experimental set-up, together with the recently proposed Weighted Time Lag (W-LT) method for the analysis of RTN signals, allows obtaining a more detailed and precise information about the RTN phenomenon.

  14. Bayesian approach to analyzing holograms of colloidal particles.

    PubMed

    Dimiduk, Thomas G; Manoharan, Vinothan N

    2016-10-17

    We demonstrate a Bayesian approach to tracking and characterizing colloidal particles from in-line digital holograms. We model the formation of the hologram using Lorenz-Mie theory. We then use a tempered Markov-chain Monte Carlo method to sample the posterior probability distributions of the model parameters: particle position, size, and refractive index. Compared to least-squares fitting, our approach allows us to more easily incorporate prior information about the parameters and to obtain more accurate uncertainties, which are critical for both particle tracking and characterization experiments. Our approach also eliminates the need to supply accurate initial guesses for the parameters, so it requires little tuning.

  15. Network Security Validation Using Game Theory

    NASA Astrophysics Data System (ADS)

    Papadopoulou, Vicky; Gregoriades, Andreas

    Non-functional requirements (NFR) such as network security recently gained widespread attention in distributed information systems. Despite their importance however, there is no systematic approach to validate these requirements given the complexity and uncertainty characterizing modern networks. Traditionally, network security requirements specification has been the results of a reactive process. This however, limited the immunity property of the distributed systems that depended on these networks. Security requirements specification need a proactive approach. Networks' infrastructure is constantly under attack by hackers and malicious software that aim to break into computers. To combat these threats, network designers need sophisticated security validation techniques that will guarantee the minimum level of security for their future networks. This paper presents a game-theoretic approach to security requirements validation. An introduction to game theory is presented along with an example that demonstrates the application of the approach.

  16. Tank 241-AZ-102 Privatization Push Mode Core Sampling and Analysis Plan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    RASMUSSEN, J.H.

    1999-08-02

    This sampling and analysis plan (SAP) identifies characterization objectives pertaining to sample collection, laboratory analytical evaluation, and reporting requirements for samples obtained from tank 241-AZ-102. The purpose of this sampling event is to obtain information about the characteristics of the contents of 241-AZ-102 required to satisfy the Data Quality Objectives For TWRS Privatization Phase I: Confirm Tank TIS An Appropriate Feed Source For High-Level Waste Feed Batch X(HLW DQO) (Nguyen 1999a), Data Quality Objectives For TWRS Privatization Phase 1: Confirm Tank TIS An Appropriate Feed Source For Low-Activity Waste Feed Batch X (LAW DQO) (Nguyen 1999b), Low Activity Waste andmore » High Level Waste Feed Data Quality Objectives (L&H DQO) (Patello et al. 1999) and Characterization Data Needs for Development, Design, and Operation of Retrieval Equipment Developed through the Data Quality Objective Process (Equipment DQO) (Bloom 1996). The Tank Characterization Technical Sampling Basis document (Brown et al. 1998) indicates that these issues, except the Equipment DQO apply to tank 241-AZ-102 for this sampling event. The Equipment DQO is applied for shear strength measurements of the solids segments only. Poppiti (1999) requires additional americium-241 analyses of the sludge segments. Brown et al. (1998) also identify safety screening, regulatory issues and provision of samples to the Privatization Contractor(s) as applicable issues for this tank. However, these issues will not be addressed via this sampling event. Reynolds et al. (1999) concluded that information from previous sampling events was sufficient to satisfy the safety screening requirements for tank 241 -AZ-102. Push mode core samples will be obtained from risers 15C and 24A to provide sufficient material for the chemical analyses and tests required to satisfy these data quality objectives. The 222-S Laboratory will extrude core samples, composite the liquids and solids, perform chemical analyses, and provide subsamples to the Process Chemistry Laboratory. The Process Chemistry Laboratory will prepare test plans and perform process tests to evaluate the behavior of the 241-AZ-102 waste undergoing the retrieval and treatment scenarios defined in the applicable DQOs. Requirements for analyses of samples originating in the process tests will be documented in the corresponding test plan.« less

  17. A Formal Approach to Empirical Dynamic Model Optimization and Validation

    NASA Technical Reports Server (NTRS)

    Crespo, Luis G; Morelli, Eugene A.; Kenny, Sean P.; Giesy, Daniel P.

    2014-01-01

    A framework was developed for the optimization and validation of empirical dynamic models subject to an arbitrary set of validation criteria. The validation requirements imposed upon the model, which may involve several sets of input-output data and arbitrary specifications in time and frequency domains, are used to determine if model predictions are within admissible error limits. The parameters of the empirical model are estimated by finding the parameter realization for which the smallest of the margins of requirement compliance is as large as possible. The uncertainty in the value of this estimate is characterized by studying the set of model parameters yielding predictions that comply with all the requirements. Strategies are presented for bounding this set, studying its dependence on admissible prediction error set by the analyst, and evaluating the sensitivity of the model predictions to parameter variations. This information is instrumental in characterizing uncertainty models used for evaluating the dynamic model at operating conditions differing from those used for its identification and validation. A practical example based on the short period dynamics of the F-16 is used for illustration.

  18. An investigation into pilot and system response to critical in-flight events. Volume 1: Executive summary

    NASA Technical Reports Server (NTRS)

    Rockwell, T. H.; Griffin, W. C.

    1981-01-01

    Critical in-flight events (CIFE) that threaten the aircraft were studied. The scope of the CIFE was described and defined with emphasis on characterizing event development, detection and assessment; pilot information requirements, sources, acquisition, and interpretation, pilot response options, decision processed, and decision implementation and event outcome. Detailed scenarios were developed for use in simulators and paper and pencil testing for developing relationships between pilot performance and background information as well as for an analysis of pilot reaction decision and feedback processes. Statistical relationships among pilot characteristics and observed responses to CIFE's were developed.

  19. VEEP: A Vehicle Economy, Emissions, and Performance simulation program

    NASA Technical Reports Server (NTRS)

    Klose, G. J.

    1978-01-01

    The purpose of the VEEP simulation program was to: (1) predict vehicle fuel economy and relative emissions over any specified driving cycle; (2) calculate various measures of vehicle performance (acceleration, passing manuevers, gradeability, top speed), and (3) give information on the various categories of energy dissipation (rolling friction, aerodynamics, accessories, inertial effects, component inefficiences, etc.). The vehicle is described based on detailed subsystem information and numerical parameters characterizing the components of a wide variety of self-propelled vehicles. Conventionally arranged heat engine powered automobiles were emphasized, but with consideration in the design toward the requirement of other types of vehicles.

  20. A Survey and Analysis of Frameworks and Framework Issues for Information Fusion Applications

    NASA Astrophysics Data System (ADS)

    Llinas, James

    This paper was stimulated by the proposed project for the Santander Bank-sponsored "Chairs of Excellence" program in Spain, of which the author is a recipient. That project involves research on characterizing a robust, problem-domain-agnostic framework in which Information Fusion (IF) processes of all description, to include artificial intelligence processes and techniques could be developed. The paper describes the IF process and its requirements, a literature survey on IF frameworks, and a new proposed framework that will be implemented and evaluated at Universidad Carlos III de Madrid, Colmenarejo Campus.

  1. Testing, Requirements, and Metrics

    NASA Technical Reports Server (NTRS)

    Rosenberg, Linda; Hyatt, Larry; Hammer, Theodore F.; Huffman, Lenore; Wilson, William

    1998-01-01

    The criticality of correct, complete, testable requirements is a fundamental tenet of software engineering. Also critical is complete requirements based testing of the final product. Modern tools for managing requirements allow new metrics to be used in support of both of these critical processes. Using these tools, potential problems with the quality of the requirements and the test plan can be identified early in the life cycle. Some of these quality factors include: ambiguous or incomplete requirements, poorly designed requirements databases, excessive or insufficient test cases, and incomplete linkage of tests to requirements. This paper discusses how metrics can be used to evaluate the quality of the requirements and test to avoid problems later. Requirements management and requirements based testing have always been critical in the implementation of high quality software systems. Recently, automated tools have become available to support requirements management. At NASA's Goddard Space Flight Center (GSFC), automated requirements management tools are being used on several large projects. The use of these tools opens the door to innovative uses of metrics in characterizing test plan quality and assessing overall testing risks. In support of these projects, the Software Assurance Technology Center (SATC) is working to develop and apply a metrics program that utilizes the information now available through the application of requirements management tools. Metrics based on this information provides real-time insight into the testing of requirements and these metrics assist the Project Quality Office in its testing oversight role. This paper discusses three facets of the SATC's efforts to evaluate the quality of the requirements and test plan early in the life cycle, thus preventing costly errors and time delays later.

  2. Characterization of the polarization and frequency selective bolometric detector architecture

    NASA Astrophysics Data System (ADS)

    Leong, Jonathan Ryan Kyoung Ho

    2009-01-01

    The Cosmic Microwave Background (CMB) has been a wonderful probe of fundamental physics and cosmology. In the future, we look towards using the polarization information encoded in the CMB for investigating the gravity waves generated by inflation. This is a daunting task as it requires orders of magnitude increases in sensitivity as well as close attention to systematic rejection and astrophysical foreground removal. We have characterized a novel detector architecture which is aimed at making these leaps towards gravity wave detection in the CMB. These detectors are called the Polarization and Frequency Selective Bolometers (PFSBs). They attempt to use all the available photon information incident on a single pixel by selecting out the two orthogonal polarizations and multiple frequency bands into separately stacked detectors in a smooth-walled waveguide. This approach is inherently multimoded and thus solves problems with downlink and readout throughput by catching more photons per detector at the higher frequencies where the number of detectors required is prohibitively large. We have found that the PFSB architecture requires the use of a square cross-section waveguide. A simulation we developed has illuminated the fact that the curved field lines of the higher order modes can be eliminated by degeneracies which exist only for a square guide and not a circular one. In the square guide configuration, the PFSBs show good band selection and polarization efficiency to a level of about 90% over the beam out to at least 20° from on-axis.

  3. Attributes of the Federal Energy Management Program's Federal Site Building Characteristics Database

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Loper, Susan A.; Sandusky, William F.

    2010-12-31

    Typically, the Federal building stock is referred to as a group of about one-half million buildings throughout the United States. Additional information beyond this level is generally limited to distribution of that total by agency and maybe distribution of the total by state. However, additional characterization of the Federal building stock is required as the Federal sector seeks ways to implement efficiency projects to reduce energy and water use intensity as mandated by legislation and Executive Order. Using a Federal facility database that was assembled for use in a geographic information system tool, additional characterization of the Federal building stockmore » is provided including information regarding the geographical distribution of sites, building counts and percentage of total by agency, distribution of sites and building totals by agency, distribution of building count and floor space by Federal building type classification by agency, and rank ordering of sites, buildings, and floor space by state. A case study is provided regarding how the building stock has changed for the Department of Energy from 2000 through 2008.« less

  4. A Petri Net-Based Software Process Model for Developing Process-Oriented Information Systems

    NASA Astrophysics Data System (ADS)

    Li, Yu; Oberweis, Andreas

    Aiming at increasing flexibility, efficiency, effectiveness, and transparency of information processing and resource deployment in organizations to ensure customer satisfaction and high quality of products and services, process-oriented information systems (POIS) represent a promising realization form of computerized business information systems. Due to the complexity of POIS, explicit and specialized software process models are required to guide POIS development. In this chapter we characterize POIS with an architecture framework and present a Petri net-based software process model tailored for POIS development with consideration of organizational roles. As integrated parts of the software process model, we also introduce XML nets, a variant of high-level Petri nets as basic methodology for business processes modeling, and an XML net-based software toolset providing comprehensive functionalities for POIS development.

  5. Toward the stereochemical identification of prohibited characterizing flavors in tobacco products: the case of strawberry flavor.

    PubMed

    Paschke, Meike; Hutzler, Christoph; Henkler, Frank; Luch, Andreas

    2015-08-01

    With the revision of the European Tobacco Products Directive (2014/40/EU), characterizing flavors such as strawberry, candy, vanillin or chocolate will be prohibited in cigarettes and fine-cut tobacco. Product surveillance will therefore require analytical means to define and subsequently detect selected characterizing flavors that are formed by supplemented flavors within the complex matrix tobacco. We have analyzed strawberry-flavored tobacco products as an example for characterizing fruit-like aroma. Using this approach, we looked into aroma components to find indicative patterns or features that can be used to satisfy obligatory product information as requested by the European Directive. Accordingly, a headspace solid-phase microextraction (HS-SPME) technique was developed and coupled to subsequent gas chromatography-mass spectrometry (GC/MS) to characterize different strawberry-flavored tobacco products (cigarettes, fine-cut tobacco, liquids for electronic cigarettes, snus, shisha tobacco) for their volatile additives. The results were compared with non-flavored, blend characteristic flavored and other fruity-flavored cigarettes, as well as fresh and dried strawberries. Besides different esters and aldehydes, the terpenes linalool, α-terpineol, nerolidol and limonene as well as the lactones γ-decalactone, γ-dodecalactone and γ-undecalactone could be verified as compounds sufficient to convey some sort of strawberry flavor to tobacco. Selected flavors, i.e., limonene, linalool, α-terpineol, citronellol, carvone and γ-decalactone, were analyzed further with respect to their stereoisomeric composition by using enantioselective HS-SPME-GC/MS. These experiments confirmed that individual enantiomers that differ in taste or physiological properties can be distinguished within the tobacco matrix. By comparing the enantiomeric composition of these compounds in the tobacco with that of fresh and dried strawberries, it can be concluded that non-natural strawberry aroma is usually used to produce strawberry-flavored tobacco products. Such authenticity control can become of interest particularly when manufacturers claim that natural sources were used for flavoring of products. Although the definition of characterizing flavors by analytical means remains challenging, specific compounds or features are required to be defined for routine screening of reported information. Clarifications by sensory testing might still be necessary, but could be limited to a few preselected samples.

  6. Coherent Doppler lidar for automated space vehicle rendezvous, stationkeeping and capture

    NASA Technical Reports Server (NTRS)

    Bilbro, James A.

    1991-01-01

    The inherent spatial resolution of laser radar makes ladar or lidar an attractive candidate for Automated Rendezvous and Capture application. Previous applications were based on incoherent lidar techniques, requiring retro-reflectors on the target vehicle. Technology improvements (reduced size, no cryogenic cooling requirement) have greatly enhanced the construction of coherent lidar systems. Coherent lidar permits the acquisition of non-cooperative targets at ranges that are limited by the detection capability rather than by the signal-to-noise ratio (SNR) requirements. The sensor can provide translational state information (range, velocity, and angle) by direct measurement and, when used with any array detector, also can provide attitude information by Doppler imaging techniques. Identification of the target is accomplished by scanning with a high pulse repetition frequency (dependent on the SNR). The system performance is independent of range and should not be constrained by sun angle. An initial effort to characterize a multi-element detection system has resulted in a system that is expected to work to a minimum range of 1 meter. The system size, weight and power requirements are dependent on the operating range; 10 km range requires a diameter of 3 centimeters with overall size at 3 x 3 x 15 to 30 cm, while 100 km range requires a 30 cm diameter.

  7. An Analysis of the Potential Impacts of Ashton Carter’s Should-Cost Memorandum on Defense Contracting

    DTIC Science & Technology

    2012-09-17

    monopolistic contractors that match distinctive government requirements. Figure 4 portrays the differences found in pure competitive commercial markets ...operate in quasi-competitive environments. Such arenas are characterized as oligopolistic or monopolistic markets . Table 6. Top 10 Companies by...practice will be used to build sufficient cost knowledge of those services within that market segment. You will employ that cost knowledge to inform

  8. Characterization of the General Electric CID-17 as a Detector for Plasma Emission Spectrometry.

    DTIC Science & Technology

    1985-11-25

    multiwavelength disreteetectors. All tnToes oF detectors ’or plasma emission snectroscopv must mntil there o eapresetutisemhas. been, byes ereounu ai!- numer...photomultiplier tubes. With almost 100,000 channels, true multiwavelength detection is obtained making a new wealth of information available to the analytical...of complex mixtures by optical emission spectrometry requires sensitive simultaneous multiwavelength detection. Until the present, this has been

  9. International Space Station (ISS) 3D Printer Performance and Material Characterization Methodology

    NASA Technical Reports Server (NTRS)

    Bean, Q. A.; Cooper, K. G.; Edmunson, J. E.; Johnston, M. M.; Werkheiser, M. J.

    2015-01-01

    In order for human exploration of the Solar System to be sustainable, manufacturing of necessary items on-demand in space or on planetary surfaces will be a requirement. As a first step towards this goal, the 3D Printing In Zero-G (3D Print) technology demonstration made the first items fabricated in space on the International Space Station. From those items, and comparable prints made on the ground, information about the microgravity effects on the printing process can be determined. Lessons learned from this technology demonstration will be applicable to other in-space manufacturing technologies, and may affect the terrestrial manufacturing industry as well. The flight samples were received at the George C. Marshall Space Flight Center on 6 April 2015. These samples will undergo a series of tests designed to not only thoroughly characterize the samples, but to identify microgravity effects manifested during printing by comparing their results to those of samples printed on the ground. Samples will be visually inspected, photographed, scanned with structured light, and analyzed with scanning electron microscopy. Selected samples will be analyzed with computed tomography; some will be assessed using ASTM standard tests. These tests will provide the information required to determine the effects of microgravity on 3D printing in microgravity.

  10. An update on technical and methodological aspects for cardiac PET applications.

    PubMed

    Presotto, Luca; Busnardo, Elena; Gianolli, Luigi; Bettinardi, Valentino

    2016-12-01

    Positron emission tomography (PET) is indicated for a large number of cardiac diseases: perfusion and viability studies are commonly used to evaluate coronary artery disease; PET can also be used to assess sarcoidosis and endocarditis, as well as to investigate amyloidosis. Furthermore, a hot topic for research is plaque characterization. Most of these studies are technically very challenging. High count rates and short acquisition times characterize perfusion scans while very small targets have to be imaged in inflammation/infection and plaques examinations. Furthermore, cardiac PET suffers from respiratory and cardiac motion blur. Each type of studies has specific requirements from the technical and methodological point of view, thus PET systems with overall high performances are required. Furthermore, in the era of hybrid PET/computed tomography (CT) and PET/Magnetic Resonance Imaging (MRI) systems, the combination of complementary functional and anatomical information can be used to improve diagnosis and prognosis. Moreover, PET images can be qualitatively and quantitatively improved exploiting information from the other modality, using advanced algorithms. In this review we will report the latest technological and methodological innovations for PET cardiac applications, with particular reference to the state of the art of the hybrid PET/CT and PET/MRI. We will also report the most recent advancements in software, from reconstruction algorithms to image processing and analysis programs.

  11. Tools for the functional interpretation of metabolomic experiments.

    PubMed

    Chagoyen, Monica; Pazos, Florencio

    2013-11-01

    The so-called 'omics' approaches used in modern biology aim at massively characterizing the molecular repertories of living systems at different levels. Metabolomics is one of the last additions to the 'omics' family and it deals with the characterization of the set of metabolites in a given biological system. As metabolomic techniques become more massive and allow characterizing larger sets of metabolites, automatic methods for analyzing these sets in order to obtain meaningful biological information are required. Only recently the first tools specifically designed for this task in metabolomics appeared. They are based on approaches previously used in transcriptomics and other 'omics', such as annotation enrichment analysis. These, together with generic tools for metabolic analysis and visualization not specifically designed for metabolomics will for sure be in the toolbox of the researches doing metabolomic experiments in the near future.

  12. CHAPTER: In-Situ Characterization of Stimulating Microelectrode Arrays: Study of an Idealized Structure Based on Argus II Retinal implantsBOOK TITLE: Implantable Neural Prostheses 2: Techniques and Engineering Approaches, D.M. Zhou and E. Greenbaum, Eds., Springer, NY 2009

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Greenbaum, Elias; Sanders, Charlene A; Kandagor, Vincent

    The development of a retinal prosthesis for artificial sight includes a study of the factors affecting the structural and functional stability of chronically implanted microelectrode arrays. Although neuron depolarization and propagation of electrical signals have been studied for nearly a century, the use of multielectrode stimulation as a proposed therapy to treat blindness is a frontier area of modern ophthalmology research. Mapping and characterizing the topographic information contained in the electric field potentials and understanding how this information is transmitted and interpreted in the visual cortex is still very much a work in progress. In order to characterize the electricalmore » field patterns generated by the device, an in vitro prototype that mimics several of the physical and chemical parameters of the in vivo visual implant device was fabricated. We carried out multiple electrical measurements in a model 'eye,' beginning with a single electrode, followed by a 9-electrode array structure, both idealized components based on the Argus II retinal implants. Correlating the information contained in the topographic features of the electric fields with psychophysical testing in patients may help reduce the time required for patients to convert the electrical patterns into graphic signals.« less

  13. Enhanced project management tool

    NASA Technical Reports Server (NTRS)

    Hsu, Chen-Jung (Inventor); Patel, Hemil N. (Inventor); Maluf, David A. (Inventor); Moh Hashim, Jairon C. (Inventor); Tran, Khai Peter B. (Inventor)

    2012-01-01

    A system for managing a project that includes multiple tasks and a plurality of workers. Input information includes characterizations based upon a human model, a team model and a product model. Periodic reports, such as one or more of a monthly report, a task plan report, a schedule report, a budget report and a risk management report, are generated and made available for display or further analysis or collection into a customized report template. An extensible database allows searching for information based upon context and upon content. Seven different types of project risks are addressed, including non-availability of required skill mix of workers. The system can be configured to exchange data and results with corresponding portions of similar project analyses, and to provide user-specific access to specified information.

  14. The fluctuating ribosome: thermal molecular dynamics characterized by neutron scattering

    NASA Astrophysics Data System (ADS)

    Zaccai, Giuseppe; Natali, Francesca; Peters, Judith; Řihová, Martina; Zimmerman, Ella; Ollivier, J.; Combet, J.; Maurel, Marie-Christine; Bashan, Anat; Yonath, Ada

    2016-11-01

    Conformational changes associated with ribosome function have been identified by X-ray crystallography and cryo-electron microscopy. These methods, however, inform poorly on timescales. Neutron scattering is well adapted for direct measurements of thermal molecular dynamics, the ‘lubricant’ for the conformational fluctuations required for biological activity. The method was applied to compare water dynamics and conformational fluctuations in the 30 S and 50 S ribosomal subunits from Haloarcula marismortui, under high salt, stable conditions. Similar free and hydration water diffusion parameters are found for both subunits. With respect to the 50 S subunit, the 30 S is characterized by a softer force constant and larger mean square displacements (MSD), which would facilitate conformational adjustments required for messenger and transfer RNA binding. It has been shown previously that systems from mesophiles and extremophiles are adapted to have similar MSD under their respective physiological conditions. This suggests that the results presented are not specific to halophiles in high salt but a general property of ribosome dynamics under corresponding, active conditions. The current study opens new perspectives for neutron scattering characterization of component functional molecular dynamics within the ribosome.

  15. A novel optical probe for pH sensing in gastro-esophageal apparatus

    NASA Astrophysics Data System (ADS)

    Baldini, F.; Ghini, G.; Giannetti, A.; Senesi, F.; Trono, C.

    2011-03-01

    Monitoring gastric pH for long periods, usually 24 h, may be essential in analyzing the physiological pattern of acidity, in obtaining information on changes in activity during peptic ulcer disease, and in assessing the effect of antisecretory drugs. Gastro-esophageal reflux, which causes a pH decrease in the esophagus content from pH 7 even down to pH 2, can determine esophagitis with possible strictures and Barrett's esophagus. One of the difficulties of the optical measurement of pH in the gastro-esophageal apparatus lies in the required extended working range from 1 to 8 pH units. The present paper deals with a novel optical pH sensor, using methyl red as optical pH indicator. Contrary to all acidbase indicators characterized by working ranges limited to 2-3 pH units, methyl red, after its covalent immobilization on controlled pore glass (CPG), is characterized by a wide working range which fits with the clinical requirements. The novel probe design here described is suitable for gastro-esophageal applications and allows the optimization of the performances of the CPG with the immobilised indicator. This leads to a very simple configuration characterized by a very fast response time.

  16. AACP Strategy for Addressing the Professional Development Needs of Department Chairs

    PubMed Central

    Rodriguez, Tobias E.; Weinstein, George; Sorofman, Bernard A.; Bosso, John A.; Kerr, Robert A.; Haden, N. Karl

    2012-01-01

    Objectives. Characterize the skills and abilities required for department chairs, identify development needs, and then create AACP professional development programs for chairs. Methods. A 30-question electronic survey was sent to AACP member department chairs related to aspects of chairing an academic department. Results. The survey identified development needs in the leadership, management, and personal abilities required for effective performance as department chair. The information was used to prioritize topics for subsequent AACP development programs. Subsequent programs conducted at AACP Interim and Annual Meetings were well attended and generally received favorable reviews from participants. A list of development resources was placed on the AACP website. Conclusions. This ongoing initiative is part of an AACP strategy to identify and address the professional development needs of department chairs. Survey results may also inform faculty members and other academic leaders about the roles and responsibilities of department chairs. PMID:22919099

  17. Transfer Learning for Activity Recognition: A Survey

    PubMed Central

    Cook, Diane; Feuz, Kyle D.; Krishnan, Narayanan C.

    2013-01-01

    Many intelligent systems that focus on the needs of a human require information about the activities being performed by the human. At the core of this capability is activity recognition, which is a challenging and well-researched problem. Activity recognition algorithms require substantial amounts of labeled training data yet need to perform well under very diverse circumstances. As a result, researchers have been designing methods to identify and utilize subtle connections between activity recognition datasets, or to perform transfer-based activity recognition. In this paper we survey the literature to highlight recent advances in transfer learning for activity recognition. We characterize existing approaches to transfer-based activity recognition by sensor modality, by differences between source and target environments, by data availability, and by type of information that is transferred. Finally, we present some grand challenges for the community to consider as this field is further developed. PMID:24039326

  18. International Implications of Labeling Foods Containing Engineered Nanomaterials.

    PubMed

    Grieger, Khara D; Hansen, Steffen Foss; Mortensen, Ninell P; Cates, Sheryl; Kowalcyk, Barbara

    2016-05-01

    To provide greater transparency and comprehensive information to consumers regarding their purchase choices, the European Parliament and the Council have mandated via Regulation 1169/2011 that foods containing engineered nanomaterials (ENMs) be labeled. This review covers the main concerns related to the use of ENMs in foods and the potential impacts that this type of food labeling might have on diverse stakeholder groups, including those outside the European Union (EU), e.g., in the United States. We also provide recommendations to stakeholders for overcoming existing challenges related to labeling foods containing ENMs. The revised EU food labeling requirements will likely result in a number of positive developments and a number of challenges for stakeholders in both EU and non-EU countries. Although labeling of foods containing ENMs will likely improve transparency, provide more information to facilitate consumer decisions, and build trust among food safety authorities and consumers, critical obstacles to the successful implementation of these labeling requirements remain, including the need for (i) harmonized information requirements or regulations between countries in different regions of the world, (ii) clarification of the regulatory definitions of the ENMs to be used for food labeling, (iii) robust techniques to detect, measure, and characterize diverse ENMs in food matrices, and (iv) clarification of the list of ENMs that may be exempt from labeling requirements, such as several food additives used for decades. We recommend that food industries and food safety authorities be more proactive in communicating with the public and consumer groups regarding the potential benefits and risks of using ENMs in foods. Efforts should be made to improve harmonization of information requirements between countries to avoid potential international trade barriers.

  19. Nanomaterials in consumer products: a challenging analytical problem.

    PubMed

    Contado, Catia

    2015-01-01

    Many products used in everyday life are made with the assistance of nanotechnologies. Cosmetic, pharmaceuticals, sunscreen, powdered food are only few examples of end products containing nano-sized particles (NPs), generally added to improve the product quality. To evaluate correctly benefits vs. risks of engineered nanomaterials and consequently to legislate in favor of consumer's protection, it is necessary to know the hazards connected with the exposure levels. This information implies transversal studies and a number of different competences. On analytical point of view the identification, quantification and characterization of NPs in food matrices and in cosmetic or personal care products pose significant challenges, because NPs are usually present at low concentration levels and the matrices, in which they are dispersed, are complexes and often incompatible with analytical instruments that would be required for their detection and characterization. This paper focused on some analytical techniques suitable for the detection, characterization and quantification of NPs in food and cosmetics products, reports their recent application in characterizing specific metal and metal-oxide NPs in these two important industrial and market sectors. The need of a characterization of the NPs as much as possible complete, matching complementary information about different metrics, possible achieved through validate procedures, is what clearly emerges from this research. More work should be done to produce standardized materials and to set-up methodologies to determine number-based size distributions and to get quantitative date about the NPs in such a complex matrices.

  20. Nanomaterials in consumer products: a challenging analytical problem

    NASA Astrophysics Data System (ADS)

    Contado, Catia

    2015-08-01

    Many products used in everyday life are made with the assistance of nanotechnologies. Cosmetic, pharmaceuticals, sunscreen, powdered food are only few examples of end products containing nano-sized particles (NPs), generally added to improve the product quality. To evaluate correctly benefits versus risks of engineered nanomaterials and consequently to legislate in favor of consumer’s protection, it is necessary to know the hazards connected with the exposure levels. This information implies transversal studies and a number of different competences. On analytical point of view the identification, quantification and characterization of NPs in food matrices and in cosmetic or personal care products pose significant challenges, because NPs are usually present at low concentration levels and the matrices, in which they are dispersed, are complexes and often incompatible with analytical instruments that would be required for their detection and characterization. This paper focused on some analytical techniques suitable for the detection, characterization and quantification of NPs in food and cosmetics products, reports their recent application in characterizing specific metal and metal-oxide NPs in these two important industrial and market sectors. The need of a characterization of the NPs as much as possible complete, matching complementary information about different metrics, possible achieved through validate procedures, is what clearly emerges from this research. More work should be done to produce standardized materials and to set-up methodologies to determine number-based size distributions and to get quantitative date about the NPs in such a complex matrices.

  1. Nanomaterials in consumer products: a challenging analytical problem

    PubMed Central

    Contado, Catia

    2015-01-01

    Many products used in everyday life are made with the assistance of nanotechnologies. Cosmetic, pharmaceuticals, sunscreen, powdered food are only few examples of end products containing nano-sized particles (NPs), generally added to improve the product quality. To evaluate correctly benefits vs. risks of engineered nanomaterials and consequently to legislate in favor of consumer's protection, it is necessary to know the hazards connected with the exposure levels. This information implies transversal studies and a number of different competences. On analytical point of view the identification, quantification and characterization of NPs in food matrices and in cosmetic or personal care products pose significant challenges, because NPs are usually present at low concentration levels and the matrices, in which they are dispersed, are complexes and often incompatible with analytical instruments that would be required for their detection and characterization. This paper focused on some analytical techniques suitable for the detection, characterization and quantification of NPs in food and cosmetics products, reports their recent application in characterizing specific metal and metal-oxide NPs in these two important industrial and market sectors. The need of a characterization of the NPs as much as possible complete, matching complementary information about different metrics, possible achieved through validate procedures, is what clearly emerges from this research. More work should be done to produce standardized materials and to set-up methodologies to determine number-based size distributions and to get quantitative date about the NPs in such a complex matrices. PMID:26301216

  2. Creating NDA working standards through high-fidelity spent fuel modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Skutnik, Steven E; Gauld, Ian C; Romano, Catherine E

    2012-01-01

    The Next Generation Safeguards Initiative (NGSI) is developing advanced non-destructive assay (NDA) techniques for spent nuclear fuel assemblies to advance the state-of-the-art in safeguards measurements. These measurements aim beyond the capabilities of existing methods to include the evaluation of plutonium and fissile material inventory, independent of operator declarations. Testing and evaluation of advanced NDA performance will require reference assemblies with well-characterized compositions to serve as working standards against which the NDA methods can be benchmarked and for uncertainty quantification. To support the development of standards for the NGSI spent fuel NDA project, high-fidelity modeling of irradiated fuel assemblies is beingmore » performed to characterize fuel compositions and radiation emission data. The assembly depletion simulations apply detailed operating history information and core simulation data as it is available to perform high fidelity axial and pin-by-pin fuel characterization for more than 1600 nuclides. The resulting pin-by-pin isotopic inventories are used to optimize the NDA measurements and provide information necessary to unfold and interpret the measurement data, e.g., passive gamma emitters, neutron emitters, neutron absorbers, and fissile content. A key requirement of this study is the analysis of uncertainties associated with the calculated compositions and signatures for the standard assemblies; uncertainties introduced by the calculation methods, nuclear data, and operating information. An integral part of this assessment involves the application of experimental data from destructive radiochemical assay to assess the uncertainty and bias in computed inventories, the impact of parameters such as assembly burnup gradients and burnable poisons, and the influence of neighboring assemblies on periphery rods. This paper will present the results of high fidelity assembly depletion modeling and uncertainty analysis from independent calculations performed using SCALE and MCNP. This work is supported by the Next Generation Safeguards Initiative, Office of Nuclear Safeguards and Security, National Nuclear Security Administration.« less

  3. A rapid approach for characterization of thiol-conjugated antibody-drug conjugates and calculation of drug-antibody ratio by liquid chromatography mass spectrometry.

    PubMed

    Firth, David; Bell, Leonard; Squires, Martin; Estdale, Sian; McKee, Colin

    2015-09-15

    We present the demonstration of a rapid "middle-up" liquid chromatography mass spectrometry (LC-MS)-based workflow for use in the characterization of thiol-conjugated maleimidocaproyl-monomethyl auristatin F (mcMMAF) and valine-citrulline-monomethyl auristatin E (vcMMAE) antibody-drug conjugates. Deconvoluted spectra were generated following a combination of deglycosylation, IdeS (immunoglobulin-degrading enzyme from Streptococcus pyogenes) digestion, and reduction steps that provide a visual representation of the product for rapid lot-to-lot comparison-a means to quickly assess the integrity of the antibody structure and the applied conjugation chemistry by mass. The relative abundance of the detected ions also offer information regarding differences in drug conjugation levels between samples, and the average drug-antibody ratio can be calculated. The approach requires little material (<100 μg) and, thus, is amenable to small-scale process development testing or as an early component of a complete characterization project facilitating informed decision making regarding which aspects of a molecule might need to be examined in more detail by orthogonal methodologies. Copyright © 2015 Elsevier Inc. All rights reserved.

  4. Efficient search, mapping, and optimization of multi-protein genetic systems in diverse bacteria

    PubMed Central

    Farasat, Iman; Kushwaha, Manish; Collens, Jason; Easterbrook, Michael; Guido, Matthew; Salis, Howard M

    2014-01-01

    Developing predictive models of multi-protein genetic systems to understand and optimize their behavior remains a combinatorial challenge, particularly when measurement throughput is limited. We developed a computational approach to build predictive models and identify optimal sequences and expression levels, while circumventing combinatorial explosion. Maximally informative genetic system variants were first designed by the RBS Library Calculator, an algorithm to design sequences for efficiently searching a multi-protein expression space across a > 10,000-fold range with tailored search parameters and well-predicted translation rates. We validated the algorithm's predictions by characterizing 646 genetic system variants, encoded in plasmids and genomes, expressed in six gram-positive and gram-negative bacterial hosts. We then combined the search algorithm with system-level kinetic modeling, requiring the construction and characterization of 73 variants to build a sequence-expression-activity map (SEAMAP) for a biosynthesis pathway. Using model predictions, we designed and characterized 47 additional pathway variants to navigate its activity space, find optimal expression regions with desired activity response curves, and relieve rate-limiting steps in metabolism. Creating sequence-expression-activity maps accelerates the optimization of many protein systems and allows previous measurements to quantitatively inform future designs. PMID:24952589

  5. Characterization of deep coral and sponge communities in the Gulf of the Farallones National Marine Sanctuary: Rittenburg Bank, Cochrane Bank and the Farallon Escarpment.

    USGS Publications Warehouse

    Etnoyer, P.; Cochrane, Guy R.; Salgado, E.; Graiff, K.; Roletto, J.; Williams, G.J.; Reyna, K.; Hyland, J.

    2014-01-01

    Benthic surveys were conducted in the Gulf of Farallones National Marine Sanctuary (GFNMS) aboard R/V Fulmar, October 3-11, 2012 using the large observation-class remotely operated vehicle (ROV) Beagle. The purpose of the surveys was to groundtruth mapping data collected in 2011, and to characterize the seafloor biota, particularly corals and sponges, in order to support Essential Fish Habitat designations under Magnuson-Stevens Act (MSA) and other conservation and management goals under the National Marine Sanctuaries Act (NMSA). A total area of 25,416 sq. meters of sea floor was surveyed during 34 ROV transects. The overall research priorities were: (1) to locate and characterize DSC and sponge habitats in priority areas; (2) to collect information to help understand the value of DSCs and sponges as reservoirs of biodiversity, or habitat for associated species, including commercially important fishes and invertebrates; (3) to assess the condition of DSC/sponge assemblages in relation to potential anthropogenic or environmental disturbances; and (4) to make this information available to support fisheries and sanctuary management needs under MSA and NMSA requirements.

  6. Lunar Dust Characterization Activity at GRC

    NASA Technical Reports Server (NTRS)

    Street, Kenneth W.

    2008-01-01

    The fidelity of lunar simulants as compared to actual regolith is evaluated using Figures of Merit (FOM) which are based on four criteria: Particle Size, Particle Shape, Composition, and Density of the bulk material. In practice, equipment testing will require other information about both the physical properties (mainly of the dust fraction) and composition as a function of particle size. At Glenn Research Center (GRC) we are involved in evaluating a number of simulant properties of consequence to testing of lunar equipment in a relevant environment, in order to meet Technology Readiness Level (TRL) 6 criteria. Bulk regolith has been characterized for many decades, but surprisingly little work has been done on the dust fraction (particles less than 20 micrometers in diameter). GRC is currently addressing the information shortfall by characterizing the following physical properties: Particle Size Distribution, Adhesion, Abrasivity, Surface Energy, Magnetic Susceptibility, Tribocharging and Surface Chemistry/Reactivity. Since some of these properties are also dependent on the size of the particles we have undertaken the construction of a six stage axial cyclone particle separator to fractionate dust into discrete particle size distributions for subsequent evaluation of these properties. An introduction to this work and progress to date will be presented.

  7. APNEA/WIT system nondestructive assay capability evaluation plan for select accessibly stored INEL RWMC waste forms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Becker, G.K.

    1997-01-01

    Bio-Imaging Research Inc. (BIR) and Lockheed Martin Speciality Components (LMSC) are engaged in a Program Research and Development Agreement and a Rapid Commercialization Initiative with the Department of Energy, EM-50. The agreement required BIR and LMSC to develop a data interpretation method that merges nondestructive assay and nondestructive examination (NDA/NDE) data and information sufficient to establish compliance with applicable National TRU Program (Program) waste characterization requirements and associated quality assurance performance criteria. This effort required an objective demonstration of the BIR and LMSC waste characterization systems in their standalone and integrated configurations. The goal of the test plan is tomore » provide a mechanism from which evidence can be derived to substantiate nondestructive assay capability and utility statement for the BIT and LMSC systems. The plan must provide for the acquisition, compilation, and reporting of performance data thereby allowing external independent agencies a basis for an objective evaluation of the standalone BIR and LMSC measurement systems, WIT and APNEA respectively, as well as an expected performance resulting from appropriate integration of the two systems. The evaluation is to be structured such that a statement regarding select INEL RWMC waste forms can be made in terms of compliance with applicable Program requirements and criteria.« less

  8. Nevada Test Site Environmental Report 2005

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cathy A. Wills

    2006-10-01

    The Nevada Test Site Environmental Report 2005 (NTSER) was prepared to meet the information needs of the public and the requirements and guidelines of the U.S. Department of Energy (DOE) for annual site environmental reports. Its purpose is to (1) report compliance status with environmental standards and requirements, (2) present results of environmental monitoring of radiological and nonradiological effluents, (3) report estimated radiological doses to the public from releases of radioactive material, (4) summarize environmental incidents of noncompliance and actions taken in response to them, (5) describe the NTS Environmental Management System and characterize its performance, and (6) highlight significantmore » environmental programs and efforts.« less

  9. The evolutionary basis of human social learning

    PubMed Central

    Morgan, T. J. H.; Rendell, L. E.; Ehn, M.; Hoppitt, W.; Laland, K. N.

    2012-01-01

    Humans are characterized by an extreme dependence on culturally transmitted information. Such dependence requires the complex integration of social and asocial information to generate effective learning and decision making. Recent formal theory predicts that natural selection should favour adaptive learning strategies, but relevant empirical work is scarce and rarely examines multiple strategies or tasks. We tested nine hypotheses derived from theoretical models, running a series of experiments investigating factors affecting when and how humans use social information, and whether such behaviour is adaptive, across several computer-based tasks. The number of demonstrators, consensus among demonstrators, confidence of subjects, task difficulty, number of sessions, cost of asocial learning, subject performance and demonstrator performance all influenced subjects' use of social information, and did so adaptively. Our analysis provides strong support for the hypothesis that human social learning is regulated by adaptive learning rules. PMID:21795267

  10. The evolutionary basis of human social learning.

    PubMed

    Morgan, T J H; Rendell, L E; Ehn, M; Hoppitt, W; Laland, K N

    2012-02-22

    Humans are characterized by an extreme dependence on culturally transmitted information. Such dependence requires the complex integration of social and asocial information to generate effective learning and decision making. Recent formal theory predicts that natural selection should favour adaptive learning strategies, but relevant empirical work is scarce and rarely examines multiple strategies or tasks. We tested nine hypotheses derived from theoretical models, running a series of experiments investigating factors affecting when and how humans use social information, and whether such behaviour is adaptive, across several computer-based tasks. The number of demonstrators, consensus among demonstrators, confidence of subjects, task difficulty, number of sessions, cost of asocial learning, subject performance and demonstrator performance all influenced subjects' use of social information, and did so adaptively. Our analysis provides strong support for the hypothesis that human social learning is regulated by adaptive learning rules.

  11. PATIENT'S RIGHT TO INFORMED CONSENT IN REPUBLIC SRPSKA: LEGAL AND ETHICAL ASPECTS (WITH SPECIAL REFERENCE TO PHYSICAL REHABILITATION).

    PubMed

    Milinkovic, Igor; Majstorovic, Biljana

    2014-12-01

    The principle of informed consent, which requires a patient's fully-informed consent prior to the medical treatment, is closely connected with the value of human dignity. The realization and protection of a patient's dignity is not possible without his/her right to choose the character and scope of medical treatment. This goal cannot be adequately achieved within the traditional model of medical paternalism characterized by the physician's authoritative position. The first part of the article deals with the content and ethical significance of the informed consent doctrine. The legal framework of informed consent in Republic Srpska (RS), one of the two Bosnia and Herzegovina (BH)entities, is analyzed. Special reference is made to the relevance of the informed consent principle within the physical rehabilitation process. Although ethical aspects of physical rehabilitation are often overlooked, this medical field possesses a strong ethical dimension (including an appropriate realization of the patient's right to informed consent).

  12. Attentional bias in clinical depression and anxiety: The impact of emotional and non-emotional distracting information.

    PubMed

    Lichtenstein-Vidne, L; Okon-Singer, H; Cohen, N; Todder, D; Aue, T; Nemets, B; Henik, A

    2017-01-01

    Both anxiety and major depression disorder (MDD) were reported to involve a maladaptive selective attention mechanism, associated with bias toward negative stimuli. Previous studies investigated attentional bias using distractors that required processing as part of task settings, and therefore, in our view, these distractors should be regarded as task-relevant. Here, we applied a unique task that used peripheral distractors that presented emotional and spatial information simultaneously. Notably, the emotional information was not associated in any way to the task, and thus was task-irrelevant. The spatial information, however, was task-relevant as it corresponded with task instructions. Corroborating previous findings, anxious patients showed attentional bias toward negative information. MDD patients showed no indication of this bias. Spatial information influenced all groups similarly. These results indicate that anxiety, but not MDD, is associated with an inherent negative information bias, further illustrating that the two closely related disorders are characterized by different processing patterns. Copyright © 2016 Elsevier B.V. All rights reserved.

  13. Hanford Site Environmental Report for Calender Year 2006

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Poston, Ted M.; Hanf, Robert W.; Duncan, Joanne P.

    This report is prepared annually for DOE and provides an overview of activities at the Hanford Site. The report summarizes environmental data that characterize Hanford Site environmental management performance. The report also highlights significant environmental and public protection programs and efforts. Although this report is primarily written to meet DOE reporting requirements and guidelines, it also provides useful summary information for the public, Indian tribes, public officials, regulatory agencies, Hanford contractors, and public officials.

  14. Intelligent multi-sensor integrations

    NASA Technical Reports Server (NTRS)

    Volz, Richard A.; Jain, Ramesh; Weymouth, Terry

    1989-01-01

    Growth in the intelligence of space systems requires the use and integration of data from multiple sensors. Generic tools are being developed for extracting and integrating information obtained from multiple sources. The full spectrum is addressed for issues ranging from data acquisition, to characterization of sensor data, to adaptive systems for utilizing the data. In particular, there are three major aspects to the project, multisensor processing, an adaptive approach to object recognition, and distributed sensor system integration.

  15. Stockpile Dismantlement Database Training Materials

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1993-11-01

    This document, the Stockpile Dismantlement Database (SDDB) training materials is designed to familiarize the user with the SDDB windowing system and the data entry steps for Component Characterization for Disposition. The foundation of information required for every part is depicted by using numbered graphic and text steps. The individual entering data is lead step by step through generic and specific examples. These training materials are intended to be supplements to individual on-the-job training.

  16. A practical approach to evidence-based dentistry: How to search for evidence to inform clinical decisions.

    PubMed

    Brignardello-Petersen, Romina; Carrasco-Labra, Alonso; Booth, H Austin; Glick, Michael; Guyatt, Gordon H; Azarpazhooh, Amir; Agoritsas, Thomas

    2014-12-01

    Knowing how to search for evidence that can inform clinical decisions is a fundamental skill for the practice of evidence-based dentistry. There are many available types of evidence-based resources, characterized by their degrees of coverage of preappraised or summarized evidence at varying levels of processing, from primary studies to systematic reviews and clinical guidelines. The practice of evidence-based dentistry requires familiarity with these resources. In this article, the authors describe the process of searching for evidence: defining the question, identifying the question's nature and main components, and selecting the study design that best addresses the question.

  17. Standard Isotherm Fit Information for Dry CO2 on Sorbents for 4-Bed Molecular Sieve

    NASA Technical Reports Server (NTRS)

    Cmarik, G. E.; Son, K. N.; Knox, J. C.

    2017-01-01

    Onboard the ISS, one of the systems tasked with removal of metabolic carbon dioxide (CO2) is a 4-bed molecular sieve (4BMS) system. In order to enable a 4-person mission to succeed, systems for removal of metabolic CO2 must reliably operate for several years while minimizing power, mass, and volume requirements. This minimization can be achieved through system redesign and/or changes to the separation material(s). A material screening process has identified the most reliable sorbent materials for the next 4BMS. Sorbent characterization will provide the information necessary to guide system design by providing inputs for computer simulations.

  18. Lidar Measurements for Desert Dust Characterization: An Overview

    NASA Technical Reports Server (NTRS)

    Mona, L.; Liu, Z.; Mueller, D.; Omar, A.; Papayannis, A.; Pappalardo, G.; Sugimoto, N.; Vaughan, M.

    2012-01-01

    We provide an overview of light detection and ranging (lidar) capability for describing and characterizing desert dust. This paper summarizes lidar techniques, observations, and fallouts of desert dust lidar measurements. The main objective is to provide the scientific community, including non-practitioners of lidar observations with a reference paper on dust lidar measurements. In particular, it will fill the current gap of communication between research-oriented lidar community and potential desert dust data users, such as air quality monitoring agencies and aviation advisory centers. The current capability of the different lidar techniques for the characterization of aerosol in general and desert dust in particular is presented. Technical aspects and required assumptions of these techniques are discussed, providing readers with the pros and cons of each technique. Information about desert dust collected up to date using lidar techniques is reviewed. Lidar techniques for aerosol characterization have a maturity level appropriate for addressing air quality and transportation issues, as demonstrated by some first results reported in this paper

  19. Statistical Characterization of the Chandra Source Catalog

    NASA Astrophysics Data System (ADS)

    Primini, Francis A.; Houck, John C.; Davis, John E.; Nowak, Michael A.; Evans, Ian N.; Glotfelty, Kenny J.; Anderson, Craig S.; Bonaventura, Nina R.; Chen, Judy C.; Doe, Stephen M.; Evans, Janet D.; Fabbiano, Giuseppina; Galle, Elizabeth C.; Gibbs, Danny G.; Grier, John D.; Hain, Roger M.; Hall, Diane M.; Harbo, Peter N.; He, Xiangqun Helen; Karovska, Margarita; Kashyap, Vinay L.; Lauer, Jennifer; McCollough, Michael L.; McDowell, Jonathan C.; Miller, Joseph B.; Mitschang, Arik W.; Morgan, Douglas L.; Mossman, Amy E.; Nichols, Joy S.; Plummer, David A.; Refsdal, Brian L.; Rots, Arnold H.; Siemiginowska, Aneta; Sundheim, Beth A.; Tibbetts, Michael S.; Van Stone, David W.; Winkelman, Sherry L.; Zografou, Panagoula

    2011-06-01

    The first release of the Chandra Source Catalog (CSC) contains ~95,000 X-ray sources in a total area of 0.75% of the entire sky, using data from ~3900 separate ACIS observations of a multitude of different types of X-ray sources. In order to maximize the scientific benefit of such a large, heterogeneous data set, careful characterization of the statistical properties of the catalog, i.e., completeness, sensitivity, false source rate, and accuracy of source properties, is required. Characterization efforts of other large Chandra catalogs, such as the ChaMP Point Source Catalog or the 2 Mega-second Deep Field Surveys, while informative, cannot serve this purpose, since the CSC analysis procedures are significantly different and the range of allowable data is much less restrictive. We describe here the characterization process for the CSC. This process includes both a comparison of real CSC results with those of other, deeper Chandra catalogs of the same targets and extensive simulations of blank-sky and point-source populations.

  20. Characterization of NIST food-matrix Standard Reference Materials for their vitamin C content.

    PubMed

    Thomas, Jeanice B; Yen, James H; Sharpless, Katherine E

    2013-05-01

    The vitamin C concentrations in three food-matrix Standard Reference Materials (SRMs) from the National Institute of Standards and Technology (NIST) have been determined by liquid chromatography (LC) with absorbance detection. These materials (SRM 1549a Whole Milk Powder, SRM 1849a Infant/Adult Nutritional Formula, and SRM 3233 Fortified Breakfast Cereal) have been characterized to support analytical measurements made by food processors that are required to provide information about their products' vitamin C content on the labels of products distributed in the United States. The SRMs are primarily intended for use in validating analytical methods for the determination of selected vitamins, elements, fatty acids, and other nutrients in these materials and in similar matrixes. They can also be used for quality assurance in the characterization of test samples or in-house control materials, and for establishing measurement traceability. Within-day precision of the LC method used to measure vitamin C in the food-matrix SRMs characterized in this study ranged from 2.7% to 6.5%.

  1. Principal Physicochemical Methods Used to Characterize Dendrimer Molecule Complexes Used as Genetic Therapy Agents, Nanovaccines or Drug Carriers.

    PubMed

    Alberto, Rodríguez Fonseca Rolando; Joao, Rodrigues; de Los Angeles, Muñoz-Fernández María; Alberto, Martínez Muñoz; Manuel Jonathan, Fragoso Vázquez; José, Correa Basurto

    2017-08-30

    Nanomedicine is the application of nanotechnology to medicine. This field is related to the study of nanodevices and nanomaterials applied to various medical uses, such as in improving the pharmacological properties of different molecules. Dendrimers are synthetic nanoparticles whose physicochemical properties vary according to their chemical structure. These molecules have been extensively investigated as drug nanocarriers to improve drug solubility and as sustained-release systems. New therapies such as gene therapy and the development of nanovaccines can be improved by the use of dendrimers. The biophysical and physicochemical characterization of nucleic acid/peptide-dendrimer complexes is crucial to identify their functional properties prior to biological evaluation. In that sense, it is necessary to first identify whether the peptide-dendrimer or nucleic aciddendrimer complexes can be formed and whether the complex can dissociate under the appropriate conditions at the target cells. In addition, biophysical and physicochemical characterization is required to determine how long the complexes remain stable, what proportion of peptide or nucleic acid is required to form the complex or saturate the dendrimer, and the size of the complex formed. In this review, we present the latest information on characterization systems for dendrimer-nucleic acid, dendrimer-peptide and dendrimer-drug complexes with several biotechnological and pharmacological applications. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  2. Design and characterization of a linear Hencken-type burner

    NASA Astrophysics Data System (ADS)

    Campbell, M. F.; Bohlin, G. A.; Schrader, P. E.; Bambha, R. P.; Kliewer, C. J.; Johansson, K. O.; Michelsen, H. A.

    2016-11-01

    We have designed and constructed a Hencken-type burner that produces a 38-mm-long linear laminar partially premixed co-flow diffusion flame. This burner was designed to produce a linear flame for studies of soot chemistry, combining the benefit of the conventional Hencken burner's laminar flames with the advantage of the slot burner's geometry for optical measurements requiring a long interaction distance. It is suitable for measurements using optical imaging diagnostics, line-of-sight optical techniques, or off-axis optical-scattering methods requiring either a long or short path length through the flame. This paper presents details of the design and operation of this new burner. We also provide characterization information for flames produced by this burner, including relative flow-field velocities obtained using hot-wire anemometry, temperatures along the centerline extracted using direct one-dimensional coherent Raman imaging, soot volume fractions along the centerline obtained using laser-induced incandescence and laser extinction, and transmission electron microscopy images of soot thermophoretically sampled from the flame.

  3. Two-Color Nonlinear Spectroscopy for the Rapid Acquisition of Coherent Dynamics.

    PubMed

    Senlik, S Seckin; Policht, Veronica R; Ogilvie, Jennifer P

    2015-07-02

    There has been considerable recent interest in the observation of coherent dynamics in photosynthetic systems by 2D electronic spectroscopy (2DES). In particular, coherences that persist during the "waiting time" in a 2DES experiment have been attributed to electronic, vibrational, and vibronic origins in various systems. The typical method for characterizing these coherent dynamics requires the acquisition of 2DES spectra as a function of waiting time, essentially a 3DES measurement. Such experiments require lengthy data acquisition times that degrade the signal-to-noise of the recorded coherent dynamics. We present a rapid and high signal-to-noise pulse-shaping-based approach for the characterization of coherent dynamics. Using chlorophyll a, we demonstrate that this method retains much of the information content of a 3DES measurement and provides insight into the physical origin of the coherent dynamics, distinguishing between ground and excited state coherences. It also enables high resolution determination of ground and excited state frequencies.

  4. ORNL Remedial Action Program strategy (FY 1987-FY 1992)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Trabalka, J.R.; Myrick, T.E.

    1987-12-01

    Over 40 years of Oak Ridge National Laboratory (ORNL) operations have produced a diverse legacy of contaminated inactive facilities, research areas, and waste disposal areas that are potential candidates for remedial action. The ORNL Remedial Action Program (RAP) represents a comprehensive effort to meet new regulatory requirements and ensure adequate protection of on-site workers, the public, and the environment by providing appropriate corrective measures at over 130 sites contaminated historically with radioactive, hazardous chemical, or mixed wastes. A structured path of program planning, site characterization, alternatives assessment, technology development, engineering design, continued site maintenance and surveillance, interim corrective action, andmore » eventual site closure or decommissioning is required to meet these objectives. This report documents the development of the Remedial Action Program, through its preliminary characterization, regulatory interface, and strategy development activities. It provides recommendations for a comprehensive, long-term strategy consistent with existing technical, institutional, and regulatory information, along with a six-year plan for achieving its initial objectives. 53 refs., 8 figs., 12 tabs.« less

  5. Rheology of corn stover slurries during fermentation to ethanol

    NASA Astrophysics Data System (ADS)

    Ghosh, Sanchari; Epps, Brenden; Lynd, Lee

    2017-11-01

    In typical processes that convert cellulosic biomass into ethanol fuel, solubilization of the biomass is carried out by saccharolytic enzymes; however, these enzymes require an expensive pretreatment step to make the biomass accessible for solubilization (and subsequent fermentation). We have proposed a potentially-less-expensive approach using the bacterium Clostridium thermocellum, which can initiate fermentation without pretreatment. Moreover, we have proposed a ``cotreatment'' process, in which fermentation and mechanical milling occur alternately so as to achieve the highest ethanol yield for the least milling energy input. In order to inform the energetic requirements of cotreatment, we experimentally characterized the rheological properties of corn stover slurries at various stages of fermentation. Results show that a corn stover slurry is a yield stress fluid, with shear thinning behavior well described by a power law model. Viscosity decreases dramatically upon fermentation, controlling for variables such as solids concentration and particle size distribution. To the authors' knowledge, this is the first study to characterize the changes in the physical properties of biomass during fermentation by a thermophilic bacterium.

  6. Investigating Bacterial-Animal Symbioses with Light Sheet Microscopy

    PubMed Central

    Taormina, Michael J.; Jemielita, Matthew; Stephens, W. Zac; Burns, Adam R.; Troll, Joshua V.; Parthasarathy, Raghuveer; Guillemin, Karen

    2014-01-01

    SUMMARY Microbial colonization of the digestive tract is a crucial event in vertebrate development, required for maturation of host immunity and establishment of normal digestive physiology. Advances in genomic, proteomic, and metabolomic technologies are providing a more detailed picture of the constituents of the intestinal habitat, but these approaches lack the spatial and temporal resolution needed to characterize the assembly and dynamics of microbial communities in this complex environment. We report the use of light sheet microscopy to provide high resolution imaging of bacterial colonization of the zebrafish intestine. The methodology allows us to characterize bacterial population dynamics across the entire organ and the behaviors of individual bacterial and host cells throughout the colonization process. The large four-dimensional datasets generated by these imaging approaches require new strategies for image analysis. When integrated with other “omics” datasets, information about the spatial and temporal dynamics of microbial cells within the vertebrate intestine will provide new mechanistic insights into how microbial communities assemble and function within hosts. PMID:22983029

  7. Characterization and prediction of chemical functions and weight fractions in consumer products.

    PubMed

    Isaacs, Kristin K; Goldsmith, Michael-Rock; Egeghy, Peter; Phillips, Katherine; Brooks, Raina; Hong, Tao; Wambaugh, John F

    2016-01-01

    Assessing exposures from the thousands of chemicals in commerce requires quantitative information on the chemical constituents of consumer products. Unfortunately, gaps in available composition data prevent assessment of exposure to chemicals in many products. Here we propose filling these gaps via consideration of chemical functional role. We obtained function information for thousands of chemicals from public sources and used a clustering algorithm to assign chemicals into 35 harmonized function categories (e.g., plasticizers, antimicrobials, solvents). We combined these functions with weight fraction data for 4115 personal care products (PCPs) to characterize the composition of 66 different product categories (e.g., shampoos). We analyzed the combined weight fraction/function dataset using machine learning techniques to develop quantitative structure property relationship (QSPR) classifier models for 22 functions and for weight fraction, based on chemical-specific descriptors (including chemical properties). We applied these classifier models to a library of 10196 data-poor chemicals. Our predictions of chemical function and composition will inform exposure-based screening of chemicals in PCPs for combination with hazard data in risk-based evaluation frameworks. As new information becomes available, this approach can be applied to other classes of products and the chemicals they contain in order to provide essential consumer product data for use in exposure-based chemical prioritization.

  8. Characterization and Prediction of Chemical Functions and ...

    EPA Pesticide Factsheets

    Assessing exposures from the thousands of chemicals in commerce requires quantitative information on the chemical constituents of consumer products. Unfortunately, gaps in available composition data prevent assessment of exposure to chemicals in many products. Here we propose filling these gaps via consideration of chemical functional role. We obtained function information for thousands of chemicals from public sources and used a clustering algorithm to assign chemicals into 35 harmonized function categories (e.g., plasticizers, antimicrobials, solvents). We combined these functions with weight fraction data for 4115 personal care products (PCPs) to characterize the composition of 66 different product categories (e.g., shampoos). We analyzed the combined weight fraction/function dataset using machine learning techniques to develop quantitative structure property relationship (QSPR) classifier models for 22 functions and for weight fraction, based on chemical-specific descriptors (including chemical properties). We applied these classifier models to a library of 10196 data-poor chemicals. Our predictions of chemical function and composition will inform exposure-based screening of chemicals in PCPs for combination with hazard data in risk-based evaluation frameworks. As new information becomes available, this approach can be applied to other classes of products and the chemicals they contain in order to provide essential consumer product data for use in exposure-b

  9. Characterizing monoclonal antibody formulations in arginine glutamate solutions using 1H NMR spectroscopy

    PubMed Central

    Kheddo, Priscilla; Cliff, Matthew J.; Uddin, Shahid; van der Walle, Christopher F.; Golovanov, Alexander P.

    2016-01-01

    ABSTRACT Assessing how excipients affect the self-association of monoclonal antibodies (mAbs) requires informative and direct in situ measurements for highly concentrated solutions, without sample dilution or perturbation. This study explores the application of solution nuclear magnetic resonance (NMR) spectroscopy for characterization of typical mAb behavior in formulations containing arginine glutamate. The data show that the analysis of signal intensities in 1D 1H NMR spectra, when compensated for changes in buffer viscosity, is invaluable for identifying conditions where protein-protein interactions are minimized. NMR-derived molecular translational diffusion rates for concentrated solutions are less useful than transverse relaxation rates as parameters defining optimal formulation. Furthermore, NMR reports on the solution viscosity and mAb aggregation during accelerated stability study assessment, generating data consistent with that acquired by size-exclusion chromatography. The methodology developed here offers NMR spectroscopy as a new tool providing complementary information useful to formulation development of mAbs and other large therapeutic proteins. PMID:27589351

  10. Assigning uncertainties in the inversion of NMR relaxation data.

    PubMed

    Parker, Robert L; Song, Yi-Qaio

    2005-06-01

    Recovering the relaxation-time density function (or distribution) from NMR decay records requires inverting a Laplace transform based on noisy data, an ill-posed inverse problem. An important objective in the face of the consequent ambiguity in the solutions is to establish what reliable information is contained in the measurements. To this end we describe how upper and lower bounds on linear functionals of the density function, and ratios of linear functionals, can be calculated using optimization theory. Those bounded quantities cover most of those commonly used in the geophysical NMR, such as porosity, T(2) log-mean, and bound fluid volume fraction, and include averages over any finite interval of the density function itself. In the theory presented statistical considerations enter to account for the presence of significant noise in the signal, but not in a prior characterization of density models. Our characterization of the uncertainties is conservative and informative; it will have wide application in geophysical NMR and elsewhere.

  11. Characterizing monoclonal antibody formulations in arginine glutamate solutions using 1H NMR spectroscopy.

    PubMed

    Kheddo, Priscilla; Cliff, Matthew J; Uddin, Shahid; van der Walle, Christopher F; Golovanov, Alexander P

    2016-10-01

    Assessing how excipients affect the self-association of monoclonal antibodies (mAbs) requires informative and direct in situ measurements for highly concentrated solutions, without sample dilution or perturbation. This study explores the application of solution nuclear magnetic resonance (NMR) spectroscopy for characterization of typical mAb behavior in formulations containing arginine glutamate. The data show that the analysis of signal intensities in 1D 1 H NMR spectra, when compensated for changes in buffer viscosity, is invaluable for identifying conditions where protein-protein interactions are minimized. NMR-derived molecular translational diffusion rates for concentrated solutions are less useful than transverse relaxation rates as parameters defining optimal formulation. Furthermore, NMR reports on the solution viscosity and mAb aggregation during accelerated stability study assessment, generating data consistent with that acquired by size-exclusion chromatography. The methodology developed here offers NMR spectroscopy as a new tool providing complementary information useful to formulation development of mAbs and other large therapeutic proteins.

  12. Revealing the planar chemistry of two-dimensional heterostructures at the atomic level.

    PubMed

    Chou, Harry; Ismach, Ariel; Ghosh, Rudresh; Ruoff, Rodney S; Dolocan, Andrei

    2015-06-23

    Two-dimensional (2D) atomic crystals and their heterostructures are an intense area of study owing to their unique properties that result from structural planar confinement. Intrinsically, the performance of a planar vertical device is linked to the quality of its 2D components and their interfaces, therefore requiring characterization tools that can reveal both its planar chemistry and morphology. Here, we propose a characterization methodology combining (micro-) Raman spectroscopy, atomic force microscopy and time-of-flight secondary ion mass spectrometry to provide structural information, morphology and planar chemical composition at virtually the atomic level, aimed specifically at studying 2D vertical heterostructures. As an example system, a graphene-on-h-BN heterostructure is analysed to reveal, with an unprecedented level of detail, the subtle chemistry and interactions within its layer structure that can be assigned to specific fabrication steps. Such detailed chemical information is of crucial importance for the complete integration of 2D heterostructures into functional devices.

  13. Establishment of a reference collection of additives and an analytical handbook of reference data to support enforcement of EU regulations on food contact plastics.

    PubMed

    van Lierop, B; Castle, L; Feigenbaum, A; Ehlert, K; Boenke, A

    1998-10-01

    A collection has been made of additives that are required as analytical standards for enforcement of European Union legislation on food contact plastics. The 100 additives have been characterized by mass spectrometry, infra-red spectroscopy and proton nuclear magnetic resonance spectroscopy to provide reference spectra. Gas chromatographic retention times have been recorded to facilitate identification by retention index. This information has been further supplemented by physico-chemical data. Finally, chromatographic methods have been used to indicate the presence of any impurities in the commercial chemicals. Samples of the reference substances are available on request and the collection of spectra and other information will be made available in printed format and on-line through the Internet. This paper gives an overview of the work done to establish the reference collection and the spectral atlas, which together will assist enforcement laboratories in the characterization of plastics and the selection of analytical methods for additives that may migrate.

  14. Scaling of an information system in a public healthcare market--infrastructuring from the vendor's perspective.

    PubMed

    Johannessen, Liv Karen; Obstfelder, Aud; Lotherington, Ann Therese

    2013-05-01

    The purpose of this paper is to explore the making and scaling of information infrastructures, as well as how the conditions for scaling a component may change for the vendor. The first research question is how the making and scaling of a healthcare information infrastructure can be done and by whom. The second question is what scope for manoeuvre there might be for vendors aiming to expand their market. This case study is based on an interpretive approach, whereby data is gathered through participant observation and semi-structured interviews. A case study of the making and scaling of an electronic system for general practitioners ordering laboratory services from hospitals is described as comprising two distinct phases. The first may be characterized as an evolving phase, when development, integration and implementation were achieved in small steps, and the vendor, together with end users, had considerable freedom to create the solution according to the users' needs. The second phase was characterized by a large-scale procurement process over which regional healthcare authorities exercised much more control and the needs of groups other than the end users influenced the design. The making and scaling of healthcare information infrastructures is not simply a process of evolution, in which the end users use and change the technology. It also consists of large steps, during which different actors, including vendors and healthcare authorities, may make substantial contributions. This process requires work, negotiation and strategies. The conditions for the vendor may change dramatically, from considerable freedom and close relationships with users and customers in the small-scale development, to losing control of the product and being required to engage in more formal relations with customers in the wider public healthcare market. Onerous procurement processes may be one of the reasons why large-scale implementation of information projects in healthcare is difficult and slow. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  15. Pupillometric evidence for the decoupling of attention from perceptual input during offline thought.

    PubMed

    Smallwood, Jonathan; Brown, Kevin S; Tipper, Christine; Giesbrecht, Barry; Franklin, Michael S; Mrazek, Michael D; Carlson, Jean M; Schooler, Jonathan W

    2011-03-25

    Accumulating evidence suggests that the brain can efficiently process both external and internal information. The processing of internal information is a distinct "offline" cognitive mode that requires not only spontaneously generated mental activity; it has also been hypothesized to require a decoupling of attention from perception in order to separate competing streams of internal and external information. This process of decoupling is potentially adaptive because it could prevent unimportant external events from disrupting an internal train of thought. Here, we use measurements of pupil diameter (PD) to provide concrete evidence for the role of decoupling during spontaneous cognitive activity. First, during periods conducive to offline thought but not during periods of task focus, PD exhibited spontaneous activity decoupled from task events. Second, periods requiring external task focus were characterized by large task evoked changes in PD; in contrast, encoding failures were preceded by episodes of high spontaneous baseline PD activity. Finally, high spontaneous PD activity also occurred prior to only the slowest 20% of correct responses, suggesting high baseline PD indexes a distinct mode of cognitive functioning. Together, these data are consistent with the decoupling hypothesis, which suggests that the capacity for spontaneous cognitive activity depends upon minimizing disruptions from the external world.

  16. Construction of a groundwater-flow model for the Big Sioux Aquifer using airborne electromagnetic methods, Sioux Falls, South Dakota

    USGS Publications Warehouse

    Valder, Joshua F.; Delzer, Gregory C.; Carter, Janet M.; Smith, Bruce D.; Smith, David V.

    2016-09-28

    The city of Sioux Falls is the fastest growing community in South Dakota. In response to this continued growth and planning for future development, Sioux Falls requires a sustainable supply of municipal water. Planning and managing sustainable groundwater supplies requires a thorough understanding of local groundwater resources. The Big Sioux aquifer consists of glacial outwash sands and gravels and is hydraulically connected to the Big Sioux River, which provided about 90 percent of the city’s source-water production in 2015. Managing sustainable groundwater supplies also requires an understanding of groundwater availability. An effective mechanism to inform water management decisions is the development and utilization of a groundwater-flow model. A groundwater-flow model provides a quantitative framework for synthesizing field information and conceptualizing hydrogeologic processes. These groundwater-flow models can support decision making processes by mapping and characterizing the aquifer. Accordingly, the city of Sioux Falls partnered with the U.S. Geological Survey to construct a groundwater-flow model. Model inputs will include data from advanced geophysical techniques, specifically airborne electromagnetic methods.

  17. Fast or Frugal, but Not Both: Decision Heuristics Under Time Pressure

    PubMed Central

    2017-01-01

    Heuristics are simple, yet effective, strategies that people use to make decisions. Because heuristics do not require all available information, they are thought to be easy to implement and to not tax limited cognitive resources, which has led heuristics to be characterized as fast-and-frugal. We question this monolithic conception of heuristics by contrasting the cognitive demands of two popular heuristics, Tallying and Take-the-Best. We contend that heuristics that are frugal in terms of information usage may not always be fast because of the attentional control required to implement this focus in certain contexts. In support of this hypothesis, we find that Take-the-Best, while being more frugal in terms of information usage, is slower to implement and fares worse under time pressure manipulations than Tallying. This effect is then reversed when search costs for Take-the-Best are reduced by changing the format of the stimuli. These findings suggest that heuristics are heterogeneous and should be unpacked according to their cognitive demands to determine the circumstances a heuristic best applies. PMID:28557503

  18. Fast or frugal, but not both: Decision heuristics under time pressure.

    PubMed

    Bobadilla-Suarez, Sebastian; Love, Bradley C

    2018-01-01

    Heuristics are simple, yet effective, strategies that people use to make decisions. Because heuristics do not require all available information, they are thought to be easy to implement and to not tax limited cognitive resources, which has led heuristics to be characterized as fast-and-frugal. We question this monolithic conception of heuristics by contrasting the cognitive demands of two popular heuristics, Tallying and Take-the-Best. We contend that heuristics that are frugal in terms of information usage may not always be fast because of the attentional control required to implement this focus in certain contexts. In support of this hypothesis, we find that Take-the-Best, while being more frugal in terms of information usage, is slower to implement and fares worse under time pressure manipulations than Tallying. This effect is then reversed when search costs for Take-the-Best are reduced by changing the format of the stimuli. These findings suggest that heuristics are heterogeneous and should be unpacked according to their cognitive demands to determine the circumstances a heuristic best applies. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  19. Man as the main component of the closed ecological system of the spacecraft or planetary station.

    PubMed

    Parin, V V; Adamovich, B A

    1968-01-01

    Current life-support systems of the spacecraft provide human requirements for food, water and oxygen only. Advanced life-support systems will involve man as their main component and will ensure completely his material and energy requirements. The design of individual components of such systems will assure their entire suitability and mutual control effects. Optimization of the performance of the crew and ecological system, on the basis of the information characterizing their function, demands efficient methods of collection and treatment of the information obtained through wireless recording of physiological parameters and their automatic treatment. Peculiarities of interplanetary missions and planetary stations make it necessary to conform the schedule of physiological recordings with the work-and-rest cycle of the space crew and inertness of components of the ecological system, especially of those responsible for oxygen regeneration. It is rational to model ecological systems and their components, taking into consideration the correction effect of the information on the health conditions and performance of the crewmen. Wide application of physiological data will allow the selection of optimal designs and sharply increase reliability of ecological systems.

  20. High pressure phase transformations revisited

    NASA Astrophysics Data System (ADS)

    Levitas, Valery I.

    2018-04-01

    High pressure phase transformations play an important role in the search for new materials and material synthesis, as well as in geophysics. However, they are poorly characterized, and phase transformation pressure and pressure hysteresis vary drastically in experiments of different researchers, with different pressure transmitting media, and with different material suppliers. Here we review the current state, challenges in studying phase transformations under high pressure, and the possible ways in overcoming the challenges. This field is critically compared with fields of phase transformations under normal pressure in steels and shape memory alloys, as well as plastic deformation of materials. The main reason for the above mentioned discrepancy is the lack of understanding that there is a fundamental difference between pressure-induced transformations under hydrostatic conditions, stress-induced transformations under nonhydrostatic conditions below yield, and strain-induced transformations during plastic flow. Each of these types of transformations has different mechanisms and requires a completely different thermodynamic and kinetic description and experimental characterization. In comparison with other fields the following challenges are indicated for high pressure phase transformation: (a) initial and evolving microstructure is not included in characterization of transformations; (b) continuum theory is poorly developed; (c) heterogeneous stress and strain fields in experiments are not determined, which leads to confusing material transformational properties with a system behavior. Some ways to advance the field of high pressure phase transformations are suggested. The key points are: (a) to take into account plastic deformations and microstructure evolution during transformations; (b) to formulate phase transformation criteria and kinetic equations in terms of stress and plastic strain tensors (instead of pressure alone); (c) to develop multiscale continuum theories, and (d) to couple experimental, theoretical, and computational studies of the behavior of a tested sample to extract information about fields of stress and strain tensors and concentration of high pressure phase, transformation criteria and kinetics. The ideal characterization should contain complete information which is required for simulation of the same experiments.

  1. Characterization of temporal coherence of hard X-ray free-electron laser pulses with single-shot interferograms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Osaka, Taito; Hirano, Takashi; Morioka, Yuki

    Temporal coherence is one of the most fundamental characteristics of light, connecting to spectral information through the Fourier transform relationship between time and frequency. Interferometers with a variable path-length difference (PLD) between the two branches have widely been employed to characterize temporal coherence properties for broad spectral regimes. Hard X-ray interferometers reported previously, however, have strict limitations in their operational photon energies, due to the specific optical layouts utilized to satisfy the stringent requirement for extreme stability of the PLD at sub-ångström scales. The work presented here characterizes the temporal coherence of hard X-ray free-electron laser (XFEL) pulses by capturingmore » single-shot interferograms. Since the stability requirement is drastically relieved with this approach, it was possible to build a versatile hard X-ray interferometer composed of six separate optical elements to cover a wide photon energy range from 6.5 to 11.5 keV while providing a large variable delay time of up to 47 ps at 10 keV. A high visibility of up to 0.55 was observed at a photon energy of 10 keV. The visibility measurement as a function of time delay reveals a mean coherence time of 5.9 ± 0.7 fs, which agrees with that expected from the single-shot spectral information. In conclusion, this is the first result of characterizing the temporal coherence of XFEL pulses in the hard X-ray regime and is an important milestone towards ultra-high energy resolutions at micro-electronvolt levels in time-domain X-ray spectroscopy, which will open up new opportunities for revealing dynamic properties in diverse systems on timescales from femtoseconds to nanoseconds, associated with fluctuations from ångström to nanometre spatial scales.« less

  2. Characterization of temporal coherence of hard X-ray free-electron laser pulses with single-shot interferograms

    DOE PAGES

    Osaka, Taito; Hirano, Takashi; Morioka, Yuki; ...

    2017-10-13

    Temporal coherence is one of the most fundamental characteristics of light, connecting to spectral information through the Fourier transform relationship between time and frequency. Interferometers with a variable path-length difference (PLD) between the two branches have widely been employed to characterize temporal coherence properties for broad spectral regimes. Hard X-ray interferometers reported previously, however, have strict limitations in their operational photon energies, due to the specific optical layouts utilized to satisfy the stringent requirement for extreme stability of the PLD at sub-ångström scales. The work presented here characterizes the temporal coherence of hard X-ray free-electron laser (XFEL) pulses by capturingmore » single-shot interferograms. Since the stability requirement is drastically relieved with this approach, it was possible to build a versatile hard X-ray interferometer composed of six separate optical elements to cover a wide photon energy range from 6.5 to 11.5 keV while providing a large variable delay time of up to 47 ps at 10 keV. A high visibility of up to 0.55 was observed at a photon energy of 10 keV. The visibility measurement as a function of time delay reveals a mean coherence time of 5.9 ± 0.7 fs, which agrees with that expected from the single-shot spectral information. In conclusion, this is the first result of characterizing the temporal coherence of XFEL pulses in the hard X-ray regime and is an important milestone towards ultra-high energy resolutions at micro-electronvolt levels in time-domain X-ray spectroscopy, which will open up new opportunities for revealing dynamic properties in diverse systems on timescales from femtoseconds to nanoseconds, associated with fluctuations from ångström to nanometre spatial scales.« less

  3. High pressure phase transformations revisited.

    PubMed

    Levitas, Valery I

    2018-04-25

    High pressure phase transformations play an important role in the search for new materials and material synthesis, as well as in geophysics. However, they are poorly characterized, and phase transformation pressure and pressure hysteresis vary drastically in experiments of different researchers, with different pressure transmitting media, and with different material suppliers. Here we review the current state, challenges in studying phase transformations under high pressure, and the possible ways in overcoming the challenges. This field is critically compared with fields of phase transformations under normal pressure in steels and shape memory alloys, as well as plastic deformation of materials. The main reason for the above mentioned discrepancy is the lack of understanding that there is a fundamental difference between pressure-induced transformations under hydrostatic conditions, stress-induced transformations under nonhydrostatic conditions below yield, and strain-induced transformations during plastic flow. Each of these types of transformations has different mechanisms and requires a completely different thermodynamic and kinetic description and experimental characterization. In comparison with other fields the following challenges are indicated for high pressure phase transformation: (a) initial and evolving microstructure is not included in characterization of transformations; (b) continuum theory is poorly developed; (c) heterogeneous stress and strain fields in experiments are not determined, which leads to confusing material transformational properties with a system behavior. Some ways to advance the field of high pressure phase transformations are suggested. The key points are: (a) to take into account plastic deformations and microstructure evolution during transformations; (b) to formulate phase transformation criteria and kinetic equations in terms of stress and plastic strain tensors (instead of pressure alone); (c) to develop multiscale continuum theories, and (d) to couple experimental, theoretical, and computational studies of the behavior of a tested sample to extract information about fields of stress and strain tensors and concentration of high pressure phase, transformation criteria and kinetics. The ideal characterization should contain complete information which is required for simulation of the same experiments.

  4. Definition of variables required for comprehensive description of drug dosage and clinical pharmacokinetics.

    PubMed

    Medem, Anna V; Seidling, Hanna M; Eichler, Hans-Georg; Kaltschmidt, Jens; Metzner, Michael; Hubert, Carina M; Czock, David; Haefeli, Walter E

    2017-05-01

    Electronic clinical decision support systems (CDSS) require drug information that can be processed by computers. The goal of this project was to determine and evaluate a compilation of variables that comprehensively capture the information contained in the summary of product characteristic (SmPC) and unequivocally describe the drug, its dosage options, and clinical pharmacokinetics. An expert panel defined and structured a set of variables and drafted a guideline to extract and enter information on dosage and clinical pharmacokinetics from textual SmPCs as published by the European Medicines Agency (EMA). The set of variables was iteratively revised and evaluated by data extraction and variable allocation of roughly 7% of all centrally approved drugs. The information contained in the SmPC was allocated to three information clusters consisting of 260 variables. The cluster "drug characterization" specifies the nature of the drug. The cluster "dosage" provides information on approved drug dosages and defines corresponding specific conditions. The cluster "clinical pharmacokinetics" includes pharmacokinetic parameters of relevance for dosing in clinical practice. A first evaluation demonstrated that, despite the complexity of the current free text SmPCs, dosage and pharmacokinetic information can be reliably extracted from the SmPCs and comprehensively described by a limited set of variables. By proposing a compilation of variables well describing drug dosage and clinical pharmacokinetics, the project represents a step forward towards the development of a comprehensive database system serving as information source for sophisticated CDSS.

  5. Characterizing the concentration of Cryptosporidium in Australian surface waters for setting health-based targets for drinking water treatment.

    PubMed

    Petterson, S; Roser, D; Deere, D

    2015-09-01

    It is proposed that the next revision of the Australian Drinking Water Guidelines will include 'health-based targets', where the required level of potable water treatment quantitatively relates to the magnitude of source water pathogen concentrations. To quantify likely Cryptosporidium concentrations in southern Australian surface source waters, the databases for 25 metropolitan water supplies with good historical records, representing a range of catchment sizes, land use and climatic regions were mined. The distributions and uncertainty intervals for Cryptosporidium concentrations were characterized for each site. Then, treatment targets were quantified applying the framework recommended in the World Health Organization Guidelines for Drinking-Water Quality 2011. Based on total oocyst concentrations, and not factoring in genotype or physiological state information as it relates to infectivity for humans, the best estimates of the required level of treatment, expressed as log10 reduction values, ranged among the study sites from 1.4 to 6.1 log10. Challenges associated with relying on historical monitoring data for defining drinking water treatment requirements were identified. In addition, the importance of quantitative microbial risk assessment input assumptions on the quantified treatment targets was investigated, highlighting the need for selection of locally appropriate values.

  6. Three-dimensional microstructural characterization of bulk plutonium and uranium metals using focused ion beam technique

    NASA Astrophysics Data System (ADS)

    Chung, Brandon W.; Erler, Robert G.; Teslich, Nick E.

    2016-05-01

    Nuclear forensics requires accurate quantification of discriminating microstructural characteristics of the bulk nuclear material to identify its process history and provenance. Conventional metallographic preparation techniques for bulk plutonium (Pu) and uranium (U) metals are limited to providing information in two-dimension (2D) and do not allow for obtaining depth profile of the material. In this contribution, use of dual-beam focused ion-beam/scanning electron microscopy (FIB-SEM) to investigate the internal microstructure of bulk Pu and U metals is demonstrated. Our results demonstrate that the dual-beam methodology optimally elucidate microstructural features without preparation artifacts, and the three-dimensional (3D) characterization of inner microstructures can reveal salient microstructural features that cannot be observed from conventional metallographic techniques. Examples are shown to demonstrate the benefit of FIB-SEM in improving microstructural characterization of microscopic inclusions, particularly with respect to nuclear forensics.

  7. Three-dimensional microstructural characterization of bulk plutonium and uranium metals using focused ion beam technique

    DOE PAGES

    Chung, Brandon W.; Erler, Robert G.; Teslich, Nick E.

    2016-03-03

    Nuclear forensics requires accurate quantification of discriminating microstructural characteristics of the bulk nuclear material to identify its process history and provenance. Conventional metallographic preparation techniques for bulk plutonium (Pu) and uranium (U) metals are limited to providing information in two-dimension (2D) and do not allow for obtaining depth profile of the material. In this contribution, use of dual-beam focused ion-beam/scanning electron microscopy (FIB-SEM) to investigate the internal microstructure of bulk Pu and U metals is demonstrated. Our results demonstrate that the dual-beam methodology optimally elucidate microstructural features without preparation artifacts, and the three-dimensional (3D) characterization of inner microstructures can revealmore » salient microstructural features that cannot be observed from conventional metallographic techniques. As a result, examples are shown to demonstrate the benefit of FIB-SEM in improving microstructural characterization of microscopic inclusions, particularly with respect to nuclear forensics.« less

  8. Depth estimation of multi-layered impact damage in PMC using lateral thermography

    NASA Astrophysics Data System (ADS)

    Whitlow, Travis; Kramb, Victoria; Reibel, Rick; Dierken, Josiah

    2018-04-01

    Characterization of impact damage in polymer matrix composites (PMCs) continues to be a challenge due to the complex internal structure of the material. Nondestructive characterization approaches such as normal incident immersion ultrasound and flash thermography are sensitive to delamination damage, but do not provide information regarding damage obscured by the delaminations. Characterization of material state below a delamination requires a technique which is sensitive to in-plane damage modes such as matrix cracking and fiber breakage. Previous studies of the lateral heat flow through a composite laminate showed that the diffusion time was sensitive to the depth of the simulated damage zone. The current study will further evaluate the lateral diffusion model to provide sensitivity limits for the modeled flaw dimensions. Comparisons between the model simulations and experimental data obtained using a concentrated heat source and machined targets will also be presented.

  9. Cartographic and geodetic methods to characterize the potential landing sites for the future Russian missions Luna-Glob and Luna-Resurs

    NASA Astrophysics Data System (ADS)

    Karachevtseva, I. P.; Kokhanov, A. A.; Konopikhin, A. A.; Nadezhdina, I. E.; Zubarev, A. E.; Patratiy, V. D.; Kozlova, N. A.; Uchaev, D. V.; Uchaev, Dm. V.; Malinnikov, V. A.; Oberst, J.

    2015-04-01

    Characterization of the potential landing sites for the planned Luna-Glob and Luna-Resurs Russian missions requires cartographic and geodetic support prepared with special methods and techniques that are briefly overviewed here. The data used in the analysis, including the digital terrain models (DTMs) and the orthoimages acquired in the survey carried out from the Lunar Reconnaissance Orbiter and Kaguya spacecraft, are described and evaluated. By way of illustration, different regions of the lunar surface, including the subpolar regions of the Moon, are characterized with the suggested methods and the GIS-technologies. The development of the information support for the future lunar missions started in 2011, and it is now carried on in MIIGAiK Extraterrestrial Laboratory (MExLab), which is a department of the Moscow State University of Geodesy and Cartography (MIIGAiK).

  10. Characterize Eruptive Processes at Yucca Mountain, Nevada

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    D. Krier

    2004-10-04

    The purpose of this scientific analysis report, ''Characterize Eruptive Processes at Yucca Mountain, Nevada'', is to present information about natural volcanic systems and the parameters that can be used to model their behavior. This information is used to develop parameter-value distributions appropriate for analysis of the consequences of volcanic eruptions through a repository at Yucca Mountain. This scientific analysis report provides information to four other reports: ''Number of Waste Packages Hit by Igneous Intrusion'', (BSC 2004 [DIRS 170001]); ''Atmospheric Dispersal and Deposition of Tephra from Potential Volcanic Eruption at Yucca Mountain, Nevada'' (BSC 2004 [DIRS 170026]); ''Dike/Drift Interactions'' (BSC 2004more » [DIRS 170028]); ''Development of Earthquake Ground Motion Input for Preclosure Seismic Design and Postclosure Performance Assessment of a Geologic Repository at Yucca Mountain, NV'' (BSC 2004 [DIRS 170027], Section 6.5). This report is organized into seven major sections. This section addresses the purpose of this document. Section 2 addresses quality assurance, Section 3 the use of software, Section 4 identifies the requirements that constrain this work, and Section 5 lists assumptions and their rationale. Section 6 presents the details of the scientific analysis and Section 7 summarizes the conclusions reached.« less

  11. Comprehensive Characterization a Tidal Energy Site (Invited)

    NASA Astrophysics Data System (ADS)

    Polagye, B. L.; Thomson, J. M.; Bassett, C. S.; Epler, J.; Northwest National Marine Renewable Energy Center

    2010-12-01

    Northern Admiralty Inlet, Puget Sound, Washington is the proposed location of a pilot tidal energy project. Site-specific characterization of the physical and biological environment is required for device engineering and environmental analysis. However, the deep water and strong currents which make the site attractive for tidal energy development also pose unique challenges to collecting comprehensive information. This talk focuses on efforts to optimally site hydrokinetic turbines and estimate their acoustic impact, based on 18 months of field data collected to date. Additional characterization efforts being undertaken by the University of Washington branch of the Northwest National Marine Renewable Energy Center and its partners include marine mammal presence and behavior, water quality, seabed geology, and biofouling potential. Because kinetic power density varies with the cube of horizontal current velocity, an accurate map of spatial current variations is required to optimally site hydrokinetic turbines. Acoustic Doppler profilers deployed on the seabed show operationally meaningful variations in flow characteristics (e.g., power density, directionality, vertical shear) and tidal harmonic constituents over length scales of less than 100m. This is, in part, attributed to the proximity of this site to a headland. Because of these variations, interpolation between stationary measurement locations introduces potentially high uncertainty. The use of shipboard acoustic Doppler profilers is shown to be an effective tool for mapping peak currents and, combined with information from seabed profilers, may be able to resolve power density variations in the project area. Because noise levels from operating turbines are expected to exceed regulatory thresholds for incidental harassment of marine mammals known to be present in the project area, an estimate of the acoustic footprint is required to permit the pilot project. This requires site-specific descriptions of pre-existing ambient noise levels and the transmission loss (or practical spreading) at frequencies of interest. Recording hydrophones deployed on the seabed are used to quantify ambient noise, but are contaminated by self-noise during periods of strong currents. An empirical estimate of transmission loss is obtained from a source of opportunity - a passenger ferry which operates for more than twelve hours each day. By comparing recorded sound pressure levels against the location of the passenger ferry and other vessels (logged by an AIS receiver), the empirical transmission loss and source level for the ferry are obtained. Measurements of current velocity and underwater noise can apply routine oceanographic instruments and techniques. More unique measurements will be more challenging, such as high resolution sampling of current structure upstream and downstream of an operating device tens of meters off the seabed. Innovative approaches are required for cost effective characterization of tidal energy sites and monitoring of operating projects.

  12. Slow Cortical Dynamics and the Accumulation of Information over Long Timescales

    PubMed Central

    Honey, Christopher J.; Thesen, Thomas; Donner, Tobias H.; Silbert, Lauren J.; Carlson, Chad E.; Devinsky, Orrin; Doyle, Werner K.; Rubin, Nava; Heeger, David J.; Hasson, Uri

    2012-01-01

    SUMMARY Making sense of the world requires us to process information over multiple timescales. We sought to identify brain regions that accumulate information over short and long timescales and to characterize the distinguishing features of their dynamics. We recorded electrocorticographic (ECoG) signals from individuals watching intact and scrambled movies. Within sensory regions, fluctuations of high-frequency (64–200 Hz) power reliably tracked instantaneous low-level properties of the intact and scrambled movies. Within higher order regions, the power fluctuations were more reliable for the intact movie than the scrambled movie, indicating that these regions accumulate information over relatively long time periods (several seconds or longer). Slow (<0.1 Hz) fluctuations of high-frequency power with time courses locked to the movies were observed throughout the cortex. Slow fluctuations were relatively larger in regions that accumulated information over longer time periods, suggesting a connection between slow neuronal population dynamics and temporally extended information processing. PMID:23083743

  13. Production of biofuels and biochemicals: in need of an ORACLE.

    PubMed

    Miskovic, Ljubisa; Hatzimanikatis, Vassily

    2010-08-01

    The engineering of cells for the production of fuels and chemicals involves simultaneous optimization of multiple objectives, such as specific productivity, extended substrate range and improved tolerance - all under a great degree of uncertainty. The achievement of these objectives under physiological and process constraints will be impossible without the use of mathematical modeling. However, the limited information and the uncertainty in the available information require new methods for modeling and simulation that will characterize the uncertainty and will quantify, in a statistical sense, the expectations of success of alternative metabolic engineering strategies. We discuss these considerations toward developing a framework for the Optimization and Risk Analysis of Complex Living Entities (ORACLE) - a computational method that integrates available information into a mathematical structure to calculate control coefficients. Copyright 2010 Elsevier Ltd. All rights reserved.

  14. RH-TRU Waste Characterization by Acceptable Knowledge at the Idaho National Engineering and Environmental Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schulz, C.; Givens, C.; Bhatt, R.

    2003-02-24

    Idaho National Engineering and Environmental Laboratory (INEEL) is conducting an effort to characterize approximately 620 drums of remote-handled (RH-) transuranic (TRU) waste currently in its inventory that were generated at the Argonne National Laboratory-East (ANL-E) Alpha Gamma Hot Cell Facility (AGHCF) between 1971 and 1995. The waste was generated at the AGHCF during the destructive examination of irradiated and unirradiated fuel pins, targets, and other materials from reactor programs at ANL-West (ANL-W) and other Department of Energy (DOE) reactors. In support of this effort, Shaw Environmental and Infrastructure (formerly IT Corporation) developed an acceptable knowledge (AK) collection and management programmore » based on existing contact-handled (CH)-TRU waste program requirements and proposed RH-TRU waste program requirements in effect in July 2001. Consistent with Attachments B-B6 of the Waste Isolation Pilot Plant (WIPP) Hazardous Waste Facility Permit (HWFP) and th e proposed Class 3 permit modification (Attachment R [RH-WAP] of this permit), the draft AK Summary Report prepared under the AK procedure describes the waste generating process and includes determinations in the following areas based on AK: physical form (currently identified at the Waste Matrix Code level); waste stream delineation; applicability of hazardous waste numbers for hazardous waste constituents; and prohibited items. In addition, the procedure requires and the draft summary report contains information supporting determinations in the areas of defense relationship and radiological characterization.« less

  15. Remediation of a Former USAF Radioactive Material Disposal Site

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hoffman, D. E.; Cushman, M; Tupyi, B.

    2003-02-25

    This paper describes the remediation of a low-level radiological waste burial site located at the former James Connally Air Force Base in Waco, Texas. Burial activities at the site occurred during the 1950's when the property was under the ownership of the United States Air Force. Included is a discussion of methods and strategies that were used to successfully exhume and characterize the wastes for proper disposal at offsite disposal facilities. Worker and environmental protection measures are also described. Information gained from this project may be used at other similar project sites. A total of nine burial tubes had beenmore » identified for excavation, characterization, and removal from the site. The disposal tubes were constructed of 4-ft lengths of concrete pipe buried upright with the upper ends flush with ground surface. Initial ground level observations of the burial tubes indicated that some weathering had occurred; however, the condition of the subsurface portions of the tubes was unknown. Soil excavation occurred in 1-foot lifts in order that the tubes could be inspected and to allow for characterization of the soils at each stage of the excavation. Due to the weight of the concrete pipe and the condition of the piping joints it was determined that special measures would be required to maintain the tubes intact during their removal. Special tube anchoring and handling methods were required to relocate the tubes from their initial positions to a staging area where they could be further characterized. Characterization of the disposal tubes was accomplished using a combination of gamma spectroscopy and activity mapping methods. Important aspects of the project included the use of specialized excavation and disposal tube reinforcement measures to maintain the disposal tubes intact during excavation, removal and subsequent characterization. The non-intrusive gamma spectroscopy and data logging methods allowed for effective characterization of the wastes while minimizing disposal costs. In addition, worker exposures were maintained ALARA as a result of the removal and characterization methods employed.« less

  16. Coming to Grips with Ambiguity: Ion Mobility-Mass Spectrometry for Protein Quaternary Structure Assignment

    NASA Astrophysics Data System (ADS)

    Eschweiler, Joseph D.; Frank, Aaron T.; Ruotolo, Brandon T.

    2017-10-01

    Multiprotein complexes are central to our understanding of cellular biology, as they play critical roles in nearly every biological process. Despite many impressive advances associated with structural characterization techniques, large and highly-dynamic protein complexes are too often refractory to analysis by conventional, high-resolution approaches. To fill this gap, ion mobility-mass spectrometry (IM-MS) methods have emerged as a promising approach for characterizing the structures of challenging assemblies due in large part to the ability of these methods to characterize the composition, connectivity, and topology of large, labile complexes. In this Critical Insight, we present a series of bioinformatics studies aimed at assessing the information content of IM-MS datasets for building models of multiprotein structure. Our computational data highlights the limits of current coarse-graining approaches, and compelled us to develop an improved workflow for multiprotein topology modeling, which we benchmark against a subset of the multiprotein complexes within the PDB. This improved workflow has allowed us to ascertain both the minimal experimental restraint sets required for generation of high-confidence multiprotein topologies, and quantify the ambiguity in models where insufficient IM-MS information is available. We conclude by projecting the future of IM-MS in the context of protein quaternary structure assignment, where we predict that a more complete knowledge of the ultimate information content and ambiguity within such models will undoubtedly lead to applications for a broader array of challenging biomolecular assemblies. [Figure not available: see fulltext.

  17. Information processing capacity in psychopathy: Effects of anomalous attention.

    PubMed

    Hamilton, Rachel K B; Newman, Joseph P

    2018-03-01

    Hamilton and colleagues (2015) recently proposed that an integrative deficit in psychopathy restricts simultaneous processing, thereby leaving fewer resources available for information encoding, narrowing the scope of attention, and undermining associative processing. The current study evaluated this parallel processing deficit proposal using the Simultaneous-Sequential paradigm. This investigation marks the first a priori test of the Hamilton et al.'s theoretical framework. We predicted that psychopathy would be associated with inferior performance (as indexed by lower accuracy and longer response time) on trials requiring simultaneous processing of visual information relative to trials necessitating sequential processing. Results were consistent with these predictions, supporting the proposal that psychopathy is characterized by a reduced capacity to process multicomponent perceptual information concurrently. We discuss the potential implications of impaired simultaneous processing for the conceptualization of the psychopathic deficit. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  18. Mental Status Documentation: Information Quality and Data Processes

    PubMed Central

    Weir, Charlene; Gibson, Bryan; Taft, Teresa; Slager, Stacey; Lewis, Lacey; Staggers, Nancy

    2016-01-01

    Delirium is a fluctuating disturbance of cognition and/or consciousness associated with poor outcomes. Caring for patients with delirium requires integration of disparate information across clinicians, settings and time. The goal of this project was to characterize the information processes involved in nurses’ assessment, documentation, decisionmaking and communication regarding patients’ mental status in the inpatient setting. VA nurse managers of medical wards (n=18) were systematically selected across the US. A semi-structured telephone interview focused on current assessment, documentation, and communication processes, as well as clinical and administrative decision-making was conducted, audio-recorded and transcribed. A thematic analytic approach was used. Five themes emerged: 1) Fuzzy Concepts, 2) Grey Data, 3) Process Variability 4) Context is Critical and 5) Goal Conflict. This project describes the vague and variable information processes related to delirium and mental status that undermine effective risk, prevention, identification, communication and mitigation of harm. PMID:28269919

  19. An information technology emphasis in biomedical informatics education.

    PubMed

    Kane, Michael D; Brewer, Jeffrey L

    2007-02-01

    Unprecedented growth in the interdisciplinary domain of biomedical informatics reflects the recent advancements in genomic sequence availability, high-content biotechnology screening systems, as well as the expectations of computational biology to command a leading role in drug discovery and disease characterization. These forces have moved much of life sciences research almost completely into the computational domain. Importantly, educational training in biomedical informatics has been limited to students enrolled in the life sciences curricula, yet much of the skills needed to succeed in biomedical informatics involve or augment training in information technology curricula. This manuscript describes the methods and rationale for training students enrolled in information technology curricula in the field of biomedical informatics, which augments the existing information technology curriculum and provides training on specific subjects in Biomedical Informatics not emphasized in bioinformatics courses offered in life science programs, and does not require prerequisite courses in the life sciences.

  20. 'Meatball searching' - The adversarial approach to online information retrieval

    NASA Technical Reports Server (NTRS)

    Jack, R. F.

    1985-01-01

    It is proposed that the different styles of online searching can be described as either formal (highly precise) or informal with the needs of the client dictating which is most applicable at a particular moment. The background and personality of the searcher also come into play. Particular attention is focused on meatball searching which is a form of online searching characterized by deliberate vagueness. It requires generally comprehensive searches, often on unusual topics and with tight deadlines. It is most likely to occur in search centers serving many different disciplines and levels of client information sophistication. Various information needs are outlined as well as the laws of meatball searching and the adversarial approach. Traits and characteristics important to sucessful searching include: (1) concept analysis, (2) flexibility of thinking, (3) ability to think in synonyms and (4) anticipation of variant word forms and spellings.

  1. Mental Status Documentation: Information Quality and Data Processes.

    PubMed

    Weir, Charlene; Gibson, Bryan; Taft, Teresa; Slager, Stacey; Lewis, Lacey; Staggers, Nancy

    2016-01-01

    Delirium is a fluctuating disturbance of cognition and/or consciousness associated with poor outcomes. Caring for patients with delirium requires integration of disparate information across clinicians, settings and time. The goal of this project was to characterize the information processes involved in nurses' assessment, documentation, decisionmaking and communication regarding patients' mental status in the inpatient setting. VA nurse managers of medical wards (n=18) were systematically selected across the US. A semi-structured telephone interview focused on current assessment, documentation, and communication processes, as well as clinical and administrative decision-making was conducted, audio-recorded and transcribed. A thematic analytic approach was used. Five themes emerged: 1) Fuzzy Concepts, 2) Grey Data, 3) Process Variability 4) Context is Critical and 5) Goal Conflict. This project describes the vague and variable information processes related to delirium and mental status that undermine effective risk, prevention, identification, communication and mitigation of harm.

  2. Addressing unmeasured confounding in comparative observational research.

    PubMed

    Zhang, Xiang; Faries, Douglas E; Li, Hu; Stamey, James D; Imbens, Guido W

    2018-04-01

    Observational pharmacoepidemiological studies can provide valuable information on the effectiveness or safety of interventions in the real world, but one major challenge is the existence of unmeasured confounder(s). While many analytical methods have been developed for dealing with this challenge, they appear under-utilized, perhaps due to the complexity and varied requirements for implementation. Thus, there is an unmet need to improve understanding the appropriate course of action to address unmeasured confounding under a variety of research scenarios. We implemented a stepwise search strategy to find articles discussing the assessment of unmeasured confounding in electronic literature databases. Identified publications were reviewed and characterized by the applicable research settings and information requirements required for implementing each method. We further used this information to develop a best practice recommendation to help guide the selection of appropriate analytical methods for assessing the potential impact of unmeasured confounding. Over 100 papers were reviewed, and 15 methods were identified. We used a flowchart to illustrate the best practice recommendation which was driven by 2 critical components: (1) availability of information on the unmeasured confounders; and (2) goals of the unmeasured confounding assessment. Key factors for implementation of each method were summarized in a checklist to provide further assistance to researchers for implementing these methods. When assessing comparative effectiveness or safety in observational research, the impact of unmeasured confounding should not be ignored. Instead, we suggest quantitatively evaluating the impact of unmeasured confounding and provided a best practice recommendation for selecting appropriate analytical methods. Copyright © 2018 John Wiley & Sons, Ltd.

  3. Social behavior of bacteria: from physics to complex organization

    NASA Astrophysics Data System (ADS)

    Ben-Jacob, E.

    2008-10-01

    I describe how bacteria develop complex colonial patterns by utilizing intricate communication capabilities, such as quorum sensing, chemotactic signaling and exchange of genetic information (plasmids) Bacteria do not store genetically all the information required for generating the patterns for all possible environments. Instead, additional information is cooperatively generated as required for the colonial organization to proceed. Each bacterium is, by itself, a biotic autonomous system with its own internal cellular informatics capabilities (storage, processing and assessments of information). These afford the cell certain plasticity to select its response to biochemical messages it receives, including self-alteration and broadcasting messages to initiate alterations in other bacteria. Hence, new features can collectively emerge during self-organization from the intra-cellular level to the whole colony. Collectively bacteria store information, perform decision make decisions (e.g. to sporulate) and even learn from past experience (e.g. exposure to antibiotics)-features we begin to associate with bacterial social behavior and even rudimentary intelligence. I also take Schrdinger’s’ “feeding on negative entropy” criteria further and propose that, in addition organisms have to extract latent information embedded in the environment. By latent information we refer to the non-arbitrary spatio-temporal patterns of regularities and variations that characterize the environmental dynamics. In other words, bacteria must be able to sense the environment and perform internal information processing for thriving on latent information embedded in the complexity of their environment. I then propose that by acting together, bacteria can perform this most elementary cognitive function more efficiently as can be illustrated by their cooperative behavior.

  4. Cost and results of information systems for health and poverty indicators in the United Republic of Tanzania.

    PubMed Central

    Rommelmann, Vanessa; Setel, Philip W.; Hemed, Yusuf; Angeles, Gustavo; Mponezya, Hamisi; Whiting, David; Boerma, Ties

    2005-01-01

    OBJECTIVE: To examine the costs of complementary information generation activities in a resource-constrained setting and compare the costs and outputs of information subsystems that generate the statistics on poverty, health and survival required for monitoring, evaluation and reporting on health programmes in the United Republic of Tanzania. METHODS: Nine systems used by four government agencies or ministries were assessed. Costs were calculated from budgets and expenditure data made available by information system managers. System coverage, quality assurance and information production were reviewed using questionnaires and interviews. Information production was characterized in terms of 38 key sociodemographic indicators required for national programme monitoring. FINDINGS: In 2002-03 approximately US$ 0.53 was spent per Tanzanian citizen on the nine information subsystems that generated information on 37 of the 38 selected indicators. The census and reporting system for routine health service statistics had the largest participating populations and highest total costs. Nationally representative household surveys and demographic surveillance systems (which are not based on nationally representative samples) produced more than half the indicators and used the most rigorous quality assurance. Five systems produced fewer than 13 indicators and had comparatively high costs per participant. CONCLUSION: Policy-makers and programme planners should be aware of the many trade-offs with respect to system costs, coverage, production, representativeness and quality control when making investment choices for monitoring and evaluation. In future, formal cost-effectiveness studies of complementary information systems would help guide investments in the monitoring, evaluation and planning needed to demonstrate the impact of poverty-reduction and health programmes. PMID:16184275

  5. An assessment of seismic monitoring in the United States; requirement for an Advanced National Seismic System

    USGS Publications Warehouse

    ,

    1999-01-01

    This report assesses the status, needs, and associated costs of seismic monitoring in the United States. It sets down the requirement for an effective, national seismic monitoring strategy and an advanced system linking national, regional, and urban monitoring networks. Modernized seismic monitoring can provide alerts of imminent strong earthquake shaking; rapid assessment of distribution and severity of earthquake shaking (for use in emergency response); warnings of a possible tsunami from an offshore earthquake; warnings of volcanic eruptions; information for correctly characterizing earthquake hazards and for improving building codes; and data on response of buildings and structures during earthquakes, for safe, cost-effective design, engineering, and construction practices in earthquake-prone regions.

  6. Logic design for dynamic and interactive recovery.

    NASA Technical Reports Server (NTRS)

    Carter, W. C.; Jessep, D. C.; Wadia, A. B.; Schneider, P. R.; Bouricius, W. G.

    1971-01-01

    Recovery in a fault-tolerant computer means the continuation of system operation with data integrity after an error occurs. This paper delineates two parallel concepts embodied in the hardware and software functions required for recovery; detection, diagnosis, and reconfiguration for hardware, data integrity, checkpointing, and restart for the software. The hardware relies on the recovery variable set, checking circuits, and diagnostics, and the software relies on the recovery information set, audit, and reconstruct routines, to characterize the system state and assist in recovery when required. Of particular utility is a handware unit, the recovery control unit, which serves as an interface between error detection and software recovery programs in the supervisor and provides dynamic interactive recovery.

  7. Over-expression and purification strategies for recombinant multi-protein oligomers: a case study of Mycobacterium tuberculosis σ/anti-σ factor protein complexes.

    PubMed

    Thakur, Krishan Gopal; Jaiswal, Ravi Kumar; Shukla, Jinal K; Praveena, T; Gopal, B

    2010-12-01

    The function of a protein in a cell often involves coordinated interactions with one or several regulatory partners. It is thus imperative to characterize a protein both in isolation as well as in the context of its complex with an interacting partner. High resolution structural information determined by X-ray crystallography and Nuclear Magnetic Resonance offer the best route to characterize protein complexes. These techniques, however, require highly purified and homogenous protein samples at high concentration. This requirement often presents a major hurdle for structural studies. Here we present a strategy based on co-expression and co-purification to obtain recombinant multi-protein complexes in the quantity and concentration range that can enable hitherto intractable structural projects. The feasibility of this strategy was examined using the σ factor/anti-σ factor protein complexes from Mycobacterium tuberculosis. The approach was successful across a wide range of σ factors and their cognate interacting partners. It thus appears likely that the analysis of these complexes based on variations in expression constructs and procedures for the purification and characterization of these recombinant protein samples would be widely applicable for other multi-protein systems. Copyright © 2010 Elsevier Inc. All rights reserved.

  8. 10 CFR 60.16 - Site characterization plan required.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 2 2010-01-01 2010-01-01 false Site characterization plan required. 60.16 Section 60.16 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) DISPOSAL OF HIGH-LEVEL RADIOACTIVE WASTES IN GEOLOGIC REPOSITORIES Licenses Preapplication Review § 60.16 Site characterization plan required. Before proceeding to...

  9. Characterizing of tissue microstructure with single-detector polarization-sensitive optical coherence tomography

    NASA Astrophysics Data System (ADS)

    Liu, Bin; Harman, Michelle; Giattina, Susanne; Stamper, Debra L.; Demakis, Charles; Chilek, Mark; Raby, Stephanie; Brezinski, Mark E.

    2006-06-01

    Assessing tissue birefringence with imaging modality polarization-sensitive optical coherence tomography (PS-OCT) could improve the characterization of in vivo tissue pathology. Among the birefringent components, collagen may provide invaluable clinical information because of its alteration in disorders ranging from myocardial infarction to arthritis. But the features required of clinical imaging modality in these areas usually include the ability to assess the parameter of interest rapidly and without extensive data analysis, the characteristics that single-detector PS-OCT demonstrates. But beyond detecting organized collagen, which has been previously demonstrated and confirmed with the appropriate histological techniques, additional information can potentially be gained with PS-OCT, including collagen type, form versus intrinsic birefringence, the collagen angle, and the presence of multiple birefringence materials. In part I, we apply the simple but powerful fast-Fourier transform (FFT) to both PS-OCT mathematical modeling and in vitro bovine meniscus for improved PS-OCT data analysis. The FFT analysis yields, in a rapid, straightforward, and easily interpreted manner, information on the presence of multiple birefringent materials, distinguishing the true anatomical structure from patterns in image resulting from alterations in the polarization state and identifying the tissue/phantom optical axes. Therefore the use of the FFT analysis of PS-OCT data provides information on tissue composition beyond identifying the presence of organized collagen in real time and directly from the image without extensive mathematical manipulation or data analysis. In part II, Helistat phantoms (collagen type I) are analyzed with the ultimate goal of improved tissue characterization. This study, along with the data in part I, advance the insights gained from PS-OCT images beyond simply determining the presence or absence of birefringence.

  10. Characterization of Technetium Speciation in Cast Stone

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Um, Wooyong; Jung, Hun Bok; Wang, Guohui

    2013-11-11

    This report describes the results from laboratory tests performed at Pacific Northwest National Laboratory (PNNL) for the U.S. Department of Energy (DOE) EM-31 Support Program (EMSP) subtask, “Production and Long-Term Performance of Low Temperature Waste Forms” to provide additional information on technetium (Tc) speciation characterization in the Cast Stone waste form. To support the use of Cast Stone as an alternative to vitrification for solidifying low-activity waste (LAW) and as the current baseline waste form for secondary waste streams at the Hanford Site, additional understanding of Tc speciation in Cast Stone is needed to predict the long-term Tc leachability frommore » Cast Stone and to meet the regulatory disposal-facility performance requirements for the Integrated Disposal Facility (IDF). Characterizations of the Tc speciation within the Cast Stone after leaching under various conditions provide insights into how the Tc is retained and released. The data generated by the laboratory tests described in this report provide both empirical and more scientific information to increase our understanding of Tc speciation in Cast Stone and its release mechanism under relevant leaching processes for the purpose of filling data gaps and to support the long-term risk and performance assessments of Cast Stone in the IDF at the Hanford Site.« less

  11. Analysis of atomic force microscopy data for surface characterization using fuzzy logic

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Al-Mousa, Amjed, E-mail: aalmousa@vt.edu; Niemann, Darrell L.; Niemann, Devin J.

    2011-07-15

    In this paper we present a methodology to characterize surface nanostructures of thin films. The methodology identifies and isolates nanostructures using Atomic Force Microscopy (AFM) data and extracts quantitative information, such as their size and shape. The fuzzy logic based methodology relies on a Fuzzy Inference Engine (FIE) to classify the data points as being top, bottom, uphill, or downhill. The resulting data sets are then further processed to extract quantitative information about the nanostructures. In the present work we introduce a mechanism which can consistently distinguish crowded surfaces from those with sparsely distributed structures and present an omni-directional searchmore » technique to improve the structural recognition accuracy. In order to demonstrate the effectiveness of our approach we present a case study which uses our approach to quantitatively identify particle sizes of two specimens each with a unique gold nanoparticle size distribution. - Research Highlights: {yields} A Fuzzy logic analysis technique capable of characterizing AFM images of thin films. {yields} The technique is applicable to different surfaces regardless of their densities. {yields} Fuzzy logic technique does not require manual adjustment of the algorithm parameters. {yields} The technique can quantitatively capture differences between surfaces. {yields} This technique yields more realistic structure boundaries compared to other methods.« less

  12. Employer knowledge of federal requirements for recording work-related injuries and illnesses: Implications for occupational injury surveillance data.

    PubMed

    Wuellner, Sara; Phipps, Polly

    2018-05-01

    Accuracy of the Bureau of Labor Statistics Survey of Occupational Injuries and Illnesses (SOII) data is dependent on employer compliance with workplace injury and illness recordkeeping requirements. Characterization of employer recordkeeping can inform efforts to improve the data. We interviewed representative samples of SOII respondents from four states to identify common recordkeeping errors and to assess employer characteristics associated with limited knowledge of the recordkeeping requirements and non compliant practices. Less than half of the establishments required to maintain OSHA injury and illness records reported doing so. Few establishments knew to omit cases limited to diagnostic services (22%) and to count unscheduled weekend days as missed work (27%). No single state or establishment characteristic was consistently associated with better or worse record-keeping. Many employers possess a limited understanding of workplace injury recordkeeping requirements, potentially leading them to over-report minor incidents, and under-report missed work cases. © 2018 Wiley Periodicals, Inc.

  13. National Streamflow Information Program: Implementation Status Report

    USGS Publications Warehouse

    Norris, J. Michael

    2009-01-01

    The U.S. Geological Survey (USGS) operates and maintains a nationwide network of about 7,500 streamgages designed to provide and interpret long-term, accurate, and unbiased streamflow information to meet the multiple needs of many diverse national, regional, state, and local users. The National Streamflow Information Program (NSIP) was initiated in 2003 in response to Congressional and stakeholder concerns about (1) the decrease in the number of operating streamgages, including a disproportionate loss of streamgages with a long period of record; (2) the inability of the USGS to continue operating high-priority streamgages in an environment of reduced funding through partnerships; and (3) the increasing demand for streamflow information due to emerging resource-management issues and new data-delivery capabilities. The NSIP's mission is to provide the streamflow information and understanding required to meet national, regional, state, and local needs. Most of the existing streamgages are funded through partnerships with more than 850 other Federal, state, tribal, and local agencies. Currently, about 90 percent of the streamgages send data to the World Wide Web in near-real time (some information is transmitted within 15 minutes, whereas some lags by about 4 hours). The streamflow information collected at USGS streamgages is used for many purposes: *In water-resource appraisals and allocations - to determine how much water is available and how it is being allocated; *To provide streamflow information required by interstate agreements, compacts, and court decrees; *For engineering design of reservoirs, bridges, roads, culverts, and treatment plants; *For the operation of reservoirs, the operation of locks and dams for navigation purposes, and power production; *To identify changes in streamflow resulting from changes in land use, water use, and climate; *For streamflow forecasting, flood planning, and flood forecasting; *To support water-quality programs by allowing determination of constituent loads and fluxes; and *For characterizing and evaluating instream conditions for habitat assessments, instream-flow requirements, and recreation.

  14. Study design requirements for RNA sequencing-based breast cancer diagnostics.

    PubMed

    Mer, Arvind Singh; Klevebring, Daniel; Grönberg, Henrik; Rantalainen, Mattias

    2016-02-01

    Sequencing-based molecular characterization of tumors provides information required for individualized cancer treatment. There are well-defined molecular subtypes of breast cancer that provide improved prognostication compared to routine biomarkers. However, molecular subtyping is not yet implemented in routine breast cancer care. Clinical translation is dependent on subtype prediction models providing high sensitivity and specificity. In this study we evaluate sample size and RNA-sequencing read requirements for breast cancer subtyping to facilitate rational design of translational studies. We applied subsampling to ascertain the effect of training sample size and the number of RNA sequencing reads on classification accuracy of molecular subtype and routine biomarker prediction models (unsupervised and supervised). Subtype classification accuracy improved with increasing sample size up to N = 750 (accuracy = 0.93), although with a modest improvement beyond N = 350 (accuracy = 0.92). Prediction of routine biomarkers achieved accuracy of 0.94 (ER) and 0.92 (Her2) at N = 200. Subtype classification improved with RNA-sequencing library size up to 5 million reads. Development of molecular subtyping models for cancer diagnostics requires well-designed studies. Sample size and the number of RNA sequencing reads directly influence accuracy of molecular subtyping. Results in this study provide key information for rational design of translational studies aiming to bring sequencing-based diagnostics to the clinic.

  15. Hanford Site Environmental Report for Calendar Year 2002

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Poston, Ted M.; Hanf, Robert W.; Dirkes, Roger L.

    This report is prepared annually to satisfy the requirements of DOE Orders. The report provides an overview of activities at the Hanford Site during 2002 and demonstrates the site's compliance with applicable federal, state, and local environmental laws, regulations, executive orders, and DOE policies; and to summarize environmental data that characterize Hanford Site environmental management performance. The purpose of the report is to provide useful summary information to members of the public, public officials, regulators, Hanford contractors, and elected representatives.

  16. Experimental investigation of criteria for continuous variable entanglement.

    PubMed

    Bowen, W P; Schnabel, R; Lam, P K; Ralph, T C

    2003-01-31

    We generate a pair of entangled beams from the interference of two amplitude squeezed beams. The entanglement is quantified in terms of EPR paradox and inseparability criteria, with both results clearly beating the standard quantum limit. We experimentally analyze the effect of decoherence on each criterion and demonstrate qualitative differences. We also characterize the number of required and excess photons present in the entangled beams and provide contour plots of the efficacy of quantum information protocols in terms of these variables.

  17. Analysis methods for Kevlar shield response to rotor fragments

    NASA Technical Reports Server (NTRS)

    Gerstle, J. H.

    1977-01-01

    Several empirical and analytical approaches to rotor burst shield sizing are compared and principal differences in metal and fabric dynamic behavior are discussed. The application of transient structural response computer programs to predict Kevlar containment limits is described. For preliminary shield sizing, present analytical methods are useful if insufficient test data for empirical modeling are available. To provide other information useful for engineering design, analytical methods require further developments in material characterization, failure criteria, loads definition, and post-impact fragment trajectory prediction.

  18. Dust: a metric for use in residential and building exposure assessment and source characterization.

    PubMed Central

    Lioy, Paul J; Freeman, Natalie C G; Millette, James R

    2002-01-01

    In this review, we examine house dust and residential soil and their use for identifying sources and the quantifying levels of toxicants for the estimation of exposure. We answer critical questions that focus on the selection of samples or sampling strategies for collection and discuss areas of uncertainty and gaps in knowledge. We discuss the evolution of dust sampling with a special emphasis on work conducted after the publication of the 1992 review by McArthur [Appl Occup Environ Hyg 7(9):599-606 (1992)]. The approaches to sampling dust examined include surface wipe sampling, vacuum sampling, and other sampling approaches, including attic sampling. The metrics of presentation of results for toxicants in dust surface loading (micrograms per square centimeter) or surface concentration (micrograms per gram) are discussed. We evaluate these metrics in terms of how the information can be used in source characterization and in exposure characterization. We discuss the types of companion information on source use and household or personal activity patterns required to assess the significance of the dust exposure. The status and needs for wipe samplers, surface samplers, and vacuum samplers are summarized with some discussion on the strengths and weaknesses of each type of sampler. We also discuss needs for research and development and the current status of standardization. Case studies are provided to illustrate the use of house dust and residential soil in source characterization, forensic analyses, or human exposure assessment. PMID:12361921

  19. Dust: a metric for use in residential and building exposure assessment and source characterization.

    PubMed

    Lioy, Paul J; Freeman, Natalie C G; Millette, James R

    2002-10-01

    In this review, we examine house dust and residential soil and their use for identifying sources and the quantifying levels of toxicants for the estimation of exposure. We answer critical questions that focus on the selection of samples or sampling strategies for collection and discuss areas of uncertainty and gaps in knowledge. We discuss the evolution of dust sampling with a special emphasis on work conducted after the publication of the 1992 review by McArthur [Appl Occup Environ Hyg 7(9):599-606 (1992)]. The approaches to sampling dust examined include surface wipe sampling, vacuum sampling, and other sampling approaches, including attic sampling. The metrics of presentation of results for toxicants in dust surface loading (micrograms per square centimeter) or surface concentration (micrograms per gram) are discussed. We evaluate these metrics in terms of how the information can be used in source characterization and in exposure characterization. We discuss the types of companion information on source use and household or personal activity patterns required to assess the significance of the dust exposure. The status and needs for wipe samplers, surface samplers, and vacuum samplers are summarized with some discussion on the strengths and weaknesses of each type of sampler. We also discuss needs for research and development and the current status of standardization. Case studies are provided to illustrate the use of house dust and residential soil in source characterization, forensic analyses, or human exposure assessment.

  20. The Human Exposure Model (HEM): A Tool to Support Rapid ...

    EPA Pesticide Factsheets

    The US EPA is developing an open and publically available software program called the Human Exposure Model (HEM) to provide near-field exposure information for Life Cycle Impact Assessments (LCIAs). Historically, LCIAs have often omitted impacts from near-field sources of exposure. The use of consumer products often results in near-field exposures (exposures that occur directly from the use of a product) that are larger than environmentally mediated exposures (i.e. far-field sources)1,2. Failure to consider near-field exposures could result in biases in LCIA-based determinations of the relative sustainability of consumer products. HEM is designed to provide this information.Characterizing near-field sources of chemical exposures present a challenge to LCIA practitioners. Unlike far-field sources, where multimedia mass balance models have been used to determine human exposure, near-field sources require product-specific models of human exposure and considerable information on product use and product composition. Such information is difficult and time-consuming to gather and curate. The HEM software will characterize the distribution of doses and product intake fractions2 across populations of product users and bystanders, allowing for differentiation by various demographic characteristics. The tool incorporates a newly developed database of the composition of more than 17,000 products, data on physical and chemical properties for more than 2,000 chemicals, and mo

  1. A global parallel model based design of experiments method to minimize model output uncertainty.

    PubMed

    Bazil, Jason N; Buzzard, Gregory T; Rundell, Ann E

    2012-03-01

    Model-based experiment design specifies the data to be collected that will most effectively characterize the biological system under study. Existing model-based design of experiment algorithms have primarily relied on Fisher Information Matrix-based methods to choose the best experiment in a sequential manner. However, these are largely local methods that require an initial estimate of the parameter values, which are often highly uncertain, particularly when data is limited. In this paper, we provide an approach to specify an informative sequence of multiple design points (parallel design) that will constrain the dynamical uncertainty of the biological system responses to within experimentally detectable limits as specified by the estimated experimental noise. The method is based upon computationally efficient sparse grids and requires only a bounded uncertain parameter space; it does not rely upon initial parameter estimates. The design sequence emerges through the use of scenario trees with experimental design points chosen to minimize the uncertainty in the predicted dynamics of the measurable responses of the system. The algorithm was illustrated herein using a T cell activation model for three problems that ranged in dimension from 2D to 19D. The results demonstrate that it is possible to extract useful information from a mathematical model where traditional model-based design of experiments approaches most certainly fail. The experiments designed via this method fully constrain the model output dynamics to within experimentally resolvable limits. The method is effective for highly uncertain biological systems characterized by deterministic mathematical models with limited data sets. Also, it is highly modular and can be modified to include a variety of methodologies such as input design and model discrimination.

  2. Lidar characterizations of atmospheric aerosols and clouds

    NASA Astrophysics Data System (ADS)

    Ferrare, R. A.; Hostetler, C. A.; Hair, J. W.; Burton, S. P.

    2017-12-01

    Knowledge of the vertical profile, composition, concentration, and size distribution of aerosols is required to quantify the impacts of aerosols on human health, global and regional climate, clouds and precipitation. In particular, radiative forcing due to anthropogenic aerosols is the most uncertain part of anthropogenic radiative forcing, with aerosol-cloud interactions (ACI) as the largest source of uncertainty in current estimates of global radiative forcing. Improving aerosol transport model predictions of the vertical profile of aerosol optical and microphysical characteristics is crucial for improving assessments of aerosol radiative forcing. Understanding how aerosols and clouds interact is essential for investigating the aerosol indirect effect and ACI. Through its ability to provide vertical profiles of aerosol and cloud distributions as well as important information regarding the optical and physical properties of aerosols and clouds, lidar is a crucial tool for addressing these science questions. This presentation describes how surface, airborne, and satellite lidar measurements have been used to address these questions, and in particular how High Spectral Resolution Lidar (HSRL) measurements provide profiles of aerosol properties (backscatter, extinction, depolarization, concentration, size) important for characterizing radiative forcing. By providing a direct measurement of aerosol extinction, HSRL provides more accurate aerosol measurement profiles and more accurate constraints for models than standard retrievals from elastic backscatter lidar, which loses accuracy and precision at lower altitudes due to attenuation from overlying layers. Information regarding particle size and abundance from advanced lidar retrievals provides better proxies for cloud-condensation-nuclei (CCN), which are required for assessing aerosol-cloud interactions. When combined with data from other sensors, advanced lidar measurements can provide information on aerosol and cloud properties for addressing both direct and indirect radiative forcing.

  3. Using borehole flow logging to optimize hydraulic-test procedures in heterogeneous fractured aquifers

    USGS Publications Warehouse

    Paillet, F.L.

    1995-01-01

    Hydraulic properties of heterogeneous fractured aquifers are difficult to characterize, and such characterization usually requires equipment-intensive and time-consuming applications of hydraulic testing in situ. Conventional coring and geophysical logging techniques provide useful and reliable information on the distribution of bedding planes, fractures and solution openings along boreholes, but it is often unclear how these locally permeable features are organized into larger-scale zones of hydraulic conductivity. New boreholes flow-logging equipment provides techniques designed to identify hydraulically active fractures intersecting boreholes, and to indicate how these fractures might be connected to larger-scale flow paths in the surrounding aquifer. Potential complications in interpreting flowmeter logs include: 1) Ambient hydraulic conditions that mask the detection of hydraulically active fractures; 2) Inability to maintain quasi-steady drawdowns during aquifer tests, which causes temporal variations in flow intensity to be confused with inflows during pumping; and 3) Effects of uncontrolled background variations in hydraulic head, which also complicate the interpretation of inflows during aquifer tests. Application of these techniques is illustrated by the analysis of cross-borehole flowmeter data from an array of four bedrock boreholes in granitic schist at the Mirror Lake, New Hampshire, research site. Only two days of field operations were required to unambiguously identify the few fractures or fracture zones that contribute most inflow to boreholes in the CO borehole array during pumping. Such information was critical in the interpretation of water-quality data. This information also permitted the setting of the available string of two packers in each borehole so as to return the aquifer as close to pre-drilling conditions as possible with the available equipment.

  4. Inquiry-Based Approach to a Carbohydrate Analysis Experiment

    NASA Astrophysics Data System (ADS)

    Senkbeil, Edward G.

    1999-01-01

    The analysis of an unknown carbohydrate in an inquiry-based learning format has proven to be a valuable and interesting undergraduate biochemistry laboratory experiment. Students are given a list of carbohydrates and a list of references for carbohydrate analysis. The references contain a variety of well-characterized wet chemistry and instrumental techniques for carbohydrate identification, but the students must develop an appropriate sequential protocol for unknown identification. The students are required to provide a list of chemicals and procedures and a flow chart for identification before the lab. During the 3-hour laboratory period, they utilize their accumulated information and knowledge to classify and identify their unknown. Advantages of the inquiry-based format are (i) students must be well prepared in advance to be successful in the laboratory, (ii) students feel a sense of accomplishment in both designing and carrying out a successful experiment, and (iii) the carbohydrate background information digested by the students significantly decreases the amount of lecture time required for this topic.

  5. Effects of reinforcement on test-enhanced learning in a large, diverse introductory college psychology course.

    PubMed

    Trumbo, Michael C; Leiting, Kari A; McDaniel, Mark A; Hodge, Gordon K

    2016-06-01

    A robust finding within laboratory research is that structuring information as a test confers benefit on long-term retention-referred to as the testing effect. Although well characterized in laboratory environments, the testing effect has been explored infrequently within ecologically valid contexts. We conducted a series of 3 experiments within a very large introductory college-level course. Experiment 1 examined the impact of required versus optional frequent low-stakes testing (quizzes) on student grades, revealing students were much more likely to take advantage of quizzing if it was a required course component. Experiment 2 implemented a method of evaluating pedagogical intervention within a single course (thereby controlling for instructor bias and student self-selection), which revealed a testing effect. Experiment 3 ruled out additional exposure to information as an explanation for the findings of Experiment 2 and suggested that students at the college level, enrolled in very large sections, accept frequent quizzing well. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  6. The role of acceptable knowledge in transuranic waste disposal operations - 11117

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chancellor, Christopher John; Nelson, Roger

    2010-11-08

    The Acceptable Knowledge (AK) process plays a key role in the delineation of waste streams destined for the Waste Isolation Pilot Plant (WIPP). General Electric's Vallecitos Nuclear Center (GEVNC) provides for an ideal case study of the application of AK in a multiple steward environment. In this review we will elucidate the pivotal role Acceptable Knowledge played in segregating Department of Energy (DOE) responsibilities from a commercial facility. The Acceptable Knowledge process is a necessary component of waste characterization that determines whether or not a waste stream may be considered for disposal at the WIPP site. This process may bemore » thought of as an effort to gain a thorough understanding of the waste origin, chemical content, and physical form gleaned by the collection of documentation that concerns generator/storage site history, mission, and operations; in addition to waste stream specific information which includes the waste generation process, the waste matrix, the quantity of waste concerned, and the radiological and chemical make up of the waste. The collection and dissemination of relevant documentation is the fundamental requirement for the AK process to work. Acceptable Knowledge is the predominant process of characterization and, therefore, a crucial part of WIPP's transuranic waste characterization program. This characterization process, when conducted to the standards set forth in WIPP's operating permit, requires confirmation/verification by physical techniques such as Non-Destructive Examination (NDE), Visual Examination (VE), and Non-Destructive Assay (NDA). These physical characterization techniques may vary in their appropriateness for a given waste stream; however, nothing will allow the substitution or exclusion of AK. Beyond the normal scope of operations, AK may be considered, when appropriate, a surrogate for the physical characterization techniques in a procedure that appeals to concepts such As Low As Reasonably Achievable (ALARA) and budgetary savings. This substitution is referred to as an Acceptable Knowledge Sufficiency Determination. With a Sufficiency Determination Request, AK may supplant the need for one or all of the physical analysis methods. This powerful procedure may be used on a scale as small as a single container to that of a vast waste stream. Only under the most stringent requirements will an AK Sufficiency Determination be approved by the regulators and, to date, only six such Sufficiency Determinations have been approved. Although Acceptable Knowledge is legislated into the operational procedures of the WIPP facility there is more to it than compliance. AK is not merely one of a long list of requirements in the characterization and verification of transuranic (TRU) waste destined for the WIPP. Acceptable Knowledge goes beyond the regulatory threshold by offering a way to reduce risk, cost, time, and uncertainty on its own laurels. Therefore, AK alone can be argued superior to any other waste characterization technique.« less

  7. The Calculation of Fractal Dimension in the Presence of Non-Fractal Clutter

    NASA Technical Reports Server (NTRS)

    Herren, Kenneth A.; Gregory, Don A.

    1999-01-01

    The area of information processing has grown dramatically over the last 50 years. In the areas of image processing and information storage the technology requirements have far outpaced the ability of the community to meet demands. The need for faster recognition algorithms and more efficient storage of large quantities of data has forced the user to accept less than lossless retrieval of that data for analysis. In addition to clutter that is not the object of interest in the data set, often the throughput requirements forces the user to accept "noisy" data and to tolerate the clutter inherent in that data. It has been shown that some of this clutter, both the intentional clutter (clouds, trees, etc) as well as the noise introduced on the data by processing requirements can be modeled as fractal or fractal-like. Traditional methods using Fourier deconvolution on these sources of noise in frequency space leads to loss of signal and can, in many cases, completely eliminate the target of interest. The parameters that characterize fractal-like noise (predominately the fractal dimension) have been investigated and a technique to reduce or eliminate noise from real scenes has been developed. Examples of clutter reduced images are presented.

  8. Hydrologic Process-oriented Optimization of Electrical Resistivity Tomography

    NASA Astrophysics Data System (ADS)

    Hinnell, A.; Bechtold, M.; Ferre, T. A.; van der Kruk, J.

    2010-12-01

    Electrical resistivity tomography (ERT) is commonly used in hydrologic investigations. Advances in joint and coupled hydrogeophysical inversion have enhanced the quantitative use of ERT to construct and condition hydrologic models (i.e. identify hydrologic structure and estimate hydrologic parameters). However the selection of which electrical resistivity data to collect and use is often determined by a combination of data requirements for geophysical analysis, intuition on the part of the hydrogeophysicist and logistical constraints of the laboratory or field site. One of the advantages of coupled hydrogeophysical inversion is the direct link between the hydrologic model and the individual geophysical data used to condition the model. That is, there is no requirement to collect geophysical data suitable for independent geophysical inversion. The geophysical measurements collected can be optimized for estimation of hydrologic model parameters rather than to develop a geophysical model. Using a synthetic model of drip irrigation we evaluate the value of individual resistivity measurements to describe the soil hydraulic properties and then use this information to build a data set optimized for characterizing hydrologic processes. We then compare the information content in the optimized data set with the information content in a data set optimized using a Jacobian sensitivity analysis.

  9. Risk analysis for biological hazards: What we need to know about invasive species

    USGS Publications Warehouse

    Stohlgren, T.J.; Schnase, J.L.

    2006-01-01

    Risk analysis for biological invasions is similar to other types of natural and human hazards. For example, risk analysis for chemical spills requires the evaluation of basic information on where a spill occurs; exposure level and toxicity of the chemical agent; knowledge of the physical processes involved in its rate and direction of spread; and potential impacts to the environment, economy, and human health relative to containment costs. Unlike typical chemical spills, biological invasions can have long lag times from introduction and establishment to successful invasion, they reproduce, and they can spread rapidly by physical and biological processes. We use a risk analysis framework to suggest a general strategy for risk analysis for invasive species and invaded habitats. It requires: (1) problem formation (scoping the problem, defining assessment endpoints); (2) analysis (information on species traits, matching species traits to suitable habitats, estimating exposure, surveys of current distribution and abundance); (3) risk characterization (understanding of data completeness, estimates of the “potential” distribution and abundance; estimates of the potential rate of spread; and probable risks, impacts, and costs); and (4) risk management (containment potential, costs, and opportunity costs; legal mandates and social considerations and information science and technology needs).

  10. Qualia: The Geometry of Integrated Information

    PubMed Central

    Balduzzi, David; Tononi, Giulio

    2009-01-01

    According to the integrated information theory, the quantity of consciousness is the amount of integrated information generated by a complex of elements, and the quality of experience is specified by the informational relationships it generates. This paper outlines a framework for characterizing the informational relationships generated by such systems. Qualia space (Q) is a space having an axis for each possible state (activity pattern) of a complex. Within Q, each submechanism specifies a point corresponding to a repertoire of system states. Arrows between repertoires in Q define informational relationships. Together, these arrows specify a quale—a shape that completely and univocally characterizes the quality of a conscious experience. Φ— the height of this shape—is the quantity of consciousness associated with the experience. Entanglement measures how irreducible informational relationships are to their component relationships, specifying concepts and modes. Several corollaries follow from these premises. The quale is determined by both the mechanism and state of the system. Thus, two different systems having identical activity patterns may generate different qualia. Conversely, the same quale may be generated by two systems that differ in both activity and connectivity. Both active and inactive elements specify a quale, but elements that are inactivated do not. Also, the activation of an element affects experience by changing the shape of the quale. The subdivision of experience into modalities and submodalities corresponds to subshapes in Q. In principle, different aspects of experience may be classified as different shapes in Q, and the similarity between experiences reduces to similarities between shapes. Finally, specific qualities, such as the “redness” of red, while generated by a local mechanism, cannot be reduced to it, but require considering the entire quale. Ultimately, the present framework may offer a principled way for translating qualitative properties of experience into mathematics. PMID:19680424

  11. Implementation and characterization of active feed-forward for deterministic linear optics quantum computing

    NASA Astrophysics Data System (ADS)

    Böhi, P.; Prevedel, R.; Jennewein, T.; Stefanov, A.; Tiefenbacher, F.; Zeilinger, A.

    2007-12-01

    In general, quantum computer architectures which are based on the dynamical evolution of quantum states, also require the processing of classical information, obtained by measurements of the actual qubits that make up the computer. This classical processing involves fast, active adaptation of subsequent measurements and real-time error correction (feed-forward), so that quantum gates and algorithms can be executed in a deterministic and hence error-free fashion. This is also true in the linear optical regime, where the quantum information is stored in the polarization state of photons. The adaptation of the photon’s polarization can be achieved in a very fast manner by employing electro-optical modulators, which change the polarization of a trespassing photon upon appliance of a high voltage. In this paper we discuss techniques for implementing fast, active feed-forward at the single photon level and we present their application in the context of photonic quantum computing. This includes the working principles and the characterization of the EOMs as well as a description of the switching logics, both of which allow quantum computation at an unprecedented speed.

  12. Concise Review: Mind the Gap: Challenges in Characterizing and Quantifying Cell- and Tissue-Based Therapies for Clinical Translation

    PubMed Central

    Rayment, Erin A; Williams, David J

    2010-01-01

    There are many challenges associated with characterizing and quantifying cells for use in cell- and tissue-based therapies. From a regulatory perspective, these advanced treatments must not only be safe and effective but also be made by high-quality manufacturing processes that allow for on-time delivery of viable products. Although sterility assays can be adapted from conventional bioprocessing, cell- and tissue-based therapies require more stringent safety assessments, especially in relation to use of animal products, immune reaction, and potential instability due to extended culture times. Furthermore, cell manufacturers who plan to use human embryonic stem cells in their therapies need to be particularly stringent in their final purification steps, due to the unrestricted growth potential of these cells. This review summarizes the current issues in characterization and quantification for cell- and tissue-based therapies, dividing these challenges into the regulatory themes of safety, potency, and manufacturing quality. It outlines current assays in use, as well as highlights the limits of many of these product release tests. Mode of action is discussed, with particular reference to in vitro surrogate assays that can be used to provide information to correlate with proposed in vivo patient efficacy. Importantly, this review highlights the requirement for basic research to improve current knowledge on the in vivo fate of these treatments; as well as an improved stakeholder negotiation process to identify the measurement requirements that will ensure the manufacture of the best possible cell- and tissue-based therapies within the shortest timeframe for the most patient benefit. PMID:20333747

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dumitrescu, Eugene; Humble, Travis S.

    The accurate and reliable characterization of quantum dynamical processes underlies efforts to validate quantum technologies, where discrimination between competing models of observed behaviors inform efforts to fabricate and operate qubit devices. We present a protocol for quantum channel discrimination that leverages advances in direct characterization of quantum dynamics (DCQD) codes. We demonstrate that DCQD codes enable selective process tomography to improve discrimination between entangling and correlated quantum dynamics. Numerical simulations show selective process tomography requires only a few measurement configurations to achieve a low false alarm rate and that the DCQD encoding improves the resilience of the protocol to hiddenmore » sources of noise. Lastly, our results show that selective process tomography with DCQD codes is useful for efficiently distinguishing sources of correlated crosstalk from uncorrelated noise in current and future experimental platforms.« less

  14. Characterizing reliability in a product/process design-assurance program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kerscher, W.J. III; Booker, J.M.; Bement, T.R.

    1997-10-01

    Over the years many advancing techniques in the area of reliability engineering have surfaced in the military sphere of influence, and one of these techniques is Reliability Growth Testing (RGT). Private industry has reviewed RGT as part of the solution to their reliability concerns, but many practical considerations have slowed its implementation. It`s objective is to demonstrate the reliability requirement of a new product with a specified confidence. This paper speaks directly to that objective but discusses a somewhat different approach to achieving it. Rather than conducting testing as a continuum and developing statistical confidence bands around the results, thismore » Bayesian updating approach starts with a reliability estimate characterized by large uncertainty and then proceeds to reduce the uncertainty by folding in fresh information in a Bayesian framework.« less

  15. The Impact of Advanced Greenhouse Gas Measurement Science on Policy Goals and Research Strategies

    NASA Astrophysics Data System (ADS)

    Abrahams, L.; Clavin, C.; McKittrick, A.

    2016-12-01

    In support of the Paris agreement, accurate characterizations of U.S. greenhouse gas (GHG) emissions estimates have been area of increased scientific focus. Over the last several years, the scientific community has placed significant emphasis on understanding, quantifying, and reconciling measurement and modeling methods that characterize methane emissions from petroleum and natural gas sources. This work has prompted national policy discussions and led to the improvement of regional and national methane emissions estimates. Research campaigns focusing on reconciling atmospheric measurements ("top-down") and process-based emissions estimates ("bottom-up") have sought to identify where measurement technology advances could inform policy objectives. A clear next step is development and deployment of advanced detection capabilities that could aid U.S. emissions mitigation and verification goals. The breadth of policy-relevant outcomes associated with advances in GHG measurement science are demonstrated by recent improvements in the petroleum and natural gas sector emission estimates in the EPA Greenhouse Gas Inventory, ambitious efforts to apply inverse modeling results to inform or validate national GHG inventory, and outcomes from federal GHG measurement science technology development programs. In this work, we explore the variety of policy-relevant outcomes impacted by advances in GHG measurement science, with an emphasis on improving GHG inventory estimates, identifying emissions mitigation strategies, and informing technology development requirements.

  16. Summary report of PQRI Workshop on Nanomaterial in Drug Products: current experience and management of potential risks.

    PubMed

    Bartlett, Jeremy A; Brewster, Marcus; Brown, Paul; Cabral-Lilly, Donna; Cruz, Celia N; David, Raymond; Eickhoff, W Mark; Haubenreisser, Sabine; Jacobs, Abigail; Malinoski, Frank; Morefield, Elaine; Nalubola, Ritu; Prud'homme, Robert K; Sadrieh, Nakissa; Sayes, Christie M; Shahbazian, Hripsime; Subbarao, Nanda; Tamarkin, Lawrence; Tyner, Katherine; Uppoor, Rajendra; Whittaker-Caulk, Margaret; Zamboni, William

    2015-01-01

    At the Product Quality Research Institute (PQRI) Workshop held last January 14-15, 2014, participants from academia, industry, and governmental agencies involved in the development and regulation of nanomedicines discussed the current state of characterization, formulation development, manufacturing, and nonclinical safety evaluation of nanomaterial-containing drug products for human use. The workshop discussions identified areas where additional understanding of material attributes, absorption, biodistribution, cellular and tissue uptake, and disposition of nanosized particles would continue to inform their safe use in drug products. Analytical techniques and methods used for in vitro characterization and stability testing of formulations containing nanomaterials were discussed, along with their advantages and limitations. Areas where additional regulatory guidance and material characterization standards would help in the development and approval of nanomedicines were explored. Representatives from the US Food and Drug Administration (USFDA), Health Canada, and European Medicines Agency (EMA) presented information about the diversity of nanomaterials in approved and newly developed drug products. USFDA, Health Canada, and EMA regulators discussed the applicability of current regulatory policies in presentations and open discussion. Information contained in several of the recent EMA reflection papers was discussed in detail, along with their scope and intent to enhance scientific understanding about disposition, efficacy, and safety of nanomaterials introduced in vivo and regulatory requirements for testing and market authorization. Opportunities for interaction with regulatory agencies during the lifecycle of nanomedicines were also addressed at the meeting. This is a summary of the workshop presentations and discussions, including considerations for future regulatory guidance on drug products containing nanomaterials.

  17. Sensor validation and fusion for gas turbine vibration monitoring

    NASA Astrophysics Data System (ADS)

    Yan, Weizhong; Goebel, Kai F.

    2003-08-01

    Vibration monitoring is an important practice throughout regular operation of gas turbine power systems and, even more so, during characterization tests. Vibration monitoring relies on accurate and reliable sensor readings. To obtain accurate readings, sensors are placed such that the signal is maximized. In the case of characterization tests, strain gauges are placed at the location of vibration modes on blades inside the gas turbine. Due to the prevailing harsh environment, these sensors have a limited life and decaying accuracy, both of which impair vibration assessment. At the same time bandwidth limitations may restrict data transmission, which in turn limits the number of sensors that can be used for assessment. Knowing the sensor status (normal or faulty), and more importantly, knowing the true vibration level of the system all the time is essential for successful gas turbine vibration monitoring. This paper investigates a dynamic sensor validation and system health reasoning scheme that addresses the issues outlined above by considering only the information required to reliably assess system health status. In particular, if abnormal system health is suspected or if the primary sensor is determined to be faulted, information from available "sibling" sensors is dynamically integrated. A confidence expresses the complex interactions of sensor health and system health, their reliabilities, conflicting information, and what the health assessment is. Effectiveness of the scheme in achieving accurate and reliable vibration evaluation is then demonstrated using a combination of simulated data and a small sample of a real-world application data where the vibration of compressor blades during a real time characterization test of a new gas turbine power system is monitored.

  18. Requirements Development Issues for Advanced Life Support Systems: Solid Waste Management

    NASA Technical Reports Server (NTRS)

    Levri, Julie A.; Fisher, John W.; Alazraki, Michael P.; Hogan, John A.

    2002-01-01

    Long duration missions pose substantial new challenges for solid waste management in Advanced Life Support (ALS) systems. These possibly include storing large volumes of waste material in a safe manner, rendering wastes stable or sterilized for extended periods of time, and/or processing wastes for recovery of vital resources. This is further complicated because future missions remain ill-defined with respect to waste stream quantity, composition and generation schedule. Without definitive knowledge of this information, development of requirements is hampered. Additionally, even if waste streams were well characterized, other operational and processing needs require clarification (e.g. resource recovery requirements, planetary protection constraints). Therefore, the development of solid waste management (SWM) subsystem requirements for long duration space missions is an inherently uncertain, complex and iterative process. The intent of this paper is to address some of the difficulties in writing requirements for missions that are not completely defined. This paper discusses an approach and motivation for ALS SWM requirements development, the characteristics of effective requirements, and the presence of those characteristics in requirements that are developed for uncertain missions. Associated drivers for life support system technological capability are also presented. A general means of requirements forecasting is discussed, including successive modification of requirements and the need to consider requirements integration among subsystems.

  19. SUPERFUND INNOVATIVE TECHNOLOGIES EVALUATION ...

    EPA Pesticide Factsheets

    This task seeks to identify high priority needs of the Regions and Program Offices for innovative field sampling, characterization, monitoring, and measurement technologies. When an appropriate solution to a specific problem is identified, a field demonstration is conducted to document the performance and cost of the proposed technologies. The use of field analysis almost always provides a savings in time and cost over the usual sample and ship to a conventional laboratory for analysis approach to site characterization and monitoring. With improvements in technology and appropriate quality assurance/quality control, field analysis has been shown to provide high quality data, useful for most environmental monitoring or characterization projects. An emphasis of the program is to seek out innovative solutions to existing problems and to provide the cost and performance data a user would require to make an informed decision regarding the adequacy of a technology to address a specific environmental problem. The objective of this program is to promote the acceptance and use of innovative field technologies by providing well-documented performance and cost data obtained from field demonstrations.

  20. First evidence of tyre debris characterization at the nanoscale by focused ion beam

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Milani, M.; Pucillo, F.P.; Ballerini, M.

    2004-07-15

    In this paper, we present a novel technique for the nanoscale characterization of the outer and inner structure of tyre debris. Tyre debris is produced by the normal wear of tyres. In previous studies, the microcharacterization and identification were performed by analytical electron microscopy. This study is a development of the characterization of surface and microstructure of tyre debris. For the first time, tyre debris was analysed by focused ion beam (FIB), a technique with 2- to 5-nm resolution that does not require any sample preparation. We studied tyre debris produced in the laboratory. We made electron and ionic imagingmore » of the surface of the material, and after a ionic cut, we studied the internal microstructure of the same sample. The tyre debris was analysed by FIB without any sample preparations unlike the case of scanning and transmission electron microscopy (SEM and TEM). Useful information was derived to improve detection and monitoring techniques of pollution by tyre degradation processes.« less

  1. Characterizing Space Environments with Long-Term Space Plasma Archive Resources

    NASA Technical Reports Server (NTRS)

    Minow, Joseph I.; Miller, J. Scott; Diekmann, Anne M.; Parker, Linda N.

    2009-01-01

    A significant scientific benefit of establishing and maintaining long-term space plasma data archives is the ready access the archives afford to resources required for characterizing spacecraft design environments. Space systems must be capable of operating in the mean environments driven by climatology as well as the extremes that occur during individual space weather events. Long- term time series are necessary to obtain quantitative information on environment variability and extremes that characterize the mean and worst case environments that may be encountered during a mission. In addition, analysis of large data sets are important to scientific studies of flux limiting processes that provide a basis for establishing upper limits to environment specifications used in radiation or charging analyses. We present applications using data from existing archives and highlight their contributions to space environment models developed at Marshall Space Flight Center including the Chandra Radiation Model, ionospheric plasma variability models, and plasma models of the L2 space environment.

  2. Probabilistic Exposure Analysis for Chemical Risk Characterization

    PubMed Central

    Bogen, Kenneth T.; Cullen, Alison C.; Frey, H. Christopher; Price, Paul S.

    2009-01-01

    This paper summarizes the state of the science of probabilistic exposure assessment (PEA) as applied to chemical risk characterization. Current probabilistic risk analysis methods applied to PEA are reviewed. PEA within the context of risk-based decision making is discussed, including probabilistic treatment of related uncertainty, interindividual heterogeneity, and other sources of variability. Key examples of recent experience gained in assessing human exposures to chemicals in the environment, and other applications to chemical risk characterization and assessment, are presented. It is concluded that, although improvements continue to be made, existing methods suffice for effective application of PEA to support quantitative analyses of the risk of chemically induced toxicity that play an increasing role in key decision-making objectives involving health protection, triage, civil justice, and criminal justice. Different types of information required to apply PEA to these different decision contexts are identified, and specific PEA methods are highlighted that are best suited to exposure assessment in these separate contexts. PMID:19223660

  3. An Introduction to Using Surface Geophysics to Characterize Sand and Gravel Deposits

    USGS Publications Warehouse

    Lucius, Jeffrey E.; Langer, William H.; Ellefsen, Karl J.

    2006-01-01

    This report is an introduction to surface geophysical techniques that aggregate producers can use to characterize known deposits of sand and gravel. Five well-established and well-tested geophysical methods are presented: seismic refraction and reflection, resistivity, ground penetrating radar, time-domain electromagnetism, and frequency-domain electromagnetism. Depending on site conditions and the selected method(s), geophysical surveys can provide information concerning aerial extent and thickness of the deposit, thickness of overburden, depth to the water table, critical geologic contacts, and location and correlation of geologic features. In addition, geophysical surveys can be conducted prior to intensive drilling to help locate auger or drill holes, reduce the number of drill holes required, calculate stripping ratios to help manage mining costs, and provide continuity between sampling sites to upgrade the confidence of reserve calculations from probable reserves to proved reserves. Perhaps the greatest value of geophysics to aggregate producers may be the speed of data acquisition, reduced overall costs, and improved subsurface characterization.

  4. An Introduction to Using Surface Geophysics to Characterize Sand and Gravel Deposits

    USGS Publications Warehouse

    Lucius, Jeffrey E.; Langer, William H.; Ellefsen, Karl J.

    2007-01-01

    This report is an introduction to surface geophysical techniques that aggregate producers can use to characterize known deposits of sand and gravel. Five well-established and well-tested geophysical methods are presented: seismic refraction and reflection, resistivity, ground penetrating radar, time-domain electromagnetism, and frequency-domain electromagnetism. Depending on site conditions and the selected method(s), geophysical surveys can provide information concerning areal extent and thickness of the deposit, thickness of overburden, depth to the water table, critical geologic contacts, and location and correlation of geologic features. In addition, geophysical surveys can be conducted prior to intensive drilling to help locate auger or drill holes, reduce the number of drill holes required, calculate stripping ratios to help manage mining costs, and provide continuity between sampling sites to upgrade the confidence of reserve calculations from probable reserves to proved reserves. Perhaps the greatest value of geophysics to aggregate producers may be the speed of data acquisition, reduced overall costs, and improved subsurface characterization.

  5. Characterizing entanglement of an artificial atom and a cavity cat state with Bell's inequality

    PubMed Central

    Vlastakis, Brian; Petrenko, Andrei; Ofek, Nissim; Sun, Luyan; Leghtas, Zaki; Sliwa, Katrina; Liu, Yehan; Hatridge, Michael; Blumoff, Jacob; Frunzio, Luigi; Mirrahimi, Mazyar; Jiang, Liang; Devoret, M. H.; Schoelkopf, R. J.

    2015-01-01

    The Schrodinger's cat thought experiment highlights the counterintuitive concept of entanglement in macroscopically distinguishable systems. The hallmark of entanglement is the detection of strong correlations between systems, most starkly demonstrated by the violation of a Bell inequality. No violation of a Bell inequality has been observed for a system entangled with a superposition of coherent states, known as a cat state. Here we use the Clauser–Horne–Shimony–Holt formulation of a Bell test to characterize entanglement between an artificial atom and a cat state, or a Bell-cat. Using superconducting circuits with high-fidelity measurements and real-time feedback, we detect correlations that surpass the classical maximum of the Bell inequality. We investigate the influence of decoherence with states up to 16 photons in size and characterize the system by introducing joint Wigner tomography. Such techniques demonstrate that information stored in superpositions of coherent states can be extracted efficiently, a crucial requirement for quantum computing with resonators. PMID:26611724

  6. Characterizing entanglement of an artificial atom and a cavity cat state with Bell's inequality.

    PubMed

    Vlastakis, Brian; Petrenko, Andrei; Ofek, Nissim; Sun, Luyan; Leghtas, Zaki; Sliwa, Katrina; Liu, Yehan; Hatridge, Michael; Blumoff, Jacob; Frunzio, Luigi; Mirrahimi, Mazyar; Jiang, Liang; Devoret, M H; Schoelkopf, R J

    2015-11-27

    The Schrodinger's cat thought experiment highlights the counterintuitive concept of entanglement in macroscopically distinguishable systems. The hallmark of entanglement is the detection of strong correlations between systems, most starkly demonstrated by the violation of a Bell inequality. No violation of a Bell inequality has been observed for a system entangled with a superposition of coherent states, known as a cat state. Here we use the Clauser-Horne-Shimony-Holt formulation of a Bell test to characterize entanglement between an artificial atom and a cat state, or a Bell-cat. Using superconducting circuits with high-fidelity measurements and real-time feedback, we detect correlations that surpass the classical maximum of the Bell inequality. We investigate the influence of decoherence with states up to 16 photons in size and characterize the system by introducing joint Wigner tomography. Such techniques demonstrate that information stored in superpositions of coherent states can be extracted efficiently, a crucial requirement for quantum computing with resonators.

  7. Using resources for scientific-driven pharmacovigilance: from many product safety documents to one product safety master file.

    PubMed

    Furlan, Giovanni

    2012-08-01

    Current regulations require a description of the overall safety profile or the specific risks of a drug in multiple documents such as the Periodic and Development Safety Update Reports, Risk Management Plans (RMPs) and Signal Detection Reports. In a resource-constrained world, the need for preparing multiple documents reporting the same information results in shifting the focus from a thorough scientific and medical evaluation of the available data to maintaining compliance with regulatory timelines. Since the aim of drug safety is to understand and characterize product issues to take adequate risk minimization measures rather than to comply with bureaucratic requirements, there is the need to avoid redundancy. In order to identify core drug safety activities that need to be undertaken to protect patient safety and reduce the number of documents reporting the results of these activities, the author has reviewed the main topics included in the drug safety guidelines and templates. The topics and sources that need to be taken into account in the main regulatory documents have been found to greatly overlap and, in the future, as a result of the new Periodic Safety Update Report structure and requirements, in the author's opinion this overlap is likely to further increase. Many of the identified inter-document differences seemed to be substantially formal. The Development Safety Update Report, for example, requires separate presentation of the safety issues emerging from different sources followed by an overall evaluation of each safety issue. The RMP, instead, requires a detailed description of the safety issues without separate presentation of the evidence derived from each source. To some extent, however, the individual documents require an in-depth analysis of different aspects; the RMP, for example, requires an epidemiological description of the indication for which the drug is used and its risks. At the time of writing this article, this is not specifically required by other documents. The author has identified signal detection (intended not only as adverse event disproportionate reporting, but including non-clinical, laboratory, clinical analysis data and literature screening) and characterization as the basis for the preparation of all drug safety documents, which can be viewed as different ways of presenting the results of this activity. Therefore, the author proposes to merge all the aggregate reports required by current regulations into a single document - the Drug Safety Master File. This report should contain all the available information, from any source, regarding the potential and identified risks of a drug. It should be a living document updated and submitted to regulatory authorities on an ongoing basis.

  8. Framework for Modeling High-Impact, Low-Frequency Power Grid Events to Support Risk-Informed Decisions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Veeramany, Arun; Unwin, Stephen D.; Coles, Garill A.

    2015-12-03

    Natural and man-made hazardous events resulting in loss of grid infrastructure assets challenge the electric power grid’s security and resilience. However, the planning and allocation of appropriate contingency resources for such events requires an understanding of their likelihood and the extent of their potential impact. Where these events are of low likelihood, a risk-informed perspective on planning can be problematic as there exists an insufficient statistical basis to directly estimate the probabilities and consequences of their occurrence. Since risk-informed decisions rely on such knowledge, a basis for modeling the risk associated with high-impact low frequency events (HILFs) is essential. Insightsmore » from such a model can inform where resources are most rationally and effectively expended. The present effort is focused on development of a HILF risk assessment framework. Such a framework is intended to provide the conceptual and overarching technical basis for the development of HILF risk models that can inform decision makers across numerous stakeholder sectors. The North American Electric Reliability Corporation (NERC) 2014 Standard TPL-001-4 considers severe events for transmission reliability planning, but does not address events of such severity that they have the potential to fail a substantial fraction of grid assets over a region, such as geomagnetic disturbances (GMD), extreme seismic events, and coordinated cyber-physical attacks. These are beyond current planning guidelines. As noted, the risks associated with such events cannot be statistically estimated based on historic experience; however, there does exist a stable of risk modeling techniques for rare events that have proven of value across a wide range of engineering application domains. There is an active and growing interest in evaluating the value of risk management techniques in the State transmission planning and emergency response communities, some of this interest in the context of grid modernization activities. The availability of a grid HILF risk model, integrated across multi-hazard domains which, when interrogated, can support transparent, defensible and effective decisions, is an attractive prospect among these communities. In this report, we document an integrated HILF risk framework intended to inform the development of risk models. These models would be based on the systematic and comprehensive (to within scope) characterization of hazards to the level of detail required for modeling risk, identification of the stressors associated with the hazards (i.e., the means of impacting grid and supporting infrastructure), characterization of the vulnerability of assets to these stressors and the probabilities of asset compromise, the grid’s dynamic response to the asset failures, and assessment of subsequent severities of consequence with respect to selected impact metrics, such as power outage duration and geographic reach. Specifically, the current framework is being developed to;1. Provide the conceptual and overarching technical paradigms for the development of risk models; 2. Identify the classes of models required to implement the framework - providing examples of existing models, and also identifying where modeling gaps exist; 3. Identify the types of data required, addressing circumstances under which data are sparse and the formal elicitation of informed judgment might be required; and 4. Identify means by which the resultant risk models might be interrogated to form the necessary basis for risk management.« less

  9. A sampling procedure to guide the collection of narrow-band, high-resolution spatially and spectrally representative reflectance data. [satellite imagery of earth resources

    NASA Technical Reports Server (NTRS)

    Brand, R. R.; Barker, J. L.

    1983-01-01

    A multistage sampling procedure using image processing, geographical information systems, and analytical photogrammetry is presented which can be used to guide the collection of representative, high-resolution spectra and discrete reflectance targets for future satellite sensors. The procedure is general and can be adapted to characterize areas as small as minor watersheds and as large as multistate regions. Beginning with a user-determined study area, successive reductions in size and spectral variation are performed using image analysis techniques on data from the Multispectral Scanner, orbital and simulated Thematic Mapper, low altitude photography synchronized with the simulator, and associated digital data. An integrated image-based geographical information system supports processing requirements.

  10. MMM: A toolbox for integrative structure modeling.

    PubMed

    Jeschke, Gunnar

    2018-01-01

    Structural characterization of proteins and their complexes may require integration of restraints from various experimental techniques. MMM (Multiscale Modeling of Macromolecules) is a Matlab-based open-source modeling toolbox for this purpose with a particular emphasis on distance distribution restraints obtained from electron paramagnetic resonance experiments on spin-labelled proteins and nucleic acids and their combination with atomistic structures of domains or whole protomers, small-angle scattering data, secondary structure information, homology information, and elastic network models. MMM does not only integrate various types of restraints, but also various existing modeling tools by providing a common graphical user interface to them. The types of restraints that can support such modeling and the available model types are illustrated by recent application examples. © 2017 The Protein Society.

  11. Remedial Investigation Report on Bear Creek Valley Operable Unit 2 (Rust Spoil Area, Spoil Area 1, and SY-200 Yard) at the Oak Ridge Y-12 Plant, Oak Ridge, Tennessee. Volume 1, Main text

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1995-01-01

    This report on the BCV OU 2 at the Y-12 Plant, was prepared in accordance with requirements under the Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA) for reporting the results of a site characterization for public review. It provides the Environmental Restoration Program with information about the results of the 1993 investigation. It includes information on risk assessments that have evaluated impacts to human health and the environment. Field activities included collection of subsurface soil samples, groundwater and surface water samples, and sediments and seep at the Rust Spoil Area (RSA), SY-200 Yard, and SA-1.

  12. Charactrisation of particle assemblies by 3D cross correlation light scattering and diffusing wave spectroscopy

    NASA Astrophysics Data System (ADS)

    Scheffold, Frank

    2014-08-01

    To characterize the structural and dynamic properties of soft materials and small particles, information on the relevant mesoscopic length scales is required. Such information is often obtained from traditional static and dynamic light scattering (SLS/DLS) experiments in the single scattering regime. In many dense systems, however, these powerful techniques frequently fail due to strong multiple scattering of light. Here I will discuss some experimental innovations that have emerged over the last decade. New methods such as 3D static and dynamic light scattering (3D LS) as well as diffusing wave spectroscopy (DWS) can cover a much extended range of experimental parameters ranging from dilute polymer solutions, colloidal suspensions to extremely opaque viscoelastic emulsions.

  13. Site Safety and Health Plan (Phase 3) for the treatability study for in situ vitrification at Seepage Pit 1 in Waste Area Grouping 7, Oak Ridge National Laboratory, Oak Ridge, TN

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Spalding, B.P.; Naney, M.T.

    1995-06-01

    This plan is to be implemented for Phase III ISV operations and post operations sampling. Two previous project phases involving site characterization have been completed and required their own site specific health and safety plans. Project activities will take place at Seepage Pit 1 in Waste Area Grouping 7 at ORNL, Oak Ridge, Tennessee. Purpose of this document is to establish standard health and safety procedures for ORNL project personnel and contractor employees in performance of this work. Site activities shall be performed in accordance with Energy Systems safety and health policies and procedures, DOE orders, Occupational Safety and Healthmore » Administration Standards 29 CFR Part 1910 and 1926; applicable United States Environmental Protection Agency requirements; and consensus standards. Where the word ``shall`` is used, the provisions of this plan are mandatory. Specific requirements of regulations and orders have been incorporated into this plan in accordance with applicability. Included from 29 CFR are 1910.120 Hazardous Waste Operations and Emergency Response; 1910.146, Permit Required - Confined Space; 1910.1200, Hazard Communication; DOE Orders requirements of 5480.4, Environmental Protection, Safety and Health Protection Standards; 5480.11, Radiation Protection; and N5480.6, Radiological Control Manual. In addition, guidance and policy will be followed as described in the Environmental Restoration Program Health and Safety Plan. The levels of personal protection and the procedures specified in this plan are based on the best information available from reference documents and site characterization data. Therefore, these recommendations represent the minimum health and safety requirements to be observed by all personnel engaged in this project.« less

  14. High resolution Physio-chemical Tissue Analysis: Towards Non-invasive In Vivo Biopsy

    NASA Astrophysics Data System (ADS)

    Xu, Guan; Meng, Zhuo-Xian; Lin, Jian-Die; Deng, Cheri X.; Carson, Paul L.; Fowlkes, J. Brian; Tao, Chao; Liu, Xiaojun; Wang, Xueding

    2016-02-01

    Conventional gold standard histopathologic diagnosis requires information of both high resolution structural and chemical changes in tissue. Providing optical information at ultrasonic resolution, photoacoustic (PA) technique could provide highly sensitive and highly accurate tissue characterization noninvasively in the authentic in vivo environment, offering a replacement for histopathology. A two-dimensional (2D) physio-chemical spectrogram (PCS) combining micrometer to centimeter morphology and chemical composition simultaneously can be generated for each biological sample with PA measurements at multiple optical wavelengths. This spectrogram presents a unique 2D “physio-chemical signature” for any specific type of tissue. Comprehensive analysis of PCS, termed PA physio-chemical analysis (PAPCA), can lead to very rich diagnostic information, including the contents of all relevant molecular and chemical components along with their corresponding histological microfeatures, comparable to those accessible by conventional histology. PAPCA could contribute to the diagnosis of many diseases involving diffusive patterns such as fatty liver.

  15. An Informational Algorithm as the Basis for Perception-Action Control of the Instantaneous Axes of the Knee

    PubMed Central

    Kim, Wangdo; Espanha, Margarida M.; Veloso, António P.; Araújo, Duarte; João, Filipa; Carrão, Luis; Kohles, Sean S.

    2013-01-01

    Traditional locomotion studies emphasize an optimization of the desired movement trajectories while ignoring sensory feedback. We propose an information based theory that locomotion is neither triggered nor commanded but controlled. The basis for this control is the information derived from perceiving oneself in the world. Control therefore lies in the human-environment system. In order to test this hypothesis, we derived a mathematical foundation characterizing the energy that is required to perform a rotational twist, with small amplitude, of the instantaneous axes of the knee (IAK). We have found that the joint’s perception of the ground reaction force may be replaced by the co-perception of muscle activation with appropriate intensities. This approach generated an accurate comparison with known joint forces and appears appropriate in so far as predicting the effect on the knee when it is free to twist about the IAK. PMID:24932433

  16. Classification-free threat detection based on material-science-informed clustering

    NASA Astrophysics Data System (ADS)

    Yuan, Siyang; Wolter, Scott D.; Greenberg, Joel A.

    2017-05-01

    X-ray diffraction (XRD) is well-known for yielding composition and structural information about a material. However, in some applications (such as threat detection in aviation security), the properties of a material are more relevant to the task than is a detailed material characterization. Furthermore, the requirement that one first identify a material before determining its class may be difficult or even impossible for a sufficiently large pool of potentially present materials. We therefore seek to learn relevant composition-structure-property relationships between materials to enable material-identification-free classification. We use an expert-informed, data-driven approach operating on a library of XRD spectra from a broad array of stream of commerce materials. We investigate unsupervised learning techniques in order to learn about naturally emergent groupings, and apply supervised learning techniques to determine how well XRD features can be used to separate user-specified classes in the presence of different types and degrees of signal degradation.

  17. Coding principles of the canonical cortical microcircuit in the avian brain

    PubMed Central

    Calabrese, Ana; Woolley, Sarah M. N.

    2015-01-01

    Mammalian neocortex is characterized by a layered architecture and a common or “canonical” microcircuit governing information flow among layers. This microcircuit is thought to underlie the computations required for complex behavior. Despite the absence of a six-layered cortex, birds are capable of complex cognition and behavior. In addition, the avian auditory pallium is composed of adjacent information-processing regions with genetically identified neuron types and projections among regions comparable with those found in the neocortex. Here, we show that the avian auditory pallium exhibits the same information-processing principles that define the canonical cortical microcircuit, long thought to have evolved only in mammals. These results suggest that the canonical cortical microcircuit evolved in a common ancestor of mammals and birds and provide a physiological explanation for the evolution of neural processes that give rise to complex behavior in the absence of cortical lamination. PMID:25691736

  18. Generalized optical angular momentum sorter and its application to high-dimensional quantum cryptography.

    PubMed

    Larocque, Hugo; Gagnon-Bischoff, Jérémie; Mortimer, Dominic; Zhang, Yingwen; Bouchard, Frédéric; Upham, Jeremy; Grillo, Vincenzo; Boyd, Robert W; Karimi, Ebrahim

    2017-08-21

    The orbital angular momentum (OAM) carried by optical beams is a useful quantity for encoding information. This form of encoding has been incorporated into various works ranging from telecommunications to quantum cryptography, most of which require methods that can rapidly process the OAM content of a beam. Among current state-of-the-art schemes that can readily acquire this information are so-called OAM sorters, which consist of devices that spatially separate the OAM components of a beam. Such devices have found numerous applications in optical communications, a field that is in constant demand for additional degrees of freedom, such as polarization and wavelength, into which information can also be encoded. Here, we report the implementation of a device capable of sorting a beam based on its OAM and polarization content, which could be of use in works employing both of these degrees of freedom as information channels. After characterizing our fabricated device, we demonstrate how it can be used for quantum communications via a quantum key distribution protocol.

  19. A decision science approach for integrating social science in climate and energy solutions

    NASA Astrophysics Data System (ADS)

    Wong-Parodi, Gabrielle; Krishnamurti, Tamar; Davis, Alex; Schwartz, Daniel; Fischhoff, Baruch

    2016-06-01

    The social and behavioural sciences are critical for informing climate- and energy-related policies. We describe a decision science approach to applying those sciences. It has three stages: formal analysis of decisions, characterizing how well-informed actors should view them; descriptive research, examining how people actually behave in such circumstances; and interventions, informed by formal analysis and descriptive research, designed to create attractive options and help decision-makers choose among them. Each stage requires collaboration with technical experts (for example, climate scientists, geologists, power systems engineers and regulatory analysts), as well as continuing engagement with decision-makers. We illustrate the approach with examples from our own research in three domains related to mitigating climate change or adapting to its effects: preparing for sea-level rise, adopting smart grid technologies in homes, and investing in energy efficiency for office buildings. The decision science approach can facilitate creating climate- and energy-related policies that are behaviourally informed, realistic and respectful of the people whom they seek to aid.

  20. MS-based analytical methodologies to characterize genetically modified crops.

    PubMed

    García-Cañas, Virginia; Simó, Carolina; León, Carlos; Ibáñez, Elena; Cifuentes, Alejandro

    2011-01-01

    The development of genetically modified crops has had a great impact on the agriculture and food industries. However, the development of any genetically modified organism (GMO) requires the application of analytical procedures to confirm the equivalence of the GMO compared to its isogenic non-transgenic counterpart. Moreover, the use of GMOs in foods and agriculture faces numerous criticisms from consumers and ecological organizations that have led some countries to regulate their production, growth, and commercialization. These regulations have brought about the need of new and more powerful analytical methods to face the complexity of this topic. In this regard, MS-based technologies are increasingly used for GMOs analysis to provide very useful information on GMO composition (e.g., metabolites, proteins). This review focuses on the MS-based analytical methodologies used to characterize genetically modified crops (also called transgenic crops). First, an overview on genetically modified crops development is provided, together with the main difficulties of their analysis. Next, the different MS-based analytical approaches applied to characterize GM crops are critically discussed, and include "-omics" approaches and target-based approaches. These methodologies allow the study of intended and unintended effects that result from the genetic transformation. This information is considered to be essential to corroborate (or not) the equivalence of the GM crop with its isogenic non-transgenic counterpart. Copyright © 2010 Wiley Periodicals, Inc.

  1. High-fidelity operations in microfabricated surface ion traps

    NASA Astrophysics Data System (ADS)

    Maunz, Peter

    2017-04-01

    Trapped ion systems can be used to implement quantum computation as well as quantum simulation. To scale these systems to the number of qubits required to solve interesting problems in quantum chemistry or solid state physics, the use of large multi-zone ion traps has been proposed. Microfabrication enables the realization of surface electrode ion traps with complex electrode structures. While these traps may enable the scaling of trapped ion quantum information processing (QIP), microfabricated ion traps also pose several technical challenges. Here, we present Sandia's trap fabrication capabilities and characterize trap properties and shuttling operations in our most recent high optical access trap (HOA-2). To demonstrate the viability of Sandia's microfabricated ion traps for QIP we realize robust single and two-qubit gates and characterize them using gate set tomography (GST). In this way we are able to demonstrate the first single qubit gates with a diamond norm of less than 1 . 7 ×10-4 , below a rigorous fault tolerance threshold for general noise of 6 . 7 ×10-4. Furthermore, we realize Mølmer-Sørensen two qubit gates with a process fidelity of 99 . 58(6) % also characterized by GST. These results demonstrate the viability of microfabricated surface traps for state of the art quantum information processing demonstrations. This research was funded, in part, by the Office of the Director of National Intelligence (ODNI), Intelligence Advanced Research Projects Activity (IARPA).

  2. An International Survey of Brain Banking Operation and Characterization Practices

    PubMed Central

    Palmer-Aronsten, Beatrix; McCrossin, Toni; Kril, Jillian

    2016-01-01

    Brain banks continue to make a major contribution to the study of neurological and psychiatric disorders. The current complexity and scope of research heighten the need for well-characterized cases and the demand for larger cohorts and necessitate strategies, such as the establishment of bank networks based in regional areas. While individual brain banks have developed protocols that meet researchers' needs within the confines of resources and funding, to further promote collaboration, standardization and scientific validity and understanding of the current protocols of participating banks are required. A survey was sent to brain banks, identified by an Internet search, to investigate operational protocols, case characterization, cohort management, data collection, standardization, and degree of collaboration between banks. The majority of the 24 banks that returned the survey have been established for more than 20 years, and most are affiliated with a regional network. While prospective donor programs were the primary source of donation, the data collected on donors varied. Longitudinal information assists case characterization and enhances the analysis capabilities of research. However, acquiring this information depended on the availability of qualified staff. Respondents indicated a high level of importance for standardization, but only 8 of 24 considered this occurred between banks. Standard diagnostic criteria were not achieved in the classification of controls, and some banks relied on the researcher to indicate the criteria for classification of controls. Although the capacity to collaborate with other banks was indicated by 16 of 24 banks, this occurred infrequently. Engagement of all brain banks to participate toward a consensus of diagnostic tools, especially for controls, will strengthen collaboration. PMID:27399803

  3. An International Survey of Brain Banking Operation and Characterization Practices.

    PubMed

    Palmer-Aronsten, Beatrix; Sheedy, Donna; McCrossin, Toni; Kril, Jillian

    2016-12-01

    Brain banks continue to make a major contribution to the study of neurological and psychiatric disorders. The current complexity and scope of research heighten the need for well-characterized cases and the demand for larger cohorts and necessitate strategies, such as the establishment of bank networks based in regional areas. While individual brain banks have developed protocols that meet researchers' needs within the confines of resources and funding, to further promote collaboration, standardization and scientific validity and understanding of the current protocols of participating banks are required. A survey was sent to brain banks, identified by an Internet search, to investigate operational protocols, case characterization, cohort management, data collection, standardization, and degree of collaboration between banks. The majority of the 24 banks that returned the survey have been established for more than 20 years, and most are affiliated with a regional network. While prospective donor programs were the primary source of donation, the data collected on donors varied. Longitudinal information assists case characterization and enhances the analysis capabilities of research. However, acquiring this information depended on the availability of qualified staff. Respondents indicated a high level of importance for standardization, but only 8 of 24 considered this occurred between banks. Standard diagnostic criteria were not achieved in the classification of controls, and some banks relied on the researcher to indicate the criteria for classification of controls. Although the capacity to collaborate with other banks was indicated by 16 of 24 banks, this occurred infrequently. Engagement of all brain banks to participate toward a consensus of diagnostic tools, especially for controls, will strengthen collaboration.

  4. A Multidisciplinary Paradigm and Approach to Protecting Human Health and the Environment, Society, and Stakeholders at Nuclear Facilities - 12244

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Burger, Joanna; Environmental and Occupational Health Sciences Institute, Piscataway, NJ; Gochfeld, Michael

    2012-07-01

    As the Department of Energy (DOE) continues to remediate its lands, and to consider moving toward long-term stewardship and the development of energy parks on its industrial, remediated land, it is essential to adequately characterize the environment around such facilities to protect society, human health, and the environment. While DOE sites re considering several different land-use scenarios, all of them require adequate protection of the environment. Even if DOE lands are developed for energy parks that are mainly for industrializes sections of DOE lands that will not be remediated to residential standards, there is still the need to consider themore » protection of human health and the environment. We present an approach to characterization and establishment of teams that will gather the information, and integrate that information for a full range of stakeholders from technical personnel, to public policy makers, and that public. Such information is needed to establish baselines, site new energy facilities in energy parks, protect existing nuclear facilities and nuclear wastes, improve the basis for emergency planning, devise suitable monitoring schemes to ensure continued protection, provide data to track local and regional response changes, and for mitigation, remediation and decommissioning planning. We suggest that there are five categories of information or data needs, including 1) geophysical, sources, fate and transport, 2) biological systems, 3) human health, 4) stakeholder and environmental justice, and 5) societal, economic, and political. These informational needs are more expansive than the traditional site characterization, but encompass a suite of physical, biological, and societal needs to protect all aspects of human health and the environment, not just physical health. We suggest a Site Committee be established that oversees technical teams for each of the major informational categories, with appropriate representation among teams and with a broad involvement of a range of governmental personnel, natural and social scientists, Native Americans, environmental justice communities, and other stakeholders. Such informational teams (and Oversight Committee) would report to a DOE-designated authority or Citizen's Advisory Board. Although designed for nuclear facilities and energy parks on DOE lands, the templates and information teams can be adapted for other hazardous facilities, such as a mercury storage facility at Oak Ridge. (authors)« less

  5. Scaling theory for information networks.

    PubMed

    Moses, Melanie E; Forrest, Stephanie; Davis, Alan L; Lodder, Mike A; Brown, James H

    2008-12-06

    Networks distribute energy, materials and information to the components of a variety of natural and human-engineered systems, including organisms, brains, the Internet and microprocessors. Distribution networks enable the integrated and coordinated functioning of these systems, and they also constrain their design. The similar hierarchical branching networks observed in organisms and microprocessors are striking, given that the structure of organisms has evolved via natural selection, while microprocessors are designed by engineers. Metabolic scaling theory (MST) shows that the rate at which networks deliver energy to an organism is proportional to its mass raised to the 3/4 power. We show that computational systems are also characterized by nonlinear network scaling and use MST principles to characterize how information networks scale, focusing on how MST predicts properties of clock distribution networks in microprocessors. The MST equations are modified to account for variation in the size and density of transistors and terminal wires in microprocessors. Based on the scaling of the clock distribution network, we predict a set of trade-offs and performance properties that scale with chip size and the number of transistors. However, there are systematic deviations between power requirements on microprocessors and predictions derived directly from MST. These deviations are addressed by augmenting the model to account for decentralized flow in some microprocessor networks (e.g. in logic networks). More generally, we hypothesize a set of constraints between the size, power and performance of networked information systems including transistors on chips, hosts on the Internet and neurons in the brain.

  6. High-Resolution Protein Structure Determination by Serial Femtosecond Crystallography

    PubMed Central

    Boutet, Sébastien; Lomb, Lukas; Williams, Garth J.; Barends, Thomas R. M.; Aquila, Andrew; Doak, R. Bruce; Weierstall, Uwe; DePonte, Daniel P.; Steinbrener, Jan; Shoeman, Robert L.; Messerschmidt, Marc; Barty, Anton; White, Thomas A.; Kassemeyer, Stephan; Kirian, Richard A.; Seibert, M. Marvin; Montanez, Paul A.; Kenney, Chris; Herbst, Ryan; Hart, Philip; Pines, Jack; Haller, Gunther; Gruner, Sol M.; Philipp, Hugh T.; Tate, Mark W.; Hromalik, Marianne; Koerner, Lucas J.; van Bakel, Niels; Morse, John; Ghonsalves, Wilfred; Arnlund, David; Bogan, Michael J.; Caleman, Carl; Fromme, Raimund; Hampton, Christina Y.; Hunter, Mark S.; Johansson, Linda C.; Katona, Gergely; Kupitz, Christopher; Liang, Mengning; Martin, Andrew V.; Nass, Karol; Redecke, Lars; Stellato, Francesco; Timneanu, Nicusor; Wang, Dingjie; Zatsepin, Nadia A.; Schafer, Donald; Defever, James; Neutze, Richard; Fromme, Petra; Spence, John C. H.; Chapman, Henry N.; Schlichting, Ilme

    2013-01-01

    Structure determination of proteins and other macromolecules has historically required the growth of high-quality crystals sufficiently large to diffract x-rays efficiently while withstanding radiation damage. We applied serial femtosecond crystallography (SFX) using an x-ray free-electron laser (XFEL) to obtain high-resolution structural information from microcrystals (less than 1 micrometer by 1 micrometer by 3 micrometers) of the well-characterized model protein lysozyme. The agreement with synchrotron data demonstrates the immediate relevance of SFX for analyzing the structure of the large group of difficult-to-crystallize molecules. PMID:22653729

  7. Exceedance statistics of accelerations resulting from thruster firings on the Apollo-Soyuz mission

    NASA Technical Reports Server (NTRS)

    Fichtl, G. H.; Holland, R. L.

    1981-01-01

    Spacecraft acceleration resulting from firings of vernier control system thrusters is an important consideration in the design, planning, execution and post-flight analysis of laboratory experiments in space. In particular, scientists and technologists involved with the development of experiments to be performed in space in many instances required statistical information on the magnitude and rate of occurrence of spacecraft accelerations. Typically, these accelerations are stochastic in nature, so that it is useful to characterize these accelerations in statistical terms. Statistics of spacecraft accelerations are summarized.

  8. Hanford Site Environmental Report for Calendar Year 2008

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Poston, Ted M.; Duncan, Joanne P.; Dirkes, Roger L.

    The Hanford Site environmental report is prepared annually for the U.S. Department of Energy (DOE) in accordance with regulatory requirements. The report provides an overview of activities at the Hanford Site; demonstrates the status of the site’s compliance with applicable federal, state, and local environmental laws and regulations, executive orders, and DOE policies and directives; and summarizes environmental data that characterize Hanford Site environmental management performance. The report also highlights significant environmental and public protection programs and efforts. Some historical and early 2009 information is included where appropriate.

  9. Hanford Site Environmental Report for Calendar Year 2009

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Poston, Ted M.; Duncan, Joanne P.; Dirkes, Roger L.

    The Hanford Site environmental report is prepared annually for the U.S. Department of Energy (DOE) in accordance with regulatory requirements. The report provides an overview of activities at the Hanford Site; demonstrates the status of the site’s compliance with applicable federal, state, and local environmental laws and regulations, executive orders, and DOE policies and directives; and summarizes environmental data that characterize Hanford Site environmental management performance. The report also highlights significant environmental and public protection programs and efforts. Some historical and early 2010 information is included where appropriate.

  10. Hanford Site Environmental Report for Calendar Year 2007

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Poston, Ted M.; Duncan, Joanne P.; Dirkes, Roger L.

    The Hanford Site environmental report is prepared annually for the U.S. Department of Energy (DOE) in accordance with regulatory requirements. The report provides an overview of activities at the site; demonstrates the status of the site’s compliance with applicable federal, state, and local environmental laws and regulations, executive orders, and DOE policies and directives; and summarizes environmental data that characterize Hanford Site environmental management performance. The report also highlights signifi cant environmental and public protection programs and efforts. Some historical and early 2008 information is included where appropriate.

  11. Hanford Site Environmental Report for Calendar Year 2010

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Poston, Ted M.; Duncan, Joanne P.; Dirkes, Roger L.

    The Hanford Site environmental report is prepared annually for the U.S. Department of Energy (DOE) in accordance with regulatory requirements. The report provides an overview of activities at the Hanford Site; demonstrates the status of the site's compliance with applicable federal, state, and local environmental laws and regulations, executive orders, and DOE policies and directives; and summarizes environmental data that characterize Hanford Site environmental management performance. The report also highlights significant environmental and public protection programs and efforts. Some historical and early 2011 information is included where appropriate.

  12. Design of diversity and focused combinatorial libraries in drug discovery.

    PubMed

    Young, S Stanley; Ge, Nanxiang

    2004-05-01

    Using well-characterized chemical reactions and readily available monomers, chemists are able to create sets of compounds, termed libraries, which are useful in drug discovery processes. The design of combinatorial chemical libraries can be complex and there has been much information recently published offering suggestions on how the design process can be carried out. This review focuses on literature with the goal of organizing current thinking. At this point in time, it is clear that benchmarking of current suggested methods is required as opposed to further new methods.

  13. Applications of aerospace technology in industry: A technology transfer profile. Visual display systems

    NASA Technical Reports Server (NTRS)

    1972-01-01

    The growth of common as well as emerging visual display technologies are surveyed. The major inference is that contemporary society is rapidly growing evermore reliant on visual display for a variety of purposes. Because of its unique mission requirements, the National Aeronautics and Space Administration has contributed in an important and specific way to the growth of visual display technology. These contributions are characterized by the use of computer-driven visual displays to provide an enormous amount of information concisely, rapidly and accurately.

  14. An attribute-driven statistics generator for use in a G.I.S. environment

    NASA Technical Reports Server (NTRS)

    Thomas, R. W.; Ritter, P. R.; Kaugars, A.

    1984-01-01

    When performing research using digital geographic information it is often useful to produce quantitative characterizations of the data, usually within some constraints. In the research environment the different combinations of required data and constraints can often become quite complex. This paper describes a technique that gives the researcher a powerful and flexible way to set up many possible combinations of data and constraints without having to perform numerous intermediate steps or create temporary data bands. This method provides an efficient way to produce descriptive statistics in such situations.

  15. Offroad vehicle riders in Big Cypress National Preserve: Results from a survey of permit holders

    USGS Publications Warehouse

    Farrell, T.; Kendra, A.; Roggenbuck, J.; Hall, T.; Marion, J.L.

    1999-01-01

    A survey of 800 offroad vehicle (ORV) owners at Big Cypress National Preserve, Florida, was conducted to obtain information on visitor characteristics and management preferences. This report characterizes survey results for riders of all-terrain vehicles, swamp buggies, standard 4-wheeled street vehicles, and airboats. Riders tended to feel satisfied with their ORV experiences and Preserve conditions. Riders were strongly opposed to management approaches that would restrict use or require certain behaviors. More favored were management actions to encourage low impact use practices

  16. Method for non-referential defect characterization using fractal encoding and active contours

    DOEpatents

    Gleason, Shaun S [Knoxville, TN; Sari-Sarraf, Hamed [Lubbock, TX

    2007-05-15

    A method for identification of anomalous structures, such as defects, includes the steps of providing a digital image and applying fractal encoding to identify a location of at least one anomalous portion of the image. The method does not require a reference image to identify the location of the anomalous portion. The method can further include the step of initializing an active contour based on the location information obtained from the fractal encoding step and deforming an active contour to enhance the boundary delineation of the anomalous portion.

  17. Characterization of transgenic mice--a comparison of protocols for welfare evaluation and phenotype characterization of mice with a suggestion on a future certificate of instruction.

    PubMed

    Jegstrup, I; Thon, R; Hansen, A K; Hoitinga, M Ritskes

    2003-01-01

    A thorough welfare evaluation performed as part of a general phenotype characterization for both transgenic and traditional mouse strains could not only contribute to the improvement of the welfare of laboratory animals, but could also be of benefit to scientists, laboratory veterinarians and the inspecting authorities. A literature review has been performed to identify and critically evaluate already existing protocols for phenotype and welfare characterization. There are several relevant schemes available, among others the SHIRPA method, the modified score sheet of Morton and Griffiths, the FRIMORFO phenotype characterization scheme and the behavioural phenotype schemes as described by Crawley. These protocols have been evaluated according to four goals: Their ability (1) to reveal any special needs or problems with a transgenic strain, (2) to cover the informational needs of the purchaser/user of the strain, (3) to refine the welfare of the transgenic animal model by identifying relevant humane endpoints, (4) to prevent the duplication of animal models that have already been developed. The protocols described are useful for characterizing the phenotype and judging welfare disturbances, however the total amount of information and the degree of detail varies considerably from one scheme to another. We present a proposal regarding the practical application of the various schemes that will secure proper treatment and the identification of humane endpoints. It is advocated that with every purchase of a particular strain, an instruction document should accompany the strain. This document needs to give detailed descriptions of the typical characteristics of the strain, as well as necessary actions concerning relevant treatment and humane endpoints. At the moment no such documents are required. The introduction of these types of documents will contribute to improvements in animal welfare as well as experimental results in laboratory animal experimentation.

  18. Enhanced and diminished visuo-spatial information processing in autism depends on stimulus complexity.

    PubMed

    Bertone, Armando; Mottron, Laurent; Jelenic, Patricia; Faubert, Jocelyn

    2005-10-01

    Visuo-perceptual processing in autism is characterized by intact or enhanced performance on static spatial tasks and inferior performance on dynamic tasks, suggesting a deficit of dorsal visual stream processing in autism. However, previous findings by Bertone et al. indicate that neuro-integrative mechanisms used to detect complex motion, rather than motion perception per se, may be impaired in autism. We present here the first demonstration of concurrent enhanced and decreased performance in autism on the same visuo-spatial static task, wherein the only factor dichotomizing performance was the neural complexity required to discriminate grating orientation. The ability of persons with autism was found to be superior for identifying the orientation of simple, luminance-defined (or first-order) gratings but inferior for complex, texture-defined (or second-order) gratings. Using a flicker contrast sensitivity task, we demonstrated that this finding is probably not due to abnormal information processing at a sub-cortical level (magnocellular and parvocellular functioning). Together, these findings are interpreted as a clear indication of altered low-level perceptual information processing in autism, and confirm that the deficits and assets observed in autistic visual perception are contingent on the complexity of the neural network required to process a given type of visual stimulus. We suggest that atypical neural connectivity, resulting in enhanced lateral inhibition, may account for both enhanced and decreased low-level information processing in autism.

  19. OGRO: The Overview of functionally characterized Genes in Rice online database.

    PubMed

    Yamamoto, Eiji; Yonemaru, Jun-Ichi; Yamamoto, Toshio; Yano, Masahiro

    2012-12-01

    The high-quality sequence information and rich bioinformatics tools available for rice have contributed to remarkable advances in functional genomics. To facilitate the application of gene function information to the study of natural variation in rice, we comprehensively searched for articles related to rice functional genomics and extracted information on functionally characterized genes. As of 31 March 2012, 702 functionally characterized genes were annotated. This number represents about 1.6% of the predicted loci in the Rice Annotation Project Database. The compiled gene information is organized to facilitate direct comparisons with quantitative trait locus (QTL) information in the Q-TARO database. Comparison of genomic locations between functionally characterized genes and the QTLs revealed that QTL clusters were often co-localized with high-density gene regions, and that the genes associated with the QTLs in these clusters were different genes, suggesting that these QTL clusters are likely to be explained by tightly linked but distinct genes. Information on the functionally characterized genes compiled during this study is now available in the O verview of Functionally Characterized G enes in R ice O nline database (OGRO) on the Q-TARO website ( http://qtaro.abr.affrc.go.jp/ogro ). The database has two interfaces: a table containing gene information, and a genome viewer that allows users to compare the locations of QTLs and functionally characterized genes. OGRO on Q-TARO will facilitate a candidate-gene approach to identifying the genes responsible for QTLs. Because the QTL descriptions in Q-TARO contain information on agronomic traits, such comparisons will also facilitate the annotation of functionally characterized genes in terms of their effects on traits important for rice breeding. The increasing amount of information on rice gene function being generated from mutant panels and other types of studies will make the OGRO database even more valuable in the future.

  20. Effective Materials Property Information Management for the 21st Century

    NASA Technical Reports Server (NTRS)

    Ren, Weiju; Cebon, David; Arnold, Steve

    2009-01-01

    This paper discusses key principles for the development of materials property information management software systems. There are growing needs for automated materials information management in various organizations. In part these are fueled by the demands for higher efficiency in material testing, product design and engineering analysis. But equally important, organizations are being driven by the need for consistency, quality and traceability of data, as well as control of access to sensitive information such as proprietary data. Further, the use of increasingly sophisticated nonlinear, anisotropic and multi-scale engineering analyses requires both processing of large volumes of test data for development of constitutive models and complex materials data input for Computer-Aided Engineering (CAE) software. And finally, the globalization of economy often generates great needs for sharing a single "gold source" of materials information between members of global engineering teams in extended supply chains. Fortunately, material property management systems have kept pace with the growing user demands and evolved to versatile data management systems that can be customized to specific user needs. The more sophisticated of these provide facilities for: (i) data management functions such as access, version, and quality controls; (ii) a wide range of data import, export and analysis capabilities; (iii) data "pedigree" traceability mechanisms; (iv) data searching, reporting and viewing tools; and (v) access to the information via a wide range of interfaces. In this paper the important requirements for advanced material data management systems, future challenges and opportunities such as automated error checking, data quality characterization, identification of gaps in datasets, as well as functionalities and business models to fuel database growth and maintenance are discussed.

  1. Research report: learning styles of biomedical engineering students.

    PubMed

    Dee, Kay C; Nauman, Eric A; Livesay, Glen A; Rice, Janet

    2002-09-01

    Examining students' learning styles can yield information useful to the design of learning activities, courses, and curricula. A variety of measures have been used to characterize learning styles, but the literature contains little information specific to biomedical engineering (BMEN) students. We, therefore, utilized Felder's Index of Learning Styles to investigate the learning style preferences of BMEN students at Tulane University. Tulane BMEN students preferred to receive information visually (preferred by 88% of the student sample) rather than verbally, focus on sensory information (55%) instead of intuitive information, process information actively (66%) instead of reflectively, and understand information globally (59%) rather than sequentially. These preferences varied between cohorts (freshman, sophomore, etc.) and a significantly higher percentage of female students preferred active and sensing learning styles. Compared to other engineering student populations, our sample of Tulane BMEN students contained the highest percentage of students preferring the global learning style. Whether this is a general trend for all BMEN students or a trait specific to Tulane engineers requires further investigation. Regardless, this study confirms the existence of a range of learning styles within biomedical engineering students, and provides motivation for instructors to consider how well their teaching style engages multiple learning styles.

  2. Discrimination of correlated and entangling quantum channels with selective process tomography

    DOE PAGES

    Dumitrescu, Eugene; Humble, Travis S.

    2016-10-10

    The accurate and reliable characterization of quantum dynamical processes underlies efforts to validate quantum technologies, where discrimination between competing models of observed behaviors inform efforts to fabricate and operate qubit devices. We present a protocol for quantum channel discrimination that leverages advances in direct characterization of quantum dynamics (DCQD) codes. We demonstrate that DCQD codes enable selective process tomography to improve discrimination between entangling and correlated quantum dynamics. Numerical simulations show selective process tomography requires only a few measurement configurations to achieve a low false alarm rate and that the DCQD encoding improves the resilience of the protocol to hiddenmore » sources of noise. Lastly, our results show that selective process tomography with DCQD codes is useful for efficiently distinguishing sources of correlated crosstalk from uncorrelated noise in current and future experimental platforms.« less

  3. Technical approaches to characterizing and cleaning up iron and steel mill sites under the brownfields initiative. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    EPA has developed this guide to provide decision-makers, such as city planners, private sector developers, and other involved in redeveloping brownfields, with a better understanding of the technical issues involved in assessing and cleaning up iron and steel mill sites so they can make the most informed decisions possible. This overview of the technical process involved in assessing and cleaning up brownfields sites can assist planners in making decisions at various stages of the project. An understanding of land use and industrial processes conducted in the past at a site can help the planner to conceptualize the site and identifymore » likely areas of contamination that may require cleanup. Numerous resources are suggested to facilitate characterization of the site and consideration of cleanup technologies.« less

  4. Human innate lymphoid cells.

    PubMed

    Montaldo, Elisa; Vacca, Paola; Vitale, Chiara; Moretta, Francesca; Locatelli, Franco; Mingari, Maria Cristina; Moretta, Lorenzo

    2016-11-01

    The interest in innate lymphoid cells (ILC) has rapidly grown during the last decade. ILC include distinct cell types that are collectively involved in host protection against pathogens and tumor cells and in the regulation of tissue homeostasis. Studies in mice enabled a broad characterization of ILC function and of their developmental requirements. In humans all mature ILC subsets have been characterized and their role in the pathogenesis of certain disease is emerging. Nonetheless, still limited information is available on human ILC development. Indeed, only the cell precursors committed toward NK cells or ILC3 have been described. Here, we review the most recent finding on human mature ILC, discussing their tissue localization and function. Moreover, we summarize the available data regarding human ILC development. Copyright © 2016 European Federation of Immunological Societies. Published by Elsevier B.V. All rights reserved.

  5. Modified social ecological model: a tool to guide the assessment of the risks and risk contexts of HIV epidemics.

    PubMed

    Baral, Stefan; Logie, Carmen H; Grosso, Ashley; Wirtz, Andrea L; Beyrer, Chris

    2013-05-17

    Social and structural factors are now well accepted as determinants of HIV vulnerabilities. These factors are representative of social, economic, organizational and political inequities. Associated with an improved understanding of multiple levels of HIV risk has been the recognition of the need to implement multi-level HIV prevention strategies. Prevention sciences research and programming aiming to decrease HIV incidence requires epidemiologic studies to collect data on multiple levels of risk to inform combination HIV prevention packages. Proximal individual-level risks, such as sharing injection devices and unprotected penile-vaginal or penile-anal sex, are necessary in mediating HIV acquisition and transmission. However, higher order social and structural-level risks can facilitate or reduce HIV transmission on population levels. Data characterizing these risks is often far more actionable than characterizing individual-level risks. We propose a modified social ecological model (MSEM) to help visualize multi-level domains of HIV infection risks and guide the development of epidemiologic HIV studies. Such a model may inform research in epidemiology and prevention sciences, particularly for key populations including men who have sex with men (MSM), people who inject drugs (PID), and sex workers. The MSEM builds on existing frameworks by examining multi-level risk contexts for HIV infection and situating individual HIV infection risks within wider network, community, and public policy contexts as well as epidemic stage. The utility of the MSEM is demonstrated with case studies of HIV risk among PID and MSM. The MSEM is a flexible model for guiding epidemiologic studies among key populations at risk for HIV in diverse sociocultural contexts. Successful HIV prevention strategies for key populations require effective integration of evidence-based biomedical, behavioral, and structural interventions. While the focus of epidemiologic studies has traditionally been on describing individual-level risk factors, the future necessitates comprehensive epidemiologic data characterizing multiple levels of HIV risk.

  6. Perceptual Decisions in the Presence of Relevant and Irrelevant Sensory Evidence

    PubMed Central

    Anders, Ursula M.; McLean, Charlotte S.; Ouyang, Bowen; Ditterich, Jochen

    2017-01-01

    Perceptual decisions in the presence of decision-irrelevant sensory information require a selection of decision-relevant sensory evidence. To characterize the mechanism that is responsible for separating decision-relevant from irrelevant sensory information we asked human subjects to make judgments about one of two simultaneously present motion components in a random dot stimulus. Subjects were able to ignore the decision-irrelevant component to a large degree, but their decisions were still influenced by the irrelevant sensory information. Computational modeling revealed that this influence was not simply the consequence of subjects forgetting at times which stimulus component they had been instructed to base their decision on. Instead, residual irrelevant information always seems to be leaking through, and the decision process is captured by a net sensory evidence signal being accumulated to a decision threshold. This net sensory evidence is a linear combination of decision-relevant and irrelevant sensory information. The selection process is therefore well-described by a strong linear gain modulation, which, in our experiment, resulted in the relevant sensory evidence having at least 10 times more impact on the decision than the irrelevant evidence. PMID:29176941

  7. Perceptual Decisions in the Presence of Relevant and Irrelevant Sensory Evidence.

    PubMed

    Anders, Ursula M; McLean, Charlotte S; Ouyang, Bowen; Ditterich, Jochen

    2017-01-01

    Perceptual decisions in the presence of decision-irrelevant sensory information require a selection of decision-relevant sensory evidence. To characterize the mechanism that is responsible for separating decision-relevant from irrelevant sensory information we asked human subjects to make judgments about one of two simultaneously present motion components in a random dot stimulus. Subjects were able to ignore the decision-irrelevant component to a large degree, but their decisions were still influenced by the irrelevant sensory information. Computational modeling revealed that this influence was not simply the consequence of subjects forgetting at times which stimulus component they had been instructed to base their decision on. Instead, residual irrelevant information always seems to be leaking through, and the decision process is captured by a net sensory evidence signal being accumulated to a decision threshold. This net sensory evidence is a linear combination of decision-relevant and irrelevant sensory information. The selection process is therefore well-described by a strong linear gain modulation, which, in our experiment, resulted in the relevant sensory evidence having at least 10 times more impact on the decision than the irrelevant evidence.

  8. Precautionary principles: a jurisdiction-free framework for decision-making under risk.

    PubMed

    Ricci, Paolo F; Cox, Louis A; MacDonald, Thomas R

    2004-12-01

    Fundamental principles of precaution are legal maxims that ask for preventive actions, perhaps as contingent interim measures while relevant information about causality and harm remains unavailable, to minimize the societal impact of potentially severe or irreversible outcomes. Such principles do not explain how to make choices or how to identify what is protective when incomplete and inconsistent scientific evidence of causation characterizes the potential hazards. Rather, they entrust lower jurisdictions, such as agencies or authorities, to make current decisions while recognizing that future information can contradict the scientific basis that supported the initial decision. After reviewing and synthesizing national and international legal aspects of precautionary principles, this paper addresses the key question: How can society manage potentially severe, irreversible or serious environmental outcomes when variability, uncertainty, and limited causal knowledge characterize their decision-making? A decision-analytic solution is outlined that focuses on risky decisions and accounts for prior states of information and scientific beliefs that can be updated as subsequent information becomes available. As a practical and established approach to causal reasoning and decision-making under risk, inherent to precautionary decision-making, these (Bayesian) methods help decision-makers and stakeholders because they formally account for probabilistic outcomes, new information, and are consistent and replicable. Rational choice of an action from among various alternatives--defined as a choice that makes preferred consequences more likely--requires accounting for costs, benefits and the change in risks associated with each candidate action. Decisions under any form of the precautionary principle reviewed must account for the contingent nature of scientific information, creating a link to the decision-analytic principle of expected value of information (VOI), to show the relevance of new information, relative to the initial (and smaller) set of data on which the decision was based. We exemplify this seemingly simple situation using risk management of BSE. As an integral aspect of causal analysis under risk, the methods developed in this paper permit the addition of non-linear, hormetic dose-response models to the current set of regulatory defaults such as the linear, non-threshold models. This increase in the number of defaults is an important improvement because most of the variants of the precautionary principle require cost-benefit balancing. Specifically, increasing the set of causal defaults accounts for beneficial effects at very low doses. We also show and conclude that quantitative risk assessment dominates qualitative risk assessment, supporting the extension of the set of default causal models.

  9. Hot streak characterization in serpentine exhaust nozzles

    NASA Astrophysics Data System (ADS)

    Crowe, Darrell S.

    Modern aircraft of the United States Air Force face increasingly demanding cost, weight, and survivability requirements. Serpentine exhaust nozzles within an embedded engine allow a weapon system to fulfill mission survivability requirements by providing denial of direct line-of-sight into the high-temperature components of the engine. Recently, aircraft have experienced material degradation and failure along the aft deck due to extreme thermal loading. Failure has occurred in specific regions along the aft deck where concentrations of hot gas have come in contact with the surface causing hot streaks. The prevention of these failures will be aided by the accurate prediction of hot streaks. Additionally, hot streak prediction will improve future designs by identifying areas of the nozzle and aft deck surfaces that require thermal management. To this end, the goal of this research is to observe and characterize the underlying flow physics of hot streak phenomena. The goal is accomplished by applying computational fluid dynamics to determine how hot streak phenomena is affected by changes in nozzle geometry. The present research first validates the computational methods using serpentine inlet experimental and computational studies. A design methodology is then established for creating six serpentine exhaust nozzles investigated in this research. A grid independent solution is obtained on a nozzle using several figures of merit and the grid-convergence index method. An investigation into the application of a second-order closure turbulence model is accomplished. Simulations are performed for all serpentine nozzles at two flow conditions. The research introduces a set of characterization and performance parameters based on the temperature distribution and flow conditions at the nozzle throat and exit. Examination of the temperature distribution on the upper and lower nozzle surfaces reveals critical information concerning changes in hot streak phenomena due to changes in nozzle geometry.

  10. A critical review on the carrier dynamics in 2D layered materials investigated using THz spectroscopy

    NASA Astrophysics Data System (ADS)

    Lu, Junpeng; Liu, Hongwei

    2018-01-01

    Accurately illustrating the photocarrier dynamics and photoelectrical properties of two dimensional (2D) materials is crucial in the development of 2D material-based optoelectronic devices. Considering this requirement, terahertz (THz) spectroscopy has emerged as a befitting characterization tool to provide deep insights into the carrier dynamics and measurements of the electrical/photoelectrical conductivity of 2D materials. THz spectroscopic measurements would provide information of transient behaviors of carriers with high accuracy in a nondestructive and noncontact manner. In this article, we present a comprehensive review on recent research efforts on investigations of 2D materials of graphene and transition metal dichalcogenides (TMDs) using THz spectroscopy. A brief introduction of THz time-domain spectroscopy (THz-TDS) and optical pump-THz probe spectroscopy (OPTP) is provided. The characterization of the electron transport of graphene at equilibrium state and transient behavior at non-equilibrium state is reviewed. We also review the characterizations of TMDs including MoS2 and WSe2. Finally, we conclude the recent reports and give a prospect on how THz characterizations would guide the design and optimization of 2D material-based optoelectronic devices.

  11. Characterizing 3D Vegetation Structure from Space: Mission Requirements

    NASA Technical Reports Server (NTRS)

    Hall, Forrest G.; Bergen, Kathleen; Blair, James B.; Dubayah, Ralph; Houghton, Richard; Hurtt, George; Kellndorfer, Josef; Lefsky, Michael; Ranson, Jon; Saatchi, Sasan; hide

    2012-01-01

    Human and natural forces are rapidly modifying the global distribution and structure of terrestrial ecosystems on which all of life depends, altering the global carbon cycle, affecting our climate now and for the foreseeable future, causing steep reductions in species diversity, and endangering Earth s sustainability. To understand changes and trends in terrestrial ecosystems and their functioning as carbon sources and sinks, and to characterize the impact of their changes on climate, habitat and biodiversity, new space assets are urgently needed to produce high spatial resolution global maps of the three-dimensional (3D) structure of vegetation, its biomass above ground, the carbon stored within and the implications for atmospheric green house gas concentrations and climate. These needs were articulated in a 2007 National Research Council (NRC) report (NRC, 2007) recommending a new satellite mission, DESDynI, carrying an L-band Polarized Synthetic Aperture Radar (Pol-SAR) and a multi-beam lidar (Light RAnging And Detection) operating at 1064 nm. The objectives of this paper are to articulate the importance of these new, multi-year, 3D vegetation structure and biomass measurements, to briefly review the feasibility of radar and lidar remote sensing technology to meet these requirements, to define the data products and measurement requirements, and to consider implications of mission durations. The paper addresses these objectives by synthesizing research results and other input from a broad community of terrestrial ecology, carbon cycle, and remote sensing scientists and working groups. We conclude that: (1) current global biomass and 3-D vegetation structure information is unsuitable for both science and management and policy. The only existing global datasets of biomass are approximations based on combining land cover type and representative carbon values, instead of measurements of actual biomass. Current measurement attempts based on radar and multispectral data have low explanatory power outside low biomass areas. There is no current capability for repeatable disturbance and regrowth estimates. (2) The science and policy needs for information on vegetation 3D structure can be successfully addressed by a mission capable of producing (i) a first global inventory of forest biomass with a spatial resolution 1km or finer and unprecedented accuracy (ii) annual global disturbance maps at a spatial resolution of 1 ha with subsequent biomass accumulation rates at resolutions of 1km or finer, and (iii) transects of vertical and horizontal forest structure with 30 m along-transect measurements globally at 25 m spatial resolution, essential for habitat characterization. We also show from the literature that lidar profile samples together with wall-to53 wall L-band quad-pol-SAR imagery and ecosystem dynamics models can work together to satisfy these vegetation 3D structure and biomass measurement requirements. Finally we argue that the technology readiness levels of combined pol-SAR and lidar instruments are adequate for space flight. Remaining to be worked out, are the particulars of a lidar/pol-SAR mission design that is feasible and at a minimum satisfies the information and measurement requirement articulated herein.

  12. Study site characterization. Chapter 2

    Treesearch

    Chris Potter; Richard Birdsey

    2008-01-01

    This chapter is an overview of the main site characterization requirements at landscape-scale sampling locations. The overview is organized according to multiple "Site Attribute" headings that require descriptions throughout a given study site area, leading ultimately to a sufficient overall site characterization. Guidance is provided to describe the major...

  13. Revision strategies of deaf student writers.

    PubMed

    Livingston, S

    1989-03-01

    Deaf high school students at different schools shared second drafts of their own narratives via an electronic bulletin board after conferencing with their repective teachers. This article characterizes the kinds of questions teachers asked during the conferences and the kinds of revisions the students made between first and second drafts. Results indicate that teachers most often ask questions that require student to provide more information; yet these questions do not affect revision as much as questions which require students to rephrase specific language. Students typically either added or substituted words or phrases that showed both similarities to and differences from the revision patterns of inexperienced writers with normal hearing. In the majority of cases, trained readers rated the deaf students' revised drafts better than their first attempts, signifying the central role revision plays in the composition process.

  14. Potential of Breastmilk Analysis to Inform Early Events in Breast Carcinogenesis: Rationale and Considerations

    PubMed Central

    Murphy, Jeanne; Sherman, Mark E.; Browne, Eva P.; Caballero, Ana I.; Punska, Elizabeth C.; Pfeiffer, Ruth M.; Yang, Hannah P.; Lee, Maxwell; Yang, Howard; Gierach, Gretchen L.; Arcaro, Kathleen F.

    2016-01-01

    This review summarizes methods related to the study of human breastmilk in etiologic and biomarkers research. Despite the importance of reproductive factors in breast carcinogenesis, factors that act early in life are difficult to study because young women rarely require breast imaging or biopsy, and analysis of critical circulating factors (e.g. hormones) is often complicated by the requirement to accurately account for menstrual cycle date. Accordingly, novel approaches are needed to understand how events such as pregnancy, breastfeeding, weaning, and post-weaning breast remodeling influence breast cancer risk. Analysis of breastmilk offers opportunities to understand mechanisms related to carcinogenesis in the breast, and to identify risk markers that may inform efforts to identify high-risk women early in the carcinogenic process. In addition, analysis of breastmilk could have value in early detection or diagnosis of breast cancer. In this article we describe the potential for using breastmilk to characterize the microenvironment of the lactating breast with the goal of advancing research on risk assessment, prevention, and detection of breast cancer. PMID:27107568

  15. Real-Time On-Board Processing Validation of MSPI Ground Camera Images

    NASA Technical Reports Server (NTRS)

    Pingree, Paula J.; Werne, Thomas A.; Bekker, Dmitriy L.

    2010-01-01

    The Earth Sciences Decadal Survey identifies a multiangle, multispectral, high-accuracy polarization imager as one requirement for the Aerosol-Cloud-Ecosystem (ACE) mission. JPL has been developing a Multiangle SpectroPolarimetric Imager (MSPI) as a candidate to fill this need. A key technology development needed for MSPI is on-board signal processing to calculate polarimetry data as imaged by each of the 9 cameras forming the instrument. With funding from NASA's Advanced Information Systems Technology (AIST) Program, JPL is solving the real-time data processing requirements to demonstrate, for the first time, how signal data at 95 Mbytes/sec over 16-channels for each of the 9 multiangle cameras in the spaceborne instrument can be reduced on-board to 0.45 Mbytes/sec. This will produce the intensity and polarization data needed to characterize aerosol and cloud microphysical properties. Using the Xilinx Virtex-5 FPGA including PowerPC440 processors we have implemented a least squares fitting algorithm that extracts intensity and polarimetric parameters in real-time, thereby substantially reducing the image data volume for spacecraft downlink without loss of science information.

  16. Users Manual for the Geospatial Stream Flow Model (GeoSFM)

    USGS Publications Warehouse

    Artan, Guleid A.; Asante, Kwabena; Smith, Jodie; Pervez, Md Shahriar; Entenmann, Debbie; Verdin, James P.; Rowland, James

    2008-01-01

    The monitoring of wide-area hydrologic events requires the manipulation of large amounts of geospatial and time series data into concise information products that characterize the location and magnitude of the event. To perform these manipulations, scientists at the U.S. Geological Survey Center for Earth Resources Observation and Science (EROS), with the cooperation of the U.S. Agency for International Development, Office of Foreign Disaster Assistance (USAID/OFDA), have implemented a hydrologic modeling system. The system includes a data assimilation component to generate data for a Geospatial Stream Flow Model (GeoSFM) that can be run operationally to identify and map wide-area streamflow anomalies. GeoSFM integrates a geographical information system (GIS) for geospatial preprocessing and postprocessing tasks and hydrologic modeling routines implemented as dynamically linked libraries (DLLs) for time series manipulations. Model results include maps that depicting the status of streamflow and soil water conditions. This Users Manual provides step-by-step instructions for running the model and for downloading and processing the input data required for initial model parameterization and daily operation.

  17. Source modeling and inversion with near real-time GPS: a GITEWS perspective for Indonesia

    NASA Astrophysics Data System (ADS)

    Babeyko, A. Y.; Hoechner, A.; Sobolev, S. V.

    2010-07-01

    We present the GITEWS approach to source modeling for the tsunami early warning in Indonesia. Near-field tsunami implies special requirements to both warning time and details of source characterization. To meet these requirements, we employ geophysical and geological information to predefine a maximum number of rupture parameters. We discretize the tsunamigenic Sunda plate interface into an ordered grid of patches (150×25) and employ the concept of Green's functions for forward and inverse rupture modeling. Rupture Generator, a forward modeling tool, additionally employs different scaling laws and slip shape functions to construct physically reasonable source models using basic seismic information only (magnitude and epicenter location). GITEWS runs a library of semi- and fully-synthetic scenarios to be extensively employed by system testing as well as by warning center personnel teaching and training. Near real-time GPS observations are a very valuable complement to the local tsunami warning system. Their inversion provides quick (within a few minutes on an event) estimation of the earthquake magnitude, rupture position and, in case of sufficient station coverage, details of slip distribution.

  18. Nucleic Acids for Ultra-Sensitive Protein Detection

    PubMed Central

    Janssen, Kris P. F.; Knez, Karel; Spasic, Dragana; Lammertyn, Jeroen

    2013-01-01

    Major advancements in molecular biology and clinical diagnostics cannot be brought about strictly through the use of genomics based methods. Improved methods for protein detection and proteomic screening are an absolute necessity to complement to wealth of information offered by novel, high-throughput sequencing technologies. Only then will it be possible to advance insights into clinical processes and to characterize the importance of specific protein biomarkers for disease detection or the realization of “personalized medicine”. Currently however, large-scale proteomic information is still not as easily obtained as its genomic counterpart, mainly because traditional antibody-based technologies struggle to meet the stringent sensitivity and throughput requirements that are required whereas mass-spectrometry based methods might be burdened by significant costs involved. However, recent years have seen the development of new biodetection strategies linking nucleic acids with existing antibody technology or replacing antibodies with oligonucleotide recognition elements altogether. These advancements have unlocked many new strategies to lower detection limits and dramatically increase throughput of protein detection assays. In this review, an overview of these new strategies will be given. PMID:23337338

  19. Hybrid ontology for semantic information retrieval model using keyword matching indexing system.

    PubMed

    Uthayan, K R; Mala, G S Anandha

    2015-01-01

    Ontology is the process of growth and elucidation of concepts of an information domain being common for a group of users. Establishing ontology into information retrieval is a normal method to develop searching effects of relevant information users require. Keywords matching process with historical or information domain is significant in recent calculations for assisting the best match for specific input queries. This research presents a better querying mechanism for information retrieval which integrates the ontology queries with keyword search. The ontology-based query is changed into a primary order to predicate logic uncertainty which is used for routing the query to the appropriate servers. Matching algorithms characterize warm area of researches in computer science and artificial intelligence. In text matching, it is more dependable to study semantics model and query for conditions of semantic matching. This research develops the semantic matching results between input queries and information in ontology field. The contributed algorithm is a hybrid method that is based on matching extracted instances from the queries and information field. The queries and information domain is focused on semantic matching, to discover the best match and to progress the executive process. In conclusion, the hybrid ontology in semantic web is sufficient to retrieve the documents when compared to standard ontology.

  20. Hybrid Ontology for Semantic Information Retrieval Model Using Keyword Matching Indexing System

    PubMed Central

    Uthayan, K. R.; Anandha Mala, G. S.

    2015-01-01

    Ontology is the process of growth and elucidation of concepts of an information domain being common for a group of users. Establishing ontology into information retrieval is a normal method to develop searching effects of relevant information users require. Keywords matching process with historical or information domain is significant in recent calculations for assisting the best match for specific input queries. This research presents a better querying mechanism for information retrieval which integrates the ontology queries with keyword search. The ontology-based query is changed into a primary order to predicate logic uncertainty which is used for routing the query to the appropriate servers. Matching algorithms characterize warm area of researches in computer science and artificial intelligence. In text matching, it is more dependable to study semantics model and query for conditions of semantic matching. This research develops the semantic matching results between input queries and information in ontology field. The contributed algorithm is a hybrid method that is based on matching extracted instances from the queries and information field. The queries and information domain is focused on semantic matching, to discover the best match and to progress the executive process. In conclusion, the hybrid ontology in semantic web is sufficient to retrieve the documents when compared to standard ontology. PMID:25922851

  1. Origin of life. Primordial genetics: Information transfer in a pre-RNA world based on self-replicating beta-sheet amyloid conformers.

    PubMed

    Maury, Carl Peter J

    2015-10-07

    The question of the origin of life on Earth can largely be reduced to the question of what was the first molecular replicator system that was able to replicate and evolve under the presumably very harsh conditions on the early Earth. It is unlikely that a functional RNA could have existed under such conditions and it is generally assumed that some other kind of information system preceded the RNA world. Here, I present an informational molecular system that is stable, self-replicative, environmentally responsive, and evolvable under conditions characterized by high temperatures, ultraviolet and cosmic radiation. This postulated pregenetic system is based on the amyloid fold, a functionally unique polypeptide fold characterized by a cross beta-sheet structure in which the beta strands are arranged perpendicular to the fiber axis. Beside an extraordinary structural robustness, the amyloid fold possesses a unique ability to transmit information by a three-dimensional templating mechanism. In amyloidogenesis short peptide monomers are added one by one to the growing end of the fiber. From the same monomeric subunits several structural variants of amyloid may be formed. Then, in a self-replicative mode, a specific amyloid conformer can act as a template and confer its spatially encoded information to daughter molecular entities in a repetitive way. In this process, the specific conformational information, the spatially changed organization, is transmitted; the coding element is the steric zipper structure, and recognition occurs by amino acid side chain complementarity. The amyloid information system fulfills several basic requirements of a primordial evolvable replicator system: (i) it is stable under the presumed primitive Earth conditions, (ii) the monomeric building blocks of the informational polymer can be formed from available prebiotic compounds, (iii) the system is self-assembling and self-replicative and (iv) it is adaptive to changes in the environment and evolvable. Copyright © 2015 The Author. Published by Elsevier Ltd.. All rights reserved.

  2. Characterization of available automated external defibrillators in the market based on the product manuals in 2014

    PubMed Central

    Ho, Chik Leung; Cheng, Ka Wai; Ma, Tze Hang; Wong, Yau Hang; Cheng, Ka Lok; Kam, Chak Wah

    2016-01-01

    BACKGROUND: To popularize the wide-spread use of automated external defibrillator (AED) to save life in sudden cardiac arrest, we compared the strength and weakness of different types of AEDs to enable a sound selection based on regional requirement. METHODS: This was a retrospective descriptive study. Different types of AEDs were compared according to the information of AEDs from manuals and brochures provided by the manufacturers. Fifteen types of AEDs were divided into 3 groups, basic, intermediate and advanced. RESULTS: Lifeline™ AUTO AED had the best performance in price, portability and user-friendly among AEDs of basic level. It required less time for shock charging. Samaritan PAD defibrillator was superior in price, portability, durability and characteristic among AEDs of intermediate level. It had the longest warranty and highest protection against water and dust. Lifeline™ PRO AED had the best performance in most of the criteria among AEDs of advanced level and offered CPR video and manual mode for laypersons and clinicians respectively. CONCLUSION: Lifeline™ AUTO AED, Samaritan PAD defibrillator, Lifeline™ PRO AED are superior in AEDs of basic, intermediate and advanced levels, respectively. A feasible AED may be chosen by users according to the regional requirement and the current information about the best available products. PMID:27313810

  3. Dynamic speech representations in the human temporal lobe.

    PubMed

    Leonard, Matthew K; Chang, Edward F

    2014-09-01

    Speech perception requires rapid integration of acoustic input with context-dependent knowledge. Recent methodological advances have allowed researchers to identify underlying information representations in primary and secondary auditory cortex and to examine how context modulates these representations. We review recent studies that focus on contextual modulations of neural activity in the superior temporal gyrus (STG), a major hub for spectrotemporal encoding. Recent findings suggest a highly interactive flow of information processing through the auditory ventral stream, including influences of higher-level linguistic and metalinguistic knowledge, even within individual areas. Such mechanisms may give rise to more abstract representations, such as those for words. We discuss the importance of characterizing representations of context-dependent and dynamic patterns of neural activity in the approach to speech perception research. Copyright © 2014 Elsevier Ltd. All rights reserved.

  4. Heterogeneous propellant internal ballistics: criticism and regeneration

    NASA Astrophysics Data System (ADS)

    Glick, R. L.

    2011-10-01

    Although heterogeneous propellant and its innately nondeterministic, chemically discrete morphology dominates applications, ballisticcharacterization deterministic time-mean burning rate and acoustic admittance measures' absence of explicit, nondeterministic information requires homogeneous propellant with a smooth, uniformly regressing burning surface: inadequate boundary conditions for heterogeneous propellant grained applications. The past age overcame this dichotomy with one-dimensional (1D) models and empirical knowledge from numerous, adequately supported motor developments and supplementary experiments. However, current cost and risk constraints inhibit this approach. Moreover, its fundamental science approach is more sensitive to incomplete boundary condition information (garbage-in still equals garbage-out) and more is expected. This work critiques this situation and sketches a path forward based on enhanced ballistic and motor characterizations in the workplace and approximate model and apparatus developments mentored by CSAR DNS capabilities (or equivalent).

  5. Improved Measures of Integrated Information

    PubMed Central

    Tegmark, Max

    2016-01-01

    Although there is growing interest in measuring integrated information in computational and cognitive systems, current methods for doing so in practice are computationally unfeasible. Existing and novel integration measures are investigated and classified by various desirable properties. A simple taxonomy of Φ-measures is presented where they are each characterized by their choice of factorization method (5 options), choice of probability distributions to compare (3 × 4 options) and choice of measure for comparing probability distributions (7 options). When requiring the Φ-measures to satisfy a minimum of attractive properties, these hundreds of options reduce to a mere handful, some of which turn out to be identical. Useful exact and approximate formulas are derived that can be applied to real-world data from laboratory experiments without posing unreasonable computational demands. PMID:27870846

  6. Database of Industrial Technological Information in Kanagawa : Networks for Technology Activities

    NASA Astrophysics Data System (ADS)

    Saito, Akira; Shindo, Tadashi

    This system is one of the databases which require participation by its members and of which premise is to open all the data in it. Aiming at free technological cooperation and exchange among industries it was constructed by Kanagawa Prefecture in collaboration with enterprises located in it. The input data is 36 items such as major product, special and advantageous technology, technolagy to be wanted for cooperation, facility and equipment, which technologically characterize each enterprise. They are expressed in 2,000 characters and written by natural language including Kanji except for some coded items. 24 search items are accessed by natural language so that in addition to interactive searching procedures including menu-type it enables extensive searching. The information service started in Oct., 1986 covering data from 2,000 enterprisen.

  7. Characterization of double continuum formulations of transport through pore-scale information

    NASA Astrophysics Data System (ADS)

    Porta, G.; Ceriotti, G.; Bijeljic, B.

    2016-12-01

    Information on pore-scale characteristics is becoming increasingly available at unprecedented levels of detail from modern visualization/data-acquisition techniques. These advancements are not completely matched by corresponding developments of operational procedures according to which we can engineer theoretical findings aiming at improving our ability to reduce the uncertainty associated with the outputs of continuum-scale models to be employed at large scales. We present here a modeling approach which rests on pore-scale information to achieve a complete characterization of a double continuum model of transport and fluid-fluid reactive processes. Our model makes full use of pore-scale velocity distributions to identify mobile and immobile regions. We do so on the basis of a pointwise (in the pore space) evaluation of the relative strength of advection and diffusion time scales, as rendered by spatially variable values of local Péclet numbers. After mobile and immobile regions are demarcated, we build a simplified unit cell which is employed as a representative proxy of the real porous domain. This model geometry is then employed to simplify the computation of the effective parameters embedded in the double continuum transport model, while retaining relevant information from the pore-scale characterization of the geometry and velocity field. We document results which illustrate the applicability of the methodology to predict transport of a passive tracer within two- and three-dimensional media upon comparison with direct pore-scale numerical simulation of transport in the same geometrical settings. We also show preliminary results about the extension of this model to fluid-fluid reactive transport processes. In this context, we focus on results obtained in two-dimensional porous systems. We discuss the impact of critical quantities required as input to our modeling approach to obtain continuum-scale outputs. We identify the key limitations of the proposed methodology and discuss its capability also in comparison with alternative approaches grounded, e.g., on nonlocal and particle-based approximations.

  8. Health systems and noncommunicable diseases in the Asia-Pacific region: a review of the published literature.

    PubMed

    Mannava, Priya; Abdullah, Asnawi; James, Chris; Dodd, Rebecca; Annear, Peter Leslie

    2015-03-01

    Addressing the growing burden of noncommunicable diseases (NCDs) in countries of the Asia-Pacific region requires well-functioning health systems. In low- and middle-income countries (LMICs), however, health systems are generally characterized by inadequate financial and human resources, unsuitable service delivery models, and weak information systems. The aims of this review were to identify (a) health systems interventions being implemented to deliver NCD programs and services and their outcomes and (b) the health systems bottlenecks impeding access to or delivery of these programs and services in LMICs of the Asia-Pacific region. A search of 4 databases for literature published between 1990 and 2010 retrieved 36 relevant studies. For each study, information on basic characteristics, type of health systems bottleneck/intervention, and outcome was extracted, and methodological quality appraised. Health systems interventions and bottlenecks were classified as per the World Health Organization health systems building blocks framework. The review identified interventions and bottlenecks in the building blocks of service delivery, health workforce, financing, health information systems, and medical products, vaccines, and technologies. Studies, however, were heterogeneous in methodologies used, and the overall quality was generally low. There are several gaps in the evidence base around NCDs in the Asia-Pacific region that require further investigation. © 2013 APJPH.

  9. Alternative Test Methods for Developmental Neurotoxicity: A ...

    EPA Pesticide Factsheets

    Exposure to environmental contaminants is well documented to adversely impact the development of the nervous system. However, the time, animal and resource intensive EPA and OECD testing guideline methods for developmental neurotoxicity (DNT) are not a viable solution to characterizing potential chemical hazards for the thousands of untested chemicals currently in commerce. Thus, research efforts over the past decade have endeavored to develop cost-effective alternative DNT testing methods. These efforts have begun to generate data that can inform regulatory decisions. Yet there are major challenges to both the acceptance and use of this data. Major scientific challenges for DNT include development of new methods and models that are “fit for purpose”, development of a decision-use framework, and regulatory acceptance of the methods. It is critical to understand that use of data from these methods will be driven mainly by the regulatory problems being addressed. Some problems may be addressed with limited datasets, while others may require data for large numbers of chemicals, or require the development and use of new biological and computational models. For example mechanistic information derived from in vitro DNT assays can be used to inform weight of evidence (WoE) or integrated approaches to testing and assessment (IATA) approaches for chemical-specific assessments. Alternatively, in vitro data can be used to prioritize (for further testing) the thousands

  10. High-throughput cocrystal slurry screening by use of in situ Raman microscopy and multi-well plate.

    PubMed

    Kojima, Takashi; Tsutsumi, Shunichirou; Yamamoto, Katsuhiko; Ikeda, Yukihiro; Moriwaki, Toshiya

    2010-10-31

    Cocrystal has attracted much attention in order to improve poor physicochemical properties, since cocrystal former crystallize with the ionic drugs as well as nonionic drugs. Cocrystal screening was usually conducted by crystallization, slurry and co-grinding techniques, however sensitivity, cost and time for screening were limited because of issues such as dissociation of cocrystal during crystallization and cost and time required for slurry and co-grinding methods. To overcome these issues, novel high-throughput cocrystal slurry screening was developed by using in situ Raman microscope and a multi-well plate. Cocrystal screening of indomethacin was conducted with 46 cocrystal formers and potential cocrystals were prepared on a large scale for the characterization with powder X-ray diffractometry, thermal analysis, and Raman microscopy and (1)H NMR spectroscopy. Compared with the characterization of scale-up cocrystals, the cocrystal screening indicated that indomethacin structured novel cocrystals with D/L-mandelic acid, nicotinamide, lactamide and benzamide which was not obtained in the screening with crystallization technique previously reported. In addition, the screening provided not only information of cocrystal formation within a day but also information of equilibrium of cocrystal formation and polymorphic transformation in one screening. Information obtained in this screening allows effective solid form selection by saving cost and time for the development. Copyright © 2010 Elsevier B.V. All rights reserved.

  11. Automatic registration of ICG images using mutual information and perfusion analysis

    NASA Astrophysics Data System (ADS)

    Kim, Namkug; Seo, Jong-Mo; Lee, June-goo; Kim, Jong Hyo; Park, Kwangsuk; Yu, Hyeong-Gon; Yu, Young Suk; Chung, Hum

    2005-04-01

    Introduction: Indocyanin green fundus angiographic images (ICGA) of the eyes is useful method in detecting and characterizing the choroidal neovascularization (CNV), which is the major cause of the blindness over 65 years of age. To investigate the quantitative analysis of the blood flow on ICGA, systematic approach for automatic registration of using mutual information and a quantitative analysis was developed. Methods: Intermittent sequential images of indocyanin green angiography were acquired by Heidelberg retinal angiography that uses the laser scanning system for the image acquisition. Misalignment of the each image generated by the minute eye movement of the patients was corrected by the mutual information method because the distribution of the contrast media on image is changing throughout the time sequences. Several region of interest (ROI) were selected by a physician and the intensities of the selected region were plotted according to the time sequences. Results: The registration of ICGA time sequential images is required not only translate transform but also rotational transform. Signal intensities showed variation based on gamma-variate function depending on ROIs and capillary vessels show more variance of signal intensity than major vessels. CNV showed intermediate variance of signal intensity and prolonged transit time. Conclusion: The resulting registered images can be used not only for quantitative analysis, but also for perfusion analysis. Various investigative approached on CNV using this method will be helpful in the characterization of the lesion and follow-up.

  12. EVA Health and Human Performance Benchmarking Study

    NASA Technical Reports Server (NTRS)

    Abercromby, A. F.; Norcross, J.; Jarvis, S. L.

    2016-01-01

    Multiple HRP Risks and Gaps require detailed characterization of human health and performance during exploration extravehicular activity (EVA) tasks; however, a rigorous and comprehensive methodology for characterizing and comparing the health and human performance implications of current and future EVA spacesuit designs does not exist. This study will identify and implement functional tasks and metrics, both objective and subjective, that are relevant to health and human performance, such as metabolic expenditure, suit fit, discomfort, suited postural stability, cognitive performance, and potentially biochemical responses for humans working inside different EVA suits doing functional tasks under the appropriate simulated reduced gravity environments. This study will provide health and human performance benchmark data for humans working in current EVA suits (EMU, Mark III, and Z2) as well as shirtsleeves using a standard set of tasks and metrics with quantified reliability. Results and methodologies developed during this test will provide benchmark data against which future EVA suits, and different suit configurations (eg, varied pressure, mass, CG) may be reliably compared in subsequent tests. Results will also inform fitness for duty standards as well as design requirements and operations concepts for future EVA suits and other exploration systems.

  13. High-yield in vitro recordings from neurons functionally characterized in vivo.

    PubMed

    Weiler, Simon; Bauer, Joel; Hübener, Mark; Bonhoeffer, Tobias; Rose, Tobias; Scheuss, Volker

    2018-06-01

    In vivo two-photon calcium imaging provides detailed information about the activity and response properties of individual neurons. However, in vitro methods are often required to study the underlying neuronal connectivity and physiology at the cellular and synaptic levels at high resolution. This protocol provides a fast and reliable workflow for combining the two approaches by characterizing the response properties of individual neurons in mice in vivo using genetically encoded calcium indicators (GECIs), followed by retrieval of the same neurons in brain slices for further analysis in vitro (e.g., circuit mapping). In this approach, a reference frame is provided by fluorescent-bead tracks and sparsely transduced neurons expressing a structural marker in order to re-identify the same neurons. The use of GECIs provides a substantial advancement over previous approaches by allowing for repeated in vivo imaging. This opens the possibility of directly correlating experience-dependent changes in neuronal activity and feature selectivity with changes in neuronal connectivity and physiology. This protocol requires expertise both in in vivo two-photon calcium imaging and in vitro electrophysiology. It takes 3 weeks or more to complete, depending on the time allotted for repeated in vivo imaging of neuronal activity.

  14. Leaf-FISH: Microscale Imaging of Bacterial Taxa on Phyllosphere

    PubMed Central

    Peredo, Elena L.; Simmons, Sheri L.

    2018-01-01

    Molecular methods for microbial community characterization have uncovered environmental and plant-associated factors shaping phyllosphere communities. Variables undetectable using bulk methods can play an important role in shaping plant-microbe interactions. Microscale analysis of bacterial dynamics in the phyllosphere requires imaging techniques specially adapted to the high autoflouresence and 3-D structure of the leaf surface. We present an easily-transferable method (Leaf-FISH) to generate high-resolution tridimensional images of leaf surfaces that allows simultaneous visualization of multiple bacterial taxa in a structurally informed context, using taxon-specific fluorescently labeled oligonucleotide probes. Using a combination of leaf pretreatments coupled with spectral imaging confocal microscopy, we demonstrate the successful imaging bacterial taxa at the genus level on cuticular and subcuticular leaf areas. Our results confirm that different bacterial species, including closely related isolates, colonize distinct microhabitats in the leaf. We demonstrate that highly related Methylobacterium species have distinct colonization patterns that could not be predicted by shared physiological traits, such as carbon source requirements or phytohormone production. High-resolution characterization of microbial colonization patterns is critical for an accurate understanding of microbe-microbe and microbe-plant interactions, and for the development of foliar bacteria as plant-protective agents. PMID:29375531

  15. A burnout prediction model based around char morphology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tao Wu; Edward Lester; Michael Cloke

    Several combustion models have been developed that can make predictions about coal burnout and burnout potential. Most of these kinetic models require standard parameters such as volatile content and particle size to make a burnout prediction. This article presents a new model called the char burnout (ChB) model, which also uses detailed information about char morphology in its prediction. The input data to the model is based on information derived from two different image analysis techniques. One technique generates characterization data from real char samples, and the other predicts char types based on characterization data from image analysis of coalmore » particles. The pyrolyzed chars in this study were created in a drop tube furnace operating at 1300{sup o}C, 200 ms, and 1% oxygen. Modeling results were compared with a different carbon burnout kinetic model as well as the actual burnout data from refiring the same chars in a drop tube furnace operating at 1300{sup o}C, 5% oxygen, and residence times of 200, 400, and 600 ms. A good agreement between ChB model and experimental data indicates that the inclusion of char morphology in combustion models could well improve model predictions. 38 refs., 5 figs., 6 tabs.« less

  16. Discovery, characterization and expression of a novel zebrafish gene, znfr, important for notochord formation.

    PubMed

    Xu, Yan; Zou, Peng; Liu, Yao; Deng, Fengjiao

    2010-06-01

    Genes specifically expressed in the notochord may be crucial for proper notochord development. Using the digital differential display program offered by the National Center for Biotechnology Information, we identified a novel EST sequence from a zebrafish ovary library (No. XM_701450). The full-length cDNA of this transcript was cloned by performing 3' and 5'-RACE and was further confirmed by PCR and sequencing. The resulting 614 bp gene was found to encode a novel 94 amino acid protein that did not share significant homology with any other known protein. Characterization of the genomic sequence revealed that the gene spanned 4.9 kb and was composed of four exons and three introns. RT-PCR gene expression analysis revealed that our gene of interest was expressed in ovary, kidney, brain, mature oocytes and during the early stages of embryogenesis. During embryonic development, znfr mRNA was found to be expressed in the embryonic shield, chordamesoderm and the vacuolated notochord cells by in situ hybridization. Based on this information, we hypothesize that this novel gene is an important maternal factor required for zebrafish notochord formation during early embryonic development. We have thus named this gene znfr (zebrafish notochord formation related).

  17. Characterization and classification of vegetation canopy structure and distribution within the Great Smoky Mountains National Park using LiDAR

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kumar, Jitendra; HargroveJr., William Walter; Norman, Steven P

    Vegetation canopy structure is a critically important habit characteristic for many threatened and endangered birds and other animal species, and it is key information needed by forest and wildlife managers for monitoring and managing forest resources, conservation planning and fostering biodiversity. Advances in Light Detection and Ranging (LiDAR) technologies have enabled remote sensing-based studies of vegetation canopies by capturing three-dimensional structures, yielding information not available in two-dimensional images of the landscape pro- vided by traditional multi-spectral remote sensing platforms. However, the large volume data sets produced by airborne LiDAR instruments pose a significant computational challenge, requiring algorithms to identify andmore » analyze patterns of interest buried within LiDAR point clouds in a computationally efficient manner, utilizing state-of-art computing infrastructure. We developed and applied a computationally efficient approach to analyze a large volume of LiDAR data and to characterize and map the vegetation canopy structures for 139,859 hectares (540 sq. miles) in the Great Smoky Mountains National Park. This study helps improve our understanding of the distribution of vegetation and animal habitats in this extremely diverse ecosystem.« less

  18. Evolution and Implementation of the NASA Robotic Conjunction Assessment Risk Analysis Concept of Operations

    NASA Astrophysics Data System (ADS)

    Newman, L.; Hejduk, M.; Frigm, R.; Duncan, M.

    2014-09-01

    On-orbit collisions pose a significant mission risk to satellites operating in the space environment. Recognizing the likelihood and consequence of on-orbit collisions, NASA has taken several proactive measures to mitigate the risk of both a catastrophic loss of mission and the increase in the space debris population. In fall 2004, NASA GSFC established an Agency-wide, institutionalized process and service for identifying and reacting to predicted close approaches. The team responsible for executing this mission is the NASA Robotic Conjunction Assessment Risk Analysis (CARA) team. By fall 2005, this process had resulted in the execution of the first collision avoidance maneuver by a NASA unmanned satellite. In February 2008, NASA adopted a policy, documented in NASA Procedural Requirement 8715.6a Process for Limiting Orbital Debris that directed maneuverable satellites to have such an on-orbit collision mitigation process. In 2009, NASA decided to require support for all operational satellites. By January 2014, the CARA team has processed nearly 500,000 close approach messages from the Joint Space Operations Center (JSpOC) and has assisted our mission customers with planning and executing over 75 collision avoidance maneuvers for unmanned satellites in LEO, GEO, and HEO orbital regimes. With the increase in number of operational missions supported; growth in the orbital debris environment due to events such as the intentional destruction of the Fengyun 1-C satellite in 2007 and collision between Iridium-33 and Cosmos-2251; and improvements to the United States Space Surveillance Network (SSN) and its ability to track, catalog, and screen against small debris objects, the demands on the CARA process have consequently required the CARA Concept of Operations (CONOPS) to evolve to manage those demands. This evolution is centered on the ability to effectively and efficiently manage JSpOC, CARA, and Mission Operations resources, applying operational and analytical efforts for conjunction events that pose significant collision risk and rapidly discarding conjunction events that do not. While the overall CARA methodology is largely unaffected, this CONOPS evolution manifests itself in several aspects of the CARA process: required data and information, communication of those data and information, and courses of actions based on those data and information. The changes affect all relevant stakeholders, including the CARA team at NASA GSFC, GSFC-dedicated Orbital Safety Analysts at the JSpOC, and Mission Operations flight teams and management. In each step of the CARA process, the CONOPS ensures that necessary (whether situational or actionable) information be sent to stakeholders to facilitate an effective and efficient management of resources and appropriate protection of data. The most significant paradigm shift is the movement to risk-based reporting. Since the consequence of the on-orbit collision scenario can be catastrophic, the CARA risk-based framework hinges on the collision probability, Pc, as the encapsulation of collision risk. This CONOPS characterizes collision risk as Red (high collision risk), Yellow (potential for becoming a high collision risk), or Green (low collision risk) based on the operationally-computed Pc. Using this risk characterization schema, the amount and content of conjunction information and analyses is determined and communicated to mission stakeholders. Major technical analyses that have been conducted in support of this CONOPS include defining risk-based thresholds for red, yellow, and green criteria; determining when conjunction-related information may not be mature enough to be actionable; and accounting for uncertainties in all the inputs to the process so that a nuanced assessment of risk can be made. This paper summarizes the analyses executed and decisions rendered during the implementation of this evolved CONOPS. Historical conjunction events of note are used as example scenarios of each risk characterization.

  19. Assessing the value of diagnostic imaging: the role of perception

    NASA Astrophysics Data System (ADS)

    Potchen, E. J.; Cooper, Thomas G.

    2000-04-01

    The value of diagnostic radiology rests in its ability to provide information. Information is defined as a reduction in randomness. Quality improvement in any system requires diminution in the variation in its performance. The major variation in performance of the system of diagnostic radiology occurs in observer performance and in the communication of information from the observer to someone who will apply that information to the benefit of the patient. The ability to provide information can be determined by observer performance studies using a receiver-operating characteristic (ROC) curve analysis. The amount of information provided by each observer can be measured in terms of the uncertainty they reduce. Using a set of standardized radiographs, some normal and some abnormal, sorting them randomly, and then asking an observer to redistribute them according to their probability of normality can measure the difference in the value added by different observers. By applying this observer performance measure, we have been able to characterize individual radiologists, groups of radiologists, and regions of the United States in their ability to add value in chest radiology. The use of these technologies in health care may improve upon the contribution of diagnostic imaging.

  20. Activity Recognition on Streaming Sensor Data.

    PubMed

    Krishnan, Narayanan C; Cook, Diane J

    2014-02-01

    Many real-world applications that focus on addressing needs of a human, require information about the activities being performed by the human in real-time. While advances in pervasive computing have lead to the development of wireless and non-intrusive sensors that can capture the necessary activity information, current activity recognition approaches have so far experimented on either a scripted or pre-segmented sequence of sensor events related to activities. In this paper we propose and evaluate a sliding window based approach to perform activity recognition in an on line or streaming fashion; recognizing activities as and when new sensor events are recorded. To account for the fact that different activities can be best characterized by different window lengths of sensor events, we incorporate the time decay and mutual information based weighting of sensor events within a window. Additional contextual information in the form of the previous activity and the activity of the previous window is also appended to the feature describing a sensor window. The experiments conducted to evaluate these techniques on real-world smart home datasets suggests that combining mutual information based weighting of sensor events and adding past contextual information into the feature leads to best performance for streaming activity recognition.

  1. Quantitative analysis of nano-pore geomaterials and representative sampling for digital rock physics

    NASA Astrophysics Data System (ADS)

    Yoon, H.; Dewers, T. A.

    2014-12-01

    Geomaterials containing nano-pores (e.g., shales and carbonate rocks) have become increasingly important for emerging problems such as unconventional gas and oil resources, enhanced oil recovery, and geologic storage of CO2. Accurate prediction of coupled geophysical and chemical processes at the pore scale requires realistic representation of pore structure and topology. This is especially true for chalk materials, where pore networks are small and complex, and require characterization at sub-micron scale. In this work, we apply laser scanning confocal microscopy to characterize pore structures and microlithofacies at micron- and greater scales and dual focused ion beam-scanning electron microscopy (FIB-SEM) for 3D imaging of nanometer-to-micron scale microcracks and pore distributions. With imaging techniques advanced for nano-pore characterization, a problem of scale with FIB-SEM images is how to take nanometer scale information and apply it to the thin-section or larger scale. In this work, several texture characterization techniques including graph-based spectral segmentation, support vector machine, and principal component analysis are applied for segmentation clusters represented by 1-2 FIB-SEM samples per each cluster. Geometric and topological properties are analyzed and lattice-Boltzmann method (LBM) is used to obtain permeability at several different scales. Upscaling of permeability to the Darcy scale (e.g., the thin-section scale) with image dataset will be discussed with emphasis on understanding microfracture-matrix interaction, representative volume for FIB-SEM sampling, and multiphase flow and reactive transport. Funding from the DOE Basic Energy Sciences Geosciences Program is gratefully acknowledged. Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000.

  2. One- and two-dimensional dopant/carrier profiling for ULSI

    NASA Astrophysics Data System (ADS)

    Vandervorst, W.; Clarysse, T.; De Wolf, P.; Trenkler, T.; Hantschel, T.; Stephenson, R.; Janssens, T.

    1998-11-01

    Dopant/carrier profiles constitute the basis of the operation of a semiconductor device and thus play a decisive role in the performance of a transistor and are subjected to the same scaling laws as the other constituents of a modern semiconductor device and continuously evolve towards shallower and more complex configurations. This evolution has increased the demands on the profiling techniques in particular in terms of resolution and quantification such that a constant reevaluation and improvement of the tools is required. As no single technique provides all the necessary information (dopant distribution, electrical activation,..) with the requested spatial and depth resolution, the present paper attempts to provide an assessment of those tools which can be considered as the main metrology technologies for ULSI-applications. For 1D-dopant profiling secondary ion mass spectrometry (SIMS) has progressed towards a generally accepted tool meeting the requirements. For 1D-carrier profiling spreading resistance profiling and microwave surface impedance profiling are envisaged as the best choices but extra developments are required to promote them to routinely applicable methods. As no main metrology tool exist for 2D-dopant profiling, main emphasis is on 2D-carrier profiling tools based on scanning probe microscopy. Scanning spreading resistance (SSRM) and scanning capacitance microscopy (SCM) are the preferred methods although neither of them already meets all the requirements. Complementary information can be extracted from Nanopotentiometry which samples the device operation in more detail. Concurrent use of carrier profiling tools, Nanopotentiometry, analysis of device characteristics and simulations is required to provide a complete characterization of deep submicron devices.

  3. Trends in analytical techniques applied to particulate matter characterization: A critical review of fundaments and applications.

    PubMed

    Galvão, Elson Silva; Santos, Jane Meri; Lima, Ana Teresa; Reis, Neyval Costa; Orlando, Marcos Tadeu D'Azeredo; Stuetz, Richard Michael

    2018-05-01

    Epidemiological studies have shown the association of airborne particulate matter (PM) size and chemical composition with health problems affecting the cardiorespiratory and central nervous systems. PM also act as cloud condensation nuclei (CNN) or ice nuclei (IN), taking part in the clouds formation process, and therefore can impact the climate. There are several works using different analytical techniques in PM chemical and physical characterization to supply information to source apportionment models that help environmental agencies to assess damages accountability. Despite the numerous analytical techniques described in the literature available for PM characterization, laboratories are normally limited to the in-house available techniques, which raises the question if a given technique is suitable for the purpose of a specific experimental work. The aim of this work consists of summarizing the main available technologies for PM characterization, serving as a guide for readers to find the most appropriate technique(s) for their investigation. Elemental analysis techniques like atomic spectrometry based and X-ray based techniques, organic and carbonaceous techniques and surface analysis techniques are discussed, illustrating their main features as well as their advantages and drawbacks. We also discuss the trends in analytical techniques used over the last two decades. The choice among all techniques is a function of a number of parameters such as: the relevant particles physical properties, sampling and measuring time, access to available facilities and the costs associated to equipment acquisition, among other considerations. An analytical guide map is presented as a guideline for choosing the most appropriated technique for a given analytical information required. Copyright © 2018 Elsevier Ltd. All rights reserved.

  4. Proteins in olive fruit and oil.

    PubMed

    Montealegre, Cristina; Esteve, Clara; García, Maria Concepción; García-Ruiz, Carmen; Marina, Maria Luisa

    2014-01-01

    This paper is a comprehensive review grouping the information on the extraction, characterization, and quantitation of olive and olive oil proteins and providing a practical guide about these proteins. Most characterized olive proteins are located in the fruit, mainly in the seed, where different oleosins and storage proteins have been found. Unlike the seed, the olive pulp contains a lower protein content having been described a polypeptide of 4.6 kDa and a thaumain-like protein. Other important proteins studied in olive fruits have been enzymes which could play important roles in olives characteristics. Part of these proteins is transferred from the fruit to the oil during the manufacturing process of olive oil. In fact, the same polypeptide of 4.6 kDa found in the pulp has been described in the olive oil and, additionally, the presence of other proteins and enzymes have also been described. Protein profiles have recently been proposed as an interesting strategy for the varietal classification of olive fruits and oils. Nevertheless, there is still a lot of knowledge without being explored requiring new studies focused on the determination and characterization of these proteins.

  5. Toward economic flood loss characterization via hazard simulation

    NASA Astrophysics Data System (ADS)

    Czajkowski, Jeffrey; Cunha, Luciana K.; Michel-Kerjan, Erwann; Smith, James A.

    2016-08-01

    Among all natural disasters, floods have historically been the primary cause of human and economic losses around the world. Improving flood risk management requires a multi-scale characterization of the hazard and associated losses—the flood loss footprint. But this is typically not available in a precise and timely manner, yet. To overcome this challenge, we propose a novel and multidisciplinary approach which relies on a computationally efficient hydrological model that simulates streamflow for scales ranging from small creeks to large rivers. We adopt a normalized index, the flood peak ratio (FPR), to characterize flood magnitude across multiple spatial scales. The simulated FPR is then shown to be a key statistical driver for associated economic flood losses represented by the number of insurance claims. Importantly, because it is based on a simulation procedure that utilizes generally readily available physically-based data, our flood simulation approach has the potential to be broadly utilized, even for ungauged and poorly gauged basins, thus providing the necessary information for public and private sector actors to effectively reduce flood losses and save lives.

  6. Landscape Hazards in Yukon Communities: Geological Mapping for Climate Change Adaptation Planning

    NASA Astrophysics Data System (ADS)

    Kennedy, K.; Kinnear, L.

    2010-12-01

    Climate change is considered to be a significant challenge for northern communities where the effects of increased temperature and climate variability are beginning to affect infrastructure and livelihoods (Arctic Climate Impact Assessment, 2004). Planning for and adapting to ongoing and future changes in climate will require the identification and characterization of social, economic, cultural, political and biophysical vulnerabilities. This pilot project addresses physical landscape vulnerabilities in two communities in the Yukon Territory through community-scale landscape hazard mapping and focused investigations of community permafrost conditions. Landscape hazards are identified by combining pre-existing data from public utilities and private-sector consultants with new geophysical techniques (ground penetrating radar and electrical resistivity), shallow drilling, surficial geological mapping, and permafrost characterization. Existing landscape vulnerabilities are evaluated based on their potential for hazard (low, medium or high) under current climate conditions, as well as under future climate scenarios. Detailed hazard maps and landscape characterizations for both communities will contribute to overall adaptation plans and allow for informed development, planning and mitigation of potentially threatening hazards in and around the communities.

  7. Raman, mid-infrared, near-infrared and ultraviolet-visible spectroscopy of PDMS silicone rubber for characterization of polymer optical waveguide materials

    NASA Astrophysics Data System (ADS)

    Cai, Dengke; Neyer, Andreas; Kuckuk, Rüdiger; Heise, H. Michael

    2010-07-01

    Special siloxane polymers have been produced via an addition reaction from commercially available two-component addition materials by thermal curing. Polydimethylsiloxane (PDMS) based polymers have already been used in the optical communication field, where passive polymer multimode waveguides are required for short-distance datacom optical applications. For such purpose, materials with low intrinsic absorption losses within the spectral region of 600-900 nm wavelengths are essential. For vibrational absorption band assignments, especially in the visible and short-wave near-infrared region, the mid-infrared and Raman spectra were investigated for fundamental vibrations of the siloxane materials, shedding light onto the chemistry before and after material polymerization. Within the near-infrared and long-wave visible spectral range, vibrational C sbnd H stretching overtone and combination bands dominate the spectra, rendering an optical characterization of core and clad materials. Such knowledge also provides information for the synthesis and optical characterization, e.g., of deuterated derivatives with less intrinsic absorption losses from molecular vibrations compared to the siloxane materials studied.

  8. Characterizing plant cell wall derived oligosaccharides using hydrophilic interaction chromatography with mass spectrometry detection.

    PubMed

    Leijdekkers, A G M; Sanders, M G; Schols, H A; Gruppen, H

    2011-12-23

    Analysis of complex mixtures of plant cell wall derived oligosaccharides is still challenging and multiple analytical techniques are often required for separation and characterization of these mixtures. In this work it is demonstrated that hydrophilic interaction chromatography coupled with evaporative light scattering and mass spectrometry detection (HILIC-ELSD-MS(n)) is a valuable tool for identification of a wide range of neutral and acidic cell wall derived oligosaccharides. The separation potential for acidic oligosaccharides observed with HILIC is much better compared to other existing techniques, like capillary electrophoresis, reversed phase and porous-graphitized carbon chromatography. Important structural information, such as presence of methyl esters and acetyl groups, is retained during analysis. Separation of acidic oligosaccharides with equal charge yet with different degrees of polymerization can be obtained. The efficient coupling of HILIC with ELSD and MS(n)-detection enables characterization and quantification of many different oligosaccharide structures present in complex mixtures. This makes HILIC-ELSD-MS(n) a versatile and powerful additional technique in plant cell wall analysis. Copyright © 2011 Elsevier B.V. All rights reserved.

  9. Beyond Metrics? The Role of Hydrologic Baseline Archetypes in Environmental Water Management.

    PubMed

    Lane, Belize A; Sandoval-Solis, Samuel; Stein, Eric D; Yarnell, Sarah M; Pasternack, Gregory B; Dahlke, Helen E

    2018-06-22

    Balancing ecological and human water needs often requires characterizing key aspects of the natural flow regime and then predicting ecological response to flow alterations. Flow metrics are generally relied upon to characterize long-term average statistical properties of the natural flow regime (hydrologic baseline conditions). However, some key aspects of hydrologic baseline conditions may be better understood through more complete consideration of continuous patterns of daily, seasonal, and inter-annual variability than through summary metrics. Here we propose the additional use of high-resolution dimensionless archetypes of regional stream classes to improve understanding of baseline hydrologic conditions and inform regional environmental flows assessments. In an application to California, we describe the development and analysis of hydrologic baseline archetypes to characterize patterns of flow variability within and between stream classes. We then assess the utility of archetypes to provide context for common flow metrics and improve understanding of linkages between aquatic patterns and processes and their hydrologic controls. Results indicate that these archetypes may offer a distinct and complementary tool for researching mechanistic flow-ecology relationships, assessing regional patterns for streamflow management, or understanding impacts of changing climate.

  10. Teaching color measurement in graphic arts

    NASA Astrophysics Data System (ADS)

    Ingram, Samuel T.; Simon, Frederick T.

    1997-04-01

    The production of color images has grown in recent years due to the impact of digital technology. Access and equipment affordability are now bringing a new generation of color producers into the marketplace. Many traditional questions concerning color attributes are repeatedly asked by individuals: color fidelity, quality, measurements and device characterization pose daily dilemmas. Curriculum components should be offered in an educational environment that enhance the color foundations required of knowledgeable managers, researchers and technicians. The printing industry is adding many of the new digital color technologies to their vocabulary pertinent to color production. This paper presents current efforts being made to integrate color knowledge in a four year program of undergraduate study. Specific topics include: color reproduction, device characterization, material characterization and the role of measurements as a linking attribute. This paper also provides information detailing efforts to integrate color specification/measurement and analysis procedures used by students and subsequent application in color image production are provided. A discussion of measurement devices used in the learning environment is also presented. The investigation involves descriptive data on colorants typically used in printing inks and color.

  11. Real-time holographic deconvolution techniques for one-way image transmission through an aberrating medium: characterization, modeling, and measurements.

    PubMed

    Haji-Saeed, B; Sengupta, S K; Testorf, M; Goodhue, W; Khoury, J; Woods, C L; Kierstead, J

    2006-05-10

    We propose and demonstrate a new photorefractive real-time holographic deconvolution technique for adaptive one-way image transmission through aberrating media by means of four-wave mixing. In contrast with earlier methods, which typically required various codings of the exact phase or two-way image transmission for correcting phase distortion, our technique relies on one-way image transmission through the use of exact phase information. Our technique can simultaneously correct both amplitude and phase distortions. We include several forms of image degradation, various test cases, and experimental results. We characterize the performance as a function of the input beam ratios for four metrics: signal-to-noise ratio, normalized root-mean-square error, edge restoration, and peak-to-total energy ratio. In our characterization we use false-color graphic images to display the best beam-intensity ratio two-dimensional region(s) for each of these metrics. Test cases are simulated at the optimal values of the beam-intensity ratios. We demonstrate our results through both experiment and computer simulation.

  12. Directed Incremental Symbolic Execution

    NASA Technical Reports Server (NTRS)

    Person, Suzette; Yang, Guowei; Rungta, Neha; Khurshid, Sarfraz

    2011-01-01

    The last few years have seen a resurgence of interest in the use of symbolic execution -- a program analysis technique developed more than three decades ago to analyze program execution paths. Scaling symbolic execution and other path-sensitive analysis techniques to large systems remains challenging despite recent algorithmic and technological advances. An alternative to solving the problem of scalability is to reduce the scope of the analysis. One approach that is widely studied in the context of regression analysis is to analyze the differences between two related program versions. While such an approach is intuitive in theory, finding efficient and precise ways to identify program differences, and characterize their effects on how the program executes has proved challenging in practice. In this paper, we present Directed Incremental Symbolic Execution (DiSE), a novel technique for detecting and characterizing the effects of program changes. The novelty of DiSE is to combine the efficiencies of static analysis techniques to compute program difference information with the precision of symbolic execution to explore program execution paths and generate path conditions affected by the differences. DiSE is a complementary technique to other reduction or bounding techniques developed to improve symbolic execution. Furthermore, DiSE does not require analysis results to be carried forward as the software evolves -- only the source code for two related program versions is required. A case-study of our implementation of DiSE illustrates its effectiveness at detecting and characterizing the effects of program changes.

  13. Invited Review Small is beautiful: The analysis of nanogram-sized astromaterials

    NASA Astrophysics Data System (ADS)

    Zolensky, M. E.; Pieters, C.; Clark, B.; Papike, J. J.

    2000-01-01

    The capability of modern methods to characterize ultra-small samples is well established from analysis of interplanetary dust particles (IDPs), interstellar grains recovered from meteorites, and other materials requiring ultra-sensitive analytical capabilities. Powerful analytical techniques are available that require, under favorable circumstances, single particles of only a few nanograms for entire suites of fairly comprehensive characterizations. A returned sample of >1,000 particles with total mass of just one microgram permits comprehensive quantitative geochemical measurements that are impractical to carry out in situ by flight instruments. The main goal of this paper is to describe the state-of-the-art in microanalysis of astromaterials. Given that we can analyze fantastically small quantities of asteroids and comets, etc., we have to ask ourselves how representative are microscopic samples of bodies that measure a few to many km across? With the Galileo flybys of Gaspra and Ida, it is now recognized that even very small airless bodies have indeed developed a particulate regolith. Acquiring a sample of the bulk regolith, a simple sampling strategy, provides two critical pieces of information about the body. Regolith samples are excellent bulk samples since they normally contain all the key components of the local environment, albeit in particulate form. Furthermore, since this fine fraction dominates remote measurements, regolith samples also provide information about surface alteration processes and are a key link to remote sensing of other bodies. Studies indicate that a statistically significant number of nanogram-sized particles should be able to characterize the regolith of a primitive asteroid, although the presence of larger components within even primitive meteorites (e.g.. Murchison), e.g. chondrules, CAI, large crystal fragments, etc., points out the limitations of using data obtained from nanogram-sized samples to characterize entire primitive asteroids. However, most important asteroidal geological processes have left their mark on the matrix, since this is the finest-grained portion and therefore most sensitive to chemical and physical changes. Thus, the following information can be learned from this fine grain size fraction alone: (1) mineral paragenesis; (2) regolith processes, (3) bulk composition; (4) conditions of thermal and aqueous alteration (if any); (5) relationships to planets, comets, meteorites (via isotopic analyses, including oxygen; (6) abundance of water and hydrated material; (7) abundance of organics; (8) history of volatile mobility, (9) presence and origin of presolar and/or interstellar material. Most of this information can even be obtained from dust samples from bodies for which nanogram-sized samples are not truly representative. Future advances in sensitivity and accuracy of laboratory analytical techniques can be expected to enhance the science value of nano- to microgram sized samples even further. This highlights a key advantage of sample returns - that the most advanced analysis techniques can always be applied in the laboratory, and that well-preserved samples are available for future investigations.

  14. Optical characterization of ultra-sensitive TES bolometers for SAFARI

    NASA Astrophysics Data System (ADS)

    Audley, Michael D.; de Lange, Gerhard; Gao, Jian-Rong; Khosropanah, Pourya; Mauskopf, Philip D.; Morozov, Dmitry; Trappe, Neil A.; Doherty, Stephen; Withington, Stafford

    2014-07-01

    We have characterized the optical response of prototype detectors for SAFARI, the far-infrared imaging spectrometer for the SPICA satellite. SAFARI's three bolometer arrays will image a 2'×2' field of view with spectral information over the wavelength range 34—210 μm. SAFARI requires extremely sensitive detectors (goal NEP ~ 0.2 aW/√Hz), with correspondingly low saturation powers (~5 fW), to take advantage of SPICA's cooled optics. We have constructed an ultra-low background optical test facility containing an internal cold black-body illuminator and have recently added an internal hot black-body source and a light-pipe for external illumination. We illustrate the performance of the test facility with results including spectral-response measurements. Based on an improved understanding of the optical throughput of the test facility we find an optical efficiency of 60% for prototype SAFARI detectors.

  15. Self-organization in a distributed coordination game through heuristic rules

    NASA Astrophysics Data System (ADS)

    Agarwal, Shubham; Ghosh, Diptesh; Chakrabarti, Anindya S.

    2016-12-01

    In this paper, we consider a distributed coordination game played by a large number of agents with finite information sets, which characterizes emergence of a single dominant attribute out of a large number of competitors. Formally, N agents play a coordination game repeatedly, which has exactly N pure strategy Nash equilibria, and all of the equilibria are equally preferred by the agents. The problem is to select one equilibrium out of N possible equilibria in the least number of attempts. We propose a number of heuristic rules based on reinforcement learning to solve the coordination problem. We see that the agents self-organize into clusters with varying intensities depending on the heuristic rule applied, although all clusters but one are transitory in most cases. Finally, we characterize a trade-off in terms of the time requirement to achieve a degree of stability in strategies versus the efficiency of such a solution.

  16. Assessment of Technologies for the Space Shuttle External Tank Thermal Protection System and Recommendations for Technology Improvement - Part III: Material Property Characterization, Analysis, and Test Methods

    NASA Technical Reports Server (NTRS)

    Gates, Thomas S.; Johnson, Theodore F.; Whitley, Karen S.

    2005-01-01

    The objective of this report is to contribute to the independent assessment of the Space Shuttle External Tank Foam Material. This report specifically addresses material modeling, characterization testing, data reduction methods, and data pedigree. A brief description of the External Tank foam materials, locations, and standard failure modes is provided to develop suitable background information. A review of mechanics based analysis methods from the open literature is used to provide an assessment of the state-of-the-art in material modeling of closed cell foams. Further, this report assesses the existing material property database and investigates sources of material property variability. The report presents identified deficiencies in testing methods and procedures, recommendations for additional testing as required, identification of near-term improvements that should be pursued, and long-term capabilities or enhancements that should be developed.

  17. High flow and riparian vegetation along the San Miguel River, Colorado

    USGS Publications Warehouse

    Friedman, J.M.; Auble, G.T.

    2000-01-01

    Riparian ecosystems are characterized by abundance of water and frequent flow related disturbance. River regulation typically decreases peak flows, reducing the amount of disturbance and altering the vegetation. The San Miguel River is one of the last relatively unregulated rivers remaining in the Colorado River Watershed. One goal of major landowners along the San Miguel including the Bureau of Land Management and The Nature Conservancy is to maintain their lands in a natural condition. Conservation of an entire river corridor requires an integrated understanding of the variability in ecosystems and external influences along the river. Therefore, the Bureau of Land Management and others have fostered a series of studies designed to catalogue that variability, and to characterize the processes that maintain the river as a whole. In addition to providing information useful to managers, these studies present a rare opportunity to investigate how a Colorado river operates in the absence of regulation.

  18. Stereo Image Ranging For An Autonomous Robot Vision System

    NASA Astrophysics Data System (ADS)

    Holten, James R.; Rogers, Steven K.; Kabrisky, Matthew; Cross, Steven

    1985-12-01

    The principles of stereo vision for three-dimensional data acquisition are well-known and can be applied to the problem of an autonomous robot vehicle. Coincidental points in the two images are located and then the location of that point in a three-dimensional space can be calculated using the offset of the points and knowledge of the camera positions and geometry. This research investigates the application of artificial intelligence knowledge representation techniques as a means to apply heuristics to relieve the computational intensity of the low level image processing tasks. Specifically a new technique for image feature extraction is presented. This technique, the Queen Victoria Algorithm, uses formal language productions to process the image and characterize its features. These characterized features are then used for stereo image feature registration to obtain the required ranging information. The results can be used by an autonomous robot vision system for environmental modeling and path finding.

  19. Characterization of Microgravity Effects on Bone Structure and Strength Using Fractal Analysis

    NASA Technical Reports Server (NTRS)

    Acharya, Raj S.; Shackelford, Linda

    1996-01-01

    Protecting humans against extreme environmental conditions requires a thorough understanding of the pathophysiological changes resulting from the exposure to those extreme conditions. Knowledge of the degree of medical risk associated with the exposure is of paramount importance in the design of effective prophylactic and therapeutic measures for space exploration. Major health hazards due o musculoskeletal systems include the signs and symptoms of hypercalciuria, lengthy recovery of lost bone tissue after flight, the possibility of irreversible trabecular bone loss, the possible effect of calcification in the soft tissues, and the possible increase in fracture potential. In this research, we characterize the trabecular structure with the aid of fractal analysis. Our research to relate local trabecular structural information to microgravity conditions is an important initial step in understanding the effect of microgravity and countermeasures on bone condition and strength. The proposed research is also closely linked with Osteoporosis and will benefit the general population.

  20. Enabling Interactive Measurements from Large Coverage Microscopy

    PubMed Central

    Bajcsy, Peter; Vandecreme, Antoine; Amelot, Julien; Chalfoun, Joe; Majurski, Michael; Brady, Mary

    2017-01-01

    Microscopy could be an important tool for characterizing stem cell products if quantitative measurements could be collected over multiple spatial and temporal scales. With the cells changing states over time and being several orders of magnitude smaller than cell products, modern microscopes are already capable of imaging large spatial areas, repeat imaging over time, and acquiring images over several spectra. However, characterizing stem cell products from such large image collections is challenging because of data size, required computations, and lack of interactive quantitative measurements needed to determine release criteria. We present a measurement web system consisting of available algorithms, extensions to a client-server framework using Deep Zoom, and the configuration know-how to provide the information needed for inspecting the quality of a cell product. The cell and other data sets are accessible via the prototype web-based system at http://isg.nist.gov/deepzoomweb. PMID:28663600

  1. Ultrasonic non invasive techniques for microbiological instrumentation

    NASA Astrophysics Data System (ADS)

    Elvira, L.; Sierra, C.; Galán, B.; Resa, P.

    2010-01-01

    Non invasive techniques based on ultrasounds have advantageous features to study, characterize and monitor microbiological and enzymatic reactions. These processes may change the sound speed, viscosity or particle distribution size of the medium where they take place, which makes possible their analysis using ultrasonic techniques. In this work, two different systems for the analysis of microbiological liquid media based on ultrasounds are presented. In first place, an industrial application based on an ultrasonic monitoring technique for microbiological growth detection in milk is shown. Such a system may improve the quality control strategies in food production factories, being able to decrease the time required to detect possible contaminations in packed products. Secondly, a study about the growing of the Escherichia coli DH5 α in different conditions is presented. It is shown that the use of ultrasonic non invasive characterization techniques in combination with other conventional measurements like optical density provides complementary information about the metabolism of these bacteria.

  2. Preparation and biophysical characterization of recombinant Pseudomonas aeruginosa phosphorylcholine phosphatase.

    PubMed

    Beassoni, Paola R; Berti, Federico Pérez de; Otero, Lisandro H; Risso, Valeria A; Ferreyra, Raul G; Lisa, Angela T; Domenech, Carlos E; Ermácora, Mario R

    2010-06-01

    Pseudomonas aeruginosa infections constitute a widespread health problem with high economical and social impact, and the phosphorylcholine phosphatase (PchP) of this bacterium is a potential target for antimicrobial treatment. However, drug design requires high-resolution structural information and detailed biophysical knowledge not available for PchP. An obstacle in the study of PchP is that current methods for its expression and purification are suboptimal and allowed only a preliminary kinetic characterization of the enzyme. Herein, we describe a new procedure for the efficient preparation of recombinant PchP overexpressed in Escherichia coli. The enzyme is purified from urea solubilized inclusion bodies and refolded by dialysis. The product of PchP refolding is a mixture of native PchP and a kinetically-trapped, alternatively-folded aggregate that is very slowly converted into the native state. The properly folded and fully active enzyme is isolated from the refolding mixture by size-exclusion chromatography. PchP prepared by the new procedure was subjected to chemical and biophysical characterization, and its basic optical, hydrodynamic, metal-binding, and catalytic properties are reported. The unfolding of the enzyme was also investigated, and its thermal stability was determined. The obtained information should help to compare PchP with other phosphatases and to obtain a better understanding of its catalytic mechanism. In addition, preliminary trials showed that PchP prepared by the new protocol is suitable for crystallization, opening the way for high-resolution studies of the enzyme structure.

  3. Acceptable knowledge document for INEEL stored transuranic waste -- Rocky Flats Plant waste. Revision 2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1998-01-23

    This document and supporting documentation provide a consistent, defensible, and auditable record of acceptable knowledge for waste generated at the Rocky Flats Plant which is currently in the accessible storage inventory at the Idaho National Engineering and Environmental Laboratory. The inventory consists of transuranic (TRU) waste generated from 1972 through 1989. Regulations authorize waste generators and treatment, storage, and disposal facilities to use acceptable knowledge in appropriate circumstances to make hazardous waste determinations. Acceptable knowledge includes information relating to plant history, process operations, and waste management, in addition to waste-specific data generated prior to the effective date of the RCRAmore » regulations. This document is organized to provide the reader a comprehensive presentation of the TRU waste inventory ranging from descriptions of the historical plant operations that generated and managed the waste to specific information about the composition of each waste group. Section 2 lists the requirements that dictate and direct TRU waste characterization and authorize the use of the acceptable knowledge approach. In addition to defining the TRU waste inventory, Section 3 summarizes the historical operations, waste management, characterization, and certification activities associated with the inventory. Sections 5.0 through 26.0 describe the waste groups in the inventory including waste generation, waste packaging, and waste characterization. This document includes an expanded discussion for each waste group of potential radionuclide contaminants, in addition to other physical properties and interferences that could potentially impact radioassay systems.« less

  4. Next generation of global land cover characterization, mapping, and monitoring

    USGS Publications Warehouse

    Giri, Chandra; Pengra, Bruce; Long, J.; Loveland, Thomas R.

    2013-01-01

    Land cover change is increasingly affecting the biophysics, biogeochemistry, and biogeography of the Earth's surface and the atmosphere, with far-reaching consequences to human well-being. However, our scientific understanding of the distribution and dynamics of land cover and land cover change (LCLCC) is limited. Previous global land cover assessments performed using coarse spatial resolution (300 m–1 km) satellite data did not provide enough thematic detail or change information for global change studies and for resource management. High resolution (∼30 m) land cover characterization and monitoring is needed that permits detection of land change at the scale of most human activity and offers the increased flexibility of environmental model parameterization needed for global change studies. However, there are a number of challenges to overcome before producing such data sets including unavailability of consistent global coverage of satellite data, sheer volume of data, unavailability of timely and accurate training and validation data, difficulties in preparing image mosaics, and high performance computing requirements. Integration of remote sensing and information technology is needed for process automation and high-performance computing needs. Recent developments in these areas have created an opportunity for operational high resolution land cover mapping, and monitoring of the world. Here, we report and discuss these advancements and opportunities in producing the next generations of global land cover characterization, mapping, and monitoring at 30-m spatial resolution primarily in the context of United States, Group on Earth Observations Global 30 m land cover initiative (UGLC).

  5. Satellite SAR interferometric techniques applied to emergency mapping

    NASA Astrophysics Data System (ADS)

    Stefanova Vassileva, Magdalena; Riccardi, Paolo; Lecci, Daniele; Giulio Tonolo, Fabio; Boccardo Boccardo, Piero; Chiesa, Giuliana; Angeluccetti, Irene

    2017-04-01

    This paper aim to investigate the capabilities of the currently available SAR interferometric algorithms in the field of emergency mapping. Several tests have been performed exploiting the Copernicus Sentinel-1 data using the COTS software ENVI/SARscape 5.3. Emergency Mapping can be defined as "creation of maps, geo-information products and spatial analyses dedicated to providing situational awareness emergency management and immediate crisis information for response by means of extraction of reference (pre-event) and crisis (post-event) geographic information/data from satellite or aerial imagery". The conventional differential SAR interferometric technique (DInSAR) and the two currently available multi-temporal SAR interferometric approaches, i.e. Permanent Scatterer Interferometry (PSI) and Small BAseline Subset (SBAS), have been applied to provide crisis information useful for the emergency management activities. Depending on the considered Emergency Management phase, it may be distinguished between rapid mapping, i.e. fast provision of geospatial data regarding the area affected for the immediate emergency response, and monitoring mapping, i.e. detection of phenomena for risk prevention and mitigation activities. In order to evaluate the potential and limitations of the aforementioned SAR interferometric approaches for the specific rapid and monitoring mapping application, five main factors have been taken into account: crisis information extracted, input data required, processing time and expected accuracy. The results highlight that DInSAR has the capacity to delineate areas affected by large and sudden deformations and fulfills most of the immediate response requirements. The main limiting factor of interferometry is the availability of suitable SAR acquisition immediately after the event (e.g. Sentinel-1 mission characterized by 6-day revisiting time may not always satisfy the immediate emergency request). PSI and SBAS techniques are suitable to produce monitoring maps for risk prevention and mitigation purposes. Nevertheless, multi-temporal techniques require large SAR temporal datasets, i.e. 20 and more images. Being the Sentinel-1 missions operational only since April 2014, multi-mission SAR datasets should be therefore exploited to carry out historical analysis.

  6. Data Integration and Mining for Synthetic Biology Design.

    PubMed

    Mısırlı, Göksel; Hallinan, Jennifer; Pocock, Matthew; Lord, Phillip; McLaughlin, James Alastair; Sauro, Herbert; Wipat, Anil

    2016-10-21

    One aim of synthetic biologists is to create novel and predictable biological systems from simpler modular parts. This approach is currently hampered by a lack of well-defined and characterized parts and devices. However, there is a wealth of existing biological information, which can be used to identify and characterize biological parts, and their design constraints in the literature and numerous biological databases. However, this information is spread among these databases in many different formats. New computational approaches are required to make this information available in an integrated format that is more amenable to data mining. A tried and tested approach to this problem is to map disparate data sources into a single data set, with common syntax and semantics, to produce a data warehouse or knowledge base. Ontologies have been used extensively in the life sciences, providing this common syntax and semantics as a model for a given biological domain, in a fashion that is amenable to computational analysis and reasoning. Here, we present an ontology for applications in synthetic biology design, SyBiOnt, which facilitates the modeling of information about biological parts and their relationships. SyBiOnt was used to create the SyBiOntKB knowledge base, incorporating and building upon existing life sciences ontologies and standards. The reasoning capabilities of ontologies were then applied to automate the mining of biological parts from this knowledge base. We propose that this approach will be useful to speed up synthetic biology design and ultimately help facilitate the automation of the biological engineering life cycle.

  7. Effective Materials Property Information Management for the 21st Century

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ren, Weiju; Cebon, David; Barabash, Oleg M

    2011-01-01

    This paper discusses key principles for the development of materials property information management software systems. There are growing needs for automated materials information management in various organizations. In part these are fuelled by the demands for higher efficiency in material testing, product design and engineering analysis. But equally important, organizations are being driven by the needs for consistency, quality and traceability of data, as well as control of access to proprietary or sensitive information. Further, the use of increasingly sophisticated nonlinear, anisotropic and multi-scale engineering analyses requires both processing of large volumes of test data for development of constitutive modelsmore » and complex materials data input for Computer-Aided Engineering (CAE) software. And finally, the globalization of economy often generates great needs for sharing a single gold source of materials information between members of global engineering teams in extended supply-chains. Fortunately material property management systems have kept pace with the growing user demands and evolved to versatile data management systems that can be customized to specific user needs. The more sophisticated of these provide facilities for: (i) data management functions such as access, version, and quality controls; (ii) a wide range of data import, export and analysis capabilities; (iii) data pedigree traceability mechanisms; (iv) data searching, reporting and viewing tools; and (v) access to the information via a wide range of interfaces. In this paper the important requirements for advanced material data management systems, future challenges and opportunities such as automated error checking, data quality characterization, identification of gaps in datasets, as well as functionalities and business models to fuel database growth and maintenance are discussed.« less

  8. Scaling-up health information systems to improve HIV treatment: An assessment of initial patient monitoring systems in Mozambique.

    PubMed

    Hochgesang, Mindy; Zamudio-Haas, Sophia; Moran, Lissa; Nhampossa, Leopoldo; Packel, Laura; Leslie, Hannah; Richards, Janise; Shade, Starley B

    2017-01-01

    The rapid scale-up of HIV care and treatment in resource-limited countries requires concurrent, rapid development of health information systems to support quality service delivery. Mozambique, a country with an 11.5% prevalence of HIV, has developed nation-wide patient monitoring systems (PMS) with standardized reporting tools, utilized by all HIV treatment providers in paper or electronic form. Evaluation of the initial implementation of PMS can inform and strengthen future development as the country moves towards a harmonized, sustainable health information system. This assessment was conducted in order to 1) characterize data collection and reporting processes and PMS resources available and 2) provide evidence-based recommendations for harmonization and sustainability of PMS. This baseline assessment of PMS was conducted with eight non-governmental organizations that supported the Ministry of Health to provide 90% of HIV care and treatment in Mozambique. The study team conducted structured and semi-structured surveys at 18 health facilities located in all 11 provinces. Seventy-nine staff were interviewed. Deductive a priori analytic categories guided analysis. Health facilities have implemented paper and electronic monitoring systems with varying success. Where in use, robust electronic PMS facilitate facility-level reporting of required indicators; improve ability to identify patients lost to follow-up; and support facility and patient management. Challenges to implementation of monitoring systems include a lack of national guidelines and norms for patient level HIS, variable system implementation and functionality, and limited human and infrastructure resources to maximize system functionality and information use. This initial assessment supports the need for national guidelines to harmonize, expand, and strengthen HIV-related health information systems. Recommendations may benefit other countries with similar epidemiologic and resource-constrained environments seeking to improve PMS implementation. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  9. The source of display rules and their effects on primary health care professionals' well-being.

    PubMed

    Martínez-Iñigo, David; Totterdell, Peter; Alcover, Carlos Maria; Holman, David

    2009-11-01

    Employees' perceptions of the emotional requirements of their work role are considered a necessary antecedent of emotion work. The impact of these requirements on the emotions employees display, their well-being, and their clients' satisfaction has been explored in previous research. Emotional requirements have been characterized as organizationally-based expectations (e.g., Brotheridge & Lee, 2003), formal and informal organizational rules (e.g., Cropanzano, Weiss & Elias, 2004), occupational norms (e.g., Rafaeli & Sutton, 1987; Smith & Kleinman, 1989) and job-based demands (Brotheridge & Lee, 2002). Although all these definitions assume some kind of shared source for perceptions of emotional requirements, it remains unclear to what extent these different sources contribute and to what extent the requirements are shared by different units, teams and individuals in the organization. The present study analyses the perception of emotional requirements from a survey of ninety-seven Primary Health Care teams composed of general practitioners, nurses and administrative staff (N = 1057). The relative contribution of different sources of variance (team, organizational, and occupational) to perceived emotional requirements and the effects on employees' job satisfaction and well being are examined. Results confirm the relevance of the source and show the contribution of emotional demands to prediction of emotional exhaustion and job satisfaction levels.

  10. Exceedance statistics of accelerations resulting from thruster firings on the Apollo-Soyuz mission

    NASA Technical Reports Server (NTRS)

    Fichtl, G. H.; Holland, R. L.

    1983-01-01

    Spacecraft acceleration resulting from firings of vernier control system thrusters is an important consideration in the design, planning, execution and post-flight analysis of laboratory experiments in space. In particular, scientists and technologists involved with the development of experiments to be performed in space in many instances required statistical information on the magnitude and rate of occurrence of spacecraft accelerations. Typically, these accelerations are stochastic in nature, so that it is useful to characterize these accelerations in statistical terms. Statistics of spacecraft accelerations are summarized. Previously announced in STAR as N82-12127

  11. Probing the Dipolar Coupling in a Heterospin Endohedral Fullerene-Phthalocyanine Dyad.

    PubMed

    Zhou, Shen; Yamamoto, Masanori; Briggs, G Andrew D; Imahori, Hiroshi; Porfyrakis, Kyriakos

    2016-02-03

    Paramagnetic endohedral fullerenes and phthalocyanine (Pc) complexes are promising building blocks for molecular quantum information processing, for which tunable dipolar coupling is required. We have linked these two spin qubit candidates together and characterized the resulting electron paramagnetic resonance properties, including the spin dipolar coupling between the fullerene spin and the copper spin. Having interpreted the distance-dependent coupling strength quantitatively and further discussed the antiferromagnetic aggregation effect of the CuPc moieties, we demonstrate two ways of tuning the dipolar coupling in such dyad systems: changing the spacer group and adjusting the solution concentration.

  12. Work-Facilitating Information Visualization Techniques for Complex Wastewater Systems

    NASA Astrophysics Data System (ADS)

    Ebert, Achim; Einsfeld, Katja

    The design and the operation of urban drainage systems and wastewater treatment plants (WWTP) have become increasingly complex. This complexity is due to increased requirements concerning process technology, technical, environmental, economical, and occupational safety aspects. The plant operator has access not only to some timeworn filers and measured parameters but also to numerous on-line and off-line parameters that characterize the current state of the plant in detail. Moreover, expert databases and specific support pages of plant manufactures are accessible through the World Wide Web. Thus, the operator is overwhelmed with predominantly unstructured data.

  13. Fibromyalgia syndrome: considerations for dental hygienists.

    PubMed

    Walters, Amber; Tolle, Susan L; McCombs, Gayle M

    2015-04-01

    Fibromyalgia syndrome (FMS) is a neurosensory disorder characterized by widespread musculoskeletal pain. Typically persistent fatigue, depression, limb stiffness, non-refreshing sleep and cognitive deficiencies are also experienced. Oral symptoms and pain are common, requiring adaptations in patient management strategies and treatment interventions. Appropriate dental hygiene care of patients suffering with this disorder is contingent upon an understanding of disease epidemiology, pathophysiology, clinical characteristics, oral signs and symptoms, as well as treatment approaches. With this information dental hygienists will be better prepared to provide appropriate and effective treatment to patients with FMS. Copyright © 2015 The American Dental Hygienists’ Association.

  14. Increase Productivity Through Knowledge Management

    NASA Astrophysics Data System (ADS)

    Gavrikova, N. A.; Dolgih, I. N.; Dyrina, E. N.

    2016-04-01

    Increase in competition level requires companies to improve the efficiency of work force use characterized by labor productivity. Professional knowledge of staff and its experience play the key role in it. The results of Extrusion Line operator’s working time analysis are performed in this article. The analysis revealed that the reasons of working time ineffective use connected with inadequate information exchange and knowledge management in the company. Authors suggest the way to solve this problem: the main sources of knowledge in engineering enterprise have been defined, the conditions of success and the stages of knowledge management control have been stated.

  15. Equicontrollability and its application to model-following and decoupling.

    NASA Technical Reports Server (NTRS)

    Curran, R. T.

    1971-01-01

    Discussion of 'model following,' a term used to describe a class of problems characterized by having two dynamic systems, generically known as the 'plant' and the 'model,' it being required to find a controller to attach to the plant so as to make the resultant compensated system behave, in an input/output sense, in the same way as the model. The approach presented to the problem takes a structural point of view. The result is a complex but informative definition which solves the problem as posed. The application of both the algorithm and its basis, equicontrollability, to the decoupling problem is considered.

  16. Real-time characterization of partially observed epidemics using surrogate models.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Safta, Cosmin; Ray, Jaideep; Lefantzi, Sophia

    We present a statistical method, predicated on the use of surrogate models, for the 'real-time' characterization of partially observed epidemics. Observations consist of counts of symptomatic patients, diagnosed with the disease, that may be available in the early epoch of an ongoing outbreak. Characterization, in this context, refers to estimation of epidemiological parameters that can be used to provide short-term forecasts of the ongoing epidemic, as well as to provide gross information on the dynamics of the etiologic agent in the affected population e.g., the time-dependent infection rate. The characterization problem is formulated as a Bayesian inverse problem, and epidemiologicalmore » parameters are estimated as distributions using a Markov chain Monte Carlo (MCMC) method, thus quantifying the uncertainty in the estimates. In some cases, the inverse problem can be computationally expensive, primarily due to the epidemic simulator used inside the inversion algorithm. We present a method, based on replacing the epidemiological model with computationally inexpensive surrogates, that can reduce the computational time to minutes, without a significant loss of accuracy. The surrogates are created by projecting the output of an epidemiological model on a set of polynomial chaos bases; thereafter, computations involving the surrogate model reduce to evaluations of a polynomial. We find that the epidemic characterizations obtained with the surrogate models is very close to that obtained with the original model. We also find that the number of projections required to construct a surrogate model is O(10)-O(10{sup 2}) less than the number of samples required by the MCMC to construct a stationary posterior distribution; thus, depending upon the epidemiological models in question, it may be possible to omit the offline creation and caching of surrogate models, prior to their use in an inverse problem. The technique is demonstrated on synthetic data as well as observations from the 1918 influenza pandemic collected at Camp Custer, Michigan.« less

  17. Assessment of Spacecraft Operational Status Using Electro-Optical Predictive Techniques

    DTIC Science & Technology

    2010-09-01

    panel appendages, may require enhanced preflight characterization processes to support monitoring by passive, remote, nonimaging optical sensors...observing and characterizing key spacecraft features. The simulation results are based on electro-optical signatures apparent to nonimaging sensors, along...and communication equipment, may require enhanced preflight characterization processes to support monitoring by passive, remote, nonimaging optical

  18. On-line characterization of monoclonal antibody variants by liquid chromatography-mass spectrometry operating in a two-dimensional format.

    PubMed

    Alvarez, Melissa; Tremintin, Guillaume; Wang, Jennifer; Eng, Marian; Kao, Yung-Hsiang; Jeong, Justin; Ling, Victor T; Borisov, Oleg V

    2011-12-01

    Recombinant monoclonal antibodies (MAbs) have become one of the most rapidly growing classes of biotherapeutics in the treatment of human disease. MAbs are highly heterogeneous proteins, thereby requiring a battery of analytical technologies for their characterization. However, incompatibility between separation and subsequent detection is often encountered. Here we demonstrate the utility of a generic on-line liquid chromatography-mass spectrometry (LC-MS) method operated in a two-dimensional format toward the rapid characterization of MAb charge and size variants. Using a single chromatographic system capable of running two independent gradients, up to six fractions of interest from an ion exchange (IEC) or size exclusion (SEC) separation can be identified by trapping and desalting the fractions onto a series of reversed phase trap cartridges with subsequent on-line analysis by mass spectrometry. Analysis of poorly resolved and low-level peaks in the IEC or SEC profile was facilitated by preconcentrating fractions on the traps using multiple injections. An on-line disulfide reduction step was successfully incorporated into the workflow, allowing more detailed characterization of modified MAbs by providing chain-specific information. The system is fully automated, thereby enabling high-throughput analysis with minimal sample handling. This technology provides rapid data turnaround time, a much needed feature during product characterization and development of multiple biotherapeutic proteins. Copyright © 2011 Elsevier Inc. All rights reserved.

  19. Mechanistic and Technical Challenges in Studying the Human Microbiome and Cancer Epidemiology.

    PubMed

    Verma, Mukesh

    2017-04-01

    This article reviews the significance of the microbiome in cancer epidemiology, mechanistic and technical challenges in the field, and characterization of the microbiome in different tumor types to identify biomarkers of risk, progression, and prognosis. Publications on the microbiome and cancer epidemiology were reviewed to analyze sample collection and processing, microbiome taxa characterization by 16S ribosomal RNA sequencing, and microbiome metabolite characterization (metabotyping) by nuclear magnetic resonance and mass spectrometry. The analysis identified methodology types, research design, sample types, and issues in integrating data from different platforms. Aerodigestive cancer epidemiology studies conducted by different groups demonstrated the significance of microbiome information in developing approaches to improve health. Challenges exist in sample preparation and processing (eg, standardization of methods for collection and analysis). These challenges relate to technology, data integration from "omics" studies, inherent bias in primer selection during 16S ribosomal RNA sequencing, the need for large consortia with well-characterized biospecimens, cause and effect issues, resilience of microbiota to exposure events (requires longitudinal studies), and expanding studies for fungal and viral diversity (most studies used bacterial 16S ribosomal RNA sequencing for microbiota characterization). Despite these challenges, microbiome and cancer epidemiology studies are significant and may facilitate cancer risk assessment, diagnosis, and prognosis. In the future, clinical trials likely will use microbiota modifications to improve the efficacy of existing treatments.

  20. Mechanistic and Technical Challenges in Studying the Human Microbiome and Cancer Epidemiology

    PubMed Central

    2016-01-01

    This article reviews the significance of the microbiome in cancer epidemiology, mechanistic and technical challenges in the field, and characterization of the microbiome in different tumor types to identify biomarkers of risk, progression, and prognosis. Publications on the microbiome and cancer epidemiology were reviewed to analyze sample collection and processing, microbiome taxa characterization by 16S ribosomal RNA sequencing, and microbiome metabolite characterization (metabotyping) by nuclear magnetic resonance and mass spectrometry. The analysis identified methodology types, research design, sample types, and issues in integrating data from different platforms. Aerodigestive cancer epidemiology studies conducted by different groups demonstrated the significance of microbiome information in developing approaches to improve health. Challenges exist in sample preparation and processing (eg, standardization of methods for collection and analysis). These challenges relate to technology, data integration from “omics” studies, inherent bias in primer selection during 16S ribosomal RNA sequencing, the need for large consortia with well-characterized biospecimens, cause and effect issues, resilience of microbiota to exposure events (requires longitudinal studies), and expanding studies for fungal and viral diversity (most studies used bacterial 16S ribosomal RNA sequencing for microbiota characterization). Despite these challenges, microbiome and cancer epidemiology studies are significant and may facilitate cancer risk assessment, diagnosis, and prognosis. In the future, clinical trials likely will use microbiota modifications to improve the efficacy of existing treatments. PMID:27121074

  1. 10 CFR 60.7 - License not required for certain preliminary activities.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... WASTES IN GEOLOGIC REPOSITORIES General Provisions § 60.7 License not required for certain preliminary... repository: (a) For purposes of site characterization; or (b) For use, during site characterization or...

  2. Waste Isolation Pilot Plant Site Environmental Report for 1998

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hooda, Balwan S.; Allen, Vivian L.

    This 1998 annual Site Environmental Report (SER) was prepared in accordance with U.S. Department of Energy (DOE) Order 5400.1, ''General Environmental Protection Program''; DOE Order 231.1, ''Environmental Safety and Health Reporting''; the ''Environmental Regulatory Guide for Radiological Effluent Monitoring and Environmental Surveillance'' (DOE/EH-0173T); and the Environmental Protection Implementation Plan (DOE/WIPP 96-2199). The above orders and guidance documents require that DOE facilities submit an SER to DOE Headquarters, Office of the Assistant Secretary for Environment, Safety, and Health. The purpose of the SER is to provide a comprehensive description of operational environmental monitoring activities, an abstract of environmental activities conducted tomore » characterize site environmental management performance, to confirm compliance with environmental standards and requirements, and to highlight significant programs and efforts of environmental merit at WIPP during calendar year ( CY) 1998. The content of this SER is not restricted to a synopsis of the required data. Information pertaining to new and continued monitoring and compliance activities during CY 1998 are also included.« less

  3. Experimental characterization of the effects of pneumatic tubing on unsteady pressure measurements

    NASA Technical Reports Server (NTRS)

    Whitmore, Stephen A.; Lindsey, William T.; Curry, Robert E.; Gilyard, Glenn B.

    1990-01-01

    Advances in aircraft control system designs have, with increasing frequency, required that air data be used as flight control feedback. This condition requires that these data be measured with accuracy and high fidelity. Most air data information is provided by pneumatic pressure measuring sensors. Typically unsteady pressure data provided by pneumatic sensing systems are distorted at high frequencies. The distortion is a result of the pressure being transmitted to the pressure sensor through a length of connective tubing. The pressure is distorted by frictional damping and wave reflection. As a result, air data provided all-flush, pneumatically sensed air data systems may not meet the frequency response requirements necessary for flight control augmentation. Both lab and flight test were performed at NASA-Ames to investigate the effects of this high frequency distortion in remotely located pressure measurement systems. Good qualitative agreement between lab and flight data are demonstrated. Results from these tests are used to describe the effects of pneumatic distortion in terms of a simple parametric model.

  4. Spatiotemporal control to eliminate cardiac alternans using isostable reduction

    NASA Astrophysics Data System (ADS)

    Wilson, Dan; Moehlis, Jeff

    2017-03-01

    Cardiac alternans, an arrhythmia characterized by a beat-to-beat alternation of cardiac action potential durations, is widely believed to facilitate the transition from normal cardiac function to ventricular fibrillation and sudden cardiac death. Alternans arises due to an instability of a healthy period-1 rhythm, and most dynamical control strategies either require extensive knowledge of the cardiac system, making experimental validation difficult, or are model independent and sacrifice important information about the specific system under study. Isostable reduction provides an alternative approach, in which the response of a system to external perturbations can be used to reduce the complexity of a cardiac system, making it easier to work with from an analytical perspective while retaining many of its important features. Here, we use isostable reduction strategies to reduce the complexity of partial differential equation models of cardiac systems in order to develop energy optimal strategies for the elimination of alternans. Resulting control strategies require significantly less energy to terminate alternans than comparable strategies and do not require continuous state feedback.

  5. Intelligent Sampling of Hazardous Particle Populations in Resource-Constrained Environments

    NASA Astrophysics Data System (ADS)

    McCollough, J. P.; Quinn, J. M.; Starks, M. J.; Johnston, W. R.

    2017-10-01

    Sampling of anomaly-causing space environment drivers is necessary for both real-time operations and satellite design efforts, and optimizing measurement sampling helps minimize resource demands. Relating these measurements to spacecraft anomalies requires the ability to resolve spatial and temporal variability in the energetic charged particle hazard of interest. Here we describe a method for sampling particle fluxes informed by magnetospheric phenomenology so that, along a given trajectory, the variations from both temporal dynamics and spatial structure are adequately captured while minimizing oversampling. We describe the coordinates, sampling method, and specific regions and parameters employed. We compare resulting sampling cadences with data from spacecraft spanning the regions of interest during a geomagnetically active period, showing that the algorithm retains the gross features necessary to characterize environmental impacts on space systems in diverse orbital regimes while greatly reducing the amount of sampling required. This enables sufficient environmental specification within a resource-constrained context, such as limited telemetry bandwidth, processing requirements, and timeliness.

  6. A Novel Information-Theoretic Approach for Variable Clustering and Predictive Modeling Using Dirichlet Process Mixtures

    PubMed Central

    Chen, Yun; Yang, Hui

    2016-01-01

    In the era of big data, there are increasing interests on clustering variables for the minimization of data redundancy and the maximization of variable relevancy. Existing clustering methods, however, depend on nontrivial assumptions about the data structure. Note that nonlinear interdependence among variables poses significant challenges on the traditional framework of predictive modeling. In the present work, we reformulate the problem of variable clustering from an information theoretic perspective that does not require the assumption of data structure for the identification of nonlinear interdependence among variables. Specifically, we propose the use of mutual information to characterize and measure nonlinear correlation structures among variables. Further, we develop Dirichlet process (DP) models to cluster variables based on the mutual-information measures among variables. Finally, orthonormalized variables in each cluster are integrated with group elastic-net model to improve the performance of predictive modeling. Both simulation and real-world case studies showed that the proposed methodology not only effectively reveals the nonlinear interdependence structures among variables but also outperforms traditional variable clustering algorithms such as hierarchical clustering. PMID:27966581

  7. A Novel Information-Theoretic Approach for Variable Clustering and Predictive Modeling Using Dirichlet Process Mixtures.

    PubMed

    Chen, Yun; Yang, Hui

    2016-12-14

    In the era of big data, there are increasing interests on clustering variables for the minimization of data redundancy and the maximization of variable relevancy. Existing clustering methods, however, depend on nontrivial assumptions about the data structure. Note that nonlinear interdependence among variables poses significant challenges on the traditional framework of predictive modeling. In the present work, we reformulate the problem of variable clustering from an information theoretic perspective that does not require the assumption of data structure for the identification of nonlinear interdependence among variables. Specifically, we propose the use of mutual information to characterize and measure nonlinear correlation structures among variables. Further, we develop Dirichlet process (DP) models to cluster variables based on the mutual-information measures among variables. Finally, orthonormalized variables in each cluster are integrated with group elastic-net model to improve the performance of predictive modeling. Both simulation and real-world case studies showed that the proposed methodology not only effectively reveals the nonlinear interdependence structures among variables but also outperforms traditional variable clustering algorithms such as hierarchical clustering.

  8. CHARACTERIZATION OF DATA VARIABILITY AND UNCERTAINTY: HEALTH EFFECTS ASSESSMENTS IN THE INTEGRATED RISK INFORMATION SYSTEM (IRIS)

    EPA Science Inventory

    In response to a Congressional directive contained in HR 106-379 regarding EPA's appropriations for FY2000, EPA has undertaken an evaluation of the characterization of data variability and uncertainty in its Integrated Risk Information System (IRIS) health effects information dat...

  9. Evolution and Convergence of State Laws Governing Controlled Substance Prescription Monitoring Programs, 1998-2011

    PubMed Central

    Pierce, Matthew; Dasgupta, Nabarun

    2014-01-01

    Objectives. We sought to collect and characterize all laws governing the operation of prescription monitoring programs (PMPs), state-level databases that collect patient-specific prescription information, which have been suggested as a tool for reducing prescription drug overdose fatalities. Methods. We utilized a structured legal research protocol to systematically identify, review, and code all PMP statutes and regulations effective from 1998 through 2011. These laws were then abstracted along eleven domains, including reporting provisions, data sharing, and data access. Results. PMP characteristics vary greatly among states and across time. We observed an increase in the types and frequency of data required to be reported, the types of individuals permitted to access PMP data, and the percentage of PMPs authorized to proactively identify outlier prescribers and patients. As of 2011, 10 states required PMPs to report suspicious activity to law enforcement, while only 3 required reporting to the patient’s physician. None required linkage to drug treatment or required all prescribers to review PMP data before prescribing. Few explicitly address data retention. Conclusions. State PMP laws are heterogeneous and evolving. Future studies of PMP effectiveness should take these variations into account. PMID:24922132

  10. Comparability: manufacturing, characterization and controls, report of a UK Regenerative Medicine Platform Pluripotent Stem Cell Platform Workshop, Trinity Hall, Cambridge, 14–15 September 2015

    PubMed Central

    Williams, David J; Archer, Richard; Archibald, Peter; Bantounas, Ioannis; Baptista, Ricardo; Barker, Roger; Barry, Jacqueline; Bietrix, Florence; Blair, Nicholas; Braybrook, Julian; Campbell, Jonathan; Canham, Maurice; Chandra, Amit; Foldes, Gabor; Gilmanshin, Rudy; Girard, Mathilde; Gorjup, Erwin; Hewitt, Zöe; Hourd, Paul; Hyllner, Johan; Jesson, Helen; Kee, Jasmin; Kerby, Julie; Kotsopoulou, Nina; Kowalski, Stanley; Leidel, Chris; Marshall, Damian; Masi, Louis; McCall, Mark; McCann, Conor; Medcalf, Nicholas; Moore, Harry; Ozawa, Hiroki; Pan, David; Parmar, Malin; Plant, Anne L; Reinwald, Yvonne; Sebastian, Sujith; Stacey, Glyn; Thomas, Robert J; Thomas, Dave; Thurman-Newell, Jamie; Turner, Marc; Vitillo, Loriana; Wall, Ivan; Wilson, Alison; Wolfrum, Jacqueline; Yang, Ying; Zimmerman, Heiko

    2016-01-01

    This paper summarizes the proceedings of a workshop held at Trinity Hall, Cambridge to discuss comparability and includes additional information and references to related information added subsequently to the workshop. Comparability is the need to demonstrate equivalence of product after a process change; a recent publication states that this ‘may be difficult for cell-based medicinal products’. Therefore a well-managed change process is required which needs access to good science and regulatory advice and developers are encouraged to seek help early. The workshop shared current thinking and best practice and allowed the definition of key research questions. The intent of this report is to summarize the key issues and the consensus reached on each of these by the expert delegates. PMID:27404768

  11. CRAVE: a database, middleware and visualization system for phenotype ontologies.

    PubMed

    Gkoutos, Georgios V; Green, Eain C J; Greenaway, Simon; Blake, Andrew; Mallon, Ann-Marie; Hancock, John M

    2005-04-01

    A major challenge in modern biology is to link genome sequence information to organismal function. In many organisms this is being done by characterizing phenotypes resulting from mutations. Efficiently expressing phenotypic information requires combinatorial use of ontologies. However tools are not currently available to visualize combinations of ontologies. Here we describe CRAVE (Concept Relation Assay Value Explorer), a package allowing storage, active updating and visualization of multiple ontologies. CRAVE is a web-accessible JAVA application that accesses an underlying MySQL database of ontologies via a JAVA persistent middleware layer (Chameleon). This maps the database tables into discrete JAVA classes and creates memory resident, interlinked objects corresponding to the ontology data. These JAVA objects are accessed via calls through the middleware's application programming interface. CRAVE allows simultaneous display and linking of multiple ontologies and searching using Boolean and advanced searches.

  12. Using Grid Benchmarks for Dynamic Scheduling of Grid Applications

    NASA Technical Reports Server (NTRS)

    Frumkin, Michael; Hood, Robert

    2003-01-01

    Navigation or dynamic scheduling of applications on computational grids can be improved through the use of an application-specific characterization of grid resources. Current grid information systems provide a description of the resources, but do not contain any application-specific information. We define a GridScape as dynamic state of the grid resources. We measure the dynamic performance of these resources using the grid benchmarks. Then we use the GridScape for automatic assignment of the tasks of a grid application to grid resources. The scalability of the system is achieved by limiting the navigation overhead to a few percent of the application resource requirements. Our task submission and assignment protocol guarantees that the navigation system does not cause grid congestion. On a synthetic data mining application we demonstrate that Gridscape-based task assignment reduces the application tunaround time.

  13. Comparability: manufacturing, characterization and controls, report of a UK Regenerative Medicine Platform Pluripotent Stem Cell Platform Workshop, Trinity Hall, Cambridge, 14-15 September 2015.

    PubMed

    Williams, David J; Archer, Richard; Archibald, Peter; Bantounas, Ioannis; Baptista, Ricardo; Barker, Roger; Barry, Jacqueline; Bietrix, Florence; Blair, Nicholas; Braybrook, Julian; Campbell, Jonathan; Canham, Maurice; Chandra, Amit; Foldes, Gabor; Gilmanshin, Rudy; Girard, Mathilde; Gorjup, Erwin; Hewitt, Zöe; Hourd, Paul; Hyllner, Johan; Jesson, Helen; Kee, Jasmin; Kerby, Julie; Kotsopoulou, Nina; Kowalski, Stanley; Leidel, Chris; Marshall, Damian; Masi, Louis; McCall, Mark; McCann, Conor; Medcalf, Nicholas; Moore, Harry; Ozawa, Hiroki; Pan, David; Parmar, Malin; Plant, Anne L; Reinwald, Yvonne; Sebastian, Sujith; Stacey, Glyn; Thomas, Robert J; Thomas, Dave; Thurman-Newell, Jamie; Turner, Marc; Vitillo, Loriana; Wall, Ivan; Wilson, Alison; Wolfrum, Jacqueline; Yang, Ying; Zimmerman, Heiko

    2016-07-01

    This paper summarizes the proceedings of a workshop held at Trinity Hall, Cambridge to discuss comparability and includes additional information and references to related information added subsequently to the workshop. Comparability is the need to demonstrate equivalence of product after a process change; a recent publication states that this 'may be difficult for cell-based medicinal products'. Therefore a well-managed change process is required which needs access to good science and regulatory advice and developers are encouraged to seek help early. The workshop shared current thinking and best practice and allowed the definition of key research questions. The intent of this report is to summarize the key issues and the consensus reached on each of these by the expert delegates.

  14. Unsupervised, Robust Estimation-based Clustering for Multispectral Images

    NASA Technical Reports Server (NTRS)

    Netanyahu, Nathan S.

    1997-01-01

    To prepare for the challenge of handling the archiving and querying of terabyte-sized scientific spatial databases, the NASA Goddard Space Flight Center's Applied Information Sciences Branch (AISB, Code 935) developed a number of characterization algorithms that rely on supervised clustering techniques. The research reported upon here has been aimed at continuing the evolution of some of these supervised techniques, namely the neural network and decision tree-based classifiers, plus extending the approach to incorporating unsupervised clustering algorithms, such as those based on robust estimation (RE) techniques. The algorithms developed under this task should be suited for use by the Intelligent Information Fusion System (IIFS) metadata extraction modules, and as such these algorithms must be fast, robust, and anytime in nature. Finally, so that the planner/schedule module of the IlFS can oversee the use and execution of these algorithms, all information required by the planner/scheduler must be provided to the IIFS development team to ensure the timely integration of these algorithms into the overall system.

  15. Higher-Order Hurst Signatures: Dynamical Information in Time Series

    NASA Astrophysics Data System (ADS)

    Ferenbaugh, Willis

    2005-10-01

    Understanding and comparing time series from different systems requires characteristic measures of the dynamics embedded in the series. The Hurst exponent is a second-order dynamical measure of a time series which grew up within the blossoming fractal world of Mandelbrot. This characteristic measure is directly related to the behavior of the autocorrelation, the power-spectrum, and other second-order things. And as with these other measures, the Hurst exponent captures and quantifies some but not all of the intrinsic nature of a series. The more elusive characteristics live in the phase spectrum and the higher-order spectra. This research is a continuing quest to (more) fully characterize the dynamical information in time series produced by plasma experiments or models. The goal is to supplement the series information which can be represented by a Hurst exponent, and we would like to develop supplemental techniques in analogy with Hurst's original R/S analysis. These techniques should be another way to plumb the higher-order dynamics.

  16. Sampling plans, selective insecticides and sustainability: the case for IPM as 'informed pest management'.

    PubMed

    Castle, Steven; Naranjo, Steven E

    2009-12-01

    Integrated Pest Management (IPM) is considered the central paradigm of insect pest management and is often characterized as a comprehensive use of multiple control tactics to reduce pest status while minimizing economic and environmental costs. As the principal precursor of IPM, the integrated control concept formulated the economic theory behind pest management decisions and specified an applied methodology for carrying out pest control. Sampling, economic thresholds and selective insecticides were three of the critical elements of that methodology and are now considered indispensable to the goals of IPM. We examine each of these elements in the context of contemporaneous information as well as accumulated experience and knowledge required for their skillful implementation in an IPM program. We conclude that while IPM is principally about integrating control tactics into an effective and sustainable approach to pest control, this overarching goal can only be achieved through well-trained practitioners, knowledgeable of the tenets conceived in the integrated control concept that ultimately yield informed pest management. (c) 2009 Society of Chemical Industry.

  17. Characterizing new compositions of [001]C relaxor ferroelectric single crystals using a work-energy model

    NASA Astrophysics Data System (ADS)

    Gallagher, John A.

    2016-04-01

    The desired operating range of ferroelectric materials with compositions near the morphotropic phase boundary is limited by field induced phase transformations. In [001]C cut and poled relaxor ferroelectric single crystals the mechanically driven ferroelectric rhombohedral to ferroelectric orthorhombic phase transformation is hindered by antagonistic electrical loading. Instability around the phase transformation makes the current experimental technique for characterization of the large field behavior very time consuming. Characterization requires specialized equipment and involves an extensive set of measurements under combined electrical, mechanical, and thermal loads. In this work a mechanism-based model is combined with a more limited set of experiments to obtain the same results. The model utilizes a work-energy criterion that calculates the mechanical work required to induce the transformation and the required electrical work that is removed to reverse the transformation. This is done by defining energy barriers to the transformation. The results of the combined experiment and modeling approach are compared to the fully experimental approach and error is discussed. The model shows excellent predictive capability and is used to substantially reduce the total number of experiments required for characterization. This decreases the time and resources required for characterization of new compositions.

  18. Autonomous Object Characterization with Large Datasets

    DTIC Science & Technology

    2015-10-18

    desk, where a substantial amount of effort is required to transform raw photometry into a data product, minimizing the amount of time the analyst has...were used to explore concepts in satellite characterization and satellite state change. The first algorithm provides real- time stability estimation... Timely and effective space object (SO) characterization is a challenge, and requires advanced data processing techniques. Detection and identification

  19. Muscle categorization using PDF estimation and Naive Bayes classification.

    PubMed

    Adel, Tameem M; Smith, Benn E; Stashuk, Daniel W

    2012-01-01

    The structure of motor unit potentials (MUPs) and their times of occurrence provide information about the motor units (MUs) that created them. As such, electromyographic (EMG) data can be used to categorize muscles as normal or suffering from a neuromuscular disease. Using pattern discovery (PD) allows clinicians to understand the rationale underlying a certain muscle characterization; i.e. it is transparent. Discretization is required in PD, which leads to some loss in accuracy. In this work, characterization techniques that are based on estimating probability density functions (PDFs) for each muscle category are implemented. Characterization probabilities of each motor unit potential train (MUPT) are obtained from these PDFs and then Bayes rule is used to aggregate the MUPT characterization probabilities to calculate muscle level probabilities. Even though this technique is not as transparent as PD, its accuracy is higher than the discrete PD. Ultimately, the goal is to use a technique that is based on both PDFs and PD and make it as transparent and as efficient as possible, but first it was necessary to thoroughly assess how accurate a fully continuous approach can be. Using gaussian PDF estimation achieved improvements in muscle categorization accuracy over PD and further improvements resulted from using feature value histograms to choose more representative PDFs; for instance, using log-normal distribution to represent skewed histograms.

  20. Multiscale dispersion-state characterization of nanocomposites using optical coherence tomography

    PubMed Central

    Schneider, Simon; Eppler, Florian; Weber, Marco; Olowojoba, Ganiu; Weiss, Patrick; Hübner, Christof; Mikonsaari, Irma; Freude, Wolfgang; Koos, Christian

    2016-01-01

    Nanocomposite materials represent a success story of nanotechnology. However, development of nanomaterial fabrication still suffers from the lack of adequate analysis tools. In particular, achieving and maintaining well-dispersed particle distributions is a key challenge, both in material development and industrial production. Conventional methods like optical or electron microscopy need laborious, costly sample preparation and do not permit fast extraction of nanoscale structural information from statistically relevant sample volumes. Here we show that optical coherence tomography (OCT) represents a versatile tool for nanomaterial characterization, both in a laboratory and in a production environment. The technique does not require sample preparation and is applicable to a wide range of solid and liquid material systems. Large particle agglomerates can be directly found by OCT imaging, whereas dispersed nanoparticles are detected by model-based analysis of depth-dependent backscattering. Using a model system of polystyrene nanoparticles, we demonstrate nanoparticle sizing with high accuracy. We further prove the viability of the approach by characterizing highly relevant material systems based on nanoclays or carbon nanotubes. The technique is perfectly suited for in-line metrology in a production environment, which is demonstrated using a state-of-the-art compounding extruder. These experiments represent the first demonstration of multiscale nanomaterial characterization using OCT. PMID:27557544

  1. Multiscale dispersion-state characterization of nanocomposites using optical coherence tomography.

    PubMed

    Schneider, Simon; Eppler, Florian; Weber, Marco; Olowojoba, Ganiu; Weiss, Patrick; Hübner, Christof; Mikonsaari, Irma; Freude, Wolfgang; Koos, Christian

    2016-08-25

    Nanocomposite materials represent a success story of nanotechnology. However, development of nanomaterial fabrication still suffers from the lack of adequate analysis tools. In particular, achieving and maintaining well-dispersed particle distributions is a key challenge, both in material development and industrial production. Conventional methods like optical or electron microscopy need laborious, costly sample preparation and do not permit fast extraction of nanoscale structural information from statistically relevant sample volumes. Here we show that optical coherence tomography (OCT) represents a versatile tool for nanomaterial characterization, both in a laboratory and in a production environment. The technique does not require sample preparation and is applicable to a wide range of solid and liquid material systems. Large particle agglomerates can be directly found by OCT imaging, whereas dispersed nanoparticles are detected by model-based analysis of depth-dependent backscattering. Using a model system of polystyrene nanoparticles, we demonstrate nanoparticle sizing with high accuracy. We further prove the viability of the approach by characterizing highly relevant material systems based on nanoclays or carbon nanotubes. The technique is perfectly suited for in-line metrology in a production environment, which is demonstrated using a state-of-the-art compounding extruder. These experiments represent the first demonstration of multiscale nanomaterial characterization using OCT.

  2. The sound manifesto

    NASA Astrophysics Data System (ADS)

    O'Donnell, Michael J.; Bisnovatyi, Ilia

    2000-11-01

    Computing practice today depends on visual output to drive almost all user interaction. Other senses, such as audition, may be totally neglected, or used tangentially, or used in highly restricted specialized ways. We have excellent audio rendering through D-A conversion, but we lack rich general facilities for modeling and manipulating sound comparable in quality and flexibility to graphics. We need coordinated research in several disciplines to improve the use of sound as an interactive information channel. Incremental and separate improvements in synthesis, analysis, speech processing, audiology, acoustics, music, etc. will not alone produce the radical progress that we seek in sonic practice. We also need to create a new central topic of study in digital audio research. The new topic will assimilate the contributions of different disciplines on a common foundation. The key central concept that we lack is sound as a general-purpose information channel. We must investigate the structure of this information channel, which is driven by the cooperative development of auditory perception and physical sound production. Particular audible encodings, such as speech and music, illuminate sonic information by example, but they are no more sufficient for a characterization than typography is sufficient for characterization of visual information. To develop this new conceptual topic of sonic information structure, we need to integrate insights from a number of different disciplines that deal with sound. In particular, we need to coordinate central and foundational studies of the representational models of sound with specific applications that illuminate the good and bad qualities of these models. Each natural or artificial process that generates informative sound, and each perceptual mechanism that derives information from sound, will teach us something about the right structure to attribute to the sound itself. The new Sound topic will combine the work of computer scientists with that of numerical mathematicians studying sonification, psychologists, linguists, bioacousticians, and musicians to illuminate the structure of sound from different angles. Each of these disciplines deals with the use of sound to carry a different sort of information, under different requirements and constraints. By combining their insights, we can learn to understand of the structure of sound in general.

  3. Performance of quantitative vegetation sampling methods across gradients of cover in Great Basin plant communities

    USGS Publications Warehouse

    Pilliod, David S.; Arkle, Robert S.

    2013-01-01

    Resource managers and scientists need efficient, reliable methods for quantifying vegetation to conduct basic research, evaluate land management actions, and monitor trends in habitat conditions. We examined three methods for quantifying vegetation in 1-ha plots among different plant communities in the northern Great Basin: photography-based grid-point intercept (GPI), line-point intercept (LPI), and point-quarter (PQ). We also evaluated each method for within-plot subsampling adequacy and effort requirements relative to information gain. We found that, for most functional groups, percent cover measurements collected with the use of LPI, GPI, and PQ methods were strongly correlated. These correlations were even stronger when we used data from the upper canopy only (i.e., top “hit” of pin flags) in LPI to estimate cover. PQ was best at quantifying cover of sparse plants such as shrubs in early successional habitats. As cover of a given functional group decreased within plots, the variance of the cover estimate increased substantially, which required more subsamples per plot (i.e., transect lines, quadrats) to achieve reliable precision. For GPI, we found that that six–nine quadrats per hectare were sufficient to characterize the vegetation in most of the plant communities sampled. All three methods reasonably characterized the vegetation in our plots, and each has advantages depending on characteristics of the vegetation, such as cover or heterogeneity, study goals, precision of measurements required, and efficiency needed.

  4. Autonomous frequency domain identification: Theory and experiment

    NASA Technical Reports Server (NTRS)

    Yam, Yeung; Bayard, D. S.; Hadaegh, F. Y.; Mettler, E.; Milman, M. H.; Scheid, R. E.

    1989-01-01

    The analysis, design, and on-orbit tuning of robust controllers require more information about the plant than simply a nominal estimate of the plant transfer function. Information is also required concerning the uncertainty in the nominal estimate, or more generally, the identification of a model set within which the true plant is known to lie. The identification methodology that was developed and experimentally demonstrated makes use of a simple but useful characterization of the model uncertainty based on the output error. This is a characterization of the additive uncertainty in the plant model, which has found considerable use in many robust control analysis and synthesis techniques. The identification process is initiated by a stochastic input u which is applied to the plant p giving rise to the output. Spectral estimation (h = P sub uy/P sub uu) is used as an estimate of p and the model order is estimated using the produce moment matrix (PMM) method. A parametric model unit direction vector p is then determined by curve fitting the spectral estimate to a rational transfer function. The additive uncertainty delta sub m = p - unit direction vector p is then estimated by the cross spectral estimate delta = P sub ue/P sub uu where e = y - unit direction vectory y is the output error, and unit direction vector y = unit direction vector pu is the computed output of the parametric model subjected to the actual input u. The experimental results demonstrate the curve fitting algorithm produces the reduced-order plant model which minimizes the additive uncertainty. The nominal transfer function estimate unit direction vector p and the estimate delta of the additive uncertainty delta sub m are subsequently available to be used for optimization of robust controller performance and stability.

  5. Context-Dependent Modulation of Functional Connectivity: Secondary Somatosensory Cortex to Prefrontal Cortex Connections in Two-Stimulus-Interval Discrimination Tasks

    PubMed Central

    Chow, Stephanie S.; Romo, Ranulfo; Brody, Carlos D.

    2010-01-01

    In a complex world, a sensory cue may prompt different actions in different contexts. A laboratory example of context-dependent sensory processing is the two-stimulus-interval discrimination task. In each trial, a first stimulus (f1) must be stored in short-term memory and later compared with a second stimulus (f2), for the animal to come to a binary decision. Prefrontal cortex (PFC) neurons need to interpret the f1 information in one way (perhaps with a positive weight) and the f2 information in an opposite way (perhaps with a negative weight), although they come from the very same secondary somatosensory cortex (S2) neurons; therefore, a functional sign inversion is required. This task thus provides a clear example of context-dependent processing. Here we develop a biologically plausible model of a context-dependent signal transformation of the stimulus encoding from S2 to PFC. To ground our model in experimental neurophysiology, we use neurophysiological data recorded by R. Romo’s laboratory from both cortical area S2 and PFC in monkeys performing the task. Our main goal is to use experimentally observed context-dependent modulations of firing rates in cortical area S2 as the basis for a model that achieves a context-dependent inversion of the sign of S2 to PFC connections. This is done without requiring any changes in connectivity (Salinas, 2004b). We (1) characterize the experimentally observed context-dependent firing rate modulation in area S2, (2) construct a model that results in the sign transformation, and (3) characterize the robustness and consequent biological plausibility of the model. PMID:19494146

  6. A Unified Probabilistic Framework for Dose-Response Assessment of Human Health Effects.

    PubMed

    Chiu, Weihsueh A; Slob, Wout

    2015-12-01

    When chemical health hazards have been identified, probabilistic dose-response assessment ("hazard characterization") quantifies uncertainty and/or variability in toxicity as a function of human exposure. Existing probabilistic approaches differ for different types of endpoints or modes-of-action, lacking a unifying framework. We developed a unified framework for probabilistic dose-response assessment. We established a framework based on four principles: a) individual and population dose responses are distinct; b) dose-response relationships for all (including quantal) endpoints can be recast as relating to an underlying continuous measure of response at the individual level; c) for effects relevant to humans, "effect metrics" can be specified to define "toxicologically equivalent" sizes for this underlying individual response; and d) dose-response assessment requires making adjustments and accounting for uncertainty and variability. We then derived a step-by-step probabilistic approach for dose-response assessment of animal toxicology data similar to how nonprobabilistic reference doses are derived, illustrating the approach with example non-cancer and cancer datasets. Probabilistically derived exposure limits are based on estimating a "target human dose" (HDMI), which requires risk management-informed choices for the magnitude (M) of individual effect being protected against, the remaining incidence (I) of individuals with effects ≥ M in the population, and the percent confidence. In the example datasets, probabilistically derived 90% confidence intervals for HDMI values span a 40- to 60-fold range, where I = 1% of the population experiences ≥ M = 1%-10% effect sizes. Although some implementation challenges remain, this unified probabilistic framework can provide substantially more complete and transparent characterization of chemical hazards and support better-informed risk management decisions.

  7. Sub-component modeling for face image reconstruction in video communications

    NASA Astrophysics Data System (ADS)

    Shiell, Derek J.; Xiao, Jing; Katsaggelos, Aggelos K.

    2008-08-01

    Emerging communications trends point to streaming video as a new form of content delivery. These systems are implemented over wired systems, such as cable or ethernet, and wireless networks, cell phones, and portable game systems. These communications systems require sophisticated methods of compression and error-resilience encoding to enable communications across band-limited and noisy delivery channels. Additionally, the transmitted video data must be of high enough quality to ensure a satisfactory end-user experience. Traditionally, video compression makes use of temporal and spatial coherence to reduce the information required to represent an image. In many communications systems, the communications channel is characterized by a probabilistic model which describes the capacity or fidelity of the channel. The implication is that information is lost or distorted in the channel, and requires concealment on the receiving end. We demonstrate a generative model based transmission scheme to compress human face images in video, which has the advantages of a potentially higher compression ratio, while maintaining robustness to errors and data corruption. This is accomplished by training an offline face model and using the model to reconstruct face images on the receiving end. We propose a sub-component AAM modeling the appearance of sub-facial components individually, and show face reconstruction results under different types of video degradation using a weighted and non-weighted version of the sub-component AAM.

  8. Uncovering the requirements of cognitive work.

    PubMed

    Roth, Emilie M

    2008-06-01

    In this article, the author provides an overview of cognitive analysis methods and how they can be used to inform system analysis and design. Human factors has seen a shift toward modeling and support of cognitively intensive work (e.g., military command and control, medical planning and decision making, supervisory control of automated systems). Cognitive task analysis and cognitive work analysis methods extend traditional task analysis techniques to uncover the knowledge and thought processes that underlie performance in cognitively complex settings. The author reviews the multidisciplinary roots of cognitive analysis and the variety of cognitive task analysis and cognitive work analysis methods that have emerged. Cognitive analysis methods have been used successfully to guide system design, as well as development of function allocation, team structure, and training, so as to enhance performance and reduce the potential for error. A comprehensive characterization of cognitive work requires two mutually informing analyses: (a) examination of domain characteristics and constraints that define cognitive requirements and challenges and (b) examination of practitioner knowledge and strategies that underlie both expert and error-vulnerable performance. A variety of specific methods can be adapted to achieve these aims within the pragmatic constraints of particular projects. Cognitive analysis methods can be used effectively to anticipate cognitive performance problems and specify ways to improve individual and team cognitive performance (be it through new forms of training, user interfaces, or decision aids).

  9. The Multispectral Microscopic Imager: Integrating Microimaging with Spectroscopy for the In-Situ Exploration of the Moon

    NASA Technical Reports Server (NTRS)

    Nunez, J. I.; Farmer, J. D.; Sellar, R. G.; Allen, Carlton C.

    2010-01-01

    To maximize the scientific return, future robotic and human missions to the Moon will need to have in-situ capabilities to enable the selection of the highest value samples for returning to Earth, or a lunar base for analysis. In order to accomplish this task efficiently, samples will need to be characterized using a suite of robotic instruments that can provide crucial information about elemental composition, mineralogy, volatiles and ices. Such spatially-correlated data sets, which place mineralogy into a microtextural context, are considered crucial for correct petrogenetic interpretations. . Combining microscopic imaging with visible= nearinfrared reflectance spectroscopy, provides a powerful in-situ approach for obtaining mineralogy within a microtextural context. The approach is non-destructive and requires minimal mechanical sample preparation. This approach provides data sets that are comparable to what geologists routinely acquire in the field, using a hand lens and in the lab using thin section petrography, and provide essential information for interpreting the primary formational processes in rocks and soils as well as the effects of secondary (diagenetic) alteration processes. Such observations lay a foundation for inferring geologic histories and provide "ground truth" for similar instruments on orbiting satellites; they support astronaut EVA activities and provide basic information about the physical properties of soils required for assessing associated health risks, and are basic tools in the exploration for in-situ resources to support human exploration of the Moon.

  10. Characterization of Athabasca lean oil sands and mixed surficial materials: Comparison of capillary electrophoresis/low-resolution mass spectrometry and high-resolution mass spectrometry.

    PubMed

    MacLennan, Matthew S; Peru, Kerry M; Swyngedouw, Chris; Fleming, Ian; Chen, David D Y; Headley, John V

    2018-05-15

    Oil sands mining in Alberta, Canada, requires removal and stockpiling of considerable volumes of near-surface overburden material. This overburden includes lean oil sands (LOS) which cannot be processed economically but contain sparingly soluble petroleum hydrocarbons and naphthenic acids, which can leach into environmental waters. In order to measure and track the leaching of dissolved constituents and distinguish industrially derived organics from naturally occurring organics in local waters, practical methods were developed for characterizing multiple sources of contaminated water leakage. Capillary electrophoresis/positive-ion electrospray ionization low-resolution time-of-flight mass spectrometry (CE/LRMS), high-resolution negative-ion electrospray ionization Orbitrap mass spectrometry (HRMS) and conventional gas chromatography/flame ionization detection (GC/FID) were used to characterize porewater samples collected from within Athabasca LOS and mixed surficial materials. GC/FID was used to measure total petroleum hydrocarbon and HRMS was used to measure total naphthenic acid fraction components (NAFCs). HRMS and CE/LRMS were used to characterize samples according to source. The amounts of total petroleum hydrocarbon in each sample as measured by GC/FID ranged from 0.1 to 15.1 mg/L while the amounts of NAFCs as measured by HRMS ranged from 5.3 to 82.3 mg/L. Factors analysis (FA) on HRMS data visually demonstrated clustering according to sample source and was correlated to molecular formula. LRMS coupled to capillary electrophoresis separation (CE/LRMS) provides important information on NAFC isomers by adding analyte migration time data to m/z and peak intensity. Differences in measured amounts of total petroleum hydrocarbons by GC/FID and NAFCs by HRMS indicate that the two methods provide complementary information about the nature of dissolved organic species in a soil or water leachate samples. NAFC molecule class O x S y is a possible tracer for LOS seepage. CE/LRMS provides complementary information and is a feasible and practical option for source evaluation of NAFCs in water. Copyright © 2018 John Wiley & Sons, Ltd.

  11. Advanced applications of scatterometry based optical metrology

    NASA Astrophysics Data System (ADS)

    Dixit, Dhairya; Keller, Nick; Kagalwala, Taher; Recchia, Fiona; Lifshitz, Yevgeny; Elia, Alexander; Todi, Vinit; Fronheiser, Jody; Vaid, Alok

    2017-03-01

    The semiconductor industry continues to drive patterning solutions that enable devices with higher memory storage capacity, faster computing performance, and lower cost per transistor. These developments in the field of semiconductor manufacturing along with the overall minimization of the size of transistors require continuous development of metrology tools used for characterization of these complex 3D device architectures. Optical scatterometry or optical critical dimension (OCD) is one of the most prevalent inline metrology techniques in semiconductor manufacturing because it is a quick, precise and non-destructive metrology technique. However, at present OCD is predominantly used to measure the feature dimensions such as line-width, height, side-wall angle, etc. of the patterned nano structures. Use of optical scatterometry for characterizing defects such as pitch-walking, overlay, line edge roughness, etc. is fairly limited. Inspection of process induced abnormalities is a fundamental part of process yield improvement. It provides process engineers with important information about process errors, and consequently helps optimize materials and process parameters. Scatterometry is an averaging technique and extending it to measure the position of local process induced defectivity and feature-to-feature variation is extremely challenging. This report is an overview of applications and benefits of using optical scatterometry for characterizing defects such as pitch-walking, overlay and fin bending for advanced technology nodes beyond 7nm. Currently, the optical scatterometry is based on conventional spectroscopic ellipsometry and spectroscopic reflectometry measurements, but generalized ellipsometry or Mueller matrix spectroscopic ellipsometry data provides important, additional information about complex structures that exhibit anisotropy and depolarization effects. In addition the symmetry-antisymmetry properties associated with Mueller matrix (MM) elements provide an excellent means of measuring asymmetry present in the structure. The useful additional information as well as symmetry-antisymmetry properties of MM elements is used to characterize fin bending, overlay defects and design improvements in the OCD test structures are used to boost OCDs' sensitivity to pitch-walking. In addition, the validity of the OCD based results is established by comparing the results to the top down critical dimensionscanning electron microscope (CD-SEM) and cross-sectional transmission electron microscope (TEM) images.

  12. Light Water Reactor Sustainability Program, U.S. Efforts in Support of Examinations at Fukushima Daiichi-2017 Evaluations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Farmer, Mitchell T.

    Although the accident signatures from each unit at the Fukushima Daiichi Nuclear Power Station (NPS) [Daiichi] differ, much is not known about the end-state of core materials within these units. Some of this uncertainty can be attributed to a lack of information related to cooling system operation and cooling water injection. There is also uncertainty in our understanding of phenomena affecting: a) in-vessel core damage progression during severe accidents in boiling water reactors (BWRs), and b) accident progression after vessel failure (ex-vessel progression) for BWRs and Pressurized Water Reactors (PWRs). These uncertainties arise due to limited full scale prototypic data.more » Similar to what occurred after the accident at Three Mile Island Unit 2, these Daiichi units offer the international community a means to reduce such uncertainties by obtaining prototypic data from multiple full-scale BWR severe accidents. Information obtained from Daiichi is required to inform Decontamination and Decommissioning activities, improving the ability of the Tokyo Electric Power Company Holdings, Incorporated (TEPCO Holdings) to characterize potential hazards and to ensure the safety of workers involved with cleanup activities. This document, which has been updated to include FY2017 information, summarizes results from U.S. efforts to use information obtained by TEPCO Holdings to enhance the safety of existing and future nuclear power plant designs. This effort, which was initiated in 2014 by the Reactor Safety Technologies Pathway of the Department of Energy Office of Nuclear Energy Light Water Reactor (LWR) Sustainability Program, consists of a group of U.S. experts in LWR safety and plant operations that have identified examination needs and are evaluating TEPCO Holdings information from Daiichi that address these needs. Each year, annual reports include examples demonstrating that significant safety insights are being obtained in the areas of component performance, fission product release and transport, debris end-state location, and combustible gas generation and transport. In addition to reducing uncertainties related to severe accident modeling progression, these insights are being used to update guidance for severe accident prevention, mitigation, and emergency planning. Furthermore, reduced uncertainties in modeling the events at Daiichi will improve the realism of reactor safety evaluations and inform future D&D activities by improving the capability for characterizing potential hazards to workers involved with cleanup activities. Highlights in this FY2017 report include new insights with respect to the forces required to produce the observed Daiichi Unit 1 (1F1) shield plug endstate, the observed leakage from 1F1 components, and the amount of combustible gas generation required to produce the observed explosions in Daiichi Units 3 and 4 (1F3 and 1F4). This report contains an appendix with a list of examination needs that was updated after U.S. experts reviewed recently obtained information from examinations at Daiichi. Additional details for higher priority, near-term, examination activities are also provided. This report also includes an appendix with a description of an updated website that has been reformatted to better assist U.S. experts by providing information in an archived retrievable location, as well as an appendix summarizing U.S. Forensics activities to host the TMI-2 Knowledge Transfer and Relevance to Fukushima Meeting that was held in Idaho Falls, ID, on October 10-14, 2016.« less

  13. Informal settlements and a relational view of health in Nairobi, Kenya: sanitation, gender and dignity.

    PubMed

    Corburn, Jason; Karanja, Irene

    2016-06-01

    On an urban planet, slums or informal settlements present an increasing challenge for health promotion. The living conditions in complex informal settlements interact with how people navigate through their daily lives and political institutions to shape health inequities. In this article, we suggest that only a relational place-based characterization of informal settlements can accurately capture the forces contributing to existing urban health inequities and inform appropriate and effective health promotion interventions. We explore our relational framework using household survey, spatial mapping and qualitative focus group data gathered in partnership with residents and non-governmental organizations in the Mathare informal settlement in Nairobi, Kenya. All data interpretation included participation with local residents and organizations. We focus on the inter-relationships between inadequate sanitation and disease, social, economic and human rights for women and girls, who we show are most vulnerable from poor slum infrastructure. We suggest that this collaborative process results in co-produced insights about the meanings and relationships between infrastructure, security, resilience and health. We conclude that complex informal settlements require relational and context-specific data gathering and analyses to understand the multiple determinants of health and to inform appropriate and effective healthy city interventions. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  14. Guide RNA selection for CRISPR-Cas9 transfections in Plasmodium falciparum.

    PubMed

    Ribeiro, Jose M; Garriga, Meera; Potchen, Nicole; Crater, Anna K; Gupta, Ankit; Ito, Daisuke; Desai, Sanjay A

    2018-06-12

    CRISPR-Cas9 mediated genome editing is addressing key limitations in the transfection of malaria parasites. While this method has already simplified the needed molecular cloning and reduced the time required to generate mutants in the human pathogen Plasmodium falciparum, optimal selection of required guide RNAs and guidelines for successful transfections have not been well characterized, leading workers to use time-consuming trial and error approaches. We used a genome-wide computational approach to create a comprehensive and publicly accessible database of possible guide RNA sequences in the P. falciparum genome. For each guide, we report on-target efficiency and specificity scores as well as information about the genomic site relevant to optimal design of CRISPR-Cas9 transfections to modify, disrupt, or conditionally knockdown any gene. As many antimalarial drug and vaccine targets are encoded by multigene families, we also developed a new paralog specificity score that should facilitate modification of either a single family member of interest or multiple paralogs that serve overlapping roles. Finally, we tabulated features of successful transfections in our laboratory, providing broadly useful guidelines for parasite transfections. Molecular studies aimed at understanding parasite biology or characterizing drug and vaccine targets in P. falciparum should be facilitated by this comprehensive database. Published by Elsevier Ltd.

  15. An innovative approach to the safety evaluation of natural products: cranberry (Vaccinium macrocarpon Aiton) leaf aqueous extract as a case study.

    PubMed

    Booth, Nancy L; Kruger, Claire L; Wallace Hayes, A; Clemens, Roger

    2012-09-01

    Assessment of safety for a food or dietary ingredient requires determination of a safe level of ingestion compared to the estimated daily intake from its proposed uses. The nature of the assessment may require the use of different approaches, determined on a case-by-case basis. Natural products are chemically complex and challenging to characterize for the purpose of carrying out a safety evaluation. For example, a botanical extract contains numerous compounds, many of which vary across batches due to changes in environmental conditions and handling. Key components integral to the safety evaluation must be identified and their variability established to assure that specifications are representative of a commercial product over time and protective of the consumer; one can then extrapolate the results of safety studies on a single batch of product to other batches that are produced under similar conditions. Safety of a well-characterized extract may be established based on the safety of its various components. When sufficient information is available from the public literature, additional toxicology testing is not necessary for a safety determination on the food or dietary ingredient. This approach is demonstrated in a case study of an aqueous extract of cranberry (Vaccinium macrocarpon Aiton) leaves. Copyright © 2012. Published by Elsevier Ltd.

  16. Brain activity related to working memory for temporal order and object information.

    PubMed

    Roberts, Brooke M; Libby, Laura A; Inhoff, Marika C; Ranganath, Charan

    2017-06-08

    Maintaining items in an appropriate sequence is important for many daily activities; however, remarkably little is known about the neural basis of human temporal working memory. Prior work suggests that the prefrontal cortex (PFC) and medial temporal lobe (MTL), including the hippocampus, play a role in representing information about temporal order. The involvement of these areas in successful temporal working memory, however, is less clear. Additionally, it is unknown whether regions in the PFC and MTL support temporal working memory across different timescales, or at coarse or fine levels of temporal detail. To address these questions, participants were scanned while completing 3 working memory task conditions (Group, Position and Item) that were matched in terms of difficulty and the number of items to be actively maintained. Group and Position trials probed temporal working memory processes, requiring the maintenance of hierarchically organized coarse and fine temporal information, respectively. To isolate activation related to temporal working memory, Group and Position trials were contrasted against Item trials, which required detailed working memory maintenance of visual objects. Results revealed that working memory encoding and maintenance of temporal information relative to visual information was associated with increased activation in dorsolateral PFC (DLPFC), and perirhinal cortex (PRC). In contrast, maintenance of visual details relative to temporal information was characterized by greater activation of parahippocampal cortex (PHC), medial and anterior PFC, and retrosplenial cortex. In the hippocampus, a dissociation along the longitudinal axis was observed such that the anterior hippocampus was more active for working memory encoding and maintenance of visual detail information relative to temporal information, whereas the posterior hippocampus displayed the opposite effect. Posterior parietal cortex was the only region to show sensitivity to temporal working memory across timescales, and was particularly involved in the encoding and maintenance of fine temporal information relative to maintenance of temporal information at more coarse timescales. Collectively, these results highlight the involvement of PFC and MTL in temporal working memory processes, and suggest a dissociation in the type of working memory information represented along the longitudinal axis of the hippocampus. Copyright © 2017 Elsevier B.V. All rights reserved.

  17. Characterizing the Undergraduate Neuroscience Major in the U.S.: An Examination of Course Requirements and Institution-Program Associations

    PubMed Central

    Pinard-Welyczko, Kira M.; Garrison, Anna C. S.; Ramos, Raddy L.; Carter, Bradley S.

    2017-01-01

    Neuroscience is a rapidly expanding field, and many colleges and universities throughout the country are implementing new neuroscience degree programs. Despite the field’s growth and popularity, little data exists on the structural character of current undergraduate neuroscience programs. We collected and examined comprehensive data on existing undergraduate neuroscience programs, including academic major requirements and institution characteristics such as size, financial resources, and research opportunities. Thirty-one variables covering information about course requirements, department characteristics, financial resources, and institution characteristics were collected from 118 colleges and universities in the United States that offer a major titled “neuroscience” or “neural sciences.” Data was collected from publicly available sources (online databases, institutions’ neuroscience program websites) and then analyzed to define the average curriculum and identify associations between institution and program characteristics. Our results suggest that the average undergraduate neuroscience major requires 3 chemistry, 3 biology, 3 laboratory, 2–3 neuroscience, 1 physics, 1 math, and 2 psychology courses, suggesting that most neuroscience programs emphasize the natural sciences over the social sciences. Additionally, while 98% of institutions in our database offer research opportunities, only 31% required majors to perform research. Of note, 70% of institutions offering a neuroscience major do not have a neuroscience department, suggesting that most institutions offer neuroscience as an interdisciplinary major spanning several departments. Finally, smaller liberal arts colleges account for the majority of institutions offering a neuroscience major. Overall, these findings may be useful for informing groups interested in undergraduate neuroscience training, including institutions looking to improve or establish programs, students wanting to major in neuroscience and employers hiring neuroscience graduates. PMID:29371843

  18. Tank 241-AY-101 Privatization Push Mode Core Sampling and Analysis Plan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    TEMPLETON, A.M.

    2000-01-12

    This sampling and analysis plan (SAP) identifies characterization objectives pertaining to sample collection, laboratory analytical evaluation, and reporting requirements for samples obtained from tank 241-AY-101. The purpose of this sampling event is to obtain information about the characteristics of the contents of 241-AY-101 required to satisfy Data Quality Objectives For RPP Privatization Phase I: Confirm Tank T Is An Appropriate Feed Source For High-Level Waste Feed Batch X(HLW DQO) (Nguyen 1999a), Data Quality Objectives For TWRS Privatization Phase I : Confirm Tank T Is An Appropriate Feed Source For Low-Activity Waste Feed Batch X (LAW DQO) (Nguyen 1999b), Low Activitymore » Waste and High-Level Waste Feed Data Quality Objectives (L and H DQO) (Patello et al. 1999), and Characterization Data Needs for Development, Design, and Operation of Retrieval Equipment Developed through the Data Quality Objective Process (Equipment DQO) (Bloom 1996). Special instructions regarding support to the LAW and HLW DQOs are provided by Baldwin (1999). Push mode core samples will be obtained from risers 15G and 150 to provide sufficient material for the chemical analyses and tests required to satisfy these data quality objectives. The 222-S Laboratory will extrude core samples; composite the liquids and solids; perform chemical analyses on composite and segment samples; archive half-segment samples; and provide subsamples to the Process Chemistry Laboratory. The Process Chemistry Laboratory will prepare test plans and perform process tests to evaluate the behavior of the 241-AY-101 waste undergoing the retrieval and treatment scenarios defined in the applicable DQOs. Requirements for analyses of samples originating in the process tests will be documented in the corresponding test plans and are not within the scope of this SAP.« less

  19. Tank 241-AY-101 Privatization Push Mode Core Sampling and Analysis Plan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    TEMPLETON, A.M.

    2000-05-19

    This sampling and analysis plan (SAP) identifies characterization objectives pertaining to sample collection, laboratory analytical evaluation, and reporting requirements for samples obtained from tank 241-AY-101. The purpose of this sampling event is to obtain information about the characteristics of the contents of 241-AY-101 required to satisfy ''Data Quality Objectives For RPP Privatization Phase I: Confirm Tank T Is An Appropriate Feed Source For High-Level Waste Feed Batch X(HLW DQO)' (Nguyen 1999a), ''Data Quality Objectives For TWRS Privatization Phase I: Confirm Tank T Is An Appropriate Feed Source For Low-Activity Waste Feed Butch X (LAW DQO) (Nguyen 1999b)'', ''Low Activity Wastemore » and High-Level Waste Feed Data Quality Objectives (L&H DQO)'' (Patello et al. 1999), and ''Characterization Data Needs for Development, Design, and Operation of Retrieval Equipment Developed through the Data Quality Objective Process (Equipment DQO)'' (Bloom 1996). Special instructions regarding support to the LAW and HLW DQOs are provided by Baldwin (1999). Push mode core samples will be obtained from risers 15G and 150 to provide sufficient material for the chemical analyses and tests required to satisfy these data quality objectives. The 222-S Laboratory will extrude core samples; composite the liquids and solids; perform chemical analyses on composite and segment samples; archive half-segment samples; and provide sub-samples to the Process Chemistry Laboratory. The Process Chemistry Laboratory will prepare test plans and perform process tests to evaluate the behavior of the 241-AY-101 waste undergoing the retrieval and treatment scenarios defined in the applicable DQOs. Requirements for analyses of samples originating in the process tests will be documented in the corresponding test plans and are not within the scope of this SAP.« less

  20. Automaticity in Anxiety Disorders and Major Depressive Disorder

    PubMed Central

    Teachman, Bethany A.; Joormann, Jutta; Steinman, Shari; Gotlib, Ian H.

    2012-01-01

    In this paper we examine the nature of automatic cognitive processing in anxiety disorders and Major Depressive Disorder (MDD). Rather than viewing automaticity as a unitary construct, we follow a social cognition perspective (Bargh, 1994) that argues for four theoretically independent features of automaticity: unconscious (processing of emotional stimuli occurs outside awareness), efficient (processing emotional meaning uses minimal attentional resources), unintentional (no goal is needed to engage in processing emotional meaning), and uncontrollable (limited ability to avoid, alter or terminate processing emotional stimuli). Our review of the literature suggests that most anxiety disorders are characterized by uncontrollable, and likely also unconscious and unintentional, biased processing of threat-relevant information. In contrast, MDD is most clearly typified by uncontrollable, but not unconscious or unintentional, processing of negative information. For the anxiety disorders and for MDD, there is not sufficient evidence to draw firm conclusions about efficiency of processing, though early indications are that neither anxiety disorders nor MDD are characterized by this feature. Clinical and theoretical implications of these findings are discussed and directions for future research are offered. In particular, it is clear that paradigms that more directly delineate the different features of automaticity are required to gain a more comprehensive and systematic understanding of the importance of automatic processing in emotion dysregulation. PMID:22858684

  1. Raman spectroscopy in biomedicine – non-invasive in vitro analysis of cells and extracellular matrix components in tissues

    PubMed Central

    Brauchle, Eva; Schenke-Layland, Katja

    2013-01-01

    Raman spectroscopy is an established laser-based technology for the quality assurance of pharmaceutical products. Over the past few years, Raman spectroscopy has become a powerful diagnostic tool in the life sciences. Raman spectra allow assessment of the overall molecular constitution of biological samples, based on specific signals from proteins, nucleic acids, lipids, carbohydrates, and inorganic crystals. Measurements are non-invasive and do not require sample processing, making Raman spectroscopy a reliable and robust method with numerous applications in biomedicine. Moreover, Raman spectroscopy allows the highly sensitive discrimination of bacteria. Rama spectra retain information on continuous metabolic processes and kinetics such as lipid storage and recombinant protein production. Raman spectra are specific for each cell type and provide additional information on cell viability, differentiation status, and tumorigenicity. In tissues, Raman spectroscopy can detect major extracellular matrix components and their secondary structures. Furthermore, the non-invasive characterization of healthy and pathological tissues as well as quality control and process monitoring of in vitro-engineered matrix is possible. This review provides comprehensive insight to the current progress in expanding the applicability of Raman spectroscopy for the characterization of living cells and tissues, and serves as a good reference point for those starting in the field. PMID:23161832

  2. How do patients access the private sector in Chennai, India? An evaluation of delays in tuberculosis diagnosis.

    PubMed

    Bronner Murrison, L; Ananthakrishnan, R; Swaminathan, A; Auguesteen, S; Krishnan, N; Pai, M; Dowdy, D W

    2016-04-01

    The diagnosis and treatment of tuberculosis (TB) in India are characterized by heavy private-sector involvement. Delays in treatment remain poorly characterized among patients seeking care in the Indian private sector. To assess delays in TB diagnosis and treatment initiation among patients diagnosed in the private sector, and pathways to care in an urban setting. Cross-sectional survey of 289 consecutive patients diagnosed with TB in the private sector and referred for anti-tuberculosis treatment through a public-private mix program in Chennai from January 2014 to February 2015. Among 212 patients with pulmonary TB, 90% first contacted a formal private provider, and 78% were diagnosed by the first or second provider seen after a median of three visits per provider. Median total delay was 51 days (mean 68). Consulting an informal (rather than formally trained) provider first was associated with significant increases in total delay (absolute increase 22.8 days, 95%CI 6.2-39.5) and in the risk of prolonged delay >90 days (aRR 2.4, 95%CI 1.3-4.4). Even among patients seeking care in the formal (vs. informal) private sector in Chennai, diagnostic delays are substantial. Novel strategies are required to engage private providers, who often serve as the first point of contact.

  3. Application of reflectance micro-Fourier Transform infrared analysis to the study of coal macerals: An example from the Late Jurassic to Early Cretaceous coals of the Mist Mountain Formation, British Columbia, Canada

    USGS Publications Warehouse

    Mastalerz, Maria; Bustin, R.M.

    1996-01-01

    The applicability of the reflectance micro-Fourier Transform infra-red spectroscopy (FTIR) technique for analyzing the distribution of functional groups in coal macerals is discussed. High quality of spectra, comparable to those obtained using other FTIR techniques (KBr pellet and transmission micro-FTIR), indicate this technique can be applied to characterizing functional groups under most conditions. The ease of sample preparation, the potential to analyze large intact samples, and ability to characterize organic matter in areas as small as 20 ??m are the main advantages of reflectance micro-FTIR. The quantitative aspects of reflectance micro-FTIR require further study. The examples from the coal seams of the Mist Mountain Formation, British Columbia show that at high volatile bituminous rank, reflectance micro-FTIR provides valuable information on the character of aliphatic chains of vitrinite and liptinite macerals. Because the character of aliphatic chains influences bond disassociation energies, such information is useful from a hydrocarbon generation viewpoint. In medium volatile bituminous coal liptinite macerals are usually not detectable but this technique can be used to study the degree of oxidation and reactivity of vitrinite and semifusinite.

  4. A simple-harmonic model for depicting the annual cycle of seasonal temperatures of streams

    USGS Publications Warehouse

    Steele, Timothy Doak

    1978-01-01

    Due to economic or operational constraints, stream-temperature records cannot always be collected at all sites where information is desired or at frequencies dictated by continuous or near-continuous surveillance requirements. For streams where only periodic measurements are made during the year, and that are not appreciably affected by regulation or by thermal loading , a simple harmonic function may adequately depict the annual seasonal cycle of stream temperature at any given site. Resultant harmonic coefficients obtained from available stream-temperature records may be used in the following ways: (1) To interpolate between discrete measurements by solving the harmonic function at specified times, thereby filling in estimates of stream-temperature values; (2) to characterize areal or regional patterns of natural stream-temperature values; (2) to characterize areal or regional patterns of natural stream-temperature conditions; and (3) to detect and to assess any significant at a site brought about by streamflow regulation or basin development. Moreover, less-than-daily or sampling frequencies at a given site may give estimates of annual variation of stream temperatures that are statistically comparable to estimates obtained from a daily or continuous sampling scheme. The latter procedure may result in potential savings of resources in network operations, with negligible loss in information on annual stream-temperature variations. (Woodard -USGS)

  5. Microspectroscopic Analysis of Anthropogenic- and Biogenic-Influenced Aerosol Particles during the SOAS Field Campaign

    NASA Astrophysics Data System (ADS)

    Ault, A. P.; Bondy, A. L.; Nhliziyo, M. V.; Bertman, S. B.; Pratt, K.; Shepson, P. B.

    2013-12-01

    During the summer, the southeastern United States experiences a cooling haze due to the interaction of anthropogenic and biogenic aerosol sources. An objective of the summer 2013 Southern Oxidant and Aerosol Study (SOAS) was to improve our understanding of how trace gases and aerosols are contributing to this relative cooling through light scattering and absorption. To improve understanding of biogenic-anthropogenic interactions through secondary organic aerosol (SOA) formation on primary aerosol cores requires detailed physicochemical characterization of the particles after uptake and processing. Our measurements focus on single particle analysis of aerosols in the accumulation mode (300-1000 nm) collected using a multi orifice uniform deposition impactor (MOUDI) at the Centreville, Alabama SEARCH site. Particles were characterized using an array of microscopic and spectroscopic techniques, including: scanning electron microscopy (SEM), transmission electron microscopy (TEM), energy dispersive X-ray analysis (EDX), and Raman microspectroscopy. These analyses provide detailed information on particle size, morphology, elemental composition, and functional groups. This information is combined with mapping capabilities to explore individual particle spatial patterns and how that impacts structural characteristics. The improved understanding will be used to explore how sources and processing (such as SOA coating of soot) change particle structure (i.e. core shell) and how the altered optical properties impact air quality/climate effects on a regional scale.

  6. Construction and Resource Utilization Explorer (CRUX): Implementing Instrument Suite Data Fusion to Characterize Regolith Hydrogen Resources

    NASA Technical Reports Server (NTRS)

    Haldemann, Albert F. C.; Johnson, Jerome B.; Elphic, Richard C.; Boynton, William V.; Wetzel, John

    2006-01-01

    CRUX is a modular suite of geophysical and borehole instruments combined with display and decision support system (MapperDSS) tools to characterize regolith resources, surface conditions, and geotechnical properties. CRUX is a NASA-funded Technology Maturation Program effort to provide enabling technology for Lunar and Planetary Surface Operations (LPSO). The MapperDSS uses data fusion methods with CRUX instruments, and other available data and models, to provide regolith properties information needed for LPSO that cannot be determined otherwise. We demonstrate the data fusion method by showing how it might be applied to characterize the distribution and form of hydrogen using a selection of CRUX instruments: Borehole Neutron Probe and Thermal Evolved Gas Analyzer data as a function of depth help interpret Surface Neutron Probe data to generate 3D information. Secondary information from other instruments along with physical models improves the hydrogen distribution characterization, enabling information products for operational decision-making.

  7. Risk Informed Margins Management as part of Risk Informed Safety Margin Characterization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Curtis Smith

    2014-06-01

    The ability to better characterize and quantify safety margin is important to improved decision making about Light Water Reactor (LWR) design, operation, and plant life extension. A systematic approach to characterization of safety margins and the subsequent margin management options represents a vital input to the licensee and regulatory analysis and decision making that will be involved. In addition, as research and development in the LWR Sustainability (LWRS) Program and other collaborative efforts yield new data, sensors, and improved scientific understanding of physical processes that govern the aging and degradation of plant SSCs needs and opportunities to better optimize plantmore » safety and performance will become known. To support decision making related to economics, readability, and safety, the Risk Informed Safety Margin Characterization (RISMC) Pathway provides methods and tools that enable mitigation options known as risk informed margins management (RIMM) strategies.« less

  8. Tank 241-AX-104 upper vadose zone cone penetrometer demonstration sampling and analysis plan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    FIELD, J.G.

    1999-02-02

    This sampling and analysis plan (SAP) is the primary document describing field and laboratory activities and requirements for the tank 241-AX-104 upper vadose zone cone penetrometer (CP) demonstration. It is written in accordance with Hanford Tank Initiative Tank 241-AX-104 Upper Vadose Zone Demonstration Data Quality Objective (Banning 1999). This technology demonstration, to be conducted at tank 241-AX-104, is being performed by the Hanford Tanks Initiative (HTI) Project as a part of Tank Waste Remediation System (TWRS) Retrieval Program (EM-30) and the Office of Science and Technology (EM-50) Tanks Focus Area. Sample results obtained as part of this demonstration will providemore » additional information for subsequent revisions to the Retrieval Performance Evaluation (RPE) report (Jacobs 1998). The RPE Report is the result of an evaluation of a single tank farm (AX Tank Farm) used as the basis for demonstrating a methodology for developing the data and analyses necessary to support making tank waste retrieval decisions within the context of tank farm closure requirements. The RPE includes a study of vadose zone contaminant transport mechanisms, including analysis of projected tank leak characteristics, hydrogeologic characteristics of tank farm soils, and the observed distribution of contaminants in the vadose zone in the tank farms. With limited characterization information available, large uncertainties exist as to the nature and extent of contaminants that may exist in the upper vadose zone in the AX Tank Farm. Traditionally, data has been collected from soils in the vadose zone through the installation of boreholes and wells. Soil samples are collected as the bore hole is advanced and samples are screened on site and/or sent to a laboratory for analysis. Some in-situ geophysical methods of contaminant analysis can be used to evaluate radionuclide levels in the soils adjacent to an existing borehole. However, geophysical methods require compensation for well casing interference and soil moisture content and may not be successful in some conditions. In some cases the level of interference must be estimated due to uncertainties regarding the materials used in well construction and soil conditions, Well casing deployment used for many in-situ geophysical methods is relatively expensive and geophysical methods do not generally provide real time values for contaminants. In addition, some of these methods are not practical within the boundaries of the tank farm due to physical constraints, such as underground piping and other hardware. The CP technologies could facilitate future characterization of vadose zone soils by providing vadose zone data in near real-time, reducing the number of soil samples and boreholes required, and reducing characterization costs.« less

  9. Field emission scanning electron microscopy (FE-SEM) as an approach for nanoparticle detection inside cells.

    PubMed

    Havrdova, M; Polakova, K; Skopalik, J; Vujtek, M; Mokdad, A; Homolkova, M; Tucek, J; Nebesarova, J; Zboril, R

    2014-12-01

    When developing new nanoparticles for bio-applications, it is important to fully characterize the nanoparticle's behavior in biological systems. The most common techniques employed for mapping nanoparticles inside cells include transmission electron microscopy (TEM) and scanning transmission electron microscopy (STEM). These techniques entail passing an electron beam through a thin specimen. STEM or TEM imaging is often used for the detection of nanoparticles inside cellular organelles. However, lengthy sample preparation is required (i.e., fixation, dehydration, drying, resin embedding, and cutting). In the present work, a new matrix (FTO glass) for biological samples was used and characterized by field emission scanning electron microscopy (FE-SEM) to generate images comparable to those obtained by TEM. Using FE-SEM, nanoparticle images were acquired inside endo/lysosomes without disruption of the cellular shape. Furthermore, the initial steps of nanoparticle incorporation into the cells were captured. In addition, the conductive FTO glass endowed the sample with high stability under the required accelerating voltage. Owing to these features of the sample, further analyses could be performed (material contrast and energy-dispersive X-ray spectroscopy (EDS)), which confirmed the presence of nanoparticles inside the cells. The results showed that FE-SEM can enable detailed characterization of nanoparticles in endosomes without the need for contrast staining or metal coating of the sample. Images showing the intracellular distribution of nanoparticles together with cellular morphology can give important information on the biocompatibility and demonstrate the potential of nanoparticle utilization in medicine. Copyright © 2014 Elsevier Ltd. All rights reserved.

  10. 30 CFR 229.10 - Information collection requirements.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 30 Mineral Resources 2 2010-07-01 2010-07-01 false Information collection requirements. 229.10... MANAGEMENT DELEGATION TO STATES General Provisions § 229.10 Information collection requirements. The information collection requirements contained in this part do not require approval by the Office of Management...

  11. Waste Generation Overview, Course 23263

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Simpson, Lewis Edward

    This course, Waste Generation Overview Live (COURSE 23263), provides an overview of federal and state waste management regulations, as well as Los Alamos National Laboratory (LANL) policies and procedures for waste management operations. The course covers the activities involved in the cradle-to-grave waste management process and focuses on waste characterization, waste compatibility determinations and classification, and the storage requirements for temporary waste accumulation areas at LANL. When you have completed this course, you will be able to recognize federal, state, and LANL environmental requirements and their impact on waste operations; recognize the importance of the cradle-to-grave waste management process; identifymore » the roles and responsibilities of key LANL waste management personnel (e.g., Waste Generator, Waste Management Coordinator, Waste Stream Profile approver, and Waste Certification Official); characterize a waste stream to determine whether it meets the definition of a hazardous waste, as well as characterize the use and minimum requirements for use of acceptable knowledge (AK) for waste characterization and waste compatibility documentation requirements; and identify the requirements for setting up and managing temporary waste accumulation areas.« less

  12. EO-1 analysis applicable to coastal characterization

    NASA Astrophysics Data System (ADS)

    Burke, Hsiao-hua K.; Misra, Bijoy; Hsu, Su May; Griffin, Michael K.; Upham, Carolyn; Farrar, Kris

    2003-09-01

    The EO-1 satellite is part of NASA's New Millennium Program (NMP). It consists of three imaging sensors: the multi-spectral Advanced Land Imager (ALI), Hyperion and Atmospheric Corrector. Hyperion provides a high-resolution hyperspectral imager capable of resolving 220 spectral bands (from 0.4 to 2.5 micron) with a 30 m resolution. The instrument images a 7.5 km by 100 km land area per image. Hyperion is currently the only space-borne HSI data source since the launch of EO-1 in late 2000. The discussion begins with the unique capability of hyperspectral sensing to coastal characterization: (1) most ocean feature algorithms are semi-empirical retrievals and HSI has all spectral bands to provide legacy with previous sensors and to explore new information, (2) coastal features are more complex than those of deep ocean that coupled effects are best resolved with HSI, and (3) with contiguous spectral coverage, atmospheric compensation can be done with more accuracy and confidence, especially since atmospheric aerosol effects are the most pronounced in the visible region where coastal feature lie. EO-1 data from Chesapeake Bay from 19 February 2002 are analyzed. In this presentation, it is first illustrated that hyperspectral data inherently provide more information for feature extraction than multispectral data despite Hyperion has lower SNR than ALI. Chlorophyll retrievals are also shown. The results compare favorably with data from other sources. The analysis illustrates the potential value of Hyperion (and HSI in general) data to coastal characterization. Future measurement requirements (air borne and space borne) are also discussed.

  13. Characterizing spatial structure of sediment E. coli populations to inform sampling design.

    PubMed

    Piorkowski, Gregory S; Jamieson, Rob C; Hansen, Lisbeth Truelstrup; Bezanson, Greg S; Yost, Chris K

    2014-01-01

    Escherichia coli can persist in streambed sediments and influence water quality monitoring programs through their resuspension into overlying waters. This study examined the spatial patterns in E. coli concentration and population structure within streambed morphological features during baseflow and following stormflow to inform sampling strategies for representative characterization of E. coli populations within a stream reach. E. coli concentrations in bed sediments were significantly different (p = 0.002) among monitoring sites during baseflow, and significant interactive effects (p = 0.002) occurred among monitoring sites and morphological features following stormflow. Least absolute shrinkage and selection operator (LASSO) regression revealed that water velocity and effective particle size (D 10) explained E. coli concentration during baseflow, whereas sediment organic carbon, water velocity and median particle diameter (D 50) were important explanatory variables following stormflow. Principle Coordinate Analysis illustrated the site-scale differences in sediment E. coli populations between disconnected stream segments. Also, E. coli populations were similar among depositional features within a reach, but differed in relation to high velocity features (e.g., riffles). Canonical correspondence analysis resolved that E. coli population structure was primarily explained by spatial (26.9–31.7 %) over environmental variables (9.2–13.1 %). Spatial autocorrelation existed among monitoring sites and morphological features for both sampling events, and gradients in mean particle diameter and water velocity influenced E. coli population structure for the baseflow and stormflow sampling events, respectively. Representative characterization of streambed E. coli requires sampling of depositional and high velocity environments to accommodate strain selectivity among these features owing to sediment and water velocity heterogeneity.

  14. Modern Workflows for Fracture Rock Hydrogeology

    NASA Astrophysics Data System (ADS)

    Doe, T.

    2015-12-01

    Discrete Fracture Network (DFN) is a numerical simulation approach that represents a conducting fracture network using geologically realistic geometries and single-conductor hydraulic and transport properties. In terms of diffusion analogues, equivalent porous media derive from heat conduction in continuous media, while DFN simulation is more similar to electrical flow and diffusion in circuits with discrete pathways. DFN modeling grew out of pioneering work of David Snow in the late 1960s with additional impetus in the 1970's from the development of the development of stochastic approaches for describing of fracture geometric and hydrologic properties. Research in underground test facilities for radioactive waste disposal developed the necessary linkages between characterization technologies and simulation as well as bringing about a hybrid deterministic stochastic approach. Over the past 40 years DFN simulation and characterization methods have moved from the research environment into practical, commercial application. The key geologic, geophysical and hydrologic tools provide the required DFN inputs of conductive fracture intensity, orientation, and transmissivity. Flow logging either using downhole tool or by detailed packer testing identifies the locations of conducting features in boreholes, and image logging provides information on the geology and geometry of the conducting features. Multi-zone monitoring systems isolate the individual conductors, and with subsequent drilling and characterization perturbations help to recognize connectivity and compartmentalization in the fracture network. Tracer tests and core analysis provide critical information on the transport properties especially matrix diffusion unidentified conducting pathways. Well test analyses incorporating flow dimension boundary effects provide further constraint on the conducting geometry of the fracture network.

  15. DART-MS: A New Analytical Technique for Forensic Paint Analysis.

    PubMed

    Marić, Mark; Marano, James; Cody, Robert B; Bridge, Candice

    2018-06-05

    Automotive paint evidence is one of the most significant forms of evidence obtained in automotive-related incidents. Therefore, the analysis of automotive paint evidence is imperative in forensic casework. Most analytical schemes for automotive paint characterization involve optical microscopy, followed by infrared spectroscopy and pyrolysis-gas chromatography mass spectrometry ( py-GCMS) if required. The main drawback with py-GCMS, aside from its destructive nature, is that this technique is relatively time intensive in comparison to other techniques. Direct analysis in real-time-time-of-flight mass spectrometry (DART-TOFMS) may provide an alternative to py-GCMS, as the rapidity of analysis and minimal sample preparation affords a significant advantage. In this study, automotive clear coats from four vehicles were characterized by DART-TOFMS and a standard py-GCMS protocol. Principal component analysis was utilized to interpret the resultant data and suggested the two techniques provided analogous sample discrimination. Moreover, in some instances DART-TOFMS was able to identify components not observed by py-GCMS and vice versa, which indicates that the two techniques may provide complementary information. Additionally, a thermal desorption/pyrolysis DART-TOFMS methodology was also evaluated to characterize the intact paint chips from the vehicles to ascertain if the linear temperature gradient provided additional discriminatory information. All the paint samples were able to be discriminated based on the distinctive thermal desorption plots afforded from this technique, which may also be utilized for sample discrimination. On the basis of the results, DART-TOFMS may provide an additional tool to the forensic paint examiner.

  16. Proteomic Workflows for Biomarker Identification Using Mass Spectrometry — Technical and Statistical Considerations during Initial Discovery

    PubMed Central

    Orton, Dennis J.; Doucette, Alan A.

    2013-01-01

    Identification of biomarkers capable of differentiating between pathophysiological states of an individual is a laudable goal in the field of proteomics. Protein biomarker discovery generally employs high throughput sample characterization by mass spectrometry (MS), being capable of identifying and quantifying thousands of proteins per sample. While MS-based technologies have rapidly matured, the identification of truly informative biomarkers remains elusive, with only a handful of clinically applicable tests stemming from proteomic workflows. This underlying lack of progress is attributed in large part to erroneous experimental design, biased sample handling, as well as improper statistical analysis of the resulting data. This review will discuss in detail the importance of experimental design and provide some insight into the overall workflow required for biomarker identification experiments. Proper balance between the degree of biological vs. technical replication is required for confident biomarker identification. PMID:28250400

  17. Performance, emissions, and physical characteristics of a rotating combustion aircraft engine

    NASA Technical Reports Server (NTRS)

    Berkowitz, M.; Hermes, W. L.; Mount, R. E.; Myers, D.

    1976-01-01

    The RC2-75, a liquid cooled two chamber rotary combustion engine (Wankel type), designed for aircraft use, was tested and representative baseline (212 KW, 285 BHP) performance and emissions characteristics established. The testing included running fuel/air mixture control curves and varied ignition timing to permit selection of desirable and practical settings for running wide open throttle curves, propeller load curves, variable manifold pressure curves covering cruise conditions, and EPA cycle operating points. Performance and emissions data were recorded for all of the points run. In addition to the test data, information required to characterize the engine and evaluate its performance in aircraft use is provided over a range from one half to twice its present power. The exhaust emissions results are compared to the 1980 EPA requirements. Standard day take-off brake specific fuel consumption is 356 g/KW-HR (.585 lb/BHP-HR) for the configuration tested.

  18. Tank 241-AZ-101 Mixer Pump Test Vapor Sampling and Analysis Plan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    TEMPLETON, A.M.

    2000-03-06

    This sampling and analysis plan (SAP) identifies characterization objectives pertaining to sample collection, laboratory analytical evaluation, and reporting requirements for vapor samples obtained during the operation of mixer pumps in tank 241-AZ-101. The primary purpose of the mixer pump test (MPT) is to demonstrate that the two 300 horsepower mixer pumps installed in tank 241-AZ-101 can mobilize the settled sludge so that it can be retrieved for treatment and vitrification. Sampling will be performed in accordance with Tank 241-AZ-101 Mixer Pump Test Data Quality Objective (Banning 1999) and Data Quality Objectives for Regulatory Requirements for Hazardous and Radioactive Air Emissionsmore » Sampling and Analysis (Mulkey 1999). The sampling will verify if current air emission estimates used in the permit application are correct and provide information for future air permit applications.« less

  19. Tank 241-AZ-101 Mixer Pump Test Vapor Sampling and Analysis Plan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    TEMPLETON, A.M.

    2000-01-31

    This sampling and analysis plan (SAP) identifies characterization objectives pertaining to sample collection, laboratory analytical evaluation, and reporting requirements for vapor samples obtained during the operation of mixer pumps in tank 241-AZ-101. The primary purpose of the mixer pump test (MPT) is to demonstrate that the two 300 horsepower mixer pumps installed in tank 241-AZ-101 can mobilize the settled sludge so that it can be retrieved for treatment and vitrification Sampling will be performed in accordance with Tank 241-AZ-101 Mixer Pump Test Data Quality Objective (Banning 1999) and Data Quality Objectives for Regulatory Requirements for Hazardous and Radioactive Air Emissionsmore » Sampling and Analysis (Mulkey 1999). The sampling will verify if current air emission estimates used in the permit application are correct and provide information for future air permit applications.« less

  20. Tank 241-AZ-101 Mixer Pump Test Vapor Sampling and Analysis Plan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    TEMPLETON, A.M.

    2000-04-10

    This sampling and analysis plan (SAP) identifies characterization objectives pertaining to sample collection, laboratory analytical evaluation, and reporting requirements for vapor samples obtained during the operation of mixer pumps in tank 241-AZ-101. The primary purpose of the mixer pump test (MPT) is to demonstrate that the two 300 horsepower mixer pumps installed in tank 241-AZ-101 can mobilize the settled sludge so that it can be retrieved for treatment and vitrification. Sampling will be performed in accordance with Tank 241-AZ-101 Mixer Pump Test Data Quality Objective (Banning 1999) and Data Quality Objectives for Regulatory Requirements for Hazardous and Radioactive Air Emissionsmore » Sampling and Analysis (Mulkey 1999). The sampling will verify if current air emission estimates used in the permit application are correct and provide information for future air permit applications.« less

  1. 29 CFR 100.605 - Information collection requirements: OMB approval.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 29 Labor 2 2010-07-01 2010-07-01 false Information collection requirements: OMB approval. 100.605... REGULATIONS Debt Collection Procedures § 100.605 Information collection requirements: OMB approval. This part contains no information collection requirements, and, therefore, is not subject to the requirements of the...

  2. 10 CFR 2.8 - Information collection requirements: OMB approval.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 1 2010-01-01 2010-01-01 false Information collection requirements: OMB approval. 2.8... ISSUANCE OF ORDERS § 2.8 Information collection requirements: OMB approval. This part contains no information collection requirements and therefore is not subject to requirements of the Paperwork Reduction...

  3. Weak Long-Range Correlated Motions in a Surface Patch of Ubiquitin Involved in Molecular Recognition

    PubMed Central

    2011-01-01

    Long-range correlated motions in proteins are candidate mechanisms for processes that require information transfer across protein structures, such as allostery and signal transduction. However, the observation of backbone correlations between distant residues has remained elusive, and only local correlations have been revealed using residual dipolar couplings measured by NMR spectroscopy. In this work, we experimentally identified and characterized collective motions spanning four β-strands separated by up to 15 Å in ubiquitin. The observed correlations link molecular recognition sites and result from concerted conformational changes that are in part mediated by the hydrogen-bonding network. PMID:21634390

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gharibyan, N.

    In order to fully characterize the NIF neutron spectrum, SAND-II-SNL software was requested/received from the Radiation Safety Information Computational Center. The software is designed to determine the neutron energy spectrum through analysis of experimental activation data. However, given that the source code was developed in Sparcstation 10, it is not compatible with current version of FORTRAN. Accounts have been established through the Lawrence Livermore National Laboratory’s High Performance Computing in order to access different compiles for FORTRAN (e.g. pgf77, pgf90). Additionally, several of the subroutines included in the SAND-II-SNL package have required debugging efforts to allow for proper compiling ofmore » the code.« less

  5. Development and characterization of a snapshot Mueller matrix polarimeter for the determination of cervical cancer risk in the low resource setting

    NASA Astrophysics Data System (ADS)

    Ramella-Roman, Jessica C.; Gonzalez, Mariacarla; Chue-Sang, Joseph; Montejo, Karla; Krup, Karl; Srinivas, Vijaya; DeHoog, Edward; Madhivanan, Purnima

    2018-04-01

    Mueller Matrix polarimetry can provide useful information about the function and structure of the extracellular matrix. Mueller Matrix systems are sophisticated and costly optical tools that have been used primarily in the laboratory or in hospital settings. Here we introduce a low-cost snapshot Mueller Matrix polarimeter that that does not require external power, has no moving parts, and can acquire a full Mueller Matrix in less than 50 milliseconds. We utilized this technology in the study of cervical cancer in Mysore India, yet the system could be translated in multiple diagnostic applications.

  6. Equipment Only - Solar Resources Measurements at the University of Texas at Austin, TX: Cooperative Research and Development Final Report, CRADA Number CRD-07-222

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stoffel, T.

    Faculty and staff at the University of Texas at Austin collected solar resource measurements at their campus using equipment on loan from the National Renewable Energy Laboratory. The equipment was used to train students on the operation and maintenance of solar radiometers and was returned to NREL's Solar Radiation Research Laboratory upon completion of the CRADA. The resulting data augment the solar resource climatology information required for solar resource characterizations in the U.S. The cooperative agreement was also consistent with NREL's goal of developing an educated workforce to advance renewable energy technologies.

  7. Dependence of defect introduction on temperature and resistivity and some long-term annealing effects

    NASA Technical Reports Server (NTRS)

    Brucker, G. J.

    1971-01-01

    The effort reported here represents data of lithium properties in bulk-silicon samples before and after irradiation for analytical information required to characterize the interactions of lithium with radiation-induced defects in silicon. A model of the damage and recovery mechanisms in irradiated-lithium-containing solar cells is developed based on making measurements of the Hall coefficient and resistivity of samples irradiated by 1-MeV electrons. Experiments on bulk samples included Hall coefficient and resistivity measurements taken as a function of: (1) bombardment temperature, (2) resistivity, (3) fluence, (4) oxygen concentration, and (5) annealing time at temperatures from 300 to 373 K.

  8. Mathematical biodescriptors of proteomics maps: background and applications.

    PubMed

    Basak, Subhash C; Gute, Brian D

    2008-05-01

    This article reviews recent developments in the formulation and application of biodescriptors to characterize proteomics maps. Such biodescriptors can be derived by applying techniques from discrete mathematics (graph theory, linear algebra and information theory). This review focuses on the development of biodescriptors for proteomics maps derived from 2D gel electrophoresis. Preliminary results demonstrated that such descriptors have a reasonable ability to differentiate between proteomics patterns that result from exposure to closely related individual chemicals and complex mixtures, such as the jet fuel JP-8. Further research is required to evaluate the utility of these proteomics-based biodescriptors for drug discovery and predictive toxicology.

  9. Simple performance evaluation of pulsed spontaneous parametric down-conversion sources for quantum communications.

    PubMed

    Smirr, Jean-Loup; Guilbaud, Sylvain; Ghalbouni, Joe; Frey, Robert; Diamanti, Eleni; Alléaume, Romain; Zaquine, Isabelle

    2011-01-17

    Fast characterization of pulsed spontaneous parametric down conversion (SPDC) sources is important for applications in quantum information processing and communications. We propose a simple method to perform this task, which only requires measuring the counts on the two output channels and the coincidences between them, as well as modeling the filter used to reduce the source bandwidth. The proposed method is experimentally tested and used for a complete evaluation of SPDC sources (pair emission probability, total losses, and fidelity) of various bandwidths. This method can find applications in the setting up of SPDC sources and in the continuous verification of the quality of quantum communication links.

  10. Human-Robot Site Survey and Sampling for Space Exploration

    NASA Technical Reports Server (NTRS)

    Fong, Terrence; Bualat, Maria; Edwards, Laurence; Flueckiger, Lorenzo; Kunz, Clayton; Lee, Susan Y.; Park, Eric; To, Vinh; Utz, Hans; Ackner, Nir

    2006-01-01

    NASA is planning to send humans and robots back to the Moon before 2020. In order for extended missions to be productive, high quality maps of lunar terrain and resources are required. Although orbital images can provide much information, many features (local topography, resources, etc) will have to be characterized directly on the surface. To address this need, we are developing a system to perform site survey and sampling. The system includes multiple robots and humans operating in a variety of team configurations, coordinated via peer-to-peer human-robot interaction. In this paper, we present our system design and describe planned field tests.

  11. Canonical Visual Size for Real-World Objects

    PubMed Central

    Konkle, Talia; Oliva, Aude

    2012-01-01

    Real-world objects can be viewed at a range of distances and thus can be experienced at a range of visual angles within the visual field. Given the large amount of visual size variation possible when observing objects, we examined how internal object representations represent visual size information. In a series of experiments which required observers to access existing object knowledge, we observed that real-world objects have a consistent visual size at which they are drawn, imagined, and preferentially viewed. Importantly, this visual size is proportional to the logarithm of the assumed size of the object in the world, and is best characterized not as a fixed visual angle, but by the ratio of the object and the frame of space around it. Akin to the previous literature on canonical perspective, we term this consistent visual size information the canonical visual size. PMID:20822298

  12. A sparsity-based simplification method for segmentation of spectral-domain optical coherence tomography images

    NASA Astrophysics Data System (ADS)

    Meiniel, William; Gan, Yu; Olivo-Marin, Jean-Christophe; Angelini, Elsa

    2017-08-01

    Optical coherence tomography (OCT) has emerged as a promising image modality to characterize biological tissues. With axio-lateral resolutions at the micron-level, OCT images provide detailed morphological information and enable applications such as optical biopsy and virtual histology for clinical needs. Image enhancement is typically required for morphological segmentation, to improve boundary localization, rather than enrich detailed tissue information. We propose to formulate image enhancement as an image simplification task such that tissue layers are smoothed while contours are enhanced. For this purpose, we exploit a Total Variation sparsity-based image reconstruction, inspired by the Compressed Sensing (CS) theory, but specialized for images with structures arranged in layers. We demonstrate the potential of our approach on OCT human heart and retinal images for layers segmentation. We also compare our image enhancement capabilities to the state-of-the-art denoising techniques.

  13. The role of flight planning in aircrew decision performance

    NASA Technical Reports Server (NTRS)

    Pepitone, Dave; King, Teresa; Murphy, Miles

    1989-01-01

    The role of flight planning in increasing the safety and decision-making performance of the air transport crews was investigated in a study that involved 48 rated airline crewmembers on a B720 simulator with a model-board-based visual scene and motion cues with three degrees of freedom. The safety performance of the crews was evaluated using videotaped replays of the flight. Based on these evaluations, the crews could be divided into high- and low-safety groups. It was found that, while collecting information before flights, the high-safety crews were more concerned with information about alternative airports, especially the fuel required to get there, and were characterized by making rapid and appropriate decisions during the emergency part of the flight scenario, allowing these crews to make an early diversion to other airports. These results suggest that contingency planning that takes into account alternative courses of action enhances rapid and accurate decision-making under time pressure.

  14. Evolution and Implementation of the NASA Robotic Conjunction Assessment Risk Analysis Concept of Operations

    NASA Technical Reports Server (NTRS)

    Newman, Lauri K.; Frigm, Ryan C.; Duncan, Matthew G.; Hejduk, Matthew D.

    2014-01-01

    Reacting to potential on-orbit collision risk in an operational environment requires timely and accurate communication and exchange of data, information, and analysis to ensure informed decision-making for safety of flight and responsible use of the shared space environment. To accomplish this mission, it is imperative that all stakeholders effectively manage resources: devoting necessary and potentially intensive resource commitment to responding to high-risk conjunction events and preventing unnecessary expenditure of resources on events of low collision risk. After 10 years of operational experience, the NASA Robotic Conjunction Assessment Risk Analysis (CARA) is modifying its Concept of Operations (CONOPS) to ensure this alignment of collision risk and resource management. This evolution manifests itself in the approach to characterizing, reporting, and refining of collision risk. Implementation of this updated CONOPS is expected to have a demonstrated improvement on the efficacy of JSpOC, CARA, and owner/operator resources.

  15. Structural characterization of Staphylococcus aureus biotin protein ligase and interaction partners: an antibiotic target.

    PubMed

    Pendini, Nicole R; Yap, Min Y; Traore, D A K; Polyak, Steven W; Cowieson, Nathan P; Abell, Andrew; Booker, Grant W; Wallace, John C; Wilce, Jacqueline A; Wilce, Matthew C J

    2013-06-01

    The essential metabolic enzyme biotin protein ligase (BPL) is a potential target for the development of new antibiotics required to combat drug-resistant pathogens. Staphylococcus aureus BPL (SaBPL) is a bifunctional protein, possessing both biotin ligase and transcription repressor activities. This positions BPL as a key regulator of several important metabolic pathways. Here, we report the structural analysis of both holo- and apo-forms of SaBPL using X-ray crystallography. We also present small-angle X-ray scattering data of SaBPL in complex with its biotin-carboxyl carrier protein substrate as well as the SaBPL:DNA complex that underlies repression. This has revealed the molecular basis of ligand (biotinyl-5'-AMP) binding and conformational changes associated with catalysis and repressor function. These data provide new information to better understand the bifunctional activities of SaBPL and to inform future strategies for antibiotic discovery. © 2013 The Protein Society.

  16. Structural characterization of Staphylococcus aureus biotin protein ligase and interaction partners: An antibiotic target

    PubMed Central

    Pendini, Nicole R; Yap, Min Y; Polyak, Steven W; Cowieson, Nathan P; Abell, Andrew; Booker, Grant W; Wallace, John C; Wilce, Jacqueline A; Wilce, Matthew C J

    2013-01-01

    The essential metabolic enzyme biotin protein ligase (BPL) is a potential target for the development of new antibiotics required to combat drug-resistant pathogens. Staphylococcus aureus BPL (SaBPL) is a bifunctional protein, possessing both biotin ligase and transcription repressor activities. This positions BPL as a key regulator of several important metabolic pathways. Here, we report the structural analysis of both holo- and apo-forms of SaBPL using X-ray crystallography. We also present small-angle X-ray scattering data of SaBPL in complex with its biotin-carboxyl carrier protein substrate as well as the SaBPL:DNA complex that underlies repression. This has revealed the molecular basis of ligand (biotinyl-5′-AMP) binding and conformational changes associated with catalysis and repressor function. These data provide new information to better understand the bifunctional activities of SaBPL and to inform future strategies for antibiotic discovery. PMID:23559560

  17. Methods for understanding microbial community structures and functions in microbial fuel cells: a review.

    PubMed

    Zhi, Wei; Ge, Zheng; He, Zhen; Zhang, Husen

    2014-11-01

    Microbial fuel cells (MFCs) employ microorganisms to recover electric energy from organic matter. However, fundamental knowledge of electrochemically active bacteria is still required to maximize MFCs power output for practical applications. This review presents microbiological and electrochemical techniques to help researchers choose the appropriate methods for the MFCs study. Pre-genomic and genomic techniques such as 16S rRNA based phylogeny and metagenomics have provided important information in the structure and genetic potential of electrode-colonizing microbial communities. Post-genomic techniques such as metatranscriptomics allow functional characterizations of electrode biofilm communities by quantifying gene expression levels. Isotope-assisted phylogenetic analysis can further link taxonomic information to microbial metabolisms. A combination of electrochemical, phylogenetic, metagenomic, and post-metagenomic techniques offers opportunities to a better understanding of the extracellular electron transfer process, which in turn can lead to process optimization for power output. Copyright © 2014 Elsevier Ltd. All rights reserved.

  18. A method for work modeling at complex systems: towards applying information systems in family health care units.

    PubMed

    Jatobá, Alessandro; de Carvalho, Paulo Victor R; da Cunha, Amauri Marques

    2012-01-01

    Work in organizations requires a minimum level of consensus on the understanding of the practices performed. To adopt technological devices to support the activities in environments where work is complex, characterized by the interdependence among a large number of variables, understanding about how work is done not only takes an even greater importance, but also becomes a more difficult task. Therefore, this study aims to present a method for modeling of work in complex systems, which allows improving the knowledge about the way activities are performed where these activities do not simply happen by performing procedures. Uniting techniques of Cognitive Task Analysis with the concept of Work Process, this work seeks to provide a method capable of providing a detailed and accurate vision of how people perform their tasks, in order to apply information systems for supporting work in organizations.

  19. Direct neural pathways convey distinct visual information to Drosophila mushroom bodies

    PubMed Central

    Vogt, Katrin; Aso, Yoshinori; Hige, Toshihide; Knapek, Stephan; Ichinose, Toshiharu; Friedrich, Anja B; Turner, Glenn C; Rubin, Gerald M; Tanimoto, Hiromu

    2016-01-01

    Previously, we demonstrated that visual and olfactory associative memories of Drosophila share mushroom body (MB) circuits (Vogt et al., 2014). Unlike for odor representation, the MB circuit for visual information has not been characterized. Here, we show that a small subset of MB Kenyon cells (KCs) selectively responds to visual but not olfactory stimulation. The dendrites of these atypical KCs form a ventral accessory calyx (vAC), distinct from the main calyx that receives olfactory input. We identified two types of visual projection neurons (VPNs) directly connecting the optic lobes and the vAC. Strikingly, these VPNs are differentially required for visual memories of color and brightness. The segregation of visual and olfactory domains in the MB allows independent processing of distinct sensory memories and may be a conserved form of sensory representations among insects. DOI: http://dx.doi.org/10.7554/eLife.14009.001 PMID:27083044

  20. The electrophotonic silicon biosensor

    NASA Astrophysics Data System (ADS)

    Juan-Colás, José; Parkin, Alison; Dunn, Katherine E.; Scullion, Mark G.; Krauss, Thomas F.; Johnson, Steven D.

    2016-09-01

    The emergence of personalized and stratified medicine requires label-free, low-cost diagnostic technology capable of monitoring multiple disease biomarkers in parallel. Silicon photonic biosensors combine high-sensitivity analysis with scalable, low-cost manufacturing, but they tend to measure only a single biomarker and provide no information about their (bio)chemical activity. Here we introduce an electrochemical silicon photonic sensor capable of highly sensitive and multiparameter profiling of biomarkers. Our electrophotonic technology consists of microring resonators optimally n-doped to support high Q resonances alongside electrochemical processes in situ. The inclusion of electrochemical control enables site-selective immobilization of different biomolecules on individual microrings within a sensor array. The combination of photonic and electrochemical characterization also provides additional quantitative information and unique insight into chemical reactivity that is unavailable with photonic detection alone. By exploiting both the photonic and the electrical properties of silicon, the sensor opens new modalities for sensing on the microscale.

  1. Absolute colorimetric characterization of a DSLR camera

    NASA Astrophysics Data System (ADS)

    Guarnera, Giuseppe Claudio; Bianco, Simone; Schettini, Raimondo

    2014-03-01

    A simple but effective technique for absolute colorimetric camera characterization is proposed. It offers a large dynamic range requiring just a single, off-the-shelf target and a commonly available controllable light source for the characterization. The characterization task is broken down in two modules, respectively devoted to absolute luminance estimation and to colorimetric characterization matrix estimation. The characterized camera can be effectively used as a tele-colorimeter, giving an absolute estimation of the XYZ data in cd=m2. The user is only required to vary the f - number of the camera lens or the exposure time t, to better exploit the sensor dynamic range. The estimated absolute tristimulus values closely match the values measured by a professional spectro-radiometer.

  2. A clinical nutritional information system with personalized nutrition assessment.

    PubMed

    Kuo, Su-E; Lai, Hui-San; Hsu, Jen-Ming; Yu, Yao-Chang; Zheng, Dong-Zhe; Hou, Ting-Wei

    2018-03-01

    Traditional nutrition evaluations not only require the use of numerous tables and lists to provide sufficient recommendations for patients' diets but are also very time-consuming due to cross-referencing and calculations. To personalize patient assessments, this study implemented a Clinical Nutritional Information System (CNIS) to help hospital dietitians perform their daily work more effectively in terms of time management and paper work. The CNIS mainly targets in-patients who require cancer-nutrition counselling. The development of the CNIS occurred in three phases. Phase 1 included system design and implementation based on the Nutrition Care Process and Model (NCPM) and the Patient Nutrition Care Process. Phase 2 involved a survey to characterize the efficiency, quality and accuracy of the CNIS. In Phase 3, a second survey was conducted to determine how well dietitians had adapted to the system and the extent of improvement in efficiency after the CNIS had been available online for three years. The work time requirements decreased by approximately 58% with the assistance of the CNIS. Of the dietitians who used the CNIS, 95% reported satisfaction, with 91.66% indicating that the CNIS was really helpful in their work. However, some shortcomings were also evident according to the results. Dietitians favoured the standardization of nutritional intervention and monitoring. The CNIS meets the needs of dietitians by increasing the quality of nutritional interventions by providing accurate calculations and cross-referencing for information regarding patients' conditions, with the benefit of decreasing the processing time, such as handwritten documentation. In addition, the CNIS also helps dietitians statistically analyse each patient's personal nutritional needs to achieve nutritional improvement. Copyright © 2017 Elsevier B.V. All rights reserved.

  3. Understanding the nature of information seeking behavior in critical care: implications for the design of health information technology.

    PubMed

    Kannampallil, Thomas G; Franklin, Amy; Mishra, Rashmi; Almoosa, Khalid F; Cohen, Trevor; Patel, Vimla L

    2013-01-01

    Information in critical care environments is distributed across multiple sources, such as paper charts, electronic records, and support personnel. For decision-making tasks, physicians have to seek, gather, filter and organize information from various sources in a timely manner. The objective of this research is to characterize the nature of physicians' information seeking process, and the content and structure of clinical information retrieved during this process. Eight medical intensive care unit physicians provided a verbal think-aloud as they performed a clinical diagnosis task. Verbal descriptions of physicians' activities, sources of information they used, time spent on each information source, and interactions with other clinicians were captured for analysis. The data were analyzed using qualitative and quantitative approaches. We found that the information seeking process was exploratory and iterative and driven by the contextual organization of information. While there was no significant differences between the overall time spent paper or electronic records, there was marginally greater relative information gain (i.e., more unique information retrieved per unit time) from electronic records (t(6)=1.89, p=0.1). Additionally, information retrieved from electronic records was at a higher level (i.e., observations and findings) in the knowledge structure than paper records, reflecting differences in the nature of knowledge utilization across resources. A process of local optimization drove the information seeking process: physicians utilized information that maximized their information gain even though it required significantly more cognitive effort. Implications for the design of health information technology solutions that seamlessly integrate information seeking activities within the workflow, such as enriching the clinical information space and supporting efficient clinical reasoning and decision-making, are discussed. Copyright © 2012 Elsevier B.V. All rights reserved.

  4. Characterization of spacecraft humidity condensate

    NASA Technical Reports Server (NTRS)

    Muckle, Susan; Schultz, John R.; Sauer, Richard L.

    1994-01-01

    When construction of Space Station Freedom reaches the Permanent Manned Capability (PMC) stage, the Water Recovery and Management Subsystem will be fully operational such that (distilled) urine, spent hygiene water, and humidity condensate will be reclaimed to provide water of potable quality. The reclamation technologies currently baselined to process these waste waters include adsorption, ion exchange, catalytic oxidation, and disinfection. To ensure that the baseline technologies will be able to effectively remove those compounds presenting a health risk to the crew, the National Research Council has recommended that additional information be gathered on specific contaminants in waste waters representative of those to be encountered on the Space Station. With the application of new analytical methods and the analysis of waste water samples more representative of the Space Station environment, advances in the identification of the specific contaminants continue to be made. Efforts by the Water and Food Analytical Laboratory at JSC were successful in enlarging the database of contaminants in humidity condensate. These efforts have not only included the chemical characterization of condensate generated during ground-based studies, but most significantly the characterization of cabin and Spacelab condensate generated during Shuttle missions. The analytical results presented in this paper will be used to show how the composition of condensate varies amongst enclosed environments and thus the importance of collecting condensate from an environment close to that of the proposed Space Station. Although advances were made in the characterization of space condensate, complete characterization, particularly of the organics, requires further development of analytical methods.

  5. Characterization of CCN and IN activity of bacterial isolates collected in Atlanta, GA

    NASA Astrophysics Data System (ADS)

    Purdue, Sara; Waters, Samantha; Karthikeyan, Smruthi; Konstantinidis, Kostas; Nenes, Athanasios

    2016-04-01

    Characterization of CCN activity of bacteria, other than a few select types such as Pseudomonas syringae, is limited, especially when looked at in conjunction with corresponding IN activity. The link between these two points is especially important for bacteria as those that have high CCN activity are likely to form an aqueous phase required for immersion freezing. Given the high ice nucleation temperature of bacterial cells, especially in immersion mode, it is important to characterize the CCN and IN activity of many different bacterial strains. To this effect, we developed a droplet freezing assay (DFA) which consists of an aluminum cold plate, cooled by a continuous flow of an ethylene glycol-water mixture, in order to observe immersion freezing of the collected bacteria. Here, we present the initial results on the CCN and IN activities of bacterial samples we have collected in Atlanta, GA. Bacterial strains were collected and isolated from rainwater samples taken from different storms throughout the year. We then characterized the CCN activity of each strain using a DMT Continuous Flow Streamwise Thermal Gradient CCN Counter by exposing the aerosolized bacteria to supersaturations ranging from 0.05% to 0.6%. Additionally, using our new DFA, we characterized the IN activity of each bacterial strain at temperatures ranging from -20oC to 0oC. The combined CCN and IN activity gives us valuable information on how some uncharacterized bacteria contribute to warm and mixed-phase cloud formation in the atmosphere.

  6. Characterization of palmprints by wavelet signatures via directional context modeling.

    PubMed

    Zhang, Lei; Zhang, David

    2004-06-01

    The palmprint is one of the most reliable physiological characteristics that can be used to distinguish between individuals. Current palmprint-based systems are more user friendly, more cost effective, and require fewer data signatures than traditional fingerprint-based identification systems. The principal lines and wrinkles captured in a low-resolution palmprint image provide more than enough information to uniquely identify an individual. This paper presents a palmprint identification scheme that characterizes a palmprint using a set of statistical signatures. The palmprint is first transformed into the wavelet domain, and the directional context of each wavelet subband is defined and computed in order to collect the predominant coefficients of its principal lines and wrinkles. A set of statistical signatures, which includes gravity center, density, spatial dispersivity and energy, is then defined to characterize the palmprint with the selected directional context values. A classification and identification scheme based on these signatures is subsequently developed. This scheme exploits the features of principal lines and prominent wrinkles sufficiently and achieves satisfactory results. Compared with the line-segments-matching or interesting-points-matching based palmprint verification schemes, the proposed scheme uses a much smaller amount of data signatures. It also provides a convenient classification strategy and more accurate identification.

  7. Design of a CMOS integrated on-chip oscilloscope for spin wave characterization

    NASA Astrophysics Data System (ADS)

    Egel, Eugen; Meier, Christian; Csaba, György; Breitkreutz-von Gamm, Stephan

    2017-05-01

    Spin waves can perform some optically-inspired computing algorithms, e.g. the Fourier transform, directly than it is done with the CMOS logic. This article describes a new approach for on-chip characterization of spin wave based devices. The readout circuitry for the spin waves is simulated with 65-nm CMOS technology models. Commonly used circuits for Radio Frequency (RF) receivers are implemented to detect a sinusoidal ultra-wideband (5-50 GHz) signal with an amplitude of at least 15 μV picked up by a loop antenna. First, the RF signal is amplified by a Low Noise Amplifier (LNA). Then, it is down-converted by a mixer to Intermediate Frequency (IF). Finally, an Operational Amplifier (OpAmp) brings the IF signal to higher voltages (50-300 mV). The estimated power consumption and the required area of the readout circuit is approximately 55.5 mW and 0.168 mm2, respectively. The proposed On-Chip Oscilloscope (OCO) is highly suitable for on-chip spin wave characterization regarding the frequency, amplitude change and phase information. It offers an integrated low power alternative to current spin wave detecting systems.

  8. The development of radioactive sample surrogates for training and exercises

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Martha Finck; Bevin Brush; Dick Jansen

    2012-03-01

    The development of radioactive sample surrogates for training and exercises Source term information is required for to reconstruct a device used in a dispersed radiological dispersal device. Simulating a radioactive environment to train and exercise sampling and sample characterization methods with suitable sample materials is a continued challenge. The Idaho National Laboratory has developed and permitted a Radioactive Response Training Range (RRTR), an 800 acre test range that is approved for open air dispersal of activated KBr, for training first responders in the entry and exit from radioactively contaminated areas, and testing protocols for environmental sampling and field characterization. Membersmore » from the Department of Defense, Law Enforcement, and the Department of Energy participated in the first contamination exercise that was conducted at the RRTR in the July 2011. The range was contaminated using a short lived radioactive Br-82 isotope (activated KBr). Soil samples contaminated with KBr (dispersed as a solution) and glass particles containing activated potassium bromide that emulated dispersed radioactive materials (such as ceramic-based sealed source materials) were collected to assess environmental sampling and characterization techniques. This presentation summarizes the performance of a radioactive materials surrogate for use as a training aide for nuclear forensics.« less

  9. The role of geomatics in supporting sustainable development policy-making

    NASA Astrophysics Data System (ADS)

    Zhang, Aining

    Sustainable development has been on national policy agendas since 1992 when Agenda 21, an international agreement on sustainable development, was signed by over 150 countries. A key to sustainable development policy-making is information. Spatial information is an integral part of this information pool given the spatial nature of sustainable development. Geomatics, a technology dealing specifically with spatial information, can play a major role in support of the policy-making process. This thesis is aimed at advancing this role. The thesis starts with a discussion of theories and methodologies for sustainable development. The policy process for sustainable development is characterized, followed by an analysis of the requirements of sustainable development policy-making for geomatics support. The current status of geomatics in meeting these requirements is then examined, and the challenges and potential for geomatics to further address the needs are identified. To deal with these challenges, an integrated solution, namely the development of an on-line national policy atlas for sustainable development, is proposed, with a focus to support policy action formulation. The thesis further addresses one of the major research topics required for the implementation of the proposed solution, namely the exploration of the feasibility of a spatial statistics approach to predictive modelling in support of policy scenario assessments. The study is based on the case of national climate change policy formulation, with a focus on the development of new light duty vehicle sales mix models in support of transportation fuel efficiency policy-making aimed at greenhouse gas reductions. The conceptual framework and methodology for the case study are followed by the presentation of outcomes including models and policy scenario forecasts. The case study has illustrated that a spatial statistics approach is not only feasible for the development of predictive models in support of policy-making, but also provides several unique advantages that could potentially improve sustainable development policymaking.

  10. Hazardous Waste Clean-Up Information (CLU-IN) On-line Characterization and Remediation Databases Fact Sheet

    EPA Pesticide Factsheets

    This fact sheet provides an overview of the 10 on-line characterization and remediation databases available on the Hazardous Waste Clean-Up Information (CLU-IN) website sponsored by the U.S. Environmental Protection Agency.

  11. Characterizing the Leaching Behavior of Coal Combustion Residues using the Leaching Environmental Assessment Framework (LEAF) to Inform Future Management Decisions

    EPA Science Inventory

    Abstract for presentation on Characterizing the Leaching Behavior of Coal Combustion Residues using the Leaching Environmental Assessment Framework (LEAF) to Inform Future Management Decisions. The abstract is attached.

  12. Expression, purification, and characterization of almond (Prunus dulcis) allergen Pru du 4

    USDA-ARS?s Scientific Manuscript database

    Biochemical characterizations of food allergens are required for understanding the allergenicity of food allergens. Such studies require a relatively large amount of highly purified allergens. Profilins from numerous species are known to be allergens, including food allergens, such as almond (Prunus...

  13. How case characteristics differ across four types of elder maltreatment: implications for tailoring interventions to increase victim safety.

    PubMed

    Jackson, Shelly L; Hafemeister, Thomas L

    2014-12-01

    The purpose of this study was to determine whether case characteristics are differentially associated with four forms of elder maltreatment. Triangulated interviews were conducted with 71 APS caseworkers, 55 victims of substantiated abuse whose cases they managed, and 35 third party persons. Pure financial exploitation (PFE) was characterized by victim unawareness of financial exploitation and living alone. Physical abuse (PA) was characterized by victim's desire to protect the abusive individual. Neglect was characterized by isolation and victim's residing with the abusive individual. Hybrid financial exploitation (HFE) was characterized by mutual dependency. These differences indicate the need for tailoring interventions to increase victim safety. PFE requires victims to maintain financial security and independence. PA requires services to meet the needs of abusive individuals. Neglect requires greater monitoring when elderly persons reside with another person. HFE requires the provision of services to both members of the dyad. © The Author(s) 2012.

  14. Effective Materials Property Information Management for the 21st Century

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ren, Weiju; Cebon, David; Arnold, Steve

    2010-01-01

    This paper discusses key principles for the development of materials property information management software systems. There are growing needs for automated materials information management in industry, research organizations and government agencies. In part these are fuelled by the demands for higher efficiency in material testing, product design and development and engineering analysis. But equally important, organizations are being driven to employ sophisticated methods and software tools for managing their mission-critical materials information by the needs for consistency, quality and traceability of data, as well as control of access to proprietary or sensitive information. Furthermore the use of increasingly sophisticated nonlinear,more » anisotropic and multi-scale engineering analysis approaches, particularly for composite materials, requires both processing of much larger volumes of test data for development of constitutive models and much more complex materials data input requirements for Computer-Aided Engineering (CAE) software. And finally, the globalization of engineering processes and outsourcing of design and development activities generates much greater needs for sharing a single gold source of materials information between members of global engineering teams in extended supply-chains. Fortunately material property management systems have kept pace with the growing user demands. They have evolved from hard copy archives, through simple electronic databases, to versatile data management systems that can be customized to specific user needs. The more sophisticated of these provide facilities for: (i) data management functions such as access control, version control, and quality control; (ii) a wide range of data import, export and analysis capabilities; (iii) mechanisms for ensuring that all data is traceable to its pedigree sources: details of testing programs, published sources, etc; (iv) tools for searching, reporting and viewing the data; and (v) access to the information via a wide range of interfaces, including web browsers, rich clients, programmatic access and clients embedded in third-party applications, such as CAE systems. This paper discusses the important requirements for advanced material data management systems as well as the future challenges and opportunities such as automated error checking, automated data quality assessment and characterization, identification of gaps in data, as well as functionalities and business models to keep users returning to the source: to generate user demand to fuel database growth and maintenance.« less

  15. A flexible data fusion architecture for persistent surveillance using ultra-low-power wireless sensor networks

    NASA Astrophysics Data System (ADS)

    Hanson, Jeffrey A.; McLaughlin, Keith L.; Sereno, Thomas J.

    2011-06-01

    We have developed a flexible, target-driven, multi-modal, physics-based fusion architecture that efficiently searches sensor detections for targets and rejects clutter while controlling the combinatoric problems that commonly arise in datadriven fusion systems. The informational constraints imposed by long lifetime requirements make systems vulnerable to false alarms. We demonstrate that our data fusion system significantly reduces false alarms while maintaining high sensitivity to threats. In addition, mission goals can vary substantially in terms of targets-of-interest, required characterization, acceptable latency, and false alarm rates. Our fusion architecture provides the flexibility to match these trade-offs with mission requirements unlike many conventional systems that require significant modifications for each new mission. We illustrate our data fusion performance with case studies that span many of the potential mission scenarios including border surveillance, base security, and infrastructure protection. In these studies, we deployed multi-modal sensor nodes - including geophones, magnetometers, accelerometers and PIR sensors - with low-power processing algorithms and low-bandwidth wireless mesh networking to create networks capable of multi-year operation. The results show our data fusion architecture maintains high sensitivities while suppressing most false alarms for a variety of environments and targets.

  16. Metrological reliability of optical coherence tomography in biomedical applications

    NASA Astrophysics Data System (ADS)

    Goloni, C. M.; Temporão, G. P.; Monteiro, E. C.

    2013-09-01

    Optical coherence tomography (OCT) has been proving to be an efficient diagnostics technique for imaging in vivo tissues, an optical biopsy with important perspectives as a diagnostic tool for quantitative characterization of tissue structures. Despite its established clinical use, there is no international standard to address the specific requirements for basic safety and essential performance of OCT devices for biomedical imaging. The present work studies the parameters necessary for conformity assessment of optoelectronics equipment used in biomedical applications like Laser, Intense Pulsed Light (IPL), and OCT, targeting to identify the potential requirements to be considered in the case of a future development of a particular standard for OCT equipment. In addition to some of the particular requirements standards for laser and IPL, also applicable for metrological reliability analysis of OCT equipment, specific parameters for OCT's evaluation have been identified, considering its biomedical application. For each parameter identified, its information on the accompanying documents and/or its measurement has been recommended. Among the parameters for which the measurement requirement was recommended, including the uncertainty evaluation, the following are highlighted: optical radiation output, axial and transverse resolution, pulse duration and interval, and beam divergence.

  17. A framework for characterizing drug information sources.

    PubMed

    Sharp, Mark; Bodenreider, Olivier; Wacholder, Nina

    2008-11-06

    Drug information is complex, voluminous, heterogeneous, and dynamic. Multiple sources are available, each providing some elements of information about drugs (usually for a given purpose), but there exists no integrated view or directory that could be used to locate sources appropriate to a given purpose. We examined 23 sources that provide drug information in the pharmacy, chemistry, biology, and clinical medicine domains. Their drug information content could be categorized with 39 dimensions. We propose this list of dimensions as a framework for characterizing drug information sources. As an evaluation, we show that this framework is useful for comparing drug information sources and selecting sources most relevant to a given use case.

  18. Reliability of Current Biokinetic and Dosimetric Models for Radionuclides: A Pilot Study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Leggett, Richard Wayne; Eckerman, Keith F; Meck, Robert A.

    2008-10-01

    This report describes the results of a pilot study of the reliability of the biokinetic and dosimetric models currently used by the U.S. Nuclear Regulatory Commission (NRC) as predictors of dose per unit internal or external exposure to radionuclides. The study examines the feasibility of critically evaluating the accuracy of these models for a comprehensive set of radionuclides of concern to the NRC. Each critical evaluation would include: identification of discrepancies between the models and current databases; characterization of uncertainties in model predictions of dose per unit intake or unit external exposure; characterization of variability in dose per unit intakemore » or unit external exposure; and evaluation of prospects for development of more accurate models. Uncertainty refers here to the level of knowledge of a central value for a population, and variability refers to quantitative differences between different members of a population. This pilot study provides a critical assessment of models for selected radionuclides representing different levels of knowledge of dose per unit exposure. The main conclusions of this study are as follows: (1) To optimize the use of available NRC resources, the full study should focus on radionuclides most frequently encountered in the workplace or environment. A list of 50 radionuclides is proposed. (2) The reliability of a dose coefficient for inhalation or ingestion of a radionuclide (i.e., an estimate of dose per unit intake) may depend strongly on the specific application. Multiple characterizations of the uncertainty in a dose coefficient for inhalation or ingestion of a radionuclide may be needed for different forms of the radionuclide and different levels of information of that form available to the dose analyst. (3) A meaningful characterization of variability in dose per unit intake of a radionuclide requires detailed information on the biokinetics of the radionuclide and hence is not feasible for many infrequently studied radionuclides. (4) The biokinetics of a radionuclide in the human body typically represents the greatest source of uncertainty or variability in dose per unit intake. (5) Characterization of uncertainty in dose per unit exposure is generally a more straightforward problem for external exposure than for intake of a radionuclide. (6) For many radionuclides the most important outcome of a large-scale critical evaluation of databases and biokinetic models for radionuclides is expected to be the improvement of current models. Many of the current models do not fully or accurately reflect available radiobiological or physiological information, either because the models are outdated or because they were based on selective or uncritical use of data or inadequate model structures. In such cases the models should be replaced with physiologically realistic models that incorporate a wider spectrum of information.« less

  19. 78 FR 31769 - Accessible Emergency Information; Apparatus Requirements for Emergency Information and Video...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-24

    ... Accessible Emergency Information; Apparatus Requirements for Emergency Information and Video Description...] Accessible Emergency Information; Apparatus Requirements for Emergency Information and Video Description... manufacturers of devices that display video programming to ensure that certain apparatus are able to make...

  20. 36 CFR 801.7 - Information requirements.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 36 Parks, Forests, and Public Property 3 2014-07-01 2014-07-01 false Information requirements. 801... HISTORIC PRESERVATION REQUIREMENTS OF THE URBAN DEVELOPMENT ACTION GRANT PROGRAM § 801.7 Information requirements. (a) Information To Be Retained by Applicants Determining No Effect. (1) Recommended Documentation...

Top