Sample records for information characterizing methods

  1. Experimental preparation and characterization of four-dimensional quantum states using polarization and time-bin modes of a single photon

    NASA Astrophysics Data System (ADS)

    Yoo, Jinwon; Choi, Yujun; Cho, Young-Wook; Han, Sang-Wook; Lee, Sang-Yun; Moon, Sung; Oh, Kyunghwan; Kim, Yong-Su

    2018-07-01

    We present a detailed method to prepare and characterize four-dimensional pure quantum states or ququarts using polarization and time-bin modes of a single-photon. In particular, we provide a simple method to generate an arbitrary pure ququart and fully characterize the state with quantum state tomography. We also verify the reliability of the recipe by showing experimental preparation and characterization of 20 ququart states in mutually unbiased bases. As qudits provide superior properties over qubits in many fundamental tests of quantum physics and applications in quantum information processing, the presented method will be useful for photonic quantum information science.

  2. Construction and Resource Utilization Explorer (CRUX): Implementing Instrument Suite Data Fusion to Characterize Regolith Hydrogen Resources

    NASA Technical Reports Server (NTRS)

    Haldemann, Albert F. C.; Johnson, Jerome B.; Elphic, Richard C.; Boynton, William V.; Wetzel, John

    2006-01-01

    CRUX is a modular suite of geophysical and borehole instruments combined with display and decision support system (MapperDSS) tools to characterize regolith resources, surface conditions, and geotechnical properties. CRUX is a NASA-funded Technology Maturation Program effort to provide enabling technology for Lunar and Planetary Surface Operations (LPSO). The MapperDSS uses data fusion methods with CRUX instruments, and other available data and models, to provide regolith properties information needed for LPSO that cannot be determined otherwise. We demonstrate the data fusion method by showing how it might be applied to characterize the distribution and form of hydrogen using a selection of CRUX instruments: Borehole Neutron Probe and Thermal Evolved Gas Analyzer data as a function of depth help interpret Surface Neutron Probe data to generate 3D information. Secondary information from other instruments along with physical models improves the hydrogen distribution characterization, enabling information products for operational decision-making.

  3. Surface characterization of nanomaterials and nanoparticles: Important needs and challenging opportunities

    PubMed Central

    Baer, Donald R.; Engelhard, Mark H.; Johnson, Grant E.; Laskin, Julia; Lai, Jinfeng; Mueller, Karl; Munusamy, Prabhakaran; Thevuthasan, Suntharampillai; Wang, Hongfei; Washton, Nancy; Elder, Alison; Baisch, Brittany L.; Karakoti, Ajay; Kuchibhatla, Satyanarayana V. N. T.; Moon, DaeWon

    2013-01-01

    This review examines characterization challenges inherently associated with understanding nanomaterials and the roles surface and interface characterization methods can play in meeting some of the challenges. In parts of the research community, there is growing recognition that studies and published reports on the properties and behaviors of nanomaterials often have reported inadequate or incomplete characterization. As a consequence, the true value of the data in these reports is, at best, uncertain. With the increasing importance of nanomaterials in fundamental research and technological applications, it is desirable that researchers from the wide variety of disciplines involved recognize the nature of these often unexpected challenges associated with reproducible synthesis and characterization of nanomaterials, including the difficulties of maintaining desired materials properties during handling and processing due to their dynamic nature. It is equally valuable for researchers to understand how characterization approaches (surface and otherwise) can help to minimize synthesis surprises and to determine how (and how quickly) materials and properties change in different environments. Appropriate application of traditional surface sensitive analysis methods (including x-ray photoelectron and Auger electron spectroscopies, scanning probe microscopy, and secondary ion mass spectroscopy) can provide information that helps address several of the analysis needs. In many circumstances, extensions of traditional data analysis can provide considerably more information than normally obtained from the data collected. Less common or evolving methods with surface selectivity (e.g., some variations of nuclear magnetic resonance, sum frequency generation, and low and medium energy ion scattering) can provide information about surfaces or interfaces in working environments (operando or in situ) or information not provided by more traditional methods. Although these methods may require instrumentation or expertise not generally available, they can be particularly useful in addressing specific questions, and examples of their use in nanomaterial research are presented. PMID:24482557

  4. Surface characterization of nanomaterials and nanoparticles: Important needs and challenging opportunities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baer, Donald R.; Engelhard, Mark H.; Johnson, Grant E.

    2013-09-15

    This review examines characterization challenges inherently associated with understanding nanomaterials and the roles surface and interface characterization methods can play in meeting some of the challenges. In parts of the research community, there is growing recognition that studies and published reports on the properties and behaviors of nanomaterials often have reported inadequate or incomplete characterization. As a consequence, the true value of the data in these reports is, at best, uncertain. With the increasing importance of nanomaterials in fundamental research and technological applications, it is desirable that researchers from the wide variety of disciplines involved recognize the nature of thesemore » often unexpected challenges associated with reproducible synthesis and characterization of nanomaterials, including the difficulties of maintaining desired materials properties during handling and processing due to their dynamic nature. It is equally valuable for researchers to understand how characterization approaches (surface and otherwise) can help to minimize synthesis surprises and to determine how (and how quickly) materials and properties change in different environments. Appropriate application of traditional surface sensitive analysis methods (including x-ray photoelectron and Auger electron spectroscopies, scanning probe microscopy, and secondary ion mass spectroscopy) can provide information that helps address several of the analysis needs. In many circumstances, extensions of traditional data analysis can provide considerably more information than normally obtained from the data collected. Less common or evolving methods with surface selectivity (e.g., some variations of nuclear magnetic resonance, sum frequency generation, and low and medium energy ion scattering) can provide information about surfaces or interfaces in working environments (operando or in situ) or information not provided by more traditional methods. Although these methods may require instrumentation or expertise not generally available, they can be particularly useful in addressing specific questions, and examples of their use in nanomaterial research are presented.« less

  5. The Strategic Direction for Army Science and Technology

    DTIC Science & Technology

    2013-02-01

    methods to characterize the nature of trust (e.g., trust in information, trust in a network node or link), and to take measures to manage the trust...Science and Technology Executive, Dr. Thomas Killion, requested a study of peer review methods in use at Army laboratories. The paper discusses... sensors  Characterization of network dynamics and quality of information important to tactical decision-making Work that should be supported

  6. Morphometric information to reduce the semantic gap in the characterization of microscopic images of thyroid nodules.

    PubMed

    Macedo, Alessandra A; Pessotti, Hugo C; Almansa, Luciana F; Felipe, Joaquim C; Kimura, Edna T

    2016-07-01

    The analyses of several systems for medical-imaging processing typically support the extraction of image attributes, but do not comprise some information that characterizes images. For example, morphometry can be applied to find new information about the visual content of an image. The extension of information may result in knowledge. Subsequently, results of mappings can be applied to recognize exam patterns, thus improving the accuracy of image retrieval and allowing a better interpretation of exam results. Although successfully applied in breast lesion images, the morphometric approach is still poorly explored in thyroid lesions due to the high subjectivity thyroid examinations. This paper presents a theoretical-practical study, considering Computer Aided Diagnosis (CAD) and Morphometry, to reduce the semantic discontinuity between medical image features and human interpretation of image content. The proposed method aggregates the content of microscopic images characterized by morphometric information and other image attributes extracted by traditional object extraction algorithms. This method carries out segmentation, feature extraction, image labeling and classification. Morphometric analysis was included as an object extraction method in order to verify the improvement of its accuracy for automatic classification of microscopic images. To validate this proposal and verify the utility of morphometric information to characterize thyroid images, a CAD system was created to classify real thyroid image-exams into Papillary Cancer, Goiter and Non-Cancer. Results showed that morphometric information can improve the accuracy and precision of image retrieval and the interpretation of results in computer-aided diagnosis. For example, in the scenario where all the extractors are combined with the morphometric information, the CAD system had its best performance (70% of precision in Papillary cases). Results signalized a positive use of morphometric information from images to reduce semantic discontinuity between human interpretation and image characterization. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  7. High-Density Signal Interface Electromagnetic Radiation Prediction for Electromagnetic Compatibility Evaluation.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Halligan, Matthew

    Radiated power calculation approaches for practical scenarios of incomplete high- density interface characterization information and incomplete incident power information are presented. The suggested approaches build upon a method that characterizes power losses through the definition of power loss constant matrices. Potential radiated power estimates include using total power loss information, partial radiated power loss information, worst case analysis, and statistical bounding analysis. A method is also proposed to calculate radiated power when incident power information is not fully known for non-periodic signals at the interface. Incident data signals are modeled from a two-state Markov chain where bit state probabilities aremore » derived. The total spectrum for windowed signals is postulated as the superposition of spectra from individual pulses in a data sequence. Statistical bounding methods are proposed as a basis for the radiated power calculation due to the statistical calculation complexity to find a radiated power probability density function.« less

  8. Combining Land Use Information and Small Stream Sampling with PCR-Based Methods for Better Characterization of Diffuse Sources of Human Fecal Pollution

    EPA Science Inventory

    Diffuse sources of human fecal pollution allow for the direct discharge of waste into receiving waters with minimal or no treatment. Traditional culture-based methods are commonly used to characterize fecal pollution in ambient waters, however these methods do not discern between...

  9. Bioforensics: Characterization of biological weapons agents by NanoSIMS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Weber, P K; Ghosal, S; Leighton, T J

    2007-02-26

    The anthrax attacks of Fall 2001 highlight the need to develop forensic methods based on multiple identifiers to determine the origin of biological weapons agents. Genetic typing methods (i.e., DNA and RNA-based) provide one attribution technology, but genetic information alone is not usually sufficient to determine the provenance of the material. Non-genetic identifiers, including elemental and isotopic signatures, provide complementary information that can be used to identify the means, geographic location and date of production. Under LDRD funding, we have successfully developed the techniques necessary to perform bioforensic characterization with the NanoSIMS at the individual spore level. We have developedmore » methods for elemental and isotopic characterization at the single spore scale. We have developed methods for analyzing spore sections to map elemental abundance within spores. We have developed rapid focused ion beam (FIB) sectioning techniques for spores to preserve elemental and structural integrity. And we have developed a high-resolution depth profiling method to characterize the elemental distribution in individual spores without sectioning. We used these newly developed methods to study the controls on elemental abundances in spores, characterize the elemental distribution of in spores, and to study elemental uptake by spores. Our work under this LDRD project attracted FBI and DHS funding for applied purposes.« less

  10. Pharmaceutical cocrystals, salts and polymorphs: Advanced characterization techniques.

    PubMed

    Pindelska, Edyta; Sokal, Agnieszka; Kolodziejski, Waclaw

    2017-08-01

    The main goal of a novel drug development is to obtain it with optimal physiochemical, pharmaceutical and biological properties. Pharmaceutical companies and scientists modify active pharmaceutical ingredients (APIs), which often are cocrystals, salts or carefully selected polymorphs, to improve the properties of a parent drug. To find the best form of a drug, various advanced characterization methods should be used. In this review, we have described such analytical methods, dedicated to solid drug forms. Thus, diffraction, spectroscopic, thermal and also pharmaceutical characterization methods are discussed. They all are necessary to study a solid API in its intrinsic complexity from bulk down to the molecular level, gain information on its structure, properties, purity and possible transformations, and make the characterization efficient, comprehensive and complete. Furthermore, these methods can be used to monitor and investigate physical processes, involved in the drug development, in situ and in real time. The main aim of this paper is to gather information on the current advancements in the analytical methods and highlight their pharmaceutical relevance. Copyright © 2017 Elsevier B.V. All rights reserved.

  11. Vibration sensing method and apparatus

    DOEpatents

    Barna, B.A.

    1989-04-25

    A method and apparatus for nondestructive evaluation of a structure are disclosed. Resonant audio frequency vibrations are excited in the structure to be evaluated and the vibrations are measured and characterized to obtain information about the structure. The vibrations are measured and characterized by reflecting a laser beam from the vibrating structure and directing a substantial portion of the reflected beam back into the laser device used to produce the beam which device is capable of producing an electric signal containing information about the vibration. 4 figs.

  12. Vibration sensing method and apparatus

    DOEpatents

    Barna, B.A.

    1987-07-07

    A method and apparatus for nondestructive evaluation of a structure is disclosed. Resonant audio frequency vibrations are excited in the structure to be evaluated and the vibrations are measured and characterized to obtain information about the structure. The vibrations are measured and characterized by reflecting a laser beam from the vibrating structure and directing a substantial portion of the reflected beam back into the laser device used to produce the beam which device is capable of producing an electric signal containing information about the vibration. 4 figs.

  13. Vibration sensing method and apparatus

    DOEpatents

    Barna, Basil A.

    1989-04-25

    A method and apparatus for nondestructive evaluation of a structure is disclosed. Resonant audio frequency vibrations are excited in the structure to be evaluated and the vibrations are measured and characterized to obtain information about the structure. The vibrations are measured and characterized by reflecting a laser beam from the vibrating structure and directing a substantial portion of the reflected beam back into the laser device used to produce the beam which device is capable of producing an electric signal containing information about the vibration.

  14. Informatics and Standards for Nanomedicine Technology

    PubMed Central

    Thomas, Dennis G.; Klaessig, Fred; Harper, Stacey L.; Fritts, Martin; Hoover, Mark D.; Gaheen, Sharon; Stokes, Todd H.; Reznik-Zellen, Rebecca; Freund, Elaine T.; Klemm, Juli D.; Paik, David S.; Baker, Nathan A.

    2011-01-01

    There are several issues to be addressed concerning the management and effective use of information (or data), generated from nanotechnology studies in biomedical research and medicine. These data are large in volume, diverse in content, and are beset with gaps and ambiguities in the description and characterization of nanomaterials. In this work, we have reviewed three areas of nanomedicine informatics: information resources; taxonomies, controlled vocabularies, and ontologies; and information standards. Informatics methods and standards in each of these areas are critical for enabling collaboration, data sharing, unambiguous representation and interpretation of data, semantic (meaningful) search and integration of data; and for ensuring data quality, reliability, and reproducibility. In particular, we have considered four types of information standards in this review, which are standard characterization protocols, common terminology standards, minimum information standards, and standard data communication (exchange) formats. Currently, due to gaps and ambiguities in the data, it is also difficult to apply computational methods and machine learning techniques to analyze, interpret and recognize patterns in data that are high dimensional in nature, and also to relate variations in nanomaterial properties to variations in their chemical composition, synthesis, characterization protocols, etc. Progress towards resolving the issues of information management in nanomedicine using informatics methods and standards discussed in this review will be essential to the rapidly growing field of nanomedicine informatics. PMID:21721140

  15. VNIR hyperspectral background characterization methods in adverse weather conditions

    NASA Astrophysics Data System (ADS)

    Romano, João M.; Rosario, Dalton; Roth, Luz

    2009-05-01

    Hyperspectral technology is currently being used by the military to detect regions of interest where potential targets may be located. Weather variability, however, may affect the ability for an algorithm to discriminate possible targets from background clutter. Nonetheless, different background characterization approaches may facilitate the ability for an algorithm to discriminate potential targets over a variety of weather conditions. In a previous paper, we introduced a new autonomous target size invariant background characterization process, the Autonomous Background Characterization (ABC) or also known as the Parallel Random Sampling (PRS) method, features a random sampling stage, a parallel process to mitigate the inclusion by chance of target samples into clutter background classes during random sampling; and a fusion of results at the end. In this paper, we will demonstrate how different background characterization approaches are able to improve performance of algorithms over a variety of challenging weather conditions. By using the Mahalanobis distance as the standard algorithm for this study, we compare the performance of different characterization methods such as: the global information, 2 stage global information, and our proposed method, ABC, using data that was collected under a variety of adverse weather conditions. For this study, we used ARDEC's Hyperspectral VNIR Adverse Weather data collection comprised of heavy, light, and transitional fog, light and heavy rain, and low light conditions.

  16. Combining land use information and small stream sampling with PCR-based methods for better characterization of diffuse sources of human fecal pollution.

    PubMed

    Peed, Lindsay A; Nietch, Christopher T; Kelty, Catherine A; Meckes, Mark; Mooney, Thomas; Sivaganesan, Mano; Shanks, Orin C

    2011-07-01

    Diffuse sources of human fecal pollution allow for the direct discharge of waste into receiving waters with minimal or no treatment. Traditional culture-based methods are commonly used to characterize fecal pollution in ambient waters, however these methods do not discern between human and other animal sources of fecal pollution making it difficult to identify diffuse pollution sources. Human-associated quantitative real-time PCR (qPCR) methods in combination with low-order headwatershed sampling, precipitation information, and high-resolution geographic information system land use data can be useful for identifying diffuse source of human fecal pollution in receiving waters. To test this assertion, this study monitored nine headwatersheds over a two-year period potentially impacted by faulty septic systems and leaky sanitary sewer lines. Human fecal pollution was measured using three different human-associated qPCR methods and a positive significant correlation was seen between abundance of human-associated genetic markers and septic systems following wet weather events. In contrast, a negative correlation was observed with sanitary sewer line densities suggesting septic systems are the predominant diffuse source of human fecal pollution in the study area. These results demonstrate the advantages of combining water sampling, climate information, land-use computer-based modeling, and molecular biology disciplines to better characterize diffuse sources of human fecal pollution in environmental waters.

  17. Geophysical investigations of well fields to characterize fractured-bedrock aquifers in southern New Hampshire

    USGS Publications Warehouse

    Degnan, James R.; Moore, Richard Bridge; Mack, Thomas J.

    2001-01-01

    Bedrock-fracture zones near high-yield bedrock wells in southern New Hampshire well fields were located and characterized using seven surface and six borehole geophysical survey methods. Detailed surveys of six sites with various methods provide an opportunity to integrate and compare survey results. Borehole geophysical surveys were conducted at three of the sites to confirm subsurface features. Hydrogeologic settings, including a variety of bedrock and surface geologic materials, were sought to gain an insight into the usefulness of the methods in varied terrains. Results from 15 survey lines, 8 arrays, and 3 boreholes were processed and interpreted from the 6 sites. The surface geophysical methods used provided physical properties of fractured bedrock. Seismic refraction and ground-penetrating radar (GPR) primarily were used to characterize the overburden materials, but in a few cases indicated bedrock-fracture zones. Magnetometer surveys were used to obtain background information about the bedrock to compare with other results, and to search for magnetic lows, which may result from weathered fractured rock. Electromagnetic terrain conductivity surveys (EM) and very-low-frequency electromagnetic surveys (VLF) were used as rapid reconnaissance techniques with the primary purpose of identifying electrical anomalies, indicating potential fracture zones in bedrock. Direct-current (dc) resistivity methods were used to gather detailed subsurface information about fracture depth and orientation. Two-dimensional (2-D) dc-resistivity surveys using dipole-dipole and Schlumberger arrays located and characterized the overburden, bedrock, and bedrock-fracture zones through analysis of data inversions. Azimuthal square array dc-resistivity survey results indicated orientations of conductive steep-dipping bedrock-fracture zones that were located and characterized by previously applied geophysical methods. Various available data sets were used for site selection, characterizations, and interpretations. Lineament data, developed as a part of a statewide and regional scale investigation of the bedrock aquifer, were available to identify potential near-vertical fracture zones. Geophysical surveys indicated fracture zones coincident with lineaments at 4 of the sites. Geologic data collected as a part of the regional scale investigation provided outcrop fracture measurements, ductile fabric, and contact information. Dominant fracture trends correspond to the trends of geophysical anomalies at 4 of the sites. Water-well drillers? logs from water supply and environmental data sets also were used where available to characterize sites. Regional overburden information was compiled from stratified-drift aquifer maps and surficial-geological maps.

  18. General Characterization Methods for Photoelectrochemical Cells for Solar Water Splitting.

    PubMed

    Shi, Xinjian; Cai, Lili; Ma, Ming; Zheng, Xiaolin; Park, Jong Hyeok

    2015-10-12

    Photoelectrochemical (PEC) water splitting is a very promising technology that converts water into clean hydrogen fuel and oxygen by using solar light. However, the characterization methods for PEC cells are diverse and a systematic introduction to characterization methods for PEC cells has rarely been attempted. Unlike most other review articles that focus mainly on the material used for the working electrodes of PEC cells, this review introduces general characterization methods for PEC cells, including their basic configurations and methods for characterizing their performance under various conditions, regardless of the materials used. Detailed experimental operation procedures with theoretical information are provided for each characterization method. The PEC research area is rapidly expanding and more researchers are beginning to devote themselves to related work. Therefore, the content of this Minireview can provide entry-level knowledge to beginners in the area of PEC, which might accelerate progress in this area. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  19. 78 FR 56749 - Site Characteristics and Site Parameters for Nuclear Power Plants

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-13

    ..., ``Geologic Characterization Information,'' (currently titled as ``Basic Geologic and Seismic Information''); Section 2.5.2, ``Vibratory Ground Motion''; Section 2.5.3, ``Surface Deformation'' (currently titled as... the following methods (unless this document describes a different method for submitting comments on a...

  20. Quantifying Safety Margin Using the Risk-Informed Safety Margin Characterization (RISMC)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grabaskas, David; Bucknor, Matthew; Brunett, Acacia

    2015-04-26

    The Risk-Informed Safety Margin Characterization (RISMC), developed by Idaho National Laboratory as part of the Light-Water Reactor Sustainability Project, utilizes a probabilistic safety margin comparison between a load and capacity distribution, rather than a deterministic comparison between two values, as is usually done in best-estimate plus uncertainty analyses. The goal is to determine the failure probability, or in other words, the probability of the system load equaling or exceeding the system capacity. While this method has been used in pilot studies, there has been little work conducted investigating the statistical significance of the resulting failure probability. In particular, it ismore » difficult to determine how many simulations are necessary to properly characterize the failure probability. This work uses classical (frequentist) statistics and confidence intervals to examine the impact in statistical accuracy when the number of simulations is varied. Two methods are proposed to establish confidence intervals related to the failure probability established using a RISMC analysis. The confidence interval provides information about the statistical accuracy of the method utilized to explore the uncertainty space, and offers a quantitative method to gauge the increase in statistical accuracy due to performing additional simulations.« less

  1. Practices and Methods for Actualization of the Scientific Information in Art Excursions (Excursions and Cultural Heritage in the Contemporary World)

    ERIC Educational Resources Information Center

    Portnova, Tatiana V.

    2016-01-01

    The paper deals with various practices and methods for actualization of the scientific information in art excursions. The modern society is characterized by commitment to information richness. The range of cultural and historical materials used as the basis for art excursions is really immense. However if to consider the number of excursions with…

  2. Waste Characterization Methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vigil-Holterman, Luciana R.; Naranjo, Felicia Danielle

    2016-02-02

    This report discusses ways to classify waste as outlined by LANL. Waste Generators must make a waste determination and characterize regulated waste by appropriate analytical testing or use of acceptable knowledge (AK). Use of AK for characterization requires several source documents. Waste characterization documentation must be accurate, sufficient, and current (i.e., updated); relevant and traceable to the waste stream’s generation, characterization, and management; and not merely a list of information sources.

  3. Risk Informed Margins Management as part of Risk Informed Safety Margin Characterization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Curtis Smith

    2014-06-01

    The ability to better characterize and quantify safety margin is important to improved decision making about Light Water Reactor (LWR) design, operation, and plant life extension. A systematic approach to characterization of safety margins and the subsequent margin management options represents a vital input to the licensee and regulatory analysis and decision making that will be involved. In addition, as research and development in the LWR Sustainability (LWRS) Program and other collaborative efforts yield new data, sensors, and improved scientific understanding of physical processes that govern the aging and degradation of plant SSCs needs and opportunities to better optimize plantmore » safety and performance will become known. To support decision making related to economics, readability, and safety, the Risk Informed Safety Margin Characterization (RISMC) Pathway provides methods and tools that enable mitigation options known as risk informed margins management (RIMM) strategies.« less

  4. Advanced eddy current test signal analysis for steam generator tube defect classification and characterization

    NASA Astrophysics Data System (ADS)

    McClanahan, James Patrick

    Eddy Current Testing (ECT) is a Non-Destructive Examination (NDE) technique that is widely used in power generating plants (both nuclear and fossil) to test the integrity of heat exchanger (HX) and steam generator (SG) tubing. Specifically for this research, laboratory-generated, flawed tubing data were examined. The purpose of this dissertation is to develop and implement an automated method for the classification and an advanced characterization of defects in HX and SG tubing. These two improvements enhanced the robustness of characterization as compared to traditional bobbin-coil ECT data analysis methods. A more robust classification and characterization of the tube flaw in-situ (while the SG is on-line but not when the plant is operating), should provide valuable information to the power industry. The following are the conclusions reached from this research. A feature extraction program acquiring relevant information from both the mixed, absolute and differential data was successfully implemented. The CWT was utilized to extract more information from the mixed, complex differential data. Image Processing techniques used to extract the information contained in the generated CWT, classified the data with a high success rate. The data were accurately classified, utilizing the compressed feature vector and using a Bayes classification system. An estimation of the upper bound for the probability of error, using the Bhattacharyya distance, was successfully applied to the Bayesian classification. The classified data were separated according to flaw-type (classification) to enhance characterization. The characterization routine used dedicated, flaw-type specific ANNs that made the characterization of the tube flaw more robust. The inclusion of outliers may help complete the feature space so that classification accuracy is increased. Given that the eddy current test signals appear very similar, there may not be sufficient information to make an extremely accurate (>95%) classification or an advanced characterization using this system. It is necessary to have a larger database fore more accurate system learning.

  5. Characterizing Navigation in Interactive Learning Environments

    ERIC Educational Resources Information Center

    Liang, Hai-Ning; Sedig, Kamran

    2009-01-01

    Interactive learning environments (ILEs) are increasingly used to support and enhance instruction and learning experiences. ILEs maintain and display information, allowing learners to interact with this information. One important method of interacting with information is navigation. Often, learners are required to navigate through the information…

  6. Application of information-retrieval methods to the classification of physical data

    NASA Technical Reports Server (NTRS)

    Mamotko, Z. N.; Khorolskaya, S. K.; Shatrovskiy, L. I.

    1975-01-01

    Scientific data received from satellites are characterized as a multi-dimensional time series, whose terms are vector functions of a vector of measurement conditions. Information retrieval methods are used to construct lower dimensional samples on the basis of the condition vector, in order to obtain these data and to construct partial relations. The methods are applied to the joint Soviet-French Arkad project.

  7. A comparison of simple shear characterization methods for composite laminates

    NASA Technical Reports Server (NTRS)

    Yeow, Y. T.; Brinson, H. F.

    1978-01-01

    Various methods for the shear stress/strain characterization of composite laminates are examined and their advantages and limitations are briefly discussed. Experimental results and the necessary accompanying analysis are then presented and compared for three simple shear characterization procedures. These are the off-axis tensile test method, the (+/- 45 deg)s tensile test method and the (0/90 deg)s symmetric rail shear test method. It is shown that the first technique indicates the shear properties of the graphite/epoxy laminates investigated are fundamentally brittle in nature while the latter two methods tend to indicate that these laminates are fundamentally ductile in nature. Finally, predictions of incrementally determined tensile stress/strain curves utilizing the various different shear behaviour methods as input information are presented and discussed.

  8. A comparison of simple shear characterization methods for composite laminates

    NASA Technical Reports Server (NTRS)

    Yeow, Y. T.; Brinson, H. F.

    1977-01-01

    Various methods for the shear stress-strain characterization of composite laminates are examined, and their advantages and limitations are briefly discussed. Experimental results and the necessary accompanying analysis are then presented and compared for three simple shear characterization procedures. These are the off-axis tensile test method, the + or - 45 degs tensile test method and the 0 deg/90 degs symmetric rail shear test method. It is shown that the first technique indicates that the shear properties of the G/E laminates investigated are fundamentally brittle in nature while the latter two methods tend to indicate that the G/E laminates are fundamentally ductile in nature. Finally, predictions of incrementally determined tensile stress-strain curves utilizing the various different shear behavior methods as input information are presented and discussed.

  9. Using pre-screening methods for an effective and reliable site characterization at megasites.

    PubMed

    Algreen, Mette; Kalisz, Mariusz; Stalder, Marcel; Martac, Eugeniu; Krupanek, Janusz; Trapp, Stefan; Bartke, Stephan

    2015-10-01

    This paper illustrates the usefulness of pre-screening methods for an effective characterization of polluted sites. We applied a sequence of site characterization methods to a former Soviet military airbase with likely fuel and benzene, toluene, ethylbenzene, and xylene (BTEX) contamination in shallow groundwater and subsoil. The methods were (i) phytoscreening with tree cores; (ii) soil gas measurements for CH4, O2, and photoionization detector (PID); (iii) direct-push with membrane interface probe (MIP) and laser-induced fluorescence (LIF) sensors; (iv) direct-push sampling; and (v) sampling from soil and from groundwater monitoring wells. Phytoscreening and soil gas measurements are rapid and inexpensive pre-screening methods. Both indicated subsurface pollution and hot spots successfully. The direct-push sensors yielded 3D information about the extension and the volume of the subsurface plume. This study also expanded the applicability of tree coring to BTEX compounds and tested the use of high-resolution direct-push sensors for light hydrocarbons. Comparison of screening results to results from conventional soil and groundwater sampling yielded in most cases high rank correlation and confirmed the findings. The large-scale application of non- or low-invasive pre-screening can be of help in directing and focusing the subsequent, more expensive investigation methods. The rapid pre-screening methods also yielded useful information about potential remediation methods. Overall, we see several benefits of a stepwise screening and site characterization scheme, which we propose in conclusion.

  10. Characterization of Class A low-level radioactive waste 1986--1990. Volume 6: Appendices G--J

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dehmel, J.C.; Loomis, D.; Mauro, J.

    1994-01-01

    Under contract to the US Nuclear Regulatory Commission, Office of Nuclear Regulatory Research, the firms of S. Cohen & Associates, Inc. (SC&A) and Eastern Research Group (ERG) have compiled a report that describes the physical, chemical, and radiological properties of Class-A low-level radioactive waste. The report also presents information characterizing various methods and facilities used to treat and dispose non-radioactive waste. A database management program was developed for use in accessing, sorting, analyzing, and displaying the electronic data provided by EG&G. The program was used to present and aggregate data characterizing the radiological, physical, and chemical properties of the wastemore » from descriptions contained in shipping manifests. The data thus retrieved are summarized in tables, histograms, and cumulative distribution curves presenting radionuclide concentration distributions in Class-A waste as a function of waste streams, by category of waste generators, and regions of the United States. The report also provides information characterizing methods and facilities used to treat and dispose non-radioactive waste, including industrial, municipal, and hazardous waste regulated under Subparts C and D of the Resource Conservation and Recovery Act (RCRA). The information includes a list of disposal options, the geographical locations of the processing and disposal facilities, and a description of the characteristics of such processing and disposal facilities. Volume 1 contains the Executive Summary, Volume 2 presents the Class-A waste database, Volume 3 presents the information characterizing non-radioactive waste management practices and facilities, and Volumes 4 through 7 contain Appendices A through P with supporting information.« less

  11. Characterization of Contrast Agent Microbubbles for Ultrasound Imaging and Therapy Research.

    PubMed

    Mulvana, Helen; Browning, Richard J; Luan, Ying; de Jong, Nico; Tang, Meng-Xing; Eckersley, Robert J; Stride, Eleanor

    2017-01-01

    The high efficiency with which gas microbubbles can scatter ultrasound compared with the surrounding blood pool or tissues has led to their widespread employment as contrast agents in ultrasound imaging. In recent years, their applications have been extended to include super-resolution imaging and the stimulation of localized bio-effects for therapy. The growing exploitation of contrast agents in ultrasound and in particular these recent developments have amplified the need to characterize and fully understand microbubble behavior. The aim in doing so is to more fully exploit their utility for both diagnostic imaging and potential future therapeutic applications. This paper presents the key characteristics of microbubbles that determine their efficacy in diagnostic and therapeutic applications and the corresponding techniques for their measurement. In each case, we have presented information regarding the methods available and their respective strengths and limitations, with the aim of presenting information relevant to the selection of appropriate characterization methods. First, we examine methods for determining the physical properties of microbubble suspensions and then techniques for acoustic characterization of both suspensions and single microbubbles. The next section covers characterization of microbubbles as therapeutic agents, including as drug carriers for which detailed understanding of their surface characteristics and drug loading capacity is required. Finally, we discuss the attempts that have been made to allow comparison across the methods employed by various groups to characterize and describe their microbubble suspensions and promote wider discussion and comparison of microbubble behavior.

  12. Thermal Analysis by Structural Characterization as a Method for Assessing Heterogeneity in Complex Solid Pharmaceutical Dosage Forms.

    PubMed

    Alhijjaj, Muqdad; Reading, Mike; Belton, Peter; Qi, Sheng

    2015-11-03

    Characterizing inter- and intrasample heterogeneity of solid and semisolid pharmaceutical products is important both for rational design of dosage forms and subsequent quality control during manufacture; however, most pharmaceutical products are multicomponent formulations that are challenging in this regard. Thermal analysis, in particular differential scanning calorimetry, is commonly used to obtain structural information, such as degree of crystallinity, or identify the presence of a particular polymorph, but the results are an average over the whole sample; it cannot directly provide information about the spatial distribution of phases. This study demonstrates the use of a new thermo-optical technique, thermal analysis by structural characterization (TASC), that can provide spatially resolved information on thermal transitions by applying a novel algorithm to images acquired by hot stage microscopy. We determined that TASC can be a low cost, relatively rapid method of characterizing heterogeneity and other aspects of structure. In the examples studied, it was found that high heating rates enabled screening times of 3-5 min per sample. In addition, this study demonstrated the higher sensitivity of TASC for detecting the metastable form of polyethylene glycol (PEG) compared to conventional differential scanning calorimetry (DSC). This preliminary work suggests that TASC will be a worthwhile additional tool for characterizing a broad range of materials.

  13. Exposure Estimation and Interpretation of Occupational Risk: Enhanced Information for the Occupational Risk Manager

    PubMed Central

    Waters, Martha; McKernan, Lauralynn; Maier, Andrew; Jayjock, Michael; Schaeffer, Val; Brosseau, Lisa

    2015-01-01

    The fundamental goal of this article is to describe, define, and analyze the components of the risk characterization process for occupational exposures. Current methods are described for the probabilistic characterization of exposure, including newer techniques that have increasing applications for assessing data from occupational exposure scenarios. In addition, since the probability of health effects reflects variability in the exposure estimate as well as the dose-response curve—the integrated considerations of variability surrounding both components of the risk characterization provide greater information to the occupational hygienist. Probabilistic tools provide a more informed view of exposure as compared to use of discrete point estimates for these inputs to the risk characterization process. Active use of such tools for exposure and risk assessment will lead to a scientifically supported worker health protection program. Understanding the bases for an occupational risk assessment, focusing on important sources of variability and uncertainty enables characterizing occupational risk in terms of a probability, rather than a binary decision of acceptable risk or unacceptable risk. A critical review of existing methods highlights several conclusions: (1) exposure estimates and the dose-response are impacted by both variability and uncertainty and a well-developed risk characterization reflects and communicates this consideration; (2) occupational risk is probabilistic in nature and most accurately considered as a distribution, not a point estimate; and (3) occupational hygienists have a variety of tools available to incorporate concepts of risk characterization into occupational health and practice. PMID:26302336

  14. Bayesian methods for characterizing unknown parameters of material models

    DOE PAGES

    Emery, J. M.; Grigoriu, M. D.; Field Jr., R. V.

    2016-02-04

    A Bayesian framework is developed for characterizing the unknown parameters of probabilistic models for material properties. In this framework, the unknown parameters are viewed as random and described by their posterior distributions obtained from prior information and measurements of quantities of interest that are observable and depend on the unknown parameters. The proposed Bayesian method is applied to characterize an unknown spatial correlation of the conductivity field in the definition of a stochastic transport equation and to solve this equation by Monte Carlo simulation and stochastic reduced order models (SROMs). As a result, the Bayesian method is also employed tomore » characterize unknown parameters of material properties for laser welds from measurements of peak forces sustained by these welds.« less

  15. Bayesian methods for characterizing unknown parameters of material models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Emery, J. M.; Grigoriu, M. D.; Field Jr., R. V.

    A Bayesian framework is developed for characterizing the unknown parameters of probabilistic models for material properties. In this framework, the unknown parameters are viewed as random and described by their posterior distributions obtained from prior information and measurements of quantities of interest that are observable and depend on the unknown parameters. The proposed Bayesian method is applied to characterize an unknown spatial correlation of the conductivity field in the definition of a stochastic transport equation and to solve this equation by Monte Carlo simulation and stochastic reduced order models (SROMs). As a result, the Bayesian method is also employed tomore » characterize unknown parameters of material properties for laser welds from measurements of peak forces sustained by these welds.« less

  16. Statistical methods for analysis of radiation effects with tumor and dose location-specific information with application to the WECARE study of asynchronous contralateral breast cancer

    PubMed Central

    Langholz, Bryan; Thomas, Duncan C.; Stovall, Marilyn; Smith, Susan A.; Boice, John D.; Shore, Roy E.; Bernstein, Leslie; Lynch, Charles F.; Zhang, Xinbo; Bernstein, Jonine L.

    2009-01-01

    Summary Methods for the analysis of individually matched case-control studies with location-specific radiation dose and tumor location information are described. These include likelihood methods for analyses that just use cases with precise location of tumor information and methods that also include cases with imprecise tumor location information. The theory establishes that each of these likelihood based methods estimates the same radiation rate ratio parameters, within the context of the appropriate model for location and subject level covariate effects. The underlying assumptions are characterized and the potential strengths and limitations of each method are described. The methods are illustrated and compared using the WECARE study of radiation and asynchronous contralateral breast cancer. PMID:18647297

  17. Rubber.

    ERIC Educational Resources Information Center

    Krishen, Anoop

    1989-01-01

    This review covers methods for identification, characterization, and determination of rubber and materials in rubber. Topics include: general information, nuclear magnetic resonance spectroscopy, infrared spectroscopy, thermal methods, gel permeation chromatography, size exclusion chromatography, analysis related to safety and health, and…

  18. Methods and systems for detecting abnormal digital traffic

    DOEpatents

    Goranson, Craig A [Kennewick, WA; Burnette, John R [Kennewick, WA

    2011-03-22

    Aspects of the present invention encompass methods and systems for detecting abnormal digital traffic by assigning characterizations of network behaviors according to knowledge nodes and calculating a confidence value based on the characterizations from at least one knowledge node and on weighting factors associated with the knowledge nodes. The knowledge nodes include a characterization model based on prior network information. At least one of the knowledge nodes should not be based on fixed thresholds or signatures. The confidence value includes a quantification of the degree of confidence that the network behaviors constitute abnormal network traffic.

  19. Structural Characterization of Mannan Cell Wall Polysaccharides in Plants Using PACE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pidatala, Venkataramana R.; Mahboubi, Amir; Mortimer, Jenny C.

    Plant cell wall polysaccharides are notoriously difficult to analyze, and most methods require expensive equipment, skilled operators, and large amounts of purified material. Here, we describe a simple method for gaining detailed polysaccharide structural information, including resolution of structural isomers. For polysaccharide analysis by gel electrophoresis (PACE), plant cell wall material is hydrolyzed with glycosyl hydrolases specific to the polysaccharide of interest (e.g., mannanases for mannan). Large format polyacrylamide gels are then used to separate the released oligosaccharides, which have been fluorescently labeled. Gels can be visualized with a modified gel imaging system (see Table of Materials). The resulting oligosaccharidemore » fingerprint can either be compared qualitatively or, with replication, quantitatively. Linkage and branching information can be established using additional glycosyl hydrolases (e.g., mannosidases and galactosidases). Whilst this protocol describes a method for analyzing glucomannan structure, it can be applied to any polysaccharide for which characterized glycosyl hydrolases exist. Alternatively, it can be used to characterize novel glycosyl hydrolases using defined polysaccharide substrates.« less

  20. Structural Characterization of Mannan Cell Wall Polysaccharides in Plants Using PACE.

    PubMed

    Pidatala, Venkataramana R; Mahboubi, Amir; Mortimer, Jenny C

    2017-10-16

    Plant cell wall polysaccharides are notoriously difficult to analyze, and most methods require expensive equipment, skilled operators, and large amounts of purified material. Here, we describe a simple method for gaining detailed polysaccharide structural information, including resolution of structural isomers. For polysaccharide analysis by gel electrophoresis (PACE), plant cell wall material is hydrolyzed with glycosyl hydrolases specific to the polysaccharide of interest (e.g., mannanases for mannan). Large format polyacrylamide gels are then used to separate the released oligosaccharides, which have been fluorescently labeled. Gels can be visualized with a modified gel imaging system (see Table of Materials). The resulting oligosaccharide fingerprint can either be compared qualitatively or, with replication, quantitatively. Linkage and branching information can be established using additional glycosyl hydrolases (e.g., mannosidases and galactosidases). Whilst this protocol describes a method for analyzing glucomannan structure, it can be applied to any polysaccharide for which characterized glycosyl hydrolases exist. Alternatively, it can be used to characterize novel glycosyl hydrolases using defined polysaccharide substrates.

  1. Structural Characterization of Mannan Cell Wall Polysaccharides in Plants Using PACE

    DOE PAGES

    Pidatala, Venkataramana R.; Mahboubi, Amir; Mortimer, Jenny C.

    2017-10-16

    Plant cell wall polysaccharides are notoriously difficult to analyze, and most methods require expensive equipment, skilled operators, and large amounts of purified material. Here, we describe a simple method for gaining detailed polysaccharide structural information, including resolution of structural isomers. For polysaccharide analysis by gel electrophoresis (PACE), plant cell wall material is hydrolyzed with glycosyl hydrolases specific to the polysaccharide of interest (e.g., mannanases for mannan). Large format polyacrylamide gels are then used to separate the released oligosaccharides, which have been fluorescently labeled. Gels can be visualized with a modified gel imaging system (see Table of Materials). The resulting oligosaccharidemore » fingerprint can either be compared qualitatively or, with replication, quantitatively. Linkage and branching information can be established using additional glycosyl hydrolases (e.g., mannosidases and galactosidases). Whilst this protocol describes a method for analyzing glucomannan structure, it can be applied to any polysaccharide for which characterized glycosyl hydrolases exist. Alternatively, it can be used to characterize novel glycosyl hydrolases using defined polysaccharide substrates.« less

  2. THE ONTARIO HYDRO METHOD FOR SPECIATED MERCURY MEASUREMENTS: ISSUES AND CONSIDERATIONS

    EPA Science Inventory

    The Ontario Hydro (OH) method has been developed for the measurement of total and speciated mercury emissions from coal-fired combustion sources. The OH method was initially developed to support EPA's information collection request to characterize and inventory mercury emissions ...

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dehmel, J.C.; Loomis, D.; Mauro, J.

    Under contract to the US Nuclear Regulatory Commission, Office of Nuclear Regulatory Research, the firms of S. Cohen & Associates, Inc. (SC&A) and Eastern Research Group (ERG) have compiled a report that describes the physical, chemical, and radiological properties of Class-A low-level radioactive waste. The report also presents information characterizing various methods and facilities used to treat and dispose non-radioactive waste. A database management program was developed for use in accessing, sorting, analyzing, and displaying the electronic data provided by EG&G. The program was used to present and aggregate data characterizing the radiological, physical, and chemical properties of the wastemore » from descriptions contained in shipping manifests. The data thus retrieved are summarized in tables, histograms, and cumulative distribution curves presenting radionuclide concentration distributions in Class-A waste as a function of waste streams, by category of waste generators, and regions of the United States. The report also provides information characterizing methods and facilities used to treat and dispose non-radioactive waste, including industrial, municipal, and hazardous waste regulated under Subparts C and D of the Resource Conservation and Recovery Act (RCRA). The information includes a list of disposal options, the geographical locations of the processing and disposal facilities, and a description of the characteristics of such processing and disposal facilities. Volume 1 contains the Executive Summary, Volume 2 presents the Class-A waste database, Volume 3 presents the information characterizing non-radioactive waste management practices and facilities, and Volumes 4 through 7 contain Appendices A through P with supporting information.« less

  4. IMPROVED RISK CHARACTERIZATION METHODS FOR DEVELOPING AQUATIC LIFE CRITERIA FOR NON-BIOACCUMULATIVE TOXICANTS

    EPA Science Inventory

    This project will use existing and developing information to evaluate and demonstrate procedures for more fully characterizing risks of non-bioaccumulative toxicants to aquatic organisms, and for incorporating these risks into aquatic life criteria. These efforts will address a v...

  5. Analysis of acoustic emission signals and monitoring of machining processes

    PubMed

    Govekar; Gradisek; Grabec

    2000-03-01

    Monitoring of a machining process on the basis of sensor signals requires a selection of informative inputs in order to reliably characterize and model the process. In this article, a system for selection of informative characteristics from signals of multiple sensors is presented. For signal analysis, methods of spectral analysis and methods of nonlinear time series analysis are used. With the aim of modeling relationships between signal characteristics and the corresponding process state, an adaptive empirical modeler is applied. The application of the system is demonstrated by characterization of different parameters defining the states of a turning machining process, such as: chip form, tool wear, and onset of chatter vibration. The results show that, in spite of the complexity of the turning process, the state of the process can be well characterized by just a few proper characteristics extracted from a representative sensor signal. The process characterization can be further improved by joining characteristics from multiple sensors and by application of chaotic characteristics.

  6. Pattern Activity Clustering and Evaluation (PACE)

    NASA Astrophysics Data System (ADS)

    Blasch, Erik; Banas, Christopher; Paul, Michael; Bussjager, Becky; Seetharaman, Guna

    2012-06-01

    With the vast amount of network information available on activities of people (i.e. motions, transportation routes, and site visits) there is a need to explore the salient properties of data that detect and discriminate the behavior of individuals. Recent machine learning approaches include methods of data mining, statistical analysis, clustering, and estimation that support activity-based intelligence. We seek to explore contemporary methods in activity analysis using machine learning techniques that discover and characterize behaviors that enable grouping, anomaly detection, and adversarial intent prediction. To evaluate these methods, we describe the mathematics and potential information theory metrics to characterize behavior. A scenario is presented to demonstrate the concept and metrics that could be useful for layered sensing behavior pattern learning and analysis. We leverage work on group tracking, learning and clustering approaches; as well as utilize information theoretical metrics for classification, behavioral and event pattern recognition, and activity and entity analysis. The performance evaluation of activity analysis supports high-level information fusion of user alerts, data queries and sensor management for data extraction, relations discovery, and situation analysis of existing data.

  7. Assessment of multiple geophysical techniques for the characterization of municipal waste deposit sites

    NASA Astrophysics Data System (ADS)

    Gaël, Dumont; Tanguy, Robert; Nicolas, Marck; Frédéric, Nguyen

    2017-10-01

    In this study, we tested the ability of geophysical methods to characterize a large technical landfill installed in a former sand quarry. The geophysical surveys specifically aimed at delimitating the deposit site horizontal extension, at estimating its thickness and at characterizing the waste material composition (the moisture content in the present case). The site delimitation was conducted with electromagnetic (in-phase and out-of-phase) and magnetic (vertical gradient and total field) methods that clearly showed the transition between the waste deposit and the host formation. Regarding waste deposit thickness evaluation, electrical resistivity tomography appeared inefficient on this particularly thick deposit site. Thus, we propose a combination of horizontal to vertical noise spectral ratio (HVNSR) and multichannel analysis of the surface waves (MASW), which successfully determined the approximate waste deposit thickness in our test landfill. However, ERT appeared to be an appropriate tool to characterize the moisture content of the waste, which is of prior information for the organic waste biodegradation process. The global multi-scale and multi-method geophysical survey offers precious information for site rehabilitation studies, water content mitigation processes for enhanced biodegradation or landfill mining operation planning.

  8. Case retrieval in medical databases by fusing heterogeneous information.

    PubMed

    Quellec, Gwénolé; Lamard, Mathieu; Cazuguel, Guy; Roux, Christian; Cochener, Béatrice

    2011-01-01

    A novel content-based heterogeneous information retrieval framework, particularly well suited to browse medical databases and support new generation computer aided diagnosis (CADx) systems, is presented in this paper. It was designed to retrieve possibly incomplete documents, consisting of several images and semantic information, from a database; more complex data types such as videos can also be included in the framework. The proposed retrieval method relies on image processing, in order to characterize each individual image in a document by their digital content, and information fusion. Once the available images in a query document are characterized, a degree of match, between the query document and each reference document stored in the database, is defined for each attribute (an image feature or a metadata). A Bayesian network is used to recover missing information if need be. Finally, two novel information fusion methods are proposed to combine these degrees of match, in order to rank the reference documents by decreasing relevance for the query. In the first method, the degrees of match are fused by the Bayesian network itself. In the second method, they are fused by the Dezert-Smarandache theory: the second approach lets us model our confidence in each source of information (i.e., each attribute) and take it into account in the fusion process for a better retrieval performance. The proposed methods were applied to two heterogeneous medical databases, a diabetic retinopathy database and a mammography screening database, for computer aided diagnosis. Precisions at five of 0.809 ± 0.158 and 0.821 ± 0.177, respectively, were obtained for these two databases, which is very promising.

  9. Mass spectrometry for the biophysical characterization of therapeutic monoclonal antibodies.

    PubMed

    Zhang, Hao; Cui, Weidong; Gross, Michael L

    2014-01-21

    Monoclonal antibodies (mAbs) are powerful therapeutics, and their characterization has drawn considerable attention and urgency. Unlike small-molecule drugs (150-600 Da) that have rigid structures, mAbs (∼150 kDa) are engineered proteins that undergo complicated folding and can exist in a number of low-energy structures, posing a challenge for traditional methods in structural biology. Mass spectrometry (MS)-based biophysical characterization approaches can provide structural information, bringing high sensitivity, fast turnaround, and small sample consumption. This review outlines various MS-based strategies for protein biophysical characterization and then reviews how these strategies provide structural information of mAbs at the protein level (intact or top-down approaches), peptide, and residue level (bottom-up approaches), affording information on higher order structure, aggregation, and the nature of antibody complexes. Copyright © 2013 Federation of European Biochemical Societies. Published by Elsevier B.V. All rights reserved.

  10. 40 CFR 194.24 - Waste characterization.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... other information and methods. (b) The Department shall submit in the compliance certification... proposed for disposal in the disposal system, WIPP complies with the numeric requirements of § 194.34 and... release. (2) Identify and describe the method(s) used to quantify the limits of waste components...

  11. 78 FR 68076 - Request for Information on Alternative Skin Sensitization Test Methods and Testing Strategies and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-13

    ... INFORMATION: Background: Allergic contact dermatitis (ACD), a skin reaction characterized by localized redness, swelling, blistering, or itching after direct contact with a skin allergen, is an important public health.... Web site: http://ntp.niehs.nih.gov/go/niceatm . FOR FURTHER INFORMATION CONTACT: Dr. Warren S. Casey...

  12. Characterizing Task-Based OpenMP Programs

    PubMed Central

    Muddukrishna, Ananya; Jonsson, Peter A.; Brorsson, Mats

    2015-01-01

    Programmers struggle to understand performance of task-based OpenMP programs since profiling tools only report thread-based performance. Performance tuning also requires task-based performance in order to balance per-task memory hierarchy utilization against exposed task parallelism. We provide a cost-effective method to extract detailed task-based performance information from OpenMP programs. We demonstrate the utility of our method by quickly diagnosing performance problems and characterizing exposed task parallelism and per-task instruction profiles of benchmarks in the widely-used Barcelona OpenMP Tasks Suite. Programmers can tune performance faster and understand performance tradeoffs more effectively than existing tools by using our method to characterize task-based performance. PMID:25860023

  13. Information-Processing Theory and Perspectives on Development: A Look at Concepts and Methods--The View of a Developmental Ethologist.

    ERIC Educational Resources Information Center

    Jesness, Bradley

    This paper examines concepts in information-processing theory which are likely to be relevant to development and characterizes the methods and data upon which the concepts are based. Among the concepts examined are those which have slight empirical grounds. Other concepts examined are those which seem to have empirical bases but which are…

  14. System and method for 100% moisture and basis weight measurement of moving paper

    DOEpatents

    Hernandez, Jose E.; Koo, Jackson C.

    2002-01-01

    A system for characterizing a set of properties for a moving substance are disclosed. The system includes: a first near-infrared linear array; a second near-infrared linear array; a first filter transparent to a first absorption wavelength emitted by the moving substance and juxtaposed between the substance and the first array; a second filter blocking the first absorption wavelength emitted by the moving substance and juxtaposed between the substance and the second array; and a computational device for characterizing data from the arrays into information on a property of the substance. The method includes the steps of: filtering out a first absorption wavelength emitted by a substance; monitoring the first absorption wavelength with a first near-infrared linear array; blocking the first wavelength from reaching a second near-infrared linear array; and characterizing data from the arrays into information on a property of the substance.

  15. OMICS DATA IN THE QUALITATIVE AND QUANTITATIVE CHARACTERIZATION OF THE MODE OF ACTION IN SUPPORT OF IRIS ASSESSMENTS

    EPA Science Inventory

    Knowledge and information generated using new tools/methods collectively called "Omics" technologies could have a profound effect on qualitative and quantitative characterizations of human health risk assessments.

    The suffix "Omics" is a descriptor used for a series of e...

  16. SUMMARY OF TECHNIQUES AND UNIQUE USES FOR DIRECT PUSH METHODS IN SITE CHARACTERIZATION ON CONTAMINATED FIELD SITES

    EPA Science Inventory

    Site characterization of subsurface contaminant transport is often hampered by a lack of knowledge of site heterogeneity and temporal variations in hydrogeochemistry. Two case studies are reviewed to illustrate the utility of macro-scale mapping information along with spatially-...

  17. DEVELOPMENT OF CHEMICAL METHODS TO CHARACTERIZE EXPOSURE TO EDCS IN THE NEUSE RIVER BASIN

    EPA Science Inventory

    To develop a quantitative health and environmental risk assessment of endocrine disrupting compounds (EDCs), information on exposures is essential. A full exposure assessment has complex requirements that require preliminary information to direct further research in this area....

  18. System and method for characterizing synthesizing and/or canceling out acoustic signals from inanimate sound sources

    DOEpatents

    Holzrichter, John F.; Burnett, Greg C.; Ng, Lawrence C.

    2003-01-01

    A system and method for characterizing, synthesizing, and/or canceling out acoustic signals from inanimate sound sources is disclosed. Propagating wave electromagnetic sensors monitor excitation sources in sound producing systems, such as machines, musical instruments, and various other structures. Acoustical output from these sound producing systems is also monitored. From such information, a transfer function characterizing the sound producing system is generated. From the transfer function, acoustical output from the sound producing system may be synthesized or canceled. The methods disclosed enable accurate calculation of matched transfer functions relating specific excitations to specific acoustical outputs. Knowledge of such signals and functions can be used to effect various sound replication, sound source identification, and sound cancellation applications.

  19. System and method for characterizing, synthesizing, and/or canceling out acoustic signals from inanimate sound sources

    DOEpatents

    Holzrichter, John F; Burnett, Greg C; Ng, Lawrence C

    2013-05-21

    A system and method for characterizing, synthesizing, and/or canceling out acoustic signals from inanimate sound sources is disclosed. Propagating wave electromagnetic sensors monitor excitation sources in sound producing systems, such as machines, musical instruments, and various other structures. Acoustical output from these sound producing systems is also monitored. From such information, a transfer function characterizing the sound producing system is generated. From the transfer function, acoustical output from the sound producing system may be synthesized or canceled. The methods disclosed enable accurate calculation of matched transfer functions relating specific excitations to specific acoustical outputs. Knowledge of such signals and functions can be used to effect various sound replication, sound source identification, and sound cancellation applications.

  20. System and method for characterizing, synthesizing, and/or canceling out acoustic signals from inanimate sound sources

    DOEpatents

    Holzrichter, John F.; Burnett, Greg C.; Ng, Lawrence C.

    2007-10-16

    A system and method for characterizing, synthesizing, and/or canceling out acoustic signals from inanimate sound sources is disclosed. Propagating wave electromagnetic sensors monitor excitation sources in sound producing systems, such as machines, musical instruments, and various other structures. Acoustical output from these sound producing systems is also monitored. From such information, a transfer function characterizing the sound producing system is generated. From the transfer function, acoustical output from the sound producing system may be synthesized or canceled. The methods disclosed enable accurate calculation of matched transfer functions relating specific excitations to specific acoustical outputs. Knowledge of such signals and functions can be used to effect various sound replication, sound source identification, and sound cancellation applications.

  1. Super-resolved calibration-free flow cytometric characterization of platelets and cell-derived microparticles in platelet-rich plasma.

    PubMed

    Konokhova, Anastasiya I; Chernova, Darya N; Moskalensky, Alexander E; Strokotov, Dmitry I; Yurkin, Maxim A; Chernyshev, Andrei V; Maltsev, Valeri P

    2016-02-01

    Importance of microparticles (MPs), also regarded as extracellular vesicles, in many physiological processes and clinical conditions motivates one to use the most informative and precise methods for their characterization. Methods based on individual particle analysis provide statistically reliable distributions of MP population over characteristics. Although flow cytometry is one of the most powerful technologies of this type, the standard forward-versus-side-scattering plots of MPs and platelets (PLTs) overlap considerably because of similarity of their morphological characteristics. Moreover, ordinary flow cytometry is not capable of measurement of size and refractive index (RI) of MPs. In this study, we 1) employed the potential of the scanning flow cytometer (SFC) for identification and characterization of MPs from light scattering; 2) suggested the reference method to characterize MP morphology (size and RI) with high precision; and 3) determined the lowest size of a MP that can be characterized from light scattering with the SFC. We equipped the SFC with 405 and 488 nm lasers to measure the light-scattering profiles and side scattering from MPs, respectively. The developed two-stage method allowed accurate separation of PLTs and MPs in platelet-rich plasma. We used two optical models for MPs, a sphere and a bisphere, in the solution of the inverse light-scattering problem. This solution provides unprecedented precision in determination of size and RI of individual spherical MPs-median uncertainties (standard deviations) were 6 nm and 0.003, respectively. The developed method provides instrument-independent quantitative information on MPs, which can be used in studies of various factors affecting MP population. © 2015 International Society for Advancement of Cytometry.

  2. Trip optimization system and method for a train

    DOEpatents

    Kumar, Ajith Kuttannair; Shaffer, Glenn Robert; Houpt, Paul Kenneth; Movsichoff, Bernardo Adrian; Chan, David So Keung

    2017-08-15

    A system for operating a train having one or more locomotive consists with each locomotive consist comprising one or more locomotives, the system including a locator element to determine a location of the train, a track characterization element to provide information about a track, a sensor for measuring an operating condition of the locomotive consist, a processor operable to receive information from the locator element, the track characterizing element, and the sensor, and an algorithm embodied within the processor having access to the information to create a trip plan that optimizes performance of the locomotive consist in accordance with one or more operational criteria for the train.

  3. Informant Disagreement for Separation Anxiety Disorder

    ERIC Educational Resources Information Center

    Foley, Debra; Rutter, Michael; Pickles, Andrew; Angold, Adrian; Maes, Hermine; Silberg, Judy; Eaves, Lindon

    2004-01-01

    Objective: To characterize informant disagreement for separation anxiety disorder (SAD). Method: The sample comprised 2,779 8- to 17-year-old twins from a community-based registry. Children and their parents completed a personal interview about the child's psychiatric history. Parents completed a personal interview about their own psychiatric…

  4. Characterizing the Networks of Digital Information that Support Collaborative Adaptive Forest Management in Sierra Nevada Forests.

    PubMed

    Lei, Shufei; Iles, Alastair; Kelly, Maggi

    2015-07-01

    Some of the factors that can contribute to the success of collaborative adaptive management--such as social learning, open communication, and trust--are built upon a foundation of the open exchange of information about science and management between participants and the public. Despite the importance of information transparency, the use and flow of information in collaborative adaptive management has not been characterized in detail in the literature, and currently there exist opportunities to develop strategies for increasing the exchange of information, as well as to track information flow in such contexts. As digital information channels and networks have been increased over the last decade, powerful new information monitoring tools have also been evolved allowing for the complete characterization of information products through their production, transport, use, and monitoring. This study uses these tools to investigate the use of various science and management information products in a case study--the Sierra Nevada Adaptive Management Project--using a mixed method (citation analysis, web analytics, and content analysis) research approach borrowed from the information processing and management field. The results from our case study show that information technologies greatly facilitate the flow and use of digital information, leading to multiparty collaborations such as knowledge transfer and public participation in science research. We conclude with recommendations for expanding information exchange in collaborative adaptive management by taking advantage of available information technologies and networks.

  5. Characterizing the Networks of Digital Information that Support Collaborative Adaptive Forest Management in Sierra Nevada Forests

    NASA Astrophysics Data System (ADS)

    Lei, Shufei; Iles, Alastair; Kelly, Maggi

    2015-07-01

    Some of the factors that can contribute to the success of collaborative adaptive management—such as social learning, open communication, and trust—are built upon a foundation of the open exchange of information about science and management between participants and the public. Despite the importance of information transparency, the use and flow of information in collaborative adaptive management has not been characterized in detail in the literature, and currently there exist opportunities to develop strategies for increasing the exchange of information, as well as to track information flow in such contexts. As digital information channels and networks have been increased over the last decade, powerful new information monitoring tools have also been evolved allowing for the complete characterization of information products through their production, transport, use, and monitoring. This study uses these tools to investigate the use of various science and management information products in a case study—the Sierra Nevada Adaptive Management Project—using a mixed method (citation analysis, web analytics, and content analysis) research approach borrowed from the information processing and management field. The results from our case study show that information technologies greatly facilitate the flow and use of digital information, leading to multiparty collaborations such as knowledge transfer and public participation in science research. We conclude with recommendations for expanding information exchange in collaborative adaptive management by taking advantage of available information technologies and networks.

  6. Infrared non-destructive evaluation method and apparatus

    DOEpatents

    Baleine, Erwan; Erwan, James F; Lee, Ching-Pang; Stinelli, Stephanie

    2014-10-21

    A method of nondestructive evaluation and related system. The method includes arranging a test piece (14) having an internal passage (18) and an external surface (15) and a thermal calibrator (12) within a field of view (42) of an infrared sensor (44); generating a flow (16) of fluid characterized by a fluid temperature; exposing the test piece internal passage (18) and the thermal calibrator (12) to fluid from the flow (16); capturing infrared emission information of the test piece external surface (15) and of the thermal calibrator (12) simultaneously using the infrared sensor (44), wherein the test piece infrared emission information includes emission intensity information, and wherein the thermal calibrator infrared emission information includes a reference emission intensity associated with the fluid temperature; and normalizing the test piece emission intensity information against the reference emission intensity.

  7. Multiscale Structure of UXO Site Characterization: Spatial Estimation and Uncertainty Quantification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ostrouchov, George; Doll, William E.; Beard, Les P.

    2009-01-01

    Unexploded ordnance (UXO) site characterization must consider both how the contamination is generated and how we observe that contamination. Within the generation and observation processes, dependence structures can be exploited at multiple scales. We describe a conceptual site characterization process, the dependence structures available at several scales, and consider their statistical estimation aspects. It is evident that most of the statistical methods that are needed to address the estimation problems are known but their application-specific implementation may not be available. We demonstrate estimation at one scale and propose a representation for site contamination intensity that takes full account of uncertainty,more » is flexible enough to answer regulatory requirements, and is a practical tool for managing detailed spatial site characterization and remediation. The representation is based on point process spatial estimation methods that require modern computational resources for practical application. These methods have provisions for including prior and covariate information.« less

  8. Analytical Methods for Biomass Characterization during Pretreatment and Bioconversion

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pu, Yunqiao; Meng, Xianzhi; Yoo, Chang Geun

    2016-01-01

    Lignocellulosic biomass has been introduced as a promising resource for alternative fuels and chemicals because of its abundance and complement for petroleum resources. Biomass is a complex biopolymer and its compositional and structural characteristics largely vary depending on its species as well as growth environments. Because of complexity and variety of biomass, understanding its physicochemical characteristics is a key for effective biomass utilization. Characterization of biomass does not only provide critical information of biomass during pretreatment and bioconversion, but also give valuable insights on how to utilize the biomass. For better understanding biomass characteristics, good grasp and proper selection ofmore » analytical methods are necessary. This chapter introduces existing analytical approaches that are widely employed for biomass characterization during biomass pretreatment and conversion process. Diverse analytical methods using Fourier transform infrared (FTIR) spectroscopy, gel permeation chromatography (GPC), and nuclear magnetic resonance (NMR) spectroscopy for biomass characterization are reviewed. In addition, biomass accessibility methods by analyzing surface properties of biomass are also summarized in this chapter.« less

  9. Developments at the Advanced Design Technologies Testbed

    NASA Technical Reports Server (NTRS)

    VanDalsem, William R.; Livingston, Mary E.; Melton, John E.; Torres, Francisco J.; Stremel, Paul M.

    2003-01-01

    A report presents background and historical information, as of August 1998, on the Advanced Design Technologies Testbed (ADTT) at Ames Research Center. The ADTT is characterized as an activity initiated to facilitate improvements in aerospace design processes; provide a proving ground for product-development methods and computational software and hardware; develop bridging methods, software, and hardware that can facilitate integrated solutions to design problems; and disseminate lessons learned to the aerospace and information technology communities.

  10. Two-photon Microscopy and Polarimetry for Assessment of Myocardial Tissue Organization

    NASA Astrophysics Data System (ADS)

    Archambault-Wallenburg, Marika

    Optical methods can provide useful tissue characterization tools. For this project, two-photon microscopy and polarized light examinations (polarimetry) were used to assess the organizational state of myocardium in healthy, infarcted, and stem-cell regenerated states. Two-photon microscopy visualizes collagen through second-harmonic generation and myocytes through two-photon excitation autofluorescence, providing information on the composition and structure/organization of the tissue. Polarimetry measurements yield a value of linear retardance that can serve as an indicator of tissue anisotropy, and with a dual-projection method, information about the anisotropy axis orientation can also be extracted. Two-photon microscopy results reveal that stem-cell treated tissue retains more myocytes and structure than infarcted myocardium, while polarimetry findings suggest that the injury caused by temporary ligation of a coronary artery is less severe and more diffuse that than caused by a permanent ligation. Both these methods show potential for tissue characterization.

  11. Field Demonstrations of Five Geophysical Methods that Could Be Used to Characterize Deposits of Alluvial Aggregate

    USGS Publications Warehouse

    Ellefsen, K.J.; Burton, B.L.; Lucius, J.E.; Haines, S.S.; Fitterman, D.V.; Witty, J.A.; Carlson, D.; Milburn, B.; Langer, W.H.

    2007-01-01

    Personnel from the U.S. Geological Survey and Martin Marietta Aggregates, Inc., conducted field demonstrations of five different geophysical methods to show how these methods could be used to characterize deposits of alluvial aggregate. The methods were time-domain electromagnetic sounding, electrical resistivity profiling, S-wave reflection profiling, S-wave refraction profiling, and P-wave refraction profiling. All demonstrations were conducted at one site within a river valley in central Indiana, where the stratigraphy consisted of 1 to 2 meters of clay-rich soil, 20 to 35 meters of alluvial sand and gravel, 1 to 6 meters of clay, and multiple layers of limestone and dolomite bedrock. All geophysical methods, except time-domain electromagnetic sounding, provided information about the alluvial aggregate that was consistent with the known geology. Although time-domain electromagnetic sounding did not work well at this site, it has worked well at other sites with different geology. All of these geophysical methods complement traditional methods of geologic characterization such as drilling.

  12. Electromigration and the structure of metallic nanocontacts

    NASA Astrophysics Data System (ADS)

    Hoffmann-Vogel, R.

    2017-09-01

    This article reviews efforts to structurally characterize metallic nanocontacts. While the electronic characterization of such junctions is relatively straight forward, usually it is technically challenging to study the nanocontact's structure at small length scales. However, knowing that the structure is the basis for understanding the electronic properties of the nanocontact, for example, it is necessary to explain the electronic properties by calculations based on structural models. Besides using a gate electrode, controlling the structure is an important way of understanding how the electronic transport properties can be influenced. A key to make structural information directly accessible is to choose a fabrication method that is adapted to the structural characterization method. Special emphasis is given to transmission electron microscopy fabrication and to thermally assisted electromigration methods due to their potential for obtaining information on both electrodes of the forming nanocontact. Controlled electromigration aims at studying the contact at constant temperature of the contact during electromigration compared to studies at constant temperature of the environment as done previously. We review efforts to calculate electromigration forces. We describe how hot spots are formed during electromigration. We summarize implications for the structure obtained from studies of the ballistic transport regime, tunneling, and Coulomb-blockade. We review the structure of the nanocontacts known from direct structural characterization. Single-crystalline wires allow suppressing grain boundary electromigration. In thin films, the substrate plays an important role in influencing the defect and temperature distribution. Hot-spot formation and recrystallization are observed. We add information on the local temperature and current density and on alloys important for microelectronic interconnects.

  13. Characterization of Nanopipettes.

    PubMed

    Perry, David; Momotenko, Dmitry; Lazenby, Robert A; Kang, Minkyung; Unwin, Patrick R

    2016-05-17

    Nanopipettes are widely used in electrochemical and analytical techniques as tools for sizing, sequencing, sensing, delivery, and imaging. For all of these applications, the response of a nanopipette is strongly affected by its geometry and surface chemistry. As the size of nanopipettes becomes smaller, precise geometric characterization is increasingly important, especially if nanopipette probes are to be used for quantitative studies and analysis. This contribution highlights the combination of data from voltage-scanning ion conductivity experiments, transmission electron microscopy and finite element method simulations to fully characterize nanopipette geometry and surface charge characteristics, with an accuracy not achievable using existing approaches. Indeed, it is shown that presently used methods for characterization can lead to highly erroneous information on nanopipettes. The new approach to characterization further facilitates high-level quantification of the behavior of nanopipettes in electrochemical systems, as demonstrated herein for a scanning ion conductance microscope setup.

  14. Authorship attribution based on Life-Like Network Automata.

    PubMed

    Machicao, Jeaneth; Corrêa, Edilson A; Miranda, Gisele H B; Amancio, Diego R; Bruno, Odemir M

    2018-01-01

    The authorship attribution is a problem of considerable practical and technical interest. Several methods have been designed to infer the authorship of disputed documents in multiple contexts. While traditional statistical methods based solely on word counts and related measurements have provided a simple, yet effective solution in particular cases; they are prone to manipulation. Recently, texts have been successfully modeled as networks, where words are represented by nodes linked according to textual similarity measurements. Such models are useful to identify informative topological patterns for the authorship recognition task. However, there is no consensus on which measurements should be used. Thus, we proposed a novel method to characterize text networks, by considering both topological and dynamical aspects of networks. Using concepts and methods from cellular automata theory, we devised a strategy to grasp informative spatio-temporal patterns from this model. Our experiments revealed an outperformance over structural analysis relying only on topological measurements, such as clustering coefficient, betweenness and shortest paths. The optimized results obtained here pave the way for a better characterization of textual networks.

  15. Retrieval of tropospheric carbon monoxide for the MOPITT experiment

    NASA Astrophysics Data System (ADS)

    Pan, Liwen; Gille, John C.; Edwards, David P.; Bailey, Paul L.; Rodgers, Clive D.

    1998-12-01

    A retrieval method for deriving the tropospheric carbon monoxide (CO) profile and column amount under clear sky conditions has been developed for the Measurements of Pollution In The Troposphere (MOPITT) instrument, scheduled for launch in 1998 onboard the EOS-AM1 satellite. This paper presents a description of the method along with analyses of retrieval information content. These analyses characterize the forward measurement sensitivity, the contribution of a priori information, and the retrieval vertical resolution. Ensembles of tropospheric CO profiles were compiled both from aircraft in situ measurements and from chemical model results and were used in retrieval experiments to characterize the method and to study the sensitivity to different parameters. Linear error analyses were carried out in parallel with the ensemble experiments. Results of these experiments and analyses indicate that MOPITT CO column measurements will have better than 10% precision, and CO profile measurement will have approximately three pieces of independent information that will resolve 3-5 tropospheric layers to approximately 10% precision. These analyses are important for understanding MOPITT data, both for application of data in tropospheric chemistry studies and for comparison with in situ measurements.

  16. Characterization and elimination of undesirable protein residues in plant cell walls for enhancing lignin analysis by solution-state 2D gel-NMR methods

    USDA-ARS?s Scientific Manuscript database

    Proteins exist in every plant cell wall. Certain protein residues interfere with lignin characterization and quantification. The current solution-state 2D-NMR technique (gel-NMR) for whole plant cell wall structural profiling provides detailed information regarding cell walls and proteins. However, ...

  17. Charge Transfer Dissociation of Complex Oligosaccharides: Comparison with Collision-Induced Dissociation and Extreme Ultraviolet Dissociative Photoionization

    NASA Astrophysics Data System (ADS)

    Ropartz, David; Li, Pengfei; Fanuel, Mathieu; Giuliani, Alexandre; Rogniaux, Hélène; Jackson, Glen P.

    2016-10-01

    The structural characterization of oligosaccharides still challenges the field of analytical chemistry. Tandem mass spectrometry offers many advantages toward this aim, although the generic fragmentation method (low-energy collision-induced dissociation) shows clear limitations and is often insufficient to retrieve some essential structural information on these molecules. In this work, we present the first application of helium charge transfer dissociation (He-CTD) to characterize the structure of complex oligosaccharides. We compare this method with low-energy collision-induced dissociation and extreme-ultraviolet dissociative photoionization (XUV-DPI), which was shown previously to ensure the successful characterization of complex glycans. Similarly to what could be obtained by XUV-DPI, He-CTD provides a complete description of the investigated structures by producing many informative cross-ring fragments and no ambiguous fragmentation. Unlike XUV-DPI, which is performed at a synchrotron source, He-CTD has the undeniable advantage of being implementable in a conventional benchtop ion trap in a conventional laboratory setting.

  18. Parallelization of a spatial random field characterization process using the Method of Anchored Distributions and the HTCondor high throughput computing system

    NASA Astrophysics Data System (ADS)

    Osorio-Murillo, C. A.; Over, M. W.; Frystacky, H.; Ames, D. P.; Rubin, Y.

    2013-12-01

    A new software application called MAD# has been coupled with the HTCondor high throughput computing system to aid scientists and educators with the characterization of spatial random fields and enable understanding the spatial distribution of parameters used in hydrogeologic and related modeling. MAD# is an open source desktop software application used to characterize spatial random fields using direct and indirect information through Bayesian inverse modeling technique called the Method of Anchored Distributions (MAD). MAD relates indirect information with a target spatial random field via a forward simulation model. MAD# executes inverse process running the forward model multiple times to transfer information from indirect information to the target variable. MAD# uses two parallelization profiles according to computational resources available: one computer with multiple cores and multiple computers - multiple cores through HTCondor. HTCondor is a system that manages a cluster of desktop computers for submits serial or parallel jobs using scheduling policies, resources monitoring, job queuing mechanism. This poster will show how MAD# reduces the time execution of the characterization of random fields using these two parallel approaches in different case studies. A test of the approach was conducted using 1D problem with 400 cells to characterize saturated conductivity, residual water content, and shape parameters of the Mualem-van Genuchten model in four materials via the HYDRUS model. The number of simulations evaluated in the inversion was 10 million. Using the one computer approach (eight cores) were evaluated 100,000 simulations in 12 hours (10 million - 1200 hours approximately). In the evaluation on HTCondor, 32 desktop computers (132 cores) were used, with a processing time of 60 hours non-continuous in five days. HTCondor reduced the processing time for uncertainty characterization by a factor of 20 (1200 hours reduced to 60 hours.)

  19. A Framework for Characterizing eHealth Literacy Demands and Barriers

    PubMed Central

    Chan, Connie V

    2011-01-01

    Background Consumer eHealth interventions are of a growing importance in the individual management of health and health behaviors. However, a range of access, resources, and skills barriers prevent health care consumers from fully engaging in and benefiting from the spectrum of eHealth interventions. Consumers may engage in a range of eHealth tasks, such as participating in health discussion forums and entering information into a personal health record. eHealth literacy names a set of skills and knowledge that are essential for productive interactions with technology-based health tools, such as proficiency in information retrieval strategies, and communicating health concepts effectively. Objective We propose a theoretical and methodological framework for characterizing complexity of eHealth tasks, which can be used to diagnose and describe literacy barriers and inform the development of solution strategies. Methods We adapted and integrated two existing theoretical models relevant to the analysis of eHealth literacy into a single framework to systematically categorize and describe task demands and user performance on tasks needed by health care consumers in the information age. The method derived from the framework is applied to (1) code task demands using a cognitive task analysis, and (2) code user performance on tasks. The framework and method are applied to the analysis of a Web-based consumer eHealth task with information-seeking and decision-making demands. We present the results from the in-depth analysis of the task performance of a single user as well as of 20 users on the same task to illustrate both the detailed analysis and the aggregate measures obtained and potential analyses that can be performed using this method. Results The analysis shows that the framework can be used to classify task demands as well as the barriers encountered in user performance of the tasks. Our approach can be used to (1) characterize the challenges confronted by participants in performing the tasks, (2) determine the extent to which application of the framework to the cognitive task analysis can predict and explain the problems encountered by participants, and (3) inform revisions to the framework to increase accuracy of predictions. Conclusions The results of this illustrative application suggest that the framework is useful for characterizing task complexity and for diagnosing and explaining barriers encountered in task completion. The framework and analytic approach can be a potentially powerful generative research platform to inform development of rigorous eHealth examination and design instruments, such as to assess eHealth competence, to design and evaluate consumer eHealth tools, and to develop an eHealth curriculum. PMID:22094891

  20. Characterization of benign thyroid nodules with HyperSPACE (Hyper Spectral Analysis for Characterization in Echography) before and after percutaneous laser ablation: a pilot study.

    PubMed

    Granchi, Simona; Vannacci, Enrico; Biagi, Elena

    2017-04-22

    To evaluate the capability of the HyperSPACE (Hyper SPectral Analysis for Characterization in Echography) method in tissue characterization, in order to provide information for the laser treatment of benign thyroid nodules in respect of conventional B-mode images and elastography. The method, based on the spectral analysis of the raw radiofrequency ultrasonic signal, was applied to characterize the nodule before and after laser treatment. Thirty patients (25 females and 5 males, age between 37 and 81 years) with thyroid benign nodule at cytology (Thyr 2) were evaluated by conventional ultrasonography, elastography, and HyperSPACE, before and after laser ablation. The images processed by HyperSPACE exhibit different color distributions that are referred to different tissue features. By calculating the percentages of the color coverages, the analysed nodules were subdivided into 3 groups. Each nodule belonging to the same group experienced, on average, similar necrosis extension. The nodules exhibit different Configurations (colors) distributions that could be indicative of the response of nodular tissue to the laser treatmentConclusions: HyperSPACEcan characterize benign nodules by providing additional information in respect of conventional ultrasound and elastography which is useful for support in the laser treatment of nodules in order to increase the probability of success.

  1. Gate frequency sweep: An effective method to evaluate the dynamic performance of AlGaN/GaN power heterojunction field effect transistors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Santi, C. de; Meneghini, M., E-mail: matteo.meneghini@dei.unipd.it; Meneghesso, G.

    2014-08-18

    With this paper we propose a test method for evaluating the dynamic performance of GaN-based transistors, namely, gate-frequency sweep measurements: the effectiveness of the method is verified by characterizing the dynamic performance of Gate Injection Transistors. We demonstrate that this method can provide an effective description of the impact of traps on the transient performance of Heterojunction Field Effect Transistors, and information on the properties (activation energy and cross section) of the related defects. Moreover, we discuss the relation between the results obtained by gate-frequency sweep measurements and those collected by conventional drain current transients and double pulse characterization.

  2. Identification of informative subgraphs in brain networks

    NASA Astrophysics Data System (ADS)

    Marinazzo, D.; Wu, G.; Pellicoro, M.; Stramaglia, S.

    2013-01-01

    Measuring directed interactions in the brain in terms of information flow is a promising approach, mathematically treatable and amenable to encompass several methods. Here we present a formal expansion of the transfer entropy to put in evidence irreducible sets of variables which provide information for the future state of each assigned target. Multiplets characterized by a large contribution to the expansion are associated to informational circuits present in the system, with an informational character (synergetic or redundant) which can be inferred from the sign of the contribution.

  3. Measurement methods and algorithms for comparison of local and remote clocks

    NASA Technical Reports Server (NTRS)

    Levine, Judah

    1993-01-01

    Several methods for characterizing the performance of clocks with special emphasis on using calibration information that is acquired via an unreliable or noisy channel is discussed. Time-domain variance estimators and frequency-domain techniques such as cross-spectral analysis are discussed. Each of these methods has advantages and limitations that will be illustrated using data obtained via GPS, ACTS, and other methods. No one technique will be optimum for all of these analyses, and some of these problems cannot be completely characterized by any of the techniques discussed. The inverse problem of communicating frequency and time corrections to a real-time steered clock are also discussed. Methods were developed to mitigate the disastrous problems of data corruption and loss of computer control.

  4. FW/CADIS-O: An Angle-Informed Hybrid Method for Neutron Transport

    NASA Astrophysics Data System (ADS)

    Munk, Madicken

    The development of methods for deep-penetration radiation transport is of continued importance for radiation shielding, nonproliferation, nuclear threat reduction, and medical applications. As these applications become more ubiquitous, the need for transport methods that can accurately and reliably model the systems' behavior will persist. For these types of systems, hybrid methods are often the best choice to obtain a reliable answer in a short amount of time. Hybrid methods leverage the speed and uniform uncertainty distribution of a deterministic solution to bias Monte Carlo transport to reduce the variance in the solution. At present, the Consistent Adjoint-Driven Importance Sampling (CADIS) and Forward-Weighted CADIS (FW-CADIS) hybrid methods are the gold standard by which to model systems that have deeply-penetrating radiation. They use an adjoint scalar flux to generate variance reduction parameters for Monte Carlo. However, in problems where there exists strong anisotropy in the flux, CADIS and FW-CADIS are not as effective at reducing the problem variance as isotropic problems. This dissertation covers the theoretical background, implementation of, and characteri- zation of a set of angle-informed hybrid methods that can be applied to strongly anisotropic deep-penetration radiation transport problems. These methods use a forward-weighted adjoint angular flux to generate variance reduction parameters for Monte Carlo. As a result, they leverage both adjoint and contributon theory for variance reduction. They have been named CADIS-O and FW-CADIS-O. To characterize CADIS-O, several characterization problems with flux anisotropies were devised. These problems contain different physical mechanisms by which flux anisotropy is induced. Additionally, a series of novel anisotropy metrics by which to quantify flux anisotropy are used to characterize the methods beyond standard Figure of Merit (FOM) and relative error metrics. As a result, a more thorough investigation into the effects of anisotropy and the degree of anisotropy on Monte Carlo convergence is possible. The results from the characterization of CADIS-O show that it performs best in strongly anisotropic problems that have preferential particle flowpaths, but only if the flowpaths are not comprised of air. Further, the characterization of the method's sensitivity to deterministic angular discretization showed that CADIS-O has less sensitivity to discretization than CADIS for both quadrature order and PN order. However, more variation in the results were observed in response to changing quadrature order than PN order. Further, as a result of the forward-normalization in the O-methods, ray effect mitigation was observed in many of the characterization problems. The characterization of the CADIS-O-method in this dissertation serves to outline a path forward for further hybrid methods development. In particular, the response that the O-method has with changes in quadrature order, PN order, and on ray effect mitigation are strong indicators that the method is more resilient than its predecessors to strong anisotropies in the flux. With further method characterization, the full potential of the O-methods can be realized. The method can then be applied to geometrically complex, materially diverse problems and help to advance system modelling in deep-penetration radiation transport problems with strong anisotropies in the flux.

  5. Cross Sectional Study of Agile Software Development Methods and Project Performance

    ERIC Educational Resources Information Center

    Lambert, Tracy

    2011-01-01

    Agile software development methods, characterized by delivering customer value via incremental and iterative time-boxed development processes, have moved into the mainstream of the Information Technology (IT) industry. However, despite a growing body of research which suggests that a predictive manufacturing approach, with big up-front…

  6. Spatio-temporal correlations in models of collective motion ruled by different dynamical laws.

    PubMed

    Cavagna, Andrea; Conti, Daniele; Giardina, Irene; Grigera, Tomas S; Melillo, Stefania; Viale, Massimiliano

    2016-11-15

    Information transfer is an essential factor in determining the robustness of biological systems with distributed control. The most direct way to study the mechanisms ruling information transfer is to experimentally observe the propagation across the system of a signal triggered by some perturbation. However, this method may be inefficient for experiments in the field, as the possibilities to perturb the system are limited and empirical observations must rely on natural events. An alternative approach is to use spatio-temporal correlations to probe the information transfer mechanism directly from the spontaneous fluctuations of the system, without the need to have an actual propagating signal on record. Here we test this method on models of collective behaviour in their deeply ordered phase by using ground truth data provided by numerical simulations in three dimensions. We compare two models characterized by very different dynamical equations and information transfer mechanisms: the classic Vicsek model, describing an overdamped noninertial dynamics and the inertial spin model, characterized by an underdamped inertial dynamics. By using dynamic finite-size scaling, we show that spatio-temporal correlations are able to distinguish unambiguously the diffusive information transfer mechanism of the Vicsek model from the linear mechanism of the inertial spin model.

  7. System and method for characterizing voiced excitations of speech and acoustic signals, removing acoustic noise from speech, and synthesizing speech

    DOEpatents

    Burnett, Greg C.; Holzrichter, John F.; Ng, Lawrence C.

    2002-01-01

    Low power EM waves are used to detect motions of vocal tract tissues of the human speech system before, during, and after voiced speech. A voiced excitation function is derived. The excitation function provides speech production information to enhance speech characterization and to enable noise removal from human speech.

  8. Irrigation Water and Nitrate Loss Characterization in South Florida Nurseries: Cumulative Volumes, Runoff Rates, NO3-N Concentrations and Loadings, and Implications for Management

    USDA-ARS?s Scientific Manuscript database

    Enrichment of surface water with nitrate-nitrogen is a significant problem throughout the world. In support of developing a method for removing nitrate from water using denitrification, this project characterized runoff events at two nurseries in South Florida to provide information needed for desi...

  9. Application of Electrical Resistivity Method (ERM) in Groundwater Exploration

    NASA Astrophysics Data System (ADS)

    Izzaty Riwayat, Akhtar; Nazri, Mohd Ariff Ahmad; Hazreek Zainal Abidin, Mohd

    2018-04-01

    The geophysical method which dominant by geophysicists become one of most popular method applied by engineers in civil engineering fields. Electrical Resistivity Method (ERM) is one of geophysical tool that offer very attractive technique for subsurface profile characterization in larger area. Applicable alternative technique in groundwater exploration such as ERM which complement with existing conventional method may produce comprehensive and convincing output thus effective in terms of cost, time, data coverage and sustainable. ERM has been applied by various application in groundwater exploration. Over the years, conventional method such as excavation and test boring are the tools used to obtain information of earth layer especially during site investigation. There are several problems regarding the application of conventional technique as it only provides information at actual drilling point only. This review paper was carried out to expose the application of ERM in groundwater exploration. Results from ERM could be additional information to respective expert for their problem solving such as the information on groundwater pollution, leachate, underground and source of water supply.

  10. Rapid characterization of microorganisms by mass spectrometry--what can be learned and how?

    PubMed

    Fenselau, Catherine C

    2013-08-01

    Strategies for the rapid and reliable analysis of microorganisms have been sought to meet national needs in defense, homeland security, space exploration, food and water safety, and clinical diagnosis. Mass spectrometry has long been a candidate technique because it is extremely rapid and can provide highly specific information. It has excellent sensitivity. Molecular and fragment ion masses provide detailed fingerprints, which can also be interpreted. Mass spectrometry is also a broad band method--everything has a mass--and it is automatable. Mass spectrometry is a physiochemical method that is orthogonal and complementary to biochemical and morphological methods used to characterize microorganisms.

  11. Characterization of Change and Significance for Clinical Findings in Radiology Reports Through Natural Language Processing.

    PubMed

    Hassanpour, Saeed; Bay, Graham; Langlotz, Curtis P

    2017-06-01

    We built a natural language processing (NLP) method to automatically extract clinical findings in radiology reports and characterize their level of change and significance according to a radiology-specific information model. We utilized a combination of machine learning and rule-based approaches for this purpose. Our method is unique in capturing different features and levels of abstractions at surface, entity, and discourse levels in text analysis. This combination has enabled us to recognize the underlying semantics of radiology report narratives for this task. We evaluated our method on radiology reports from four major healthcare organizations. Our evaluation showed the efficacy of our method in highlighting important changes (accuracy 99.2%, precision 96.3%, recall 93.5%, and F1 score 94.7%) and identifying significant observations (accuracy 75.8%, precision 75.2%, recall 75.7%, and F1 score 75.3%) to characterize radiology reports. This method can help clinicians quickly understand the key observations in radiology reports and facilitate clinical decision support, review prioritization, and disease surveillance.

  12. Virtual substrate method for nanomaterials characterization

    PubMed Central

    Da, Bo; Liu, Jiangwei; Yamamoto, Mahito; Ueda, Yoshihiro; Watanabe, Kazuyuki; Cuong, Nguyen Thanh; Li, Songlin; Tsukagoshi, Kazuhito; Yoshikawa, Hideki; Iwai, Hideo; Tanuma, Shigeo; Guo, Hongxuan; Gao, Zhaoshun; Sun, Xia; Ding, Zejun

    2017-01-01

    Characterization techniques available for bulk or thin-film solid-state materials have been extended to substrate-supported nanomaterials, but generally non-quantitatively. This is because the nanomaterial signals are inevitably buried in the signals from the underlying substrate in common reflection-configuration techniques. Here, we propose a virtual substrate method, inspired by the four-point probe technique for resistance measurement as well as the chop-nod method in infrared astronomy, to characterize nanomaterials without the influence of underlying substrate signals from four interrelated measurements. By implementing this method in secondary electron (SE) microscopy, a SE spectrum (white electrons) associated with the reflectivity difference between two different substrates can be tracked and controlled. The SE spectrum is used to quantitatively investigate the covering nanomaterial based on subtle changes in the transmission of the nanomaterial with high efficiency rivalling that of conventional core-level electrons. The virtual substrate method represents a benchmark for surface analysis to provide ‘free-standing' information about supported nanomaterials. PMID:28548114

  13. Incorporating linguistic, probabilistic, and possibilistic information in a risk-based approach for ranking contaminated sites.

    PubMed

    Zhang, Kejiang; Achari, Gopal; Pei, Yuansheng

    2010-10-01

    Different types of uncertain information-linguistic, probabilistic, and possibilistic-exist in site characterization. Their representation and propagation significantly influence the management of contaminated sites. In the absence of a framework with which to properly represent and integrate these quantitative and qualitative inputs together, decision makers cannot fully take advantage of the available and necessary information to identify all the plausible alternatives. A systematic methodology was developed in the present work to incorporate linguistic, probabilistic, and possibilistic information into the Preference Ranking Organization METHod for Enrichment Evaluation (PROMETHEE), a subgroup of Multi-Criteria Decision Analysis (MCDA) methods for ranking contaminated sites. The identification of criteria based on the paradigm of comparative risk assessment provides a rationale for risk-based prioritization. Uncertain linguistic, probabilistic, and possibilistic information identified in characterizing contaminated sites can be properly represented as numerical values, intervals, probability distributions, and fuzzy sets or possibility distributions, and linguistic variables according to their nature. These different kinds of representation are first transformed into a 2-tuple linguistic representation domain. The propagation of hybrid uncertainties is then carried out in the same domain. This methodology can use the original site information directly as much as possible. The case study shows that this systematic methodology provides more reasonable results. © 2010 SETAC.

  14. Data-centric method for object observation through scattering media

    NASA Astrophysics Data System (ADS)

    Tanida, Jun; Horisaki, Ryoichi

    2018-03-01

    A data-centric method is introduced for object observation through scattering media. A large number of training pairs are used to characterize the relation between the object and the observation signals based on machine learning. Using the method object information can be retrieved even from strongly-disturbed signals. As potential applications, object recognition, imaging, and focusing through scattering media were demonstrated.

  15. A Novel Method for Characterization of Superconductors: Physical Measurements and Modeling of Thin Films

    NASA Technical Reports Server (NTRS)

    Kim, B. F.; Moorjani, K.; Phillips, T. E.; Adrian, F. J.; Bohandy, J.; Dolecek, Q. E.

    1993-01-01

    A method for characterization of granular superconducting thin films has been developed which encompasses both the morphological state of the sample and its fabrication process parameters. The broad scope of this technique is due to the synergism between experimental measurements and their interpretation using numerical simulation. Two novel technologies form the substance of this system: the magnetically modulated resistance method for characterizing superconductors; and a powerful new computer peripheral, the Parallel Information Processor card, which provides enhanced computing capability for PC computers. This enhancement allows PC computers to operate at speeds approaching that of supercomputers. This makes atomic scale simulations possible on low cost machines. The present development of this system involves the integration of these two technologies using mesoscale simulations of thin film growth. A future stage of development will incorporate atomic scale modeling.

  16. Methods to characterize environmental settings of stream and groundwater sampling sites for National Water-Quality Assessment

    USGS Publications Warehouse

    Nakagaki, Naomi; Hitt, Kerie J.; Price, Curtis V.; Falcone, James A.

    2012-01-01

    Characterization of natural and anthropogenic features that define the environmental settings of sampling sites for streams and groundwater, including drainage basins and groundwater study areas, is an essential component of water-quality and ecological investigations being conducted as part of the U.S. Geological Survey's National Water-Quality Assessment program. Quantitative characterization of environmental settings, combined with physical, chemical, and biological data collected at sampling sites, contributes to understanding the status of, and influences on, water-quality and ecological conditions. To support studies for the National Water-Quality Assessment program, a geographic information system (GIS) was used to develop a standard set of methods to consistently characterize the sites, drainage basins, and groundwater study areas across the nation. This report describes three methods used for characterization-simple overlay, area-weighted areal interpolation, and land-cover-weighted areal interpolation-and their appropriate applications to geographic analyses that have different objectives and data constraints. In addition, this document records the GIS thematic datasets that are used for the Program's national design and data analyses.

  17. "Master-Slave" Biological Network Alignment

    NASA Astrophysics Data System (ADS)

    Ferraro, Nicola; Palopoli, Luigi; Panni, Simona; Rombo, Simona E.

    Performing global alignment between protein-protein interaction (PPI) networks of different organisms is important to infer knowledge about conservation across species. Known methods that perform this task operate symmetrically, that is to say, they do not assign a distinct role to the input PPI networks. However, in most cases, the input networks are indeed distinguishable on the basis of how well the corresponding organism is biologically well-characterized. For well-characterized organisms the associated PPI network supposedly encode in a sound manner all the information about their proteins and associated interactions, which is far from being the case for not well characterized ones. Here the new idea is developed to devise a method for global alignment of PPI networks that in fact exploit differences in the characterization of organisms at hand. We assume that the PPI network (called Master) of the best characterized is used as a fingerprint to guide the alignment process to the second input network (called Slave), so that generated results preferably retain the structural characteristics of the Master (and using the Slave) network. We tested our method showing that the results it returns are biologically relevant.

  18. Measuring information transfer in a soft robotic arm.

    PubMed

    Nakajima, K; Schmidt, N; Pfeifer, R

    2015-05-13

    Soft robots can exhibit diverse behaviors with simple types of actuation by partially outsourcing control to the morphological and material properties of their soft bodies, which is made possible by the tight coupling between control, body, and environment. In this paper, we present a method that will quantitatively characterize these diverse spatiotemporal dynamics of a soft body based on the information-theoretic approach. In particular, soft bodies have the ability to propagate the effect of actuation through the entire body, with a certain time delay, due to their elasticity. Our goal is to capture this delayed interaction in a quantitative manner based on a measure called momentary information transfer. We extend this measure to soft robotic applications and demonstrate its power using a physical soft robotic platform inspired by the octopus. Our approach is illustrated in two ways. First, we statistically characterize the delayed actuation propagation through the body as a strength of information transfer. Second, we capture this information propagation directly as local information dynamics. As a result, we show that our approach can successfully characterize the spatiotemporal dynamics of the soft robotic platform, explicitly visualizing how information transfers through the entire body with delays. Further extension scenarios of our approach are discussed for soft robotic applications in general.

  19. High-Resolution Remote Sensing Image Building Extraction Based on Markov Model

    NASA Astrophysics Data System (ADS)

    Zhao, W.; Yan, L.; Chang, Y.; Gong, L.

    2018-04-01

    With the increase of resolution, remote sensing images have the characteristics of increased information load, increased noise, more complex feature geometry and texture information, which makes the extraction of building information more difficult. To solve this problem, this paper designs a high resolution remote sensing image building extraction method based on Markov model. This method introduces Contourlet domain map clustering and Markov model, captures and enhances the contour and texture information of high-resolution remote sensing image features in multiple directions, and further designs the spectral feature index that can characterize "pseudo-buildings" in the building area. Through the multi-scale segmentation and extraction of image features, the fine extraction from the building area to the building is realized. Experiments show that this method can restrain the noise of high-resolution remote sensing images, reduce the interference of non-target ground texture information, and remove the shadow, vegetation and other pseudo-building information, compared with the traditional pixel-level image information extraction, better performance in building extraction precision, accuracy and completeness.

  20. Integrated workflow for characterizing and modeling fracture network in unconventional reservoirs using microseismic data

    NASA Astrophysics Data System (ADS)

    Ayatollahy Tafti, Tayeb

    We develop a new method for integrating information and data from different sources. We also construct a comprehensive workflow for characterizing and modeling a fracture network in unconventional reservoirs, using microseismic data. The methodology is based on combination of several mathematical and artificial intelligent techniques, including geostatistics, fractal analysis, fuzzy logic, and neural networks. The study contributes to scholarly knowledge base on the characterization and modeling fractured reservoirs in several ways; including a versatile workflow with a novel objective functions. Some the characteristics of the methods are listed below: 1. The new method is an effective fracture characterization procedure estimates different fracture properties. Unlike the existing methods, the new approach is not dependent on the location of events. It is able to integrate all multi-scaled and diverse fracture information from different methodologies. 2. It offers an improved procedure to create compressional and shear velocity models as a preamble for delineating anomalies and map structures of interest and to correlate velocity anomalies with fracture swarms and other reservoir properties of interest. 3. It offers an effective way to obtain the fractal dimension of microseismic events and identify the pattern complexity, connectivity, and mechanism of the created fracture network. 4. It offers an innovative method for monitoring the fracture movement in different stages of stimulation that can be used to optimize the process. 5. Our newly developed MDFN approach allows to create a discrete fracture network model using only microseismic data with potential cost reduction. It also imposes fractal dimension as a constraint on other fracture modeling approaches, which increases the visual similarity between the modeled networks and the real network over the simulated volume.

  1. Characterizing lentic freshwater fish assemblages using multiple sampling methods

    USGS Publications Warehouse

    Fischer, Jesse R.; Quist, Michael C.

    2014-01-01

    Characterizing fish assemblages in lentic ecosystems is difficult, and multiple sampling methods are almost always necessary to gain reliable estimates of indices such as species richness. However, most research focused on lentic fish sampling methodology has targeted recreationally important species, and little to no information is available regarding the influence of multiple methods and timing (i.e., temporal variation) on characterizing entire fish assemblages. Therefore, six lakes and impoundments (48–1,557 ha surface area) were sampled seasonally with seven gear types to evaluate the combined influence of sampling methods and timing on the number of species and individuals sampled. Probabilities of detection for species indicated strong selectivities and seasonal trends that provide guidance on optimal seasons to use gears when targeting multiple species. The evaluation of species richness and number of individuals sampled using multiple gear combinations demonstrated that appreciable benefits over relatively few gears (e.g., to four) used in optimal seasons were not present. Specifically, over 90 % of the species encountered with all gear types and season combinations (N = 19) from six lakes and reservoirs were sampled with nighttime boat electrofishing in the fall and benthic trawling, modified-fyke, and mini-fyke netting during the summer. Our results indicated that the characterization of lentic fish assemblages was highly influenced by the selection of sampling gears and seasons, but did not appear to be influenced by waterbody type (i.e., natural lake, impoundment). The standardization of data collected with multiple methods and seasons to account for bias is imperative to monitoring of lentic ecosystems and will provide researchers with increased reliability in their interpretations and decisions made using information on lentic fish assemblages.

  2. The method of belief scales as a means for dealing with uncertainty in tough regulatory decisions.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pilch, Martin M.

    Modeling and simulation is playing an increasing role in supporting tough regulatory decisions, which are typically characterized by variabilities and uncertainties in the scenarios, input conditions, failure criteria, model parameters, and even model form. Variability exists when there is a statistically significant database that is fully relevant to the application. Uncertainty, on the other hand, is characterized by some degree of ignorance. A simple algebraic problem was used to illustrate how various risk methodologies address variability and uncertainty in a regulatory context. These traditional risk methodologies include probabilistic methods (including frequensic and Bayesian perspectives) and second-order methods where variabilities andmore » uncertainties are treated separately. Representing uncertainties with (subjective) probability distributions and using probabilistic methods to propagate subjective distributions can lead to results that are not logically consistent with available knowledge and that may not be conservative. The Method of Belief Scales (MBS) is developed as a means to logically aggregate uncertain input information and to propagate that information through the model to a set of results that are scrutable, easily interpretable by the nonexpert, and logically consistent with the available input information. The MBS, particularly in conjunction with sensitivity analyses, has the potential to be more computationally efficient than other risk methodologies. The regulatory language must be tailored to the specific risk methodology if ambiguity and conflict are to be avoided.« less

  3. The Use of Information and Communication Technology (ICT) as a Teaching Method in Vocational Education and Training in Tourism

    ERIC Educational Resources Information Center

    Mocanu, Elena Madalina; Deaconu, Alecxandrina

    2017-01-01

    Globalization and technological change that have characterized recent years have created a new global economy powered by technology, fueled by information and knowledge, with serious implications for the nature and purpose of education institutions. Effective integration of ICT into the education system is a complex, multilateral process that…

  4. User-Based Information Retrieval System Interface Evaluation: An Examination of an On-Line Public Access Catalog.

    ERIC Educational Resources Information Center

    Hert, Carol A.; Nilan, Michael S.

    1991-01-01

    Presents preliminary data that characterizes the relationship between what users say they are trying to accomplish when using an online public access catalog (OPAC) and their perceptions of what input to give the system. Human-machine interaction is discussed, and appropriate methods for evaluating information retrieval systems are considered. (18…

  5. Design and Optimization of Nanomaterials for Sensing Applications

    NASA Astrophysics Data System (ADS)

    Sanderson, Robert Noboru

    Nanomaterials, materials with one or more of their dimensions on the nanoscale, have emerged as an important field in the development of next-generation sensing systems. Their high surface-to-volume ratio makes them useful for sensing, but also makes them sensitive to processing defects and inherent material defects. To develop and optimize these systems, it is thus necessary to characterize these defects to understand their origin and how to work around them. Scanning probe microscopy (SPM) techniques like atomic force microscopy (AFM) and scanning tunneling microscopy (STM) are important characterization methods which can measure nanoscale topography and electronic structure. These methods are appealing in nanomaterial systems because they are non-damaging and provide local, high-resolution data, and so are capable of detecting nanoscale features such as single defect sites. There are difficulties, however, in the interpretation of SPM data. For instance, AFM-based methods are prone to experimental artifacts due to long-range interactions, such as capacitive crosstalk in Kelvin probe force microscopy (KPFM), and artifacts due to the finite size of the probe tip, such as incorrect surface tracking at steep topographical features. Mechanical characterization (via force spectroscopy) of nanomaterials with significant nanoscale variations, such as tethered lipid bilayer membranes (tLBMs), is also difficult since variations in the bulk system's mechanical behavior must be distinguished from local fluctuations. Additionally, interpretation of STM data is non-trivial due to local variations in electron density in addition to topographical variations. In this thesis we overcome some limitations of SPM methods by supplementing them with additional surface analytical methods as well as computational methods, and we characterize several nanomaterial systems. Current-carrying vapor-liquid-solid Si nanowires (useful for interdigitated-electrode-based sensors) are characterized using finite-element-method (FEM)-supplemented KPFM to retrieve useful information about processing defects, contact resistance, and the primary charge carriers. Next, a tLBM system's stiffness and the stiffness' dependence on tethering molecule concentration is measured using statistical analysis of thousands of AFM force spectra, demonstrating a biosensor-compatible system with a controllable bulk rigidity. Finally, we utilize surface analytical techniques to inform the development of a novel three-dimensional graphene system for sensing applications.

  6. Modeling and quantification of repolarization feature dependency on heart rate.

    PubMed

    Minchole, A; Zacur, E; Pueyo, E; Laguna, P

    2014-01-01

    This article is part of the Focus Theme of Methods of Information in Medicine on "Biosignal Interpretation: Advanced Methods for Studying Cardiovascular and Respiratory Systems". This work aims at providing an efficient method to estimate the parameters of a non linear model including memory, previously proposed to characterize rate adaptation of repolarization indices. The physiological restrictions on the model parameters have been included in the cost function in such a way that unconstrained optimization techniques such as descent optimization methods can be used for parameter estimation. The proposed method has been evaluated on electrocardiogram (ECG) recordings of healthy subjects performing a tilt test, where rate adaptation of QT and Tpeak-to-Tend (Tpe) intervals has been characterized. The proposed strategy results in an efficient methodology to characterize rate adaptation of repolarization features, improving the convergence time with respect to previous strategies. Moreover, Tpe interval adapts faster to changes in heart rate than the QT interval. In this work an efficient estimation of the parameters of a model aimed at characterizing rate adaptation of repolarization features has been proposed. The Tpe interval has been shown to be rate related and with a shorter memory lag than the QT interval.

  7. Authorship attribution based on Life-Like Network Automata

    PubMed Central

    Machicao, Jeaneth; Corrêa, Edilson A.; Miranda, Gisele H. B.; Amancio, Diego R.

    2018-01-01

    The authorship attribution is a problem of considerable practical and technical interest. Several methods have been designed to infer the authorship of disputed documents in multiple contexts. While traditional statistical methods based solely on word counts and related measurements have provided a simple, yet effective solution in particular cases; they are prone to manipulation. Recently, texts have been successfully modeled as networks, where words are represented by nodes linked according to textual similarity measurements. Such models are useful to identify informative topological patterns for the authorship recognition task. However, there is no consensus on which measurements should be used. Thus, we proposed a novel method to characterize text networks, by considering both topological and dynamical aspects of networks. Using concepts and methods from cellular automata theory, we devised a strategy to grasp informative spatio-temporal patterns from this model. Our experiments revealed an outperformance over structural analysis relying only on topological measurements, such as clustering coefficient, betweenness and shortest paths. The optimized results obtained here pave the way for a better characterization of textual networks. PMID:29566100

  8. 1998 report on Hanford Site land disposal restrictions for mixed waste

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Black, D.G.

    1998-04-10

    This report was submitted to meet the requirements of Hanford Federal Facility Agreement and Consent Order (Tri-Party Agreement) Milestone M-26-01H. This milestone requires the preparation of an annual report that covers characterization, treatment, storage, minimization, and other aspects of managing land-disposal-restricted mixed waste at the Hanford Facility. The US Department of Energy, its predecessors, and contractors on the Hanford Facility were involved in the production and purification of nuclear defense materials from the early 1940s to the late 1980s. These production activities have generated large quantities of liquid and solid mixed waste. This waste is regulated under authority of bothmore » the Resource Conservation and Recovery Act of l976 and the Atomic Energy Act of 1954. This report covers only mixed waste. The Washington State Department of Ecology, US Environmental Protection Agency, and US Department of Energy have entered into the Tri-Party Agreement to bring the Hanford Facility operations into compliance with dangerous waste regulations. The Tri-Party Agreement required development of the original land disposal restrictions (LDR) plan and its annual updates to comply with LDR requirements for mixed waste. This report is the eighth update of the plan first issued in 1990. The Tri-Party Agreement requires and the baseline plan and annual update reports provide the following information: (1) Waste Characterization Information -- Provides information about characterizing each LDR mixed waste stream. The sampling and analysis methods and protocols, past characterization results, and, where available, a schedule for providing the characterization information are discussed. (2) Storage Data -- Identifies and describes the mixed waste on the Hanford Facility. Storage data include the Resource Conservation and Recovery Act of 1976 dangerous waste codes, generator process knowledge needed to identify the waste and to make LDR determinations, quantities stored, generation rates, location and method of storage, an assessment of storage-unit compliance status, storage capacity, and the bases and assumptions used in making the estimates.« less

  9. GPR impedance inversion for imaging and characterization of buried archaeological remains: A case study at Mudu city cite in Suzhou, China

    NASA Astrophysics Data System (ADS)

    Liu, Yu; Shi, Zhanjie; Wang, Bangbing; Yu, Tianxiang

    2018-01-01

    As a method with high resolution, GPR has been extensively used in archaeological surveys. However, conventional GPR profile can only provide limited geometry information, such as the shape or location of the interface, but can't give the distribution of physical properties which could help identify the historical remains more directly. A common way for GPR to map parameter distribution is the common-midpoint velocity analysis, but it provides limited resolution. Another research hotspot, the full-waveform inversion, is unstable and relatively dependent on the initial model. Coring method could give direct information in drilling site, while the accurate result is only limited in several boreholes. In this paper, we propose a new scheme to enhance imaging and characterization of archaeological targets by fusion of GPR and coring data. The scheme mainly involves the impedance inversion of conventional common-offset GPR data, which uses well log to compensate GPR data and finally obtains a high-resolution estimation of permittivity. The core analysis result also contributes to interpretation of the inversion result. To test this method, we did a case study at Mudu city site in Suzhou, China. The results provide clear images of the ancient city's moat and wall subsurface and improve the characterization of archaeological targets. It is shown that this method is effective and feasible for archaeological exploration.

  10. Principle, system, and applications of tip-enhanced Raman spectroscopy

    NASA Astrophysics Data System (ADS)

    Zhang, MingQian; Wang, Rui; Wu, XiaoBin; Wang, Jia

    2012-08-01

    Raman spectroscopy is a powerful technique in chemical information characterization. However, this spectral method is subject to two obstacles in nano-material detection. One is diffraction limited spatial resolution, and the other is its inherent small Raman cross section and weak signaling. To resolve these problems, a new approach has been developed, denoted as tip-enhanced Raman spectroscopy (TERS). TERS is capable of high-resolution and high-sensitivity detection and demonstrated to be a promising spectroscopic and micro-topographic method to characterize nano-materials and nanostructures. In this paper, the principle and experimental system of TERS are discussed. The latest application of TERS in molecule detection, biological specimen identification, nanao-material characterization, and semi-conductor material determination with some specific experimental examples are presented.

  11. Decoding the dynamics of cellular metabolism and the action of 3-bromopyruvate and 2-deoxyglucose using pulsed stable isotope-resolved metabolomics.

    PubMed

    Pietzke, Matthias; Zasada, Christin; Mudrich, Susann; Kempa, Stefan

    2014-01-01

    Cellular metabolism is highly dynamic and continuously adjusts to the physiological program of the cell. The regulation of metabolism appears at all biological levels: (post-) transcriptional, (post-) translational, and allosteric. This regulatory information is expressed in the metabolome, but in a complex manner. To decode such complex information, new methods are needed in order to facilitate dynamic metabolic characterization at high resolution. Here, we describe pulsed stable isotope-resolved metabolomics (pSIRM) as a tool for the dynamic metabolic characterization of cellular metabolism. We have adapted gas chromatography-coupled mass spectrometric methods for metabolomic profiling and stable isotope-resolved metabolomics. In addition, we have improved robustness and reproducibility and implemented a strategy for the absolute quantification of metabolites. By way of examples, we have applied this methodology to characterize central carbon metabolism of a panel of cancer cell lines and to determine the mode of metabolic inhibition of glycolytic inhibitors in times ranging from minutes to hours. Using pSIRM, we observed that 2-deoxyglucose is a metabolic inhibitor, but does not directly act on the glycolytic cascade.

  12. Common Effects Methodology for Pesticides

    EPA Pesticide Factsheets

    EPA is exploring how to build on the substantial high quality science developed under both OPP programs to develop additional tools and approaches to support a consistent and common set of effects characterization methods using best available information.

  13. A Method to Quantify Visual Information Processing in Children Using Eye Tracking

    PubMed Central

    Kooiker, Marlou J.G.; Pel, Johan J.M.; van der Steen-Kant, Sanny P.; van der Steen, Johannes

    2016-01-01

    Visual problems that occur early in life can have major impact on a child's development. Without verbal communication and only based on observational methods, it is difficult to make a quantitative assessment of a child's visual problems. This limits accurate diagnostics in children under the age of 4 years and in children with intellectual disabilities. Here we describe a quantitative method that overcomes these problems. The method uses a remote eye tracker and a four choice preferential looking paradigm to measure eye movement responses to different visual stimuli. The child sits without head support in front of a monitor with integrated infrared cameras. In one of four monitor quadrants a visual stimulus is presented. Each stimulus has a specific visual modality with respect to the background, e.g., form, motion, contrast or color. From the reflexive eye movement responses to these specific visual modalities, output parameters such as reaction times, fixation accuracy and fixation duration are calculated to quantify a child's viewing behavior. With this approach, the quality of visual information processing can be assessed without the use of communication. By comparing results with reference values obtained in typically developing children from 0-12 years, the method provides a characterization of visual information processing in visually impaired children. The quantitative information provided by this method can be advantageous for the field of clinical visual assessment and rehabilitation in multiple ways. The parameter values provide a good basis to: (i) characterize early visual capacities and consequently to enable early interventions; (ii) compare risk groups and follow visual development over time; and (iii), construct an individual visual profile for each child. PMID:27500922

  14. A Method to Quantify Visual Information Processing in Children Using Eye Tracking.

    PubMed

    Kooiker, Marlou J G; Pel, Johan J M; van der Steen-Kant, Sanny P; van der Steen, Johannes

    2016-07-09

    Visual problems that occur early in life can have major impact on a child's development. Without verbal communication and only based on observational methods, it is difficult to make a quantitative assessment of a child's visual problems. This limits accurate diagnostics in children under the age of 4 years and in children with intellectual disabilities. Here we describe a quantitative method that overcomes these problems. The method uses a remote eye tracker and a four choice preferential looking paradigm to measure eye movement responses to different visual stimuli. The child sits without head support in front of a monitor with integrated infrared cameras. In one of four monitor quadrants a visual stimulus is presented. Each stimulus has a specific visual modality with respect to the background, e.g., form, motion, contrast or color. From the reflexive eye movement responses to these specific visual modalities, output parameters such as reaction times, fixation accuracy and fixation duration are calculated to quantify a child's viewing behavior. With this approach, the quality of visual information processing can be assessed without the use of communication. By comparing results with reference values obtained in typically developing children from 0-12 years, the method provides a characterization of visual information processing in visually impaired children. The quantitative information provided by this method can be advantageous for the field of clinical visual assessment and rehabilitation in multiple ways. The parameter values provide a good basis to: (i) characterize early visual capacities and consequently to enable early interventions; (ii) compare risk groups and follow visual development over time; and (iii), construct an individual visual profile for each child.

  15. Automated Weld Characterization Using the Thermoelectric Method

    NASA Technical Reports Server (NTRS)

    Fulton, J. P.; Wincheski, B.; Namkung, M.

    1992-01-01

    The effective assessment of the integrity of welds is a complicated NDE problem that continues to be a challenge. To be able to completely characterize a weld, detailed knowledge of its tensile strength, ductility, hardness, microstructure, macrostructure, and chemical composition is needed. NDE techniques which can provide information on any of these features are extremely important. In this paper, we examine a seldom used approach based on the thermoelectric (TE) effect for characterizing welds and their associated heat affected zone (HAZ). The thermoelectric method monitors the thermoelectric power which is sensitive to small changes in the kinetics of the conduction electrons near the Fermi surface that can be caused by changes in the local microstructure. The technique has been applied to metal sorting, quality testing, flaw detection, thickness gauging of layers, and microscopic structural analysis. To demonstrate the effectiveness of the technique for characterizing welds, a series of tungsten-inert-gas welded Inconel-718 samples were scanned with a computer controlled TE probe. The samples were then analyzed using a scanning electron microscope and Rockwell hardness tests to characterize the weld and the associated HAZ. We then correlated the results with the TE measurements to provide quantitative information on the size of the HAZ and the degree of hardness of the material in the weld region. This provides potentially valuable information on the strength and fatigue life of the weld. We begin the paper by providing a brief review of the TE technique and then highlight some of the factors that can effect the measurements. Next, we provide an overview of the experimental procedure and discuss the results. Finally, we summarize our findings and consider areas for future research.

  16. Processes in construction of failure management expert systems from device design information

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.; Lance, Nick

    1987-01-01

    This paper analyzes the tasks and problem solving methods used by an engineer in constructing a failure management expert system from design information about the device to te diagnosed. An expert test engineer developed a trouble-shooting expert system based on device design information and experience with similar devices, rather than on specific expert knowledge gained from operating the device or troubleshooting its failures. The construction of the expert system was intensively observed and analyzed. This paper characterizes the knowledge, tasks, methods, and design decisions involved in constructing this type of expert system, and makes recommendations concerning tools for aiding and automating construction of such systems.

  17. Airborne and Ground-Based Optical Characterization of Legacy Underground Nuclear Test Sites

    NASA Astrophysics Data System (ADS)

    Vigil, S.; Craven, J.; Anderson, D.; Dzur, R.; Schultz-Fellenz, E. S.; Sussman, A. J.

    2015-12-01

    Detecting, locating, and characterizing suspected underground nuclear test sites is a U.S. security priority. Currently, global underground nuclear explosion monitoring relies on seismic and infrasound sensor networks to provide rapid initial detection of potential underground nuclear tests. While seismic and infrasound might be able to generally locate potential underground nuclear tests, additional sensing methods might be required to further pinpoint test site locations. Optical remote sensing is a robust approach for site location and characterization due to the ability it provides to search large areas relatively quickly, resolve surface features in fine detail, and perform these tasks non-intrusively. Optical remote sensing provides both cultural and surface geological information about a site, for example, operational infrastructure, surface fractures. Surface geological information, when combined with known or estimated subsurface geologic information, could provide clues concerning test parameters. We have characterized two legacy nuclear test sites on the Nevada National Security Site (NNSS), U20ak and U20az using helicopter-, ground- and unmanned aerial system-based RGB imagery and light detection and ranging (lidar) systems. The multi-faceted information garnered from these different sensing modalities has allowed us to build a knowledge base of how a nuclear test site might look when sensed remotely, and the standoff distances required to resolve important site characteristics.

  18. Material characterization in partially filled waveguides using inverse scattering and multiple sample orientations

    NASA Astrophysics Data System (ADS)

    Sjöberg, Daniel; Larsson, Christer

    2015-06-01

    We present a method aimed at reducing uncertainties and instabilities when characterizing materials in waveguide setups. The method is based on measuring the S parameters for three different orientations of a rectangular sample block in a rectangular waveguide. The corresponding geometries are modeled in a commercial full-wave simulation program, taking any material parameters as input. The material parameters of the sample are found by minimizing the squared distance between measured and calculated S parameters. The information added by the different sample orientations is quantified using the Cramér-Rao lower bound. The flexibility of the method allows the determination of material parameters of an arbitrarily shaped sample that fits in the waveguide.

  19. Three-Dimensional Bayesian Geostatistical Aquifer Characterization at the Hanford 300 Area using Tracer Test Data

    NASA Astrophysics Data System (ADS)

    Chen, X.; Murakami, H.; Hahn, M. S.; Hammond, G. E.; Rockhold, M. L.; Rubin, Y.

    2010-12-01

    Tracer testing under natural or forced gradient flow provides useful information for characterizing subsurface properties, by monitoring and modeling the tracer plume migration in a heterogeneous aquifer. At the Hanford 300 Area, non-reactive tracer experiments, in addition to constant-rate injection tests and electromagnetic borehole flowmeter (EBF) profiling, were conducted to characterize the heterogeneous hydraulic conductivity field. A Bayesian data assimilation technique, method of anchored distributions (MAD), is applied to assimilate the experimental tracer test data and to infer the three-dimensional heterogeneous structure of the hydraulic conductivity in the saturated zone of the Hanford formation. In this study, the prior information of the underlying random hydraulic conductivity field was obtained from previous field characterization efforts using the constant-rate injection tests and the EBF data. The posterior distribution of the random field is obtained by further conditioning the field on the temporal moments of tracer breakthrough curves at various observation wells. The parallel three-dimensional flow and transport code PFLOTRAN is implemented to cope with the highly transient flow boundary conditions at the site and to meet the computational demand of the proposed method. The validation results show that the field conditioned on the tracer test data better reproduces the tracer transport behavior compared to the field characterized previously without the tracer test data. A synthetic study proves that the proposed method can effectively assimilate tracer test data to capture the essential spatial heterogeneity of the three-dimensional hydraulic conductivity field. These characterization results will improve conceptual models developed for the site, including reactive transport models. The study successfully demonstrates the capability of MAD to assimilate multi-scale multi-type field data within a consistent Bayesian framework. The MAD framework can potentially be applied to combine geophysical data with other types of data in site characterization.

  20. Cartographic and geodetic methods to characterize the potential landing sites for the future Russian missions Luna-Glob and Luna-Resurs

    NASA Astrophysics Data System (ADS)

    Karachevtseva, I. P.; Kokhanov, A. A.; Konopikhin, A. A.; Nadezhdina, I. E.; Zubarev, A. E.; Patratiy, V. D.; Kozlova, N. A.; Uchaev, D. V.; Uchaev, Dm. V.; Malinnikov, V. A.; Oberst, J.

    2015-04-01

    Characterization of the potential landing sites for the planned Luna-Glob and Luna-Resurs Russian missions requires cartographic and geodetic support prepared with special methods and techniques that are briefly overviewed here. The data used in the analysis, including the digital terrain models (DTMs) and the orthoimages acquired in the survey carried out from the Lunar Reconnaissance Orbiter and Kaguya spacecraft, are described and evaluated. By way of illustration, different regions of the lunar surface, including the subpolar regions of the Moon, are characterized with the suggested methods and the GIS-technologies. The development of the information support for the future lunar missions started in 2011, and it is now carried on in MIIGAiK Extraterrestrial Laboratory (MExLab), which is a department of the Moscow State University of Geodesy and Cartography (MIIGAiK).

  1. Intact glycopeptide characterization using mass spectrometry.

    PubMed

    Cao, Li; Qu, Yi; Zhang, Zhaorui; Wang, Zhe; Prytkova, Iya; Wu, Si

    2016-05-01

    Glycosylation is one of the most prominent and extensively studied protein post-translational modifications. However, traditional proteomic studies at the peptide level (bottom-up) rarely characterize intact glycopeptides (glycosylated peptides without removing glycans), so no glycoprotein heterogeneity information is retained. Intact glycopeptide characterization, on the other hand, provides opportunities to simultaneously elucidate the glycan structure and the glycosylation site needed to reveal the actual biological function of protein glycosylation. Recently, significant improvements have been made in the characterization of intact glycopeptides, ranging from enrichment and separation, mass spectroscopy (MS) detection, to bioinformatics analysis. In this review, we recapitulated currently available intact glycopeptide characterization methods with respect to their advantages and limitations as well as their potential applications.

  2. Characterization of coronary plaque regions in intravascular ultrasound images using a hybrid ensemble classifier.

    PubMed

    Hwang, Yoo Na; Lee, Ju Hwan; Kim, Ga Young; Shin, Eun Seok; Kim, Sung Min

    2018-01-01

    The purpose of this study was to propose a hybrid ensemble classifier to characterize coronary plaque regions in intravascular ultrasound (IVUS) images. Pixels were allocated to one of four tissues (fibrous tissue (FT), fibro-fatty tissue (FFT), necrotic core (NC), and dense calcium (DC)) through processes of border segmentation, feature extraction, feature selection, and classification. Grayscale IVUS images and their corresponding virtual histology images were acquired from 11 patients with known or suspected coronary artery disease using 20 MHz catheter. A total of 102 hybrid textural features including first order statistics (FOS), gray level co-occurrence matrix (GLCM), extended gray level run-length matrix (GLRLM), Laws, local binary pattern (LBP), intensity, and discrete wavelet features (DWF) were extracted from IVUS images. To select optimal feature sets, genetic algorithm was implemented. A hybrid ensemble classifier based on histogram and texture information was then used for plaque characterization in this study. The optimal feature set was used as input of this ensemble classifier. After tissue characterization, parameters including sensitivity, specificity, and accuracy were calculated to validate the proposed approach. A ten-fold cross validation approach was used to determine the statistical significance of the proposed method. Our experimental results showed that the proposed method had reliable performance for tissue characterization in IVUS images. The hybrid ensemble classification method outperformed other existing methods by achieving characterization accuracy of 81% for FFT and 75% for NC. In addition, this study showed that Laws features (SSV and SAV) were key indicators for coronary tissue characterization. The proposed method had high clinical applicability for image-based tissue characterization. Copyright © 2017 Elsevier B.V. All rights reserved.

  3. Method for detecting damage in carbon-fibre reinforced plastic-steel structures based on eddy current pulsed thermography

    NASA Astrophysics Data System (ADS)

    Li, Xuan; Liu, Zhiping; Jiang, Xiaoli; Lodewijks, Gabrol

    2018-01-01

    Eddy current pulsed thermography (ECPT) is well established for non-destructive testing of electrical conductive materials, featuring the advantages of contactless, intuitive detecting and efficient heating. The concept of divergence characterization of the damage rate of carbon fibre-reinforced plastic (CFRP)-steel structures can be extended to ECPT thermal pattern characterization. It was found in this study that the use of ECPT technology on CFRP-steel structures generated a sizeable amount of valuable information for comprehensive material diagnostics. The relationship between divergence and transient thermal patterns can be identified and analysed by deploying mathematical models to analyse the information about fibre texture-like orientations, gaps and undulations in these multi-layered materials. The developed algorithm enabled the removal of information about fibre texture and the extraction of damage features. The model of the CFRP-glue-steel structures with damage was established using COMSOL Multiphysics® software, and quantitative non-destructive damage evaluation from the ECPT image areas was derived. The results of this proposed method illustrate that damaged areas are highly affected by available information about fibre texture. This proposed work can be applied for detection of impact induced damage and quantitative evaluation of CFRP structures.

  4. Model-Data Fusion and Adaptive Sensing for Large Scale Systems: Applications to Atmospheric Release Incidents

    NASA Astrophysics Data System (ADS)

    Madankan, Reza

    All across the world, toxic material clouds are emitted from sources, such as industrial plants, vehicular traffic, and volcanic eruptions can contain chemical, biological or radiological material. With the growing fear of natural, accidental or deliberate release of toxic agents, there is tremendous interest in precise source characterization and generating accurate hazard maps of toxic material dispersion for appropriate disaster management. In this dissertation, an end-to-end framework has been developed for probabilistic source characterization and forecasting of atmospheric release incidents. The proposed methodology consists of three major components which are combined together to perform the task of source characterization and forecasting. These components include Uncertainty Quantification, Optimal Information Collection, and Data Assimilation. Precise approximation of prior statistics is crucial to ensure performance of the source characterization process. In this work, an efficient quadrature based method has been utilized for quantification of uncertainty in plume dispersion models that are subject to uncertain source parameters. In addition, a fast and accurate approach is utilized for the approximation of probabilistic hazard maps, based on combination of polynomial chaos theory and the method of quadrature points. Besides precise quantification of uncertainty, having useful measurement data is also highly important to warranty accurate source parameter estimation. The performance of source characterization is highly affected by applied sensor orientation for data observation. Hence, a general framework has been developed for the optimal allocation of data observation sensors, to improve performance of the source characterization process. The key goal of this framework is to optimally locate a set of mobile sensors such that measurement of textit{better} data is guaranteed. This is achieved by maximizing the mutual information between model predictions and observed data, given a set of kinetic constraints on mobile sensors. Dynamic Programming method has been utilized to solve the resulting optimal control problem. To complete the loop of source characterization process, two different estimation techniques, minimum variance estimation framework and Bayesian Inference method has been developed to fuse model forecast with measurement data. Incomplete information regarding the distribution of associated noise signal in measurement data, is another major challenge in the source characterization of plume dispersion incidents. This frequently happens in data assimilation of atmospheric data by using the satellite imagery. This occurs due to the fact that satellite imagery data can be polluted with noise, depending on weather conditions, clouds, humidity, etc. Unfortunately, there is no accurate procedure to quantify the error in recorded satellite data. Hence, using classical data assimilation methods in this situation is not straight forward. In this dissertation, the basic idea of a novel approach has been proposed to tackle these types of real world problems with more accuracy and robustness. A simple example demonstrating the real-world scenario is presented to validate the developed methodology.

  5. Molecular characterization of multivalent bioconjugates by size-exclusion chromatography with multiangle laser light scattering.

    PubMed

    Pollock, Jacob F; Ashton, Randolph S; Rode, Nikhil A; Schaffer, David V; Healy, Kevin E

    2012-09-19

    The degree of substitution and valency of bioconjugate reaction products are often poorly judged or require multiple time- and product-consuming chemical characterization methods. These aspects become critical when analyzing and optimizing the potency of costly polyvalent bioactive conjugates. In this study, size-exclusion chromatography with multiangle laser light scattering was paired with refractive index detection and ultraviolet spectroscopy (SEC-MALS-RI-UV) to characterize the reaction efficiency, degree of substitution, and valency of the products of conjugation of either peptides or proteins to a biopolymer scaffold, i.e., hyaluronic acid (HyA). Molecular characterization was more complete compared to estimates from a protein quantification assay, and exploitation of this method led to more accurate deduction of the molecular structures of polymer bioconjugates. Information obtained using this technique can improve macromolecular engineering design principles and help to better understand multivalent macromolecular interactions in biological systems.

  6. Molecular characterization of multivalent bioconjugates by size-exclusion chromatography (SEC) with multi-angle laser light scattering (MALS)

    PubMed Central

    Pollock, Jacob F.; Ashton, Randolph S.; Rode, Nikhil A.; Schaffer, David V.; Healy, Kevin E.

    2013-01-01

    The degree of substitution and valency of bioconjugate reaction products are often poorly judged or require multiple time- and product- consuming chemical characterization methods. These aspects become critical when analyzing and optimizing the potency of costly polyvalent bioactive conjugates. In this study, size-exclusion chromatography with multi-angle laser light scattering was paired with refractive index detection and ultraviolet spectroscopy (SEC-MALS-RI-UV) to characterize the reaction efficiency, degree of substitution, and valency of the products of conjugation of either peptides or proteins to a biopolymer scaffold, i.e., hyaluronic acid (HyA). Molecular characterization was more complete compared to estimates from a protein quantification assay, and exploitation of this method led to more accurate deduction of the molecular structures of polymer bioconjugates. Information obtained using this technique can improve macromolecular engineering design principles and better understand multivalent macromolecular interactions in biological systems. PMID:22794081

  7. Itô-SDE MCMC method for Bayesian characterization of errors associated with data limitations in stochastic expansion methods for uncertainty quantification

    NASA Astrophysics Data System (ADS)

    Arnst, M.; Abello Álvarez, B.; Ponthot, J.-P.; Boman, R.

    2017-11-01

    This paper is concerned with the characterization and the propagation of errors associated with data limitations in polynomial-chaos-based stochastic methods for uncertainty quantification. Such an issue can arise in uncertainty quantification when only a limited amount of data is available. When the available information does not suffice to accurately determine the probability distributions that must be assigned to the uncertain variables, the Bayesian method for assigning these probability distributions becomes attractive because it allows the stochastic model to account explicitly for insufficiency of the available information. In previous work, such applications of the Bayesian method had already been implemented by using the Metropolis-Hastings and Gibbs Markov Chain Monte Carlo (MCMC) methods. In this paper, we present an alternative implementation, which uses an alternative MCMC method built around an Itô stochastic differential equation (SDE) that is ergodic for the Bayesian posterior. We draw together from the mathematics literature a number of formal properties of this Itô SDE that lend support to its use in the implementation of the Bayesian method, and we describe its discretization, including the choice of the free parameters, by using the implicit Euler method. We demonstrate the proposed methodology on a problem of uncertainty quantification in a complex nonlinear engineering application relevant to metal forming.

  8. Optimization-based methods for road image registration

    DOT National Transportation Integrated Search

    2008-02-01

    A number of transportation agencies are now relying on direct imaging for monitoring and cataloguing the state of their roadway systems. Images provide objective information to characterize the pavement as well as roadside hardware. The tasks of proc...

  9. Optical stress generator and detector

    DOEpatents

    Maris, Humphrey J.; Stoner, Robert J

    2001-01-01

    Disclosed is a system for the characterization of thin films and interfaces between thin films through measurements of their mechanical and thermal properties. In the system light is absorbed in a thin film or in a structure made up of several thin films, and the change in optical transmission or reflection is measured and analyzed. The change in reflection or transmission is used to give information about the ultrasonic waves that are produced in the structure. The information that is obtained from the use of the measurement methods and apparatus of this invention can include: (a) a determination of the thickness of thin films with a speed and accuracy that is improved compared to earlier methods; (b) a determination of the thermal, elastic, and optical properties of thin films; (c) a determination of the stress in thin films; and (d) a characterization of the properties of interfaces, including the presence of roughness and defects.

  10. Optical stress generator and detector

    DOEpatents

    Maris, Humphrey J.; Stoner, Robert J.

    1998-01-01

    Disclosed is a system for the characterization of thin films and interfaces between thin films through measurements of their mechanical and thermal properties. In the system light is absorbed in a thin film or in a structure made up of several thin films, and the change in optical transmission or reflection is measured and analyzed. The change in reflection or transmission is used to give information about the ultrasonic waves that are produced in the structure. The information that is obtained from the use of the measurement methods and apparatus of this invention can include: (a) a determination of the thickness of thin films with a speed and accuracy that is improved compared to earlier methods; (b) a determination of the thermal, elastic, and optical properties of thin films; (c) a determination of the stress in thin films; and (d) a characterization of the properties of interfaces, including the presence of roughness and defects.

  11. Optical stress generator and detector

    DOEpatents

    Maris, H.J.; Stoner, R.J.

    1998-05-05

    Disclosed is a system for the characterization of thin films and interfaces between thin films through measurements of their mechanical and thermal properties. In the system light is absorbed in a thin film or in a structure made up of several thin films, and the change in optical transmission or reflection is measured and analyzed. The change in reflection or transmission is used to give information about the ultrasonic waves that are produced in the structure. The information that is obtained from the use of the measurement methods and apparatus of this invention can include: (a) a determination of the thickness of thin films with a speed and accuracy that is improved compared to earlier methods; (b) a determination of the thermal, elastic, and optical properties of thin films; (c) a determination of the stress in thin films; and (d) a characterization of the properties of interfaces, including the presence of roughness and defects. 32 figs.

  12. Optical stress generator and detector

    DOEpatents

    Maris, Humphrey J.; Stoner, Robert J

    2002-01-01

    Disclosed is a system for the characterization of thin films and interfaces between thin films through measurements of their mechanical and thermal properties. In the system light is absorbed in a thin film or in a structure made up of several thin films, and the change in optical transmission or reflection is measured and analyzed. The change in reflection or transmission is used to give information about the ultrasonic waves that are produced in the structure. The information that is obtained from the use of the measurement methods and apparatus of this invention can include: (a) a determination of the thickness of thin films with a speed and accuracy that is improved compared to earlier methods; (b) a determination of the thermal, elastic, and optical properties of thin films; (c) a determination of the stress in thin films; and (d) a characterization of the properties of interfaces, including the presence of roughness and defects.

  13. Optical stress generator and detector

    DOEpatents

    Maris, Humphrey J.; Stoner, Robert J

    1999-01-01

    Disclosed is a system for the characterization of thin films and interfaces between thin films through measurements of their mechanical and thermal properties. In the system light is absorbed in a thin film or in a structure made up of several thin films, and the change in optical transmission or reflection is measured and analyzed. The change in reflection or transmission is used to give information about the ultrasonic waves that are produced in the structure. The information that is obtained from the use of the measurement methods and apparatus of this invention can include: (a) a determination of the thickness of thin films with a speed and accuracy that is improved compared to earlier methods; (b) a determination of the thermal, elastic, and optical properties of thin films; (c) a determination of the stress in thin films; and (d) a characterization of the properties of interfaces, including the presence of roughness and defects.

  14. Practical characterization of quantum devices without tomography

    NASA Astrophysics Data System (ADS)

    Landon-Cardinal, Olivier; Flammia, Steven; Silva, Marcus; Liu, Yi-Kai; Poulin, David

    2012-02-01

    Quantum tomography is the main method used to assess the quality of quantum information processing devices, but its complexity presents a major obstacle for the characterization of even moderately large systems. Part of the reason for this complexity is that tomography generates much more information than is usually sought. Taking a more targeted approach, we develop schemes that enable (i) estimating the ?delity of an experiment to a theoretical ideal description, (ii) learning which description within a reduced subset best matches the experimental data. Both these approaches yield a signi?cant reduction in resources compared to tomography. In particular, we show how to estimate the ?delity between a predicted pure state and an arbitrary experimental state using only a constant number of Pauli expectation values selected at random according to an importance-weighting rule. In addition, we propose methods for certifying quantum circuits and learning continuous-time quantum dynamics that are described by local Hamiltonians or Lindbladians.

  15. 76 FR 58268 - Agency Information Collection Activities; Submission to OMB for Review and Approval; Comment...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-20

    ... simplify some assumptions and to make estimation methods consistent; and characterization as Agency burden...-1007 to (1) EPA online using http://www.regulations.gov (our preferred method), by e-mail to oppt.ncic...-HQ-OPPT-2010-1007, which is available for online viewing at http://www.regulations.gov , or in person...

  16. A framework for characterizing eHealth literacy demands and barriers.

    PubMed

    Chan, Connie V; Kaufman, David R

    2011-11-17

    Consumer eHealth interventions are of a growing importance in the individual management of health and health behaviors. However, a range of access, resources, and skills barriers prevent health care consumers from fully engaging in and benefiting from the spectrum of eHealth interventions. Consumers may engage in a range of eHealth tasks, such as participating in health discussion forums and entering information into a personal health record. eHealth literacy names a set of skills and knowledge that are essential for productive interactions with technology-based health tools, such as proficiency in information retrieval strategies, and communicating health concepts effectively. We propose a theoretical and methodological framework for characterizing complexity of eHealth tasks, which can be used to diagnose and describe literacy barriers and inform the development of solution strategies. We adapted and integrated two existing theoretical models relevant to the analysis of eHealth literacy into a single framework to systematically categorize and describe task demands and user performance on tasks needed by health care consumers in the information age. The method derived from the framework is applied to (1) code task demands using a cognitive task analysis, and (2) code user performance on tasks. The framework and method are applied to the analysis of a Web-based consumer eHealth task with information-seeking and decision-making demands. We present the results from the in-depth analysis of the task performance of a single user as well as of 20 users on the same task to illustrate both the detailed analysis and the aggregate measures obtained and potential analyses that can be performed using this method. The analysis shows that the framework can be used to classify task demands as well as the barriers encountered in user performance of the tasks. Our approach can be used to (1) characterize the challenges confronted by participants in performing the tasks, (2) determine the extent to which application of the framework to the cognitive task analysis can predict and explain the problems encountered by participants, and (3) inform revisions to the framework to increase accuracy of predictions. The results of this illustrative application suggest that the framework is useful for characterizing task complexity and for diagnosing and explaining barriers encountered in task completion. The framework and analytic approach can be a potentially powerful generative research platform to inform development of rigorous eHealth examination and design instruments, such as to assess eHealth competence, to design and evaluate consumer eHealth tools, and to develop an eHealth curriculum.

  17. Designing quantum information processing via structural physical approximation.

    PubMed

    Bae, Joonwoo

    2017-10-01

    In quantum information processing it may be possible to have efficient computation and secure communication beyond the limitations of classical systems. In a fundamental point of view, however, evolution of quantum systems by the laws of quantum mechanics is more restrictive than classical systems, identified to a specific form of dynamics, that is, unitary transformations and, consequently, positive and completely positive maps to subsystems. This also characterizes classes of disallowed transformations on quantum systems, among which positive but not completely maps are of particular interest as they characterize entangled states, a general resource in quantum information processing. Structural physical approximation offers a systematic way of approximating those non-physical maps, positive but not completely positive maps, with quantum channels. Since it has been proposed as a method of detecting entangled states, it has stimulated fundamental problems on classifications of positive maps and the structure of Hermitian operators and quantum states, as well as on quantum measurement such as quantum design in quantum information theory. It has developed efficient and feasible methods of directly detecting entangled states in practice, for which proof-of-principle experimental demonstrations have also been performed with photonic qubit states. Here, we present a comprehensive review on quantum information processing with structural physical approximations and the related progress. The review mainly focuses on properties of structural physical approximations and their applications toward practical information applications.

  18. Designing quantum information processing via structural physical approximation

    NASA Astrophysics Data System (ADS)

    Bae, Joonwoo

    2017-10-01

    In quantum information processing it may be possible to have efficient computation and secure communication beyond the limitations of classical systems. In a fundamental point of view, however, evolution of quantum systems by the laws of quantum mechanics is more restrictive than classical systems, identified to a specific form of dynamics, that is, unitary transformations and, consequently, positive and completely positive maps to subsystems. This also characterizes classes of disallowed transformations on quantum systems, among which positive but not completely maps are of particular interest as they characterize entangled states, a general resource in quantum information processing. Structural physical approximation offers a systematic way of approximating those non-physical maps, positive but not completely positive maps, with quantum channels. Since it has been proposed as a method of detecting entangled states, it has stimulated fundamental problems on classifications of positive maps and the structure of Hermitian operators and quantum states, as well as on quantum measurement such as quantum design in quantum information theory. It has developed efficient and feasible methods of directly detecting entangled states in practice, for which proof-of-principle experimental demonstrations have also been performed with photonic qubit states. Here, we present a comprehensive review on quantum information processing with structural physical approximations and the related progress. The review mainly focuses on properties of structural physical approximations and their applications toward practical information applications.

  19. Idaho National Engineering Laboratory code assessment of the Rocky Flats transuranic waste

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1995-07-01

    This report is an assessment of the content codes associated with transuranic waste shipped from the Rocky Flats Plant in Golden, Colorado, to INEL. The primary objective of this document is to characterize and describe the transuranic wastes shipped to INEL from Rocky Flats by item description code (IDC). This information will aid INEL in determining if the waste meets the waste acceptance criteria (WAC) of the Waste Isolation Pilot Plant (WIPP). The waste covered by this content code assessment was shipped from Rocky Flats between 1985 and 1989. These years coincide with the dates for information available in themore » Rocky Flats Solid Waste Information Management System (SWIMS). The majority of waste shipped during this time was certified to the existing WIPP WAC. This waste is referred to as precertified waste. Reassessment of these precertified waste containers is necessary because of changes in the WIPP WAC. To accomplish this assessment, the analytical and process knowledge available on the various IDCs used at Rocky Flats were evaluated. Rocky Flats sources for this information include employee interviews, SWIMS, Transuranic Waste Certification Program, Transuranic Waste Inspection Procedure, Backlog Waste Baseline Books, WIPP Experimental Waste Characterization Program (headspace analysis), and other related documents, procedures, and programs. Summaries are provided of: (a) certification information, (b) waste description, (c) generation source, (d) recovery method, (e) waste packaging and handling information, (f) container preparation information, (g) assay information, (h) inspection information, (i) analytical data, and (j) RCRA characterization.« less

  20. Characterization of Interfacial Chemistry of Adhesive/Dentin Bond Using FTIR Chemical Imaging With Univariate and Multivariate Data Processing

    PubMed Central

    Wang, Yong; Yao, Xiaomei; Parthasarathy, Ranganathan

    2008-01-01

    Fourier transform infrared (FTIR) chemical imaging can be used to investigate molecular chemical features of the adhesive/dentin interfaces. However, the information is not straightforward, and is not easily extracted. The objective of this study was to use multivariate analysis methods, principal component analysis and fuzzy c-means clustering, to analyze spectral data in comparison with univariate analysis. The spectral imaging data collected from both the adhesive/healthy dentin and adhesive/caries-affected dentin specimens were used and compared. The univariate statistical methods such as mapping of intensities of specific functional group do not always accurately identify functional group locations and concentrations due to more or less band overlapping in adhesive and dentin. Apart from the ease with which information can be extracted, multivariate methods highlight subtle and often important changes in the spectra that are difficult to observe using univariate methods. The results showed that the multivariate methods gave more satisfactory, interpretable results than univariate methods and were conclusive in showing that they can discriminate and classify differences between healthy dentin and caries-affected dentin within the interfacial regions. It is demonstrated that the multivariate FTIR imaging approaches can be used in the rapid characterization of heterogeneous, complex structure. PMID:18980198

  1. Quantitative impact characterization of aeronautical CFRP materials with non-destructive testing methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kiefel, Denis, E-mail: Denis.Kiefel@airbus.com, E-mail: Rainer.Stoessel@airbus.com; Stoessel, Rainer, E-mail: Denis.Kiefel@airbus.com, E-mail: Rainer.Stoessel@airbus.com; Grosse, Christian, E-mail: Grosse@tum.de

    2015-03-31

    In recent years, an increasing number of safety-relevant structures are designed and manufactured from carbon fiber reinforced polymers (CFRP) in order to reduce weight of airplanes by taking the advantage of their specific strength into account. Non-destructive testing (NDT) methods for quantitative defect analysis of damages are liquid- or air-coupled ultrasonic testing (UT), phased array ultrasonic techniques, and active thermography (IR). The advantage of these testing methods is the applicability on large areas. However, their quantitative information is often limited on impact localization and size. In addition to these techniques, Airbus Group Innovations operates a micro x-ray computed tomography (μ-XCT)more » system, which was developed for CFRP characterization. It is an open system which allows different kinds of acquisition, reconstruction, and data evaluation. One main advantage of this μ-XCT system is its high resolution with 3-dimensional analysis and visualization opportunities, which enables to gain important quantitative information for composite part design and stress analysis. Within this study, different NDT methods will be compared at CFRP samples with specified artificial impact damages. The results can be used to select the most suitable NDT-method for specific application cases. Furthermore, novel evaluation and visualization methods for impact analyzes are developed and will be presented.« less

  2. Hierarchical structures of amorphous solids characterized by persistent homology

    PubMed Central

    Hiraoka, Yasuaki; Nakamura, Takenobu; Hirata, Akihiko; Escolar, Emerson G.; Matsue, Kaname; Nishiura, Yasumasa

    2016-01-01

    This article proposes a topological method that extracts hierarchical structures of various amorphous solids. The method is based on the persistence diagram (PD), a mathematical tool for capturing shapes of multiscale data. The input to the PDs is given by an atomic configuration and the output is expressed as 2D histograms. Then, specific distributions such as curves and islands in the PDs identify meaningful shape characteristics of the atomic configuration. Although the method can be applied to a wide variety of disordered systems, it is applied here to silica glass, the Lennard-Jones system, and Cu-Zr metallic glass as standard examples of continuous random network and random packing structures. In silica glass, the method classified the atomic rings as short-range and medium-range orders and unveiled hierarchical ring structures among them. These detailed geometric characterizations clarified a real space origin of the first sharp diffraction peak and also indicated that PDs contain information on elastic response. Even in the Lennard-Jones system and Cu-Zr metallic glass, the hierarchical structures in the atomic configurations were derived in a similar way using PDs, although the glass structures and properties substantially differ from silica glass. These results suggest that the PDs provide a unified method that extracts greater depth of geometric information in amorphous solids than conventional methods. PMID:27298351

  3. A simple and rapid electrophoretic method to characterize simple phenols, lignans, complex phenols, phenolic acids, and flavonoids in extra-virgin olive oil.

    PubMed

    Carrasco-Pancorbo, Alegria; Gómez-Caravaca, Ana Maria; Cerretani, Lorenzo; Bendini, Alessandra; Segura-Carretero, Antonio; Fernández-Gutiérrez, Alberto

    2006-09-01

    We have devised a simple and rapid capillary electrophoretic method which provides the analyst with a useful tool for the characterization of the polyphenolic fraction of extra-virgin olive oil. This method that uses a capillary with 50 microm id and a total length of 47 cm (40 cm to the detector) with a detection window of 100 x 200 microm, and a buffer solution containing 45 mM of sodium tetraborate pH 9.3 offers valuable information about all the families of compounds present in the polar fraction of the olive oil. The detection was carried out by UV absorption at 200, 240, 280, and 330 nm in order to facilitate the identification of the compounds. Concretely, the method permits the identification of simple phenols, lignans, complex phenols (isomeric forms of secoiridoids), phenolic acids, and flavonoids in the SPE-Diol extracts from extra-virgin olive oil in a short time (less than 10 min) and provides a satisfactory resolution. Peak identification was done by comparing both migration time and spectral data obtained from olive oil samples and standards (commercial or isolated (by HPLC-MS) standards), with spiked methanol-water extracts of olive oil with HPLC-collected compounds and commercially available standards at several concentration levels, studying the information of the electropherograms obtained at several wavelengths and also using the information previously reported.

  4. Automatic registration of ICG images using mutual information and perfusion analysis

    NASA Astrophysics Data System (ADS)

    Kim, Namkug; Seo, Jong-Mo; Lee, June-goo; Kim, Jong Hyo; Park, Kwangsuk; Yu, Hyeong-Gon; Yu, Young Suk; Chung, Hum

    2005-04-01

    Introduction: Indocyanin green fundus angiographic images (ICGA) of the eyes is useful method in detecting and characterizing the choroidal neovascularization (CNV), which is the major cause of the blindness over 65 years of age. To investigate the quantitative analysis of the blood flow on ICGA, systematic approach for automatic registration of using mutual information and a quantitative analysis was developed. Methods: Intermittent sequential images of indocyanin green angiography were acquired by Heidelberg retinal angiography that uses the laser scanning system for the image acquisition. Misalignment of the each image generated by the minute eye movement of the patients was corrected by the mutual information method because the distribution of the contrast media on image is changing throughout the time sequences. Several region of interest (ROI) were selected by a physician and the intensities of the selected region were plotted according to the time sequences. Results: The registration of ICGA time sequential images is required not only translate transform but also rotational transform. Signal intensities showed variation based on gamma-variate function depending on ROIs and capillary vessels show more variance of signal intensity than major vessels. CNV showed intermediate variance of signal intensity and prolonged transit time. Conclusion: The resulting registered images can be used not only for quantitative analysis, but also for perfusion analysis. Various investigative approached on CNV using this method will be helpful in the characterization of the lesion and follow-up.

  5. Texture characterization for joint compression and classification based on human perception in the wavelet domain.

    PubMed

    Fahmy, Gamal; Black, John; Panchanathan, Sethuraman

    2006-06-01

    Today's multimedia applications demand sophisticated compression and classification techniques in order to store, transmit, and retrieve audio-visual information efficiently. Over the last decade, perceptually based image compression methods have been gaining importance. These methods take into account the abilities (and the limitations) of human visual perception (HVP) when performing compression. The upcoming MPEG 7 standard also addresses the need for succinct classification and indexing of visual content for efficient retrieval. However, there has been no research that has attempted to exploit the characteristics of the human visual system to perform both compression and classification jointly. One area of HVP that has unexplored potential for joint compression and classification is spatial frequency perception. Spatial frequency content that is perceived by humans can be characterized in terms of three parameters, which are: 1) magnitude; 2) phase; and 3) orientation. While the magnitude of spatial frequency content has been exploited in several existing image compression techniques, the novel contribution of this paper is its focus on the use of phase coherence for joint compression and classification in the wavelet domain. Specifically, this paper describes a human visual system-based method for measuring the degree to which an image contains coherent (perceptible) phase information, and then exploits that information to provide joint compression and classification. Simulation results that demonstrate the efficiency of this method are presented.

  6. Systematic screening and characterization of flavonoid glycosides in Carthamus tinctorius L. by liquid chromatography/UV diode-array detection/electrospray ionization tandem mass spectrometry.

    PubMed

    Jin, Yu; Xiao, Yuan-sheng; Zhang, Fei-fang; Xue, Xing-ya; Xu, Qing; Liang, Xin-miao

    2008-02-13

    The traditional Chinese medicine (TCM) is a complex system, which always consists of numerous compounds with significant difference in the content and physical and chemical properties. In this paper, a screening method based on target molecular weights was developed to characterize the flavonoid glycosides in the flower of Carthamus tinctorius L. The screening tables of aglycone and glycan were designed, respectively, in order to select and combine freely. The multiple reaction monitoring (MRM) scan mode with higher sensitivity and selectivity was adopted in the screening, which benefit the characterization for the minor components. Seventy-seven flavonoid glycosides were screened out finally, and their structures were characterized by tandem mass spectrometric method in both positive and negative ion modes. The glycosylation mode, aglycone, sequence and/or the interglycosidic linkages of the glycan portion and glycosylation position were elucidated by the fragmentation rule in the MS. Numerous compounds screened out with this method showed the structure variety in secondary plant metabolites, and the purposeful screening systemically and subsequent structure characterization offered more information about the chemical constitutions of TCM.

  7. A rapid method to characterize seabed habitats and associated macro-organisms

    USGS Publications Warehouse

    Anderson, T.J.; Cochrane, G.R.; Roberts, D.A.; Chezar, H.; Hatcher, G.; ,

    2007-01-01

    This study presents a method for rapidly collecting, processing, and interrogating real-time abiotic and biotic seabed data to determine seabed habitat classifications. This is done from data collected over a large area of an acoustically derived seabed map, along multidirectional transects, using a towed small camera-sled. The seabed, within the newly designated Point Harris Marine Reserve on the northern coast of San Miguel Island, California, was acoustically imaged using sidescan sonar then ground-truthed using a towed small camera-sled. Seabed characterizations were made from video observations, and were logged to a laptop computer (PC) in real time. To ground-truth the acoustic mosaic, and to characterize abiotic and biotic aspects of the seabed, a three-tiered characterization scheme was employed that described the substratum type, physical structure (i.e., bedform or vertical relief), and the occurrence of benthic macrofauna and flora. A crucial advantage of the method described here, is that preliminary seabed characterizations can be interrogated and mapped over the sidescan mosaic and other seabed information within hours of data collection. This ability to rapidly process seabed data is invaluable to scientists and managers, particularly in modifying concurrent or planning subsequent surveys.

  8. Approach to estimation of level of information security at enterprise based on genetic algorithm

    NASA Astrophysics Data System (ADS)

    V, Stepanov L.; V, Parinov A.; P, Korotkikh L.; S, Koltsov A.

    2018-05-01

    In the article, the way of formalization of different types of threats of information security and vulnerabilities of an information system of the enterprise and establishment is considered. In a type of complexity of ensuring information security of application of any new organized system, the concept and decisions in the sphere of information security are expedient. One of such approaches is the method of a genetic algorithm. For the enterprises of any fields of activity, the question of complex estimation of the level of security of information systems taking into account the quantitative and qualitative factors characterizing components of information security is relevant.

  9. Microstructural characterization of Ti-6Al-4V alloy subjected to the duplex SMAT/plasma nitriding.

    PubMed

    Pi, Y; Faure, J; Agoda-Tandjawa, G; Andreazza, C; Potiron, S; Levesque, A; Demangel, C; Retraint, D; Benhayoune, H

    2013-09-01

    In this study, microstructural characterization of Ti-6Al-4V alloy, subjected to the duplex surface mechanical attrition treatment (SMAT)/nitriding treatment, leading to improve its mechanical properties, was carried out through novel and original samples preparation methods. Instead of acid etching which is limited for morphological characterization by scanning electron microscopy (SEM), an original ion polishing method was developed. Moreover, for structural characterization by transmission electron microscopy (TEM), an ion milling method based with the use of two ions guns was also carried out for cross-section preparation. To demonstrate the efficiency of the two developed methods, morphological investigations were done by traditional SEM and field emission gun SEM. This was followed by structural investigations through selected area electron diffraction (SAED) coupled with TEM and X-ray diffraction techniques. The results demonstrated that ionic polishing allowed to reveal a variation of the microstructure according to the surface treatment that could not be observed by acid etching preparation. TEM associated to SAED and X-ray diffraction provided information regarding the nanostructure compositional changes induced by the duplex SMAT/nitriding process. Copyright © 2013 Wiley Periodicals, Inc.

  10. Detailed description of oil shale organic and mineralogical heterogeneity via fourier transform infrared mircoscopy

    USGS Publications Warehouse

    Washburn, Kathryn E.; Birdwell, Justin E.; Foster, Michael; Gutierrez, Fernando

    2015-01-01

    Mineralogical and geochemical information on reservoir and source rocks is necessary to assess and produce from petroleum systems. The standard methods in the petroleum industry for obtaining these properties are bulk measurements on homogenized, generally crushed, and pulverized rock samples and can take from hours to days to perform. New methods using Fourier transform infrared (FTIR) spectroscopy have been developed to more rapidly obtain information on mineralogy and geochemistry. However, these methods are also typically performed on bulk, homogenized samples. We present a new approach to rock sample characterization incorporating multivariate analysis and FTIR microscopy to provide non-destructive, spatially resolved mineralogy and geochemistry on whole rock samples. We are able to predict bulk mineralogy and organic carbon content within the same margin of error as standard characterization techniques, including X-ray diffraction (XRD) and total organic carbon (TOC) analysis. Validation of the method was performed using two oil shale samples from the Green River Formation in the Piceance Basin with differing sedimentary structures. One sample represents laminated Green River oil shales, and the other is representative of oil shale breccia. The FTIR microscopy results on the oil shales agree with XRD and LECO TOC data from the homogenized samples but also give additional detail regarding sample heterogeneity by providing information on the distribution of mineral phases and organic content. While measurements for this study were performed on oil shales, the method could also be applied to other geological samples, such as other mudrocks, complex carbonates, and soils.

  11. Information Transfer in the Brain: Insights from a Unified Approach

    NASA Astrophysics Data System (ADS)

    Marinazzo, Daniele; Wu, Guorong; Pellicoro, Mario; Stramaglia, Sebastiano

    Measuring directed interactions in the brain in terms of information flow is a promising approach, mathematically treatable and amenable to encompass several methods. In this chapter we propose some approaches rooted in this framework for the analysis of neuroimaging data. First we will explore how the transfer of information depends on the network structure, showing how for hierarchical networks the information flow pattern is characterized by exponential distribution of the incoming information and a fat-tailed distribution of the outgoing information, as a signature of the law of diminishing marginal returns. This was reported to be true also for effective connectivity networks from human EEG data. Then we address the problem of partial conditioning to a limited subset of variables, chosen as the most informative ones for the driver node.We will then propose a formal expansion of the transfer entropy to put in evidence irreducible sets of variables which provide information for the future state of each assigned target. Multiplets characterized by a large contribution to the expansion are associated to informational circuits present in the system, with an informational character (synergetic or redundant) which can be associated to the sign of the contribution. Applications are reported for EEG and fMRI data.

  12. Imprecise Probability Methods for Weapons UQ

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Picard, Richard Roy; Vander Wiel, Scott Alan

    Building on recent work in uncertainty quanti cation, we examine the use of imprecise probability methods to better characterize expert knowledge and to improve on misleading aspects of Bayesian analysis with informative prior distributions. Quantitative approaches to incorporate uncertainties in weapons certi cation are subject to rigorous external peer review, and in this regard, certain imprecise probability methods are well established in the literature and attractive. These methods are illustrated using experimental data from LANL detonator impact testing.

  13. Common Effects Methodology National Stakeholder Meeting December 1, 2010

    EPA Pesticide Factsheets

    EPA is exploring how to build on the substantial high quality science developed under both OPP programs to develop additional tools and approaches to support a consistent and common set of effects characterization methods using best available information.

  14. Probabilistic classification method on multi wavelength chromatographic data for photosynthetic pigments identification

    NASA Astrophysics Data System (ADS)

    Prilianti, K. R.; Setiawan, Y.; Indriatmoko, Adhiwibawa, M. A. S.; Limantara, L.; Brotosudarmo, T. H. P.

    2014-02-01

    Environmental and health problem caused by artificial colorant encourages the increasing usage of natural colorant nowadays. Natural colorant refers to the colorant that is derivate from living organism or minerals. Extensive research topic has been done to exploit these colorant, but recent data shows that only 0.5% of the wide range of plant pigments in the earth has been exhaustively used. Hence development of the pigment characterization technique is an important consideration. High-performance liquid chromatography (HPLC) is a widely used technique to separate pigments in a mixture and identify it. In former HPLC fingerprinting, pigment characterization was based on a single chromatogram from a fixed wavelength (one dimensional) and discard the information contained at other wavelength. Therefore, two dimensional fingerprints have been proposed to use more chromatographic information. Unfortunately this method leads to the data processing problem due to the size of its data matrix. The other common problem in the chromatogram analysis is the subjectivity of the researcher in recognizing the chromatogram pattern. In this research an automated analysis method of the multi wavelength chromatographic data was proposed. Principal component analysis (PCA) was used to compress the data matrix and Maximum Likelihood (ML) classification was applied to identify the chromatogram pattern of the existing pigments in a mixture. Three photosynthetic pigments were selected to show the proposed method. Those pigments are β-carotene, fucoxanthin and zeaxanthin. The result suggests that the method could well inform the existence of the pigments in a particular mixture. A simple computer application was also developed to facilitate real time analysis. Input of the application is multi wavelength chromatographic data matrix and the output is information about the existence of the three pigments.

  15. Learning to classify wakes from local sensory information

    NASA Astrophysics Data System (ADS)

    Alsalman, Mohamad; Colvert, Brendan; Kanso, Eva; Kanso Team

    2017-11-01

    Aquatic organisms exhibit remarkable abilities to sense local flow signals contained in their fluid environment and to surmise the origins of these flows. For example, fish can discern the information contained in various flow structures and utilize this information for obstacle avoidance and prey tracking. Flow structures created by flapping and swimming bodies are well characterized in the fluid dynamics literature; however, such characterization relies on classical methods that use an external observer to reconstruct global flow fields. The reconstructed flows, or wakes, are then classified according to the unsteady vortex patterns. Here, we propose a new approach for wake identification: we classify the wakes resulting from a flapping airfoil by applying machine learning algorithms to local flow information. In particular, we simulate the wakes of an oscillating airfoil in an incoming flow, extract the downstream vorticity information, and train a classifier to learn the different flow structures and classify new ones. This data-driven approach provides a promising framework for underwater navigation and detection in application to autonomous bio-inspired vehicles.

  16. The Early Detection of the Emerald Ash Borer (eab) Using Multi-Source Remotely Sensed Data

    NASA Astrophysics Data System (ADS)

    Hu, B.; Naveed, F.; Tasneem, F.; Xing, C.

    2018-04-01

    The objectives of this study were to exploit the synergy of hyperspectral imagery, Light Detection And Ranging (LiDAR) and high spatial resolution data and their synergy in the early detection of the EAB (Emerald Ash Borer) presence in trees within urban areas and to develop a framework to combine information extracted from multiple data sources. To achieve these, an object-oriented framework was developed to combine information derived from available data sets to characterize ash trees. Within this framework, an advanced individual tree delineation method was developed to delineate individual trees using the combined high-spatial resolution worldview-3 imagery was used together with LiDAR data. Individual trees were then classified to ash and non-ash trees using spectral and spatial information. In order to characterize the health state of individual ash trees, leaves from ash trees with various health states were sampled and measured using a field spectrometer. Based on the field measurements, the best indices that sensitive to leaf chlorophyll content were selected. The developed framework and methods were tested using worldview-3, airborne LiDAR data over the Keele campus of York University Toronto Canada. Satisfactory results in terms of individual tree crown delineation, ash tree identification and characterization of the health state of individual ash trees. Quantitative evaluations is being carried out.

  17. Petri net-based method for the analysis of the dynamics of signal propagation in signaling pathways.

    PubMed

    Hardy, Simon; Robillard, Pierre N

    2008-01-15

    Cellular signaling networks are dynamic systems that propagate and process information, and, ultimately, cause phenotypical responses. Understanding the circuitry of the information flow in cells is one of the keys to understanding complex cellular processes. The development of computational quantitative models is a promising avenue for attaining this goal. Not only does the analysis of the simulation data based on the concentration variations of biological compounds yields information about systemic state changes, but it is also very helpful for obtaining information about the dynamics of signal propagation. This article introduces a new method for analyzing the dynamics of signal propagation in signaling pathways using Petri net theory. The method is demonstrated with the Ca(2+)/calmodulin-dependent protein kinase II (CaMKII) regulation network. The results constitute temporal information about signal propagation in the network, a simplified graphical representation of the network and of the signal propagation dynamics and a characterization of some signaling routes as regulation motifs.

  18. An improved method for LCD displays colorimetric characterization

    NASA Astrophysics Data System (ADS)

    Li, Tong; Xie, Kai; Wang, Qiaojie; He, Nannan; Ye, Yushan

    2018-03-01

    The colorimetric characterization of the display can achieve the purpose of precisely controlling the color of the monitor. This paper describes an improved method for estimating the gamma value of liquid-crystal displays (LCDs) without using a measurement device was described by Xiao et al. It relies on observer's luminance matching by presenting eight half-tone patterns with luminance from 1/9 to 8/9 of the maximum value of each color channel. Since the previous method lacked partial low frequency information, we partially replaced the half-tone patterns. A large number of experiments show that the color difference is reduced from 3.726 to 2.835, and our half-tone pattern can better estimate the visual gamma value of LCDs.

  19. A taxonomy of adolescent health: development of the adolescent health profile-types.

    PubMed

    Riley, A W; Green, B F; Forrest, C B; Starfield, B; Kang, M; Ensminger, M E

    1998-08-01

    The aim of this study was to develop a taxonomy of health profile-types that describe adolescents' patterns of health as self-reported on a health status questionnaire. The intent was to be able to assign individuals to mutually exclusive and exhaustive groups that characterize the important aspects of their health and need for health services. Cluster analytic empirical methods and clinically based conceptual methods were used to identify patterns of health in samples of adolescents from schools and from clinics that serve adolescents with chronic conditions and acute illnesses. Individuals with similar patterns of scores across multiple domains were assigned to the same profile-type. Results from the empirical and conceptually based methods were integrated to produce a practical system for assigning youths to profile-types. Four domains of health (Satisfaction, Discomfort, Risks and Resilience) were used to group individuals into 13 distinct profile-types. The profile-types were characterized primarily by the number of domains in which health is poor, identifying the unique combinations of problems that characterize different subgroups of adolescents. This method of reporting the information available on health status surveys is potentially a more informative way of identifying and classifying the health needs of subgroups in the population than is available from global scores or multiple scale scores. The reliability and validity of this taxonomy of health profile-types for the purposes of planning and evaluating health services must be demonstrated. That is the purpose of the accompanying study.

  20. Subsurface and Surface Characterization using an Information Framework Model

    NASA Astrophysics Data System (ADS)

    Samuel-Ojo, Olusola

    Groundwater plays a critical dual role as a reservoir of fresh water for human consumption and as a cause of the most severe problems when dealing with construction works below the water table. This is why it is critical to monitor groundwater recharge, distribution, and discharge on a continuous basis. The conventional method of monitoring groundwater employs a network of sparsely distributed monitoring wells and it is laborious, expensive, and intrusive. The problem of sparse data and undersampling reduces the accuracy of sampled survey data giving rise to poor interpretation. This dissertation addresses this problem by investigating groundwater-deformation response in order to augment the conventional method. A blend of three research methods was employed, namely design science research, geological methods, and geophysical methods, to examine whether persistent scatterer interferometry, a remote sensing technique, might augment conventional groundwater monitoring. Observation data (including phase information for displacement deformation from permanent scatterer interferometric synthetic aperture radar and depth to groundwater data) was obtained from the Water District, Santa Clara Valley, California. An information framework model was built and applied, and then evaluated. Data was preprocessed and decomposed into five components or parts: trend, seasonality, low frequency, high frequency and octave bandwidth. Digital elevation models of observed and predicted hydraulic head were produced, illustrating the piezometric or potentiometric surface. The potentiometric surface characterizes the regional aquifer of the valley showing areal variation of rate of percolation, velocity and permeability, and completely defines flow direction, advising characteristics and design levels. The findings show a geologic forcing phenomenon which explains in part the long-term deformation behavior of the valley, characterized by poroelastic, viscoelastic, elastoplastic and inelastic deformations under the influence of an underlying geologic southward plate motion within the theory of plate tectonics. It also explains the impact of a history of heavy pumpage of groundwater during the agricultural and urbanization era. Thus the persistent scatterer interferometry method offers an attractive, non-intrusive, cost-effective augmentation of the conventional method of monitoring groundwater for water resource development and stability of soil mass.

  1. Characterization of winemaking yeast by cell number-size distribution analysis through flow field-flow fractionation with multi-wavelength turbidimetric detection.

    PubMed

    Zattoni, Andrea; Melucci, Dora; Reschiglian, Pierluigi; Sanz, Ramsés; Puignou, Lluís; Galceran, Maria Teresa

    2004-10-29

    Yeasts are widely used in several areas of food industry, e.g. baking, beer brewing, and wine production. Interest in new analytical methods for quality control and characterization of yeast cells is thus increasing. The biophysical properties of yeast cells, among which cell size, are related to yeast cell capabilities to produce primary and secondary metabolites during the fermentation process. Biophysical properties of winemaking yeast strains can be screened by field-flow fractionation (FFF). In this work we present the use of flow FFF (FlFFF) with turbidimetric multi-wavelength detection for the number-size distribution analysis of different commercial winemaking yeast varieties. The use of a diode-array detector allows to apply to dispersed samples like yeast cells the recently developed method for number-size (or mass-size) analysis in flow-assisted separation techniques. Results for six commercial winemaking yeast strains are compared with data obtained by a standard method for cell sizing (Coulter counter). The method here proposed gives, at short analysis time, accurate information on the number of cells of a given size, and information on the total number of cells.

  2. Relevant Scatterers Characterization in SAR Images

    NASA Astrophysics Data System (ADS)

    Chaabouni, Houda; Datcu, Mihai

    2006-11-01

    Recognizing scenes in a single look meter resolution Synthetic Aperture Radar (SAR) images, requires the capability to identify relevant signal signatures in condition of variable image acquisition geometry, arbitrary objects poses and configurations. Among the methods to detect relevant scatterers in SAR images, we can mention the internal coherence. The SAR spectrum splitted in azimuth generates a series of images which preserve high coherence only for particular object scattering. The detection of relevant scatterers can be done by correlation study or Independent Component Analysis (ICA) methods. The present article deals with the state of the art for SAR internal correlation analysis and proposes further extensions using elements of inference based on information theory applied to complex valued signals. The set of azimuth looks images is analyzed using mutual information measures and an equivalent channel capacity is derived. The localization of the "target" requires analysis in a small image window, thus resulting in imprecise estimation of the second order statistics of the signal. For a better precision, a Hausdorff measure is introduced. The method is applied to detect and characterize relevant objects in urban areas.

  3. Dynamics in Complex Coacervates

    NASA Astrophysics Data System (ADS)

    Perry, Sarah

    Understanding the dynamics of a material provides detailed information about the self-assembly, structure, and intermolecular interactions present in a material. While rheological methods have long been used for the characterization of complex coacervate-based materials, it remains a challenge to predict the dynamics for a new system of materials. Furthermore, most work reports only qualitative trends exist as to how parameters such as charge stoichiometry, ionic strength, and polymer chain length impact self-assembly and material dynamics, and there is little information on the effects of polymer architecture or the organization of charges within a polymer. We seek to link thermodynamic studies of coacervation phase behavior with material dynamics through a carefully-controlled, systematic study of coacervate linear viscoelasticity for different polymer chemistries. We couple various methods of characterizing the dynamics of polymer-based complex coacervates, including the time-salt superposition methods developed first by Spruijt and coworkers to establish a more mechanistic strategy for comparing the material dynamics and linear viscoelasticity of different systems. Acknowledgment is made to the Donors of the American Chemical Society Petroleum Research Fund for support of this research.

  4. On the magnetic polarizability tensor of US coinage

    NASA Astrophysics Data System (ADS)

    Davidson, John L.; Abdel-Rehim, Omar A.; Hu, Peipei; Marsh, Liam A.; O'Toole, Michael D.; Peyton, Anthony J.

    2018-03-01

    The magnetic dipole polarizability tensor of a metallic object gives unique information about the size, shape and electromagnetic properties of the object. In this paper, we present a novel method of coin characterization based on the spectroscopic response of the absolute tensor. The experimental measurements are validated using a combination of tests with a small set of bespoke coin surrogates and simulated data. The method is applied to an uncirculated set of US coins. Measured and simulated spectroscopic tensor responses of the coins show significant differences between different coin denominations. The presented results are encouraging as they strongly demonstrate the ability to characterize coins using an absolute tensor approach.

  5. Rapid Characterization of Microorganisms by Mass Spectrometry—What Can Be Learned and How?

    NASA Astrophysics Data System (ADS)

    Fenselau, Catherine C.

    2013-08-01

    Strategies for the rapid and reliable analysis of microorganisms have been sought to meet national needs in defense, homeland security, space exploration, food and water safety, and clinical diagnosis. Mass spectrometry has long been a candidate technique because it is extremely rapid and can provide highly specific information. It has excellent sensitivity. Molecular and fragment ion masses provide detailed fingerprints, which can also be interpreted. Mass spectrometry is also a broad band method—everything has a mass—and it is automatable. Mass spectrometry is a physiochemical method that is orthogonal and complementary to biochemical and morphological methods used to characterize microorganisms.

  6. New high resolution Random Telegraph Noise (RTN) characterization method for resistive RAM

    NASA Astrophysics Data System (ADS)

    Maestro, M.; Diaz, J.; Crespo-Yepes, A.; Gonzalez, M. B.; Martin-Martinez, J.; Rodriguez, R.; Nafria, M.; Campabadal, F.; Aymerich, X.

    2016-01-01

    Random Telegraph Noise (RTN) is one of the main reliability problems of resistive switching-based memories. To understand the physics behind RTN, a complete and accurate RTN characterization is required. The standard equipment used to analyse RTN has a typical time resolution of ∼2 ms which prevents evaluating fast phenomena. In this work, a new RTN measurement procedure, which increases the measurement time resolution to 2 μs, is proposed. The experimental set-up, together with the recently proposed Weighted Time Lag (W-LT) method for the analysis of RTN signals, allows obtaining a more detailed and precise information about the RTN phenomenon.

  7. Topograph for inspection of engine cylinder walls.

    PubMed

    Franz, S; Leonhardt, K; Windecker, R; Tiziani, H J

    1999-12-20

    The microstructural inspection of engine cylinder walls is an important task for quality management in the automotive industry. Until recently, mainly tactile methods were used for this purpose. We present an optical instrument based on microscopic fringe projection that permits fast, reliable, and nondestructive measurements of microstructure. The field of view is 0.8 mm x 1.2 mm, with a spatial sampling of 1100 x 700 pixels. In contrast to conventional tactile sensors, the optical method provides fast in situ three-dimensional surface characterizations that provide more information about the surface than do line profiles. Measurements are presented, and advantages of this instrument for characterization of a surface are discussed.

  8. The development of internationally managed information systems and their prospects.

    PubMed

    East, H

    1978-12-01

    This paper reviews a selection of international collaborative efforts in the production of information services and attempts to characterize modes of cooperation. Information systems specifically discussed include: international nuclear information system (INIS); Nuclear Science Abstract (NSA); EURATOM; AGRIS; AGRINDEX; Information Retrieval Limited (IRL); IFIS (International Food Information Service); Chemical Abstracts Service (CAS); MEDLARS; and TITUS. 3 methods of international information transfer are discussed: commercial transactions; negotiated (bilateral) barter arrangements; and contribution to internationally managed systems. Technical, economic, and professional objectives support the rationale for international cooperation. It is argued that economic and political considerations, as much as improved technology or information transfer, will determine the nature of collaboration in the future.

  9. ALTERNATIVES TO DUPLICATE DIET METHODOLOGY

    EPA Science Inventory

    Duplicate Diet (DD) methodology has been used to collect information about the dietary exposure component in the context of total exposure studies. DD methods have been used to characterize the dietary exposure component in the NHEXAS pilot studies. NERL desired to evaluate it...

  10. Common Effects Methodology National Stakeholder Meeting December 1, 2010 White Papers

    EPA Pesticide Factsheets

    EPA is exploring how to build on the substantial high quality science developed under both OPP programs to develop additional tools and approaches to support a consistent and common set of effects characterization methods using best available information.

  11. Common Effects Methodology Regional Stakeholder Meeting January 11 -22, 2010

    EPA Pesticide Factsheets

    EPA is exploring how to build on the substantial high quality science developed under both OPP programs to develop additional tools and approaches to support a consistent and common set of effects characterization methods using best available information.

  12. Detection, characterization and quantification of inorganic engineered nanomaterials: A review of techniques and methodological approaches for the analysis of complex samples.

    PubMed

    Laborda, Francisco; Bolea, Eduardo; Cepriá, Gemma; Gómez, María T; Jiménez, María S; Pérez-Arantegui, Josefina; Castillo, Juan R

    2016-01-21

    The increasing demand of analytical information related to inorganic engineered nanomaterials requires the adaptation of existing techniques and methods, or the development of new ones. The challenge for the analytical sciences has been to consider the nanoparticles as a new sort of analytes, involving both chemical (composition, mass and number concentration) and physical information (e.g. size, shape, aggregation). Moreover, information about the species derived from the nanoparticles themselves and their transformations must also be supplied. Whereas techniques commonly used for nanoparticle characterization, such as light scattering techniques, show serious limitations when applied to complex samples, other well-established techniques, like electron microscopy and atomic spectrometry, can provide useful information in most cases. Furthermore, separation techniques, including flow field flow fractionation, capillary electrophoresis and hydrodynamic chromatography, are moving to the nano domain, mostly hyphenated to inductively coupled plasma mass spectrometry as element specific detector. Emerging techniques based on the detection of single nanoparticles by using ICP-MS, but also coulometry, are in their way to gain a position. Chemical sensors selective to nanoparticles are in their early stages, but they are very promising considering their portability and simplicity. Although the field is in continuous evolution, at this moment it is moving from proofs-of-concept in simple matrices to methods dealing with matrices of higher complexity and relevant analyte concentrations. To achieve this goal, sample preparation methods are essential to manage such complex situations. Apart from size fractionation methods, matrix digestion, extraction and concentration methods capable of preserving the nature of the nanoparticles are being developed. This review presents and discusses the state-of-the-art analytical techniques and sample preparation methods suitable for dealing with complex samples. Single- and multi-method approaches applied to solve the nanometrological challenges posed by a variety of stakeholders are also presented. Copyright © 2015 Elsevier B.V. All rights reserved.

  13. How Analysis Informs Regulation:Success and Failure of ...

    EPA Pesticide Factsheets

    How Analysis Informs Regulation:Success and Failure of Evolving Approaches to Polyfluoroalkyl Acid Contamination The National Exposure Research Laboratory (NERL) Human Exposure and Atmospheric Sciences Division (HEASD) conducts research in support of EPA mission to protect human health and the environment. HEASD research program supports Goal 1 (Clean Air) and Goal 4 (Healthy People) of EPA strategic plan. More specifically, our division conducts research to characterize the movement of pollutants from the source to contact with humans. Our multidisciplinary research program produces Methods, Measurements, and Models to identify relationships between and characterize processes that link source emissions, environmental concentrations, human exposures, and target-tissue dose. The impact of these tools is improved regulatory programs and policies for EPA.

  14. Multiscale tomography of buried magnetic structures: its use in the localization and characterization of archaeological structures

    NASA Astrophysics Data System (ADS)

    Saracco, Ginette; Moreau, Frédérique; Mathé, Pierre-Etienne; Hermitte, Daniel; Michel, Jean-Marie

    2007-10-01

    We have previously developed a method for characterizing and localizing `homogeneous' buried sources, from the measure of potential anomalies at a fixed height above ground (magnetic, electric and gravity). This method is based on potential theory and uses the properties of the Poisson kernel (real by definition) and the continuous wavelet theory. Here, we relax the assumption on sources and introduce a method that we call the `multiscale tomography'. Our approach is based on the harmonic extension of the observed magnetic field to produce a complex source by use of a complex Poisson kernel solution of the Laplace equation for complex potential field. A phase and modulus are defined. We show that the phase provides additional information on the total magnetic inclination and the structure of sources, while the modulus allows us to characterize its spatial location, depth and `effective degree'. This method is compared to the `complex dipolar tomography', extension of the Patella method that we previously developed. We applied both methods and a classical electrical resistivity tomography to detect and localize buried archaeological structures like antique ovens from magnetic measurements on the Fox-Amphoux site (France). The estimates are then compared with the results of excavations.

  15. Automatic identification of resting state networks: an extended version of multiple template-matching

    NASA Astrophysics Data System (ADS)

    Guaje, Javier; Molina, Juan; Rudas, Jorge; Demertzi, Athena; Heine, Lizette; Tshibanda, Luaba; Soddu, Andrea; Laureys, Steven; Gómez, Francisco

    2015-12-01

    Functional magnetic resonance imaging in resting state (fMRI-RS) constitutes an informative protocol to investigate several pathological and pharmacological conditions. A common approach to study this data source is through the analysis of changes in the so called resting state networks (RSNs). These networks correspond to well-defined functional entities that have been associated to different low and high brain order functions. RSNs may be characterized by using Independent Component Analysis (ICA). ICA provides a decomposition of the fMRI-RS signal into sources of brain activity, but it lacks of information about the nature of the signal, i.e., if the source is artifactual or not. Recently, a multiple template-matching (MTM) approach was proposed to automatically recognize RSNs in a set of Independent Components (ICs). This method provides valuable information to assess subjects at individual level. Nevertheless, it lacks of a mechanism to quantify how much certainty there is about the existence/absence of each network. This information may be important for the assessment of patients with severely damaged brains, in which RSNs may be greatly affected as a result of the pathological condition. In this work we propose a set of changes to the original MTM that improves the RSNs recognition task and also extends the functionality of the method. The key points of this improvement is a standardization strategy and a modification of method's constraints that adds flexibility to the approach. Additionally, we also introduce an analysis to the trustworthiness measurement of each RSN obtained by using template-matching approach. This analysis consists of a thresholding strategy applied over the computed Goodness-of-Fit (GOF) between the set of templates and the ICs. The proposed method was validated on 2 two independent studies (Baltimore, 23 healthy subjects and Liege, 27 healthy subjects) with different configurations of MTM. Results suggest that the method will provide complementary information for characterization of RSNs at individual level.

  16. Time-resolved SERS for characterizing extracellular vesicles

    NASA Astrophysics Data System (ADS)

    Rojalin, Tatu; Saari, Heikki; Somersalo, Petter; Laitinen, Saara; Turunen, Mikko; Viitala, Tapani; Wachsmann-Hogiu, Sebastian; Smith, Zachary J.; Yliperttula, Marjo

    2017-02-01

    The aim of this work is to develop a platform for characterizing extracellular vesicles (EV) by using gold-polymer nanopillar SERS arrays simultaneously circumventing the photoluminescence-related disadvantages of Raman with a time-resolved approach. EVs are rich of biochemical information reporting of, for example, diseased state of the biological system. Currently, straightforward, label-free and fast EV characterization methods with low sample consumption are warranted. In this study, SERS spectra of red blood cell and platelet derived EVs were successfully measured and their biochemical contents analyzed using multivariate data analysis techniques. The developed platform could be conveniently used for EV analytics in general.

  17. Harnessing glycomics technologies: integrating structure with function for glycan characterization

    PubMed Central

    Robinson, Luke N.; Artpradit, Charlermchai; Raman, Rahul; Shriver, Zachary H.; Ruchirawat, Mathuros; Sasisekharan, Ram

    2013-01-01

    Glycans, or complex carbohydrates, are a ubiquitous class of biological molecules which impinge on a variety of physiological processes ranging from signal transduction to tissue development and microbial pathogenesis. In comparison to DNA and proteins, glycans present unique challenges to the study of their structure and function owing to their complex and heterogeneous structures and the dominant role played by multivalency in their sequence-specific biological interactions. Arising from these challenges, there is a need to integrate information from multiple complementary methods to decode structure-function relationships. Focusing on acidic glycans, we describe here key glycomics technologies for characterizing their structural attributes, including linkage, modifications, and topology, as well as for elucidating their role in biological processes. Two cases studies, one involving sialylated branched glycans and the other sulfated glycosaminoglycans, are used to highlight how integration of orthogonal information from diverse datasets enables rapid convergence of glycan characterization for development of robust structure-function relationships. PMID:22522536

  18. Gas-water two-phase flow characterization with Electrical Resistance Tomography and Multivariate Multiscale Entropy analysis.

    PubMed

    Tan, Chao; Zhao, Jia; Dong, Feng

    2015-03-01

    Flow behavior characterization is important to understand gas-liquid two-phase flow mechanics and further establish its description model. An Electrical Resistance Tomography (ERT) provides information regarding flow conditions at different directions where the sensing electrodes implemented. We extracted the multivariate sample entropy (MSampEn) by treating ERT data as a multivariate time series. The dynamic experimental results indicate that the MSampEn is sensitive to complexity change of flow patterns including bubbly flow, stratified flow, plug flow and slug flow. MSampEn can characterize the flow behavior at different direction of two-phase flow, and reveal the transition between flow patterns when flow velocity changes. The proposed method is effective to analyze two-phase flow pattern transition by incorporating information of different scales and different spatial directions. Copyright © 2014 ISA. Published by Elsevier Ltd. All rights reserved.

  19. Characterizing Surfaces of the Wide Bandgap Semiconductor Ilmenite with Scanning Probe Microcopies

    NASA Technical Reports Server (NTRS)

    Wilkins, R.; Powell, Kirk St. A.

    1997-01-01

    Ilmenite (FeTiO3) is a wide bandgap semiconductor with an energy gap of about 2.5eV. Initial radiation studies indicate that ilmenite has properties suited for radiation tolerant applications, as well as a variety of other electronic applications. Two scanning probe microscopy methods have been used to characterize the surface of samples taken from Czochralski grown single crystals. The two methods, atomic force microscopy (AFM) and scanning tunneling microscopy (STM), are based on different physical principles and therefore provide different information about the samples. AFM provides a direct, three-dimensional image of the surface of the samples, while STM give a convolution of topographic and electronic properties of the surface. We will discuss the differences between the methods and present preliminary data of each method for ilmenite samples.

  20. Applications of multi-frequency single beam sonar fisheries analysis methods for seep quantification and characterization

    NASA Astrophysics Data System (ADS)

    Price, V.; Weber, T.; Jerram, K.; Doucet, M.

    2016-12-01

    The analysis of multi-frequency, narrow-band single-beam acoustic data for fisheries applications has long been established, with methodology focusing on characterizing targets in the water column by utilizing complex algorithms and false-color time series data to create and compare frequency response curves for dissimilar biological groups. These methods were built on concepts developed for multi-frequency analysis of satellite imagery for terrestrial analysis and have been applied to a broad range of data types and applications. Single-beam systems operating at multiple frequencies are also used for the detection and identification of seeps in water column data. Here we incorporate the same analysis and visualization techniques used for fisheries applications to attempt to characterize and quantify seeps by creating and comparing frequency response curves and applying false coloration to shallow and deep multi-channel seep data. From this information, we can establish methods to differentiate bubble size in the echogram and differentiate seep composition. These techniques are also useful in differentiating plume content from biological noise (volume reverberation) created by euphausid layers and fish with gas-filled swim bladders. The combining of the multiple frequencies using false coloring and other image analysis techniques after applying established normalization and beam pattern correction algorithms is a novel approach to quantitatively describing seeps. Further, this information could be paired with geological models, backscatter, and bathymetry data to assess seep distribution.

  1. Decrease of Fisher information and the information geometry of evolution equations for quantum mechanical probability amplitudes.

    PubMed

    Cafaro, Carlo; Alsing, Paul M

    2018-04-01

    The relevance of the concept of Fisher information is increasing in both statistical physics and quantum computing. From a statistical mechanical standpoint, the application of Fisher information in the kinetic theory of gases is characterized by its decrease along the solutions of the Boltzmann equation for Maxwellian molecules in the two-dimensional case. From a quantum mechanical standpoint, the output state in Grover's quantum search algorithm follows a geodesic path obtained from the Fubini-Study metric on the manifold of Hilbert-space rays. Additionally, Grover's algorithm is specified by constant Fisher information. In this paper, we present an information geometric characterization of the oscillatory or monotonic behavior of statistically parametrized squared probability amplitudes originating from special functional forms of the Fisher information function: constant, exponential decay, and power-law decay. Furthermore, for each case, we compute both the computational speed and the availability loss of the corresponding physical processes by exploiting a convenient Riemannian geometrization of useful thermodynamical concepts. Finally, we briefly comment on the possibility of using the proposed methods of information geometry to help identify a suitable trade-off between speed and thermodynamic efficiency in quantum search algorithms.

  2. Decrease of Fisher information and the information geometry of evolution equations for quantum mechanical probability amplitudes

    NASA Astrophysics Data System (ADS)

    Cafaro, Carlo; Alsing, Paul M.

    2018-04-01

    The relevance of the concept of Fisher information is increasing in both statistical physics and quantum computing. From a statistical mechanical standpoint, the application of Fisher information in the kinetic theory of gases is characterized by its decrease along the solutions of the Boltzmann equation for Maxwellian molecules in the two-dimensional case. From a quantum mechanical standpoint, the output state in Grover's quantum search algorithm follows a geodesic path obtained from the Fubini-Study metric on the manifold of Hilbert-space rays. Additionally, Grover's algorithm is specified by constant Fisher information. In this paper, we present an information geometric characterization of the oscillatory or monotonic behavior of statistically parametrized squared probability amplitudes originating from special functional forms of the Fisher information function: constant, exponential decay, and power-law decay. Furthermore, for each case, we compute both the computational speed and the availability loss of the corresponding physical processes by exploiting a convenient Riemannian geometrization of useful thermodynamical concepts. Finally, we briefly comment on the possibility of using the proposed methods of information geometry to help identify a suitable trade-off between speed and thermodynamic efficiency in quantum search algorithms.

  3. Preparation for Scaling Studies of Ice-Crystal Icing at the NRC Research Altitude Test Facility

    NASA Technical Reports Server (NTRS)

    Struk, Peter M.; Bencic, Timothy J.; Tsao, Jen-Ching; Fuleki, Dan; Knezevici, Daniel C.

    2013-01-01

    This paper describes experiments conducted at the National Research Council (NRC) of Canadas Research Altitiude Test Facility between March 26 and April 11, 2012. The tests, conducted collaboratively between NASA and NRC, focus on three key aspects in preparation for later scaling work to be conducted with a NACA 0012 airfoil model in the NRC Cascade rig: (1) cloud characterization, (2) scaling model development, and (3) ice-shape profile measurements. Regarding cloud characterization, the experiments focus on particle spectra measurements using two shadowgraphy methods, cloud uniformity via particle scattering from a laser sheet, and characterization of the SEA Multi-Element probe. Overviews of each aspect as well as detailed information on the diagnostic method are presented. Select results from the measurements and interpretation are presented which will help guide future work.

  4. Application toward Confocal Full-Field Microscopic X-ray Absorption Near Edge Structure Spectroscopy.

    PubMed

    Tack, Pieter; Vekemans, Bart; Laforce, Brecht; Rudloff-Grund, Jennifer; Hernández, Willinton Y; Garrevoet, Jan; Falkenberg, Gerald; Brenker, Frank; Van Der Voort, Pascal; Vincze, Laszlo

    2017-02-07

    Using X-ray absorption near edge structure (XANES) spectroscopy, information on the local chemical structure and oxidation state of an element of interest can be acquired. Conventionally, this information can be obtained in a spatially resolved manner by scanning a sample through a focused X-ray beam. Recently, full-field methods have been developed to obtain direct 2D chemical state information by imaging a large sample area. These methods are usually in transmission mode, thus restricting the use to thin and transmitting samples. Here, a fluorescence method is displayed using an energy-dispersive pnCCD detector, the SLcam, characterized by measurement times far superior to what is generally applicable. Additionally, this method operates in confocal mode, thus providing direct 3D spatially resolved chemical state information from a selected subvolume of a sample, without the need of rotating a sample. The method is applied to two samples: a gold-supported magnesia catalyst (Au/MgO) and a natural diamond containing Fe-rich inclusions. Both samples provide XANES spectra that can be overlapped with reference XANES spectra, allowing this method to be used for fingerprinting and linear combination analysis of known XANES reference compounds.

  5. Determination of Ten Perfluorinated Compounds in Bluegill Sunfish (Lepomis macrochirus) Fillets

    EPA Science Inventory

    Limited information is known about the environmental distributions of the perfluorinated compounds (PFCs) such as perfluorooctane sulfonate (PFOS) and perfluorooctanoic acid (PFOA), in part due to a lack of well characterized analytical methods that can be used to accurately mea...

  6. Probabilistic Exposure Analysis for Chemical Risk Characterization

    PubMed Central

    Bogen, Kenneth T.; Cullen, Alison C.; Frey, H. Christopher; Price, Paul S.

    2009-01-01

    This paper summarizes the state of the science of probabilistic exposure assessment (PEA) as applied to chemical risk characterization. Current probabilistic risk analysis methods applied to PEA are reviewed. PEA within the context of risk-based decision making is discussed, including probabilistic treatment of related uncertainty, interindividual heterogeneity, and other sources of variability. Key examples of recent experience gained in assessing human exposures to chemicals in the environment, and other applications to chemical risk characterization and assessment, are presented. It is concluded that, although improvements continue to be made, existing methods suffice for effective application of PEA to support quantitative analyses of the risk of chemically induced toxicity that play an increasing role in key decision-making objectives involving health protection, triage, civil justice, and criminal justice. Different types of information required to apply PEA to these different decision contexts are identified, and specific PEA methods are highlighted that are best suited to exposure assessment in these separate contexts. PMID:19223660

  7. An Introduction to Using Surface Geophysics to Characterize Sand and Gravel Deposits

    USGS Publications Warehouse

    Lucius, Jeffrey E.; Langer, William H.; Ellefsen, Karl J.

    2006-01-01

    This report is an introduction to surface geophysical techniques that aggregate producers can use to characterize known deposits of sand and gravel. Five well-established and well-tested geophysical methods are presented: seismic refraction and reflection, resistivity, ground penetrating radar, time-domain electromagnetism, and frequency-domain electromagnetism. Depending on site conditions and the selected method(s), geophysical surveys can provide information concerning aerial extent and thickness of the deposit, thickness of overburden, depth to the water table, critical geologic contacts, and location and correlation of geologic features. In addition, geophysical surveys can be conducted prior to intensive drilling to help locate auger or drill holes, reduce the number of drill holes required, calculate stripping ratios to help manage mining costs, and provide continuity between sampling sites to upgrade the confidence of reserve calculations from probable reserves to proved reserves. Perhaps the greatest value of geophysics to aggregate producers may be the speed of data acquisition, reduced overall costs, and improved subsurface characterization.

  8. An Introduction to Using Surface Geophysics to Characterize Sand and Gravel Deposits

    USGS Publications Warehouse

    Lucius, Jeffrey E.; Langer, William H.; Ellefsen, Karl J.

    2007-01-01

    This report is an introduction to surface geophysical techniques that aggregate producers can use to characterize known deposits of sand and gravel. Five well-established and well-tested geophysical methods are presented: seismic refraction and reflection, resistivity, ground penetrating radar, time-domain electromagnetism, and frequency-domain electromagnetism. Depending on site conditions and the selected method(s), geophysical surveys can provide information concerning areal extent and thickness of the deposit, thickness of overburden, depth to the water table, critical geologic contacts, and location and correlation of geologic features. In addition, geophysical surveys can be conducted prior to intensive drilling to help locate auger or drill holes, reduce the number of drill holes required, calculate stripping ratios to help manage mining costs, and provide continuity between sampling sites to upgrade the confidence of reserve calculations from probable reserves to proved reserves. Perhaps the greatest value of geophysics to aggregate producers may be the speed of data acquisition, reduced overall costs, and improved subsurface characterization.

  9. Method of and Apparatus for Histological Human Tissue Characterization Using Ultrasound

    NASA Technical Reports Server (NTRS)

    Yost, William T. (Inventor); Cantrell, John H. (Inventor); TalEr, George A. (Inventor)

    1999-01-01

    A method and apparatus for determining important histological characteristics of tissue, including a determination of the tissue's health. Electrical pulses are converted into meaningful numerical representations through the use of Fourier Transforms. These numerical representations are then used to determine important histological characteristics of tissue. This novel invention does not require rectification and thus provides for detailed information from the ultrasonic scan.

  10. Method of and Apparatus for Histological Human Tissue Characterization Using Ultrasound

    NASA Technical Reports Server (NTRS)

    Yost, William T. (Inventor); Cantrell, John H. (Inventor); Taler, George A. (Inventor)

    1998-01-01

    A method and apparatus for determining important histological characteristics of tissue, including a determination of the tissue's health is discussed. Electrical pulses are converted into meaningful numerical representations through the use of Fourier Transforms. These numerical representations are then used to determine important histological characteristics of tissue. This novel invention does not require rectification and thus provides for detailed information from the ultrasonic scan.

  11. RH-TRU Waste Inventory Characterization by AK and Proposed WIPP RH-TRU Waste Characterization Objectives

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Most, W. A.; Kehrman, R.; Gist, C.

    2002-02-26

    The U.S. Department of Energy (DOE)-Carlsbad Field Office (CBFO) has developed draft documentation to present the proposed Waste Isolation Pilot Plant (WIPP) remote-handled (RH-) transuranic (TRU) waste characterization program to its regulators, the U.S. Environmental Protection Agency and the New Mexico Environment Department. Compliance with Title 40, Code of Federal Regulations, Parts 191 and 194; the WIPP Land Withdrawal Act (PL 102-579); and the WIPP Hazardous Waste Facility Permit, as well as the Certificates of Compliance for the 72-B and 10-160B Casks, requires that specific waste parameter limits be imposed on DOE sites disposing of TRU waste at WIPP. Themore » DOE-CBFO must control the sites' compliance with the limits by specifying allowable characterization methods. As with the established WIPP contact handled TRU waste characterization program, the DOE-CBFO has proposed a Remote-Handled TRU Waste Acceptance Criteria (RH-WAC) document consolidating the requirements from various regulatory drivers and proposed allowable characterization methods. These criteria are consistent with the recommendation of a recent National Academy Sciences/National Research Council to develop an RH-TRU waste characterization approach that removes current self imposed requirements that lack a legal or safety basis. As proposed in the draft RH-WAC and other preliminary documents, the DOE-CBFO RH-TRU waste characterization program proposes the use of acceptable knowledge (AK) as the primary method for obtaining required characterization information. The use of AK involves applying knowledge of the waste in light of the materials or processes used to generate the waste. Documentation, records, or processes providing information about various attributes of a waste stream, such as chemical, physical, and radiological properties, may be used as AK and may be applied to individual waste containers either independently or in conjunction with radiography, visual examination, assay, and other sampling and analytical data. RH-TRU waste cannot be shipped to WIPP on the basis of AK alone if documentation demonstrating that all of the prescribed limits in the RH-WAC are met is not available, discrepancies exist among AK source documents describing the same waste stream and the most conservative assumptions regarding those documents indicates that a limit will not be met, or all required data are not available for a given waste stream.« less

  12. A cross-site comparison of methods used for hydrogeologic characterization of the Galena-Platteville aquifer in Illinois and Wisconsin, with examples from selected Superfund sites

    USGS Publications Warehouse

    Kay, Robert T.; Mills, Patrick C.; Dunning, Charles P.; Yeskis, Douglas J.; Ursic, James R.; Vendl, Mark

    2004-01-01

    The effectiveness of 28 methods used to characterize the fractured Galena-Platteville aquifer at eight sites in northern Illinois and Wisconsin is evaluated. Analysis of government databases, previous investigations, topographic maps, aerial photographs, and outcrops was essential to understanding the hydrogeology in the area to be investigated. The effectiveness of surface-geophysical methods depended on site geology. Lithologic logging provided essential information for site characterization. Cores were used for stratigraphy and geotechnical analysis. Natural-gamma logging helped identify the effect of lithology on the location of secondary- permeability features. Caliper logging identified large secondary-permeability features. Neutron logs identified trends in matrix porosity. Acoustic-televiewer logs identified numerous secondary-permeability features and their orientation. Borehole-camera logs also identified a number of secondary-permeability features. Borehole ground-penetrating radar identified lithologic and secondary-permeability features. However, the accuracy and completeness of this method is uncertain. Single-point-resistance, density, and normal resistivity logs were of limited use. Water-level and water-quality data identified flow directions and indicated the horizontal and vertical distribution of aquifer permeability and the depth of the permeable features. Temperature, spontaneous potential, and fluid-resistivity logging identified few secondary-permeability features at some sites and several features at others. Flowmeter logging was the most effective geophysical method for characterizing secondary-permeability features. Aquifer tests provided insight into the permeability distribution, identified hydraulically interconnected features, the presence of heterogeneity and anisotropy, and determined effective porosity. Aquifer heterogeneity prevented calculation of accurate hydraulic properties from some tests. Different methods, such as flowmeter logging and slug testing, occasionally produced different interpretations. Aquifer characterization improved with an increase in the number of data points, the period of data collection, and the number of methods used.

  13. Invited Article: Concepts and tools for the evaluation of measurement uncertainty

    NASA Astrophysics Data System (ADS)

    Possolo, Antonio; Iyer, Hari K.

    2017-01-01

    Measurements involve comparisons of measured values with reference values traceable to measurement standards and are made to support decision-making. While the conventional definition of measurement focuses on quantitative properties (including ordinal properties), we adopt a broader view and entertain the possibility of regarding qualitative properties also as legitimate targets for measurement. A measurement result comprises the following: (i) a value that has been assigned to a property based on information derived from an experiment or computation, possibly also including information derived from other sources, and (ii) a characterization of the margin of doubt that remains about the true value of the property after taking that information into account. Measurement uncertainty is this margin of doubt, and it can be characterized by a probability distribution on the set of possible values of the property of interest. Mathematical or statistical models enable the quantification of measurement uncertainty and underlie the varied collection of methods available for uncertainty evaluation. Some of these methods have been in use for over a century (for example, as introduced by Gauss for the combination of mutually inconsistent observations or for the propagation of "errors"), while others are of fairly recent vintage (for example, Monte Carlo methods including those that involve Markov Chain Monte Carlo sampling). This contribution reviews the concepts, models, methods, and computations that are commonly used for the evaluation of measurement uncertainty, and illustrates their application in realistic examples drawn from multiple areas of science and technology, aiming to serve as a general, widely accessible reference.

  14. Characterizing natural colloidal/particulate-protein interactions using fluorescence-based techniques and principal component analysis.

    PubMed

    Peiris, Ramila H; Ignagni, Nicholas; Budman, Hector; Moresoli, Christine; Legge, Raymond L

    2012-09-15

    Characterization of the interactions between natural colloidal/particulate- and protein-like matter is important for understanding their contribution to different physiochemical phenomena like membrane fouling, adsorption of bacteria onto surfaces and various applications of nanoparticles in nanomedicine and nanotoxicology. Precise interpretation of the extent of such interactions is however hindered due to the limitations of most characterization methods to allow rapid, sensitive and accurate measurements. Here we report on a fluorescence-based excitation-emission matrix (EEM) approach in combination with principal component analysis (PCA) to extract information related to the interaction between natural colloidal/particulate- and protein-like matter. Surface plasmon resonance (SPR) analysis and fiber-optic probe based surface fluorescence measurements were used to confirm that the proposed approach can be used to characterize colloidal/particulate-protein interactions at the physical level. This method has potential to be a fundamental measurement of these interactions with the advantage that it can be performed rapidly and with high sensitivity. Copyright © 2012 Elsevier B.V. All rights reserved.

  15. Standardization of Nanoparticle Characterization: Methods for Testing Properties, Stability, and Functionality of Edible Nanoparticles.

    PubMed

    McClements, Jake; McClements, David Julian

    2016-06-10

    There has been a rapid increase in the fabrication of various kinds of edible nanoparticles for oral delivery of bioactive agents, such as those constructed from proteins, carbohydrates, lipids, and/or minerals. It is currently difficult to compare the relative advantages and disadvantages of different kinds of nanoparticle-based delivery systems because researchers use different analytical instruments and protocols to characterize them. In this paper, we briefly review the various analytical methods available for characterizing the properties of edible nanoparticles, such as composition, morphology, size, charge, physical state, and stability. This information is then used to propose a number of standardized protocols for characterizing nanoparticle properties, for evaluating their stability to environmental stresses, and for predicting their biological fate. Implementation of these protocols would facilitate comparison of the performance of nanoparticles under standardized conditions, which would facilitate the rational selection of nanoparticle-based delivery systems for different applications in the food, health care, and pharmaceutical industries.

  16. Bioinspired sensory systems for local flow characterization

    NASA Astrophysics Data System (ADS)

    Colvert, Brendan; Chen, Kevin; Kanso, Eva

    2016-11-01

    Empirical evidence suggests that many aquatic organisms sense differential hydrodynamic signals.This sensory information is decoded to extract relevant flow properties. This task is challenging because it relies on local and partial measurements, whereas classical flow characterization methods depend on an external observer to reconstruct global flow fields. Here, we introduce a mathematical model in which a bioinspired sensory array measuring differences in local flow velocities characterizes the flow type and intensity. We linearize the flow field around the sensory array and express the velocity gradient tensor in terms of frame-independent parameters. We develop decoding algorithms that allow the sensory system to characterize the local flow and discuss the conditions under which this is possible. We apply this framework to the canonical problem of a circular cylinder in uniform flow, finding excellent agreement between sensed and actual properties. Our results imply that combining suitable velocity sensors with physics-based methods for decoding sensory measurements leads to a powerful approach for understanding and developing underwater sensory systems.

  17. A Novel Physical Sensing Principle for Liquid Characterization Using Paper-Based Hygro-Mechanical Systems (PB-HMS).

    PubMed

    Perez-Cruz, Angel; Stiharu, Ion; Dominguez-Gonzalez, Aurelio

    2017-07-20

    In recent years paper-based microfluidic systems have emerged as versatile tools for developing sensors in different areas. In this work; we report a novel physical sensing principle for the characterization of liquids using a paper-based hygro-mechanical system (PB-HMS). The PB-HMS is formed by the interaction of liquid droplets and paper-based mini-structures such as cantilever beams. The proposed principle takes advantage of the hygroscopic properties of paper to produce hygro-mechanical motion. The dynamic response of the PB-HMS reveals information about the tested liquid that can be applied to characterize certain properties of liquids. A suggested method to characterize liquids by means of the proposed principle is introduced. The experimental results show the feasibility of such a method. It is expected that the proposed principle may be applied to sense properties of liquids in different applications where both disposability and portability are of extreme importance.

  18. Characterization of NIST food-matrix Standard Reference Materials for their vitamin C content.

    PubMed

    Thomas, Jeanice B; Yen, James H; Sharpless, Katherine E

    2013-05-01

    The vitamin C concentrations in three food-matrix Standard Reference Materials (SRMs) from the National Institute of Standards and Technology (NIST) have been determined by liquid chromatography (LC) with absorbance detection. These materials (SRM 1549a Whole Milk Powder, SRM 1849a Infant/Adult Nutritional Formula, and SRM 3233 Fortified Breakfast Cereal) have been characterized to support analytical measurements made by food processors that are required to provide information about their products' vitamin C content on the labels of products distributed in the United States. The SRMs are primarily intended for use in validating analytical methods for the determination of selected vitamins, elements, fatty acids, and other nutrients in these materials and in similar matrixes. They can also be used for quality assurance in the characterization of test samples or in-house control materials, and for establishing measurement traceability. Within-day precision of the LC method used to measure vitamin C in the food-matrix SRMs characterized in this study ranged from 2.7% to 6.5%.

  19. Surface Characterization of Nanomaterials and Nanoparticles. Important needs and challenging opportunities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baer, Donald R.; Engelhard, Mark H.; Johnson, Grant E.

    2013-08-27

    This review examines the characterization challenges inherently associated with understanding nanomaterials and how surface characterization methods can help meet those challenges. In parts of the research community, there is growing recognition that many studies and published reports on the properties and behaviors of nanomaterials have involved inadequate characterization. As a consequence, the true value of the data in these reports is, at best, uncertain. As the importance of nanomaterials in fundamental research and technological applications increases, it is necessary for researchers to recognize the challenges associated with reproducible materials synthesis, maintaining desired materials properties during handling and processing, and themore » dynamic nature of nanomaterials, especially nanoparticles. Researchers also need to understand how characterization approaches (surface and otherwise) can be used to minimize synthesis surprises and to determine how (and how quickly) materials and properties change in different environments. The types of information that can be provided by traditional surface sensitive analysis methods (including X-ray photoelectron and Auger electron spectroscopies, scanning probe microscopy and secondary ion mass spectroscopy) and less common or evolving surface sensitive methods (e.g., nuclear magnetic resonance, sum frequency generation, and low and medium energy ion scattering) are discussed and various of their use in nanomaterial research are presented.« less

  20. Remediation of a Former USAF Radioactive Material Disposal Site

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hoffman, D. E.; Cushman, M; Tupyi, B.

    2003-02-25

    This paper describes the remediation of a low-level radiological waste burial site located at the former James Connally Air Force Base in Waco, Texas. Burial activities at the site occurred during the 1950's when the property was under the ownership of the United States Air Force. Included is a discussion of methods and strategies that were used to successfully exhume and characterize the wastes for proper disposal at offsite disposal facilities. Worker and environmental protection measures are also described. Information gained from this project may be used at other similar project sites. A total of nine burial tubes had beenmore » identified for excavation, characterization, and removal from the site. The disposal tubes were constructed of 4-ft lengths of concrete pipe buried upright with the upper ends flush with ground surface. Initial ground level observations of the burial tubes indicated that some weathering had occurred; however, the condition of the subsurface portions of the tubes was unknown. Soil excavation occurred in 1-foot lifts in order that the tubes could be inspected and to allow for characterization of the soils at each stage of the excavation. Due to the weight of the concrete pipe and the condition of the piping joints it was determined that special measures would be required to maintain the tubes intact during their removal. Special tube anchoring and handling methods were required to relocate the tubes from their initial positions to a staging area where they could be further characterized. Characterization of the disposal tubes was accomplished using a combination of gamma spectroscopy and activity mapping methods. Important aspects of the project included the use of specialized excavation and disposal tube reinforcement measures to maintain the disposal tubes intact during excavation, removal and subsequent characterization. The non-intrusive gamma spectroscopy and data logging methods allowed for effective characterization of the wastes while minimizing disposal costs. In addition, worker exposures were maintained ALARA as a result of the removal and characterization methods employed.« less

  1. Colorization and Automated Segmentation of Human T2 MR Brain Images for Characterization of Soft Tissues

    PubMed Central

    Attique, Muhammad; Gilanie, Ghulam; Hafeez-Ullah; Mehmood, Malik S.; Naweed, Muhammad S.; Ikram, Masroor; Kamran, Javed A.; Vitkin, Alex

    2012-01-01

    Characterization of tissues like brain by using magnetic resonance (MR) images and colorization of the gray scale image has been reported in the literature, along with the advantages and drawbacks. Here, we present two independent methods; (i) a novel colorization method to underscore the variability in brain MR images, indicative of the underlying physical density of bio tissue, (ii) a segmentation method (both hard and soft segmentation) to characterize gray brain MR images. The segmented images are then transformed into color using the above-mentioned colorization method, yielding promising results for manual tracing. Our color transformation incorporates the voxel classification by matching the luminance of voxels of the source MR image and provided color image by measuring the distance between them. The segmentation method is based on single-phase clustering for 2D and 3D image segmentation with a new auto centroid selection method, which divides the image into three distinct regions (gray matter (GM), white matter (WM), and cerebrospinal fluid (CSF) using prior anatomical knowledge). Results have been successfully validated on human T2-weighted (T2) brain MR images. The proposed method can be potentially applied to gray-scale images from other imaging modalities, in bringing out additional diagnostic tissue information contained in the colorized image processing approach as described. PMID:22479421

  2. Beyond mind-reading: multi-voxel pattern analysis of fMRI data.

    PubMed

    Norman, Kenneth A; Polyn, Sean M; Detre, Greg J; Haxby, James V

    2006-09-01

    A key challenge for cognitive neuroscience is determining how mental representations map onto patterns of neural activity. Recently, researchers have started to address this question by applying sophisticated pattern-classification algorithms to distributed (multi-voxel) patterns of functional MRI data, with the goal of decoding the information that is represented in the subject's brain at a particular point in time. This multi-voxel pattern analysis (MVPA) approach has led to several impressive feats of mind reading. More importantly, MVPA methods constitute a useful new tool for advancing our understanding of neural information processing. We review how researchers are using MVPA methods to characterize neural coding and information processing in domains ranging from visual perception to memory search.

  3. Characterizing Storm Event Dynamics of a Forested Watershed in the Lower Atlantic Coastal Plain, South Carolina USA

    NASA Astrophysics Data System (ADS)

    Latorre Torres, I. B.; Amatya, D. M.; Callahan, T. J.; Levine, N. S.

    2007-12-01

    Hydrology research in the Southeast U.S. has primarily focused on upland mountainous areas; however, much less is known about hydrological processes in Lower Coastal Plain (LCP) watersheds. Such watersheds are difficult to characterize due to shallow water table conditions, low topographic gradient, complex surface- subsurface water interaction, and lack of detailed soil information. Although opportunities to conduct long term monitoring in relatively undeveloped watersheds are often limited, stream flow and rainfall in the Turkey Creek watershed (third-order watershed, about 7200 ha in the Francis Marion National Forest near Charleston, SC) have been monitored since 1964. In this study, event runoff-rainfall ratios have been determined for 51 storm events using historical data from 1964-1973. One of our objectives was to characterize relationships between seasonal event rainfall and storm outflow in this watershed. To this end, observed storm event data were compared with values predicted by established hydrological methods such as the Soil Conservation Service runoff curve number (SCS-CN) and the rational method integrated within a Geographical Information System (GIS), to estimate total event runoff and peak discharge, respectively. Available 1:15000 scale aerial images were digitized to obtain land uses, which were used with the SCS soil hydrologic groups to obtain the runoff coefficients (C) for the rational method and the CN values for the SCS-CN method. These methods are being tested with historical storm event responses in the Turkey Creek watershed scale, and then will be used to predict event runoff in Quinby Creek, an ungauged third-order watershed (8700 ha) adjacent to Turkey Creek. Successful testing with refinement of parameters in the rational method and SCS-CN method, both designed for small urban and agricultural dominated watersheds, may allow widespread application of these methods for studying the event rainfall-runoff dynamics for similar watersheds in the Lower Coastal Plain of the Southeast U.S.

  4. Bioelectrical Impedance Methods for Noninvasive Health Monitoring: A Review

    PubMed Central

    Bera, Tushar Kanti

    2014-01-01

    Under the alternating electrical excitation, biological tissues produce a complex electrical impedance which depends on tissue composition, structures, health status, and applied signal frequency, and hence the bioelectrical impedance methods can be utilized for noninvasive tissue characterization. As the impedance responses of these tissue parameters vary with frequencies of the applied signal, the impedance analysis conducted over a wide frequency band provides more information about the tissue interiors which help us to better understand the biological tissues anatomy, physiology, and pathology. Over past few decades, a number of impedance based noninvasive tissue characterization techniques such as bioelectrical impedance analysis (BIA), electrical impedance spectroscopy (EIS), electrical impedance plethysmography (IPG), impedance cardiography (ICG), and electrical impedance tomography (EIT) have been proposed and a lot of research works have been conducted on these methods for noninvasive tissue characterization and disease diagnosis. In this paper BIA, EIS, IPG, ICG, and EIT techniques and their applications in different fields have been reviewed and technical perspective of these impedance methods has been presented. The working principles, applications, merits, and demerits of these methods has been discussed in detail along with their other technical issues followed by present status and future trends. PMID:27006932

  5. Analytical approaches for the characterization and quantification of nanoparticles in food and beverages.

    PubMed

    Mattarozzi, Monica; Suman, Michele; Cascio, Claudia; Calestani, Davide; Weigel, Stefan; Undas, Anna; Peters, Ruud

    2017-01-01

    Estimating consumer exposure to nanomaterials (NMs) in food products and predicting their toxicological properties are necessary steps in the assessment of the risks of this technology. To this end, analytical methods have to be available to detect, characterize and quantify NMs in food and materials related to food, e.g. food packaging and biological samples following metabolization of food. The challenge for the analytical sciences is that the characterization of NMs requires chemical as well as physical information. This article offers a comprehensive analysis of methods available for the detection and characterization of NMs in food and related products. Special attention was paid to the crucial role of sample preparation methods since these have been partially neglected in the scientific literature so far. The currently available instrumental methods are grouped as fractionation, counting and ensemble methods, and their advantages and limitations are discussed. We conclude that much progress has been made over the last 5 years but that many challenges still exist. Future perspectives and priority research needs are pointed out. Graphical Abstract Two possible analytical strategies for the sizing and quantification of Nanoparticles: Asymmetric Flow Field-Flow Fractionation with multiple detectors (allows the determination of true size and mass-based particle size distribution); Single Particle Inductively Coupled Plasma Mass Spectrometry (allows the determination of a spherical equivalent diameter of the particle and a number-based particle size distribution).

  6. Integrating asthma hazard characterization methods for consumer products.

    PubMed

    Maier, A; Vincent, M J; Gadagbui, B; Patterson, J; Beckett, W; Dalton, P; Kimber, I; Selgrade, M J K

    2014-10-01

    Despite extensive study, definitive conclusions regarding the relationship between asthma and consumer products remain elusive. Uncertainties reflect the multi-faceted nature of asthma (i.e., contributions of immunologic and non-immunologic mechanisms). Many substances used in consumer products are associated with occupational asthma or asthma-like syndromes. However, risk assessment methods do not adequately predict the potential for consumer product exposures to trigger asthma and related syndromes under lower-level end-user conditions. A decision tree system is required to characterize asthma and respiratory-related hazards associated with consumer products. A system can be built to incorporate the best features of existing guidance, frameworks, and models using a weight-of-evidence (WoE) approach. With this goal in mind, we have evaluated chemical hazard characterization methods for asthma and asthma-like responses. Despite the wealth of information available, current hazard characterization methods do not definitively identify whether a particular ingredient will cause or exacerbate asthma, asthma-like responses, or sensitization of the respiratory tract at lower levels associated with consumer product use. Effective use of hierarchical lines of evidence relies on consideration of the relevance and potency of assays, organization of assays by mode of action, and better assay validation. It is anticipated that the analysis of existing methods will support the development of a refined WoE approach. Copyright © 2014 The Authors. Published by Elsevier Inc. All rights reserved.

  7. Advancing Models and Data for Characterizing Exposures to Chemicals in Consumer Products

    EPA Science Inventory

    EPA’s Office of Research and Development (ORD) is leading several efforts to develop data and methods for estimating population chemical exposures related to the use of consumer products. New curated chemical, ingredient, and product use information are being collected fro...

  8. Social Interest in High-Functioning Adults with Autism Spectrum Disorders

    ERIC Educational Resources Information Center

    Fletcher-Watson, Sue; Leekam, Susan R.; Findlay, John M.

    2013-01-01

    Autism spectrum disorders (ASD) are principally characterized by impairments in social functioning. Experimental investigation often is conducted using methods measuring social attention, social cognition, and social communication. In this study, we instead measured interest in social information, making a distinction between basic-level…

  9. FROM2D to 3d Supervised Segmentation and Classification for Cultural Heritage Applications

    NASA Astrophysics Data System (ADS)

    Grilli, E.; Dininno, D.; Petrucci, G.; Remondino, F.

    2018-05-01

    The digital management of architectural heritage information is still a complex problem, as a heritage object requires an integrated representation of various types of information in order to develop appropriate restoration or conservation strategies. Currently, there is extensive research focused on automatic procedures of segmentation and classification of 3D point clouds or meshes, which can accelerate the study of a monument and integrate it with heterogeneous information and attributes, useful to characterize and describe the surveyed object. The aim of this study is to propose an optimal, repeatable and reliable procedure to manage various types of 3D surveying data and associate them with heterogeneous information and attributes to characterize and describe the surveyed object. In particular, this paper presents an approach for classifying 3D heritage models, starting from the segmentation of their textures based on supervised machine learning methods. Experimental results run on three different case studies demonstrate that the proposed approach is effective and with many further potentials.

  10. The exposure of the nursing profession in online and print media

    PubMed Central

    Cardoso, Rodrigo José Martins; Graveto, João Manuel Garcia de Nascimento; Queiroz, Ana Maria Correia Albuquerque

    2014-01-01

    Objective to describe the coverage of news concerning the nursing profession in the Portuguese media: informative sites on the Internet and in print media. Method a total of 1,271 health news items were collected in September and October of 2011 (956 online news items and 325 news items originating from the press review of the Portuguese Order of Nurses). Statistical analysis was used to characterize the variables. Results nurses were the sources of information in 6.6% of cases, suggesting limited media exposure. The health news collected is characterized by a production based on limited information sources, that is, male and official sources, on information disseminated by news agencies focused on economic and political issues in the health field. Conclusion the presence of nurses in the news concerning nursing health is reduced. We suggest that nurses develop public communication skills to disseminate the importance of their profession in society and their relationship with the media. PMID:24553715

  11. Characterization of trabecular bone using the backscattered spectral centroid shift.

    PubMed

    Wear, Keith A

    2003-04-01

    Ultrasonic attenuation in bone in vivo is generally measured using a through-transmission method at the calcaneus. Although attenuation in calcaneus has been demonstrated to be a useful predictor for osteoporotic fracture risk, measurements at other clinically important sites, such as hip and spine, could potentially contain additional useful diagnostic information. Through-transmission measurements may not be feasible at these sites due to complex bone shapes and the increased amount of intervening soft tissue. Centroid shift from the backscattered signal is an index of attenuation slope and has been used previously to characterize soft tissues. In this paper, centroid shift from signals backscattered from 30 trabecular bone samples in vitro were measured. Attenuation slope also was measured using a through-transmission method. The correlation coefficient between centroid shift and attenuation slope was -0.71. The 95% confidence interval was (-0.86, -0.47). These results suggest that the backscattered spectral centroid shift may contain useful diagnostic information potentially applicable to hip and spine.

  12. Establishment of a reference collection of additives and an analytical handbook of reference data to support enforcement of EU regulations on food contact plastics.

    PubMed

    van Lierop, B; Castle, L; Feigenbaum, A; Ehlert, K; Boenke, A

    1998-10-01

    A collection has been made of additives that are required as analytical standards for enforcement of European Union legislation on food contact plastics. The 100 additives have been characterized by mass spectrometry, infra-red spectroscopy and proton nuclear magnetic resonance spectroscopy to provide reference spectra. Gas chromatographic retention times have been recorded to facilitate identification by retention index. This information has been further supplemented by physico-chemical data. Finally, chromatographic methods have been used to indicate the presence of any impurities in the commercial chemicals. Samples of the reference substances are available on request and the collection of spectra and other information will be made available in printed format and on-line through the Internet. This paper gives an overview of the work done to establish the reference collection and the spectral atlas, which together will assist enforcement laboratories in the characterization of plastics and the selection of analytical methods for additives that may migrate.

  13. A global parallel model based design of experiments method to minimize model output uncertainty.

    PubMed

    Bazil, Jason N; Buzzard, Gregory T; Rundell, Ann E

    2012-03-01

    Model-based experiment design specifies the data to be collected that will most effectively characterize the biological system under study. Existing model-based design of experiment algorithms have primarily relied on Fisher Information Matrix-based methods to choose the best experiment in a sequential manner. However, these are largely local methods that require an initial estimate of the parameter values, which are often highly uncertain, particularly when data is limited. In this paper, we provide an approach to specify an informative sequence of multiple design points (parallel design) that will constrain the dynamical uncertainty of the biological system responses to within experimentally detectable limits as specified by the estimated experimental noise. The method is based upon computationally efficient sparse grids and requires only a bounded uncertain parameter space; it does not rely upon initial parameter estimates. The design sequence emerges through the use of scenario trees with experimental design points chosen to minimize the uncertainty in the predicted dynamics of the measurable responses of the system. The algorithm was illustrated herein using a T cell activation model for three problems that ranged in dimension from 2D to 19D. The results demonstrate that it is possible to extract useful information from a mathematical model where traditional model-based design of experiments approaches most certainly fail. The experiments designed via this method fully constrain the model output dynamics to within experimentally resolvable limits. The method is effective for highly uncertain biological systems characterized by deterministic mathematical models with limited data sets. Also, it is highly modular and can be modified to include a variety of methodologies such as input design and model discrimination.

  14. Characterization of Initial Parameter Information for Lifetime Prediction of Electronic Devices.

    PubMed

    Li, Zhigang; Liu, Boying; Yuan, Mengxiong; Zhang, Feifei; Guo, Jiaqiang

    2016-01-01

    Newly manufactured electronic devices are subject to different levels of potential defects existing among the initial parameter information of the devices. In this study, a characterization of electromagnetic relays that were operated at their optimal performance with appropriate and steady parameter values was performed to estimate the levels of their potential defects and to develop a lifetime prediction model. First, the initial parameter information value and stability were quantified to measure the performance of the electronics. In particular, the values of the initial parameter information were estimated using the probability-weighted average method, whereas the stability of the parameter information was determined by using the difference between the extrema and end points of the fitting curves for the initial parameter information. Second, a lifetime prediction model for small-sized samples was proposed on the basis of both measures. Finally, a model for the relationship of the initial contact resistance and stability over the lifetime of the sampled electromagnetic relays was proposed and verified. A comparison of the actual and predicted lifetimes of the relays revealed a 15.4% relative error, indicating that the lifetime of electronic devices can be predicted based on their initial parameter information.

  15. Characterization of Initial Parameter Information for Lifetime Prediction of Electronic Devices

    PubMed Central

    Li, Zhigang; Liu, Boying; Yuan, Mengxiong; Zhang, Feifei; Guo, Jiaqiang

    2016-01-01

    Newly manufactured electronic devices are subject to different levels of potential defects existing among the initial parameter information of the devices. In this study, a characterization of electromagnetic relays that were operated at their optimal performance with appropriate and steady parameter values was performed to estimate the levels of their potential defects and to develop a lifetime prediction model. First, the initial parameter information value and stability were quantified to measure the performance of the electronics. In particular, the values of the initial parameter information were estimated using the probability-weighted average method, whereas the stability of the parameter information was determined by using the difference between the extrema and end points of the fitting curves for the initial parameter information. Second, a lifetime prediction model for small-sized samples was proposed on the basis of both measures. Finally, a model for the relationship of the initial contact resistance and stability over the lifetime of the sampled electromagnetic relays was proposed and verified. A comparison of the actual and predicted lifetimes of the relays revealed a 15.4% relative error, indicating that the lifetime of electronic devices can be predicted based on their initial parameter information. PMID:27907188

  16. Hierarchical mutual information for the comparison of hierarchical community structures in complex networks

    NASA Astrophysics Data System (ADS)

    Perotti, Juan Ignacio; Tessone, Claudio Juan; Caldarelli, Guido

    2015-12-01

    The quest for a quantitative characterization of community and modular structure of complex networks produced a variety of methods and algorithms to classify different networks. However, it is not clear if such methods provide consistent, robust, and meaningful results when considering hierarchies as a whole. Part of the problem is the lack of a similarity measure for the comparison of hierarchical community structures. In this work we give a contribution by introducing the hierarchical mutual information, which is a generalization of the traditional mutual information and makes it possible to compare hierarchical partitions and hierarchical community structures. The normalized version of the hierarchical mutual information should behave analogously to the traditional normalized mutual information. Here the correct behavior of the hierarchical mutual information is corroborated on an extensive battery of numerical experiments. The experiments are performed on artificial hierarchies and on the hierarchical community structure of artificial and empirical networks. Furthermore, the experiments illustrate some of the practical applications of the hierarchical mutual information, namely the comparison of different community detection methods and the study of the consistency, robustness, and temporal evolution of the hierarchical modular structure of networks.

  17. Probability density function characterization for aggregated large-scale wind power based on Weibull mixtures

    DOE PAGES

    Gomez-Lazaro, Emilio; Bueso, Maria C.; Kessler, Mathieu; ...

    2016-02-02

    Here, the Weibull probability distribution has been widely applied to characterize wind speeds for wind energy resources. Wind power generation modeling is different, however, due in particular to power curve limitations, wind turbine control methods, and transmission system operation requirements. These differences are even greater for aggregated wind power generation in power systems with high wind penetration. Consequently, models based on one-Weibull component can provide poor characterizations for aggregated wind power generation. With this aim, the present paper focuses on discussing Weibull mixtures to characterize the probability density function (PDF) for aggregated wind power generation. PDFs of wind power datamore » are firstly classified attending to hourly and seasonal patterns. The selection of the number of components in the mixture is analyzed through two well-known different criteria: the Akaike information criterion (AIC) and the Bayesian information criterion (BIC). Finally, the optimal number of Weibull components for maximum likelihood is explored for the defined patterns, including the estimated weight, scale, and shape parameters. Results show that multi-Weibull models are more suitable to characterize aggregated wind power data due to the impact of distributed generation, variety of wind speed values and wind power curtailment.« less

  18. High-resolution remotely sensed small target detection by imitating fly visual perception mechanism.

    PubMed

    Huang, Fengchen; Xu, Lizhong; Li, Min; Tang, Min

    2012-01-01

    The difficulty and limitation of small target detection methods for high-resolution remote sensing data have been a recent research hot spot. Inspired by the information capture and processing theory of fly visual system, this paper endeavors to construct a characterized model of information perception and make use of the advantages of fast and accurate small target detection under complex varied nature environment. The proposed model forms a theoretical basis of small target detection for high-resolution remote sensing data. After the comparison of prevailing simulation mechanism behind fly visual systems, we propose a fly-imitated visual system method of information processing for high-resolution remote sensing data. A small target detector and corresponding detection algorithm are designed by simulating the mechanism of information acquisition, compression, and fusion of fly visual system and the function of pool cell and the character of nonlinear self-adaption. Experiments verify the feasibility and rationality of the proposed small target detection model and fly-imitated visual perception method.

  19. Coming to Grips with Ambiguity: Ion Mobility-Mass Spectrometry for Protein Quaternary Structure Assignment

    NASA Astrophysics Data System (ADS)

    Eschweiler, Joseph D.; Frank, Aaron T.; Ruotolo, Brandon T.

    2017-10-01

    Multiprotein complexes are central to our understanding of cellular biology, as they play critical roles in nearly every biological process. Despite many impressive advances associated with structural characterization techniques, large and highly-dynamic protein complexes are too often refractory to analysis by conventional, high-resolution approaches. To fill this gap, ion mobility-mass spectrometry (IM-MS) methods have emerged as a promising approach for characterizing the structures of challenging assemblies due in large part to the ability of these methods to characterize the composition, connectivity, and topology of large, labile complexes. In this Critical Insight, we present a series of bioinformatics studies aimed at assessing the information content of IM-MS datasets for building models of multiprotein structure. Our computational data highlights the limits of current coarse-graining approaches, and compelled us to develop an improved workflow for multiprotein topology modeling, which we benchmark against a subset of the multiprotein complexes within the PDB. This improved workflow has allowed us to ascertain both the minimal experimental restraint sets required for generation of high-confidence multiprotein topologies, and quantify the ambiguity in models where insufficient IM-MS information is available. We conclude by projecting the future of IM-MS in the context of protein quaternary structure assignment, where we predict that a more complete knowledge of the ultimate information content and ambiguity within such models will undoubtedly lead to applications for a broader array of challenging biomolecular assemblies. [Figure not available: see fulltext.

  20. Micro-XRF for characterization of Moroccan glazed ceramics and Portuguese tiles

    NASA Astrophysics Data System (ADS)

    Guilherme, A.; Manso, M.; Pessanha, S.; Zegzouti, A.; Elaatmani, M.; Bendaoud, R.; Coroado, J.; dos Santos, J. M. F.; Carvalho, M. L.

    2013-02-01

    A set of enamelled terracotta samples (Zellij) collected from five different monuments in Morocco were object of study. With the aim of characterizing these typically Moroccan artistic objects, X-ray spectroscopic techniques were used as analytical tool to provide elemental and compound information. A lack of information about these types of artistic ceramics is found by the research through international scientific journals, so this investigation is an opportunity to fulfill this gap. For this purpose, micro-Energy Dispersive X-ray Fluorescence (μ-EDXRF), and wavelength dispersive X-ray Fluorescence (WDXRF) and X-ray Diffraction (XRD) were the chosen methods. As complementary information, a comparison with other sort of artistic pottery objects is given, more precisely with Portuguese glazed wall tiles (Azulejos), based in the Islamic pottery traditions. Differences between these two types of decorative pottery were found and presented in this manuscript.

  1. A Parallel Stochastic Framework for Reservoir Characterization and History Matching

    DOE PAGES

    Thomas, Sunil G.; Klie, Hector M.; Rodriguez, Adolfo A.; ...

    2011-01-01

    The spatial distribution of parameters that characterize the subsurface is never known to any reasonable level of accuracy required to solve the governing PDEs of multiphase flow or species transport through porous media. This paper presents a numerically cheap, yet efficient, accurate and parallel framework to estimate reservoir parameters, for example, medium permeability, using sensor information from measurements of the solution variables such as phase pressures, phase concentrations, fluxes, and seismic and well log data. Numerical results are presented to demonstrate the method.

  2. Solar Technology Validation Project - USS Data, LLC: Cooperative Research and Development Final Report, CRADA Number CRD-09-367-04

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wilcox, S.

    2013-08-01

    Under this Agreement, NREL will work with Participant to improve concentrating solar power system performance characterizations. This work includes, but is not limited to, research and development of methods for acquiring renewable resource characterization information using site-specific measurements of solar radiation and meteorological conditions; collecting system performance data; and developing tools for improving the design, installation, operation, and maintenance of solar energy conversion systems. This work will be conducted at NREL and Participant facilities.

  3. Solar Technology Validation Project - RES Americas: Cooperative Research and Development Final Report, CRADA Number CRD-09-367-11

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wilcox, S.

    Under this Agreement, NREL will work with Participant to improve concentrating solar power system performance characterizations. This work includes, but is not limited to, research and development of methods for acquiring renewable resource characterization information using site-specific measurements of solar radiation and meteorological conditions; collecting system performance data; and developing tools for improving the design, installation, operation, and maintenance of solar energy conversion systems. This work will be conducted at NREL and Participant facilities.

  4. Towards a detailed anthropometric body characterization using the Microsoft Kinect.

    PubMed

    Domingues, Ana; Barbosa, Filipa; Pereira, Eduardo M; Santos, Márcio Borgonovo; Seixas, Adérito; Vilas-Boas, João; Gabriel, Joaquim; Vardasca, Ricardo

    2016-01-01

    Anthropometry has been widely used in different fields, providing relevant information for medicine, ergonomics and biometric applications. However, the existent solutions present marked disadvantages, reducing the employment of this type of evaluation. Studies have been conducted in order to easily determine anthropometric measures considering data provided by low-cost sensors, such as the Microsoft Kinect. In this work, a methodology is proposed and implemented for estimating anthropometric measures considering the information acquired with this sensor. The measures obtained with this method were compared with the ones from a validation system, Qualisys. Comparing the relative errors determined with state-of-art references, for some of the estimated measures, lower errors were verified and a more complete characterization of the whole body structure was achieved.

  5. System and method for identifying, validating, weighing and characterizing moving or stationary vehicles and cargo

    DOEpatents

    Beshears, David L.; Batsell, Stephen G.; Abercrombie, Robert K.; Scudiere, Matthew B.; White, Clifford P.

    2007-12-04

    An asset identification and information infrastructure management (AI3M) device having an automated identification technology system (AIT), a Transportation Coordinators' Automated Information for Movements System II (TC-AIMS II), a weigh-in-motion system (WIM-II), and an Automated Air Load Planning system (AALPS) all in electronic communication for measuring and calculating actual asset characteristics, either statically or in-motion, and further calculating an actual load plan.

  6. Estimating Temporal Causal Interaction between Spike Trains with Permutation and Transfer Entropy

    PubMed Central

    Li, Zhaohui; Li, Xiaoli

    2013-01-01

    Estimating the causal interaction between neurons is very important for better understanding the functional connectivity in neuronal networks. We propose a method called normalized permutation transfer entropy (NPTE) to evaluate the temporal causal interaction between spike trains, which quantifies the fraction of ordinal information in a neuron that has presented in another one. The performance of this method is evaluated with the spike trains generated by an Izhikevich’s neuronal model. Results show that the NPTE method can effectively estimate the causal interaction between two neurons without influence of data length. Considering both the precision of time delay estimated and the robustness of information flow estimated against neuronal firing rate, the NPTE method is superior to other information theoretic method including normalized transfer entropy, symbolic transfer entropy and permutation conditional mutual information. To test the performance of NPTE on analyzing simulated biophysically realistic synapses, an Izhikevich’s cortical network that based on the neuronal model is employed. It is found that the NPTE method is able to characterize mutual interactions and identify spurious causality in a network of three neurons exactly. We conclude that the proposed method can obtain more reliable comparison of interactions between different pairs of neurons and is a promising tool to uncover more details on the neural coding. PMID:23940662

  7. Wavelet entropy: a new tool for analysis of short duration brain electrical signals.

    PubMed

    Rosso, O A; Blanco, S; Yordanova, J; Kolev, V; Figliola, A; Schürmann, M; Başar, E

    2001-01-30

    Since traditional electrical brain signal analysis is mostly qualitative, the development of new quantitative methods is crucial for restricting the subjectivity in the study of brain signals. These methods are particularly fruitful when they are strongly correlated with intuitive physical concepts that allow a better understanding of brain dynamics. Here, new method based on orthogonal discrete wavelet transform (ODWT) is applied. It takes as a basic element the ODWT of the EEG signal, and defines the relative wavelet energy, the wavelet entropy (WE) and the relative wavelet entropy (RWE). The relative wavelet energy provides information about the relative energy associated with different frequency bands present in the EEG and their corresponding degree of importance. The WE carries information about the degree of order/disorder associated with a multi-frequency signal response, and the RWE measures the degree of similarity between different segments of the signal. In addition, the time evolution of the WE is calculated to give information about the dynamics in the EEG records. Within this framework, the major objective of the present work was to characterize in a quantitative way functional dynamics of order/disorder microstates in short duration EEG signals. For that aim, spontaneous EEG signals under different physiological conditions were analyzed. Further, specific quantifiers were derived to characterize how stimulus affects electrical events in terms of frequency synchronization (tuning) in the event related potentials.

  8. Assessment of Technologies for the Space Shuttle External Tank Thermal Protection System and Recommendations for Technology Improvement - Part III: Material Property Characterization, Analysis, and Test Methods

    NASA Technical Reports Server (NTRS)

    Gates, Thomas S.; Johnson, Theodore F.; Whitley, Karen S.

    2005-01-01

    The objective of this report is to contribute to the independent assessment of the Space Shuttle External Tank Foam Material. This report specifically addresses material modeling, characterization testing, data reduction methods, and data pedigree. A brief description of the External Tank foam materials, locations, and standard failure modes is provided to develop suitable background information. A review of mechanics based analysis methods from the open literature is used to provide an assessment of the state-of-the-art in material modeling of closed cell foams. Further, this report assesses the existing material property database and investigates sources of material property variability. The report presents identified deficiencies in testing methods and procedures, recommendations for additional testing as required, identification of near-term improvements that should be pursued, and long-term capabilities or enhancements that should be developed.

  9. A method for tailoring the information content of a software process model

    NASA Technical Reports Server (NTRS)

    Perkins, Sharon; Arend, Mark B.

    1990-01-01

    The framework is defined for a general method for selecting a necessary and sufficient subset of a general software life cycle's information products, to support new software development process. Procedures for characterizing problem domains in general and mapping to a tailored set of life cycle processes and products is presented. An overview of the method is shown using the following steps: (1) During the problem concept definition phase, perform standardized interviews and dialogs between developer and user, and between user and customer; (2) Generate a quality needs profile of the software to be developed, based on information gathered in step 1; (3) Translate the quality needs profile into a profile of quality criteria that must be met by the software to satisfy the quality needs; (4) Map the quality criteria to set of accepted processes and products for achieving each criterion; (5) Select the information products which match or support the accepted processes and product of step 4; and (6) Select the design methodology which produces the information products selected in step 5.

  10. A method for tailoring the information content of a software process model

    NASA Technical Reports Server (NTRS)

    Perkins, Sharon; Arend, Mark B.

    1990-01-01

    The framework is defined for a general method for selecting a necessary and sufficient subset of a general software life cycle's information products, to support new software development process. Procedures for characterizing problem domains in general and mapping to a tailored set of life cycle processes and products is presented. An overview of the method is shown using the following steps: (1) During the problem concept definition phase, perform standardized interviews and dialogs between developer and user, and between user and customer; (2) Generate a quality needs profile of the software to be developed, based on information gathered in step 1; (3) Translate the quality needs profile into a profile of quality criteria that must be met by the software to satisfy the quality needs; (4) Map the quality criteria to a set of accepted processes and products for achieving each criterion; (5) select the information products which match or support the accepted processes and product of step 4; and (6) Select the design methodology which produces the information products selected in step 5.

  11. Separation and characterization of gold nanoparticle mixtures by flow-field-flow fractionation.

    PubMed

    Calzolai, Luigi; Gilliland, Douglas; Garcìa, César Pascual; Rossi, François

    2011-07-08

    We show that using asymmetric flow-field-flow fractionation and UV-vis detector it is possible to separate, characterize, and quantify the correct number size distribution of gold nanoparticle (AuNP) mixtures of various sizes in the 5-60 nm range for which simple dynamic light scattering measurements give misleading information. The size of the collected nanoparticles fractions can be determined both in solution and in the solid state, and their surface chemistry characterized by NMR. This method will find widespread applications both in the process of "size purification" after the synthesis of AuNP and in the identification and characterization of gold-based nanomaterials in consumer products. Copyright © 2011 Elsevier B.V. All rights reserved.

  12. Four-dimensional characterization of a sheet-forming web

    DOEpatents

    Sari-Sarraf, Hamed; Goddard, James S.

    2003-04-22

    A method and apparatus are provided by which a sheet-forming web may be characterized in four dimensions. Light images of the web are recorded at a point adjacent the initial stage of the web, for example, near the headbox in a paperforming operation. The images are digitized, and the resulting data is processed by novel algorithms to provide a four-dimensional measurement of the web. The measurements include two-dimensional spatial information, the intensity profile of the web, and the depth profile of the web. These measurements can be used to characterize the web, predict its properties and monitor production events, and to analyze and quantify headbox flow dynamics.

  13. Characterization of Structural and Configurational Properties of DNA by Atomic Force Microscopy.

    PubMed

    Meroni, Alice; Lazzaro, Federico; Muzi-Falconi, Marco; Podestà, Alessandro

    2018-01-01

    We describe a method to extract quantitative information on DNA structural and configurational properties from high-resolution topographic maps recorded by atomic force microscopy (AFM). DNA molecules are deposited on mica surfaces from an aqueous solution, carefully dehydrated, and imaged in air in Tapping Mode. Upon extraction of the spatial coordinates of the DNA backbones from AFM images, several parameters characterizing DNA structure and configuration can be calculated. Here, we explain how to obtain the distribution of contour lengths, end-to-end distances, and gyration radii. This modular protocol can be also used to characterize other statistical parameters from AFM topographies.

  14. The Chameleon Effect: characterization challenges due to the variability of nanoparticles and their surfaces of nanoparticles and their surfaces

    NASA Astrophysics Data System (ADS)

    Baer, Donald R.

    2018-05-01

    Nanoparticles in a variety of forms are increasing important in fundamental research, technological and medical applications, and environmental or toxicology studies. Physical and chemical drivers that lead to multiple types of particle instabilities complicate both the ability to produce, appropriately characterize, and consistently deliver well-defined particles, frequently leading to inconsistencies and conflicts in the published literature. This perspective suggests that provenance information, beyond that often recorded or reported, and application of a set of core characterization methods, including a surface sensitive technique, consistently applied at critical times can serve as tools in the effort minimize reproducibility issues.

  15. On Improving the Quality and Interpretation of Environmental Assessments using Statistical Analysis and Geographic Information Systems

    NASA Astrophysics Data System (ADS)

    Karuppiah, R.; Faldi, A.; Laurenzi, I.; Usadi, A.; Venkatesh, A.

    2014-12-01

    An increasing number of studies are focused on assessing the environmental footprint of different products and processes, especially using life cycle assessment (LCA). This work shows how combining statistical methods and Geographic Information Systems (GIS) with environmental analyses can help improve the quality of results and their interpretation. Most environmental assessments in literature yield single numbers that characterize the environmental impact of a process/product - typically global or country averages, often unchanging in time. In this work, we show how statistical analysis and GIS can help address these limitations. For example, we demonstrate a method to separately quantify uncertainty and variability in the result of LCA models using a power generation case study. This is important for rigorous comparisons between the impacts of different processes. Another challenge is lack of data that can affect the rigor of LCAs. We have developed an approach to estimate environmental impacts of incompletely characterized processes using predictive statistical models. This method is applied to estimate unreported coal power plant emissions in several world regions. There is also a general lack of spatio-temporal characterization of the results in environmental analyses. For instance, studies that focus on water usage do not put in context where and when water is withdrawn. Through the use of hydrological modeling combined with GIS, we quantify water stress on a regional and seasonal basis to understand water supply and demand risks for multiple users. Another example where it is important to consider regional dependency of impacts is when characterizing how agricultural land occupation affects biodiversity in a region. We developed a data-driven methodology used in conjuction with GIS to determine if there is a statistically significant difference between the impacts of growing different crops on different species in various biomes of the world.

  16. Emotional First Aid: Crisis Development and Systems of Intervention.

    ERIC Educational Resources Information Center

    Rosenbluh, Edward S.; And Others

    This instructional manual takes a developmental approach toward understanding the psychological, social and behavioral dynamics of human crisis. The manual describes the behavior patterns characterizing various psychological and physical crises, and provides background information and methods of crisis intervention with which to manage each. In…

  17. Analysis methods for Kevlar shield response to rotor fragments

    NASA Technical Reports Server (NTRS)

    Gerstle, J. H.

    1977-01-01

    Several empirical and analytical approaches to rotor burst shield sizing are compared and principal differences in metal and fabric dynamic behavior are discussed. The application of transient structural response computer programs to predict Kevlar containment limits is described. For preliminary shield sizing, present analytical methods are useful if insufficient test data for empirical modeling are available. To provide other information useful for engineering design, analytical methods require further developments in material characterization, failure criteria, loads definition, and post-impact fragment trajectory prediction.

  18. Abdominal Tumor Characterization and Recognition Using Superior-Order Cooccurrence Matrices, Based on Ultrasound Images

    PubMed Central

    Mitrea, Delia; Mitrea, Paulina; Nedevschi, Sergiu; Badea, Radu; Lupsor, Monica; Socaciu, Mihai; Golea, Adela; Hagiu, Claudia; Ciobanu, Lidia

    2012-01-01

    The noninvasive diagnosis of the malignant tumors is an important issue in research nowadays. Our purpose is to elaborate computerized, texture-based methods for performing computer-aided characterization and automatic diagnosis of these tumors, using only the information from ultrasound images. In this paper, we considered some of the most frequent abdominal malignant tumors: the hepatocellular carcinoma and the colonic tumors. We compared these structures with the benign tumors and with other visually similar diseases. Besides the textural features that proved in our previous research to be useful in the characterization and recognition of the malignant tumors, we improved our method by using the grey level cooccurrence matrix and the edge orientation cooccurrence matrix of superior order. As resulted from our experiments, the new textural features increased the malignant tumor classification performance, also revealing visual and physical properties of these structures that emphasized the complex, chaotic structure of the corresponding tissue. PMID:22312411

  19. Method for non-referential defect characterization using fractal encoding and active contours

    DOEpatents

    Gleason, Shaun S [Knoxville, TN; Sari-Sarraf, Hamed [Lubbock, TX

    2007-05-15

    A method for identification of anomalous structures, such as defects, includes the steps of providing a digital image and applying fractal encoding to identify a location of at least one anomalous portion of the image. The method does not require a reference image to identify the location of the anomalous portion. The method can further include the step of initializing an active contour based on the location information obtained from the fractal encoding step and deforming an active contour to enhance the boundary delineation of the anomalous portion.

  20. Evidential analysis of difference images for change detection of multitemporal remote sensing images

    NASA Astrophysics Data System (ADS)

    Chen, Yin; Peng, Lijuan; Cremers, Armin B.

    2018-03-01

    In this article, we develop two methods for unsupervised change detection in multitemporal remote sensing images based on Dempster-Shafer's theory of evidence (DST). In most unsupervised change detection methods, the probability of difference image is assumed to be characterized by mixture models, whose parameters are estimated by the expectation maximization (EM) method. However, the main drawback of the EM method is that it does not consider spatial contextual information, which may entail rather noisy detection results with numerous spurious alarms. To remedy this, we firstly develop an evidence theory based EM method (EEM) which incorporates spatial contextual information in EM by iteratively fusing the belief assignments of neighboring pixels to the central pixel. Secondly, an evidential labeling method in the sense of maximizing a posteriori probability (MAP) is proposed in order to further enhance the detection result. It first uses the parameters estimated by EEM to initialize the class labels of a difference image. Then it iteratively fuses class conditional information and spatial contextual information, and updates labels and class parameters. Finally it converges to a fixed state which gives the detection result. A simulated image set and two real remote sensing data sets are used to evaluate the two evidential change detection methods. Experimental results show that the new evidential methods are comparable to other prevalent methods in terms of total error rate.

  1. A Novel Method for Analyzing Microbially Affiliated Volatile Organic Compounds in Soil Environments

    NASA Astrophysics Data System (ADS)

    Ruhs, C. V.; McNeal, K. S.

    2010-12-01

    A concerted, international effort by citizens, governments, industries and educational systems is necessary to address the myriad environmental issues that face us today. The authors of this paper concentrate on soil environments and, specifically, the methods currently used to characterize them. The ability to efficiently and effectively monitor and characterize various soils is desired, allows for the study, supervision, and protection of natural and cultivated ecosystems, and may assist stakeholders in meeting governmentally-imposed environmental standards. This research addresses soil characterization by a comparison of four methods that emphasize a combination of microbial community and metabolic measures: BIOLOG, fatty acid methyl-ester analysis (FAME), descriptive physical and chemical analysis (moisture content, pH, carbon content, nutrient content, and grain size), and the novel soil-microbe volatile organic compound analysis (SMVOC) presented in this work. In order to achieve the method comparison, soils were collected from three climatic regions (Bahamas, Michigan, and Mississippi), with three samples taken from niche ecosystems found at each climatic region (a total of nine sites). Of interest to the authors is whether or not an investigation of microbial communities and the volatile organic compounds (VOCs) produced by microbial communities from nine separate soil ecosystems provides useful information about soil dynamics. In essence, is analysis of soil-derived VOCs using gas chromatography-mass spectrometry (GC-MS) an effective method for characterizing microbial communities and their metabolic activity of soils rapidly and accurately compared with the other three traditional characterization methods? Preliminary results suggest that VOCs in each of these locales differ with changes in soil types, soil moisture, and bacterial community. Each niche site shows distinct patterns in both VOCs and BIOLOG readings. Results will be presented to show the efficacy of the SMVOC approach and the statistical alignment of the VOC and community measures.

  2. Recent mass spectrometry-based techniques and considerations for disulfide bond characterization in proteins.

    PubMed

    Lakbub, Jude C; Shipman, Joshua T; Desaire, Heather

    2018-04-01

    Disulfide bonds are important structural moieties of proteins: they ensure proper folding, provide stability, and ensure proper function. With the increasing use of proteins for biotherapeutics, particularly monoclonal antibodies, which are highly disulfide bonded, it is now important to confirm the correct disulfide bond connectivity and to verify the presence, or absence, of disulfide bond variants in the protein therapeutics. These studies help to ensure safety and efficacy. Hence, disulfide bonds are among the critical quality attributes of proteins that have to be monitored closely during the development of biotherapeutics. However, disulfide bond analysis is challenging because of the complexity of the biomolecules. Mass spectrometry (MS) has been the go-to analytical tool for the characterization of such complex biomolecules, and several methods have been reported to meet the challenging task of mapping disulfide bonds in proteins. In this review, we describe the relevant, recent MS-based techniques and provide important considerations needed for efficient disulfide bond analysis in proteins. The review focuses on methods for proper sample preparation, fragmentation techniques for disulfide bond analysis, recent disulfide bond mapping methods based on the fragmentation techniques, and automated algorithms designed for rapid analysis of disulfide bonds from liquid chromatography-MS/MS data. Researchers involved in method development for protein characterization can use the information herein to facilitate development of new MS-based methods for protein disulfide bond analysis. In addition, individuals characterizing biotherapeutics, especially by disulfide bond mapping in antibodies, can use this review to choose the best strategies for disulfide bond assignment of their biologic products. Graphical Abstract This review, describing characterization methods for disulfide bonds in proteins, focuses on three critical components: sample preparation, mass spectrometry data, and software tools.

  3. A dual-modality optical coherence tomography and selective plane illumination microscopy system for mouse embryonic imaging

    NASA Astrophysics Data System (ADS)

    Wu, Chen; Ran, Shihao; Le, Henry; Singh, Manmohan; Larina, Irina V.; Mayerich, David; Dickinson, Mary E.; Larin, Kirill V.

    2017-02-01

    Both optical coherence tomography (OCT) and selective plane illumination microscopy (SPIM) are frequently used in mouse embryonic research for high-resolution three-dimensional imaging. However, each of these imaging methods provide a unique and independent advantage: SPIM provides morpho-functional information through immunofluorescence and OCT provides a method for whole-embryo 3D imaging. In this study, we have combined rotational imaging OCT and SPIM into a single, dual-modality device to image E9.5 mouse embryos. The results demonstrate that the dual-modality setup is able to provide both anatomical and functional information simultaneously for more comprehensive tissue characterization.

  4. Collecting Information for Rating Global Assessment of Functioning (GAF): Sources of Information and Methods for Information Collection.

    PubMed

    I H, Monrad Aas

    2014-11-01

    Global Assessment of Functioning (GAF) is an assessment instrument that is known worldwide. It is widely used for rating the severity of illness. Results from evaluations in psychiatry should characterize the patients. Rating of GAF is based on collected information. The aim of the study is to identify the factors involved in collecting information that is relevant for rating GAF, and gaps in knowledge where it is likely that further development would play a role for improved scoring. A literature search was conducted with a combination of thorough hand search and search in the bibliographic databases PubMed, PsycINFO, Google Scholar, and Campbell Collaboration Library of Systematic Reviews. Collection of information for rating GAF depends on two fundamental factors: the sources of information and the methods for information collection. Sources of information are patients, informants, health personnel, medical records, letters of referral and police records about violence and substance abuse. Methods for information collection include the many different types of interview - unstructured, semi-structured, structured, interviews for Axis I and II disorders, semistructured interviews for rating GAF, and interviews of informants - as well as instruments for rating symptoms and functioning, and observation. The different sources of information, and methods for collection, frequently result in inconsistencies in the information collected. The variation in collected information, and lack of a generally accepted algorithm for combining collected information, is likely to be important for rated GAF values, but there is a fundamental lack of knowledge about the degree of importance. Research to improve GAF has not reached a high level. Rated GAF values are likely to be influenced by both the sources of information used and the methods employed for information collection, but the lack of research-based information about these influences is fundamental. Further development of GAF is feasible and proposals for this are presented.

  5. Teaching at the edge of knowledge: Non-equilibrium statistical physics

    NASA Astrophysics Data System (ADS)

    Schmittmann, Beate

    2007-03-01

    As physicists become increasingly interested in biological problems, we frequently find ourselves confronted with complex open systems, involving many interacting constituents and characterized by non-vanishing fluxes of mass or energy. Faced with the task of predicting macroscopic behaviors from microscopic information for these non-equilibrium systems, the familiar Gibbs-Boltzmann framework fails. The development of a comprehensive theoretical characterization of non-equilibrium behavior is one of the key challenges of modern condensed matter physics. In its absence, several approaches have been developed, from master equations to thermostatted molecular dynamics, which provide key insights into the rich and often surprising phenomenology of systems far from equilibrium. In my talk, I will address some of these methods, selecting those that are most relevant for a broad range of interdisciplinary problems from biology to traffic, finance, and sociology. The ``portability'' of these methods makes them valuable for graduate students from a variety of disciplines. To illustrate how different methods can complement each other when probing a problem from, e.g., the life sciences, I will discuss some recent attempts at modeling translation, i.e., the process by which the genetic information encoded on an mRNA is translated into the corresponding protein.

  6. The use of high resolution magnetic resonance on 3.0-T system in the diagnosis and surgical planning of intraosseous lesions of the jaws: preliminary results of a retrospective study.

    PubMed

    Cassetta, M; Di Carlo, S; Pranno, N; Stagnitti, A; Pompa, V; Pompa, G

    2012-12-01

    The pre-operative evaluation in oral and maxillofacial surgery is currently performed by computerized tomography (CT). However in some case the information of the traditional imaging methods are not enough in the diagnosis and surgical planning. The efficacy of these imaging methods in the evaluation of soft tissues is lower than magnetic resonance imaging (MRI). The aim of the study was to show the use of MRI in the evaluation of relation between intraosseous lesions of the jaws and anatomical structures, when it was difficult using the traditional radiographic methods, and to evaluate the usefulness of MRI to depict the morphostructural characterization of the lesions and infiltration of the soft tissues. 10 patients with a lesion of jaw were selected. All the patients underwent panoramic radiography (OPT), CT and MRI. The images were examined by dental and maxillofacial radiology who compared the different imaging methods to analyze the morphological and structural characteristics of the lesion and assessed the relationship between the lesion and the anatomical structures. Magnetic resonance imaging provided more detailed spatial and structural information than other imaging methods. MRI allowed us to characterize the intraosseous lesions of the jaws and to plan the surgery, resulting in a lower risk of anatomic structures surgical injury.

  7. SCMPSP: Prediction and characterization of photosynthetic proteins based on a scoring card method.

    PubMed

    Vasylenko, Tamara; Liou, Yi-Fan; Chen, Hong-An; Charoenkwan, Phasit; Huang, Hui-Ling; Ho, Shinn-Ying

    2015-01-01

    Photosynthetic proteins (PSPs) greatly differ in their structure and function as they are involved in numerous subprocesses that take place inside an organelle called a chloroplast. Few studies predict PSPs from sequences due to their high variety of sequences and structues. This work aims to predict and characterize PSPs by establishing the datasets of PSP and non-PSP sequences and developing prediction methods. A novel bioinformatics method of predicting and characterizing PSPs based on scoring card method (SCMPSP) was used. First, a dataset consisting of 649 PSPs was established by using a Gene Ontology term GO:0015979 and 649 non-PSPs from the SwissProt database with sequence identity <= 25%.- Several prediction methods are presented based on support vector machine (SVM), decision tree J48, Bayes, BLAST, and SCM. The SVM method using dipeptide features-performed well and yielded - a test accuracy of 72.31%. The SCMPSP method uses the estimated propensity scores of 400 dipeptides - as PSPs and has a test accuracy of 71.54%, which is comparable to that of the SVM method. The derived propensity scores of 20 amino acids were further used to identify informative physicochemical properties for characterizing PSPs. The analytical results reveal the following four characteristics of PSPs: 1) PSPs favour hydrophobic side chain amino acids; 2) PSPs are composed of the amino acids prone to form helices in membrane environments; 3) PSPs have low interaction with water; and 4) PSPs prefer to be composed of the amino acids of electron-reactive side chains. The SCMPSP method not only estimates the propensity of a sequence to be PSPs, it also discovers characteristics that further improve understanding of PSPs. The SCMPSP source code and the datasets used in this study are available at http://iclab.life.nctu.edu.tw/SCMPSP/.

  8. A flexible computational framework for detecting, characterizing, and interpreting statistical patterns of epistasis in genetic studies of human disease susceptibility.

    PubMed

    Moore, Jason H; Gilbert, Joshua C; Tsai, Chia-Ti; Chiang, Fu-Tien; Holden, Todd; Barney, Nate; White, Bill C

    2006-07-21

    Detecting, characterizing, and interpreting gene-gene interactions or epistasis in studies of human disease susceptibility is both a mathematical and a computational challenge. To address this problem, we have previously developed a multifactor dimensionality reduction (MDR) method for collapsing high-dimensional genetic data into a single dimension (i.e. constructive induction) thus permitting interactions to be detected in relatively small sample sizes. In this paper, we describe a comprehensive and flexible framework for detecting and interpreting gene-gene interactions that utilizes advances in information theory for selecting interesting single-nucleotide polymorphisms (SNPs), MDR for constructive induction, machine learning methods for classification, and finally graphical models for interpretation. We illustrate the usefulness of this strategy using artificial datasets simulated from several different two-locus and three-locus epistasis models. We show that the accuracy, sensitivity, specificity, and precision of a naïve Bayes classifier are significantly improved when SNPs are selected based on their information gain (i.e. class entropy removed) and reduced to a single attribute using MDR. We then apply this strategy to detecting, characterizing, and interpreting epistatic models in a genetic study (n = 500) of atrial fibrillation and show that both classification and model interpretation are significantly improved.

  9. Bayesian characterization of uncertainty in species interaction strengths.

    PubMed

    Wolf, Christopher; Novak, Mark; Gitelman, Alix I

    2017-06-01

    Considerable effort has been devoted to the estimation of species interaction strengths. This effort has focused primarily on statistical significance testing and obtaining point estimates of parameters that contribute to interaction strength magnitudes, leaving the characterization of uncertainty associated with those estimates unconsidered. We consider a means of characterizing the uncertainty of a generalist predator's interaction strengths by formulating an observational method for estimating a predator's prey-specific per capita attack rates as a Bayesian statistical model. This formulation permits the explicit incorporation of multiple sources of uncertainty. A key insight is the informative nature of several so-called non-informative priors that have been used in modeling the sparse data typical of predator feeding surveys. We introduce to ecology a new neutral prior and provide evidence for its superior performance. We use a case study to consider the attack rates in a New Zealand intertidal whelk predator, and we illustrate not only that Bayesian point estimates can be made to correspond with those obtained by frequentist approaches, but also that estimation uncertainty as described by 95% intervals is more useful and biologically realistic using the Bayesian method. In particular, unlike in bootstrap confidence intervals, the lower bounds of the Bayesian posterior intervals for attack rates do not include zero when a predator-prey interaction is in fact observed. We conclude that the Bayesian framework provides a straightforward, probabilistic characterization of interaction strength uncertainty, enabling future considerations of both the deterministic and stochastic drivers of interaction strength and their impact on food webs.

  10. Validation of Rapid Radiochemical Method for Californium ...

    EPA Pesticide Factsheets

    Technical Brief In the event of a radiological/nuclear contamination event, the response community would need tools and methodologies to rapidly assess the nature and the extent of contamination. To characterize a radiologically contaminated outdoor area and to inform risk assessment, large numbers of environmental samples would be collected and analyzed over a short period of time. To address the challenge of quickly providing analytical results to the field, the U.S. EPA developed a robust analytical method. This method allows response officials to characterize contaminated areas and to assess the effectiveness of remediation efforts, both rapidly and accurately, in the intermediate and late phases of environmental cleanup. Improvement in sample processing and analysis leads to increased laboratory capacity to handle the analysis of a large number of samples following the intentional or unintentional release of a radiological/nuclear contaminant.

  11. Plot Description (PD)

    Treesearch

    Robert E. Keane

    2006-01-01

    The Plot Description (PD) form is used to describe general characteristics of the FIREMON macroplot to provide ecological context for data analyses. The PD data characterize the topographical setting, geographic reference point, general plant composition and cover, ground cover, fuels, and soils information. This method provides the general ecological data that can be...

  12. Production of biofuels and biochemicals: in need of an ORACLE.

    PubMed

    Miskovic, Ljubisa; Hatzimanikatis, Vassily

    2010-08-01

    The engineering of cells for the production of fuels and chemicals involves simultaneous optimization of multiple objectives, such as specific productivity, extended substrate range and improved tolerance - all under a great degree of uncertainty. The achievement of these objectives under physiological and process constraints will be impossible without the use of mathematical modeling. However, the limited information and the uncertainty in the available information require new methods for modeling and simulation that will characterize the uncertainty and will quantify, in a statistical sense, the expectations of success of alternative metabolic engineering strategies. We discuss these considerations toward developing a framework for the Optimization and Risk Analysis of Complex Living Entities (ORACLE) - a computational method that integrates available information into a mathematical structure to calculate control coefficients. Copyright 2010 Elsevier Ltd. All rights reserved.

  13. Untangling Brain-Wide Dynamics in Consciousness by Cross-Embedding

    PubMed Central

    Tajima, Satohiro; Yanagawa, Toru; Fujii, Naotaka; Toyoizumi, Taro

    2015-01-01

    Brain-wide interactions generating complex neural dynamics are considered crucial for emergent cognitive functions. However, the irreducible nature of nonlinear and high-dimensional dynamical interactions challenges conventional reductionist approaches. We introduce a model-free method, based on embedding theorems in nonlinear state-space reconstruction, that permits a simultaneous characterization of complexity in local dynamics, directed interactions between brain areas, and how the complexity is produced by the interactions. We demonstrate this method in large-scale electrophysiological recordings from awake and anesthetized monkeys. The cross-embedding method captures structured interaction underlying cortex-wide dynamics that may be missed by conventional correlation-based analysis, demonstrating a critical role of time-series analysis in characterizing brain state. The method reveals a consciousness-related hierarchy of cortical areas, where dynamical complexity increases along with cross-area information flow. These findings demonstrate the advantages of the cross-embedding method in deciphering large-scale and heterogeneous neuronal systems, suggesting a crucial contribution by sensory-frontoparietal interactions to the emergence of complex brain dynamics during consciousness. PMID:26584045

  14. Alternatives Assessment Frameworks: Research Needs for the Informed Substitution of Hazardous Chemicals

    PubMed Central

    Jacobs, Molly M.; Malloy, Timothy F.; Tickner, Joel A.; Edwards, Sally

    2015-01-01

    Background Given increasing pressures for hazardous chemical replacement, there is growing interest in alternatives assessment to avoid substituting a toxic chemical with another of equal or greater concern. Alternatives assessment is a process for identifying, comparing, and selecting safer alternatives to chemicals of concern (including those used in materials, processes, or technologies) on the basis of their hazards, performance, and economic viability. Objectives The purposes of this substantive review of alternatives assessment frameworks are to identify consistencies and differences in methods and to outline needs for research and collaboration to advance science policy practice. Methods This review compares methods used in six core components of these frameworks: hazard assessment, exposure characterization, life-cycle impacts, technical feasibility evaluation, economic feasibility assessment, and decision making. Alternatives assessment frameworks published from 1990 to 2014 were included. Results Twenty frameworks were reviewed. The frameworks were consistent in terms of general process steps, but some differences were identified in the end points addressed. Methodological gaps were identified in the exposure characterization, life-cycle assessment, and decision–analysis components. Methods for addressing data gaps remain an issue. Discussion Greater consistency in methods and evaluation metrics is needed but with sufficient flexibility to allow the process to be adapted to different decision contexts. Conclusion Although alternatives assessment is becoming an important science policy field, there is a need for increased cross-disciplinary collaboration to refine methodologies in support of the informed substitution and design of safer chemicals, materials, and products. Case studies can provide concrete lessons to improve alternatives assessment. Citation Jacobs MM, Malloy TF, Tickner JA, Edwards S. 2016. Alternatives assessment frameworks: research needs for the informed substitution of hazardous chemicals. Environ Health Perspect 124:265–280; http://dx.doi.org/10.1289/ehp.1409581 PMID:26339778

  15. A method for estimating the diffuse attenuation coefficient (KdPAR)from paired temperature sensors

    USGS Publications Warehouse

    Read, Jordan S.; Rose, Kevin C.; Winslow, Luke A.; Read, Emily K.

    2015-01-01

    A new method for estimating the diffuse attenuation coefficient for photosynthetically active radiation (KdPAR) from paired temperature sensors was derived. We show that during cases where the attenuation of penetrating shortwave solar radiation is the dominant source of temperature changes, time series measurements of water temperatures at multiple depths (z1 and z2) are related to one another by a linear scaling factor (a). KdPAR can then be estimated by the simple equation KdPAR ln(a)/(z2/z1). A suggested workflow is presented that outlines procedures for calculating KdPAR according to this paired temperature sensor (PTS) method. This method is best suited for conditions when radiative temperature gains are large relative to physical noise. These conditions occur frequently on water bodies with low wind and/or high KdPARs but can be used for other types of lakes during time periods of low wind and/or where spatially redundant measurements of temperatures are available. The optimal vertical placement of temperature sensors according to a priori knowledge of KdPAR is also described. This information can be used to inform the design of future sensor deployments using the PTS method or for campaigns where characterizing sub-daily changes in temperatures is important. The PTS method provides a novel method to characterize light attenuation in aquatic ecosystems without expensive radiometric equipment or the user subjectivity inherent in Secchi depth measurements. This method also can enable the estimation of KdPAR at higher frequencies than many manual monitoring programs allow.

  16. Combining fibre optic Raman spectroscopy and tactile resonance measurement for tissue characterization

    NASA Astrophysics Data System (ADS)

    Candefjord, Stefan; Nyberg, Morgan; Jalkanen, Ville; Ramser, Kerstin; Lindahl, Olof A.

    2010-12-01

    Tissue characterization is fundamental for identification of pathological conditions. Raman spectroscopy (RS) and tactile resonance measurement (TRM) are two promising techniques that measure biochemical content and stiffness, respectively. They have potential to complement the golden standard--histological analysis. By combining RS and TRM, complementary information about tissue content can be obtained and specific drawbacks can be avoided. The aim of this study was to develop a multivariate approach to compare RS and TRM information. The approach was evaluated on measurements at the same points on porcine abdominal tissue. The measurement points were divided into five groups by multivariate analysis of the RS data. A regression analysis was performed and receiver operating characteristic (ROC) curves were used to compare the RS and TRM data. TRM identified one group efficiently (area under ROC curve 0.99). The RS data showed that the proportion of saturated fat was high in this group. The regression analysis showed that stiffness was mainly determined by the amount of fat and its composition. We concluded that RS provided additional, important information for tissue identification that was not provided by TRM alone. The results are promising for development of a method combining RS and TRM for intraoperative tissue characterization.

  17. Application of Asymmetric Flow Field-Flow Fractionation hyphenations for liposome-antimicrobial peptide interaction.

    PubMed

    Iavicoli, Patrizia; Urbán, Patricia; Bella, Angelo; Ryadnov, Maxim G; Rossi, François; Calzolai, Luigi

    2015-11-27

    Asymmetric Flow Field-Flow Fractionation (AF4) combined with multidetector analysis form a promising technique in the field of nanoparticle characterization. This system is able to measure the dimensions and physicochemical properties of nanoparticles with unprecedented accuracy and precision. Here, for the first time, this technique is optimized to characterize the interaction between an archetypal antimicrobial peptide and synthetic membranes. By using charged and neutral liposomes it is possible to mimic some of the charge characteristics of biological membranes. The use of AF4 system allows determining, in a single analysis, information regarding the selectivity of the peptides, the quantity of peptides bound to each liposome, the induced change in the size distribution and morphology of the liposomes. The results obtained provide relevant information for the study of structure-activity relationships in the context of membrane-induced antimicrobial action. This information will contribute to the rational design of potent antimicrobial agents in the future. Moreover, the application of this method to other liposome systems is straightforward and would be extremely useful for a comprehensive characterization with regard to size distribution and protein interaction in the nanomedicine field. Copyright © 2015. Published by Elsevier B.V.

  18. Rapid functional screening of Streptomyces coelicolor regulators by use of a pH indicator and application to the MarR-like regulator AbsC.

    PubMed

    Yang, Yung-Hun; Song, Eunjung; Lee, Bo-Rahm; Kim, Eun-jung; Park, Sung-Hee; Kim, Yun-Gon; Lee, Chang-Soo; Kim, Byung-Gee

    2010-06-01

    To elucidate the function of an unknown regulator in Streptomyces, differences in phenotype and antibiotic production between a deletion mutant and a wild-type strain (WT) were compared. These differences are easily hidden by complex media. To determine the specific nutrient conditions that reveal such differences, we used a multiwell method containing different nutrients along with bromothymol blue. We found several nutrients that provide key information on characterization conditions. By comparing the growth of wild-type and mutant strains on screened nutrients, we were able to measure growth, organic acid production, and antibiotic production for the elucidation of regulator function. As a result of this method, a member of the MarR-like regulator family, SCO5405 (AbsC), was newly characterized to control pyruvate dehydrogenase in Streptomyces coelicolor. Deletion of SCO5405 increased the pH of the culture broth due to decreased production of organic acids such as pyruvate and alpha-ketoglutarate and increased extracellular actinorhodin (ACT) production in minimal medium containing glucose and alanine (MMGA). This method could therefore be a high-throughput method for the characterization of unknown regulators.

  19. Solvate Structures and Computational/Spectroscopic Characterization of LiPF6 Electrolytes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Han, Sang D.; Yun, Sung-Hyun; Borodin, Oleg

    2015-04-23

    Raman spectroscopy is a powerful method for identifying ion-ion interactions, but only if the vibrational band signature for the anion coordination modes can be accurately deciphered. The present study characterizes the PF6- anion P-F Raman symmetric stretching vibrational band for evaluating the PF6-...Li+ cation interactions within LiPF6 crystalline solvates to create a characterization tool for liquid electrolytes. To facilitate this, the crystal structures for two new solvates—(G3)1:LiPF6 and (DEC)2:LiPF6 with triglyme and diethyl carbonate, respectively—are reported. The information obtained from this analysis provides key guidance about the ionic association information which may be obtained from a Raman spectroscopic evaluation ofmore » electrolytes containing the LiPF6 salt and aprotic solvents. Of particular note is the overlap of the Raman bands for both solvent-separated ion pair (SSIP) and contact ion pair (CIP) coordination in which the PF6- anions are uncoordinated or coordinated to a single Li+ cation, respectively.« less

  20. Characterization of microplastic litter from oceans by an innovative approach based on hyperspectral imaging.

    PubMed

    Serranti, Silvia; Palmieri, Roberta; Bonifazi, Giuseppe; Cózar, Andrés

    2018-06-01

    An innovative approach, based on HyperSpectral Imaging (HSI), was developed in order to set up an efficient method to analyze marine microplastic litter. HSI was applied to samples collected by surface-trawling plankton nets from several parts of the world (i.e. Arctic, Mediterranean, South Atlantic and North Pacific). Reliable information on abundance, size, shape and polymer type for the whole ensemble of plastic particles in each sample was retrieved from single hyperspectral images. The simultaneous characterization of the polymeric composition of the plastic debris represents an important analytical advantage considering that this information, and even the validation of the plastic nature of the small debris, is a common flaw in the analysis of marine microplastic pollution. HSI was revealed as a rapid, non-invasive, non-destructive and reliable technology for the characterization of the microplastic waste, opening a promising way for improving the plastic pollution monitoring. Copyright © 2018 Elsevier Ltd. All rights reserved.

  1. Characterizing human activity induced impulse and slip-pulse excitations through structural vibration

    NASA Astrophysics Data System (ADS)

    Pan, Shijia; Mirshekari, Mostafa; Fagert, Jonathon; Ramirez, Ceferino Gabriel; Chung, Albert Jin; Hu, Chih Chi; Shen, John Paul; Zhang, Pei; Noh, Hae Young

    2018-02-01

    Many human activities induce excitations on ambient structures with various objects, causing the structures to vibrate. Accurate vibration excitation source detection and characterization enable human activity information inference, hence allowing human activity monitoring for various smart building applications. By utilizing structural vibrations, we can achieve sparse and non-intrusive sensing, unlike pressure- and vision-based methods. Many approaches have been presented on vibration-based source characterization, and they often either focus on one excitation type or have limited performance due to the dispersion and attenuation effects of the structures. In this paper, we present our method to characterize two main types of excitations induced by human activities (impulse and slip-pulse) on multiple structures. By understanding the physical properties of waves and their propagation, the system can achieve accurate excitation tracking on different structures without large-scale labeled training data. Specifically, our algorithm takes properties of surface waves generated by impulse and of body waves generated by slip-pulse into account to handle the dispersion and attenuation effects when different types of excitations happen on various structures. We then evaluate the algorithm through multiple scenarios. Our method achieves up to a six times improvement in impulse localization accuracy and a three times improvement in slip-pulse trajectory length estimation compared to existing methods that do not take wave properties into account.

  2. Characterization of Maize Grains with Different Pigmentation Investigated by Photoacoustic Spectroscopy

    NASA Astrophysics Data System (ADS)

    Rico Molina, R.; Hernández Aguilar, C.; Dominguez Pacheco, A.; Cruz-Orea, A.; López Bonilla, J. L.

    2014-10-01

    A knowledge of grains' optical parameters is of great relevance in the maize grain technology practice. Such parameters provide information about its absorption and reflectance, which in turn is related to its color. In the dough and tortilla industries, it is important to characterize this attribute of the corn kernel, as it is one of the attributes that directly affects the quality of the food product. Thus, it is important to have techniques that contribute to the characterization of this raw material. It is traditionally characterized by conventional methods, which usually destroy the grain and involve a laborious preparation of material plus they are expensive. The objective of this study was to determine the optical absorption coefficient for maize grains ( Zea mays L.) with different pigmentations by means of photoacoustic spectroscopy (PAS). The genotype A had bluish coloration and genotype B had yellowish coloration. In addition, the photoacoustic signal obtained by two methods was analyzed mathematically: the standard deviation and the first derivative; both results were compared (Fig. 1). In combination with mathematical analysis, PAS may be considered as a potential diagnostic tool for the characterization of the grains. [Figure not available: see fulltext.

  3. PH-sensing 96-well microtitre plates for the characterization of acid production by dairy starter cultures.

    PubMed

    John, Gernot T; Goelling, Detlef; Klimant, Ingo; Schneider, Holger; Heinzle, Elmar

    2003-08-01

    A new method for characterization of acid production by dairy starter cultures is presented. Microplates with integrated optical pH sensors are developed. Two fluorophores, a pH-sensitive and a pH-insensitive one are immobilised at the bottom of a polystyrene 96-well microtitre plate. The pH-insensitive fluorophore serves as an internal reference and makes calibration unnecessary. The sensor measures pH accurately in optically well-defined media. Particles and fluorophores contained in the bulk medium disturbed the measurements. Despite these disturbances it was possible to clearly sense differences in inoculum type and in inoculum sizes of cultures of Lactococcus lactis and of Streptococcus thermophilus at 30 and 37 degrees C. Besides a pH-related signal there is information about other changes during milk fermentation. The cultivation results were compared with those from the established CINAC-method. From this comparison it can be concluded that the new method can be used reliably to characterize particularly a large number of strains for screening purposes but also for quality control.

  4. Development of an Enhanced Metaproteomic Approach for Deepening the Microbiome Characterization of the Human Infant Gut

    PubMed Central

    2015-01-01

    The establishment of early life microbiota in the human infant gut is highly variable and plays a crucial role in host nutrient availability/uptake and maturation of immunity. Although high-performance mass spectrometry (MS)-based metaproteomics is a powerful method for the functional characterization of complex microbial communities, the acquisition of comprehensive metaproteomic information in human fecal samples is inhibited by the presence of abundant human proteins. To alleviate this restriction, we have designed a novel metaproteomic strategy based on double filtering (DF) the raw samples, a method that fractionates microbial from human cells to enhance microbial protein identification and characterization in complex fecal samples from healthy premature infants. This method dramatically improved the overall depth of infant gut proteome measurement, with an increase in the number of identified low-abundance proteins and a greater than 2-fold improvement in microbial protein identification and quantification. This enhancement of proteome measurement depth enabled a more extensive microbiome comparison between infants by not only increasing the confidence of identified microbial functional categories but also revealing previously undetected categories. PMID:25350865

  5. Facial skin color measurement based on camera colorimetric characterization

    NASA Astrophysics Data System (ADS)

    Yang, Boquan; Zhou, Changhe; Wang, Shaoqing; Fan, Xin; Li, Chao

    2016-10-01

    The objective measurement of facial skin color and its variance is of great significance as much information can be obtained from it. In this paper, we developed a new skin color measurement procedure which includes following parts: first, a new skin tone color checker made of pantone Skin Tone Color Checker was designed for camera colorimetric characterization; second, the chromaticity of light source was estimated via a new scene illumination estimation method considering several previous algorithms; third, chromatic adaption was used to convert the input facial image into output facial image which appears taken under canonical light; finally the validity and accuracy of our method was verified by comparing the results obtained by our procedure with these by spectrophotometer.

  6. Observer for a thick layer of solid deuterium-tritium using backlit optical shadowgraphy and interferometry.

    PubMed

    Choux, Alexandre; Busvelle, Eric; Gauthier, Jean Paul; Pascal, Ghislain

    2007-11-20

    Our work is in the context of the French "laser mégajoule" project, about fusion by inertial confinement. The project leads to the problem of characterizing the inner surface, of the approximately spherical target, by optical shadowgraphy techniques. Our work is entirely based on the basic idea that optical shadowgraphy produces "caustics" of systems of optical rays, which contain a great deal of 3D information about the surface to be characterized. We develop a method of 3D reconstruction based upon this idea plus a "small perturbations" technique. Although computations are made in the special "spherical" case, the method is in fact general and may be extended to several other situations.

  7. Aerosol generation and characterization of multi-walled carbon nanotubes exposed to cells cultured at the air-liquid interface.

    PubMed

    Polk, William W; Sharma, Monita; Sayes, Christie M; Hotchkiss, Jon A; Clippinger, Amy J

    2016-04-23

    Aerosol generation and characterization are critical components in the assessment of the inhalation hazards of engineered nanomaterials (NMs). An extensive review was conducted on aerosol generation and exposure apparatus as part of an international expert workshop convened to discuss the design of an in vitro testing strategy to assess pulmonary toxicity following exposure to aerosolized particles. More specifically, this workshop focused on the design of an in vitro method to predict the development of pulmonary fibrosis in humans following exposure to multi-walled carbon nanotubes (MWCNTs). Aerosol generators, for dry or liquid particle suspension aerosolization, and exposure chambers, including both commercially available systems and those developed by independent researchers, were evaluated. Additionally, characterization methods that can be used and the time points at which characterization can be conducted in order to interpret in vitro exposure results were assessed. Summarized below is the information presented and discussed regarding the relevance of various aerosol generation and characterization techniques specific to aerosolized MWCNTs exposed to cells cultured at the air-liquid interface (ALI). The generation of MWCNT aerosols relevant to human exposures and their characterization throughout exposure in an ALI system is critical for extrapolation of in vitro results to toxicological outcomes in humans.

  8. On-orbit characterization of hyperspectral imagers

    NASA Astrophysics Data System (ADS)

    McCorkel, Joel

    Remote Sensing Group (RSG) at the University of Arizona has a long history of using ground-based test sites for the calibration of airborne- and satellite-based sensors. Often, ground-truth measurements at these tests sites are not always successful due to weather and funding availability. Therefore, RSG has also employed automated ground instrument approaches and cross-calibration methods to verify the radiometric calibration of a sensor. The goal in the cross-calibration method is to transfer the calibration of a well-known sensor to that of a different sensor. This dissertation presents a method for determining the radiometric calibration of a hyperspectral imager using multispectral imagery. The work relies on a multispectral sensor, Moderate-resolution Imaging Spectroradiometer (MODIS), as a reference for the hyperspectral sensor Hyperion. Test sites used for comparisons are Railroad Valley in Nevada and a portion of the Libyan Desert in North Africa. A method to predict hyperspectral surface reflectance using a combination of MODIS data and spectral shape information is developed and applied for the characterization of Hyperion. Spectral shape information is based on RSG's historical in situ data for the Railroad Valley test site and spectral library data for the Libyan test site. Average atmospheric parameters, also based on historical measurements, are used in reflectance prediction and transfer to space. Results of several cross-calibration scenarios that differ in image acquisition coincidence, test site, and reference sensor are found for the characterization of Hyperion. These are compared with results from the reflectance-based approach of vicarious calibration, a well-documented method developed by the RSG that serves as a baseline for calibration performance for the cross-calibration method developed here. Cross-calibration provides results that are within 2% of those of reflectance-based results in most spectral regions. Larger disagreements exist for shorter wavelengths studied in this work as well as in spectral areas that experience absorption by the atmosphere.

  9. Development of a simple and sensitive method for the characterization of odorous waste gas emissions by means of solid-phase microextraction (SPME) and GC-MS/olfactometry.

    PubMed

    Kleeberg, K K; Liu, Y; Jans, M; Schlegelmilch, M; Streese, J; Stegmann, R

    2005-01-01

    A solid-phase microextraction (SPME) method has been developed for the extraction of odorous compounds from waste gas. The enriched compounds were characterized by gas chromatography-mass spectrometry (GC-MS) and gas chromatography followed by simultaneous flame ionization detection and olfactometry (GC-FID/O). Five different SPME fiber coatings were tested, and the carboxen/polydimethylsiloxane (CAR/PDMS) fiber showed the highest ability to extract odorous compounds from the waste gas. Furthermore, parameters such as exposure time, desorption temperature, and desorption time have been optimized. The SPME method was successfully used to characterize an odorous waste gas from a fat refinery prior to and after waste gas treatment in order to describe the treatment efficiency of the used laboratory scale plant which consisted of a bioscrubber/biofilter combination and an activated carbon adsorber. The developed method is a valuable approach to provide detailed information of waste gas composition and complements existing methods for the determination of odors. However, caution should be exercised if CAR/PDMS fibers are used for the quantification of odorous compounds in multi-component matrices like waste gas emissions since the relative affinity of each analyte was shown to differ according to the total amount of analytes present in the sample.

  10. Power of Ultra Performance Liquid Chromatography/Electrospray Ionization-MS Reconstructed Ion Chromatograms in the Characterization of Small Differences in Polymer Microstructure.

    PubMed

    Epping, Ruben; Panne, Ulrich; Falkenhagen, Jana

    2018-03-06

    From simple homopolymers to functionalized, 3-dimensional structured copolymers, the complexity of polymeric materials has become more and more sophisticated. With new applications, for instance, in the semiconductor or pharmaceutical industry, the requirements for the characterization have risen with the complexity of the used polymers. For each additional distribution, an additional dimension in analysis is needed. Small, often isomeric heterogeneities in topology or microstructure can usually not be simply separated chromatographically or distinguished by any common detector but affect the properties of materials significantly. For a drug delivery system, for example, the degree of branching and branching distribution is crucial for the formation of micelles. Instead of a complicated, time-consuming, and/or expensive 2D-chromatography or ion mobility spectrometry (IMS) method, that also has its limitations, in this work, a simple approach using size exclusion chromatography (SEC) coupled with electrospray ionization (ESI) mass spectrometry is proposed. The online coupling allows the analysis of reconstructed ion chromatograms (RICs) of each degree of polymerization. While a complete separation often cannot be achieved, the derived retention times and peak widths lead to information on the existence and dispersity of heterogeneities. Although some microstructural heterogeneities like short chain branching can for large polymers be characterized with methods such as light scattering, for oligomers where the heterogeneities just start to form and their influence is at the maximum, they are inaccessible with these methods. It is also shown that with a proper calibration even quantitative information can be obtained. This method is suitable to detect small differences in, e.g., branching, 3D-structure, monomer sequence, or tacticity and could potentially be used in routine analysis to quickly determine deviations.

  11. Band-limited Green's Functions for Quantitative Evaluation of Acoustic Emission Using the Finite Element Method

    NASA Technical Reports Server (NTRS)

    Leser, William P.; Yuan, Fuh-Gwo; Leser, William P.

    2013-01-01

    A method of numerically estimating dynamic Green's functions using the finite element method is proposed. These Green's functions are accurate in a limited frequency range dependent on the mesh size used to generate them. This range can often match or exceed the frequency sensitivity of the traditional acoustic emission sensors. An algorithm is also developed to characterize an acoustic emission source by obtaining information about its strength and temporal dependence. This information can then be used to reproduce the source in a finite element model for further analysis. Numerical examples are presented that demonstrate the ability of the band-limited Green's functions approach to determine the moment tensor coefficients of several reference signals to within seven percent, as well as accurately reproduce the source-time function.

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Blume-Kohout, Robin J.; Gamble, John King; Nielsen, Erik

    Quantum tomography is used to characterize quantum operations implemented in quantum information processing (QIP) hardware. Traditionally, state tomography has been used to characterize the quantum state prepared in an initialization procedure, while quantum process tomography is used to characterize dynamical operations on a QIP system. As such, tomography is critical to the development of QIP hardware (since it is necessary both for debugging and validating as-built devices, and its results are used to influence the next generation of devices). But tomography suffers from several critical drawbacks. In this report, we present new research that resolves several of these flaws. Wemore » describe a new form of tomography called gate set tomography (GST), which unifies state and process tomography, avoids prior methods critical reliance on precalibrated operations that are not generally available, and can achieve unprecedented accuracies. We report on theory and experimental development of adaptive tomography protocols that achieve far higher fidelity in state reconstruction than non-adaptive methods. Finally, we present a new theoretical and experimental analysis of process tomography on multispin systems, and demonstrate how to more effectively detect and characterize quantum noise using carefully tailored ensembles of input states.« less

  13. Characterizing heterogeneous cellular responses to perturbations.

    PubMed

    Slack, Michael D; Martinez, Elisabeth D; Wu, Lani F; Altschuler, Steven J

    2008-12-09

    Cellular populations have been widely observed to respond heterogeneously to perturbation. However, interpreting the observed heterogeneity is an extremely challenging problem because of the complexity of possible cellular phenotypes, the large dimension of potential perturbations, and the lack of methods for separating meaningful biological information from noise. Here, we develop an image-based approach to characterize cellular phenotypes based on patterns of signaling marker colocalization. Heterogeneous cellular populations are characterized as mixtures of phenotypically distinct subpopulations, and responses to perturbations are summarized succinctly as probabilistic redistributions of these mixtures. We apply our method to characterize the heterogeneous responses of cancer cells to a panel of drugs. We find that cells treated with drugs of (dis-)similar mechanism exhibit (dis-)similar patterns of heterogeneity. Despite the observed phenotypic diversity of cells observed within our data, low-complexity models of heterogeneity were sufficient to distinguish most classes of drug mechanism. Our approach offers a computational framework for assessing the complexity of cellular heterogeneity, investigating the degree to which perturbations induce redistributions of a limited, but nontrivial, repertoire of underlying states and revealing functional significance contained within distinct patterns of heterogeneous responses.

  14. Baselining the New GSFC Information Systems Center: The Foundation for Verifiable Software Process Improvement

    NASA Technical Reports Server (NTRS)

    Parra, A.; Schultz, D.; Boger, J.; Condon, S.; Webby, R.; Morisio, M.; Yakimovich, D.; Carver, J.; Stark, M.; Basili, V.; hide

    1999-01-01

    This paper describes a study performed at the Information System Center (ISC) in NASA Goddard Space Flight Center. The ISC was set up in 1998 as a core competence center in information technology. The study aims at characterizing people, processes and products of the new center, to provide a basis for proposing improvement actions and comparing the center before and after these actions have been performed. The paper presents the ISC, goals and methods of the study, results and suggestions for improvement, through the branch-level portion of this baselining effort.

  15. Methods and apparatus for non-acoustic speech characterization and recognition

    DOEpatents

    Holzrichter, John F.

    1999-01-01

    By simultaneously recording EM wave reflections and acoustic speech information, the positions and velocities of the speech organs as speech is articulated can be defined for each acoustic speech unit. Well defined time frames and feature vectors describing the speech, to the degree required, can be formed. Such feature vectors can uniquely characterize the speech unit being articulated each time frame. The onset of speech, rejection of external noise, vocalized pitch periods, articulator conditions, accurate timing, the identification of the speaker, acoustic speech unit recognition, and organ mechanical parameters can be determined.

  16. Methods and apparatus for non-acoustic speech characterization and recognition

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Holzrichter, J.F.

    By simultaneously recording EM wave reflections and acoustic speech information, the positions and velocities of the speech organs as speech is articulated can be defined for each acoustic speech unit. Well defined time frames and feature vectors describing the speech, to the degree required, can be formed. Such feature vectors can uniquely characterize the speech unit being articulated each time frame. The onset of speech, rejection of external noise, vocalized pitch periods, articulator conditions, accurate timing, the identification of the speaker, acoustic speech unit recognition, and organ mechanical parameters can be determined.

  17. Phenomenological and mechanics aspects of nondestructive evaluation and characterization by sound and ultrasound of material and fracture properties

    NASA Technical Reports Server (NTRS)

    Fu, L. S. W.

    1982-01-01

    Developments in fracture mechanics and elastic wave theory enhance the understanding of many physical phenomena in a mathematical context. Available literature in the material, and fracture characterization by NDT, and the related mathematical methods in mechanics that provide fundamental underlying principles for its interpretation and evaluation are reviewed. Information on the energy release mechanism of defects and the interaction of microstructures within the material is basic in the formulation of the mechanics problems that supply guidance for nondestructive evaluation (NDE).

  18. Characteristics analysis of acupuncture electroencephalograph based on mutual information Lempel—Ziv complexity

    NASA Astrophysics Data System (ADS)

    Luo, Xi-Liu; Wang, Jiang; Han, Chun-Xiao; Deng, Bin; Wei, Xi-Le; Bian, Hong-Rui

    2012-02-01

    As a convenient approach to the characterization of cerebral cortex electrical information, electroencephalograph (EEG) has potential clinical application in monitoring the acupuncture effects. In this paper, a method composed of the mutual information method and Lempel—Ziv complexity method (MILZC) is proposed to investigate the effects of acupuncture on the complexity of information exchanges between different brain regions based on EEGs. In the experiments, eight subjects are manually acupunctured at ‘Zusanli’ acupuncture point (ST-36) with different frequencies (i.e., 50, 100, 150, and 200 times/min) and the EEGs are recorded simultaneously. First, MILZC values are compared in general. Then average brain connections are used to quantify the effectiveness of acupuncture under the above four frequencies. Finally, significance index P values are used to study the spatiality of the acupuncture effect on the brain. Three main findings are obtained: (i) MILZC values increase during the acupuncture; (ii) manual acupunctures (MAs) with 100 times/min and 150 times/min are more effective than with 50 times/min and 200 times/min; (iii) contralateral hemisphere activation is more prominent than ipsilateral hemisphere's. All these findings suggest that acupuncture contributes to the increase of brain information exchange complexity and the MILZC method can successfully describe these changes.

  19. The Chameleon Effect: Characterization Challenges Due to the Variability of Nanoparticles and Their Surfaces

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baer, Donald R.

    Nanoparticles in a variety of forms are of increasing importance in fundamental research, technological and medical applications, and environmental or toxicology studies. Physical and chemical drivers that lead to multiple types of particle instabilities complicate both the ability to produce and consistently deliver well defined particles and their appropriate characterization, frequently leading to inconsistencies and conflicts in the published literature. This perspective suggests that provenance information, beyond that often recorded or reported, and application of a set of core characterization methods, including a surface sensitive technique, consistently applied at critical times can serve as tools in the effort minimize reproducibilitymore » issues.« less

  20. Discovering the intelligence in molecular biology.

    PubMed

    Uberbacher, E

    1995-12-01

    The Third International Conference on Intelligent Systems in Molecular Biology was truly an outstanding event. Computational methods in molecular biology have reached a new level of maturity and utility, resulting in many high-impact applications. The success of this meeting bodes well for the rapid and continuing development of computational methods, intelligent systems and information-based approaches for the biosciences. The basic technology, originally most often applied to 'feasibility' problems, is now dealing effectively with the most difficult real-world problems. Significant progress has been made in understanding protein-structure information, structural classification, and how functional information and the relevant features of active-site geometry can be gleaned from structures by automated computational approaches. The value and limits of homology-based methods, and the ability to classify proteins by structure in the absence of homology, have reached a new level of sophistication. New methods for covariation analysis in the folding of large structures such as RNAs have shown remarkably good results, indicating the long-term potential to understand very complicated molecules and multimolecular complexes using computational means. Novel methods, such as HMMs, context-free grammars and the uses of mutual information theory, have taken center stage as highly valuable tools in our quest to represent and characterize biological information. A focus on creative uses of intelligent systems technologies and the trend toward biological application will undoubtedly continue and grow at the 1996 ISMB meeting in St Louis.

  1. Added Value of Assessing Adnexal Masses with Advanced MRI Techniques

    PubMed Central

    Thomassin-Naggara, I.; Balvay, D.; Rockall, A.; Carette, M. F.; Ballester, M.; Darai, E.; Bazot, M.

    2015-01-01

    This review will present the added value of perfusion and diffusion MR sequences to characterize adnexal masses. These two functional MR techniques are readily available in routine clinical practice. We will describe the acquisition parameters and a method of analysis to optimize their added value compared with conventional images. We will then propose a model of interpretation that combines the anatomical and morphological information from conventional MRI sequences with the functional information provided by perfusion and diffusion weighted sequences. PMID:26413542

  2. Participatory design of an integrated information system design to support public health nurses and nurse managers.

    PubMed

    Reeder, Blaine; Hills, Rebecca A; Turner, Anne M; Demiris, George

    2014-01-01

    The objectives of the study were to use persona-driven and scenario-based design methods to create a conceptual information system design to support public health nursing. We enrolled 19 participants from two local health departments to conduct an information needs assessment, create a conceptual design, and conduct a preliminary design validation. Interviews and thematic analysis were used to characterize information needs and solicit design recommendations from participants. Personas were constructed from participant background information, and scenario-based design was used to create a conceptual information system design. Two focus groups were conducted as a first iteration validation of information needs, personas, and scenarios. Eighty-nine information needs were identified. Two personas and 89 scenarios were created. Public health nurses and nurse managers confirmed the accuracy of information needs, personas, scenarios, and the perceived usefulness of proposed features of the conceptual design. Design artifacts were modified based on focus group results. Persona-driven design and scenario-based design are feasible methods to design for common work activities in different local health departments. Public health nurses and nurse managers should be engaged in the design of systems that support their work. © 2013 Wiley Periodicals, Inc.

  3. Participatory Design of an Integrated Information System Design to Support Public Health Nurses and Nurse Managers

    PubMed Central

    Reeder, Blaine; Hills, Rebecca A.; Turner, Anne M.; Demiris, George

    2014-01-01

    Objectives The objectives of the study were to use persona-driven and scenario-based design methods to create a conceptual information system design to support public health nursing. Design and Sample We enrolled 19 participants from two local health departments to conduct an information needs assessment, create a conceptual design, and conduct a preliminary design validation. Measures Interviews and thematic analysis were used to characterize information needs and solicit design recommendations from participants. Personas were constructed from participant background information, and scenario-based design was used to create a conceptual information system design. Two focus groups were conducted as a first iteration validation of information needs, personas, and scenarios. Results Eighty-nine information needs were identified. Two personas and 89 scenarios were created. Public health nurses and nurse managers confirmed the accuracy of information needs, personas, scenarios, and the perceived usefulness of proposed features of the conceptual design. Design artifacts were modified based on focus group results. Conclusion Persona-driven design and scenario-based design are feasible methods to design for common work activities in different local health departments. Public health nurses and nurse managers should be engaged in the design of systems that support their work. PMID:24117760

  4. Classification of weld defect based on information fusion technology for radiographic testing system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jiang, Hongquan; Liang, Zeming, E-mail: heavenlzm@126.com; Gao, Jianmin

    Improving the efficiency and accuracy of weld defect classification is an important technical problem in developing the radiographic testing system. This paper proposes a novel weld defect classification method based on information fusion technology, Dempster–Shafer evidence theory. First, to characterize weld defects and improve the accuracy of their classification, 11 weld defect features were defined based on the sub-pixel level edges of radiographic images, four of which are presented for the first time in this paper. Second, we applied information fusion technology to combine different features for weld defect classification, including a mass function defined based on the weld defectmore » feature information and the quartile-method-based calculation of standard weld defect class which is to solve a sample problem involving a limited number of training samples. A steam turbine weld defect classification case study is also presented herein to illustrate our technique. The results show that the proposed method can increase the correct classification rate with limited training samples and address the uncertainties associated with weld defect classification.« less

  5. Classification of weld defect based on information fusion technology for radiographic testing system.

    PubMed

    Jiang, Hongquan; Liang, Zeming; Gao, Jianmin; Dang, Changying

    2016-03-01

    Improving the efficiency and accuracy of weld defect classification is an important technical problem in developing the radiographic testing system. This paper proposes a novel weld defect classification method based on information fusion technology, Dempster-Shafer evidence theory. First, to characterize weld defects and improve the accuracy of their classification, 11 weld defect features were defined based on the sub-pixel level edges of radiographic images, four of which are presented for the first time in this paper. Second, we applied information fusion technology to combine different features for weld defect classification, including a mass function defined based on the weld defect feature information and the quartile-method-based calculation of standard weld defect class which is to solve a sample problem involving a limited number of training samples. A steam turbine weld defect classification case study is also presented herein to illustrate our technique. The results show that the proposed method can increase the correct classification rate with limited training samples and address the uncertainties associated with weld defect classification.

  6. Rural-Urban Disparities in Child Abuse Management Resources in the Emergency Department

    ERIC Educational Resources Information Center

    Choo, Esther K.; Spiro, David M.; Lowe, Robert A.; Newgard, Craig D.; Hall, Michael Kennedy; McConnell, Kenneth John

    2010-01-01

    Purpose: To characterize differences in child abuse management resources between urban and rural emergency departments (EDs). Methods: We surveyed ED directors and nurse managers at hospitals in Oregon to gain information about available abuse-related resources. Chi-square analysis was used to test differences between urban and rural EDs.…

  7. Associate Residency Training Directors in Psychiatry: Demographics, Professional Activities, and Job Satisfaction

    ERIC Educational Resources Information Center

    Arbuckle, Melissa R.; DeGolia, Sallie G.; Esposito, Karin; Miller, Deborah A.; Weinberg, Michael; Brenner, Adam M.

    2012-01-01

    Objective: The purpose of this study was to characterize associate training director (ATD) positions in psychiatry. Method: An on-line survey was e-mailed in 2009 to all ATDs identified through the American Association of Directors of Psychiatric Residency Training (AADPRT). Survey questions elicited information regarding demographics,…

  8. Development of advanced acreage estimation methods

    NASA Technical Reports Server (NTRS)

    Guseman, L. F., Jr. (Principal Investigator)

    1982-01-01

    The development of an accurate and efficient algorithm for analyzing the structure of MSS data, the application of the Akaiki information criterion to mixture models, and a research plan to delineate some of the technical issues and associated tasks in the area of rice scene radiation characterization are discussed. The AMOEBA clustering algorithm is refined and documented.

  9. Application of Terrestrial Geomorphic Threshold Theory to the Analysis of Small Channels on Mars

    NASA Technical Reports Server (NTRS)

    Rosenshein, E. B.; Greeley, R.; Arrowsmith, J. R.

    2001-01-01

    New terrestrial work on the geomorphic thresholds for channel initiation use the drainage area above a channel head vs. the slope at the channel head to delineate surface process types. This method has been used to characterize martian landscapes. Additional information is contained in the original extended abstract.

  10. Method for measuring multiple scattering corrections between liquid scintillators

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Verbeke, J. M.; Glenn, A. M.; Keefer, G. J.

    2016-04-11

    In this study, a time-of-flight method is proposed to experimentally quantify the fractions of neutrons scattering between scintillators. An array of scintillators is characterized in terms of crosstalk with this method by measuring a californium source, for different neutron energy thresholds. The spectral information recorded by the scintillators can be used to estimate the fractions of neutrons multiple scattering. With the help of a correction to Feynman's point model theory to account for multiple scattering, these fractions can in turn improve the mass reconstruction of fissile materials under investigation.

  11. An Overview of Prognosis Health Management Research at Glenn Research Center for Gas Turbine Engine Structures With Special Emphasis on Deformation and Damage Modeling

    NASA Technical Reports Server (NTRS)

    Arnold, Steven M.; Goldberg, Robert K.; Lerch, Bradley A.; Saleeb, Atef F.

    2009-01-01

    Herein a general, multimechanism, physics-based viscoelastoplastic model is presented in the context of an integrated diagnosis and prognosis methodology which is proposed for structural health monitoring, with particular applicability to gas turbine engine structures. In this methodology, diagnostics and prognostics will be linked through state awareness variable(s). Key technologies which comprise the proposed integrated approach include (1) diagnostic/detection methodology, (2) prognosis/lifing methodology, (3) diagnostic/prognosis linkage, (4) experimental validation, and (5) material data information management system. A specific prognosis lifing methodology, experimental characterization and validation and data information management are the focal point of current activities being pursued within this integrated approach. The prognostic lifing methodology is based on an advanced multimechanism viscoelastoplastic model which accounts for both stiffness and/or strength reduction damage variables. Methods to characterize both the reversible and irreversible portions of the model are discussed. Once the multiscale model is validated the intent is to link it to appropriate diagnostic methods to provide a full-featured structural health monitoring system.

  12. An Overview of Prognosis Health Management Research at GRC for Gas Turbine Engine Structures With Special Emphasis on Deformation and Damage Modeling

    NASA Technical Reports Server (NTRS)

    Arnold, Steven M.; Goldberg, Robert K.; Lerch, Bradley A.; Saleeb, Atef F.

    2009-01-01

    Herein a general, multimechanism, physics-based viscoelastoplastic model is presented in the context of an integrated diagnosis and prognosis methodology which is proposed for structural health monitoring, with particular applicability to gas turbine engine structures. In this methodology, diagnostics and prognostics will be linked through state awareness variable(s). Key technologies which comprise the proposed integrated approach include 1) diagnostic/detection methodology, 2) prognosis/lifing methodology, 3) diagnostic/prognosis linkage, 4) experimental validation and 5) material data information management system. A specific prognosis lifing methodology, experimental characterization and validation and data information management are the focal point of current activities being pursued within this integrated approach. The prognostic lifing methodology is based on an advanced multi-mechanism viscoelastoplastic model which accounts for both stiffness and/or strength reduction damage variables. Methods to characterize both the reversible and irreversible portions of the model are discussed. Once the multiscale model is validated the intent is to link it to appropriate diagnostic methods to provide a full-featured structural health monitoring system.

  13. p-Type Doping of GaN Nanowires Characterized by Photoelectrochemical Measurements.

    PubMed

    Kamimura, Jumpei; Bogdanoff, Peter; Ramsteiner, Manfred; Corfdir, Pierre; Feix, Felix; Geelhaar, Lutz; Riechert, Henning

    2017-03-08

    GaN nanowires (NWs) doped with Mg as a p-type impurity were grown on Si(111) substrates by plasma-assisted molecular beam epitaxy. In a systematic series of experiments, the amount of Mg supplied during NW growth was varied. The incorporation of Mg into the NWs was confirmed by the observation of donor-acceptor pairs and acceptor-bound excitons in low-temperature photoluminescence spectroscopy. Quantitative information about the Mg concentrations was deduced from Raman scattering by local vibrational modes related to Mg. In order to study the type and density of charge carriers present in the NWs, we employed two photoelectrochemical techniques, open-circuit potential and Mott-Schottky measurements. Both methods showed the expected transition from n-type to p-type conductivity with increasing Mg doping level, and the latter characterization technique allowed us to quantify the charge carrier concentration. Beyond the quantitative information obtained for Mg doping of GaN NWs, our systematic and comprehensive investigation demonstrates the benefit of photoelectrochemical methods for the analysis of doping in semiconductor NWs in general.

  14. Modification of measurement methods for evaluation of tissue-engineered cartilage function and biochemical properties using nanosecond pulsed laser

    NASA Astrophysics Data System (ADS)

    Ishihara, Miya; Sato, Masato; Kutsuna, Toshiharu; Ishihara, Masayuki; Mochida, Joji; Kikuchi, Makoto

    2008-02-01

    There is a demand in the field of regenerative medicine for measurement technology that enables determination of functions and components of engineered tissue. To meet this demand, we developed a method for extracellular matrix characterization using time-resolved autofluorescence spectroscopy, which enabled simultaneous measurements with mechanical properties using relaxation of laser-induced stress wave. In this study, in addition to time-resolved fluorescent spectroscopy, hyperspectral sensor, which enables to capture both spectral and spatial information, was used for evaluation of biochemical characterization of tissue-engineered cartilage. Hyperspectral imaging system provides spectral resolution of 1.2 nm and image rate of 100 images/sec. The imaging system consisted of the hyperspectral sensor, a scanner for x-y plane imaging, magnifying optics and Xenon lamp for transmmissive lighting. Cellular imaging using the hyperspectral image system has been achieved by improvement in spatial resolution up to 9 micrometer. The spectroscopic cellular imaging could be observed using cultured chondrocytes as sample. At early stage of culture, the hyperspectral imaging offered information about cellular function associated with endogeneous fluorescent biomolecules.

  15. Use of comparative genomics approaches to characterize interspecies differences in response to environmental chemicals: Challenges, opportunities, and research needs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Burgess-Herbert, Sarah L., E-mail: sarah.burgess@alum.mit.edu; Euling, Susan Y.

    A critical challenge for environmental chemical risk assessment is the characterization and reduction of uncertainties introduced when extrapolating inferences from one species to another. The purpose of this article is to explore the challenges, opportunities, and research needs surrounding the issue of how genomics data and computational and systems level approaches can be applied to inform differences in response to environmental chemical exposure across species. We propose that the data, tools, and evolutionary framework of comparative genomics be adapted to inform interspecies differences in chemical mechanisms of action. We compare and contrast existing approaches, from disciplines as varied as evolutionarymore » biology, systems biology, mathematics, and computer science, that can be used, modified, and combined in new ways to discover and characterize interspecies differences in chemical mechanism of action which, in turn, can be explored for application to risk assessment. We consider how genetic, protein, pathway, and network information can be interrogated from an evolutionary biology perspective to effectively characterize variations in biological processes of toxicological relevance among organisms. We conclude that comparative genomics approaches show promise for characterizing interspecies differences in mechanisms of action, and further, for improving our understanding of the uncertainties inherent in extrapolating inferences across species in both ecological and human health risk assessment. To achieve long-term relevance and consistent use in environmental chemical risk assessment, improved bioinformatics tools, computational methods robust to data gaps, and quantitative approaches for conducting extrapolations across species are critically needed. Specific areas ripe for research to address these needs are recommended.« less

  16. Metrological characterization methods for confocal chromatic line sensors and optical topography sensors

    NASA Astrophysics Data System (ADS)

    Seppä, Jeremias; Niemelä, Karri; Lassila, Antti

    2018-05-01

    The increasing use of chromatic confocal technology for, e.g. fast, in-line optical topography, and measuring thickness, roughness and profiles implies a need for the characterization of various aspects of the sensors. Single-point, line and matrix versions of chromatic confocal technology, encoding depth information into wavelength, have been developed. Of these, line sensors are particularly suitable for in-line process measurement. Metrological characterization and development of practical methods for calibration and checking is needed for new optical methods and devices. Compared to, e.g. tactile methods, optical topography measurement techniques have limitations related to light wavelength and coherence, optical properties of the sample including reflectivity, specularity, roughness and colour, and definition of optical versus mechanical surfaces. In this work, metrological characterization methods for optical line sensors were developed for scale magnification and linearity, sensitivity to sample properties, and dynamic characteristics. An accurate depth scale calibration method using a single prototype groove depth sample was developed for a line sensor and validated with laser-interferometric sample tracking, attaining (sub)micrometre level or better than 0.1% scale accuracy. Furthermore, the effect of different surfaces and materials on the measurement and depth scale was studied, in particular slope angle, specularity and colour. In addition, dynamic performance, noise, lateral scale and resolution were measured using the developed methods. In the case of the LCI1200 sensor used in this study, which has a 11.3 mm  ×  2.8 mm measurement range, the instrument depth scale was found to depend only minimally on sample colour, whereas measuring steeply sloped specular surfaces in the peripheral measurement area, in the worst case, caused a somewhat larger relative sample-dependent change (1%) in scale.

  17. Mapping the distribution of materials in hyperspectral data using the USGS Material Identification and Characterization Algorithm (MICA)

    USGS Publications Warehouse

    Kokaly, R.F.; King, T.V.V.; Hoefen, T.M.

    2011-01-01

    Identifying materials by measuring and analyzing their reflectance spectra has been an important method in analytical chemistry for decades. Airborne and space-based imaging spectrometers allow scientists to detect materials and map their distributions across the landscape. With new satellite-borne hyperspectral sensors planned for the future, for example, HYSPIRI (HYPerspectral InfraRed Imager), robust methods are needed to fully exploit the information content of hyperspectral remote sensing data. A method of identifying and mapping materials using spectral-feature based analysis of reflectance data in an expert-system framework called MICA (Material Identification and Characterization Algorithm) is described in this paper. The core concepts and calculations of MICA are presented. A MICA command file has been developed and applied to map minerals in the full-country coverage of the 2007 Afghanistan HyMap hyperspectral data. ?? 2011 IEEE.

  18. FTIR characterization of Bi2Sr2Can-1(Cu1-xFex)3O10+δ with (n=3, x = 0.01) ceramic superconductor

    NASA Astrophysics Data System (ADS)

    Kumar, Rohitash; Singh, H. S.; Singh, Yadunath

    2018-05-01

    We synthesized a ceramic superconductor Bi2Sr2Can-1(Cu1-xFex)3O10+δ with (n = 3, x = 0.01) by usual method of oxides superconductor. In this paper, we report the characterization of the said sample by Fourier Transform Infrared Spectroscopic (FTIR) method. This method provides information about structural and compound bonding formation for the studied sample in powder form. The sharper peaks in the recorded spectra are reflecting with a functional group in the high-frequency stretching and low frequency bending modes. In this study, the interaction between Cu-O and Fe-O bond occupies octahedral and tetrahedral positions due to occupancy of cations and anions. The increasing amount of (Fe) is showing the transmittance (T%) behavior with different bonding vibration modes.

  19. Characterization and Developmental History of Problem Solving Methods in Medicine

    PubMed Central

    Harbort, Robert A.

    1980-01-01

    The central thesis of this paper is the importance of the framework in which information is structured. It is technically important in the design of systems; it is also important in guaranteeing that systems are usable by clinicians. Progress in medical computing depends on our ability to develop a more quantitative understanding of the role of context in our choice of problem solving techniques. This in turn will help us to design more flexible and responsive computer systems. The paper contains an overview of some models of knowledge and problem solving methods, a characterization of modern diagnostic techniques, and a discussion of skill development in medical practice. Diagnostic techniques are examined in terms of how they are taught, what problem solving methods they use, and how they fit together into an overall theory of interpretation of the medical status of a patient.

  20. Thermal characterization of TiCxOy thin films

    NASA Astrophysics Data System (ADS)

    Fernandes, A. C.; Vaz, F.; Gören, A.; Junge, K. H.; Gibkes, J.; Bein, B. K.; Macedo, F.

    2008-01-01

    Thermal wave characterization of thin films used in industrial applications can be a useful tool, not just to get information on the films' thermal properties, but to get information on structural-physical parameters, e.g. crystalline structure and surface roughness, and on the film deposition conditions, since the thermal film properties are directly related to the structural-physical parameters and to the deposition conditions. Different sets of TiCXOY thin films, deposited by reactive magnetron sputtering on steel, have been prepared, changing only one deposition parameter at a time. Here, the effect of the oxygen flow on the thermal film properties is studied. The thermal waves have been measured by modulated IR radiometry, and the phase lag data have been interpreted using an Extremum method by which the thermal coating parameters are directly related to the values and modulation frequencies of the relative extrema of the inverse calibrated thermal wave phases. Structural/morphological characterization has been done using X-ray diffraction (XRD) and atomic force microscopy (AFM). The characterization of the films also includes thickness, hardness, and electric resistivity measurements. The results obtained so far indicate strong correlations between the thermal diffusivity and conductivity, on the one hand, and the oxygen flow on the other hand.

  1. Kernel-aligned multi-view canonical correlation analysis for image recognition

    NASA Astrophysics Data System (ADS)

    Su, Shuzhi; Ge, Hongwei; Yuan, Yun-Hao

    2016-09-01

    Existing kernel-based correlation analysis methods mainly adopt a single kernel in each view. However, only a single kernel is usually insufficient to characterize nonlinear distribution information of a view. To solve the problem, we transform each original feature vector into a 2-dimensional feature matrix by means of kernel alignment, and then propose a novel kernel-aligned multi-view canonical correlation analysis (KAMCCA) method on the basis of the feature matrices. Our proposed method can simultaneously employ multiple kernels to better capture the nonlinear distribution information of each view, so that correlation features learned by KAMCCA can have well discriminating power in real-world image recognition. Extensive experiments are designed on five real-world image datasets, including NIR face images, thermal face images, visible face images, handwritten digit images, and object images. Promising experimental results on the datasets have manifested the effectiveness of our proposed method.

  2. Bayesian hierarchical functional data analysis via contaminated informative priors.

    PubMed

    Scarpa, Bruno; Dunson, David B

    2009-09-01

    A variety of flexible approaches have been proposed for functional data analysis, allowing both the mean curve and the distribution about the mean to be unknown. Such methods are most useful when there is limited prior information. Motivated by applications to modeling of temperature curves in the menstrual cycle, this article proposes a flexible approach for incorporating prior information in semiparametric Bayesian analyses of hierarchical functional data. The proposed approach is based on specifying the distribution of functions as a mixture of a parametric hierarchical model and a nonparametric contamination. The parametric component is chosen based on prior knowledge, while the contamination is characterized as a functional Dirichlet process. In the motivating application, the contamination component allows unanticipated curve shapes in unhealthy menstrual cycles. Methods are developed for posterior computation, and the approach is applied to data from a European fecundability study.

  3. Electronic Nose Characterization of the Quality Parameters of Freeze-Dried Bacteria

    NASA Astrophysics Data System (ADS)

    Capuano, R.; Santonico, M.; Martinelli, E.; Paolesse, R.; Passot, S.; Fonseca, F.; Cenard, S.; Trelea, C.; Di Natale, C.

    2011-09-01

    Freeze-drying is the method of choice for preserving heat sensitive biological products such as microorganisms. The development of a fast analytical method for evaluating the properties of the dehydrated bacteria is then necessary for a proper utilization of the product in several food processes. In this paper, dried bacteria headspace is analyzed by a GC-MS and an electronic nose. Results indicate that headspace contains enough information to assess the products quality.

  4. SPF Full-scale emissions test method development status ...

    EPA Pesticide Factsheets

    This is a non-technical presentation that is intended to inform ASTM task group members about our intended approach to full-scale emissions testing that includes the application of spray foam in an environmental chamber. The presentation describes the approach to emissions characterization, types of measurement systems employed, and expected outcomes from the planned tests. Purpose of this presentation is to update the ASTM D22.05 work group regarding status of our full-scale emissions test method development.

  5. Physical interpretation and development of ultrasonic nondestructive evaluation techniques applied to the quantitative characterization of textile composite materials

    NASA Technical Reports Server (NTRS)

    Miller, James G.

    1995-01-01

    In this Progress Report, the author describes the continuing research to explore the feasibility of implementing medical linear array imaging technology as a viable ultrasonic-based nondestructive evaluation method to inspect and characterize complex materials. Images obtained using an unmodified medical ultrasonic imaging system of a bonded aluminum plate sample with a simulated disbond region are presented. The disbond region was produced by adhering a piece of plain white paper to a piece of cellophane tape and applying the paper-tape combination to one of the aluminum plates. Because the area under the paper was not adhesively bonded to the aluminum plate, this arrangement more closely simulates a disbond. Images are also presented for an aluminum plate sample with an epoxy strip adhered to one side to help provide information for the interpretation of the images of the bonded aluminum plate sample containing the disbond region. These images are compared with corresponding conventional ultrasonic contact transducer measurements in order to provide information regarding the nature of the disbonded region. The results of this on-going investigation may provide a step toward the development of a rapid, real-time, and portable method of ultrasonic inspection and characterization based on linear array technology. In Section 2 of this Progress Report, the preparation of the aluminum plate specimens is described. Section 3 describes the method of linear array imaging. Sections 4 and 5 present the linear array images and results from contact transducer measurements, respectively. A discussion of the results are presented in Section 6.

  6. Scalable subsurface inverse modeling of huge data sets with an application to tracer concentration breakthrough data from magnetic resonance imaging

    NASA Astrophysics Data System (ADS)

    Lee, Jonghyun; Yoon, Hongkyu; Kitanidis, Peter K.; Werth, Charles J.; Valocchi, Albert J.

    2016-07-01

    Characterizing subsurface properties is crucial for reliable and cost-effective groundwater supply management and contaminant remediation. With recent advances in sensor technology, large volumes of hydrogeophysical and geochemical data can be obtained to achieve high-resolution images of subsurface properties. However, characterization with such a large amount of information requires prohibitive computational costs associated with "big data" processing and numerous large-scale numerical simulations. To tackle such difficulties, the principal component geostatistical approach (PCGA) has been proposed as a "Jacobian-free" inversion method that requires much smaller forward simulation runs for each iteration than the number of unknown parameters and measurements needed in the traditional inversion methods. PCGA can be conveniently linked to any multiphysics simulation software with independent parallel executions. In this paper, we extend PCGA to handle a large number of measurements (e.g., 106 or more) by constructing a fast preconditioner whose computational cost scales linearly with the data size. For illustration, we characterize the heterogeneous hydraulic conductivity (K) distribution in a laboratory-scale 3-D sand box using about 6 million transient tracer concentration measurements obtained using magnetic resonance imaging. Since each individual observation has little information on the K distribution, the data were compressed by the zeroth temporal moment of breakthrough curves, which is equivalent to the mean travel time under the experimental setting. Only about 2000 forward simulations in total were required to obtain the best estimate with corresponding estimation uncertainty, and the estimated K field captured key patterns of the original packing design, showing the efficiency and effectiveness of the proposed method.

  7. Decoding of quantum dots encoded microbeads using a hyperspectral fluorescence imaging method.

    PubMed

    Liu, Yixi; Liu, Le; He, Yonghong; Zhu, Liang; Ma, Hui

    2015-05-19

    We presented a decoding method of quantum dots encoded microbeads with its fluorescence spectra using line scan hyperspectral fluorescence imaging (HFI) method. A HFI method was developed to attain both the spectra of fluorescence signal and the spatial information of the encoded microbeads. A decoding scheme was adopted to decode the spectra of multicolor microbeads acquired by the HFI system. Comparison experiments between the HFI system and the flow cytometer were conducted. The results showed that the HFI system has higher spectrum resolution; thus, more channels in spectral dimension can be used. The HFI system detection and decoding experiment with the single-stranded DNA (ssDNA) immobilized multicolor beads was done, and the result showed the efficiency of the HFI system. Surface modification of the microbeads by use of the polydopamine was characterized by the scanning electron microscopy and ssDNA immobilization was characterized by the laser confocal microscope. These results indicate that the designed HFI system can be applied to practical biological and medical applications.

  8. Developments in ambient noise analysis for the characterization of dynamic response of slopes to seismic shaking

    NASA Astrophysics Data System (ADS)

    Del Gaudio, Vincenzo; Wasowski, Janusz

    2016-04-01

    In the last few decades, we have witnessed a growing awareness of the role of site dynamic response to seismic shaking in slope failures during earthquakes. Considering the time and costs involved in acquiring accelerometer data on landslide prone slopes, the analysis of ambient noise offers a profitable investigative alternative. Standard procedures of ambient noise analysis, according to the technique known as HVNR or Nakamura's method, were originally devised to interpret data under simple site conditions similar to 1D layering (flat horizontal layering infinitely extended). In such cases, conditions of site amplification, characterized by a strong impedance contrast between a soft surface layer and a stiff bedrock, result in a single pronounced isotropic maximum of spectral ratios between horizontal and vertical component of ambient noise. However, previous studies have shown that the dynamic response of slopes affected by landslides is rather complex, being characterized by multiple resonance peaks with directional variability, thus, the use of standard techniques can encounter difficulties in providing reliable information. A new approach of data analysis has recently been proposed to exploit the potential of information content of Rayleigh waves present in ambient noise, with regard to the identification of frequency and orientation of directional resonance. By exploiting ground motion ellipticity this approach can also provide information on vertical distribution of S-wave velocity, which controls site amplification factors. The method, based on the identification of Rayleigh wave packets from instantaneous polarization properties of ambient noise, was first tested using synthetic signals in order to optimize the data processing system. Then the improved processing scheme is adopted to re-process and re-interpret the ambient noise data acquired on landslide prone slopes around Caramanico Terme (central Italy), at sites monitored also with accelerometer stations. The comparison of ambient noise analysis results with the outcomes of accelerometer monitoring reveals potential and limits of the new method for the investigations on slope dynamic response.

  9. A new method to study the change of miRNA-mRNA interactions due to environmental exposures.

    PubMed

    Petralia, Francesca; Aushev, Vasily N; Gopalakrishnan, Kalpana; Kappil, Maya; W Khin, Nyan; Chen, Jia; Teitelbaum, Susan L; Wang, Pei

    2017-07-15

    Integrative approaches characterizing the interactions among different types of biological molecules have been demonstrated to be useful for revealing informative biological mechanisms. One such example is the interaction between microRNA (miRNA) and messenger RNA (mRNA), whose deregulation may be sensitive to environmental insult leading to altered phenotypes. The goal of this work is to develop an effective data integration method to characterize deregulation between miRNA and mRNA due to environmental toxicant exposures. We will use data from an animal experiment designed to investigate the effect of low-dose environmental chemical exposure on normal mammary gland development in rats to motivate and evaluate the proposed method. We propose a new network approach-integrative Joint Random Forest (iJRF), which characterizes the regulatory system between miRNAs and mRNAs using a network model. iJRF is designed to work under the high-dimension low-sample-size regime, and can borrow information across different treatment conditions to achieve more accurate network inference. It also effectively takes into account prior information of miRNA-mRNA regulatory relationships from existing databases. When iJRF is applied to the data from the environmental chemical exposure study, we detected a few important miRNAs that regulated a large number of mRNAs in the control group but not in the exposed groups, suggesting the disruption of miRNA activity due to chemical exposure. Effects of chemical exposure on two affected miRNAs were further validated using breast cancer human cell lines. R package iJRF is available at CRAN. pei.wang@mssm.edu or susan.teitelbaum@mssm.edu. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  10. Creating NDA working standards through high-fidelity spent fuel modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Skutnik, Steven E; Gauld, Ian C; Romano, Catherine E

    2012-01-01

    The Next Generation Safeguards Initiative (NGSI) is developing advanced non-destructive assay (NDA) techniques for spent nuclear fuel assemblies to advance the state-of-the-art in safeguards measurements. These measurements aim beyond the capabilities of existing methods to include the evaluation of plutonium and fissile material inventory, independent of operator declarations. Testing and evaluation of advanced NDA performance will require reference assemblies with well-characterized compositions to serve as working standards against which the NDA methods can be benchmarked and for uncertainty quantification. To support the development of standards for the NGSI spent fuel NDA project, high-fidelity modeling of irradiated fuel assemblies is beingmore » performed to characterize fuel compositions and radiation emission data. The assembly depletion simulations apply detailed operating history information and core simulation data as it is available to perform high fidelity axial and pin-by-pin fuel characterization for more than 1600 nuclides. The resulting pin-by-pin isotopic inventories are used to optimize the NDA measurements and provide information necessary to unfold and interpret the measurement data, e.g., passive gamma emitters, neutron emitters, neutron absorbers, and fissile content. A key requirement of this study is the analysis of uncertainties associated with the calculated compositions and signatures for the standard assemblies; uncertainties introduced by the calculation methods, nuclear data, and operating information. An integral part of this assessment involves the application of experimental data from destructive radiochemical assay to assess the uncertainty and bias in computed inventories, the impact of parameters such as assembly burnup gradients and burnable poisons, and the influence of neighboring assemblies on periphery rods. This paper will present the results of high fidelity assembly depletion modeling and uncertainty analysis from independent calculations performed using SCALE and MCNP. This work is supported by the Next Generation Safeguards Initiative, Office of Nuclear Safeguards and Security, National Nuclear Security Administration.« less

  11. Study of the correlation parameters of the surface structure of disordered semiconductors by the two-dimensional DFA and average mutual information methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Alpatov, A. V.; Vikhrov, S. P.; Rybina, N. V., E-mail: pgnv@mail.ru

    The processes of self-organization of the surface structure of hydrogenated amorphous silicon are studied by the methods of fluctuation analysis and average mutual information on the basis of atomic-force-microscopy images of the surface. It is found that all of the structures can be characterized by a correlation vector and represented as a superposition of harmonic components and noise. It is shown that, under variations in the technological parameters of the production of a-Si:H films, the correlation properties of their structure vary as well. As the substrate temperature is increased, the formation of structural irregularities becomes less efficient; in this case,more » the length of the correlation vector and the degree of structural ordering increase. It is shown that the procedure based on the method of fluctuation analysis in combination with the method of average mutual information provides a means for studying the self-organization processes in any structures on different length scales.« less

  12. The Advancement of Public Awareness, Concerning TRU Waste Characterization, Using a Virtual Document

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    West, T. B.; Burns, T. P.; Estill, W. G.

    2002-02-28

    Building public trust and confidence through openness is a goal of the DOE Carlsbad Field Office for the Waste Isolation Pilot Plant (WIPP). The objective of the virtual document described in this paper is to give the public an overview of the waste characterization steps, an understanding of how waste characterization instrumentation works, and the type and amount of data generated from a batch of drums. The document is intended to be published on a web page and/or distributed at public meetings on CDs. Users may gain as much information as they desire regarding the transuranic (TRU) waste characterization program,more » starting at the highest level requirements (drivers) and progressing to more and more detail regarding how the requirements are met. Included are links to: drivers (which include laws, permits and DOE Orders); various characterization steps required for transportation and disposal under WIPP's Hazardous Waste Facility Permit; physical/chemical basis for each characterization method; types of data produced; and quality assurance process that accompanies each measurement. Examples of each type of characterization method in use across the DOE complex are included. The original skeleton of the document was constructed in a PowerPoint presentation and included descriptions of each section of the waste characterization program. This original document had a brief overview of Acceptable Knowledge, Non-Destructive Examination, Non-Destructive Assay, Small Quantity sites, and the National Certification Team. A student intern was assigned the project of converting the document to a virtual format and to discuss each subject in depth. The resulting product is a fully functional virtual document that works in a web browser and functions like a web page. All documents that were referenced, linked to, or associated, are included on the virtual document's CD. WIPP has been engaged in a variety of Hazardous Waste Facility Permit modification activities. During the public meetings, discussion centered on proposed changes to the characterization program. The philosophy behind the virtual document is to show the characterization process as a whole, rather than as isolated parts. In addition to public meetings, other uses for the information might be as a training tool for new employees at the WIPP facility to show them where their activities fit into the overall scheme, as well as an employee review to help prepare for waste certification audits.« less

  13. Results and lessons learned from MODIS polarization sensitivity characterization

    NASA Astrophysics Data System (ADS)

    Sun, J.; Xiong, X.; Wang, X.; Qiu, S.; Xiong, S.; Waluschka, E.

    2006-08-01

    In addition to radiometric, spatial, and spectral calibration requirements, MODIS design specifications include polarization sensitivity requirements of less than 2% for all Reflective Solar Bands (RSB) except for the band centered at 412nm. To the best of our knowledge, MODIS was the first imaging radiometer that went through comprehensive system level (end-to-end) polarization characterization. MODIS polarization sensitivity was measured pre-launch at a number of sensor view angles using a laboratory Polarization Source Assembly (PSA) that consists of a rotatable source, a polarizer (Ahrens prism design), and a collimator. This paper describes MODIS polarization characterization approaches used by MODIS Characterization Support Team (MCST) at NASA/GSFC and addresses issues and concerns in the measurements. Results (polarization factor and phase angle) using different analyzing methods are discussed. Also included in this paper is a polarization characterization comparison between Terra and Aqua MODIS. Our previous and recent analysis of MODIS RSB polarization sensitivity could provide useful information for future Earth-observing sensor design, development, and characterization.

  14. Software Suite to Support In-Flight Characterization of Remote Sensing Systems

    NASA Technical Reports Server (NTRS)

    Stanley, Thomas; Holekamp, Kara; Gasser, Gerald; Tabor, Wes; Vaughan, Ronald; Ryan, Robert; Pagnutti, Mary; Blonski, Slawomir; Kenton, Ross

    2014-01-01

    A characterization software suite was developed to facilitate NASA's in-flight characterization of commercial remote sensing systems. Characterization of aerial and satellite systems requires knowledge of ground characteristics, or ground truth. This information is typically obtained with instruments taking measurements prior to or during a remote sensing system overpass. Acquired ground-truth data, which can consist of hundreds of measurements with different data formats, must be processed before it can be used in the characterization. Accurate in-flight characterization of remote sensing systems relies on multiple field data acquisitions that are efficiently processed, with minimal error. To address the need for timely, reproducible ground-truth data, a characterization software suite was developed to automate the data processing methods. The characterization software suite is engineering code, requiring some prior knowledge and expertise to run. The suite consists of component scripts for each of the three main in-flight characterization types: radiometric, geometric, and spatial. The component scripts for the radiometric characterization operate primarily by reading the raw data acquired by the field instruments, combining it with other applicable information, and then reducing it to a format that is appropriate for input into MODTRAN (MODerate resolution atmospheric TRANsmission), an Air Force Research Laboratory-developed radiative transport code used to predict at-sensor measurements. The geometric scripts operate by comparing identified target locations from the remote sensing image to known target locations, producing circular error statistics defined by the Federal Geographic Data Committee Standards. The spatial scripts analyze a target edge within the image, and produce estimates of Relative Edge Response and the value of the Modulation Transfer Function at the Nyquist frequency. The software suite enables rapid, efficient, automated processing of ground truth data, which has been used to provide reproducible characterizations on a number of commercial remote sensing systems. Overall, this characterization software suite improves the reliability of ground-truth data processing techniques that are required for remote sensing system in-flight characterizations.

  15. Characteristics of Health Information Gatherers, Disseminators, and Blockers Within Families at Risk of Hereditary Cancer: Implications for Family Health Communication Interventions

    PubMed Central

    Peters, June A.; Kenen, Regina; Hoskins, Lindsey M.; Ersig, Anne L.; Kuhn, Natalia R.; Loud, Jennifer T.; Greene, Mark H.

    2009-01-01

    Objectives. Given the importance of the dissemination of accurate family history to assess disease risk, we characterized the gatherers, disseminators, and blockers of health information within families at high genetic risk of cancer. Methods. A total of 5466 personal network members of 183 female participants of the Breast Imaging Study from 124 families with known mutations in the BRCA1/2 genes (associated with high risk of breast, ovarian, and other types of cancer) were identified by using the Colored Eco-Genetic Relationship Map (CEGRM). Hierarchical nonlinear models were fitted to characterize information gatherers, disseminators, and blockers. Results. Gatherers of information were more often female (P < .001), parents (P < .001), and emotional support providers (P < .001). Disseminators were more likely female first- and second- degree relatives (both P < .001), family members in the older or same generation as the participant (P < .001), those with a cancer history (P < .001), and providers of emotional (P < .001) or tangible support (P < .001). Blockers tended to be spouses or partners (P < .001) and male, first-degree relatives (P < .001). Conclusions. Our results provide insight into which family members may, within a family-based intervention, effectively gather family risk information, disseminate information, and encourage discussions regarding shared family risk. PMID:19833996

  16. Automated Interpretation of Subcellular Patterns in Fluorescence Microscope Images for Location Proteomics

    PubMed Central

    Chen, Xiang; Velliste, Meel; Murphy, Robert F.

    2010-01-01

    Proteomics, the large scale identification and characterization of many or all proteins expressed in a given cell type, has become a major area of biological research. In addition to information on protein sequence, structure and expression levels, knowledge of a protein’s subcellular location is essential to a complete understanding of its functions. Currently subcellular location patterns are routinely determined by visual inspection of fluorescence microscope images. We review here research aimed at creating systems for automated, systematic determination of location. These employ numerical feature extraction from images, feature reduction to identify the most useful features, and various supervised learning (classification) and unsupervised learning (clustering) methods. These methods have been shown to perform significantly better than human interpretation of the same images. When coupled with technologies for tagging large numbers of proteins and high-throughput microscope systems, the computational methods reviewed here enable the new subfield of location proteomics. This subfield will make critical contributions in two related areas. First, it will provide structured, high-resolution information on location to enable Systems Biology efforts to simulate cell behavior from the gene level on up. Second, it will provide tools for Cytomics projects aimed at characterizing the behaviors of all cell types before, during and after the onset of various diseases. PMID:16752421

  17. Chromosome preference of disease genes and vectorization for the prediction of non-coding disease genes.

    PubMed

    Peng, Hui; Lan, Chaowang; Liu, Yuansheng; Liu, Tao; Blumenstein, Michael; Li, Jinyan

    2017-10-03

    Disease-related protein-coding genes have been widely studied, but disease-related non-coding genes remain largely unknown. This work introduces a new vector to represent diseases, and applies the newly vectorized data for a positive-unlabeled learning algorithm to predict and rank disease-related long non-coding RNA (lncRNA) genes. This novel vector representation for diseases consists of two sub-vectors, one is composed of 45 elements, characterizing the information entropies of the disease genes distribution over 45 chromosome substructures. This idea is supported by our observation that some substructures (e.g., the chromosome 6 p-arm) are highly preferred by disease-related protein coding genes, while some (e.g., the 21 p-arm) are not favored at all. The second sub-vector is 30-dimensional, characterizing the distribution of disease gene enriched KEGG pathways in comparison with our manually created pathway groups. The second sub-vector complements with the first one to differentiate between various diseases. Our prediction method outperforms the state-of-the-art methods on benchmark datasets for prioritizing disease related lncRNA genes. The method also works well when only the sequence information of an lncRNA gene is known, or even when a given disease has no currently recognized long non-coding genes.

  18. Chromosome preference of disease genes and vectorization for the prediction of non-coding disease genes

    PubMed Central

    Peng, Hui; Lan, Chaowang; Liu, Yuansheng; Liu, Tao; Blumenstein, Michael; Li, Jinyan

    2017-01-01

    Disease-related protein-coding genes have been widely studied, but disease-related non-coding genes remain largely unknown. This work introduces a new vector to represent diseases, and applies the newly vectorized data for a positive-unlabeled learning algorithm to predict and rank disease-related long non-coding RNA (lncRNA) genes. This novel vector representation for diseases consists of two sub-vectors, one is composed of 45 elements, characterizing the information entropies of the disease genes distribution over 45 chromosome substructures. This idea is supported by our observation that some substructures (e.g., the chromosome 6 p-arm) are highly preferred by disease-related protein coding genes, while some (e.g., the 21 p-arm) are not favored at all. The second sub-vector is 30-dimensional, characterizing the distribution of disease gene enriched KEGG pathways in comparison with our manually created pathway groups. The second sub-vector complements with the first one to differentiate between various diseases. Our prediction method outperforms the state-of-the-art methods on benchmark datasets for prioritizing disease related lncRNA genes. The method also works well when only the sequence information of an lncRNA gene is known, or even when a given disease has no currently recognized long non-coding genes. PMID:29108274

  19. Monitoring the bio-stimulation of hydrocarbon-contaminated soils by measurements of soil electrical properties, and CO2 content and its 13C/12C isotopic signature

    NASA Astrophysics Data System (ADS)

    Noel, C.; Gourry, J.; Ignatiadis, I.; Colombano, S.; Dictor, M.; Guimbaud, C.; Chartier, M.; Dumestre, A.; Dehez, S.; Naudet, V.

    2013-12-01

    Hydrocarbon contaminated soils represent an environmental issue as it impacts on ecosystems and aquifers. Where significant subsurface heterogeneity exists, conventional intrusive investigations and groundwater sampling can be insufficient to obtain a robust monitoring of hydrocarbon contaminants, as the information they provide is restricted to vertical profiles at discrete locations, with no information between sampling points. In order to obtain wider information in space volume on subsurface modifications, complementary methods can be used like geophysics. Among geophysical methods, geoelectrical techniques such as electrical resistivity (ER) and induced polarization (IP) seem the more promising, especially to study the effects of biodegradation processes. Laboratory and field geoelectrical experiments to characterize soils contaminated by oil products have shown that mature hydrocarbon-contaminated soils are characterized by enhanced electrical conductivity although hydrocarbons are electrically resistive. This high bulk conductivity is due to bacterial impacts on geological media, resulting in changes in the chemical and physical properties and thus, to the geophysical properties of the ground. Moreover, microbial activity induced CO2 production and isotopic deviation of carbon. Indeed, produced CO2 will reflect the pollutant isotopic signature. Thus, the ratio δ13C(CO2) will come closer to δ13C(hydrocarbon). BIOPHY, project supported by the French National Research Agency (ANR), proposes to use electrical methods and gas analyses to develop an operational and non-destructive method for monitoring in situ biodegradation of hydrocarbons in order to optimize soil treatment. Demonstration field is located in the South of Paris (France), where liquid fuels (gasoline and diesel) leaked from some tanks in 1997. In order to stimulate biodegradation, a trench has been dug to supply oxygen to the water table and thus stimulate aerobic metabolic bioprocesses. ER and IP surveys are performed regularly to monitor the stimulated biodegradation and progress of remediation until soil cleanup. Microbial activity is characterized by CO2 production increase and δ13C isotopic deviation, in the produced CO2 measured by infrared laser spectroscopy, and by an evolution of electrical conductivity and IP responses in correlation with microbiological and chemical analyses.

  20. Recent developments on algal biochar production and characterization.

    PubMed

    Yu, Kai Ling; Lau, Beng Fye; Show, Pau Loke; Ong, Hwai Chyuan; Ling, Tau Chuan; Chen, Wei-Hsin; Ng, Eng Poh; Chang, Jo-Shu

    2017-12-01

    Algal biomass is known as a promising sustainable feedstock for the production of biofuels and other valuable products. However, since last decade, massive amount of interests have turned to converting algal biomass into biochar. Due to their high nutrient content and ion-exchange capacity, algal biochars can be used as soil amendment for agriculture purposes or adsorbents in wastewater treatment for the removal of organic or inorganic pollutants. This review describes the conventional (e.g., slow and microwave-assisted pyrolysis) and newly developed (e.g., hydrothermal carbonization and torrefaction) methods used for the synthesis of algae-based biochars. The characterization of algal biochar and a comparison between algal biochar with biochar produced from other feedstocks are also presented. This review aims to provide updated information on the development of algal biochar in terms of the production methods and the characterization of its physical and chemical properties to justify and to expand their potential applications. Copyright © 2017 Elsevier Ltd. All rights reserved.

  1. Separation and characterization of silybin, isosilybin, silydianin and silychristin in milk thistle extract by liquid chromatography-electrospray tandem mass spectrometry.

    PubMed

    Lee, James I; Hsu, Bih H; Wu, Di; Barrett, Jeffrey S

    2006-05-26

    A selective and sensitive liquid chromatography/tandem mass spectrometry (LC/MS/MS) method has been developed for the characterization of silymarin in commercially available milk thistle extract. In this study, six main active constituents, including silydianin, silychristin, diastereomers of silybin (silybin A and B) and diastereomers of isosilybin (isosilybin A and B) in silymarin, were completely separated on a YMC ODS-AQ HPLC column using a gradient mobile phase system comprised of ammonium acetate and methanol/water/formic acid. Identification and characterization of the major constituents were based not only on the product ion scan, which provided unique fragmentation information of a selected molecular ion, but also on the specific fragmentation of multiple reaction monitoring (MRM) data, which confirmed the retention times of LC chromatographic peaks. The method was applied in the analysis of human plasma samples in the presence of silymarin and appeared to be suitable for the pharmacokinetic studies in which the discrimination of silymarin constituents is essential.

  2. DNA binding site characterization by means of Rényi entropy measures on nucleotide transitions.

    PubMed

    Perera, A; Vallverdu, M; Claria, F; Soria, J M; Caminal, P

    2008-06-01

    In this work, parametric information-theory measures for the characterization of binding sites in DNA are extended with the use of transitional probabilities on the sequence. We propose the use of parametric uncertainty measures such as Rényi entropies obtained from the transition probabilities for the study of the binding sites, in addition to nucleotide frequency-based Rényi measures. Results are reported in this work comparing transition frequencies (i.e., dinucleotides) and base frequencies for Shannon and parametric Rényi entropies for a number of binding sites found in E. Coli, lambda and T7 organisms. We observe that the information provided by both approaches is not redundant. Furthermore, under the presence of noise in the binding site matrix we observe overall improved robustness of nucleotide transition-based algorithms when compared with nucleotide frequency-based method.

  3. Documentation of the U.S. Geological Survey Stress and Sediment Mobility Database

    USGS Publications Warehouse

    Dalyander, P. Soupy; Butman, Bradford; Sherwood, Christopher R.; Signell, Richard P.

    2012-01-01

    The U.S. Geological Survey Sea Floor Stress and Sediment Mobility Database contains estimates of bottom stress and sediment mobility for the U.S. continental shelf. This U.S. Geological Survey database provides information that is needed to characterize sea floor ecosystems and evaluate areas for human use. The estimates contained in the database are designed to spatially and seasonally resolve the general characteristics of bottom stress over the U.S. continental shelf and to estimate sea floor mobility by comparing critical stress thresholds based on observed sediment texture data to the modeled stress. This report describes the methods used to make the bottom stress and mobility estimates, statistics used to characterize stress and mobility, data validation procedures, and the metadata for each dataset and provides information on how to access the database online.

  4. Upgrade of the Surface Spectrometer at NEPOMUC for PAES, XPS and STM Investigations

    NASA Astrophysics Data System (ADS)

    Zimnik, S.; Lippert, F.; Hugenschmidt, C.

    2014-04-01

    The characterization of the elemental composition of surfaces is of great importance for the understanding of many surface processes, such as surface segregation or oxidation. Positron-annihilation-induced Auger Electron Spectroscopy (PAES) is a powerful technique for gathering information about the elemental composition of only the topmost atomic layer of a sample. The upgraded surface spectrometer at NEPOMUC (NEtron induced POsitron source MUniCh) enables a comprehensive surface analysis with the complementary techniques STM, XPS and PAES. A new X-ray source for X-ray induced photoelectron spectroscopy (XPS) was installed to gather additional information on oxidation states. A new scanning tunneling microscope (STM) is used as a complementary method to investigate with atomic resolution the surface electron density. The combination of PAES, XPS and STM allows the characterization of both the elemental composition, and the surface topology.

  5. A thermodynamic-like characterization of Colombia’s presidential elections in 2010, and a comparison with other Latin American countries

    NASA Astrophysics Data System (ADS)

    Campos, Diógenes

    2011-05-01

    A thermodynamic-like characterization of Colombia’s presidential election is presented. We assume that the electoral system consists of citizens embedded in a political environment, and that this environment can be considered as an information bath characterized by the entropic parameter q ( q∈[0,∞]). First, for q=1, the electoral outcomes of 2010 are translated into a set of probabilities (relative frequencies of the events) P={P1,P2,…,PN}, with N possible independent results. Then, for 0≤q<∞, the electoral system is characterized by using the thermodynamic-like method for probabilistic systems proposed in a previous article. Some general considerations of the macro-description of a probabilistic system and a comparison of presidential elections in five countries are also included.

  6. Geotechnical parameter spatial distribution stochastic analysis based on multi-precision information assimilation

    NASA Astrophysics Data System (ADS)

    Wang, C.; Rubin, Y.

    2014-12-01

    Spatial distribution of important geotechnical parameter named compression modulus Es contributes considerably to the understanding of the underlying geological processes and the adequate assessment of the Es mechanics effects for differential settlement of large continuous structure foundation. These analyses should be derived using an assimilating approach that combines in-situ static cone penetration test (CPT) with borehole experiments. To achieve such a task, the Es distribution of stratum of silty clay in region A of China Expo Center (Shanghai) is studied using the Bayesian-maximum entropy method. This method integrates rigorously and efficiently multi-precision of different geotechnical investigations and sources of uncertainty. Single CPT samplings were modeled as a rational probability density curve by maximum entropy theory. Spatial prior multivariate probability density function (PDF) and likelihood PDF of the CPT positions were built by borehole experiments and the potential value of the prediction point, then, preceding numerical integration on the CPT probability density curves, the posterior probability density curve of the prediction point would be calculated by the Bayesian reverse interpolation framework. The results were compared between Gaussian Sequential Stochastic Simulation and Bayesian methods. The differences were also discussed between single CPT samplings of normal distribution and simulated probability density curve based on maximum entropy theory. It is shown that the study of Es spatial distributions can be improved by properly incorporating CPT sampling variation into interpolation process, whereas more informative estimations are generated by considering CPT Uncertainty for the estimation points. Calculation illustrates the significance of stochastic Es characterization in a stratum, and identifies limitations associated with inadequate geostatistical interpolation techniques. This characterization results will provide a multi-precision information assimilation method of other geotechnical parameters.

  7. TECHNICAL APPROACHES TO CHARACTERIZING AND ...

    EPA Pesticide Factsheets

    The document provides brownfields planners with an overview of the technical methods that can be used to achieve successful site assessment and cleanup which are two key components of the brownfields redevelopment process. No two brownfields sites are identical and planners will need to base assessment and cleanup activities on the conditions of the particular sites with which they are dealing. A site assessment strategy should address: the type and extent of contamination, if any, that is present, the types of data needed to adequately assess the site; appropriate sampling and analytical methods to characterize the contamination; acceptable level of uncertainty and cleanup technologies that contain or treat the types of wastes present.This document includes references to state agency roles including the Voluntary Cleanup Program, public involvement and other guidances that may be used. Information

  8. Physicochemical characterization of modified clay based composites obtained by a novel method

    NASA Astrophysics Data System (ADS)

    Kalra, Swati; Dudi, D.; Singh, G. P.; Verma, S. K.; Bhojak, N.

    2018-05-01

    Material science is one of the important fields where, absorption spectra of lanthanide ions have been a subject of several investigations because of their possible use as laser materials, diagnostic tools and sensors. Study of absorption spectra in visible and near infrared regions yields useful information regarding energy and intensity parameters, and nature and probabilities of transitions. Chemical physics provides fundamental tool to develop lanthanide chemistry, which has been increasingly significant in the last few years due to the wide variety of potential applications of their complexes in many important areas of biology and medicines. The present work describes the development of a novel method of composite preparation based on clay and its physiochemical characterization. Simultaneous measurement of some thermal properties has made study more useful. Results match with accepted models.

  9. Characterization of perovskite film prepared by pulsed laser deposition on ferritic stainless steel using microscopic and optical methods

    NASA Astrophysics Data System (ADS)

    Durda, E.; Jaglarz, J.; Kąc, S.; Przybylski, K.; El Kouari, Y.

    2016-06-01

    The perovskite La0.6Sr0.4Co0.2Fe0.8O3-δ (LSCF48) film was deposited on Crofer 22 APU ferritic stainless steel by pulsed laser deposition (PLD). Morphological studies of the sample were performed using scanning electron microscopy (SEM) and atomic force microscopy (AFM). Information about film thickness and surface topography of the film and the steel substrate were obtained using following optical methods: spectroscopic ellipsometry (SE), bidirectional reflection distribution function (BRDF) and total integrated reflectometry (TIS). In particular, the BRDF study, being complementary to atomic force microscopy, yielded information about surface topography. Using the previously mentioned methods, the following statistic surface parameters were determined: root-mean square (rms) roughness and autocorrelation length by determining the power spectral density (PSD) function of surface irregularities.

  10. Computer synthesis of high resolution electron micrographs

    NASA Technical Reports Server (NTRS)

    Nathan, R.

    1976-01-01

    Specimen damage, spherical aberration, low contrast and noisy sensors combine to prevent direct atomic viewing in a conventional electron microscope. The paper describes two methods for obtaining ultra-high resolution in biological specimens under the electron microscope. The first method assumes the physical limits of the electron objective lens and uses a series of dark field images of biological crystals to obtain direct information on the phases of the Fourier diffraction maxima; this information is used in an appropriate computer to synthesize a large aperture lens for a 1-A resolution. The second method assumes there is sufficient amplitude scatter from images recorded in focus which can be utilized with a sensitive densitometer and computer contrast stretching to yield fine structure image details. Cancer virus characterization is discussed as an illustrative example. Numerous photographs supplement the text.

  11. Comparison of traditional nondestructive analysis of RERTR fuel plates with digital radiographic techniques

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Davidsmeier, T.; Koehl, R.; Lanham, R.

    2008-07-15

    The current design and fabrication process for RERTR fuel plates utilizes film radiography during the nondestructive testing and characterization. Digital radiographic methods offer a potential increases in efficiency and accuracy. The traditional and digital radiographic methods are described and demonstrated on a fuel plate constructed with and average of 51% by volume fuel using the dispersion method. Fuel loading data from each method is analyzed and compared to a third baseline method to assess accuracy. The new digital method is shown to be more accurate, save hours of work, and provide additional information not easily available in the traditional method.more » Additional possible improvements suggested by the new digital method are also raised. (author)« less

  12. Characterization, thermal stability studies, and analytical method development of Paromomycin for formulation development.

    PubMed

    Khan, Wahid; Kumar, Neeraj

    2011-06-01

    Paromomycin (PM) is an aminoglycoside antibiotic, first isolated in the 1950s, and approved in 2006 for treatment of visceral leishmaniasis. Although isolated six decades back, sufficient information essential for development of pharmaceutical formulation is not available for PM. The purpose of this paper was to determine thermal stability and development of new analytical method for formulation development of PM. PM was characterized by thermoanalytical (DSC, TGA, and HSM) and by spectroscopic (FTIR) techniques and these techniques were used to establish thermal stability of PM after heating PM at 100, 110, 120, and 130 °C for 24 h. Biological activity of these heated samples was also determined by microbiological assay. Subsequently, a simple, rapid and sensitive RP-HPLC method for quantitative determination of PM was developed using pre-column derivatization with 9-fluorenylmethyl chloroformate. The developed method was applied to estimate PM quantitatively in two parenteral dosage forms. PM was successfully characterized by various stated techniques. These techniques indicated stability of PM for heating up to 120 °C for 24 h, but when heated at 130 °C, PM is liable to degradation. This degradation is also observed in microbiological assay where PM lost ∼30% of its biological activity when heated at 130 °C for 24 h. New analytical method was developed for PM in the concentration range of 25-200 ng/ml with intra-day and inter-day variability of < 2%RSD. Characterization techniques were established and stability of PM was determined successfully. Developed analytical method was found sensitive, accurate, and precise for quantification of PM. Copyright © 2010 John Wiley & Sons, Ltd. Copyright © 2010 John Wiley & Sons, Ltd.

  13. The growth and in situ characterization of chemical vapor deposited SiO2

    NASA Technical Reports Server (NTRS)

    Iyer, R.; Chang, R. R.; Lile, D. L.

    1987-01-01

    This paper reports the results of studies of the kinetics of remote (indirect) plasma enhanced low pressure CVD growth of SiO2 on Si and InP and of the in situ characterization of the electrical surface properties of InP during CVD processing. In the latter case photoluminescence was employed as a convenient and sensitive noninvasive method for characterizing surface trap densities. It was determined that, provided certain precautions are taken, the growth of SiO2 occurs in a reproducible and systematic fashion that can be expressed in an analytic form useful for growth rate prediction. Moreover, the in situ photoluminescence studies have yielded information on sample degradation resulting from heating and chemical exposure during the CVD growth.

  14. Characterization of intermittency in renewal processes: Application to earthquakes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Akimoto, Takuma; Hasumi, Tomohiro; Aizawa, Yoji

    2010-03-15

    We construct a one-dimensional piecewise linear intermittent map from the interevent time distribution for a given renewal process. Then, we characterize intermittency by the asymptotic behavior near the indifferent fixed point in the piecewise linear intermittent map. Thus, we provide a framework to understand a unified characterization of intermittency and also present the Lyapunov exponent for renewal processes. This method is applied to the occurrence of earthquakes using the Japan Meteorological Agency and the National Earthquake Information Center catalog. By analyzing the return map of interevent times, we find that interevent times are not independent and identically distributed random variablesmore » but that the conditional probability distribution functions in the tail obey the Weibull distribution.« less

  15. Towards precision medicine: from quantitative imaging to radiomics

    PubMed Central

    Acharya, U. Rajendra; Hagiwara, Yuki; Sudarshan, Vidya K.; Chan, Wai Yee; Ng, Kwan Hoong

    2018-01-01

    Radiology (imaging) and imaging-guided interventions, which provide multi-parametric morphologic and functional information, are playing an increasingly significant role in precision medicine. Radiologists are trained to understand the imaging phenotypes, transcribe those observations (phenotypes) to correlate with underlying diseases and to characterize the images. However, in order to understand and characterize the molecular phenotype (to obtain genomic information) of solid heterogeneous tumours, the advanced sequencing of those tissues using biopsy is required. Thus, radiologists image the tissues from various views and angles in order to have the complete image phenotypes, thereby acquiring a huge amount of data. Deriving meaningful details from all these radiological data becomes challenging and raises the big data issues. Therefore, interest in the application of radiomics has been growing in recent years as it has the potential to provide significant interpretive and predictive information for decision support. Radiomics is a combination of conventional computer-aided diagnosis, deep learning methods, and human skills, and thus can be used for quantitative characterization of tumour phenotypes. This paper discusses the overview of radiomics workflow, the results of various radiomics-based studies conducted using various radiological images such as computed tomography (CT), magnetic resonance imaging (MRI), and positron-emission tomography (PET), the challenges we are facing, and the potential contribution of radiomics towards precision medicine. PMID:29308604

  16. Characterizing uncertain sea-level rise projections to support investment decisions.

    PubMed

    Sriver, Ryan L; Lempert, Robert J; Wikman-Svahn, Per; Keller, Klaus

    2018-01-01

    Many institutions worldwide are considering how to include uncertainty about future changes in sea-levels and storm surges into their investment decisions regarding large capital infrastructures. Here we examine how to characterize deeply uncertain climate change projections to support such decisions using Robust Decision Making analysis. We address questions regarding how to confront the potential for future changes in low probability but large impact flooding events due to changes in sea-levels and storm surges. Such extreme events can affect investments in infrastructure but have proved difficult to consider in such decisions because of the deep uncertainty surrounding them. This study utilizes Robust Decision Making methods to address two questions applied to investment decisions at the Port of Los Angeles: (1) Under what future conditions would a Port of Los Angeles decision to harden its facilities against extreme flood scenarios at the next upgrade pass a cost-benefit test, and (2) Do sea-level rise projections and other information suggest such conditions are sufficiently likely to justify such an investment? We also compare and contrast the Robust Decision Making methods with a full probabilistic analysis. These two analysis frameworks result in similar investment recommendations for different idealized future sea-level projections, but provide different information to decision makers and envision different types of engagement with stakeholders. In particular, the full probabilistic analysis begins by aggregating the best scientific information into a single set of joint probability distributions, while the Robust Decision Making analysis identifies scenarios where a decision to invest in near-term response to extreme sea-level rise passes a cost-benefit test, and then assembles scientific information of differing levels of confidence to help decision makers judge whether or not these scenarios are sufficiently likely to justify making such investments. Results highlight the highly-localized and context dependent nature of applying Robust Decision Making methods to inform investment decisions.

  17. Characterizing uncertain sea-level rise projections to support investment decisions

    PubMed Central

    Lempert, Robert J.; Wikman-Svahn, Per; Keller, Klaus

    2018-01-01

    Many institutions worldwide are considering how to include uncertainty about future changes in sea-levels and storm surges into their investment decisions regarding large capital infrastructures. Here we examine how to characterize deeply uncertain climate change projections to support such decisions using Robust Decision Making analysis. We address questions regarding how to confront the potential for future changes in low probability but large impact flooding events due to changes in sea-levels and storm surges. Such extreme events can affect investments in infrastructure but have proved difficult to consider in such decisions because of the deep uncertainty surrounding them. This study utilizes Robust Decision Making methods to address two questions applied to investment decisions at the Port of Los Angeles: (1) Under what future conditions would a Port of Los Angeles decision to harden its facilities against extreme flood scenarios at the next upgrade pass a cost-benefit test, and (2) Do sea-level rise projections and other information suggest such conditions are sufficiently likely to justify such an investment? We also compare and contrast the Robust Decision Making methods with a full probabilistic analysis. These two analysis frameworks result in similar investment recommendations for different idealized future sea-level projections, but provide different information to decision makers and envision different types of engagement with stakeholders. In particular, the full probabilistic analysis begins by aggregating the best scientific information into a single set of joint probability distributions, while the Robust Decision Making analysis identifies scenarios where a decision to invest in near-term response to extreme sea-level rise passes a cost-benefit test, and then assembles scientific information of differing levels of confidence to help decision makers judge whether or not these scenarios are sufficiently likely to justify making such investments. Results highlight the highly-localized and context dependent nature of applying Robust Decision Making methods to inform investment decisions. PMID:29414978

  18. Graph characterization via Ihara coefficients.

    PubMed

    Ren, Peng; Wilson, Richard C; Hancock, Edwin R

    2011-02-01

    The novel contributions of this paper are twofold. First, we demonstrate how to characterize unweighted graphs in a permutation-invariant manner using the polynomial coefficients from the Ihara zeta function, i.e., the Ihara coefficients. Second, we generalize the definition of the Ihara coefficients to edge-weighted graphs. For an unweighted graph, the Ihara zeta function is the reciprocal of a quasi characteristic polynomial of the adjacency matrix of the associated oriented line graph. Since the Ihara zeta function has poles that give rise to infinities, the most convenient numerically stable representation is to work with the coefficients of the quasi characteristic polynomial. Moreover, the polynomial coefficients are invariant to vertex order permutations and also convey information concerning the cycle structure of the graph. To generalize the representation to edge-weighted graphs, we make use of the reduced Bartholdi zeta function. We prove that the computation of the Ihara coefficients for unweighted graphs is a special case of our proposed method for unit edge weights. We also present a spectral analysis of the Ihara coefficients and indicate their advantages over other graph spectral methods. We apply the proposed graph characterization method to capturing graph-class structure and clustering graphs. Experimental results reveal that the Ihara coefficients are more effective than methods based on Laplacian spectra.

  19. Application of advanced geophysical logging methods in the characterization of a fractured-sedimentary bedrock aquifer, Ventura County, California

    USGS Publications Warehouse

    Williams, John H.; Lane, John W.; Singha, Kamini; Haeni, F. Peter

    2002-01-01

    An integrated suite of advanced geophysical logging methods was used to characterize the geology and hydrology of three boreholes completed in fractured-sedimentary bedrock in Ventura County, California. The geophysical methods included caliper, gamma, electromagnetic induction, borehole deviation, optical and acoustic televiewer, borehole radar, fluid resistivity, temperature, and electromagnetic flowmeter. The geophysical logging 1) provided insights useful for the overall geohydrologic characterization of the bedrock and 2) enhanced the value of information collected by other methods from the boreholes including core-sample analysis, multiple-level monitoring, and packer testing.The logged boreholes, which have open intervals of 100 to 200 feet, penetrate a sequence of interbedded sandstone and mudstone with bedding striking 220 to 250 degrees and dipping 15 to 40 degrees to the northwest. Fractures intersected by the boreholes include fractures parallel to bedding and fractures with variable strike that dip moderately to steeply. Two to three flow zones were detected in each borehole. The flow zones consist of bedding-parallel or steeply dipping fractures or a combination of bedding-parallel fractures and moderately to steeply dipping fractures. About 75 to more than 90 percent of the measured flow under pumped conditions was produced by only one of the flow zones in each borehole.

  20. Materials characterization of propellants using ultrasonics

    NASA Technical Reports Server (NTRS)

    Workman, Gary L.; Jones, David

    1993-01-01

    Propellant characteristics for solid rocket motors were not completely determined for its use as a processing variable in today's production facilities. A major effort to determine propellant characteristics obtainable through ultrasonic measurement techniques was performed in this task. The information obtained was then used to determine the uniformity of manufacturing methods and/or the ability to determine non-uniformity in processes.

  1. An Overview of Addiction Research Center Inventory Scales (ARCI): An Appendix and Manual of Scales.

    ERIC Educational Resources Information Center

    Haertzen, C.A.

    The Addiction Research Center Inventory is a 550 item multipurpose test measuring the broad range of physical, emotive, cognitive, and subjective effects of drugs. This manual provides technical information concerning 38 most valid scales, a quantitative method for characterizing the similarity of a profile of scores for the subject, group, or…

  2. Modulation Based on Probability Density Functions

    NASA Technical Reports Server (NTRS)

    Williams, Glenn L.

    2009-01-01

    A proposed method of modulating a sinusoidal carrier signal to convey digital information involves the use of histograms representing probability density functions (PDFs) that characterize samples of the signal waveform. The method is based partly on the observation that when a waveform is sampled (whether by analog or digital means) over a time interval at least as long as one half cycle of the waveform, the samples can be sorted by frequency of occurrence, thereby constructing a histogram representing a PDF of the waveform during that time interval.

  3. Method and device for diagnosing and controlling combustion instabilities in internal combustion engines operating in or transitioning to homogeneous charge combustion ignition mode

    DOEpatents

    Wagner, Robert M [Knoxville, TN; Daw, Charles S [Knoxville, TN; Green, Johney B [Knoxville, TN; Edwards, Kevin D [Knoxville, TN

    2008-10-07

    This invention is a method of achieving stable, optimal mixtures of HCCI and SI in practical gasoline internal combustion engines comprising the steps of: characterizing the combustion process based on combustion process measurements, determining the ratio of conventional and HCCI combustion, determining the trajectory (sequence) of states for consecutive combustion processes, and determining subsequent combustion process modifications using said information to steer the engine combustion toward desired behavior.

  4. Characterization of organic matter of plants from lakes by thermal analysis in a N2 atmosphere

    NASA Astrophysics Data System (ADS)

    Guo, Fei; Wu, Fengchang; Mu, Yunsong; Hu, Yan; Zhao, Xiaoli; Meng, Wei; Giesy, John P.; Lin, Ying

    2016-03-01

    Organic matter (OM) has been characterized using thermal analysis in O2 atmospheres, but it is not clear if OM can be characterized using slow thermal degradation in N2 atmospheres (STDN). This article presents a new method to estimate the behavior of OM in anaerobic environment. Seventeen different plants from Tai Lake (Ch: Taihu), China were heated to 600 °C at a rate of 10 °C min-1 in a N2 atmosphere and characterized by use of differential scanning calorimetry (DSC) and thermal gravimetric analysis (TGA). DSC chromatograms were compared with 9 standard compounds. Seven peaks were observed in DSC chromatograms, 2 main peaks strongly correlated with biochemical indices, and one main peak was a transitional stage. Energy absorbed by a peak at approximately 200 °C and total organic carbon were well correlated, while energy absorbed at approximately 460 °C was negatively correlated with lignin content. Presence of peaks at approximately 350 and 420 °C varied among plant biomass sources, providing potential evidence for biomass identification. Methods of STDN reported here were rapid and accurate ways to quantitatively characterize OM, which may provide useful information for understanding anaerobic behaviors of natural organic matters.

  5. Inverse approaches with lithologic information for a regional groundwater system in southwest Kansas

    USGS Publications Warehouse

    Tsou, Ming‐shu; Perkins, S.P.; Zhan, X.; Whittemore, Donald O.; Zheng, Lingyun

    2006-01-01

    Two practical approaches incorporating lithologic information for groundwater modeling calibration are presented to estimate distributed, cell-based hydraulic conductivity. The first approach is to estimate optimal hydraulic conductivities for geological materials by incorporating thickness distribution of materials into inverse modeling. In the second approach, residuals for the groundwater model solution are minimized according to a globalized Newton method with the aid of a Geographic Information System (GIS) to calculate a cell-wise distribution of hydraulic conductivity. Both approaches honor geologic data and were effective in characterizing the heterogeneity of a regional groundwater modeling system in southwest Kansas. ?? 2005 Elsevier Ltd All rights reserved.

  6. Addressing unmeasured confounding in comparative observational research.

    PubMed

    Zhang, Xiang; Faries, Douglas E; Li, Hu; Stamey, James D; Imbens, Guido W

    2018-04-01

    Observational pharmacoepidemiological studies can provide valuable information on the effectiveness or safety of interventions in the real world, but one major challenge is the existence of unmeasured confounder(s). While many analytical methods have been developed for dealing with this challenge, they appear under-utilized, perhaps due to the complexity and varied requirements for implementation. Thus, there is an unmet need to improve understanding the appropriate course of action to address unmeasured confounding under a variety of research scenarios. We implemented a stepwise search strategy to find articles discussing the assessment of unmeasured confounding in electronic literature databases. Identified publications were reviewed and characterized by the applicable research settings and information requirements required for implementing each method. We further used this information to develop a best practice recommendation to help guide the selection of appropriate analytical methods for assessing the potential impact of unmeasured confounding. Over 100 papers were reviewed, and 15 methods were identified. We used a flowchart to illustrate the best practice recommendation which was driven by 2 critical components: (1) availability of information on the unmeasured confounders; and (2) goals of the unmeasured confounding assessment. Key factors for implementation of each method were summarized in a checklist to provide further assistance to researchers for implementing these methods. When assessing comparative effectiveness or safety in observational research, the impact of unmeasured confounding should not be ignored. Instead, we suggest quantitatively evaluating the impact of unmeasured confounding and provided a best practice recommendation for selecting appropriate analytical methods. Copyright © 2018 John Wiley & Sons, Ltd.

  7. Automation of aggregate characterization using laser profiling and digital image analysis

    NASA Astrophysics Data System (ADS)

    Kim, Hyoungkwan

    2002-08-01

    Particle morphological properties such as size, shape, angularity, and texture are key properties that are frequently used to characterize aggregates. The characteristics of aggregates are crucial to the strength, durability, and serviceability of the structure in which they are used. Thus, it is important to select aggregates that have proper characteristics for each specific application. Use of improper aggregate can cause rapid deterioration or even failure of the structure. The current standard aggregate test methods are generally labor-intensive, time-consuming, and subject to human errors. Moreover, important properties of aggregates may not be captured by the standard methods due to a lack of an objective way of quantifying critical aggregate properties. Increased quality expectations of products along with recent technological advances in information technology are motivating new developments to provide fast and accurate aggregate characterization. The resulting information can enable a real time quality control of aggregate production as well as lead to better design and construction methods of portland cement concrete and hot mix asphalt. This dissertation presents a system to measure various morphological characteristics of construction aggregates effectively. Automatic measurement of various particle properties is of great interest because it has the potential to solve such problems in manual measurements as subjectivity, labor intensity, and slow speed. The main efforts of this research are placed on three-dimensional (3D) laser profiling, particle segmentation algorithms, particle measurement algorithms, and generalized particle descriptors. First, true 3D data of aggregate particles obtained by laser profiling are transformed into digital images. Second, a segmentation algorithm and a particle measurement algorithm are developed to separate particles and process each particle data individually with the aid of various kinds of digital image technologies. Finally, in order to provide a generalized, quantitative, and representative way to characterize aggregate particles, 3D particle descriptors are developed using the multi-resolution analysis feature of wavelet transforms. Verification tests show that this approach could characterize various aggregate properties in a fast, accurate, and reliable way. When implemented, this ability to automatically analyze multiple characteristics of an aggregate sample is expected to provide not only economic but also intangible strategic gains.

  8. NIR and Py-mbms coupled with multivariate data analysis as a high-throughput biomass characterization technique: a review

    PubMed Central

    Xiao, Li; Wei, Hui; Himmel, Michael E.; Jameel, Hasan; Kelley, Stephen S.

    2014-01-01

    Optimizing the use of lignocellulosic biomass as the feedstock for renewable energy production is currently being developed globally. Biomass is a complex mixture of cellulose, hemicelluloses, lignins, extractives, and proteins; as well as inorganic salts. Cell wall compositional analysis for biomass characterization is laborious and time consuming. In order to characterize biomass fast and efficiently, several high through-put technologies have been successfully developed. Among them, near infrared spectroscopy (NIR) and pyrolysis-molecular beam mass spectrometry (Py-mbms) are complementary tools and capable of evaluating a large number of raw or modified biomass in a short period of time. NIR shows vibrations associated with specific chemical structures whereas Py-mbms depicts the full range of fragments from the decomposition of biomass. Both NIR vibrations and Py-mbms peaks are assigned to possible chemical functional groups and molecular structures. They provide complementary information of chemical insight of biomaterials. However, it is challenging to interpret the informative results because of the large amount of overlapping bands or decomposition fragments contained in the spectra. In order to improve the efficiency of data analysis, multivariate analysis tools have been adapted to define the significant correlations among data variables, so that the large number of bands/peaks could be replaced by a small number of reconstructed variables representing original variation. Reconstructed data variables are used for sample comparison (principal component analysis) and for building regression models (partial least square regression) between biomass chemical structures and properties of interests. In this review, the important biomass chemical structures measured by NIR and Py-mbms are summarized. The advantages and disadvantages of conventional data analysis methods and multivariate data analysis methods are introduced, compared and evaluated. This review aims to serve as a guide for choosing the most effective data analysis methods for NIR and Py-mbms characterization of biomass. PMID:25147552

  9. Paper-Based MicroRNA Expression Profiling from Plasma and Circulating Tumor Cells.

    PubMed

    Leong, Sai Mun; Tan, Karen Mei-Ling; Chua, Hui Wen; Huang, Mo-Chao; Cheong, Wai Chye; Li, Mo-Huang; Tucker, Steven; Koay, Evelyn Siew-Chuan

    2017-03-01

    Molecular characterization of circulating tumor cells (CTCs) holds great promise for monitoring metastatic progression and characterizing metastatic disease. However, leukocyte and red blood cell contamination of routinely isolated CTCs makes CTC-specific molecular characterization extremely challenging. Here we report the use of a paper-based medium for efficient extraction of microRNAs (miRNAs) from limited amounts of biological samples such as rare CTCs harvested from cancer patient blood. Specifically, we devised a workflow involving the use of Flinders Technology Associates (FTA) ® Elute Card with a digital PCR-inspired "partitioning" method to extract and purify miRNAs from plasma and CTCs. We demonstrated the sensitivity of this method to detect miRNA expression from as few as 3 cancer cells spiked into human blood. Using this method, background miRNA expression was excluded from contaminating blood cells, and CTC-specific miRNA expression profiles were derived from breast and colorectal cancer patients. Plasma separated out during purification of CTCs could likewise be processed using the same paper-based method for miRNA detection, thereby maximizing the amount of patient-specific information that can be derived from a single blood draw. Overall, this paper-based extraction method enables an efficient, cost-effective workflow for maximized recovery of small RNAs from limited biological samples for downstream molecular analyses. © 2016 American Association for Clinical Chemistry.

  10. Invariance algorithms for processing NDE signals

    NASA Astrophysics Data System (ADS)

    Mandayam, Shreekanth; Udpa, Lalita; Udpa, Satish S.; Lord, William

    1996-11-01

    Signals that are obtained in a variety of nondestructive evaluation (NDE) processes capture information not only about the characteristics of the flaw, but also reflect variations in the specimen's material properties. Such signal changes may be viewed as anomalies that could obscure defect related information. An example of this situation occurs during in-line inspection of gas transmission pipelines. The magnetic flux leakage (MFL) method is used to conduct noninvasive measurements of the integrity of the pipe-wall. The MFL signals contain information both about the permeability of the pipe-wall and the dimensions of the flaw. Similar operational effects can be found in other NDE processes. This paper presents algorithms to render NDE signals invariant to selected test parameters, while retaining defect related information. Wavelet transform based neural network techniques are employed to develop the invariance algorithms. The invariance transformation is shown to be a necessary pre-processing step for subsequent defect characterization and visualization schemes. Results demonstrating the successful application of the method are presented.

  11. Characterizing individual scattering events by measuring the amplitude and phase of the electric field diffusing through a random medium.

    PubMed

    Jian, Zhongping; Pearce, Jeremy; Mittleman, Daniel M

    2003-07-18

    We describe observations of the amplitude and phase of an electric field diffusing through a three-dimensional random medium, using terahertz time-domain spectroscopy. These measurements are spatially resolved with a resolution smaller than the speckle spot size and temporally resolved with a resolution better than one optical cycle. By computing correlation functions between fields measured at different positions and with different temporal delays, it is possible to obtain information about individual scattering events experienced by the diffusing field. This represents a new method for characterizing a multiply scattered wave.

  12. Assessment of Technologies Used to Characterize Wildlife Populations in the Offshore Environment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Duberstein, Corey A.; Tagestad, Jerry D.; Larson, Kyle B.

    Wind energy development in the offshore environment can have both direct and indirect effects on wildlife, yet little is known about most species that use near-shore and offshore waters due in part to the difficulty involved in studying animals in remote, challenging environments. Traditional methods to characterize offshore wildlife populations include shipboard observations. Technological advances have provided researches with an array of technologies to gather information about fauna from afar. This report describes the use and application of radar, thermal and optical imagery, and acoustic detection technologies for monitoring birds, bats, and marine mammals in offshore environments.

  13. Electrochemical Quartz Crystal Microbalance with Dissipation Real-Time Hydrodynamic Spectroscopy of Porous Solids in Contact with Liquids.

    PubMed

    Sigalov, Sergey; Shpigel, Netanel; Levi, Mikhael D; Feldberg, Moshe; Daikhin, Leonid; Aurbach, Doron

    2016-10-18

    Using multiharmonic electrochemical quartz crystal microbalance with dissipation (EQCM-D) monitoring, a new method of characterization of porous solids in contact with liquids has been developed. The dynamic gravimetric information on the growing, dissolving, or stationary stored solid deposits is supplemented by their precise in-operando porous structure characterization on a mesoscopic scale. We present a very powerful method of quartz-crystal admittance modeling of hydrodynamic solid-liquid interactions in order to extract the porous structure parameters of solids during their formation in real time, using different deposition modes. The unique hydrodynamic spectroscopic characterization of electrolytic and rf-sputtered solid Cu coatings that we use for our "proof of concept" provides a new strategy for probing various electrochemically active thin and thick solid deposits, thereby offering inexpensive, noninvasive, and highly efficient quantitative control over their properties. A broad spectrum of applications of our method is proposed, from various metal electroplating and finishing technologies to deeper insight into dynamic build-up and subsequent development of solid-electrolyte interfaces in the operation of Li-battery electrodes, as well as monitoring hydrodynamic consequences of metal corrosion, and growth of biomass coatings (biofouling) on different solid surfaces in seawater.

  14. High throughput integrated thermal characterization with non-contact optical calorimetry

    NASA Astrophysics Data System (ADS)

    Hou, Sichao; Huo, Ruiqing; Su, Ming

    2017-10-01

    Commonly used thermal analysis tools such as calorimeter and thermal conductivity meter are separated instruments and limited by low throughput, where only one sample is examined each time. This work reports an infrared based optical calorimetry with its theoretical foundation, which is able to provide an integrated solution to characterize thermal properties of materials with high throughput. By taking time domain temperature information of spatially distributed samples, this method allows a single device (infrared camera) to determine the thermal properties of both phase change systems (melting temperature and latent heat of fusion) and non-phase change systems (thermal conductivity and heat capacity). This method further allows these thermal properties of multiple samples to be determined rapidly, remotely, and simultaneously. In this proof-of-concept experiment, the thermal properties of a panel of 16 samples including melting temperatures, latent heats of fusion, heat capacities, and thermal conductivities have been determined in 2 min with high accuracy. Given the high thermal, spatial, and temporal resolutions of the advanced infrared camera, this method has the potential to revolutionize the thermal characterization of materials by providing an integrated solution with high throughput, high sensitivity, and short analysis time.

  15. Synthesis of hexavalent molybdenum formo- and aceto-hydroxamates and deferoxamine via liquid-liquid metal partitioning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Breshears, Andrew T.; Brown, M. Alex; Bloom, Ira

    We report a new method of crystal growth and synthesis based on liquid-liquid partitioning that allows for isolation and in-depth characterization of molybdenyl bis(formohydroxamate), Mo-FHA, molybdenyl bis(acetohydroxamate), Mo-AHA, and molybdenyl deferoxamine, Mo-DFO, for the first time. This novel approach affords shorter crystal growth time (hourly timeframe) without sacrificing crystal size or integrity when other methods of crystallization were unsuccessful. All three Mo complexes are characterized in solution via FTIR, NMR, UV-vis, and EXAFS spectroscopy. Mo-AHA and Mo-FHA structures are resolved by single crystal X-ray diffraction. Using the molybdenyl hydroxamate structural information, the speciation of Mo in a siderophore complex (Mo-DFO)more » is determined via complimentary spectroscopic methods and confirmed by DFT calculations. ESI-MS verifies that a complex of 1:1 molybdenum to deferoxamine is present in solution. Additionally, the Mo solution speciation in the precursor organic phase, MoO2(NO3)2HEH[EHP]2 (where HEH[EHP] is 2-ethylhexylphosphonic acid mono-2-ethylhexyl ester), is characterized by FTIR and EXAFS spectroscopy as well as DFT calculations.« less

  16. Altered cerebral blood flow velocity features in fibromyalgia patients in resting-state conditions

    PubMed Central

    Rodríguez, Alejandro; Tembl, José; Mesa-Gresa, Patricia; Muñoz, Miguel Ángel; Montoya, Pedro

    2017-01-01

    The aim of this study is to characterize in resting-state conditions the cerebral blood flow velocity (CBFV) signals of fibromyalgia patients. The anterior and middle cerebral arteries of both hemispheres from 15 women with fibromyalgia and 15 healthy women were monitored using Transcranial Doppler (TCD) during a 5-minute eyes-closed resting period. Several signal processing methods based on time, information theory, frequency and time-frequency analyses were used in order to extract different features to characterize the CBFV signals in the different vessels. Main results indicated that, in comparison with control subjects, fibromyalgia patients showed a higher complexity of the envelope CBFV and a different distribution of the power spectral density. In addition, it has been observed that complexity and spectral features show correlations with clinical pain parameters and emotional factors. The characterization features were used in a lineal model to discriminate between fibromyalgia patients and healthy controls, providing a high accuracy. These findings indicate that CBFV signals, specifically their complexity and spectral characteristics, contain information that may be relevant for the assessment of fibromyalgia patients in resting-state conditions. PMID:28700720

  17. SDS-PAGE analysis of Aβ oligomers is disserving research into Alzheimer´s disease: appealing for ESI-IM-MS

    NASA Astrophysics Data System (ADS)

    Pujol-Pina, Rosa; Vilaprinyó-Pascual, Sílvia; Mazzucato, Roberta; Arcella, Annalisa; Vilaseca, Marta; Orozco, Modesto; Carulla, Natàlia

    2015-10-01

    The characterization of amyloid-beta peptide (Aβ) oligomer forms and structures is crucial to the advancement in the field of Alzheimer´s disease (AD). Here we report a critical evaluation of two methods used for this purpose, namely sodium dodecyl sulfate polyacrylamide gel electrophoresis (SDS-PAGE), extensively used in the field, and ion mobility coupled to electrospray ionization mass spectrometry (ESI-IM-MS), an emerging technique with great potential for oligomer characterization. To evaluate their performance, we first obtained pure cross-linked Aβ40 and Aβ42 oligomers of well-defined order. Analysis of these samples by SDS-PAGE revealed that SDS affects the oligomerization state of Aβ42 oligomers, thus providing flawed information on their order and distribution. In contrast, ESI-IM-MS provided accurate information, while also reported on the chemical nature and on the structure of the oligomers. Our findings have important implications as they challenge scientific paradigms in the AD field built upon SDS-PAGE characterization of Aβ oligomer samples.

  18. Characterization of single α-tracks by photoresist detection and AFM analysis-focus on biomedical science and technology

    NASA Astrophysics Data System (ADS)

    Falzone, Nadia; Myhra, Sverre; Chakalova, Radka; Hill, Mark A.; Thomson, James; Vallis, Katherine A.

    2013-11-01

    The interactions between energetic ions and biological and/or organic target materials have recently attracted theoretical and experimental attention, due to their implications for detector and device technologies, and for therapeutic applications. Most of the attention has focused on detection of the primary ionization tracks, and their effects, while recoil target atom tracks remain largely unexplored. Detection of tracks by a negative tone photoresist (SU-8), followed by standard development, in combination with analysis by atomic force microscopy, shows that both primary and recoil tracks are revealed as conical spikes, and can be characterized at high spatial resolution. The methodology has the potential to provide detailed information about single impact events, which may lead to more effective and informative detector technologies and advanced therapeutic procedures. In comparison with current characterization methods the advantageous features include: greater spatial resolution by an order of magnitude (20 nm) detection of single primary and associated recoil tracks; increased range of fluence (to 2.5 × 109 cm-2) sensitivity to impacts at grazing angle incidence; and better definition of the lateral interaction volume in target materials.

  19. Characterization of methicillin-resistant Staphylococcus aureus isolated at Tripoli Medical Center, Libya, between 2008 and 2014.

    PubMed

    BenDarif, Elloulu; Khalil, Asma; Rayes, Abdunnabi; Bennour, Emad; Dhawi, Abdulgader; Lowe, John J; Gibbs, Shawn; Goering, Richard V

    2016-12-01

    Bacterial pathogens such as methicillin-resistant Staphylococcus aureus (MRSA) represent a well-known public health problem affecting both healthcare-associated and community populations. Past studies have clearly shown the value of characterizing problem organisms including MRSA through the use of molecular techniques (i.e. strain typing), with the aim of informing local, regional and national efforts in epidemiological analysis and infection control. The country of Libya represents a challenge for such analysis due to limited historical infectious disease information and major political unrest culminating in the Libyan Civil War (Libyan Revolution) in 2011. A MRSA study population of 202 isolates, cultured from patients in Tripoli Medical Center through this historical period (2008-2014), was characterized by both phenotypic and molecular methods. The results revealed a diversification of epidemic MRSA strains over time with generally increasing resistance to fluoroquinolone antibiotics. The study identified prevalent MRSA in comparison to known global epidemic types, providing unique insight into the change of strains and/or characteristics over time especially with reference to the potential influence of the political revolution (i.e. pre- and post-2011).

  20. Design, fabrication and characterization of Computer Generated Holograms for anti-counterfeiting applications using OAM beams as light decoders.

    PubMed

    Ruffato, Gianluca; Rossi, Roberto; Massari, Michele; Mafakheri, Erfan; Capaldo, Pietro; Romanato, Filippo

    2017-12-21

    In this paper, we present the design, fabrication and optical characterization of computer-generated holograms (CGH) encoding information for light beams carrying orbital angular momentum (OAM). Through the use of a numerical code, based on an iterative Fourier transform algorithm, a phase-only diffractive optical element (PO-DOE) specifically designed for OAM illumination has been computed, fabricated and tested. In order to shape the incident beam into a helicoidal phase profile and generate light carrying phase singularities, a method based on transmission through high-order spiral phase plates (SPPs) has been used. The phase pattern of the designed holographic DOEs has been fabricated using high-resolution Electron-Beam Lithography (EBL) over glass substrates coated with a positive photoresist layer (polymethylmethacrylate). To the best of our knowledge, the present study is the first attempt, in a comprehensive work, to design, fabricate and characterize computer-generated holograms encoding information for structured light carrying OAM and phase singularities. These optical devices appear promising as high-security optical elements for anti-counterfeiting applications.

  1. Altered cerebral blood flow velocity features in fibromyalgia patients in resting-state conditions.

    PubMed

    Rodríguez, Alejandro; Tembl, José; Mesa-Gresa, Patricia; Muñoz, Miguel Ángel; Montoya, Pedro; Rey, Beatriz

    2017-01-01

    The aim of this study is to characterize in resting-state conditions the cerebral blood flow velocity (CBFV) signals of fibromyalgia patients. The anterior and middle cerebral arteries of both hemispheres from 15 women with fibromyalgia and 15 healthy women were monitored using Transcranial Doppler (TCD) during a 5-minute eyes-closed resting period. Several signal processing methods based on time, information theory, frequency and time-frequency analyses were used in order to extract different features to characterize the CBFV signals in the different vessels. Main results indicated that, in comparison with control subjects, fibromyalgia patients showed a higher complexity of the envelope CBFV and a different distribution of the power spectral density. In addition, it has been observed that complexity and spectral features show correlations with clinical pain parameters and emotional factors. The characterization features were used in a lineal model to discriminate between fibromyalgia patients and healthy controls, providing a high accuracy. These findings indicate that CBFV signals, specifically their complexity and spectral characteristics, contain information that may be relevant for the assessment of fibromyalgia patients in resting-state conditions.

  2. Application of constrained deconvolution technique for reconstruction of electron bunch profile with strongly non-Gaussian shape

    NASA Astrophysics Data System (ADS)

    Geloni, G.; Saldin, E. L.; Schneidmiller, E. A.; Yurkov, M. V.

    2004-08-01

    An effective and practical technique based on the detection of the coherent synchrotron radiation (CSR) spectrum can be used to characterize the profile function of ultra-short bunches. The CSR spectrum measurement has an important limitation: no spectral phase information is available, and the complete profile function cannot be obtained in general. In this paper we propose to use constrained deconvolution method for bunch profile reconstruction based on a priori-known information about formation of the electron bunch. Application of the method is illustrated with practically important example of a bunch formed in a single bunch-compressor. Downstream of the bunch compressor the bunch charge distribution is strongly non-Gaussian with a narrow leading peak and a long tail. The longitudinal bunch distribution is derived by measuring the bunch tail constant with a streak camera and by using a priory available information about profile function.

  3. Contribution of in situ geophysical methods for the definition of the São Sebastião crater model (Azores)

    NASA Astrophysics Data System (ADS)

    Lopes, Isabel; Deidda, Gian Piero; Mendes, Manuela; Strobbia, Claudio; Santos, Jaime

    2013-11-01

    The area located inside the São Sebastião volcanic crater, at the southeast end of Terceira Island (Azores), is characterized by an important amplification of ground motion with respect to the surrounding area, as clearly demonstrated by the spatial distribution of the damage that occurred during the Terceira earthquake (the strongest earthquake felt in the Island during the recent decades - 01/01/1980 - M = 7.2). Geological and geophysical studies have been conducted, to characterize the volcanic crater and understand the different site effects that occurred in the village of São Sebastião. The complexity of the subsurface geology, with intercalations of compact basalt and soft pyroclastic deposits, is associated to extreme vertical and lateral velocity contrasts, and poses a serious challenge to different geophysical characterization methods. The available qualitative model did not allow a complete understanding of the site effects. A new seismic campaign has been designed and acquired, and a single, geologically consistent geophysical model has been generated integrating the existing and new data. The new campaign included two cross-line P-wave seismic refraction profiles, four short SH-wave seismic reflection profiles, and seven multichannel surface wave acquisitions. The integration and joint interpretation of geophysical and geological data allowed mutual validation and confirmation of data processing steps. In particular, the use of refraction, reflection and surface wave techniques allowed facing the complexity of a geology that can pose different challenges to all the methods when used individually: velocity inversions, limited reflectivity, and lateral variations. It is shown how the integration of seismic data from different methods, in the framework of a geological model, allowed the geometrical and dynamic characterization of the site. Correlation with further borehole information, then allowed the definition of a subsoil model for the crater, providing information that allowed a better understanding of the earthquake site effects in the São Sebastião village. The new near-surface geological model includes a lava layer within the soft infill materials of the crater. This new model matches closely with the damage distribution map, and explains the spatial variation of building stock performance in the 1980 earthquake.

  4. An MLC-based linac QA procedure for the characterization of radiation isocenter and room lasers' position.

    PubMed

    Rosca, Florin; Lorenz, Friedlieb; Hacker, Fred L; Chin, Lee M; Ramakrishna, Naren; Zygmanski, Piotr

    2006-06-01

    We have designed and implemented a new stereotactic linac QA test with stereotactic precision. The test is used to characterize gantry sag, couch wobble, cone placement, MLC offsets, and room lasers' positions relative to the radiation isocenter. Two MLC star patterns, a cone pattern, and the laser line patterns are recorded on the same imaging medium. Phosphor plates are used as imaging medium due to their sensitivity to red light. The red light of room lasers erases some of the irradiation information stored on the phosphor plates enabling accurate and direct measurements for the position of room lasers and radiation isocenter. Using film instead of the phosphor plate as imaging medium is possible, however, it is less practical. The QA method consists of irradiating four phosphor plates that record the gantry sag between the 0 degrees and 180 degrees gantry angles, the position and stability of couch rotational axis, the sag between the 90 degrees and 270 degrees gantry angles, the accuracy of cone placement on the collimator, the MLC offsets from the collimator rotational axis, and the position of laser lines relative to the radiation isocenter. The estimated accuracy of the method is +/- 0.2 mm. The observed reproducibility of the method is about +/- 0.1 mm. The total irradiation/ illumination time is about 10 min per image. Data analysis, including the phosphor plate scanning, takes less than 5 min for each image. The method characterizes the radiation isocenter geometry with the high accuracy required for the stereotactic radiosurgery. In this respect, it is similar to the standard ball test for stereotactic machines. However, due to the usage of the MLC instead of the cross-hair/ball, it does not depend on the cross-hair/ball placement errors with respect to the lasers and it provides more information on the mechanical integrity of the linac/couch/laser system. Alternatively, it can be used as a highly accurate QA procedure for the nonstereotactic machines. Noteworthy is its ability to characterize the MLC position accuracy, which is an important factor in IMRT delivery.

  5. Loss of Coolant Accident (LOCA) / Emergency Core Coolant System (ECCS Evaluation of Risk-Informed Margins Management Strategies for a Representative Pressurized Water Reactor (PWR)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Szilard, Ronaldo Henriques

    A Risk Informed Safety Margin Characterization (RISMC) toolkit and methodology are proposed for investigating nuclear power plant core, fuels design and safety analysis, including postulated Loss-of-Coolant Accident (LOCA) analysis. This toolkit, under an integrated evaluation model framework, is name LOCA toolkit for the US (LOTUS). This demonstration includes coupled analysis of core design, fuel design, thermal hydraulics and systems analysis, using advanced risk analysis tools and methods to investigate a wide range of results.

  6. Characterizing fluid dynamics in a bubble column aimed for the determination of reactive mass transfer

    NASA Astrophysics Data System (ADS)

    Kováts, Péter; Thévenin, Dominique; Zähringer, Katharina

    2018-02-01

    Bubble column reactors are multiphase reactors that are used in many process engineering applications. In these reactors a gas phase comes into contact with a fluid phase to initiate or support reactions. The transport process from the gas to the liquid phase is often the limiting factor. Characterizing this process is therefore essential for the optimization of multiphase reactors. For a better understanding of the transfer mechanisms and subsequent chemical reactions, a laboratory-scale bubble column reactor was investigated. First, to characterize the flow field in the reactor, two different methods have been applied. The shadowgraphy technique is used for the characterisation of the bubbles (bubble diameter, velocity, shape or position) for various process conditions. This technique is based on particle recognition with backlight illumination, combined with particle tracking velocimetry (PTV). The bubble trajectories in the column can also be obtained in this manner. Secondly, the liquid phase flow has been analysed by particle image velocimetry (PIV). The combination of both methods, delivering relevant information concerning disperse (bubbles) and continuous (liquid) phases, leads to a complete fluid dynamical characterization of the reactor, which is the pre-condition for the analysis of mass transfer between both phases.

  7. Alternative Test Methods for Developmental Neurotoxicity: A ...

    EPA Pesticide Factsheets

    Exposure to environmental contaminants is well documented to adversely impact the development of the nervous system. However, the time, animal and resource intensive EPA and OECD testing guideline methods for developmental neurotoxicity (DNT) are not a viable solution to characterizing potential chemical hazards for the thousands of untested chemicals currently in commerce. Thus, research efforts over the past decade have endeavored to develop cost-effective alternative DNT testing methods. These efforts have begun to generate data that can inform regulatory decisions. Yet there are major challenges to both the acceptance and use of this data. Major scientific challenges for DNT include development of new methods and models that are “fit for purpose”, development of a decision-use framework, and regulatory acceptance of the methods. It is critical to understand that use of data from these methods will be driven mainly by the regulatory problems being addressed. Some problems may be addressed with limited datasets, while others may require data for large numbers of chemicals, or require the development and use of new biological and computational models. For example mechanistic information derived from in vitro DNT assays can be used to inform weight of evidence (WoE) or integrated approaches to testing and assessment (IATA) approaches for chemical-specific assessments. Alternatively, in vitro data can be used to prioritize (for further testing) the thousands

  8. Wind turbine acoustics

    NASA Technical Reports Server (NTRS)

    Hubbard, Harvey H.; Shepherd, Kevin P.

    1990-01-01

    Available information on the physical characteristics of the noise generated by wind turbines is summarized, with example sound pressure time histories, narrow- and broadband frequency spectra, and noise radiation patterns. Reviewed are noise measurement standards, analysis technology, and a method of characterizing wind turbine noise. Prediction methods are given for both low-frequency rotational harmonics and broadband noise components. Also included are atmospheric propagation data showing the effects of distance and refraction by wind shear. Human perception thresholds, based on laboratory and field tests, are given. Building vibration analysis methods are summarized. The bibliography of this report lists technical publications on all aspects of wind turbine acoustics.

  9. Characterization of human arterial tissue affected by atherosclerosis using multimodal nonlinear optical microscopy

    NASA Astrophysics Data System (ADS)

    Baria, Enrico; Cicchi, Riccardo; Rotellini, Matteo; Nesi, Gabriella; Massi, Daniela; Pavone, Francesco S.

    2016-03-01

    Atherosclerosis is a widespread cardiovascular disease caused by the deposition of lipids (such as cholesterol and triglycerides) on the inner arterial wall. The rupture of an atherosclerotic plaque, resulting in a thrombus, is one of the leading causes of death in the Western World. Preventive assessment of plaque vulnerability is therefore extremely important and can be performed by studying collagen organization and lipid composition in atherosclerotic arterial tissues. Routinely used diagnostic methods, such as histopathological examination, are limited to morphological analysis of the examined tissues, whereas an exhaustive characterization requires immune-histochemical examination and a morpho-functional approach. Instead, a label-free and non-invasive alternative is provided by nonlinear microscopy. In this study, we combined SHG and FLIM microscopy in order to characterize collagen organization and lipids in human carotid ex vivo tissues affected by atherosclerosis. SHG and TPF images, acquired from different regions within atherosclerotic plaques, were processed through image pattern analysis methods (FFT, GLCM). The resulting information on collagen and cholesterol distribution and anisotropy, combined with collagen and lipids fluorescence lifetime measured from FLIM images, allowed characterization of carotid samples and discrimination of different tissue regions. The presented method can be applied for automated classification of atherosclerotic lesions and plaque vulnerability. Moreover, it lays the foundation for a potential in vivo diagnostic tool to be used in clinical setting.

  10. Minimum information specification for in situ hybridization and immunohistochemistry experiments (MISFISHIE).

    PubMed

    Deutsch, Eric W; Ball, Catherine A; Berman, Jules J; Bova, G Steven; Brazma, Alvis; Bumgarner, Roger E; Campbell, David; Causton, Helen C; Christiansen, Jeffrey H; Daian, Fabrice; Dauga, Delphine; Davidson, Duncan R; Gimenez, Gregory; Goo, Young Ah; Grimmond, Sean; Henrich, Thorsten; Herrmann, Bernhard G; Johnson, Michael H; Korb, Martin; Mills, Jason C; Oudes, Asa J; Parkinson, Helen E; Pascal, Laura E; Pollet, Nicolas; Quackenbush, John; Ramialison, Mirana; Ringwald, Martin; Salgado, David; Sansone, Susanna-Assunta; Sherlock, Gavin; Stoeckert, Christian J; Swedlow, Jason; Taylor, Ronald C; Walashek, Laura; Warford, Anthony; Wilkinson, David G; Zhou, Yi; Zon, Leonard I; Liu, Alvin Y; True, Lawrence D

    2008-03-01

    One purpose of the biomedical literature is to report results in sufficient detail that the methods of data collection and analysis can be independently replicated and verified. Here we present reporting guidelines for gene expression localization experiments: the minimum information specification for in situ hybridization and immunohistochemistry experiments (MISFISHIE). MISFISHIE is modeled after the Minimum Information About a Microarray Experiment (MIAME) specification for microarray experiments. Both guidelines define what information should be reported without dictating a format for encoding that information. MISFISHIE describes six types of information to be provided for each experiment: experimental design, biomaterials and treatments, reporters, staining, imaging data and image characterizations. This specification has benefited the consortium within which it was developed and is expected to benefit the wider research community. We welcome feedback from the scientific community to help improve our proposal.

  11. Drug Information Education in Doctor of Pharmacy Programs

    PubMed Central

    Wang, Fei; Troutman, William G.; Seo, Teresa; Peak, Amy; Rosenberg, Jack M.

    2006-01-01

    Objective To characterize pharmacy program standards and trends in drug information education. Methods A questionnaire containing 34 questions addressing general demographic characteristics, organization, and content of drug information education was distributed to 86 colleges and schools of pharmacy in the United States using a Web-based survey system. Results Sixty colleges responded (73% response rate). All colleges offered a campus-based 6-year first-professional degree PharmD program. Didactic drug information was a required course in over 70% of these schools. Only 51 of the 60 colleges offered an advanced pharmacy practice experience (APPE) in drug information, and 62% of these did so only on an elective basis. Conclusion Although almost all of the PharmD programs in the US include a required course in drug information, the majority do not have a required APPE in this important area. PMID:17136172

  12. Hybrid ontology for semantic information retrieval model using keyword matching indexing system.

    PubMed

    Uthayan, K R; Mala, G S Anandha

    2015-01-01

    Ontology is the process of growth and elucidation of concepts of an information domain being common for a group of users. Establishing ontology into information retrieval is a normal method to develop searching effects of relevant information users require. Keywords matching process with historical or information domain is significant in recent calculations for assisting the best match for specific input queries. This research presents a better querying mechanism for information retrieval which integrates the ontology queries with keyword search. The ontology-based query is changed into a primary order to predicate logic uncertainty which is used for routing the query to the appropriate servers. Matching algorithms characterize warm area of researches in computer science and artificial intelligence. In text matching, it is more dependable to study semantics model and query for conditions of semantic matching. This research develops the semantic matching results between input queries and information in ontology field. The contributed algorithm is a hybrid method that is based on matching extracted instances from the queries and information field. The queries and information domain is focused on semantic matching, to discover the best match and to progress the executive process. In conclusion, the hybrid ontology in semantic web is sufficient to retrieve the documents when compared to standard ontology.

  13. Hybrid Ontology for Semantic Information Retrieval Model Using Keyword Matching Indexing System

    PubMed Central

    Uthayan, K. R.; Anandha Mala, G. S.

    2015-01-01

    Ontology is the process of growth and elucidation of concepts of an information domain being common for a group of users. Establishing ontology into information retrieval is a normal method to develop searching effects of relevant information users require. Keywords matching process with historical or information domain is significant in recent calculations for assisting the best match for specific input queries. This research presents a better querying mechanism for information retrieval which integrates the ontology queries with keyword search. The ontology-based query is changed into a primary order to predicate logic uncertainty which is used for routing the query to the appropriate servers. Matching algorithms characterize warm area of researches in computer science and artificial intelligence. In text matching, it is more dependable to study semantics model and query for conditions of semantic matching. This research develops the semantic matching results between input queries and information in ontology field. The contributed algorithm is a hybrid method that is based on matching extracted instances from the queries and information field. The queries and information domain is focused on semantic matching, to discover the best match and to progress the executive process. In conclusion, the hybrid ontology in semantic web is sufficient to retrieve the documents when compared to standard ontology. PMID:25922851

  14. Surveillance methods for identifying, characterizing, and monitoring tobacco products: potential reduced exposure products as an example

    PubMed Central

    O’Connor, Richard J.; Cummings, K. Michael; Rees, Vaughan W.; Connolly, Gregory N.; Norton, Kaila J.; Sweanor, David; Parascandola, Mark; Hatsukami, Dorothy K.; Shields, Peter G.

    2015-01-01

    Tobacco products are widely sold and marketed, yet integrated data systems for identifying, tracking, and characterizing products are lacking. Tobacco manufacturers recently have developed potential reduction exposure products (PREPs) with implied or explicit health claims. Currently, a systematic approach for identifying, defining, and evaluating PREPs sold at the local, state or national levels in the US has not been developed. Identifying, characterizing, and monitoring new tobacco products could be greatly enhanced with a responsive surveillance system. This paper critically reviews available surveillance data sources for identifying and tracking tobacco products, including PREPs, evaluating strengths and weaknesses of potential data sources in light of their reliability and validity. Absent regulations mandating disclosure of product-specific information, it is likely that public health officials will need to rely on a variety of imperfect data sources to help identify, characterize, and monitor tobacco products, including PREPs. PMID:19959680

  15. Conformal piezoelectric systems for clinical and experimental characterization of soft tissue biomechanics

    NASA Astrophysics Data System (ADS)

    Dagdeviren, Canan; Shi, Yan; Joe, Pauline; Ghaffari, Roozbeh; Balooch, Guive; Usgaonkar, Karan; Gur, Onur; Tran, Phat L.; Crosby, Jessi R.; Meyer, Marcin; Su, Yewang; Chad Webb, R.; Tedesco, Andrew S.; Slepian, Marvin J.; Huang, Yonggang; Rogers, John A.

    2015-07-01

    Mechanical assessment of soft biological tissues and organs has broad relevance in clinical diagnosis and treatment of disease. Existing characterization methods are invasive, lack microscale spatial resolution, and are tailored only for specific regions of the body under quasi-static conditions. Here, we develop conformal and piezoelectric devices that enable in vivo measurements of soft tissue viscoelasticity in the near-surface regions of the epidermis. These systems achieve conformal contact with the underlying complex topography and texture of the targeted skin, as well as other organ surfaces, under both quasi-static and dynamic conditions. Experimental and theoretical characterization of the responses of piezoelectric actuator-sensor pairs laminated on a variety of soft biological tissues and organ systems in animal models provide information on the operation of the devices. Studies on human subjects establish the clinical significance of these devices for rapid and non-invasive characterization of skin mechanical properties.

  16. Joint two dimensional inversion of gravity and magnetotelluric data using correspondence maps

    NASA Astrophysics Data System (ADS)

    Carrillo Lopez, J.; Gallardo, L. A.

    2016-12-01

    Inverse problems in Earth sciences are inherently non-unique. To improve models and reduce the number of solutions we need to provide extra information. In geological context, this information could be a priori information, for example, geological information, well log data, smoothness, or actually, information of measures of different kind of data. Joint inversion provides an approach to improve the solution and reduce the errors due to suppositions of each method. To do that, we need a link between two or more models. Some approaches have been explored successfully in recent years. For example, Gallardo and Meju (2003), Gallardo and Meju (2004, 2011), and Gallardo et. al. (2012) used the directions of properties to measure the similarity between models minimizing their cross gradients. In this work, we proposed a joint iterative inversion method that use spatial distribution of properties as a link. Correspondence maps could be better characterizing specific Earth systems due they consider the relation between properties. We implemented a code in Fortran to do a two dimensional inversion of magnetotelluric and gravity data, which are two of the standard methods in geophysical exploration. Synthetic tests show the advantages of joint inversion using correspondence maps against separate inversion. Finally, we applied this technique to magnetotelluric and gravity data in the geothermal zone located in Cerro Prieto, México.

  17. WIPP waste characterization program sampling and analysis guidance manual

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1991-01-01

    The Waste Isolation Pilot Plant (WIPP) Waste Characterization Program Sampling and Analysis Guidance Manual (Guidance Manual) provides a unified source of information on the sampling and analytical techniques that enable Department of Energy (DOE) facilities to comply with the requirements established in the current revision of the Quality Assurance Program Plan (QAPP) for the WIPP Experimental-Waste Characterization Program (the Program). This Guidance Manual includes all of the sampling and testing methodologies accepted by the WIPP Project Office (DOE/WPO) for use in implementing the Program requirements specified in the QAPP. This includes methods for characterizing representative samples of transuranic (TRU) wastesmore » at DOE generator sites with respect to the gas generation controlling variables defined in the WIPP bin-scale and alcove test plans, as well as waste container headspace gas sampling and analytical procedures to support waste characterization requirements under the WIPP test program and the Resource Conservation and Recovery Act (RCRA). The procedures in this Guidance Manual are comprehensive and detailed and are designed to provide the necessary guidance for the preparation of site specific procedures. The use of these procedures is intended to provide the necessary sensitivity, specificity, precision, and comparability of analyses and test results. The solutions to achieving specific program objectives will depend upon facility constraints, compliance with DOE Orders and DOE facilities' operating contractor requirements, and the knowledge and experience of the TRU waste handlers and analysts. With some analytical methods, such as gas chromatography/mass spectrometry, the Guidance Manual procedures may be used directly. With other methods, such as nondestructive/destructive characterization, the Guidance Manual provides guidance rather than a step-by-step procedure.« less

  18. EDITORIAL: (Nano)characterization of semiconductor materials and structures (Nano)characterization of semiconductor materials and structures

    NASA Astrophysics Data System (ADS)

    Bonanni, Alberta

    2011-06-01

    The latest impressive advancements in the epitaxial fabrication of semiconductors and in the refinement of characterization techniques have the potential to allow insight into the deep relation between materials' structural properties and their physical and chemical functionalities. Furthermore, while the comprehensive (nano)characterization of semiconductor materials and structures is becoming more and more necessary, a compendium of the currently available techniques is lacking. We are positive that an overview of the hurdles related to the specific methods, often leading to deceptive interpretations, will be most informative for the broad community working on semiconductors, and will help in shining some light onto a plethora of controversial reports found in the literature. From this perspective, with this special issue we address and highlight the challenges and misinterpretations related to complementary local (nanoscale) and more global experimental methods for the characterization of semiconductors. The six topical reviews and the three invited papers by leading experts in the specific fields collected in here are intended to provide the required broad overview on the possibilities of actual (nano)characterization methods, from the microscopy of single quantum structures, over the synchrotron-based absorption and diffraction of nano-objects, to the contentious detection of tiny magnetic signals by quantum interference and resonance techniques. We are grateful to all the authors for their valuable contributions. Moreover, I would like to thank the Editorial Board of the journal for supporting the realization of this special issue and for inviting me to serve as Guest Editor. We greatly appreciate the work of the reviewers, of the editorial staff of Semiconductor Science and Technology and of IOP Publishing. In particular, the efforts of Alice Malhador in coordinating this special issue are acknowledged.

  19. Minimum Information Specification For In Situ Hybridization and Immunohistochemistry Experiments (MISFISHIE)

    PubMed Central

    Deutsch, Eric W; Ball, Catherine A; Berman, Jules J; Bova, G Steven; Brazma, Alvis; Bumgarner, Roger E; Campbell, David; Causton, Helen C; Christiansen, Jeffrey H; Daian, Fabrice; Dauga, Delphine; Davidson, Duncan R; Gimenez, Gregory; Goo, Young Ah; Grimmond, Sean; Henrich, Thorsten; Herrmann, Bernhard G; Johnson, Michael H; Korb, Martin; Mills, Jason C; Oudes, Asa J; Parkinson, Helen E; Pascal, Laura E; Pollet, Nicolas; Quackenbush, John; Ramialison, Mirana; Ringwald, Martin; Salgado, David; Sansone, Susanna-Assunta; Sherlock, Gavin; Stoeckert, Christian J; Swedlow, Jason; Taylor, Ronald C; Walashek, Laura; Warford, Anthony; Wilkinson, David G; Zhou, Yi; Zon, Leonard I; Liu, Alvin Y; True, Lawrence D

    2015-01-01

    One purpose of the biomedical literature is to report results in sufficient detail so that the methods of data collection and analysis can be independently replicated and verified. Here we present for consideration a minimum information specification for gene expression localization experiments, called the “Minimum Information Specification For In Situ Hybridization and Immunohistochemistry Experiments (MISFISHIE)”. It is modelled after the MIAME (Minimum Information About a Microarray Experiment) specification for microarray experiments. Data specifications like MIAME and MISFISHIE specify the information content without dictating a format for encoding that information. The MISFISHIE specification describes six types of information that should be provided for each experiment: Experimental Design, Biomaterials and Treatments, Reporters, Staining, Imaging Data, and Image Characterizations. This specification has benefited the consortium within which it was initially developed and is expected to benefit the wider research community. We welcome feedback from the scientific community to help improve our proposal. PMID:18327244

  20. Multi-scale integration and predictability in resting state brain activity

    PubMed Central

    Kolchinsky, Artemy; van den Heuvel, Martijn P.; Griffa, Alessandra; Hagmann, Patric; Rocha, Luis M.; Sporns, Olaf; Goñi, Joaquín

    2014-01-01

    The human brain displays heterogeneous organization in both structure and function. Here we develop a method to characterize brain regions and networks in terms of information-theoretic measures. We look at how these measures scale when larger spatial regions as well as larger connectome sub-networks are considered. This framework is applied to human brain fMRI recordings of resting-state activity and DSI-inferred structural connectivity. We find that strong functional coupling across large spatial distances distinguishes functional hubs from unimodal low-level areas, and that this long-range functional coupling correlates with structural long-range efficiency on the connectome. We also find a set of connectome regions that are both internally integrated and coupled to the rest of the brain, and which resemble previously reported resting-state networks. Finally, we argue that information-theoretic measures are useful for characterizing the functional organization of the brain at multiple scales. PMID:25104933

  1. Characterization of Nanoporous Materials with Atom Probe Tomography.

    PubMed

    Pfeiffer, Björn; Erichsen, Torben; Epler, Eike; Volkert, Cynthia A; Trompenaars, Piet; Nowak, Carsten

    2015-06-01

    A method to characterize open-cell nanoporous materials with atom probe tomography (APT) has been developed. For this, open-cell nanoporous gold with pore diameters of around 50 nm was used as a model system, and filled by electron beam-induced deposition (EBID) to obtain a compact material. Two different EBID precursors were successfully tested-dicobalt octacarbonyl [Co2(CO)8] and diiron nonacarbonyl [Fe2(CO)9]. Penetration and filling depth are sufficient for focused ion beam-based APT sample preparation. With this approach, stable APT analysis of the nanoporous material can be performed. Reconstruction reveals the composition of the deposited precursor and the nanoporous material, as well as chemical information of the interfaces between them. Thus, it is shown that, using an appropriate EBID process, local chemical information in three dimensions with sub-nanometer resolution can be obtained from nanoporous materials using APT.

  2. Real time explosive hazard information sensing, processing, and communication for autonomous operation

    DOEpatents

    Versteeg, Roelof J; Few, Douglas A; Kinoshita, Robert A; Johnson, Doug; Linda, Ondrej

    2015-02-24

    Methods, computer readable media, and apparatuses provide robotic explosive hazard detection. A robot intelligence kernel (RIK) includes a dynamic autonomy structure with two or more autonomy levels between operator intervention and robot initiative A mine sensor and processing module (ESPM) operating separately from the RIK perceives environmental variables indicative of a mine using subsurface perceptors. The ESPM processes mine information to determine a likelihood of a presence of a mine. A robot can autonomously modify behavior responsive to an indication of a detected mine. The behavior is modified between detection of mines, detailed scanning and characterization of the mine, developing mine indication parameters, and resuming detection. Real time messages are passed between the RIK and the ESPM. A combination of ESPM bound messages and RIK bound messages cause the robot platform to switch between modes including a calibration mode, the mine detection mode, and the mine characterization mode.

  3. Real time explosive hazard information sensing, processing, and communication for autonomous operation

    DOEpatents

    Versteeg, Roelof J.; Few, Douglas A.; Kinoshita, Robert A.; Johnson, Douglas; Linda, Ondrej

    2015-12-15

    Methods, computer readable media, and apparatuses provide robotic explosive hazard detection. A robot intelligence kernel (RIK) includes a dynamic autonomy structure with two or more autonomy levels between operator intervention and robot initiative A mine sensor and processing module (ESPM) operating separately from the RIK perceives environmental variables indicative of a mine using subsurface perceptors. The ESPM processes mine information to determine a likelihood of a presence of a mine. A robot can autonomously modify behavior responsive to an indication of a detected mine. The behavior is modified between detection of mines, detailed scanning and characterization of the mine, developing mine indication parameters, and resuming detection. Real time messages are passed between the RIK and the ESPM. A combination of ESPM bound messages and RIK bound messages cause the robot platform to switch between modes including a calibration mode, the mine detection mode, and the mine characterization mode.

  4. Attrition from surgical residency training: perspectives from those who left.

    PubMed

    Bongiovanni, Tasce; Yeo, Heather; Sosa, Julie A; Yoo, Peter S; Long, Theodore; Rosenthal, Marjorie; Berg, David; Curry, Leslie; Nunez-Smith, Marcella

    2015-10-01

    High rates of attrition from general surgery residency may threaten the surgical workforce. We sought to gain further insight regarding resident motivations for leaving general surgery residency. We conducted in-depth interviews to generate rich narrative data that explored individual experiences. An interdisciplinary team used the constant comparative method to analyze the data. Four themes characterized experiences of our 19 interviewees who left their residency program. Participants (1) felt an informal contract was breached when clinical duties were prioritized over education, (2) characterized a culture in which there was no safe space to share personal and programmatic concerns, (3) expressed a scarcity of role models who demonstrated better work-life balance, and (4) reported negative interactions with authority resulting in a profound loss of commitment. As general surgery graduate education continues to evolve, our findings may inform interventions and policies regarding programmatic changes to boost retention in surgical residency. Copyright © 2015 Elsevier Inc. All rights reserved.

  5. Functional Classification of Immune Regulatory Proteins

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rubinstein, Rotem; Ramagopal, Udupi A.; Nathenson, Stanley G.

    2013-05-01

    Members of the immunoglobulin superfamily (IgSF) control innate and adaptive immunity and are prime targets for the treatment of autoimmune diseases, infectious diseases, and malignancies. We describe a computational method, termed the Brotherhood algorithm, which utilizes intermediate sequence information to classify proteins into functionally related families. This approach identifies functional relationships within the IgSF and predicts additional receptor-ligand interactions. As a specific example, we examine the nectin/nectin-like family of cell adhesion and signaling proteins and propose receptor-ligand interactions within this family. We were guided by the Brotherhood approach and present the high-resolution structural characterization of a homophilic interaction involving themore » class-I MHC-restricted T-cell-associated molecule, which we now classify as a nectin-like family member. The Brotherhood algorithm is likely to have a significant impact on structural immunology by identifying those proteins and complexes for which structural characterization will be particularly informative.« less

  6. Dual Roadside Seismic Sensor for Moving Road Vehicle Detection and Characterization

    PubMed Central

    Wang, Hua; Quan, Wei; Wang, Yinhai; Miller, Gregory R.

    2014-01-01

    This paper presents a method for using a dual roadside seismic sensor to detect moving vehicles on roadway by installing them on a road shoulder. Seismic signals are split into fixed time intervals in recording. In each interval, the time delay of arrival (TDOA) is estimated using a generalized cross-correlation approach with phase transform (GCC-PHAT). Various kinds of vehicle characterization information, including vehicle speed, axle spacing, detection of both vehicle axles and moving direction, can also be extracted from the collected seismic signals as demonstrated in this paper. The error of both vehicle speed and axle spacing detected by this approach has been shown to be less than 20% through the field tests conducted on an urban street in Seattle. Compared to most existing sensors, this new design of dual seismic sensor is cost effective, easy to install, and effective in gathering information for various traffic management applications. PMID:24526304

  7. Performance Evaluation of EnKF-based Hydrogeological Site Characterization using Color Coherent Vectors

    NASA Astrophysics Data System (ADS)

    Moslehi, M.; de Barros, F.

    2017-12-01

    Complexity of hydrogeological systems arises from the multi-scale heterogeneity and insufficient measurements of their underlying parameters such as hydraulic conductivity and porosity. An inadequate characterization of hydrogeological properties can significantly decrease the trustworthiness of numerical models that predict groundwater flow and solute transport. Therefore, a variety of data assimilation methods have been proposed in order to estimate hydrogeological parameters from spatially scarce data by incorporating the governing physical models. In this work, we propose a novel framework for evaluating the performance of these estimation methods. We focus on the Ensemble Kalman Filter (EnKF) approach that is a widely used data assimilation technique. It reconciles multiple sources of measurements to sequentially estimate model parameters such as the hydraulic conductivity. Several methods have been used in the literature to quantify the accuracy of the estimations obtained by EnKF, including Rank Histograms, RMSE and Ensemble Spread. However, these commonly used methods do not regard the spatial information and variability of geological formations. This can cause hydraulic conductivity fields with very different spatial structures to have similar histograms or RMSE. We propose a vision-based approach that can quantify the accuracy of estimations by considering the spatial structure embedded in the estimated fields. Our new approach consists of adapting a new metric, Color Coherent Vectors (CCV), to evaluate the accuracy of estimated fields achieved by EnKF. CCV is a histogram-based technique for comparing images that incorporate spatial information. We represent estimated fields as digital three-channel images and use CCV to compare and quantify the accuracy of estimations. The sensitivity of CCV to spatial information makes it a suitable metric for assessing the performance of spatial data assimilation techniques. Under various factors of data assimilation methods such as number, layout, and type of measurements, we compare the performance of CCV with other metrics such as RMSE. By simulating hydrogeological processes using estimated and true fields, we observe that CCV outperforms other existing evaluation metrics.

  8. Scalable subsurface inverse modeling of huge data sets with an application to tracer concentration breakthrough data from magnetic resonance imaging

    DOE PAGES

    Lee, Jonghyun; Yoon, Hongkyu; Kitanidis, Peter K.; ...

    2016-06-09

    When characterizing subsurface properties is crucial for reliable and cost-effective groundwater supply management and contaminant remediation. With recent advances in sensor technology, large volumes of hydro-geophysical and geochemical data can be obtained to achieve high-resolution images of subsurface properties. However, characterization with such a large amount of information requires prohibitive computational costs associated with “big data” processing and numerous large-scale numerical simulations. To tackle such difficulties, the Principal Component Geostatistical Approach (PCGA) has been proposed as a “Jacobian-free” inversion method that requires much smaller forward simulation runs for each iteration than the number of unknown parameters and measurements needed inmore » the traditional inversion methods. PCGA can be conveniently linked to any multi-physics simulation software with independent parallel executions. In our paper, we extend PCGA to handle a large number of measurements (e.g. 106 or more) by constructing a fast preconditioner whose computational cost scales linearly with the data size. For illustration, we characterize the heterogeneous hydraulic conductivity (K) distribution in a laboratory-scale 3-D sand box using about 6 million transient tracer concentration measurements obtained using magnetic resonance imaging. Since each individual observation has little information on the K distribution, the data was compressed by the zero-th temporal moment of breakthrough curves, which is equivalent to the mean travel time under the experimental setting. Moreover, only about 2,000 forward simulations in total were required to obtain the best estimate with corresponding estimation uncertainty, and the estimated K field captured key patterns of the original packing design, showing the efficiency and effectiveness of the proposed method. This article is protected by copyright. All rights reserved.« less

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Jonghyun; Yoon, Hongkyu; Kitanidis, Peter K.

    When characterizing subsurface properties is crucial for reliable and cost-effective groundwater supply management and contaminant remediation. With recent advances in sensor technology, large volumes of hydro-geophysical and geochemical data can be obtained to achieve high-resolution images of subsurface properties. However, characterization with such a large amount of information requires prohibitive computational costs associated with “big data” processing and numerous large-scale numerical simulations. To tackle such difficulties, the Principal Component Geostatistical Approach (PCGA) has been proposed as a “Jacobian-free” inversion method that requires much smaller forward simulation runs for each iteration than the number of unknown parameters and measurements needed inmore » the traditional inversion methods. PCGA can be conveniently linked to any multi-physics simulation software with independent parallel executions. In our paper, we extend PCGA to handle a large number of measurements (e.g. 106 or more) by constructing a fast preconditioner whose computational cost scales linearly with the data size. For illustration, we characterize the heterogeneous hydraulic conductivity (K) distribution in a laboratory-scale 3-D sand box using about 6 million transient tracer concentration measurements obtained using magnetic resonance imaging. Since each individual observation has little information on the K distribution, the data was compressed by the zero-th temporal moment of breakthrough curves, which is equivalent to the mean travel time under the experimental setting. Moreover, only about 2,000 forward simulations in total were required to obtain the best estimate with corresponding estimation uncertainty, and the estimated K field captured key patterns of the original packing design, showing the efficiency and effectiveness of the proposed method. This article is protected by copyright. All rights reserved.« less

  10. Surface characterization of retinal tissues for the enhancement of vitreoretinal surgical methods

    NASA Astrophysics Data System (ADS)

    Valentin-Rodriguez, Celimar

    Diabetic retinopathy is the most common ophthalmic complication of diabetes and the leading cause of blindness among adults, ages 30 to 70. Surgery to remove scar tissue in the eye is the only corrective treatment once the retina is affected. Visual recovery is often hampered by retinal trauma during surgery and by low patient compliance. Our work in this project aimed to improve vitreoretinal surgical methods from information gathered by sensitive surface analysis of pre-retinal tissues found at the vitreoretinal interface. Atomic force microscopy characterization of human retinal tissues revealed that surgically excised inner limiting membrane (ILM) has a heterogeneous surface and is mainly composed of globular and fibrous structures. ILM tissues also show low adhesion for clean unmodified surfaces as opposed to those with functional groups attractive to those on the ILM surface, due to their charge. Based on these observations, layer-by-layer films with embedded gold nanoparticles with a positive outer charge were designed. These modifications increased the adhesion between surgical instruments and ILM by increasing the roughness and tuning the film surface charge. These films proved to be stable under physiological conditions. Finally, the effect of vital dyes on the topographical characteristics of ILMs was characterized and new imaging modes to further reveal ILM topography were utilized. Roughness and adhesion force data suggest that second generation dyes have no effect on the surface nanostructure of ILMs, but increase adhesion at the tip sample interface. This project clearly illustrates that physicochemical information from tissues can be used to rationally re-design surgical procedures, in this case for tissue removal purposes. This rational design method can be applied to other soft tissue excision procedures as is the case of cataract surgery or laparoscopic removal of endometrial tissue.

  11. Characterization of cytochrome c as marker for retinal cell degeneration by uv/vis spectroscopic imaging

    NASA Astrophysics Data System (ADS)

    Hollmach, Julia; Schweizer, Julia; Steiner, Gerald; Knels, Lilla; Funk, Richard H. W.; Thalheim, Silko; Koch, Edmund

    2011-07-01

    Retinal diseases like age-related macular degeneration have become an important cause of visual loss depending on increasing life expectancy and lifestyle habits. Due to the fact that no satisfying treatment exists, early diagnosis and prevention are the only possibilities to stop the degeneration. The protein cytochrome c (cyt c) is a suitable marker for degeneration processes and apoptosis because it is a part of the respiratory chain and involved in the apoptotic pathway. The determination of the local distribution and oxidative state of cyt c in living cells allows the characterization of cell degeneration processes. Since cyt c exhibits characteristic absorption bands between 400 and 650 nm wavelength, uv/vis in situ spectroscopic imaging was used for its characterization in retinal ganglion cells. The large amount of data, consisting of spatial and spectral information, was processed by multivariate data analysis. The challenge consists in the identification of the molecular information of cyt c. Baseline correction, principle component analysis (PCA) and cluster analysis (CA) were performed in order to identify cyt c within the spectral dataset. The combination of PCA and CA reveals cyt c and its oxidative state. The results demonstrate that uv/vis spectroscopic imaging in conjunction with sophisticated multivariate methods is a suitable tool to characterize cyt c under in situ conditions.

  12. Three-Dimensional Bayesian Geostatistical Aquifer Characterization at the Hanford 300 Area using Tracer Test Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Xingyuan; Murakami, Haruko; Hahn, Melanie S.

    2012-06-01

    Tracer testing under natural or forced gradient flow holds the potential to provide useful information for characterizing subsurface properties, through monitoring, modeling and interpretation of the tracer plume migration in an aquifer. Non-reactive tracer experiments were conducted at the Hanford 300 Area, along with constant-rate injection tests and electromagnetic borehole flowmeter (EBF) profiling. A Bayesian data assimilation technique, the method of anchored distributions (MAD) [Rubin et al., 2010], was applied to assimilate the experimental tracer test data with the other types of data and to infer the three-dimensional heterogeneous structure of the hydraulic conductivity in the saturated zone of themore » Hanford formation. In this study, the Bayesian prior information on the underlying random hydraulic conductivity field was obtained from previous field characterization efforts using the constant-rate injection tests and the EBF data. The posterior distribution of the conductivity field was obtained by further conditioning the field on the temporal moments of tracer breakthrough curves at various observation wells. MAD was implemented with the massively-parallel three-dimensional flow and transport code PFLOTRAN to cope with the highly transient flow boundary conditions at the site and to meet the computational demands of MAD. A synthetic study proved that the proposed method could effectively invert tracer test data to capture the essential spatial heterogeneity of the three-dimensional hydraulic conductivity field. Application of MAD to actual field data shows that the hydrogeological model, when conditioned on the tracer test data, can reproduce the tracer transport behavior better than the field characterized without the tracer test data. This study successfully demonstrates that MAD can sequentially assimilate multi-scale multi-type field data through a consistent Bayesian framework.« less

  13. Multivariate analysis and extraction of parameters in resistive RAMs using the Quantum Point Contact model

    NASA Astrophysics Data System (ADS)

    Roldán, J. B.; Miranda, E.; González-Cordero, G.; García-Fernández, P.; Romero-Zaliz, R.; González-Rodelas, P.; Aguilera, A. M.; González, M. B.; Jiménez-Molinos, F.

    2018-01-01

    A multivariate analysis of the parameters that characterize the reset process in Resistive Random Access Memory (RRAM) has been performed. The different correlations obtained can help to shed light on the current components that contribute in the Low Resistance State (LRS) of the technology considered. In addition, a screening method for the Quantum Point Contact (QPC) current component is presented. For this purpose, the second derivative of the current has been obtained using a novel numerical method which allows determining the QPC model parameters. Once the procedure is completed, a whole Resistive Switching (RS) series of thousands of curves is studied by means of a genetic algorithm. The extracted QPC parameter distributions are characterized in depth to get information about the filamentary pathways associated with LRS in the low voltage conduction regime.

  14. Using informative priors in facies inversion: The case of C-ISR method

    NASA Astrophysics Data System (ADS)

    Valakas, G.; Modis, K.

    2016-08-01

    Inverse problems involving the characterization of hydraulic properties of groundwater flow systems by conditioning on observations of the state variables are mathematically ill-posed because they have multiple solutions and are sensitive to small changes in the data. In the framework of McMC methods for nonlinear optimization and under an iterative spatial resampling transition kernel, we present an algorithm for narrowing the prior and thus producing improved proposal realizations. To achieve this goal, we cosimulate the facies distribution conditionally to facies observations and normal scores transformed hydrologic response measurements, assuming a linear coregionalization model. The approach works by creating an importance sampling effect that steers the process to selected areas of the prior. The effectiveness of our approach is demonstrated by an example application on a synthetic underdetermined inverse problem in aquifer characterization.

  15. Google matrix analysis of directed networks

    NASA Astrophysics Data System (ADS)

    Ermann, Leonardo; Frahm, Klaus M.; Shepelyansky, Dima L.

    2015-10-01

    In the past decade modern societies have developed enormous communication and social networks. Their classification and information retrieval processing has become a formidable task for the society. Because of the rapid growth of the World Wide Web, and social and communication networks, new mathematical methods have been invented to characterize the properties of these networks in a more detailed and precise way. Various search engines extensively use such methods. It is highly important to develop new tools to classify and rank a massive amount of network information in a way that is adapted to internal network structures and characteristics. This review describes the Google matrix analysis of directed complex networks demonstrating its efficiency using various examples including the World Wide Web, Wikipedia, software architectures, world trade, social and citation networks, brain neural networks, DNA sequences, and Ulam networks. The analytical and numerical matrix methods used in this analysis originate from the fields of Markov chains, quantum chaos, and random matrix theory.

  16. A method for work modeling at complex systems: towards applying information systems in family health care units.

    PubMed

    Jatobá, Alessandro; de Carvalho, Paulo Victor R; da Cunha, Amauri Marques

    2012-01-01

    Work in organizations requires a minimum level of consensus on the understanding of the practices performed. To adopt technological devices to support the activities in environments where work is complex, characterized by the interdependence among a large number of variables, understanding about how work is done not only takes an even greater importance, but also becomes a more difficult task. Therefore, this study aims to present a method for modeling of work in complex systems, which allows improving the knowledge about the way activities are performed where these activities do not simply happen by performing procedures. Uniting techniques of Cognitive Task Analysis with the concept of Work Process, this work seeks to provide a method capable of providing a detailed and accurate vision of how people perform their tasks, in order to apply information systems for supporting work in organizations.

  17. Laboratory analytical methods for the determination of the hydrocarbon status of soils (a review)

    NASA Astrophysics Data System (ADS)

    Pikovskii, Yu. I.; Korotkov, L. A.; Smirnova, M. A.; Kovach, R. G.

    2017-10-01

    Laboratory analytical methods suitable for the determination of the hydrocarbon status of soils (a specific soil characteristic involving information on the total content and qualitative features of soluble (bitumoid) carbonaceous substances and individual hydrocarbons (polycyclic aromatic hydrocarbons, alkanes, etc.) in bitumoid, as well as the composition and content of hydrocarbon gases) have been considered. Among different physicochemical methods of study, attention is focused on the methods suitable for the wide use. Luminescence-bituminological analysis, low-temperature spectrofluorimetry (Shpolskii spectroscopy), infrared (IR) spectroscopy, gas chromatography, chromatography-mass spectrometry, and some other methods have been characterized, as well as sample preparation features. Advantages and limitations of each of these methods are described; their efficiency, instrumental complexity, analysis duration, and accuracy are assessed.

  18. Sector Identification in a Set of Stock Return Time Series Traded at the London Stock Exchange

    NASA Astrophysics Data System (ADS)

    Coronnello, C.; Tumminello, M.; Lillo, F.; Micciche, S.; Mantegna, R. N.

    2005-09-01

    We compare some methods recently used in the literature to detect the existence of a certain degree of common behavior of stock returns belonging to the same economic sector. Specifically, we discuss methods based on random matrix theory and hierarchical clustering techniques. We apply these methods to a portfolio of stocks traded at the London Stock Exchange. The investigated time series are recorded both at a daily time horizon and at a 5-minute time horizon. The correlation coefficient matrix is very different at different time horizons confirming that more structured correlation coefficient matrices are observed for long time horizons. All the considered methods are able to detect economic information and the presence of clusters characterized by the economic sector of stocks. However, different methods present a different degree of sensitivity with respect to different sectors. Our comparative analysis suggests that the application of just a single method could not be able to extract all the economic information present in the correlation coefficient matrix of a stock portfolio.

  19. Tail-scope: Using friends to estimate heavy tails of degree distributions in large-scale complex networks

    NASA Astrophysics Data System (ADS)

    Eom, Young-Ho; Jo, Hang-Hyun

    2015-05-01

    Many complex networks in natural and social phenomena have often been characterized by heavy-tailed degree distributions. However, due to rapidly growing size of network data and concerns on privacy issues about using these data, it becomes more difficult to analyze complete data sets. Thus, it is crucial to devise effective and efficient estimation methods for heavy tails of degree distributions in large-scale networks only using local information of a small fraction of sampled nodes. Here we propose a tail-scope method based on local observational bias of the friendship paradox. We show that the tail-scope method outperforms the uniform node sampling for estimating heavy tails of degree distributions, while the opposite tendency is observed in the range of small degrees. In order to take advantages of both sampling methods, we devise the hybrid method that successfully recovers the whole range of degree distributions. Our tail-scope method shows how structural heterogeneities of large-scale complex networks can be used to effectively reveal the network structure only with limited local information.

  20. Using an internet questionnaire to characterize bat survey efforts in the United States and Canada

    Treesearch

    Theodore J. Weller; William J. Zielinski

    2006-01-01

    Standardized survey methods are important for obtaining reliable information on wildlife populations. As a precursor to creating a regional bat-survey (Chiroptera) protocol, we distributed a questionnaire via e-mail to biologists responsible for conducting bat surveys in the United States and Canada. We received 415 responses from 45 states and 7 Canadian provinces or...

  1. On-Orbit System Identification

    NASA Technical Reports Server (NTRS)

    Mettler, E.; Milman, M. H.; Bayard, D.; Eldred, D. B.

    1987-01-01

    Information derived from accelerometer readings benefits important engineering and control functions. Report discusses methodology for detection, identification, and analysis of motions within space station. Techniques of vibration and rotation analyses, control theory, statistics, filter theory, and transform methods integrated to form system for generating models and model parameters that characterize total motion of complicated space station, with respect to both control-induced and random mechanical disturbances.

  2. Non-contact fluid characterization in containers using ultrasonic waves

    DOEpatents

    Sinha, Dipen N [Los Alamos, NM

    2012-05-15

    Apparatus and method for non-contact (stand-off) ultrasonic determination of certain characteristics of fluids in containers or pipes are described. A combination of swept frequency acoustic interferometry (SFAI), wide-bandwidth, air-coupled acoustic transducers, narrowband frequency data acquisition, and data conversion from the frequency domain to the time domain, if required, permits meaningful information to be extracted from such fluids.

  3. State Methods for a Cyber Incident

    DTIC Science & Technology

    2012-03-01

    Glossary S905 - Incident Submission and Response Standard S910 - Data Breach Notification Standard E-5 Our state characterizes information system...Office of Management and Budget. (2011a). Legislative Language Data Breach Notification. Retrieved September 20, 2010, from http://www.whitehouse.gov...sites/default/files/omb/legislative/letters/ data - breach -notification.pdf Executive Office of the President. Office of Management and Budget

  4. Youth crime and preventive policing in post-war Scotland (c.1945-71).

    PubMed

    Bartie, Angela; Jackson, Louise A

    2011-01-01

    This article explores debates concerning the methods and styles used by the police service in its dealings with children and young people in post-war Scotland (in comparison with England). Study of the implementation of Police Juvenile Liaison Schemes is used to consider shifting points of tension as well as cooperation between the police and other occupational groups engaged in work at the nexus of youth justice-welfare. Whilst often characterized as contradictory tendencies, the article demonstrates that a social welfare ethic and a criminal justice ethic were coexistent within the rhetoric and practice of policing, but that they operated in a state of flux. It also argues that styles of policing were subject to change, particularly as the use of discretionary and informal methods was increasingly challenged, as physical violence was increasingly seen as an outmoded recourse for the institutions of criminal justice, and as the policing of youth was increasingly politicized. The post-war period can be characterized in terms of greater levels of public scrutiny, the formalization of processes previously undertaken through informal or semi-formal mechanisms, and attempts (not always successful) to systematize procedures nationally in terms of the Scottish state.

  5. Three-dimensional Hessian matrix-based quantitative vascular imaging of rat iris with optical-resolution photoacoustic microscopy in vivo

    NASA Astrophysics Data System (ADS)

    Zhao, Huangxuan; Wang, Guangsong; Lin, Riqiang; Gong, Xiaojing; Song, Liang; Li, Tan; Wang, Wenjia; Zhang, Kunya; Qian, Xiuqing; Zhang, Haixia; Li, Lin; Liu, Zhicheng; Liu, Chengbo

    2018-04-01

    For the diagnosis and evaluation of ophthalmic diseases, imaging and quantitative characterization of vasculature in the iris are very important. The recently developed photoacoustic imaging, which is ultrasensitive in imaging endogenous hemoglobin molecules, provides a highly efficient label-free method for imaging blood vasculature in the iris. However, the development of advanced vascular quantification algorithms is still needed to enable accurate characterization of the underlying vasculature. We have developed a vascular information quantification algorithm by adopting a three-dimensional (3-D) Hessian matrix and applied for processing iris vasculature images obtained with a custom-built optical-resolution photoacoustic imaging system (OR-PAM). For the first time, we demonstrate in vivo 3-D vascular structures of a rat iris with a the label-free imaging method and also accurately extract quantitative vascular information, such as vessel diameter, vascular density, and vascular tortuosity. Our results indicate that the developed algorithm is capable of quantifying the vasculature in the 3-D photoacoustic images of the iris in-vivo, thus enhancing the diagnostic capability of the OR-PAM system for vascular-related ophthalmic diseases in vivo.

  6. Better bioinformatics through usability analysis.

    PubMed

    Bolchini, Davide; Finkelstein, Anthony; Perrone, Vito; Nagl, Sylvia

    2009-02-01

    Improving the usability of bioinformatics resources enables researchers to find, interact with, share, compare and manipulate important information more effectively and efficiently. It thus enables researchers to gain improved insights into biological processes with the potential, ultimately, of yielding new scientific results. Usability 'barriers' can pose significant obstacles to a satisfactory user experience and force researchers to spend unnecessary time and effort to complete their tasks. The number of online biological databases available is growing and there is an expanding community of diverse users. In this context there is an increasing need to ensure the highest standards of usability. Using 'state-of-the-art' usability evaluation methods, we have identified and characterized a sample of usability issues potentially relevant to web bioinformatics resources, in general. These specifically concern the design of the navigation and search mechanisms available to the user. The usability issues we have discovered in our substantial case studies are undermining the ability of users to find the information they need in their daily research activities. In addition to characterizing these issues, specific recommendations for improvements are proposed leveraging proven practices from web and usability engineering. The methods and approach we exemplify can be readily adopted by the developers of bioinformatics resources.

  7. Purification and characterization of nattokinase from Bacillus subtilis natto B-12.

    PubMed

    Wang, Cong; Du, Ming; Zheng, Dongmei; Kong, Fandong; Zu, Guoren; Feng, Yibing

    2009-10-28

    Bacillus subtilis natto B-12 was isolated from natto, a traditional fermented soybean food in Japan. A fibrinolytic enzyme (B-12 nattokinase) was purified from the supernatant of B. subtilis natto B-12 culture broth and showed strong fibrinolytic activity. The enzyme was homogenously purified to 56.1-fold, with a recovery of 43.2% of the initial activity. B-12 nattokinase was demonstrated to be homogeneous by SDS-PAGE and was identified as a monomer of 29000 +/- 300 Da in its native state by SDS-PAGE and size exclusion methods. The optimal pH value and temperature were 8.0 and 40 degrees C, respectively. Purified nattokinase showed high thermostability at temperatures from 30 to 50 degrees C and alkaline stability within the range of pH 6.0-9.0. The enzyme activity was activated by Zn(2+) and obviously inhibited by Fe(3+) and Al(3+). This study provides some important information for the effect factors of fibrinolytic activity, the purification methods, and characterization of nattokinase from B. subtilis natto B-12, which enriches the theoretical information of nattokinase for the research and development of nattokinase as a functional additive of food.

  8. Influence of Flavors on the Propagation of E-Cigarette–Related Information: Social Media Study

    PubMed Central

    Zhou, Jiaqi; Zeng, Daniel Dajun; Tsui, Kwok Leung

    2018-01-01

    Background Modeling the influence of e-cigarette flavors on information propagation could provide quantitative policy decision support concerning smoking initiation and contagion, as well as e-cigarette regulations. Objective The objective of this study was to characterize the influence of flavors on e-cigarette–related information propagation on social media. Methods We collected a comprehensive dataset of e-cigarette–related discussions from public Pages on Facebook. We identified 11 categories of flavors based on commonly used categorizations. Each post’s frequency of being shared served as a proxy measure of information propagation. We evaluated a set of regression models and chose the hurdle negative binomial model to characterize the influence of different flavors and nonflavor control variables on e-cigarette–related information propagation. Results We found that 5 flavors (sweet, dessert & bakery, fruits, herbs & spices, and tobacco) had significantly negative influences on e-cigarette–related information propagation, indicating the users’ tendency not to share posts related to these flavors. We did not find a positive significance of any flavors, which is contradictory to previous research. In addition, we found that a set of nonflavor–related factors were associated with information propagation. Conclusions Mentions of flavors in posts did not enhance the popularity of e-cigarette–related information. Certain flavors could even have reduced the popularity of information, indicating users’ lack of interest in flavors. Promoting e-cigarette–related information with mention of flavors is not an effective marketing approach. This study implies the potential concern of users about flavorings and suggests a need to regulate the use of flavorings in e-cigarettes. PMID:29572202

  9. Zinc Oxide—From Synthesis to Application: A Review

    PubMed Central

    Kołodziejczak-Radzimska, Agnieszka; Jesionowski, Teofil

    2014-01-01

    Zinc oxide can be called a multifunctional material thanks to its unique physical and chemical properties. The first part of this paper presents the most important methods of preparation of ZnO divided into metallurgical and chemical methods. The mechanochemical process, controlled precipitation, sol-gel method, solvothermal and hydrothermal method, method using emulsion and microemulsion enviroment and other methods of obtaining zinc oxide were classified as chemical methods. In the next part of this review, the modification methods of ZnO were characterized. The modification with organic (carboxylic acid, silanes) and inroganic (metal oxides) compounds, and polymer matrices were mainly described. Finally, we present possible applications in various branches of industry: rubber, pharmaceutical, cosmetics, textile, electronic and electrotechnology, photocatalysis were introduced. This review provides useful information for specialist dealings with zinc oxide. PMID:28788596

  10. Spectral Induced Polarization approaches to characterize reactive transport parameters and processes

    NASA Astrophysics Data System (ADS)

    Schmutz, M.; Franceschi, M.; Revil, A.; Peruzzo, L.; Maury, T.; Vaudelet, P.; Ghorbani, A.; Hubbard, S. S.

    2017-12-01

    For almost a decade, geophysical methods have explored the potential for characterization of reactive transport parameters and processes relevant to hydrogeology, contaminant remediation, and oil and gas applications. Spectral Induced Polarization (SIP) methods show particular promise in this endeavour, given the sensitivity of the SIP signature to geological material electrical double layer properties and the critical role of the electrical double layer on reactive transport processes, such as adsorption. In this presentation, we discuss results from several recent studies that have been performed to quantify the value of SIP parameters for characterizing reactive transport parameters. The advances have been realized through performing experimental studies and interpreting their responses using theoretical and numerical approaches. We describe a series of controlled experimental studies that have been performed to quantify the SIP responses to variations in grain size and specific surface area, pore fluid geochemistry, and other factors. We also model chemical reactions at the interface fluid/matrix linked to part of our experimental data set. For some examples, both geochemical modelling and measurements are integrated into a SIP physico-chemical based model. Our studies indicate both the potential of and the opportunity for using SIP to estimate reactive transport parameters. In case of well sorted granulometry of the samples, we find that the grain size characterization (as well as the permeabililty for some specific examples) value can be estimated using SIP. We show that SIP is sensitive to physico-chemical conditions at the fluid/mineral interface, including the different pore fluid dissolved ions (Na+, Cu2+, Zn2+, Pb2+) due to their different adsorption behavior. We also showed the relevance of our approach to characterize the fluid/matrix interaction for various organic contents (wetting and non-wetting oils). We also discuss early efforts to jointly interpret SIP and other information for improved estimation, approaches to use SIP information to constrain mechanistic flow and transport models, and the potential to apply some of the approaches to field scale applications.

  11. An Evaluation of Fractal Surface Measurement Methods for Characterizing Landscape Complexity from Remote-Sensing Imagery

    NASA Technical Reports Server (NTRS)

    Lam, Nina Siu-Ngan; Qiu, Hong-Lie; Quattrochi, Dale A.; Emerson, Charles W.; Arnold, James E. (Technical Monitor)

    2001-01-01

    The rapid increase in digital data volumes from new and existing sensors necessitates the need for efficient analytical tools for extracting information. We developed an integrated software package called ICAMS (Image Characterization and Modeling System) to provide specialized spatial analytical functions for interpreting remote sensing data. This paper evaluates the three fractal dimension measurement methods: isarithm, variogram, and triangular prism, along with the spatial autocorrelation measurement methods Moran's I and Geary's C, that have been implemented in ICAMS. A modified triangular prism method was proposed and implemented. Results from analyzing 25 simulated surfaces having known fractal dimensions show that both the isarithm and triangular prism methods can accurately measure a range of fractal surfaces. The triangular prism method is most accurate at estimating the fractal dimension of higher spatial complexity, but it is sensitive to contrast stretching. The variogram method is a comparatively poor estimator for all of the surfaces, particularly those with higher fractal dimensions. Similar to the fractal techniques, the spatial autocorrelation techniques are found to be useful to measure complex images but not images with low dimensionality. These fractal measurement methods can be applied directly to unclassified images and could serve as a tool for change detection and data mining.

  12. Docking-based modeling of protein-protein interfaces for extensive structural and functional characterization of missense mutations.

    PubMed

    Barradas-Bautista, Didier; Fernández-Recio, Juan

    2017-01-01

    Next-generation sequencing (NGS) technologies are providing genomic information for an increasing number of healthy individuals and patient populations. In the context of the large amount of generated genomic data that is being generated, understanding the effect of disease-related mutations at molecular level can contribute to close the gap between genotype and phenotype and thus improve prevention, diagnosis or treatment of a pathological condition. In order to fully characterize the effect of a pathological mutation and have useful information for prediction purposes, it is important first to identify whether the mutation is located at a protein-binding interface, and second to understand the effect on the binding affinity of the affected interaction/s. Computational methods, such as protein docking are currently used to complement experimental efforts and could help to build the human structural interactome. Here we have extended the original pyDockNIP method to predict the location of disease-associated nsSNPs at protein-protein interfaces, when there is no available structure for the protein-protein complex. We have applied this approach to the pathological interaction networks of six diseases with low structural data on PPIs. This approach can almost double the number of nsSNPs that can be characterized and identify edgetic effects in many nsSNPs that were previously unknown. This can help to annotate and interpret genomic data from large-scale population studies, and to achieve a better understanding of disease at molecular level.

  13. Thermal and Chemical Characterization of Composite Materials. MSFC Center Director's Discretionary Fund Final Report, Project No. ED36-18

    NASA Technical Reports Server (NTRS)

    Stanley, D. C.; Huff, T. L.

    2003-01-01

    The purpose of this research effort was to: (1) provide a concise and well-defined property profile of current and developing composite materials using thermal and chemical characterization techniques and (2) optimize analytical testing requirements of materials. This effort applied a diverse array of methodologies to ascertain composite material properties. Often, a single method of technique will provide useful, but nonetheless incomplete, information on material composition and/or behavior. To more completely understand and predict material properties, a broad-based analytical approach is required. By developing a database of information comprised of both thermal and chemical properties, material behavior under varying conditions may be better understood. THis is even more important in the aerospace community, where new composite materials and those in the development stage have little reference data. For example, Fourier transform infrared (FTIR) spectroscopy spectral databases available for identification of vapor phase spectra, such as those generated during experiments, generally refer to well-defined chemical compounds. Because this method renders a unique thermal decomposition spectral pattern, even larger, more diverse databases, such as those found in solid and liquid phase FTIR spectroscopy libraries, cannot be used. By combining this and other available methodologies, a database specifically for new materials and materials being developed at Marshall Space Flight Center can be generated . In addition, characterizing materials using this approach will be extremely useful in the verification of materials and identification of anomalies in NASA-wide investigations.

  14. Detection of Memory B Activity Against a Therapeutic Protein in Treatment-Naïve Subjects.

    PubMed

    Liao, Karen; Derbyshire, Stacy; Wang, Kai-Fen; Caucci, Cherilyn; Tang, Shuo; Holland, Claire; Loercher, Amy; Gunn, George R

    2018-03-16

    Bridging immunoassays commonly used to detect and characterize immunogenicity during biologic development do not provide direct information on the presence or development of a memory anti-drug antibody (ADA) response. In this study, a B cell ELISPOT assay method was used to evaluate pre-existing ADA for anti-TNFR1 domain antibody, GSK1995057, an experimental biologic in treatment naive subjects. This assay utilized a 7-day activation of PBMCs by a combination of GSK1995057 (antigen) and polyclonal stimulator followed by GSK1995057-specific ELISPOT for the enumeration of memory B cells that have differentiated into antibody secreting cells (ASC) in vitro. We demonstrated that GSK1995057-specific ASC were detectable in treatment-naïve subjects with pre-existing ADA; the frequency of drug-specific ASC was low and ranged from 1 to 10 spot forming units (SFU) per million cells. Interestingly, the frequency of drug-specific ASC correlated with the ADA level measured using an in vitro ADA assay. We further confirmed that the ASC originated from CD27 + memory B cells, not from CD27 - -naïve B cells. Our data demonstrated the utility of the B cell ELISPOT method in therapeutic protein immunogenicity evaluation, providing a novel way to confirm and characterize the cell population producing pre-existing ADA. This novel application of a B cell ELISPOT assay informs and characterizes immune memory activity regarding incidence and magnitude associated with a pre-existing ADA response.

  15. Docking-based modeling of protein-protein interfaces for extensive structural and functional characterization of missense mutations

    PubMed Central

    2017-01-01

    Next-generation sequencing (NGS) technologies are providing genomic information for an increasing number of healthy individuals and patient populations. In the context of the large amount of generated genomic data that is being generated, understanding the effect of disease-related mutations at molecular level can contribute to close the gap between genotype and phenotype and thus improve prevention, diagnosis or treatment of a pathological condition. In order to fully characterize the effect of a pathological mutation and have useful information for prediction purposes, it is important first to identify whether the mutation is located at a protein-binding interface, and second to understand the effect on the binding affinity of the affected interaction/s. Computational methods, such as protein docking are currently used to complement experimental efforts and could help to build the human structural interactome. Here we have extended the original pyDockNIP method to predict the location of disease-associated nsSNPs at protein-protein interfaces, when there is no available structure for the protein-protein complex. We have applied this approach to the pathological interaction networks of six diseases with low structural data on PPIs. This approach can almost double the number of nsSNPs that can be characterized and identify edgetic effects in many nsSNPs that were previously unknown. This can help to annotate and interpret genomic data from large-scale population studies, and to achieve a better understanding of disease at molecular level. PMID:28841721

  16. Gene Ontology annotations at SGD: new data sources and annotation methods

    PubMed Central

    Hong, Eurie L.; Balakrishnan, Rama; Dong, Qing; Christie, Karen R.; Park, Julie; Binkley, Gail; Costanzo, Maria C.; Dwight, Selina S.; Engel, Stacia R.; Fisk, Dianna G.; Hirschman, Jodi E.; Hitz, Benjamin C.; Krieger, Cynthia J.; Livstone, Michael S.; Miyasato, Stuart R.; Nash, Robert S.; Oughtred, Rose; Skrzypek, Marek S.; Weng, Shuai; Wong, Edith D.; Zhu, Kathy K.; Dolinski, Kara; Botstein, David; Cherry, J. Michael

    2008-01-01

    The Saccharomyces Genome Database (SGD; http://www.yeastgenome.org/) collects and organizes biological information about the chromosomal features and gene products of the budding yeast Saccharomyces cerevisiae. Although published data from traditional experimental methods are the primary sources of evidence supporting Gene Ontology (GO) annotations for a gene product, high-throughput experiments and computational predictions can also provide valuable insights in the absence of an extensive body of literature. Therefore, GO annotations available at SGD now include high-throughput data as well as computational predictions provided by the GO Annotation Project (GOA UniProt; http://www.ebi.ac.uk/GOA/). Because the annotation method used to assign GO annotations varies by data source, GO resources at SGD have been modified to distinguish data sources and annotation methods. In addition to providing information for genes that have not been experimentally characterized, GO annotations from independent sources can be compared to those made by SGD to help keep the literature-based GO annotations current. PMID:17982175

  17. Calling depths of baleen whales from single sensor data: development of an autocorrelation method using multipath localization.

    PubMed

    Valtierra, Robert D; Glynn Holt, R; Cholewiak, Danielle; Van Parijs, Sofie M

    2013-09-01

    Multipath localization techniques have not previously been applied to baleen whale vocalizations due to difficulties in application to tonal vocalizations. Here it is shown that an autocorrelation method coupled with the direct reflected time difference of arrival localization technique can successfully resolve location information. A derivation was made to model the autocorrelation of a direct signal and its overlapping reflections to illustrate that an autocorrelation may be used to extract reflection information from longer duration signals containing a frequency sweep, such as some calls produced by baleen whales. An analysis was performed to characterize the difference in behavior of the autocorrelation when applied to call types with varying parameters (sweep rate, call duration). The method's feasibility was tested using data from playback transmissions to localize an acoustic transducer at a known depth and location. The method was then used to estimate the depth and range of a single North Atlantic right whale (Eubalaena glacialis) and humpback whale (Megaptera novaeangliae) from two separate experiments.

  18. Automated segmentation of retinal pigment epithelium cells in fluorescence adaptive optics images.

    PubMed

    Rangel-Fonseca, Piero; Gómez-Vieyra, Armando; Malacara-Hernández, Daniel; Wilson, Mario C; Williams, David R; Rossi, Ethan A

    2013-12-01

    Adaptive optics (AO) imaging methods allow the histological characteristics of retinal cell mosaics, such as photoreceptors and retinal pigment epithelium (RPE) cells, to be studied in vivo. The high-resolution images obtained with ophthalmic AO imaging devices are rich with information that is difficult and/or tedious to quantify using manual methods. Thus, robust, automated analysis tools that can provide reproducible quantitative information about the cellular mosaics under examination are required. Automated algorithms have been developed to detect the position of individual photoreceptor cells; however, most of these methods are not well suited for characterizing the RPE mosaic. We have developed an algorithm for RPE cell segmentation and show its performance here on simulated and real fluorescence AO images of the RPE mosaic. Algorithm performance was compared to manual cell identification and yielded better than 91% correspondence. This method can be used to segment RPE cells for morphometric analysis of the RPE mosaic and speed the analysis of both healthy and diseased RPE mosaics.

  19. [Lithology feature extraction of CASI hyperspectral data based on fractal signal algorithm].

    PubMed

    Tang, Chao; Chen, Jian-Ping; Cui, Jing; Wen, Bo-Tao

    2014-05-01

    Hyperspectral data is characterized by combination of image and spectrum and large data volume dimension reduction is the main research direction. Band selection and feature extraction is the primary method used for this objective. In the present article, the authors tested methods applied for the lithology feature extraction from hyperspectral data. Based on the self-similarity of hyperspectral data, the authors explored the application of fractal algorithm to lithology feature extraction from CASI hyperspectral data. The "carpet method" was corrected and then applied to calculate the fractal value of every pixel in the hyperspectral data. The results show that fractal information highlights the exposed bedrock lithology better than the original hyperspectral data The fractal signal and characterized scale are influenced by the spectral curve shape, the initial scale selection and iteration step. At present, research on the fractal signal of spectral curve is rare, implying the necessity of further quantitative analysis and investigation of its physical implications.

  20. Multipulse technique exploiting the intermodulation of ultrasound waves in a nonlinear medium.

    PubMed

    Biagi, Elena; Breschi, Luca; Vannacci, Enrico; Masotti, Leonardo

    2009-03-01

    In recent years, the nonlinear properties of materials have attracted much interest in nondestructive testing and in ultrasound diagnostic applications. Acoustic nonlinear parameters represent an opportunity to improve the information that can be extracted from a medium such as structural organization and pathologic status of tissue. In this paper, a method called pulse subtraction intermodulation (PSI), based on a multipulse technique, is presented and investigated both theoretically and experimentally. This method allows separation of the intermodulation products, which arise when 2 separate frequencies are transmitted in a nonlinear medium, from fundamental and second harmonic components, making them available for improved imaging techniques or signal processing algorithms devoted to tissue characterization. The theory of intermodulation product generation was developed according the Khokhlov-Zabolotskaya-Kuznetsov (KZK) nonlinear propagation equation, which is consistent with experimental results. The description of the proposed method, characterization of the intermodulation spectral contents, and quantitative results coming from in vitro experimentation are reported and discussed in this paper.

  1. Environmental risk analysis of oil handling facilities in port areas. Application to Tarragona harbor (NE Spain).

    PubMed

    Valdor, Paloma F; Gómez, Aina G; Puente, Araceli

    2015-01-15

    Diffuse pollution from oil spills is a widespread problem in port areas (as a result of fuel supply, navigation and loading/unloading activities). This article presents a method to assess the environmental risk of oil handling facilities in port areas. The method is based on (i) identification of environmental hazards, (ii) characterization of meteorological and oceanographic conditions, (iii) characterization of environmental risk scenarios, and (iv) assessment of environmental risk. The procedure has been tested by application to the Tarragona harbor. The results show that the method is capable of representing (i) specific local pollution cases (i.e., discriminating between products and quantities released by a discharge source), (ii) oceanographic and meteorological conditions (selecting a representative subset data), and (iii) potentially affected areas in probabilistic terms. Accordingly, it can inform the design of monitoring plans to study and control the environmental impact of these facilities, as well as the design of contingency plans. Copyright © 2014 Elsevier Ltd. All rights reserved.

  2. Spectrum of the Laplace-Beltrami operator and the phase structure of causal dynamical triangulations

    NASA Astrophysics Data System (ADS)

    Clemente, Giuseppe; D'Elia, Massimo

    2018-06-01

    We propose a new method to characterize the different phases observed in the nonperturbative numerical approach to quantum gravity known as causal dynamical triangulations. The method is based on the analysis of the eigenvalues and the eigenvectors of the Laplace-Beltrami operator computed on the triangulations: it generalizes previous works based on the analysis of diffusive processes and proves capable of providing more detailed information on the geometric properties of the triangulations. In particular, we apply the method to the analysis of spatial slices, showing that the different phases can be characterized by a new order parameter related to the presence or absence of a gap in the spectrum of the Laplace-Beltrami operator, and deriving an effective dimensionality of the slices at the different scales. We also propose quantities derived from the spectrum that could be used to monitor the running to the continuum limit around a suitable critical point in the phase diagram, if any is found.

  3. Mapping of ligand-binding cavities in proteins.

    PubMed

    Andersson, C David; Chen, Brian Y; Linusson, Anna

    2010-05-01

    The complex interactions between proteins and small organic molecules (ligands) are intensively studied because they play key roles in biological processes and drug activities. Here, we present a novel approach to characterize and map the ligand-binding cavities of proteins without direct geometric comparison of structures, based on Principal Component Analysis of cavity properties (related mainly to size, polarity, and charge). This approach can provide valuable information on the similarities and dissimilarities, of binding cavities due to mutations, between-species differences and flexibility upon ligand-binding. The presented results show that information on ligand-binding cavity variations can complement information on protein similarity obtained from sequence comparisons. The predictive aspect of the method is exemplified by successful predictions of serine proteases that were not included in the model construction. The presented strategy to compare ligand-binding cavities of related and unrelated proteins has many potential applications within protein and medicinal chemistry, for example in the characterization and mapping of "orphan structures", selection of protein structures for docking studies in structure-based design, and identification of proteins for selectivity screens in drug design programs. 2009 Wiley-Liss, Inc.

  4. Location contexts of user check-ins to model urban geo life-style patterns.

    PubMed

    Hasan, Samiul; Ukkusuri, Satish V

    2015-01-01

    Geo-location data from social media offers us information, in new ways, to understand people's attitudes and interests through their activity choices. In this paper, we explore the idea of inferring individual life-style patterns from activity-location choices revealed in social media. We present a model to understand life-style patterns using the contextual information (e. g. location categories) of user check-ins. Probabilistic topic models are developed to infer individual geo life-style patterns from two perspectives: i) to characterize the patterns of user interests to different types of places and ii) to characterize the patterns of user visits to different neighborhoods. The method is applied to a dataset of Foursquare check-ins of the users from New York City. The co-existence of several location contexts and the corresponding probabilities in a given pattern provide useful information about user interests and choices. It is found that geo life-style patterns have similar items-either nearby neighborhoods or similar location categories. The semantic and geographic proximity of the items in a pattern reflects the hidden regularity in user preferences and location choice behavior.

  5. Thermal Performance Benchmarking

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Feng, Xuhui; Moreno, Gilbert; Bennion, Kevin

    2016-06-07

    The goal for this project is to thoroughly characterize the thermal performance of state-of-the-art (SOA) in-production automotive power electronics and electric motor thermal management systems. Information obtained from these studies will be used to: evaluate advantages and disadvantages of different thermal management strategies; establish baseline metrics for the thermal management systems; identify methods of improvement to advance the SOA; increase the publicly available information related to automotive traction-drive thermal management systems; help guide future electric drive technologies (EDT) research and development (R&D) efforts. The thermal performance results combined with component efficiency and heat generation information obtained by Oak Ridge Nationalmore » Laboratory (ORNL) may then be used to determine the operating temperatures for the EDT components under drive-cycle conditions. In FY16, the 2012 Nissan LEAF power electronics and 2014 Honda Accord Hybrid power electronics thermal management system were characterized. Comparison of the two power electronics thermal management systems was also conducted to provide insight into the various cooling strategies to understand the current SOA in thermal management for automotive power electronics and electric motors.« less

  6. Entropy of dynamical social networks

    NASA Astrophysics Data System (ADS)

    Zhao, Kun; Karsai, Marton; Bianconi, Ginestra

    2012-02-01

    Dynamical social networks are evolving rapidly and are highly adaptive. Characterizing the information encoded in social networks is essential to gain insight into the structure, evolution, adaptability and dynamics. Recently entropy measures have been used to quantify the information in email correspondence, static networks and mobility patterns. Nevertheless, we still lack methods to quantify the information encoded in time-varying dynamical social networks. In this talk we present a model to quantify the entropy of dynamical social networks and use this model to analyze the data of phone-call communication. We show evidence that the entropy of the phone-call interaction network changes according to circadian rhythms. Moreover we show that social networks are extremely adaptive and are modified by the use of technologies such as mobile phone communication. Indeed the statistics of duration of phone-call is described by a Weibull distribution and is significantly different from the distribution of duration of face-to-face interactions in a conference. Finally we investigate how much the entropy of dynamical social networks changes in realistic models of phone-call or face-to face interactions characterizing in this way different type human social behavior.

  7. Shotgun metagenomic data streams: surfing without fear

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Berendzen, Joel R

    2010-12-06

    Timely information about bio-threat prevalence, consequence, propagation, attribution, and mitigation is needed to support decision-making, both routinely and in a crisis. One DNA sequencer can stream 25 Gbp of information per day, but sampling strategies and analysis techniques are needed to turn raw sequencing power into actionable knowledge. Shotgun metagenomics can enable biosurveillance at the level of a single city, hospital, or airplane. Metagenomics characterizes viruses and bacteria from complex environments such as soil, air filters, or sewage. Unlike targeted-primer-based sequencing, shotgun methods are not blind to sequences that are truly novel, and they can measure absolute prevalence. Shotgun metagenomicmore » sampling can be non-invasive, efficient, and inexpensive while being informative. We have developed analysis techniques for shotgun metagenomic sequencing that rely upon phylogenetic signature patterns. They work by indexing local sequence patterns in a manner similar to web search engines. Our methods are laptop-fast and favorable scaling properties ensure they will be sustainable as sequencing methods grow. We show examples of application to soil metagenomic samples.« less

  8. Radiological characterization of clay mixed red mud in particular as regards its leaching features.

    PubMed

    Hegedűs, Miklós; Sas, Zoltán; Tóth-Bodrogi, Edit; Szántó, Tamás; Somlai, János; Kovács, Tibor

    2016-10-01

    The reuse of industrial by-products such as red mud is of great importance. In the case of the building material industry the reuse of red mud requires a cautious attitude, since the enhanced radionuclide content of red mud can have an effect on human health. The natural radionuclide content of red mud from the Ajka red mud reservoir and the clay sample from a Hungarian brick factory were determined by gamma spectrometry. It was found that maximum 27.8% red mud content can be added to fulfil the conditions of the EU-BSS. The effect of heat treatment was investigated on a red mud-clay mixture and it was found that in the case of radon and thoron exhalation the applied heat reduced remarkably the exhalation capacities. The leaching features of red mud and different mixtures were studied according to the MSZ-21470-50 Hungarian standard, the British CEN/TS 14429 standard and the Tessier sequential extraction method. The Tessier method and the MSZ-21470-50 standard are suitable for the characterization of materials; however, they do not provide enough information for waste deposition purposes. To this end, we propose using the CEN/TS 14429 method, because it is easy to use, and gives detailed information about the material's behaviour under different pH conditions, however, further measurements are necessary. Copyright © 2016 Elsevier Ltd. All rights reserved.

  9. Using radial NMR profiles to characterize pore size distributions

    NASA Astrophysics Data System (ADS)

    Deriche, Rachid; Treilhard, John

    2012-02-01

    Extracting information about axon diameter distributions in the brain is a challenging task which provides useful information for medical purposes; for example, the ability to characterize and monitor axon diameters would be useful in diagnosing and investigating diseases like amyotrophic lateral sclerosis (ALS)1 or autism.2 Three families of operators are defined by Ozarslan,3 whose action upon an NMR attenuation signal extracts the moments of the pore size distribution of the ensemble under consideration; also a numerical method is proposed to continuously reconstruct a discretely sampled attenuation profile using the eigenfunctions of the simple harmonic oscillator Hamiltonian: the SHORE basis. The work presented here extends Ozarlan's method to other bases that can offer a better description of attenuation signal behaviour; in particular, we propose the use of the radial Spherical Polar Fourier (SPF) basis. Testing is performed to contrast the efficacy of the radial SPF basis and SHORE basis in practical attenuation signal reconstruction. The robustness of the method to additive noise is tested and analysed. We demonstrate that a low-order attenuation signal reconstruction outperforms a higher-order reconstruction in subsequent moment estimation under noisy conditions. We propose the simulated annealing algorithm for basis function scale parameter estimation. Finally, analytic expressions are derived and presented for the action of the operators on the radial SPF basis (obviating the need for numerical integration, thus avoiding a spectrum of possible sources of error).

  10. An effective parameter optimization technique for vibration flow field characterization of PP melts via LS-SVM combined with SALS in an electromagnetism dynamic extruder

    NASA Astrophysics Data System (ADS)

    Xian, Guangming

    2018-03-01

    A method for predicting the optimal vibration field parameters by least square support vector machine (LS-SVM) is presented in this paper. One convenient and commonly used technique for characterizing the the vibration flow field of polymer melts films is small angle light scattering (SALS) in a visualized slit die of the electromagnetism dynamic extruder. The optimal value of vibration vibration frequency, vibration amplitude, and the maximum light intensity projection area can be obtained by using LS-SVM for prediction. For illustrating this method and show its validity, the flowing material is used with polypropylene (PP) and fifteen samples are tested at the rotation speed of screw at 36rpm. This paper first describes the apparatus of SALS to perform the experiments, then gives the theoretical basis of this new method, and detail the experimental results for parameter prediction of vibration flow field. It is demonstrated that it is possible to use the method of SALS and obtain detailed information on optimal parameter of vibration flow field of PP melts by LS-SVM.

  11. Analysis of biomedical time signals for characterization of cutaneous diabetic micro-angiopathy

    NASA Astrophysics Data System (ADS)

    Kraitl, Jens; Ewald, Hartmut

    2007-02-01

    Photo-plethysmography (PPG) is frequently used in research on microcirculation of blood. It is a non-invasive procedure and takes minimal time to be carried out. Usually PPG time series are analyzed by conventional linear methods, mainly Fourier analysis. These methods may not be optimal for the investigation of nonlinear effects of the hearth circulation system like vasomotion, autoregulation, thermoregulation, breathing, heartbeat and vessels. The wavelet analysis of the PPG time series is a specific, sensitive nonlinear method for the in vivo identification of hearth circulation patterns and human health status. This nonlinear analysis of PPG signals provides additional information which cannot be detected using conventional approaches. The wavelet analysis has been used to study healthy subjects and to characterize the health status of patients with a functional cutaneous microangiopathy which was associated with diabetic neuropathy. The non-invasive in vivo method is based on the radiation of monochromatic light through an area of skin on the finger. A Photometrical Measurement Device (PMD) has been developed. The PMD is suitable for non-invasive continuous online monitoring of one or more biologic constituent values and blood circulation patterns.

  12. Performance Analysis of Continuous Black-Box Optimization Algorithms via Footprints in Instance Space.

    PubMed

    Muñoz, Mario A; Smith-Miles, Kate A

    2017-01-01

    This article presents a method for the objective assessment of an algorithm's strengths and weaknesses. Instead of examining the performance of only one or more algorithms on a benchmark set, or generating custom problems that maximize the performance difference between two algorithms, our method quantifies both the nature of the test instances and the algorithm performance. Our aim is to gather information about possible phase transitions in performance, that is, the points in which a small change in problem structure produces algorithm failure. The method is based on the accurate estimation and characterization of the algorithm footprints, that is, the regions of instance space in which good or exceptional performance is expected from an algorithm. A footprint can be estimated for each algorithm and for the overall portfolio. Therefore, we select a set of features to generate a common instance space, which we validate by constructing a sufficiently accurate prediction model. We characterize the footprints by their area and density. Our method identifies complementary performance between algorithms, quantifies the common features of hard problems, and locates regions where a phase transition may lie.

  13. Integrated DNA walking system to characterize a broad spectrum of GMOs in food/feed matrices.

    PubMed

    Fraiture, Marie-Alice; Herman, Philippe; Lefèvre, Loic; Taverniers, Isabel; De Loose, Marc; Deforce, Dieter; Roosens, Nancy H

    2015-08-14

    In order to provide a system fully integrated with qPCR screening, usually used in GMO routine analysis, as well as being able to detect, characterize and identify a broad spectrum of GMOs in food/feed matrices, two bidirectional DNA walking methods targeting p35S or tNOS, the most common transgenic elements found in GM crops, were developed. These newly developed DNA walking methods are completing the previously implemented DNA walking method targeting the t35S pCAMBIA element. Food/feed matrices containing transgenic crops (Bt rice or MON863 maize) were analysed using the integrated DNA walking system. First, the newly developed DNA walking methods, anchored on the sequences used for the p35S or tNOS qPCR screening, were tested on Bt rice that contains these two transgenic elements. Second, the methods were assessed on a maize sample containing a low amount of the GM MON863 event, representing a more complex matrix in terms of genome size and sensitivity. Finally, to illustrate its applicability in GMO routine analysis by enforcement laboratories, the entire workflow of the integrated strategy, including qPCR screening to detect the potential presence of GMOs and the subsequent DNA walking methods to characterize and identify the detected GMOs, was applied on a GeMMA Scheme Proficiency Test matrix. Via the characterization of the transgene flanking region between the transgenic cassette and the plant genome as well as of a part of the transgenic cassette, the presence of GMOs was properly confirmed or infirmed in all tested samples. Due to their simple procedure and their short time-frame to get results, the developed DNA walking methods proposed here can be easily implemented in GMO routine analysis by the enforcement laboratories. In providing crucial information about the transgene flanking regions and/or the transgenic cassettes, this DNA walking strategy is a key molecular tool to prove the presence of GMOs in any given food/feed matrix.

  14. Improved Measures of Integrated Information

    PubMed Central

    Tegmark, Max

    2016-01-01

    Although there is growing interest in measuring integrated information in computational and cognitive systems, current methods for doing so in practice are computationally unfeasible. Existing and novel integration measures are investigated and classified by various desirable properties. A simple taxonomy of Φ-measures is presented where they are each characterized by their choice of factorization method (5 options), choice of probability distributions to compare (3 × 4 options) and choice of measure for comparing probability distributions (7 options). When requiring the Φ-measures to satisfy a minimum of attractive properties, these hundreds of options reduce to a mere handful, some of which turn out to be identical. Useful exact and approximate formulas are derived that can be applied to real-world data from laboratory experiments without posing unreasonable computational demands. PMID:27870846

  15. From the outside looking in: developing snapshot imaging spectro-polarimeters

    NASA Astrophysics Data System (ADS)

    Dereniak, E. L.

    2014-09-01

    The information from a scene is critical in autonomous optical systems, and the variety of information that can be extracted is determined by the application. To characterize a target, the information of interest captured is spectral (λ), polarization (S) and distance (Z). There are many technologies that capture this information in different ways to identify the target. In many fields, such as mining and military reconnaissance, there is a need for rapid data acquisition and, for this reason, a relatively new method has been devised that can obtain all this information simultaneously. The need for snapshot acquisition of data without moving parts was the goal of the research. This paper reviews the chain of novel research instruments that were sequentially developed to capture spectral and polarization information of a scene in a snapshot or flash. The distance (Z) is yet to be integrated.

  16. Method development and strategy for the characterization of complexly faulted and fractured rhyolitic tuffs, Yucca Mountain, Nevada

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Karasaki, K.; Galloway, D.

    1991-06-01

    The planned high-level nuclear waste repository at Yucca Mountain, Nevada, would exist in unsaturated, fractured welded tuff. One possible contaminant pathway to the accessible environment is transport by groundwater infiltrating to the water table and flowing through the saturated zone. Therefore, an effort to characterize the hydrology of the saturated zone is being undertaken in parallel with that of the unsaturated zone. As a part of the saturated zone investigation, there wells-UE-25c{number_sign}1, UE-25c{number_sign}2, and UE-25c{number_sign}3 (hereafter called the c-holes)-were drilled to study hydraulic and transport properties of rock formations underlying the planned waste repository. The location of the c-holes ismore » such that the formations penetrated in the unsaturated zone occur at similar depths and with similar thicknesses as at the planned repository site. In characterizing a highly heterogeneous flow system, several issues emerge. (1) The characterization strategy should allow for the virtual impossibility to enumerate and characterize all heterogeneities. (2) The methodology to characterize the heterogeneous flow system at the scale of the well tests needs to be established. (3) Tools need to be developed for scaling up the information obtained at the well-test scale to the larger scale of the site. In the present paper, the characterization strategy and the methods under development are discussed with the focus on the design and analysis of the field experiments at the c-holes.« less

  17. Diagnostic value of contrast-enhanced ultrasonography in the characterization of ovarian tumors☆

    PubMed Central

    Sconfienza, L.M.; Perrone, N.; Delnevo, A.; Lacelli, F.; Murolo, C.; Gandolfo, N.; Serafini, G.

    2009-01-01

    Introduction Vascularity influences the characteristics of gynecologic tumors observed with direct imaging techniques that reveal the macrovascular component of these lesions (color and power Doppler) and with indirect imaging involving the administration of contrast agents to examine the microcirculation and interstitial perfusion (contrast-enhanced computed tomography [CT] and magnetic resonance [MR] imaging). The purpose of this study was to determine whether contrast-enhanced ultrasonography (CEUS) of ovarian lesions provides useful information that cannot be obtained with conventional US. Materials and methods We used CEUS to assess 72 nonspecific adnexal lesions in 61 patients. CEUS was performed with a 4.8-ml bolus of a second-generation ultrasonographic contrast agent and dedicated imaging algorithms. For each lesion, B-mode morphology, CEUS morphology, and time/intensity curves were evaluated. Results In 8/61 cases (13.1%) CEUS offered no additional morphovascular information. In 38/61 cases (62.3%), it provided additional information that did not modify the management of the lesion, and in 15/61 cases (24.6%) it gave additional information that modified the management of the lesion. Malignant lesions were characterized by significantly shorter times to peak enhancement (11.9 ± 3.1 s vs 19.8 ± 4.0 s p < 0.01) and significantly higher peak intensity (24.7 ± 4.2 dB vs 17.8 ± 3.3 dB p < 0.01) compared with benign lesions. Conclusions CEUS improves diagnostic confidence in the characterization of liquid-corpuscular lesions where conventional US is inconclusive. CEUS can be proposed as a valid alternative to CT and MR. However, information obtained by CEUS influences the therapy in a limited percentage of cases (24.6%). PMID:23396092

  18. Comparative Study in Laboratory Rats to Validate Sperm Quality Methods and Endpoints

    NASA Technical Reports Server (NTRS)

    Price, W. A.; Briggs, G. B.; Alexander, W. K.; Still, K. R.; Grasman, K. A.

    2000-01-01

    Abstract The Naval Health Research Center, Detachment (Toxicology) performs toxicity studies in laboratory animals to characterize the risk of exposure to chemicals of Navy interest. Research was conducted at the Toxicology Detachment at WPAFB, OH in collaboration with Wright State University, Department of Biological Sciences for the validation of new bioassay methods for evaluating reproductive toxicity. The Hamilton Thorne sperm analyzer was used to evaluate sperm damage produced by exposure to a known testicular toxic agent, methoxyacetic acid and by inhalation exposure to JP-8 and JP-5 in laboratory rats. Sperm quality parameters were evaluated (sperm concentration, motility, and morphology) to provide evidence of sperm damage. The Hamilton Thorne sperm analyzer utilizes a DNA specific fluorescent stain (similar to flow cytometry) and digitized optical computer analysis to detect sperm cell damage. The computer assisted sperm analysis (CASA) is a more rapid, robust, predictive and sensitive method for characterizing reproductive toxicity. The results presented in this poster report validation information showing exposure to methoxyacetic acid causes reproductive toxicity and inhalation exposure to JP-8 and JP-5 had no significant effects. The CASA method detects early changes that result in reproductive deficits and these data will be used in a continuing program to characterize the toxicity of chemicals, and combinations of chemicals, of military interest to formulate permissible exposure limits.

  19. Ronchi test for characterization of nanofocusing optics at a hard x-ray free-electron laser.

    PubMed

    Nilsson, Daniel; Uhlén, Fredrik; Holmberg, Anders; Hertz, Hans M; Schropp, Andreas; Patommel, Jens; Hoppe, Robert; Seiboth, Frank; Meier, Vivienne; Schroer, Christian G; Galtier, Eric; Nagler, Bob; Lee, Hae Ja; Vogt, Ulrich

    2012-12-15

    We demonstrate the use of the classical Ronchi test to characterize aberrations in focusing optics at a hard x-ray free-electron laser. A grating is placed close to the focus and the interference between the different orders after the grating is observed in the far field. Any aberrations in the beam or the optics will distort the interference fringes. The method is simple to implement and can provide single-shot information about the focusing quality. We used the Ronchi test to measure the aberrations in a nanofocusing Fresnel zone plate at the Linac Coherent Light Source at 8.194 keV.

  20. Extraction and textural characterization of above-ground areas from aerial stereo pairs: a quality assessment

    NASA Astrophysics Data System (ADS)

    Baillard, C.; Dissard, O.; Jamet, O.; Maître, H.

    Above-ground analysis is a key point to the reconstruction of urban scenes, but it is a difficult task because of the diversity of the involved objects. We propose a new method to above-ground extraction from an aerial stereo pair, which does not require any assumption about object shape or nature. A Digital Surface Model is first produced by a stereoscopic matching stage preserving discontinuities, and then processed by a region-based Markovian classification algorithm. The produced above-ground areas are finally characterized as man-made or natural according to the grey level information. The quality of the results is assessed and discussed.

  1. New secondary batteries utilizing electronically conductive polymer cathodes

    NASA Technical Reports Server (NTRS)

    Martin, Charles R.; White, Ralph E.

    1989-01-01

    The objectives of this project are to characterize the transport properties in electronically conductive polymers and to assess the utility of these films as cathodes in lithium/polymer secondary batteries. During this research period, progress has been made in a literature survey of the historical background, methods of preparation, the physical and chemical properties, and potential technological applications of polythiophene. Progress has also been made in the characterization of polypyrrole flat films and fibrillar films. Cyclic voltammetry and potential step chronocoulometry were used to gain information on peak currents and potentials switching reaction rates, charge capacity, and charge retention. Battery charge/discharge studies were also performed.

  2. Bayesian approach to analyzing holograms of colloidal particles.

    PubMed

    Dimiduk, Thomas G; Manoharan, Vinothan N

    2016-10-17

    We demonstrate a Bayesian approach to tracking and characterizing colloidal particles from in-line digital holograms. We model the formation of the hologram using Lorenz-Mie theory. We then use a tempered Markov-chain Monte Carlo method to sample the posterior probability distributions of the model parameters: particle position, size, and refractive index. Compared to least-squares fitting, our approach allows us to more easily incorporate prior information about the parameters and to obtain more accurate uncertainties, which are critical for both particle tracking and characterization experiments. Our approach also eliminates the need to supply accurate initial guesses for the parameters, so it requires little tuning.

  3. Endoscopic ultrasound: Elastographic lymph node evaluation.

    PubMed

    Dietrich, Christoph F; Jenssen, Christian; Arcidiacono, Paolo G; Cui, Xin-Wu; Giovannini, Marc; Hocke, Michael; Iglesias-Garcia, Julio; Saftoiu, Adrian; Sun, Siyu; Chiorean, Liliana

    2015-01-01

    Different imaging techniques can bring different information which will contribute to the final diagnosis and further management of the patients. Even from the time of Hippocrates, palpation has been used in order to detect and characterize a body mass. The so-called virtual palpation has now become a reality due to elastography, which is a recently developed technique. Elastography has already been proving its added value as a complementary imaging method, helpful to better characterize and differentiate between benign and malignant masses. The current applications of elastography in lymph nodes (LNs) assessment by endoscopic ultrasonography will be further discussed in this paper, with a review of the literature and future perspectives.

  4. A-TEEMTM, a new molecular fingerprinting technique: simultaneous absorbance-transmission and fluorescence excitation-emission matrix method

    NASA Astrophysics Data System (ADS)

    Quatela, Alessia; Gilmore, Adam M.; Steege Gall, Karen E.; Sandros, Marinella; Csatorday, Karoly; Siemiarczuk, Alex; (Ben Yang, Boqian; Camenen, Loïc

    2018-04-01

    We investigate the new simultaneous absorbance-transmission and fluorescence excitation-emission matrix method for rapid and effective characterization of the varying components from a mixture. The absorbance-transmission and fluorescence excitation-emission matrix method uniquely facilitates correction of fluorescence inner-filter effects to yield quantitative fluorescence spectral information that is largely independent of component concentration. This is significant because it allows one to effectively monitor quantitative component changes using multivariate methods and to generate and evaluate spectral libraries. We present the use of this novel instrument in different fields: i.e. tracking changes in complex mixtures including natural water, wine as well as monitoring stability and aggregation of hormones for biotherapeutics.

  5. Characterization and quantification of flavonoids and saponins in adzuki bean (Vigna angularis L.) by HPLC-DAD-ESI-MSn analysis.

    PubMed

    Liu, Rui; Cai, Zongwei; Xu, Baojun

    2017-09-22

    Bioactive activities of adzuki bean have been widely reported, however, the phytochemical information of adzuki bean is incomplete. The aim of this study was to characterize and quantify flavonoids and saponins in adzuki bean. High performance liquid chromatography with diode array detection and electro spray ionization-tandem multi-stage mass spectrometry (HPLC-DAD-ESI-MS n ) were applied to do qualitative and quantitative analyses. A total of 15 compounds from adzuki bean were identified by HPLC-DAD-ESI-MS n . Among 15 compounds identified, four flavonoids (catechin, vitexin-4″-O-glucoside, quercetin-3-O-glucoside, and quercetin-3-O-rutinoside) and six saponins (azukisaponin I, II, III, IV, V, and VI) in adzuki bean were further quantified by external calibration method using HPLC-MS with the program of time segment and extract ion chromatogram (EIC) analysis. Current qualitative and quantitative method based on HPLC and MS technique provides a scientific basis for in vitro and in vivo pharmacological study in the future. Graphical abstract Isolation and characterization of flavonoids and saponins from adzuki bean.

  6. Experimental Characterization of Guided Waves by Their Surface Displacement Vector Field

    NASA Astrophysics Data System (ADS)

    Barth, M.; Köhler, B.; Schubert, L.

    2009-03-01

    The development new nondestructive evaluation (NDE) and structural health monitoring (SHM) methods utilizing guided elastic waves needs a good understanding of wave propagation properties and the interaction of the waves with structures and defects. If the geometrical and stiffness properties of the components are well known, these effects can be studied very efficiently by numerical modeling. But very often there is a lack of precise knowledge of all necessary elastic properties; accurate and non-disturbing measurements are without alternative in these cases. The mapping of wave fields can be done by scanning laser vibrometers as demonstrated in a number of cases. Originally, a laser vibrometer provides only information from one displacement component. To get all three displacement components, the simultaneous measurement with three vibrometers is offered commercially. This is a very expensive approach. The paper describes a method which uses only one vibrometer sequentially for getting all three vector components. It allows determining additional parameters for characterizing wave modes as e.g. the ellipticity. The capability of this approach is demonstrated for the characterization of Lamb waves.

  7. Analyzing Information Seeking and Drug-Safety Alert Response by Health Care Professionals as New Methods for Surveillance

    PubMed Central

    Pernek, Igor; Stiglic, Gregor; Leskovec, Jure; Strasberg, Howard R; Shah, Nigam Haresh

    2015-01-01

    Background Patterns in general consumer online search logs have been used to monitor health conditions and to predict health-related activities, but the multiple contexts within which consumers perform online searches make significant associations difficult to interpret. Physician information-seeking behavior has typically been analyzed through survey-based approaches and literature reviews. Activity logs from health care professionals using online medical information resources are thus a valuable yet relatively untapped resource for large-scale medical surveillance. Objective To analyze health care professionals’ information-seeking behavior and assess the feasibility of measuring drug-safety alert response from the usage logs of an online medical information resource. Methods Using two years (2011-2012) of usage logs from UpToDate, we measured the volume of searches related to medical conditions with significant burden in the United States, as well as the seasonal distribution of those searches. We quantified the relationship between searches and resulting page views. Using a large collection of online mainstream media articles and Web log posts we also characterized the uptake of a Food and Drug Administration (FDA) alert via changes in UpToDate search activity compared with general online media activity related to the subject of the alert. Results Diseases and symptoms dominate UpToDate searches. Some searches result in page views of only short duration, while others consistently result in longer-than-average page views. The response to an FDA alert for Celexa, characterized by a change in UpToDate search activity, differed considerably from general online media activity. Changes in search activity appeared later and persisted longer in UpToDate logs. The volume of searches and page view durations related to Celexa before the alert also differed from those after the alert. Conclusions Understanding the information-seeking behavior associated with online evidence sources can offer insight into the information needs of health professionals and enable large-scale medical surveillance. Our Web log mining approach has the potential to monitor responses to FDA alerts at a national level. Our findings can also inform the design and content of evidence-based medical information resources such as UpToDate. PMID:26293444

  8. Ames expedited site characterization demonstration at the former manufactured gas plant site, Marshalltown, Iowa

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bevolo, A.J.; Kjartanson, B.H.; Wonder, J.D.

    1996-03-01

    The goal of the Ames Expedited Site Characterization (ESC) project is to evaluate and promote both innovative technologies (IT) and state-of-the-practice technologies (SOPT) for site characterization and monitoring. In April and May 1994, the ESC project conducted site characterization, technology comparison, and stakeholder demonstration activities at a former manufactured gas plant (FMGP) owned by Iowa Electric Services (IES) Utilities, Inc., in Marshalltown, Iowa. Three areas of technology were fielded at the Marshalltown FMGP site: geophysical, analytical and data integration. The geophysical technologies are designed to assess the subsurface geological conditions so that the location, fate and transport of the targetmore » contaminants may be assessed and forecasted. The analytical technologies/methods are designed to detect and quantify the target contaminants. The data integration technology area consists of hardware and software systems designed to integrate all the site information compiled and collected into a conceptual site model on a daily basis at the site; this conceptual model then becomes the decision-support tool. Simultaneous fielding of different methods within each of the three areas of technology provided data for direct comparison of the technologies fielded, both SOPT and IT. This document reports the results of the site characterization, technology comparison, and ESC demonstration activities associated with the Marshalltown FMGP site. 124 figs., 27 tabs.« less

  9. Design of diversity and focused combinatorial libraries in drug discovery.

    PubMed

    Young, S Stanley; Ge, Nanxiang

    2004-05-01

    Using well-characterized chemical reactions and readily available monomers, chemists are able to create sets of compounds, termed libraries, which are useful in drug discovery processes. The design of combinatorial chemical libraries can be complex and there has been much information recently published offering suggestions on how the design process can be carried out. This review focuses on literature with the goal of organizing current thinking. At this point in time, it is clear that benchmarking of current suggested methods is required as opposed to further new methods.

  10. Analogy between gambling and measurement-based work extraction

    NASA Astrophysics Data System (ADS)

    Vinkler, Dror A.; Permuter, Haim H.; Merhav, Neri

    2016-04-01

    In information theory, one area of interest is gambling, where mutual information characterizes the maximal gain in wealth growth rate due to knowledge of side information; the betting strategy that achieves this maximum is named the Kelly strategy. In the field of physics, it was recently shown that mutual information can characterize the maximal amount of work that can be extracted from a single heat bath using measurement-based control protocols, i.e. using ‘information engines’. However, to the best of our knowledge, no relation between gambling and information engines has been presented before. In this paper, we briefly review the two concepts and then demonstrate an analogy between gambling, where bits are converted into wealth, and information engines, where bits representing measurements are converted into energy. From this analogy follows an extension of gambling to the continuous-valued case, which is shown to be useful for investments in currency exchange rates or in the stock market using options. Moreover, the analogy enables us to use well-known methods and results from one field to solve problems in the other. We present three such cases: maximum work extraction when the probability distributions governing the system and measurements are unknown, work extraction when some energy is lost in each cycle, e.g. due to friction, and an analysis of systems with memory. In all three cases, the analogy enables us to use known results in order to obtain new ones.

  11. Curcumin complexation with cyclodextrins by the autoclave process: Method development and characterization of complex formation.

    PubMed

    Hagbani, Turki Al; Nazzal, Sami

    2017-03-30

    One approach to enhance curcumin (CUR) aqueous solubility is to use cyclodextrins (CDs) to form inclusion complexes where CUR is encapsulated as a guest molecule within the internal cavity of the water-soluble CD. Several methods have been reported for the complexation of CUR with CDs. Limited information, however, is available on the use of the autoclave process (AU) in complex formation. The aims of this work were therefore to (1) investigate and evaluate the AU cycle as a complex formation method to enhance CUR solubility; (2) compare the efficacy of the AU process with the freeze-drying (FD) and evaporation (EV) processes in complex formation; and (3) confirm CUR stability by characterizing CUR:CD complexes by NMR, Raman spectroscopy, DSC, and XRD. Significant differences were found in the saturation solubility of CUR from its complexes with CD when prepared by the three complexation methods. The AU yielded a complex with expected chemical and physical fingerprints for a CUR:CD inclusion complex that maintained the chemical integrity and stability of CUR and provided the highest solubility of CUR in water. Physical and chemical characterizations of the AU complexes confirmed the encapsulated of CUR inside the CD cavity and the transformation of the crystalline CUR:CD inclusion complex to an amorphous form. It was concluded that the autoclave process with its short processing time could be used as an alternate and efficient methods for drug:CD complexation. Copyright © 2017 Elsevier B.V. All rights reserved.

  12. Use of bioclimatic indexes to characterize phenological phases of apple varieties in Northern Italy.

    PubMed

    Valentini, N; Me, G; Ferrero, R; Spanna, F

    2001-11-01

    The research was designed to characterize the phenological behaviour of different apple varieties and to compare different bioclimatic indexes in order to evaluate their adaptability in describing the phenological phases of fruit species. A field study on the requirement for chilling units (winter chilling requirement) and the accumulation of growing degree hours of 15 native apple cultivars was carried out in a fruit-growing area in North West Italy (Cuneo Province, Piedmont). From 1991 to 1993, climatic data were collected at meteorological stations installed in an experimental orchard (Verzuolo, Cuneo). Four methods were compared to determine the winter chilling requirement: Hutchins, Weinberger-Eggert, Utah and North Carolina. The Utah method was applied to determine the time when the chilling units accumulated become effective in meeting the rest requirements. A comparison of the different methods indicated that the Weinberger-Eggert method is the best: as it showed the lowest statistical variability during the 3 years of observations. The growing degree hour requirement (GDH) was estimated by the North Carolina method with two different base temperatures: 4.4 degrees C and 6.1 degrees C. More difficulties were met when the date of rest completion and the beginning of GDH accumulation was determined. The best base temperature for the estimation of GDH is 4.4 degrees C. Phenological and climatic characterizations are two basic tools for giving farmers and agricultural advisors important information about which varieties to choose and which are the best and the most correct cultivation practices to follow.

  13. New method for evaluating irreversible adsorption and stationary phase bleed in gas chromatographic capillary columns.

    PubMed

    Wright, Bob W; Wright, Cherylyn W

    2012-10-26

    A novel method is described for the evaluation of irreversible adsorption and column bleed in gas chromatographic (GC) columns using a tandem GC approach. This work specifically determined the degree of irreversible adsorption behavior of specific sulfur and phosphorous containing test probe compounds at levels ranging from approximately 50 picograms (pg) to 1 nanogram (ng) on selected gas chromatographic columns. This method does not replace existing evaluation methods that characterize reversible adsorption but provides an additional tool. The test compounds were selected due to their ease of adsorption and their importance in the specific trace analytical detection methodology being developed. Replicate chromatographic columns with 5% phenylmethylpolysiloxane (PMS), polyethylene glycol (wax), trifluoropropylpolysiloxane (TFP), or 78% cyanopropylpolysiloxane stationary phases from a variety of vendors were evaluated. As expected, the results demonstrate that the different chromatographic phases exhibit differing degrees of irreversible adsorption behavior. The results also indicate that all manufacturers do not produce equally inert columns nor are columns from a given manufacturer identical. The wax-coated columns for the test probes used were more inert as a group than 5% PMS coated columns, and they were more reproducibly manufactured. Both TFP and 78% cyanopropylpolysiloxane columns displayed superior inertness to the test compounds compared to either 5% PMS- or wax-coated columns. Irreversible adsorption behavior was characterized for a limited range of stationary phase film thicknesses. In addition, the method was shown effective for characterizing column bleed and methods to remove bleed components. This method is useful in screening columns for demanding applications and to obtain diagnostic information related to improved preparation methods. Copyright © 2012 Elsevier B.V. All rights reserved.

  14. (abstract) Oblique Insonification Ultrasonic NDE of Composite Materials for Space Applications

    NASA Technical Reports Server (NTRS)

    Bar-Cohen, Y.; Lih, S. S.; Mal, A. K.

    1997-01-01

    In recent years, a great deal of research has been exerted to developing NDE methods for the characterization of the material properties of composites as well as other space structural materials. The need for information about such parameters as the elastic properties, density, and thickness are critical to the safe design and operation of such structural materials. Ultrasonics using immersion methods has played an important role in these efforts due to its capability, cost effectiveness, and ease of use. The authors designed a series of ultrasonic oblique insonification experiments in order to develop a practical field applicable NDE method for space structures.

  15. Simple performance evaluation of pulsed spontaneous parametric down-conversion sources for quantum communications.

    PubMed

    Smirr, Jean-Loup; Guilbaud, Sylvain; Ghalbouni, Joe; Frey, Robert; Diamanti, Eleni; Alléaume, Romain; Zaquine, Isabelle

    2011-01-17

    Fast characterization of pulsed spontaneous parametric down conversion (SPDC) sources is important for applications in quantum information processing and communications. We propose a simple method to perform this task, which only requires measuring the counts on the two output channels and the coincidences between them, as well as modeling the filter used to reduce the source bandwidth. The proposed method is experimentally tested and used for a complete evaluation of SPDC sources (pair emission probability, total losses, and fidelity) of various bandwidths. This method can find applications in the setting up of SPDC sources and in the continuous verification of the quality of quantum communication links.

  16. Characterization of Athabasca lean oil sands and mixed surficial materials: Comparison of capillary electrophoresis/low-resolution mass spectrometry and high-resolution mass spectrometry.

    PubMed

    MacLennan, Matthew S; Peru, Kerry M; Swyngedouw, Chris; Fleming, Ian; Chen, David D Y; Headley, John V

    2018-05-15

    Oil sands mining in Alberta, Canada, requires removal and stockpiling of considerable volumes of near-surface overburden material. This overburden includes lean oil sands (LOS) which cannot be processed economically but contain sparingly soluble petroleum hydrocarbons and naphthenic acids, which can leach into environmental waters. In order to measure and track the leaching of dissolved constituents and distinguish industrially derived organics from naturally occurring organics in local waters, practical methods were developed for characterizing multiple sources of contaminated water leakage. Capillary electrophoresis/positive-ion electrospray ionization low-resolution time-of-flight mass spectrometry (CE/LRMS), high-resolution negative-ion electrospray ionization Orbitrap mass spectrometry (HRMS) and conventional gas chromatography/flame ionization detection (GC/FID) were used to characterize porewater samples collected from within Athabasca LOS and mixed surficial materials. GC/FID was used to measure total petroleum hydrocarbon and HRMS was used to measure total naphthenic acid fraction components (NAFCs). HRMS and CE/LRMS were used to characterize samples according to source. The amounts of total petroleum hydrocarbon in each sample as measured by GC/FID ranged from 0.1 to 15.1 mg/L while the amounts of NAFCs as measured by HRMS ranged from 5.3 to 82.3 mg/L. Factors analysis (FA) on HRMS data visually demonstrated clustering according to sample source and was correlated to molecular formula. LRMS coupled to capillary electrophoresis separation (CE/LRMS) provides important information on NAFC isomers by adding analyte migration time data to m/z and peak intensity. Differences in measured amounts of total petroleum hydrocarbons by GC/FID and NAFCs by HRMS indicate that the two methods provide complementary information about the nature of dissolved organic species in a soil or water leachate samples. NAFC molecule class O x S y is a possible tracer for LOS seepage. CE/LRMS provides complementary information and is a feasible and practical option for source evaluation of NAFCs in water. Copyright © 2018 John Wiley & Sons, Ltd.

  17. Efficient high-dimensional characterization of conductivity in a sand box using massive MRI-imaged concentration data

    NASA Astrophysics Data System (ADS)

    Lee, J. H.; Yoon, H.; Kitanidis, P. K.; Werth, C. J.; Valocchi, A. J.

    2015-12-01

    Characterizing subsurface properties, particularly hydraulic conductivity, is crucial for reliable and cost-effective groundwater supply management, contaminant remediation, and emerging deep subsurface activities such as geologic carbon storage and unconventional resources recovery. With recent advances in sensor technology, a large volume of hydro-geophysical and chemical data can be obtained to achieve high-resolution images of subsurface properties, which can be used for accurate subsurface flow and reactive transport predictions. However, subsurface characterization with a plethora of information requires high, often prohibitive, computational costs associated with "big data" processing and large-scale numerical simulations. As a result, traditional inversion techniques are not well-suited for problems that require coupled multi-physics simulation models with massive data. In this work, we apply a scalable inversion method called Principal Component Geostatistical Approach (PCGA) for characterizing heterogeneous hydraulic conductivity (K) distribution in a 3-D sand box. The PCGA is a Jacobian-free geostatistical inversion approach that uses the leading principal components of the prior information to reduce computational costs, sometimes dramatically, and can be easily linked with any simulation software. Sequential images of transient tracer concentrations in the sand box were obtained using magnetic resonance imaging (MRI) technique, resulting in 6 million tracer-concentration data [Yoon et. al., 2008]. Since each individual tracer observation has little information on the K distribution, the dimension of the data was reduced using temporal moments and discrete cosine transform (DCT). Consequently, 100,000 unknown K values consistent with the scale of MRI data (at a scale of 0.25^3 cm^3) were estimated by matching temporal moments and DCT coefficients of the original tracer data. Estimated K fields are close to the true K field, and even small-scale variability of the sand box was captured to highlight high K connectivity and contrasts between low and high K zones. Total number of 1,000 MODFLOW and MT3DMS simulations were required to obtain final estimates and corresponding estimation uncertainty, showing the efficiency and effectiveness of our method.

  18. Windows to the soul: vision science as a tool for studying biological mechanisms of information processing deficits in schizophrenia.

    PubMed

    Yoon, Jong H; Sheremata, Summer L; Rokem, Ariel; Silver, Michael A

    2013-10-31

    Cognitive and information processing deficits are core features and important sources of disability in schizophrenia. Our understanding of the neural substrates of these deficits remains incomplete, in large part because the complexity of impairments in schizophrenia makes the identification of specific deficits very challenging. Vision science presents unique opportunities in this regard: many years of basic research have led to detailed characterization of relationships between structure and function in the early visual system and have produced sophisticated methods to quantify visual perception and characterize its neural substrates. We present a selective review of research that illustrates the opportunities for discovery provided by visual studies in schizophrenia. We highlight work that has been particularly effective in applying vision science methods to identify specific neural abnormalities underlying information processing deficits in schizophrenia. In addition, we describe studies that have utilized psychophysical experimental designs that mitigate generalized deficit confounds, thereby revealing specific visual impairments in schizophrenia. These studies contribute to accumulating evidence that early visual cortex is a useful experimental system for the study of local cortical circuit abnormalities in schizophrenia. The high degree of similarity across neocortical areas of neuronal subtypes and their patterns of connectivity suggests that insights obtained from the study of early visual cortex may be applicable to other brain regions. We conclude with a discussion of future studies that combine vision science and neuroimaging methods. These studies have the potential to address pressing questions in schizophrenia, including the dissociation of local circuit deficits vs. impairments in feedback modulation by cognitive processes such as spatial attention and working memory, and the relative contributions of glutamatergic and GABAergic deficits.

  19. Edge detection and localization with edge pattern analysis and inflection characterization

    NASA Astrophysics Data System (ADS)

    Jiang, Bo

    2012-05-01

    In general edges are considered to be abrupt changes or discontinuities in two dimensional image signal intensity distributions. The accuracy of front-end edge detection methods in image processing impacts the eventual success of higher level pattern analysis downstream. To generalize edge detectors designed from a simple ideal step function model to real distortions in natural images, research on one dimensional edge pattern analysis to improve the accuracy of edge detection and localization proposes an edge detection algorithm, which is composed by three basic edge patterns, such as ramp, impulse, and step. After mathematical analysis, general rules for edge representation based upon the classification of edge types into three categories-ramp, impulse, and step (RIS) are developed to reduce detection and localization errors, especially reducing "double edge" effect that is one important drawback to the derivative method. But, when applying one dimensional edge pattern in two dimensional image processing, a new issue is naturally raised that the edge detector should correct marking inflections or junctions of edges. Research on human visual perception of objects and information theory pointed out that a pattern lexicon of "inflection micro-patterns" has larger information than a straight line. Also, research on scene perception gave an idea that contours have larger information are more important factor to determine the success of scene categorization. Therefore, inflections or junctions are extremely useful features, whose accurate description and reconstruction are significant in solving correspondence problems in computer vision. Therefore, aside from adoption of edge pattern analysis, inflection or junction characterization is also utilized to extend traditional derivative edge detection algorithm. Experiments were conducted to test my propositions about edge detection and localization accuracy improvements. The results support the idea that these edge detection method improvements are effective in enhancing the accuracy of edge detection and localization.

  20. Annotating novel genes by integrating synthetic lethals and genomic information

    PubMed Central

    Schöner, Daniel; Kalisch, Markus; Leisner, Christian; Meier, Lukas; Sohrmann, Marc; Faty, Mahamadou; Barral, Yves; Peter, Matthias; Gruissem, Wilhelm; Bühlmann, Peter

    2008-01-01

    Background Large scale screening for synthetic lethality serves as a common tool in yeast genetics to systematically search for genes that play a role in specific biological processes. Often the amounts of data resulting from a single large scale screen far exceed the capacities of experimental characterization of every identified target. Thus, there is need for computational tools that select promising candidate genes in order to reduce the number of follow-up experiments to a manageable size. Results We analyze synthetic lethality data for arp1 and jnm1, two spindle migration genes, in order to identify novel members in this process. To this end, we use an unsupervised statistical method that integrates additional information from biological data sources, such as gene expression, phenotypic profiling, RNA degradation and sequence similarity. Different from existing methods that require large amounts of synthetic lethal data, our method merely relies on synthetic lethality information from two single screens. Using a Multivariate Gaussian Mixture Model, we determine the best subset of features that assign the target genes to two groups. The approach identifies a small group of genes as candidates involved in spindle migration. Experimental testing confirms the majority of our candidates and we present she1 (YBL031W) as a novel gene involved in spindle migration. We applied the statistical methodology also to TOR2 signaling as another example. Conclusion We demonstrate the general use of Multivariate Gaussian Mixture Modeling for selecting candidate genes for experimental characterization from synthetic lethality data sets. For the given example, integration of different data sources contributes to the identification of genetic interaction partners of arp1 and jnm1 that play a role in the same biological process. PMID:18194531

  1. Glycopeptide Analysis, Recent Developments and Applications*

    PubMed Central

    Desaire, Heather

    2013-01-01

    Glycopeptide-based analysis is used to inform researchers about the glycans on one or more proteins. The method's key attractive feature is its ability to link glycosylation information to exact locations (glycosylation sites) on proteins. Numerous applications for glycopeptide analysis are known, and several examples are described herein. The techniques used to characterize glycopeptides are still emerging, and recently, research focused on facilitating aspects of glycopeptide analysis has advanced significantly in the areas of sample preparation, MS fragmentation, and automation of data analysis. These recent developments, described herein, provide the foundation for the growth of glycopeptide analysis as a blossoming field. PMID:23389047

  2. Determination of Inorganic Arsenic in a Wide Range of Food Matrices using Hydride Generation - Atomic Absorption Spectrometry.

    PubMed

    de la Calle, Maria B; Devesa, Vicenta; Fiamegos, Yiannis; Vélez, Dinoraz

    2017-09-01

    The European Food Safety Authority (EFSA) underlined in its Scientific Opinion on Arsenic in Food that in order to support a sound exposure assessment to inorganic arsenic through diet, information about distribution of arsenic species in various food types must be generated. A method, previously validated in a collaborative trial, has been applied to determine inorganic arsenic in a wide variety of food matrices, covering grains, mushrooms and food of marine origin (31 samples in total). The method is based on detection by flow injection-hydride generation-atomic absorption spectrometry of the iAs selectively extracted into chloroform after digestion of the proteins with concentrated HCl. The method is characterized by a limit of quantification of 10 µg/kg dry weight, which allowed quantification of inorganic arsenic in a large amount of food matrices. Information is provided about performance scores given to results obtained with this method and which were reported by different laboratories in several proficiency tests. The percentage of satisfactory results obtained with the discussed method is higher than that of the results obtained with other analytical approaches.

  3. Information Based Numerical Practice.

    DTIC Science & Technology

    1987-02-01

    characterization by comparative computational studies of various benchmark problems. See e.g. [MacNeal, Harder (1985)], [Robinson, Blackham (1981)] any...FOR NONADAPTIVE METHODS 2.1. THE QUADRATURE FORMULA The simplest example studied in detail in the literature is the problem of the optimal quadrature...formulae and the functional analytic prerequisites for the study of optimal formulae, we refer to the large monography (808 p) of [Sobolev (1974)]. Let us

  4. 2012 NRL Review: Building a Workforce and Assembling Scientific Tools for the Future

    DTIC Science & Technology

    2012-01-01

    fiber optics, electro-optics, microelectronics, fracture mechan ics, vacuum science, laser phys ics and joining technol ogy, and radio frequen cy...ics, elastic/plastic fracture mechanics , materials, finite-element methods, nondestruc tive evalua tion, characterization of fracture resistance of...NRL Review chapter entitled “Programs for Professional Development.” For additional information about NRL, the NRL Fact Book lists the organizations

  5. Evaluation between ultrahigh pressure liquid chromatography and high-performance liquid chromatography analytical methods for characterizing natural dyestuffs.

    PubMed

    Serrano, Ana; van Bommel, Maarten; Hallett, Jessica

    2013-11-29

    An evaluation was undertaken of ultrahigh pressure liquid chromatography (UHPLC) in comparison to high-performance liquid chromatography (HPLC) for characterizing natural dyes in cultural heritage objects. A new UHPLC method was optimized by testing several analytical parameters adapted from prior UHPLC studies developed in diverse fields of research. Different gradient elution programs were tested on seven UHPLC columns with different dimensions and stationary phase compositions by applying several mobile phases, flow rates, temperatures, and runtimes. The UHPLC method successfully provided more improved data than that achieved by the HPLC method. Indeed, even though carminic acid has shown circa 146% higher resolution with HPLC, UHPLC resulted in an increase of 41-61% resolution and a decrease of 91-422% limit of detection, depending on the dye compound. The optimized method was subsequently assigned to analyse 59 natural reference materials, in which 85 different components were ascribed with different physicochemical properties, in order to create a spectral database for future characterization of dyes in cultural heritage objects. The majority of these reference samples could be successfully distinguished with one single method through the examination of these compounds' retention times and their spectra acquired with a photodiode array detector. These results demonstrate that UHPLC analyses are extremely valuable for the acquisition of more precise chromatographic information concerning natural dyes with complex mixtures of different and/or closely related physicochemical properties, essential for distinguishing similar species of plants and animals used to colour cultural heritage objects. Copyright © 2013 Elsevier B.V. All rights reserved.

  6. Revised Methods for Characterizing Stream Habitat in the National Water-Quality Assessment Program

    USGS Publications Warehouse

    Fitzpatrick, Faith A.; Waite, Ian R.; D'Arconte, Patricia J.; Meador, Michael R.; Maupin, Molly A.; Gurtz, Martin E.

    1998-01-01

    Stream habitat is characterized in the U.S. Geological Survey's National Water-Quality Assessment (NAWQA) Program as part of an integrated physical, chemical, and biological assessment of the Nation's water quality. The goal of stream habitat characterization is to relate habitat to other physical, chemical, and biological factors that describe water-quality conditions. To accomplish this goal, environmental settings are described at sites selected for water-quality assessment. In addition, spatial and temporal patterns in habitat are examined at local, regional, and national scales. This habitat protocol contains updated methods for evaluating habitat in NAWQA Study Units. Revisions are based on lessons learned after 6 years of applying the original NAWQA habitat protocol to NAWQA Study Unit ecological surveys. Similar to the original protocol, these revised methods for evaluating stream habitat are based on a spatially hierarchical framework that incorporates habitat data at basin, segment, reach, and microhabitat scales. This framework provides a basis for national consistency in collection techniques while allowing flexibility in habitat assessment within individual Study Units. Procedures are described for collecting habitat data at basin and segment scales; these procedures include use of geographic information system data bases, topographic maps, and aerial photographs. Data collected at the reach scale include channel, bank, and riparian characteristics.

  7. Satellite Articulation Characterization from an Image Trajectory Matrix Using Optimization

    NASA Astrophysics Data System (ADS)

    Curtis, D. H.; Cobb, R. G.

    Autonomous on-orbit satellite servicing and inspection benefits from an inspector satellite that can autonomously gain as much information as possible about the primary satellite. This includes performance of articulated objects such as solar arrays, antennas, and sensors. This paper presents a method of characterizing the articulation of a satellite using resolved monocular imagery. A simulated point cloud representing a nominal satellite with articulating solar panels and a complex articulating appendage is developed and projected to the image coordinates that would be seen from an inspector following a given inspection route. A method is developed to analyze the resulting image trajectory matrix. The developed method takes advantage of the fact that the route of the inspector satellite is known to assist in the segmentation of the points into different rigid bodies, the creation of the 3D point cloud, and the identification of the articulation parameters. Once the point cloud and the articulation parameters are calculated, they can be compared to the known truth. The error in the calculated point cloud is determined as well as the difference between the true workspace of the satellite and the calculated workspace. These metrics can be used to compare the quality of various inspection routes for characterizing the satellite and its articulation.

  8. Postprocessing for character recognition using pattern features and linguistic information

    NASA Astrophysics Data System (ADS)

    Yoshikawa, Takatoshi; Okamoto, Masayosi; Horii, Hiroshi

    1993-04-01

    We propose a new method of post-processing for character recognition using pattern features and linguistic information. This method corrects errors in the recognition of handwritten Japanese sentences containing Kanji characters. This post-process method is characterized by having two types of character recognition. Improving the accuracy of the character recognition rate of Japanese characters is made difficult by the large number of characters, and the existence of characters with similar patterns. Therefore, it is not practical for a character recognition system to recognize all characters in detail. First, this post-processing method generates a candidate character table by recognizing the simplest features of characters. Then, it selects words corresponding to the character from the candidate character table by referring to a word and grammar dictionary before selecting suitable words. If the correct character is included in the candidate character table, this process can correct an error, however, if the character is not included, it cannot correct an error. Therefore, if this method can presume a character does not exist in a candidate character table by using linguistic information (word and grammar dictionary). It then can verify a presumed character by character recognition using complex features. When this method is applied to an online character recognition system, the accuracy of character recognition improves 93.5% to 94.7%. This proved to be the case when it was used for the editorials of a Japanese newspaper (Asahi Shinbun).

  9. Statistical Characterization of Environmental Error Sources Affecting Electronically Scanned Pressure Transducers

    NASA Technical Reports Server (NTRS)

    Green, Del L.; Walker, Eric L.; Everhart, Joel L.

    2006-01-01

    Minimization of uncertainty is essential to extend the usable range of the 15-psid Electronically Scanned Pressure [ESP) transducer measurements to the low free-stream static pressures found in hypersonic wind tunnels. Statistical characterization of environmental error sources inducing much of this uncertainty requires a well defined and controlled calibration method. Employing such a controlled calibration system, several studies were conducted that provide quantitative information detailing the required controls needed to minimize environmental and human induced error sources. Results of temperature, environmental pressure, over-pressurization, and set point randomization studies for the 15-psid transducers are presented along with a comparison of two regression methods using data acquired with both 0.36-psid and 15-psid transducers. Together these results provide insight into procedural and environmental controls required for long term high-accuracy pressure measurements near 0.01 psia in the hypersonic testing environment using 15-psid ESP transducers.

  10. Statistical Characterization of Environmental Error Sources Affecting Electronically Scanned Pressure Transducers

    NASA Technical Reports Server (NTRS)

    Green, Del L.; Walker, Eric L.; Everhart, Joel L.

    2006-01-01

    Minimization of uncertainty is essential to extend the usable range of the 15-psid Electronically Scanned Pressure (ESP) transducer measurements to the low free-stream static pressures found in hypersonic wind tunnels. Statistical characterization of environmental error sources inducing much of this uncertainty requires a well defined and controlled calibration method. Employing such a controlled calibration system, several studies were conducted that provide quantitative information detailing the required controls needed to minimize environmental and human induced error sources. Results of temperature, environmental pressure, over-pressurization, and set point randomization studies for the 15-psid transducers are presented along with a comparison of two regression methods using data acquired with both 0.36-psid and 15-psid transducers. Together these results provide insight into procedural and environmental controls required for long term high-accuracy pressure measurements near 0.01 psia in the hypersonic testing environment using 15-psid ESP transducers.

  11. Reprint of: Combining theory and experiment for X-ray absorption spectroscopy and resonant X-ray scattering characterization of polymers

    DOE PAGES

    Su, Gregory M.; Cordova, Isvar A.; Brady, Michael A.; ...

    2016-11-01

    An improved understanding of fundamental chemistry, electronic structure, morphology, and dynamics in polymers and soft materials requires advanced characterization techniques that are amenable to in situ and operando studies. Soft X-ray methods are especially useful in their ability to non-destructively provide information on specific materials or chemical moieties. Analysis of these experiments, which can be very dependent on X-ray energy and polarization, can quickly become complex. Complementary modeling and predictive capabilities are required to properly probe these critical features. Here in this paper, we present relevant background on this emerging suite of techniques. We focus on how the combination ofmore » theory and experiment has been applied and can be further developed to drive our understanding of how these methods probe relevant chemistry, structure, and dynamics in soft materials.« less

  12. Combining theory and experiment for X-ray absorption spectroscopy and resonant X-ray scattering characterization of polymers

    DOE PAGES

    Su, Gregory M.; Cordova, Isvar A.; Brady, Michael A.; ...

    2016-07-04

    We present that an improved understanding of fundamental chemistry, electronic structure, morphology, and dynamics in polymers and soft materials requires advanced characterization techniques that are amenable to in situ and operando studies. Soft X-ray methods are especially useful in their ability to non-destructively provide information on specific materials or chemical moieties. Analysis of these experiments, which can be very dependent on X-ray energy and polarization, can quickly become complex. Complementary modeling and predictive capabilities are required to properly probe these critical features. Here, we present relevant background on this emerging suite of techniques. Finally, we focus on how the combinationmore » of theory and experiment has been applied and can be further developed to drive our understanding of how these methods probe relevant chemistry, structure, and dynamics in soft materials.« less

  13. Abseq: Ultrahigh-throughput single cell protein profiling with droplet microfluidic barcoding.

    PubMed

    Shahi, Payam; Kim, Samuel C; Haliburton, John R; Gartner, Zev J; Abate, Adam R

    2017-03-14

    Proteins are the primary effectors of cellular function, including cellular metabolism, structural dynamics, and information processing. However, quantitative characterization of proteins at the single-cell level is challenging due to the tiny amount of protein available. Here, we present Abseq, a method to detect and quantitate proteins in single cells at ultrahigh throughput. Like flow and mass cytometry, Abseq uses specific antibodies to detect epitopes of interest; however, unlike these methods, antibodies are labeled with sequence tags that can be read out with microfluidic barcoding and DNA sequencing. We demonstrate this novel approach by characterizing surface proteins of different cell types at the single-cell level and distinguishing between the cells by their protein expression profiles. DNA-tagged antibodies provide multiple advantages for profiling proteins in single cells, including the ability to amplify low-abundance tags to make them detectable with sequencing, to use molecular indices for quantitative results, and essentially limitless multiplexing.

  14. Abseq: Ultrahigh-throughput single cell protein profiling with droplet microfluidic barcoding

    NASA Astrophysics Data System (ADS)

    Shahi, Payam; Kim, Samuel C.; Haliburton, John R.; Gartner, Zev J.; Abate, Adam R.

    2017-03-01

    Proteins are the primary effectors of cellular function, including cellular metabolism, structural dynamics, and information processing. However, quantitative characterization of proteins at the single-cell level is challenging due to the tiny amount of protein available. Here, we present Abseq, a method to detect and quantitate proteins in single cells at ultrahigh throughput. Like flow and mass cytometry, Abseq uses specific antibodies to detect epitopes of interest; however, unlike these methods, antibodies are labeled with sequence tags that can be read out with microfluidic barcoding and DNA sequencing. We demonstrate this novel approach by characterizing surface proteins of different cell types at the single-cell level and distinguishing between the cells by their protein expression profiles. DNA-tagged antibodies provide multiple advantages for profiling proteins in single cells, including the ability to amplify low-abundance tags to make them detectable with sequencing, to use molecular indices for quantitative results, and essentially limitless multiplexing.

  15. Abseq: Ultrahigh-throughput single cell protein profiling with droplet microfluidic barcoding

    PubMed Central

    Shahi, Payam; Kim, Samuel C.; Haliburton, John R.; Gartner, Zev J.; Abate, Adam R.

    2017-01-01

    Proteins are the primary effectors of cellular function, including cellular metabolism, structural dynamics, and information processing. However, quantitative characterization of proteins at the single-cell level is challenging due to the tiny amount of protein available. Here, we present Abseq, a method to detect and quantitate proteins in single cells at ultrahigh throughput. Like flow and mass cytometry, Abseq uses specific antibodies to detect epitopes of interest; however, unlike these methods, antibodies are labeled with sequence tags that can be read out with microfluidic barcoding and DNA sequencing. We demonstrate this novel approach by characterizing surface proteins of different cell types at the single-cell level and distinguishing between the cells by their protein expression profiles. DNA-tagged antibodies provide multiple advantages for profiling proteins in single cells, including the ability to amplify low-abundance tags to make them detectable with sequencing, to use molecular indices for quantitative results, and essentially limitless multiplexing. PMID:28290550

  16. Large amplitude oscillatory measurements as mechanical characterization methods for soft elastomers

    NASA Astrophysics Data System (ADS)

    Skov, Anne L.

    2012-04-01

    Mechanical characterization of soft elastomers is usually done either by traditional shear rheometry in the linear viscoelastic (LVE) regime (i.e. low strains) or by extensional rheology in the nonlinear regime. However, in many commercially available rheometers for nonlinear extensions the measurements rely on certain assumptions such as a predefined shape alteration and are very hard to perform on soft elastomers in most cases. The LVE data provides information on important parameters for DEAP purposes such as the Young's modulus and the tendency to viscous dissipation (at low strains only) but provides no information on the strain hardening or softening effects at larger strains, and the mechanical breakdown strength. Therefore it is obvious that LVE can not be used as the single mechanical characterization tool in large strain applications. We show how the data set of LVE, and large amplitude oscillating elongation (LAOE)1 and planar elongation2,3 make the ideal set of experiments to evaluate the mechanical performance of DEAPs. We evaluate the mechanical performance of several soft elastomers applicable for DEAP purposes such as poly(propyleneoxide) (PPO) networks3,4 and traditional unfilled silicone (PDMS) networks5.

  17. Characterization of alkaloids in Sophora flavescens Ait. by high-performance liquid chromatography-electrospray ionization tandem mass spectrometry.

    PubMed

    Liu, Guoqiang; Dong, Jing; Wang, Hong; Hashi, Yuki; Chen, Shizhong

    2011-04-05

    Sophora flavescens Ait., a well-known Chinese herbal medicine, is widely used in clinical practice for the treatment of viral hepatitis, cancer, gastrointestinal hemorrhage, and skin diseases. This paper is the first report on a method based on the combined use of high-performance liquid chromatography, photodiode array detection, and electrospray ionization tandem mass spectrometry for the comprehensive and systematic separation and characterization of bioactive alkaloids in Sophora flavescens Ait. A total of 22 constituents were identified on the basis of the extracted ion chromatograms for different [M+H](+) ions of the alkaloids present in S. flavescens Ait. Among these, 5 constituents were unambiguously identified by comparing the experimental data on their retention times and MS(n) spectra with those of the authentic compounds, and 17 other constituents were tentatively identified on the basis of their MS(n) fragmentation behaviors and/or molecular weight information from literatures. Furthermore, some characteristic fragmentation pathways of the alkaloids in S. flavescens Ait. were detected and examined. This information may be useful for characterizing the bioactive alkaloids present in S. flavescens Ait. and for possible applications in formulations. Copyright © 2010 Elsevier B.V. All rights reserved.

  18. Self-Taught Low-Rank Coding for Visual Learning.

    PubMed

    Li, Sheng; Li, Kang; Fu, Yun

    2018-03-01

    The lack of labeled data presents a common challenge in many computer vision and machine learning tasks. Semisupervised learning and transfer learning methods have been developed to tackle this challenge by utilizing auxiliary samples from the same domain or from a different domain, respectively. Self-taught learning, which is a special type of transfer learning, has fewer restrictions on the choice of auxiliary data. It has shown promising performance in visual learning. However, existing self-taught learning methods usually ignore the structure information in data. In this paper, we focus on building a self-taught coding framework, which can effectively utilize the rich low-level pattern information abstracted from the auxiliary domain, in order to characterize the high-level structural information in the target domain. By leveraging a high quality dictionary learned across auxiliary and target domains, the proposed approach learns expressive codings for the samples in the target domain. Since many types of visual data have been proven to contain subspace structures, a low-rank constraint is introduced into the coding objective to better characterize the structure of the given target set. The proposed representation learning framework is called self-taught low-rank (S-Low) coding, which can be formulated as a nonconvex rank-minimization and dictionary learning problem. We devise an efficient majorization-minimization augmented Lagrange multiplier algorithm to solve it. Based on the proposed S-Low coding mechanism, both unsupervised and supervised visual learning algorithms are derived. Extensive experiments on five benchmark data sets demonstrate the effectiveness of our approach.

  19. Application of data fusion modeling (DFM) to site characterization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Porter, D.W.; Gibbs, B.P.; Jones, W.F.

    1996-01-01

    Subsurface characterization is faced with substantial uncertainties because the earth is very heterogeneous, and typical data sets are fragmented and disparate. DFM removes many of the data limitations of current methods to quantify and reduce uncertainty for a variety of data types and models. DFM is a methodology to compute hydrogeological state estimates and their uncertainties from three sources of information: measured data, physical laws, and statistical models for spatial heterogeneities. The benefits of DFM are savings in time and cost through the following: the ability to update models in real time to help guide site assessment, improved quantification ofmore » uncertainty for risk assessment, and improved remedial design by quantifying the uncertainty in safety margins. A Bayesian inverse modeling approach is implemented with a Gauss Newton method where spatial heterogeneities are viewed as Markov random fields. Information from data, physical laws, and Markov models is combined in a Square Root Information Smoother (SRIS). Estimates and uncertainties can be computed for heterogeneous hydraulic conductivity fields in multiple geological layers from the usually sparse hydraulic conductivity data and the often more plentiful head data. An application of DFM to the Old Burial Ground at the DOE Savannah River Site will be presented. DFM estimates and quantifies uncertainty in hydrogeological parameters using variably saturated flow numerical modeling to constrain the estimation. Then uncertainties are propagated through the transport modeling to quantify the uncertainty in tritium breakthrough curves at compliance points.« less

  20. Application of data fusion modeling (DFM) to site characterization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Porter, D.W.; Gibbs, B.P.; Jones, W.F.

    1996-12-31

    Subsurface characterization is faced with substantial uncertainties because the earth is very heterogeneous, and typical data sets are fragmented and disparate. DFM removes many of the data limitations of current methods to quantify and reduce uncertainty for a variety of data types and models. DFM is a methodology to compute hydrogeological state estimates and their uncertainties from three sources of information: measured data, physical laws, and statistical models for spatial heterogeneities. The benefits of DFM are savings in time and cost through the following: the ability to update models in real time to help guide site assessment, improved quantification ofmore » uncertainty for risk assessment, and improved remedial design by quantifying the uncertainty in safety margins. A Bayesian inverse modeling approach is implemented with a Gauss Newton method where spatial heterogeneities are viewed as Markov random fields. Information from data, physical laws, and Markov models is combined in a Square Root Information Smoother (SRIS). Estimates and uncertainties can be computed for heterogeneous hydraulic conductivity fields in multiple geological layers from the usually sparse hydraulic conductivity data and the often more plentiful head data. An application of DFM to the Old Burial Ground at the DOE Savannah River Site will be presented. DFM estimates and quantifies uncertainty in hydrogeological parameters using variably saturated flow numerical modeling to constrain the estimation. Then uncertainties are propagated through the transport modeling to quantify the uncertainty in tritium breakthrough curves at compliance points.« less

  1. Tools for the functional interpretation of metabolomic experiments.

    PubMed

    Chagoyen, Monica; Pazos, Florencio

    2013-11-01

    The so-called 'omics' approaches used in modern biology aim at massively characterizing the molecular repertories of living systems at different levels. Metabolomics is one of the last additions to the 'omics' family and it deals with the characterization of the set of metabolites in a given biological system. As metabolomic techniques become more massive and allow characterizing larger sets of metabolites, automatic methods for analyzing these sets in order to obtain meaningful biological information are required. Only recently the first tools specifically designed for this task in metabolomics appeared. They are based on approaches previously used in transcriptomics and other 'omics', such as annotation enrichment analysis. These, together with generic tools for metabolic analysis and visualization not specifically designed for metabolomics will for sure be in the toolbox of the researches doing metabolomic experiments in the near future.

  2. New Electrical Resistivity Tomography approach for karst cave characterization: Castello di Lepre karst cave (Marsico Nuovo, Southern Italy).

    NASA Astrophysics Data System (ADS)

    Guerriero, Merilisa; Capozzoli, Luigi; De Martino, Gregory; Perciante, Felice; Gueguen, Erwan; Rizzo, Enzo

    2017-04-01

    Geophysical methods are commonly applied to characterize karst cave. Several geophysical method are used such as electrical resistivity tomography (ERT), gravimetric prospecting (G), ground penetrating radar (GPR) and seismic methods (S), in order to provide information on cave geometry and subsurface geological structure. In detail, in some complex karst systems, each geophysical method can only give partial information if used in normal way due to a low resolution for deep target. In order to reduce uncertainty and avoid misinterpretations based on a normal use of the electrical resistivity tomography method, a new ERT approach has been applied in karst cave Castello di Lepre (Marsico Nuovo, Basilicata region, Italy) located in the Mezo-Cenozoic carbonate substratum of the Monti della Maddalena ridge (Southern Appenines). In detail, a cross-ERT acquisition system was applied in order to improve the resolution on the electrical resistivity distribution on the surrounding geological structure of a karst cave. The cross-ERT system provides a more uniform model resolution vertically, increasing the resolution of the surface resistivity imaging. The usual cross-ERT is made by electrode setting in two or more borehole in order to acquire the resistivity data distribution. In this work the cross-ERT was made between the electrodes located on surface and along a karst cave, in order to obtain an high resolution of the electrical resistivity distributed between the cave and the surface topography. Finally, the acquired cross-ERT is potentially well-suited for imaging fracture zones since electrical current flow in fractured rock is primarily electrolytic via the secondary porosity associated with the fractures.

  3. Identification of high versus lower risk clinical subgroups in a group of adult patients with supratentorial anaplastic astrocytomas.

    PubMed

    Decaestecker, C; Salmon, I; Camby, I; Dewitte, O; Pasteels, J L; Brotchi, J; Van Ham, P; Kiss, R

    1995-05-01

    The present work investigates whether computer-assisted techniques can contribute any significant information to the characterization of astrocytic tumor aggressiveness. Two complementary computer-assisted methods were used. The first method made use of the digital image analysis of Feulgen-stained nuclei, making it possible to compute 15 morphonuclear and 8 nuclear DNA content-related (ploidy level) parameters. The second method enabled the most discriminatory parameters to be determined. This second method is the Decision Tree technique, which forms part of the Supervised Learning Algorithms. These two techniques were applied to a series of 250 supratentorial astrocytic tumors of the adult. This series included 39 low-grade (astrocytomas, AST) and 211 high-grade (47 anaplastic astrocytomas, ANA, and 164 glioblastomas, GBM) astrocytic tumors. The results show that some AST, ANA and GBM did not fit within simple logical rules. These "complex" cases were labeled NC-AST, NC-ANA and NC-GBM because they were "non-classical" (NC) with respect to their cytological features. An analysis of survival data revealed that the patients with NC-GBM had the same survival period as patients with GBM. In sharp contrast, patients with ANA survived significantly longer than patients with NC-ANA. In fact, the patients with ANA had the same survival period as patients who died from AST, while the patients with NC-ANA had a survival period similar to those with GBM. All these data show that the computer-assisted techniques used in this study can actually provide the pathologist with significant information on the characterization of astrocytic tumor aggressiveness.

  4. State-space reduction and equivalence class sampling for a molecular self-assembly model.

    PubMed

    Packwood, Daniel M; Han, Patrick; Hitosugi, Taro

    2016-07-01

    Direct simulation of a model with a large state space will generate enormous volumes of data, much of which is not relevant to the questions under study. In this paper, we consider a molecular self-assembly model as a typical example of a large state-space model, and present a method for selectively retrieving 'target information' from this model. This method partitions the state space into equivalence classes, as identified by an appropriate equivalence relation. The set of equivalence classes H, which serves as a reduced state space, contains none of the superfluous information of the original model. After construction and characterization of a Markov chain with state space H, the target information is efficiently retrieved via Markov chain Monte Carlo sampling. This approach represents a new breed of simulation techniques which are highly optimized for studying molecular self-assembly and, moreover, serves as a valuable guideline for analysis of other large state-space models.

  5. A traveling salesman approach for predicting protein functions.

    PubMed

    Johnson, Olin; Liu, Jing

    2006-10-12

    Protein-protein interaction information can be used to predict unknown protein functions and to help study biological pathways. Here we present a new approach utilizing the classic Traveling Salesman Problem to study the protein-protein interactions and to predict protein functions in budding yeast Saccharomyces cerevisiae. We apply the global optimization tool from combinatorial optimization algorithms to cluster the yeast proteins based on the global protein interaction information. We then use this clustering information to help us predict protein functions. We use our algorithm together with the direct neighbor algorithm 1 on characterized proteins and compare the prediction accuracy of the two methods. We show our algorithm can produce better predictions than the direct neighbor algorithm, which only considers the immediate neighbors of the query protein. Our method is a promising one to be used as a general tool to predict functions of uncharacterized proteins and a successful sample of using computer science knowledge and algorithms to study biological problems.

  6. A traveling salesman approach for predicting protein functions

    PubMed Central

    Johnson, Olin; Liu, Jing

    2006-01-01

    Background Protein-protein interaction information can be used to predict unknown protein functions and to help study biological pathways. Results Here we present a new approach utilizing the classic Traveling Salesman Problem to study the protein-protein interactions and to predict protein functions in budding yeast Saccharomyces cerevisiae. We apply the global optimization tool from combinatorial optimization algorithms to cluster the yeast proteins based on the global protein interaction information. We then use this clustering information to help us predict protein functions. We use our algorithm together with the direct neighbor algorithm [1] on characterized proteins and compare the prediction accuracy of the two methods. We show our algorithm can produce better predictions than the direct neighbor algorithm, which only considers the immediate neighbors of the query protein. Conclusion Our method is a promising one to be used as a general tool to predict functions of uncharacterized proteins and a successful sample of using computer science knowledge and algorithms to study biological problems. PMID:17147783

  7. Temporal information partitioning: Characterizing synergy, uniqueness, and redundancy in interacting environmental variables

    NASA Astrophysics Data System (ADS)

    Goodwell, Allison E.; Kumar, Praveen

    2017-07-01

    Information theoretic measures can be used to identify nonlinear interactions between source and target variables through reductions in uncertainty. In information partitioning, multivariate mutual information is decomposed into synergistic, unique, and redundant components. Synergy is information shared only when sources influence a target together, uniqueness is information only provided by one source, and redundancy is overlapping shared information from multiple sources. While this partitioning has been applied to provide insights into complex dependencies, several proposed partitioning methods overestimate redundant information and omit a component of unique information because they do not account for source dependencies. Additionally, information partitioning has only been applied to time-series data in a limited context, using basic pdf estimation techniques or a Gaussian assumption. We develop a Rescaled Redundancy measure (Rs) to solve the source dependency issue, and present Gaussian, autoregressive, and chaotic test cases to demonstrate its advantages over existing techniques in the presence of noise, various source correlations, and different types of interactions. This study constitutes the first rigorous application of information partitioning to environmental time-series data, and addresses how noise, pdf estimation technique, or source dependencies can influence detected measures. We illustrate how our techniques can unravel the complex nature of forcing and feedback within an ecohydrologic system with an application to 1 min environmental signals of air temperature, relative humidity, and windspeed. The methods presented here are applicable to the study of a broad range of complex systems composed of interacting variables.

  8. Characterizing the EEG correlates of exploratory behavior.

    PubMed

    Bourdaud, Nicolas; Chavarriaga, Ricardo; Galan, Ferran; Millan, José Del R

    2008-12-01

    This study aims to characterize the electroencephalography (EEG) correlates of exploratory behavior. Decision making in an uncertain environment raises a conflict between two opposing needs: gathering information about the environment and exploiting this knowledge in order to optimize the decision. Exploratory behavior has already been studied using functional magnetic resonance imaging (fMRI). Based on a usual paradigm in reinforcement learning, this study has shown bilateral activation in the frontal and parietal cortex. To our knowledge, no previous study has been done on it using EEG. The study of the exploratory behavior using EEG signals raises two difficulties. First, the labels of trial as exploitation or exploration cannot be directly derived from the subject action. In order to access this information, a model of how the subject makes his decision must be built. The exploration related information can be then derived from it. Second, because of the complexity of the task, its EEG correlates are not necessarily time locked with the action. So the EEG processing methods used should be designed in order to handle signals that shift in time across trials. Using the same experimental protocol as the fMRI study, results show that the bilateral frontal and parietal areas are also the most discriminant. This strongly suggests that the EEG signal also conveys information about the exploratory behavior.

  9. Physical interpretation and development of ultrasonic nondestructive evaluation techniques applied to the quantitative characterization of textile composite materials

    NASA Technical Reports Server (NTRS)

    Miller, James G.

    1994-01-01

    In this Progress Report, we describe our continuing research activities concerning the development and implementation of advanced ultrasonic nondestructive evaluation methods applied to the inspection and characterization of complex composite structures. We explore the feasibility of implementing medical linear array imaging technology as a viable ultrasonic-based nondestructive evaluation method to inspect and characterize complex materials. As an initial step toward the application of linear array imaging technology to the interrogation of a wide range of complex composite structures, we present images obtained using an unmodified medical ultrasonic imaging system of two epoxy-bonded aluminum plate specimens, each with intentionally disbonded regions. These images are compared with corresponding conventional ultrasonic contact transducer measurements in order to assess whether these images can detect disbonded regions and provide information regarding the nature of the disbonded region. We present a description of a standoff/delay fixture which has been designed, constructed, and implemented on a Hewlett-Packard SONOS 1500 medical imaging system. This standoff/delay fixture, when attached to a 7.5 MHz linear array probe, greatly enhances our ability to interrogate flat plate specimens. The final section of this Progress Report describes a woven composite plate specimen that has been specially machined to include intentional flaws. This woven composite specimen will allow us to assess the feasibility of applying linear array imaging technology to the inspection and characterization of complex textile composite materials. We anticipate the results of this on-going investigation may provide a step toward the development of a rapid, real-time, and portable method of ultrasonic inspection and characterization based on linear array technology.

  10. OGRO: The Overview of functionally characterized Genes in Rice online database.

    PubMed

    Yamamoto, Eiji; Yonemaru, Jun-Ichi; Yamamoto, Toshio; Yano, Masahiro

    2012-12-01

    The high-quality sequence information and rich bioinformatics tools available for rice have contributed to remarkable advances in functional genomics. To facilitate the application of gene function information to the study of natural variation in rice, we comprehensively searched for articles related to rice functional genomics and extracted information on functionally characterized genes. As of 31 March 2012, 702 functionally characterized genes were annotated. This number represents about 1.6% of the predicted loci in the Rice Annotation Project Database. The compiled gene information is organized to facilitate direct comparisons with quantitative trait locus (QTL) information in the Q-TARO database. Comparison of genomic locations between functionally characterized genes and the QTLs revealed that QTL clusters were often co-localized with high-density gene regions, and that the genes associated with the QTLs in these clusters were different genes, suggesting that these QTL clusters are likely to be explained by tightly linked but distinct genes. Information on the functionally characterized genes compiled during this study is now available in the O verview of Functionally Characterized G enes in R ice O nline database (OGRO) on the Q-TARO website ( http://qtaro.abr.affrc.go.jp/ogro ). The database has two interfaces: a table containing gene information, and a genome viewer that allows users to compare the locations of QTLs and functionally characterized genes. OGRO on Q-TARO will facilitate a candidate-gene approach to identifying the genes responsible for QTLs. Because the QTL descriptions in Q-TARO contain information on agronomic traits, such comparisons will also facilitate the annotation of functionally characterized genes in terms of their effects on traits important for rice breeding. The increasing amount of information on rice gene function being generated from mutant panels and other types of studies will make the OGRO database even more valuable in the future.

  11. Label-free direct surface-enhanced Raman scattering (SERS) of nucleic acids (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Guerrini, Luca; Morla-Folch, Judit; Gisbert-Quilis, Patricia; Xie, Hainan; Alvarez-Puebla, Ramon

    2016-03-01

    Recently, plasmonic-based biosensing has experienced an unprecedented level of attention, with a particular focus on the nucleic acid detection, offering efficient solutions to engineer simple, fast, highly sensitive sensing platforms while overcoming important limitations of PCR and microarray techniques. In the broad field of plasmonics, surface-enhanced Raman scattering (SERS) spectroscopy has arisen as a powerful analytical tool for detection and structural characterization of biomolecules. Today applications of SERS to nucleic acid analysis largely rely on indirect strategies, which have been demonstrated very effective for pure sensing purposes but completely dismiss the exquisite structural information provided by the direct acquisition of the biomolecular vibrational fingerprint. Contrarily, direct label-free SERS of nucleic acid shows an outstanding potential in terms of chemical-specific information which, however, remained largely unexpressed mainly because of the inherent poor spectral reproducibility and/or limited sensitivity. To address these limitations, we developed a fast and affordable high-throughput screening direct SERS method for gaining detailed genomic information on nucleic acids (DNA and RNA) and for the characterization and quantitative recognition of DNA interactions with exogenous agents. The simple strategy relies on the electrostatic adhesion of DNA/RNA onto positively-charged silver colloids that promotes the nanoparticle aggregation into stable clusters yielding intense and reproducible SERS spectra at picogram level (i.e. the analysis can be performed without the necessity of amplification steps thus providing realistic direct information of the nucleic acid in its native state). We anticipate this method to gain a vast impact and set of applications in different fields, including medical diagnostics, genomic screening, drug discovery, forensic science and even molecular electronics.

  12. Texture-specific bag of visual words model and spatial cone matching-based method for the retrieval of focal liver lesions using multiphase contrast-enhanced CT images.

    PubMed

    Xu, Yingying; Lin, Lanfen; Hu, Hongjie; Wang, Dan; Zhu, Wenchao; Wang, Jian; Han, Xian-Hua; Chen, Yen-Wei

    2018-01-01

    The bag of visual words (BoVW) model is a powerful tool for feature representation that can integrate various handcrafted features like intensity, texture, and spatial information. In this paper, we propose a novel BoVW-based method that incorporates texture and spatial information for the content-based image retrieval to assist radiologists in clinical diagnosis. This paper presents a texture-specific BoVW method to represent focal liver lesions (FLLs). Pixels in the region of interest (ROI) are classified into nine texture categories using the rotation-invariant uniform local binary pattern method. The BoVW-based features are calculated for each texture category. In addition, a spatial cone matching (SCM)-based representation strategy is proposed to describe the spatial information of the visual words in the ROI. In a pilot study, eight radiologists with different clinical experience performed diagnoses for 20 cases with and without the top six retrieved results. A total of 132 multiphase computed tomography volumes including five pathological types were collected. The texture-specific BoVW was compared to other BoVW-based methods using the constructed dataset of FLLs. The results show that our proposed model outperforms the other three BoVW methods in discriminating different lesions. The SCM method, which adds spatial information to the orderless BoVW model, impacted the retrieval performance. In the pilot trial, the average diagnosis accuracy of the radiologists was improved from 66 to 80% using the retrieval system. The preliminary results indicate that the texture-specific features and the SCM-based BoVW features can effectively characterize various liver lesions. The retrieval system has the potential to improve the diagnostic accuracy and the confidence of the radiologists.

  13. Real-time X-ray Diffraction: Applications to Materials Characterization

    NASA Technical Reports Server (NTRS)

    Rosemeier, R. G.

    1984-01-01

    With the high speed growth of materials it becomes necessary to develop measuring systems which also have the capabilities of characterizing these materials at high speeds. One of the conventional techniques of characterizing materials was X-ray diffraction. Film, which is the oldest method of recording the X-ray diffraction phenomenon, is not quite adequate in most circumstances to record fast changing events. Even though conventional proportional counters and scintillation counters can provide the speed necessary to record these changing events, they lack the ability to provide image information which may be important in some types of experiment or production arrangements. A selected number of novel applications of using X-ray diffraction to characterize materials in real-time are discussed. Also, device characteristics of some X-ray intensifiers useful in instantaneous X-ray diffraction applications briefly presented. Real-time X-ray diffraction experiments with the incorporation of image X-ray intensification add a new dimension in the characterization of materials. The uses of real-time image intensification in laboratory and production arrangements are quite unlimited and their application depends more upon the ingenuity of the scientist or engineer.

  14. Characterization and preliminary toxicity assay of nano-titanium dioxide additive in sugar-coated chewing gum.

    PubMed

    Chen, Xin-Xin; Cheng, Bin; Yang, Yi-Xin; Cao, Aoneng; Liu, Jia-Hui; Du, Li-Jing; Liu, Yuanfang; Zhao, Yuliang; Wang, Haifang

    2013-05-27

    Nanotechnology shows great potential for producing food with higher quality and better taste through including new additives, improving nutrient delivery, and using better packaging. However, lack of investigations on safety issues of nanofood has resulted in public fears. How to characterize engineered nanomaterials in food and assess the toxicity and health impact of nanofood remains a big challenge. Herein, a facile and highly reliable separation method of TiO2 particles from food products (focusing on sugar-coated chewing gum) is reported, and the first comprehensive characterization study on food nanoparticles by multiple qualitative and quantitative methods is provided. The detailed information on nanoparticles in gum includes chemical composition, morphology, size distribution, crystalline phase, particle and mass concentration, surface charge, and aggregation state. Surprisingly, the results show that the number of food products containing nano-TiO2 (<200 nm) is much larger than known, and consumers have already often been exposed to engineered nanoparticles in daily life. Over 93% of TiO2 in gum is nano-TiO2 , and it is unexpectedly easy to come out and be swallowed by a person who chews gum. Preliminary cytotoxicity assays show that the gum nano-TiO2 particles are relatively safe for gastrointestinal cells within 24 h even at a concentration of 200 μg mL(-1) . This comprehensive study demonstrates accurate physicochemical property, exposure, and cytotoxicity information on engineered nanoparticles in food, which is a prerequisite for the successful safety assessment of nanofood products. Copyright © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  15. Three-dimensional Bayesian geostatistical aquifer characterization at the Hanford 300 Area using tracer test data

    NASA Astrophysics Data System (ADS)

    Chen, Xingyuan; Murakami, Haruko; Hahn, Melanie S.; Hammond, Glenn E.; Rockhold, Mark L.; Zachara, John M.; Rubin, Yoram

    2012-06-01

    Tracer tests performed under natural or forced gradient flow conditions can provide useful information for characterizing subsurface properties, through monitoring, modeling, and interpretation of the tracer plume migration in an aquifer. Nonreactive tracer experiments were conducted at the Hanford 300 Area, along with constant-rate injection tests and electromagnetic borehole flowmeter tests. A Bayesian data assimilation technique, the method of anchored distributions (MAD) (Rubin et al., 2010), was applied to assimilate the experimental tracer test data with the other types of data and to infer the three-dimensional heterogeneous structure of the hydraulic conductivity in the saturated zone of the Hanford formation.In this study, the Bayesian prior information on the underlying random hydraulic conductivity field was obtained from previous field characterization efforts using constant-rate injection and borehole flowmeter test data. The posterior distribution of the conductivity field was obtained by further conditioning the field on the temporal moments of tracer breakthrough curves at various observation wells. MAD was implemented with the massively parallel three-dimensional flow and transport code PFLOTRAN to cope with the highly transient flow boundary conditions at the site and to meet the computational demands of MAD. A synthetic study proved that the proposed method could effectively invert tracer test data to capture the essential spatial heterogeneity of the three-dimensional hydraulic conductivity field. Application of MAD to actual field tracer data at the Hanford 300 Area demonstrates that inverting for spatial heterogeneity of hydraulic conductivity under transient flow conditions is challenging and more work is needed.

  16. Deep Learning from EEG Reports for Inferring Underspecified Information

    PubMed Central

    Goodwin, Travis R.; Harabagiu, Sanda M.

    2017-01-01

    Secondary use1of electronic health records (EHRs) often relies on the ability to automatically identify and extract information from EHRs. Unfortunately, EHRs are known to suffer from a variety of idiosyncrasies – most prevalently, they have been shown to often omit or underspecify information. Adapting traditional machine learning methods for inferring underspecified information relies on manually specifying features characterizing the specific information to recover (e.g. particular findings, test results, or physician’s impressions). By contrast, in this paper, we present a method for jointly (1) automatically extracting word- and report-level features and (2) inferring underspecified information from EHRs. Our approach accomplishes these two tasks jointly by combining recent advances in deep neural learning with access to textual data in electroencephalogram (EEG) reports. We evaluate the performance of our model on the problem of inferring the neurologist’s over-all impression (normal or abnormal) from electroencephalogram (EEG) reports and report an accuracy of 91.4% precision of 94.4% recall of 91.2% and F1 measure of 92.8% (a 40% improvement over the performance obtained using Doc2Vec). These promising results demonstrate the power of our approach, while error analysis reveals remaining obstacles as well as areas for future improvement. PMID:28815118

  17. Evaluation of risk communication in a mammography patient decision aid

    PubMed Central

    Klein, Krystal A.; Watson, Lindsey; Ash, Joan S.; Eden, Karen B.

    2016-01-01

    Objectives We characterized patients’ comprehension, memory, and impressions of risk communication messages in a patient decision aid (PtDA), Mammopad, and clarified perceived importance of numeric risk information in medical decision making. Methods Participants were 75 women in their forties with average risk factors for breast cancer. We used mixed methods, comprising a risk estimation problem administered within a pretest–posttest design, and semi-structured qualitative interviews with a subsample of 21 women. Results Participants’ positive predictive value estimates of screening mammography improved after using Mammopad. Although risk information was only briefly memorable, through content analysis, we identified themes describing why participants value quantitative risk information, and obstacles to understanding. We describe ways the most complicated graphic was incompletely comprehended. Conclusions Comprehension of risk information following Mammopad use could be improved. Patients valued receiving numeric statistical information, particularly in pictograph format. Obstacles to understanding risk information, including potential for confusion between statistics, should be identified and mitigated in PtDA design. Practice implications Using simple pictographs accompanied by text, PtDAs may enhance a shared decision-making discussion. PtDA designers and providers should be aware of benefits and limitations of graphical risk presentations. Incorporating comprehension checks could help identify and correct misapprehensions of graphically presented statistics PMID:26965020

  18. Saturation-Transfer Difference (STD) NMR: A Simple and Fast Method for Ligand Screening and Characterization of Protein Binding

    ERIC Educational Resources Information Center

    Viegas, Aldino; Manso, Joao; Nobrega, Franklin L.; Cabrita, Eurico J.

    2011-01-01

    Saturation transfer difference (STD) NMR has emerged as one of the most popular ligand-based NMR techniques for the study of protein-ligand interactions. The success of this technique is a consequence of its robustness and the fact that it is focused on the signals of the ligand, without any need of processing NMR information about the receptor…

  19. Composite Materials Handbook. Volume 1. Polymer Matrix Composites Guidelines for Characterization of Structural Materials

    DTIC Science & Technology

    2002-06-17

    power law type (References 6.8.6.1(h) and (i)). Various attempts have been made to use fracture mechanics based methods for predicting failure of...participate in the MIL-HDBK-17 coordination activity . 7. All information and data contained in this handbook have been coordinated with industry and the U.S...for statistically- based properties ............................. 6 2.2.3 Issues of data equivalence

  20. Data normalization in biosurveillance: an information-theoretic approach.

    PubMed

    Peter, William; Najmi, Amir H; Burkom, Howard

    2007-10-11

    An approach to identifying public health threats by characterizing syndromic surveillance data in terms of its surprisability is discussed. Surprisability in our model is measured by assigning a probability distribution to a time series, and then calculating its entropy, leading to a straightforward designation of an alert. Initial application of our method is to investigate the applicability of using suitably-normalized syndromic counts (i.e., proportions) to improve early event detection.

Top