Sample records for extract physical information

  1. The methodology of semantic analysis for extracting physical effects

    NASA Astrophysics Data System (ADS)

    Fomenkova, M. A.; Kamaev, V. A.; Korobkin, D. M.; Fomenkov, S. A.

    2017-01-01

    The paper represents new methodology of semantic analysis for physical effects extracting. This methodology is based on the Tuzov ontology that formally describes the Russian language. In this paper, semantic patterns were described to extract structural physical information in the form of physical effects. A new algorithm of text analysis was described.

  2. Translations on USSR Science and Technology, Physical Sciences and Technology, Number 45

    DTIC Science & Technology

    1978-08-14

    hundred pages. [Question] In the public’s perception a cosmonaut seems to be, in terms of physical and mental fitness, something like a superman . How ...indicate how the original information was processed. Where no processing indicator is given, the information was summarized or extracted. Unfamiliar... how the original Information was processed. Where no processing indicator is given, the information was sipmarized or extracted, unfamiliar names

  3. Data assimilation to extract soil moisture information from SMAP observations

    USDA-ARS?s Scientific Manuscript database

    This study compares different methods to extract soil moisture information through the assimilation of Soil Moisture Active Passive (SMAP) observations. Neural Network(NN) and physically-based SMAP soil moisture retrievals were assimilated into the NASA Catchment model over the contiguous United Sta...

  4. Mutual information, neural networks and the renormalization group

    NASA Astrophysics Data System (ADS)

    Koch-Janusz, Maciej; Ringel, Zohar

    2018-06-01

    Physical systems differing in their microscopic details often display strikingly similar behaviour when probed at macroscopic scales. Those universal properties, largely determining their physical characteristics, are revealed by the powerful renormalization group (RG) procedure, which systematically retains `slow' degrees of freedom and integrates out the rest. However, the important degrees of freedom may be difficult to identify. Here we demonstrate a machine-learning algorithm capable of identifying the relevant degrees of freedom and executing RG steps iteratively without any prior knowledge about the system. We introduce an artificial neural network based on a model-independent, information-theoretic characterization of a real-space RG procedure, which performs this task. We apply the algorithm to classical statistical physics problems in one and two dimensions. We demonstrate RG flow and extract the Ising critical exponent. Our results demonstrate that machine-learning techniques can extract abstract physical concepts and consequently become an integral part of theory- and model-building.

  5. Physical data measurements and mathematical modelling of simple gas bubble experiments in glass melts

    NASA Technical Reports Server (NTRS)

    Weinberg, Michael C.

    1986-01-01

    In this work consideration is given to the problem of the extraction of physical data information from gas bubble dissolution and growth measurements. The discussion is limited to the analysis of the simplest experimental systems consisting of a single, one component gas bubble in a glassmelt. It is observed that if the glassmelt is highly under- (super-) saturated, then surface tension effects may be ignored, simplifying the task of extracting gas diffusivity values from the measurements. If, in addition, the bubble rise velocity is very small (or very large) the ease of obtaining physical property data is enhanced. Illustrations are given for typical cases.

  6. Future Directions of Nonlinear Dynamics in Physical and Biological Systems. (Physica D Nonlinear. Volume 68, Number 1)

    DTIC Science & Technology

    1993-09-15

    and structure of the equations. The Lagrangian for- c and we can extract information for any speed of mulation gives us an extremum principle for the...Dueholm and N.F. Pedersen, J. Appi. [261 For references on this see e.g. N.F. Pedersen, in: Phys. 60 (1986) 1447. SQUID 80, eds. H. Hahlbohm and H...obtained for arbitrary initial conditions, and a number of physical How do we augment the DNLSE (4) to treat features have been extracted [121. The

  7. Overview of image processing tools to extract physical information from JET videos

    NASA Astrophysics Data System (ADS)

    Craciunescu, T.; Murari, A.; Gelfusa, M.; Tiseanu, I.; Zoita, V.; EFDA Contributors, JET

    2014-11-01

    In magnetic confinement nuclear fusion devices such as JET, the last few years have witnessed a significant increase in the use of digital imagery, not only for the surveying and control of experiments, but also for the physical interpretation of results. More than 25 cameras are routinely used for imaging on JET in the infrared (IR) and visible spectral regions. These cameras can produce up to tens of Gbytes per shot and their information content can be very different, depending on the experimental conditions. However, the relevant information about the underlying physical processes is generally of much reduced dimensionality compared to the recorded data. The extraction of this information, which allows full exploitation of these diagnostics, is a challenging task. The image analysis consists, in most cases, of inverse problems which are typically ill-posed mathematically. The typology of objects to be analysed is very wide, and usually the images are affected by noise, low levels of contrast, low grey-level in-depth resolution, reshaping of moving objects, etc. Moreover, the plasma events have time constants of ms or tens of ms, which imposes tough conditions for real-time applications. On JET, in the last few years new tools and methods have been developed for physical information retrieval. The methodology of optical flow has allowed, under certain assumptions, the derivation of information about the dynamics of video objects associated with different physical phenomena, such as instabilities, pellets and filaments. The approach has been extended in order to approximate the optical flow within the MPEG compressed domain, allowing the manipulation of the large JET video databases and, in specific cases, even real-time data processing. The fast visible camera may provide new information that is potentially useful for disruption prediction. A set of methods, based on the extraction of structural information from the visual scene, have been developed for the automatic detection of MARFE (multifaceted asymmetric radiation from the edge) occurrences, which precede disruptions in density limit discharges. An original spot detection method has been developed for large surveys of videos in JET, and for the assessment of the long term trends in their evolution. The analysis of JET IR videos, recorded during JET operation with the ITER-like wall, allows the retrieval of data and hence correlation of the evolution of spots properties with macroscopic events, in particular series of intentional disruptions.

  8. The Extraction of Information From Visual Persistence

    ERIC Educational Resources Information Center

    Erwin, Donald E.

    1976-01-01

    This research sought to distinguish among three concepts of visual persistence by substituting the physical presence of the target stimulus while simultaneously inhibiting the formation of a persisting representation. Reportability of information about the stimuli was compared to a condition in which visual persistence was allowed to fully develop…

  9. EDGE COMPUTING AND CONTEXTUAL INFORMATION FOR THE INTERNET OF THINGS SENSORS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Klein, Levente

    Interpreting sensor data require knowledge about sensor placement and the surrounding environment. For a single sensor measurement, it is easy to document the context by visual observation, however for millions of sensors reporting data back to a server, the contextual information needs to be automatically extracted from either data analysis or leveraging complimentary data sources. Data layers that overlap spatially or temporally with sensor locations, can be used to extract the context and to validate the measurement. To minimize the amount of data transmitted through the internet, while preserving signal information content, two methods are explored; computation at the edgemore » and compressed sensing. We validate the above methods on wind and chemical sensor data (1) eliminate redundant measurement from wind sensors and (2) extract peak value of a chemical sensor measuring a methane plume. We present a general cloud based framework to validate sensor data based on statistical and physical modeling and contextual data extracted from geospatial data.« less

  10. Analysis of entropy extraction efficiencies in random number generation systems

    NASA Astrophysics Data System (ADS)

    Wang, Chao; Wang, Shuang; Chen, Wei; Yin, Zhen-Qiang; Han, Zheng-Fu

    2016-05-01

    Random numbers (RNs) have applications in many areas: lottery games, gambling, computer simulation, and, most importantly, cryptography [N. Gisin et al., Rev. Mod. Phys. 74 (2002) 145]. In cryptography theory, the theoretical security of the system calls for high quality RNs. Therefore, developing methods for producing unpredictable RNs with adequate speed is an attractive topic. Early on, despite the lack of theoretical support, pseudo RNs generated by algorithmic methods performed well and satisfied reasonable statistical requirements. However, as implemented, those pseudorandom sequences were completely determined by mathematical formulas and initial seeds, which cannot introduce extra entropy or information. In these cases, “random” bits are generated that are not at all random. Physical random number generators (RNGs), which, in contrast to algorithmic methods, are based on unpredictable physical random phenomena, have attracted considerable research interest. However, the way that we extract random bits from those physical entropy sources has a large influence on the efficiency and performance of the system. In this manuscript, we will review and discuss several randomness extraction schemes that are based on radiation or photon arrival times. We analyze the robustness, post-processing requirements and, in particular, the extraction efficiency of those methods to aid in the construction of efficient, compact and robust physical RNG systems.

  11. Heat engine driven by purely quantum information.

    PubMed

    Park, Jung Jun; Kim, Kang-Hwan; Sagawa, Takahiro; Kim, Sang Wook

    2013-12-06

    The key question of this Letter is whether work can be extracted from a heat engine by using purely quantum mechanical information. If the answer is yes, what is its mathematical formula? First, by using a bipartite memory we show that the work extractable from a heat engine is bounded not only by the free energy change and the sum of the entropy change of an individual memory but also by the change of quantum mutual information contained inside the memory. We then find that the engine can be driven by purely quantum information, expressed as the so-called quantum discord, forming a part of the quantum mutual information. To confirm it, as a physical example we present the Szilard engine containing a diatomic molecule with a semipermeable wall.

  12. The Grasp of Physics Concepts of Motion: Identifying Particular Patterns in Students' Thinking

    ERIC Educational Resources Information Center

    Obaidat, Ihab; Malkawi, Ehab

    2009-01-01

    We have investigated the grasp of some of the basic concepts of motion by students taking the introductory physics course in Mechanics at United Arab Emirates University (UAEU). We have developed a short research-based multiple-choice test where we were able to extract some information about the state of knowledge of the students. In general, the…

  13. Attending to suggestion and trance in the pediatric history and physical examination: a case study.

    PubMed

    Berberich, F Ralph

    2011-07-01

    Obtaining complete information lies at the heart of accurate diagnosis in all healthcare fields. Extracting information is a time-honored purpose of the history and physical examination. Practitioners may not be aware that these functions also provide opportunities to impart positive verbal and nonverbal suggestions. Paying attention to language promotes patient self-mastery and helps forge a therapeutic alliance for successful outcomes. Principles taught in hypnosis workshops can also help the practitioner avoid negative, undermining suggestions that could diminish diagnostic and therapeutic effectiveness.

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Klein, Levente

    Interpreting sensor data require knowledge about sensor placement and the surrounding environment. For a single sensor measurement, it is easy to document the context by visual observation, however for millions of sensors reporting data back to a server, the contextual information needs to be automatically extracted from either data analysis or leveraging complimentary data sources. Data layers that overlap spatially or temporally with sensor locations, can be used to extract the context and to validate the measurement. To minimize the amount of data transmitted through the internet, while preserving signal information content, two methods are explored; computation at the edgemore » and compressed sensing. We validate the above methods on wind and chemical sensor data (1) eliminate redundant measurement from wind sensors and (2) extract peak value of a chemical sensor measuring a methane plume. We present a general cloud based framework to validate sensor data based on statistical and physical modeling and contextual data extracted from geospatial data.« less

  15. Two-dimensional thermal video analysis of offshore bird and bat flight

    DOE PAGES

    Matzner, Shari; Cullinan, Valerie I.; Duberstein, Corey A.

    2015-09-11

    Thermal infrared video can provide essential information about bird and bat presence and activity for risk assessment studies, but the analysis of recorded video can be time-consuming and may not extract all of the available information. Automated processing makes continuous monitoring over extended periods of time feasible, and maximizes the information provided by video. This is especially important for collecting data in remote locations that are difficult for human observers to access, such as proposed offshore wind turbine sites. We present guidelines for selecting an appropriate thermal camera based on environmental conditions and the physical characteristics of the target animals.more » We developed new video image processing algorithms that automate the extraction of bird and bat flight tracks from thermal video, and that characterize the extracted tracks to support animal identification and behavior inference. The algorithms use a video peak store process followed by background masking and perceptual grouping to extract flight tracks. The extracted tracks are automatically quantified in terms that could then be used to infer animal type and possibly behavior. The developed automated processing generates results that are reproducible and verifiable, and reduces the total amount of video data that must be retained and reviewed by human experts. Finally, we suggest models for interpreting thermal imaging information.« less

  16. Two-dimensional thermal video analysis of offshore bird and bat flight

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Matzner, Shari; Cullinan, Valerie I.; Duberstein, Corey A.

    Thermal infrared video can provide essential information about bird and bat presence and activity for risk assessment studies, but the analysis of recorded video can be time-consuming and may not extract all of the available information. Automated processing makes continuous monitoring over extended periods of time feasible, and maximizes the information provided by video. This is especially important for collecting data in remote locations that are difficult for human observers to access, such as proposed offshore wind turbine sites. We present guidelines for selecting an appropriate thermal camera based on environmental conditions and the physical characteristics of the target animals.more » We developed new video image processing algorithms that automate the extraction of bird and bat flight tracks from thermal video, and that characterize the extracted tracks to support animal identification and behavior inference. The algorithms use a video peak store process followed by background masking and perceptual grouping to extract flight tracks. The extracted tracks are automatically quantified in terms that could then be used to infer animal type and possibly behavior. The developed automated processing generates results that are reproducible and verifiable, and reduces the total amount of video data that must be retained and reviewed by human experts. Finally, we suggest models for interpreting thermal imaging information.« less

  17. SPECTRa-T: machine-based data extraction and semantic searching of chemistry e-theses.

    PubMed

    Downing, Jim; Harvey, Matt J; Morgan, Peter B; Murray-Rust, Peter; Rzepa, Henry S; Stewart, Diana C; Tonge, Alan P; Townsend, Joe A

    2010-02-22

    The SPECTRa-T project has developed text-mining tools to extract named chemical entities (NCEs), such as chemical names and terms, and chemical objects (COs), e.g., experimental spectral assignments and physical chemistry properties, from electronic theses (e-theses). Although NCEs were readily identified within the two major document formats studied, only the use of structured documents enabled identification of chemical objects and their association with the relevant chemical entity (e.g., systematic chemical name). A corpus of theses was analyzed and it is shown that a high degree of semantic information can be extracted from structured documents. This integrated information has been deposited in a persistent Resource Description Framework (RDF) triple-store that allows users to conduct semantic searches. The strength and weaknesses of several document formats are reviewed.

  18. Cost-Effectiveness of Non-Invasive and Non-Pharmacological Interventions for Low Back Pain: a Systematic Literature Review.

    PubMed

    Andronis, Lazaros; Kinghorn, Philip; Qiao, Suyin; Whitehurst, David G T; Durrell, Susie; McLeod, Hugh

    2017-04-01

    Low back pain (LBP) is a major health problem, having a substantial effect on peoples' quality of life and placing a significant economic burden on healthcare systems and, more broadly, societies. Many interventions to alleviate LBP are available but their cost effectiveness is unclear. To identify, document and appraise studies reporting on the cost effectiveness of non-invasive and non-pharmacological treatment options for LBP. Relevant studies were identified through systematic searches in bibliographic databases (EMBASE, MEDLINE, PsycINFO, Cochrane Library, CINAHL and the National Health Service Economic Evaluation Database), 'similar article' searches and reference list scanning. Study selection was carried out by three assessors, independently. Study quality was assessed using the Consensus on Health Economic Criteria checklist. Data were extracted using customized extraction forms. Thirty-three studies were identified. Study interventions were categorised as: (1) combined physical exercise and psychological therapy, (2) physical exercise therapy only, (3) information and education, and (4) manual therapy. Interventions assessed within each category varied in terms of their components and delivery. In general, combined physical and psychological treatments, information and education interventions, and manual therapies appeared to be cost effective when compared with the study-specific comparators. There is inconsistent evidence around the cost effectiveness of physical exercise programmes as a whole, with yoga, but not group exercise, being cost effective. The identified evidence suggests that combined physical and psychological treatments, medical yoga, information and education programmes, spinal manipulation and acupuncture are likely to be cost-effective options for LBP.

  19. Random bits, true and unbiased, from atmospheric turbulence

    PubMed Central

    Marangon, Davide G.; Vallone, Giuseppe; Villoresi, Paolo

    2014-01-01

    Random numbers represent a fundamental ingredient for secure communications and numerical simulation as well as to games and in general to Information Science. Physical processes with intrinsic unpredictability may be exploited to generate genuine random numbers. The optical propagation in strong atmospheric turbulence is here taken to this purpose, by observing a laser beam after a 143 km free-space path. In addition, we developed an algorithm to extract the randomness of the beam images at the receiver without post-processing. The numbers passed very selective randomness tests for qualification as genuine random numbers. The extracting algorithm can be easily generalized to random images generated by different physical processes. PMID:24976499

  20. Extraction of Profile Information from Cloud Contaminated Radiances. Appendixes 2

    NASA Technical Reports Server (NTRS)

    Smith, W. L.; Zhou, D. K.; Huang, H.-L.; Li, Jun; Liu, X.; Larar, A. M.

    2003-01-01

    Clouds act to reduce the signal level and may produce noise dependence on the complexity of the cloud properties and the manner in which they are treated in the profile retrieval process. There are essentially three ways to extract profile information from cloud contaminated radiances: (1) cloud-clearing using spatially adjacent cloud contaminated radiance measurements, (2) retrieval based upon the assumption of opaque cloud conditions, and (3) retrieval or radiance assimilation using a physically correct cloud radiative transfer model which accounts for the absorption and scattering of the radiance observed. Cloud clearing extracts the radiance arising from the clear air portion of partly clouded fields of view permitting soundings to the surface or the assimilation of radiances as in the clear field of view case. However, the accuracy of the clear air radiance signal depends upon the cloud height and optical property uniformity across the two fields of view used in the cloud clearing process. The assumption of opaque clouds within the field of view permits relatively accurate profiles to be retrieved down to near cloud top levels, the accuracy near the cloud top level being dependent upon the actual microphysical properties of the cloud. The use of a physically correct cloud radiative transfer model enables accurate retrievals down to cloud top levels and below semi-transparent cloud layers (e.g., cirrus). It should also be possible to assimilate cloudy radiances directly into the model given a physically correct cloud radiative transfer model using geometric and microphysical cloud parameters retrieved from the radiance spectra as initial cloud variables in the radiance assimilation process. This presentation reviews the above three ways to extract profile information from cloud contaminated radiances. NPOESS Airborne Sounder Testbed-Interferometer radiance spectra and Aqua satellite AIRS radiance spectra are used to illustrate how cloudy radiances can be used in the profile retrieval process.

  1. Remote sensing and extractable biological resources

    NASA Technical Reports Server (NTRS)

    Cronin, L. E.

    1972-01-01

    The nature and quantity of extractable biological resources available in the Chesapeake Bay are discussed. The application of miniaturized radio sensors to track the movement of fish and birds is described. The specific uses of remote sensors for detecting and mapping areas of algae, red tide, thermal pollution, and vegetation beds are presented. The necessity for obtaining information on the physical, chemical, and meteorological features of the entire bay in order to provide improved resources management is emphasized.

  2. Information retrieval and terminology extraction in online resources for patients with diabetes.

    PubMed

    Seljan, Sanja; Baretić, Maja; Kucis, Vlasta

    2014-06-01

    Terminology use, as a mean for information retrieval or document indexing, plays an important role in health literacy. Specific types of users, i.e. patients with diabetes need access to various online resources (on foreign and/or native language) searching for information on self-education of basic diabetic knowledge, on self-care activities regarding importance of dietetic food, medications, physical exercises and on self-management of insulin pumps. Automatic extraction of corpus-based terminology from online texts, manuals or professional papers, can help in building terminology lists or list of "browsing phrases" useful in information retrieval or in document indexing. Specific terminology lists represent an intermediate step between free text search and controlled vocabulary, between user's demands and existing online resources in native and foreign language. The research aiming to detect the role of terminology in online resources, is conducted on English and Croatian manuals and Croatian online texts, and divided into three interrelated parts: i) comparison of professional and popular terminology use ii) evaluation of automatic statistically-based terminology extraction on English and Croatian texts iii) comparison and evaluation of extracted terminology performed on English manual using statistical and hybrid approaches. Extracted terminology candidates are evaluated by comparison with three types of reference lists: list created by professional medical person, list of highly professional vocabulary contained in MeSH and list created by non-medical persons, made as intersection of 15 lists. Results report on use of popular and professional terminology in online diabetes resources, on evaluation of automatically extracted terminology candidates in English and Croatian texts and on comparison of statistical and hybrid extraction methods in English text. Evaluation of automatic and semi-automatic terminology extraction methods is performed by recall, precision and f-measure.

  3. Effective Information Extraction Framework for Heterogeneous Clinical Reports Using Online Machine Learning and Controlled Vocabularies

    PubMed Central

    Zheng, Shuai; Ghasemzadeh, Nima; Hayek, Salim S; Quyyumi, Arshed A

    2017-01-01

    Background Extracting structured data from narrated medical reports is challenged by the complexity of heterogeneous structures and vocabularies and often requires significant manual effort. Traditional machine-based approaches lack the capability to take user feedbacks for improving the extraction algorithm in real time. Objective Our goal was to provide a generic information extraction framework that can support diverse clinical reports and enables a dynamic interaction between a human and a machine that produces highly accurate results. Methods A clinical information extraction system IDEAL-X has been built on top of online machine learning. It processes one document at a time, and user interactions are recorded as feedbacks to update the learning model in real time. The updated model is used to predict values for extraction in subsequent documents. Once prediction accuracy reaches a user-acceptable threshold, the remaining documents may be batch processed. A customizable controlled vocabulary may be used to support extraction. Results Three datasets were used for experiments based on report styles: 100 cardiac catheterization procedure reports, 100 coronary angiographic reports, and 100 integrated reports—each combines history and physical report, discharge summary, outpatient clinic notes, outpatient clinic letter, and inpatient discharge medication report. Data extraction was performed by 3 methods: online machine learning, controlled vocabularies, and a combination of these. The system delivers results with F1 scores greater than 95%. Conclusions IDEAL-X adopts a unique online machine learning–based approach combined with controlled vocabularies to support data extraction for clinical reports. The system can quickly learn and improve, thus it is highly adaptable. PMID:28487265

  4. Recognition techniques for extracting information from semistructured documents

    NASA Astrophysics Data System (ADS)

    Della Ventura, Anna; Gagliardi, Isabella; Zonta, Bruna

    2000-12-01

    Archives of optical documents are more and more massively employed, the demand driven also by the new norms sanctioning the legal value of digital documents, provided they are stored on supports that are physically unalterable. On the supply side there is now a vast and technologically advanced market, where optical memories have solved the problem of the duration and permanence of data at costs comparable to those for magnetic memories. The remaining bottleneck in these systems is the indexing. The indexing of documents with a variable structure, while still not completely automated, can be machine supported to a large degree with evident advantages both in the organization of the work, and in extracting information, providing data that is much more detailed and potentially significant for the user. We present here a system for the automatic registration of correspondence to and from a public office. The system is based on a general methodology for the extraction, indexing, archiving, and retrieval of significant information from semi-structured documents. This information, in our prototype application, is distributed among the database fields of sender, addressee, subject, date, and body of the document.

  5. Standardized Patients versus Volunteer Patients for Physical Therapy Students' Interviewing Practice: A Pilot Study.

    PubMed

    Murphy, Sue; Imam, Bita; MacIntyre, Donna L

    2015-01-01

    To compare the use of standardized patients (SPs) and volunteer patients (VPs) for physical therapy students' interviewing practice in terms of students' perception and overall costs. Students in the Master of Physical Therapy programme (n=80) at a Canadian university were divided into 20 groups of 4 and were randomly assigned to interview either an SP (10 groups) or a VP (10 groups). Students completed a survey about their perception of the usefulness of the activity and the ease and depth of information extraction. Survey responses as well as costs of the interview exercise were compared between SP and VP groups. No statistically significant between-groups difference was found for the majority of survey items. The cost of using an SP was $148, versus $50 for a VP. Students' perceptions of the usefulness of the activity in helping them to develop their interview skills and of the ease and depth of extracting information were similar for both SPs and VPs. Because the cost of using an SP is about three times that of using a VP, using VPs seem to be a more cost-effective option.

  6. Analogy between gambling and measurement-based work extraction

    NASA Astrophysics Data System (ADS)

    Vinkler, Dror A.; Permuter, Haim H.; Merhav, Neri

    2016-04-01

    In information theory, one area of interest is gambling, where mutual information characterizes the maximal gain in wealth growth rate due to knowledge of side information; the betting strategy that achieves this maximum is named the Kelly strategy. In the field of physics, it was recently shown that mutual information can characterize the maximal amount of work that can be extracted from a single heat bath using measurement-based control protocols, i.e. using ‘information engines’. However, to the best of our knowledge, no relation between gambling and information engines has been presented before. In this paper, we briefly review the two concepts and then demonstrate an analogy between gambling, where bits are converted into wealth, and information engines, where bits representing measurements are converted into energy. From this analogy follows an extension of gambling to the continuous-valued case, which is shown to be useful for investments in currency exchange rates or in the stock market using options. Moreover, the analogy enables us to use well-known methods and results from one field to solve problems in the other. We present three such cases: maximum work extraction when the probability distributions governing the system and measurements are unknown, work extraction when some energy is lost in each cycle, e.g. due to friction, and an analysis of systems with memory. In all three cases, the analogy enables us to use known results in order to obtain new ones.

  7. DNA confinement in nanochannels: physics and biological applications

    NASA Astrophysics Data System (ADS)

    Reisner, Walter; Pedersen, Jonas N.; Austin, Robert H.

    2012-10-01

    DNA is the central storage molecule of genetic information in the cell, and reading that information is a central problem in biology. While sequencing technology has made enormous advances over the past decade, there is growing interest in platforms that can readout genetic information directly from long single DNA molecules, with the ultimate goal of single-cell, single-genome analysis. Such a capability would obviate the need for ensemble averaging over heterogeneous cellular populations and eliminate uncertainties introduced by cloning and molecular amplification steps (thus enabling direct assessment of the genome in its native state). In this review, we will discuss how the information contained in genomic-length single DNA molecules can be accessed via physical confinement in nanochannels. Due to self-avoidance interactions, DNA molecules will stretch out when confined in nanochannels, creating a linear unscrolling of the genome along the channel for analysis. We will first review the fundamental physics of DNA nanochannel confinement—including the effect of varying ionic strength—and then discuss recent applications of these systems to genomic mapping. Apart from the intense biological interest in extracting linear sequence information from elongated DNA molecules, from a physics view these systems are fascinating as they enable probing of single-molecule conformation in environments with dimensions that intersect key physical length-scales in the 1 nm to 100 µm range.

  8. DNA confinement in nanochannels: physics and biological applications.

    PubMed

    Reisner, Walter; Pedersen, Jonas N; Austin, Robert H

    2012-10-01

    DNA is the central storage molecule of genetic information in the cell, and reading that information is a central problem in biology. While sequencing technology has made enormous advances over the past decade, there is growing interest in platforms that can readout genetic information directly from long single DNA molecules, with the ultimate goal of single-cell, single-genome analysis. Such a capability would obviate the need for ensemble averaging over heterogeneous cellular populations and eliminate uncertainties introduced by cloning and molecular amplification steps (thus enabling direct assessment of the genome in its native state). In this review, we will discuss how the information contained in genomic-length single DNA molecules can be accessed via physical confinement in nanochannels. Due to self-avoidance interactions, DNA molecules will stretch out when confined in nanochannels, creating a linear unscrolling of the genome along the channel for analysis. We will first review the fundamental physics of DNA nanochannel confinement--including the effect of varying ionic strength--and then discuss recent applications of these systems to genomic mapping. Apart from the intense biological interest in extracting linear sequence information from elongated DNA molecules, from a physics view these systems are fascinating as they enable probing of single-molecule conformation in environments with dimensions that intersect key physical length-scales in the 1 nm to 100 µm range.

  9. FY10 Report on Multi-scale Simulation of Solvent Extraction Processes: Molecular-scale and Continuum-scale Studies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wardle, Kent E.; Frey, Kurt; Pereira, Candido

    2014-02-02

    This task is aimed at predictive modeling of solvent extraction processes in typical extraction equipment through multiple simulation methods at various scales of resolution. We have conducted detailed continuum fluid dynamics simulation on the process unit level as well as simulations of the molecular-level physical interactions which govern extraction chemistry. Through combination of information gained through simulations at each of these two tiers along with advanced techniques such as the Lattice Boltzmann Method (LBM) which can bridge these two scales, we can develop the tools to work towards predictive simulation for solvent extraction on the equipment scale (Figure 1). Themore » goal of such a tool-along with enabling optimized design and operation of extraction units-would be to allow prediction of stage extraction effrciency under specified conditions. Simulation efforts on each of the two scales will be described below. As the initial application of FELBM in the work performed during FYl0 has been on annular mixing it will be discussed in context of the continuum-scale. In the future, however, it is anticipated that the real value of FELBM will be in its use as a tool for sub-grid model development through highly refined DNS-like multiphase simulations facilitating exploration and development of droplet models including breakup and coalescence which will be needed for the large-scale simulations where droplet level physics cannot be resolved. In this area, it can have a significant advantage over traditional CFD methods as its high computational efficiency allows exploration of significantly greater physical detail especially as computational resources increase in the future.« less

  10. Automated Fluid Feature Extraction from Transient Simulations

    NASA Technical Reports Server (NTRS)

    Haimes, Robert; Lovely, David

    1999-01-01

    In the past, feature extraction and identification were interesting concepts, but not required to understand the underlying physics of a steady flow field. This is because the results of the more traditional tools like iso-surfaces, cuts and streamlines were more interactive and easily abstracted so they could be represented to the investigator. These tools worked and properly conveyed the collected information at the expense of much interaction. For unsteady flow-fields, the investigator does not have the luxury of spending time scanning only one "snap-shot" of the simulation. Automated assistance is required in pointing out areas of potential interest contained within the flow. This must not require a heavy compute burden (the visualization should not significantly slow down the solution procedure for co-processing environments like pV3). And methods must be developed to abstract the feature and display it in a manner that physically makes sense. The following is a list of the important physical phenomena found in transient (and steady-state) fluid flow: (1) Shocks, (2) Vortex cores, (3) Regions of recirculation, (4) Boundary layers, (5) Wakes. Three papers and an initial specification for the (The Fluid eXtraction tool kit) FX Programmer's guide were included. The papers, submitted to the AIAA Computational Fluid Dynamics Conference, are entitled : (1) Using Residence Time for the Extraction of Recirculation Regions, (2) Shock Detection from Computational Fluid Dynamics results and (3) On the Velocity Gradient Tensor and Fluid Feature Extraction.

  11. Daemonic ergotropy: enhanced work extraction from quantum correlations

    NASA Astrophysics Data System (ADS)

    Francica, Gianluca; Goold, John; Plastina, Francesco; Paternostro, Mauro

    2017-03-01

    We investigate how the presence of quantum correlations can influence work extraction in closed quantum systems, establishing a new link between the field of quantum non-equilibrium thermodynamics and the one of quantum information theory. We consider a bipartite quantum system and we show that it is possible to optimize the process of work extraction, thanks to the correlations between the two parts of the system, by using an appropriate feedback protocol based on the concept of ergotropy. We prove that the maximum gain in the extracted work is related to the existence of quantum correlations between the two parts, quantified by either quantum discord or, for pure states, entanglement. We then illustrate our general findings on a simple physical situation consisting of a qubit system.

  12. Thermodynamical detection of entanglement by Maxwell's demons

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Maruyama, Koji; Vedral, Vlatko; Morikoshi, Fumiaki

    2005-01-01

    Quantum correlation, or entanglement, is now believed to be an indispensable physical resource for certain tasks in quantum information processing, for which classically correlated states cannot be useful. Besides information processing, what kind of physical processes can exploit entanglement? In this paper, we show that there is indeed a more basic relationship between entanglement and its usefulness in thermodynamics. We derive an inequality showing that we can extract more work out of a heat bath via entangled systems than via classically correlated ones. We also analyze the work balance of the process as a heat engine, in connection with themore » second law of thermodynamics.« less

  13. Engaging the community to improve nutrition and physical activity among houses of worship.

    PubMed

    Evans, Kiameesha R; Hudson, Shawna V

    2014-03-13

    Obesity, physical inactivity, and poor nutrition have been linked to many chronic diseases. Research indicates that interventions in community-based settings such as houses of worship can build on attendees' trust to address health issues and help them make behavioral changes. New Brunswick, New Jersey, has low rates of physical activity and a high prevalence of obesity. An adapted community-based intervention was implemented there to improve nutrition and physical activity among people who attend houses of worship and expand and enhance the network of partners working with Rutgers Cancer Institute of New Jersey. An adapted version of Body & Soul: A Celebration of Healthy Living and Eating was created using a 3-phase model to 1) educate lay members on nutrition and physical activity, 2) provide sustainable change through the development of physical activity programming, and 3) increase access to local produce through collaborations with community partners. Nineteen houses of worship were selected for participation in this program. Houses of worship provided a questionnaire to a convenience sample of its congregation to assess congregants' physical activity levels and produce consumption behaviors at baseline using questions from the Health Information National Trends Survey instrument. This information was also used to inform future program activities. Community-based health education can be a promising approach when appropriate partnerships are identified, funding is adequate, ongoing information is extracted to inform future action, and there is an expectation from all parties of long-term engagement and capacity building.

  14. Kinetics from Replica Exchange Molecular Dynamics Simulations.

    PubMed

    Stelzl, Lukas S; Hummer, Gerhard

    2017-08-08

    Transitions between metastable states govern many fundamental processes in physics, chemistry and biology, from nucleation events in phase transitions to the folding of proteins. The free energy surfaces underlying these processes can be obtained from simulations using enhanced sampling methods. However, their altered dynamics makes kinetic and mechanistic information difficult or impossible to extract. Here, we show that, with replica exchange molecular dynamics (REMD), one can not only sample equilibrium properties but also extract kinetic information. For systems that strictly obey first-order kinetics, the procedure to extract rates is rigorous. For actual molecular systems whose long-time dynamics are captured by kinetic rate models, accurate rate coefficients can be determined from the statistics of the transitions between the metastable states at each replica temperature. We demonstrate the practical applicability of the procedure by constructing master equation (Markov state) models of peptide and RNA folding from REMD simulations.

  15. Applying traditional signal processing techniques to social media exploitation for situational understanding

    NASA Astrophysics Data System (ADS)

    Abdelzaher, Tarek; Roy, Heather; Wang, Shiguang; Giridhar, Prasanna; Al Amin, Md. Tanvir; Bowman, Elizabeth K.; Kolodny, Michael A.

    2016-05-01

    Signal processing techniques such as filtering, detection, estimation and frequency domain analysis have long been applied to extract information from noisy sensor data. This paper describes the exploitation of these signal processing techniques to extract information from social networks, such as Twitter and Instagram. Specifically, we view social networks as noisy sensors that report events in the physical world. We then present a data processing stack for detection, localization, tracking, and veracity analysis of reported events using social network data. We show using a controlled experiment that the behavior of social sources as information relays varies dramatically depending on context. In benign contexts, there is general agreement on events, whereas in conflict scenarios, a significant amount of collective filtering is introduced by conflicted groups, creating a large data distortion. We describe signal processing techniques that mitigate such distortion, resulting in meaningful approximations of actual ground truth, given noisy reported observations. Finally, we briefly present an implementation of the aforementioned social network data processing stack in a sensor network analysis toolkit, called Apollo. Experiences with Apollo show that our techniques are successful at identifying and tracking credible events in the physical world.

  16. A fresh approach to forecasting in astroparticle physics and dark matter searches

    NASA Astrophysics Data System (ADS)

    Edwards, Thomas D. P.; Weniger, Christoph

    2018-02-01

    We present a toolbox of new techniques and concepts for the efficient forecasting of experimental sensitivities. These are applicable to a large range of scenarios in (astro-)particle physics, and based on the Fisher information formalism. Fisher information provides an answer to the question 'what is the maximum extractable information from a given observation?'. It is a common tool for the forecasting of experimental sensitivities in many branches of science, but rarely used in astroparticle physics or searches for particle dark matter. After briefly reviewing the Fisher information matrix of general Poisson likelihoods, we propose very compact expressions for estimating expected exclusion and discovery limits ('equivalent counts method'). We demonstrate by comparison with Monte Carlo results that they remain surprisingly accurate even deep in the Poisson regime. We show how correlated background systematics can be efficiently accounted for by a treatment based on Gaussian random fields. Finally, we introduce the novel concept of Fisher information flux. It can be thought of as a generalization of the commonly used signal-to-noise ratio, while accounting for the non-local properties and saturation effects of background and instrumental uncertainties. It is a powerful and flexible tool ready to be used as core concept for informed strategy development in astroparticle physics and searches for particle dark matter.

  17. Can administrative claim file review be used to gather physical therapy, occupational therapy, and psychology payment data and functional independence measure scores? Implications for rehabilitation providers in the private health sector.

    PubMed

    Riis, Viivi; Jaglal, Susan; Boschen, Kathryn; Walker, Jan; Verrier, Molly

    2011-01-01

    Rehabilitation costs for spinal-cord injury (SCI) are increasingly borne by Canada's private health system. Because of poor outcomes, payers are questioning the value of their expenditures, but there is a paucity of data informing analysis of rehabilitation costs and outcomes. This study evaluated the feasibility of using administrative claim file review to extract rehabilitation payment data and functional status for a sample of persons with work-related SCI. Researchers reviewed 28 administrative e-claim files for persons who sustained a work-related SCI between 1996 and 2000. Payment data were extracted for physical therapy (PT), occupational therapy (OT), and psychology services. Functional Independence Measure (FIM) scores were targeted as a surrogate measure for functional outcome. Feasibility was tested using an existing approach for evaluating health services data. The process of administrative e-claim file review was not practical for extraction of the targeted data. While administrative claim files contain some rehabilitation payment and outcome data, in their present form the data are not suitable to inform rehabilitation services research. A new strategy to standardize collection, recording, and sharing of data in the rehabilitation industry should be explored as a means of promoting best practices.

  18. Effective Information Extraction Framework for Heterogeneous Clinical Reports Using Online Machine Learning and Controlled Vocabularies.

    PubMed

    Zheng, Shuai; Lu, James J; Ghasemzadeh, Nima; Hayek, Salim S; Quyyumi, Arshed A; Wang, Fusheng

    2017-05-09

    Extracting structured data from narrated medical reports is challenged by the complexity of heterogeneous structures and vocabularies and often requires significant manual effort. Traditional machine-based approaches lack the capability to take user feedbacks for improving the extraction algorithm in real time. Our goal was to provide a generic information extraction framework that can support diverse clinical reports and enables a dynamic interaction between a human and a machine that produces highly accurate results. A clinical information extraction system IDEAL-X has been built on top of online machine learning. It processes one document at a time, and user interactions are recorded as feedbacks to update the learning model in real time. The updated model is used to predict values for extraction in subsequent documents. Once prediction accuracy reaches a user-acceptable threshold, the remaining documents may be batch processed. A customizable controlled vocabulary may be used to support extraction. Three datasets were used for experiments based on report styles: 100 cardiac catheterization procedure reports, 100 coronary angiographic reports, and 100 integrated reports-each combines history and physical report, discharge summary, outpatient clinic notes, outpatient clinic letter, and inpatient discharge medication report. Data extraction was performed by 3 methods: online machine learning, controlled vocabularies, and a combination of these. The system delivers results with F1 scores greater than 95%. IDEAL-X adopts a unique online machine learning-based approach combined with controlled vocabularies to support data extraction for clinical reports. The system can quickly learn and improve, thus it is highly adaptable. ©Shuai Zheng, James J Lu, Nima Ghasemzadeh, Salim S Hayek, Arshed A Quyyumi, Fusheng Wang. Originally published in JMIR Medical Informatics (http://medinform.jmir.org), 09.05.2017.

  19. Quantum Szilard engines with arbitrary spin.

    PubMed

    Zhuang, Zekun; Liang, Shi-Dong

    2014-11-01

    The quantum Szilard engine (QSZE) is a conceptual quantum engine for understanding the fundamental physics of quantum thermodynamics and information physics. We generalize the QSZE to an arbitrary spin case, i.e., a spin QSZE (SQSZE), and we systematically study the basic physical properties of both fermion and boson SQSZEs in a low-temperature approximation. We give the analytic formulation of the total work. For the fermion SQSZE, the work might be absorbed from the environment, and the change rate of the work with temperature exhibits periodicity and even-odd oscillation, which is a generalization of a spinless QSZE. It is interesting that the average absorbed work oscillates regularly and periodically in a large-number limit, which implies that the average absorbed work in a fermion SQSZE is neither an intensive quantity nor an extensive quantity. The phase diagrams of both fermion and boson SQSZEs give the SQSZE doing positive or negative work in the parameter space of the temperature and the particle number of the system, but they have different behaviors because the spin degrees of the fermion and the boson play different roles in their configuration states and corresponding statistical properties. The critical temperature of phase transition depends sensitively on the particle number. By using Landauer's erasure principle, we give the erasure work in a thermodynamic cycle, and we define an efficiency (we refer to it as information-work efficiency) to measure the engine's ability of utilizing information to extract work. We also give the conditions under which the maximum extracted work and highest information-work efficiencies for fermion and boson SQSZEs can be achieved.

  20. Constraints on the phase gamma and new physics from B --> kpi decays

    PubMed

    He; Hsueh; Shi

    2000-01-03

    Recent results from CLEO on B-->Kpi indicate that the phase gamma may be substantially different from that obtained from other fit to the KM matrix elements in the standard model. We show that gamma extracted using B-->Kpi,pipi is sensitive to new physics occurring at loop level. It provides a powerful method to probe new physics in electroweak penguin interactions. Using effects due to anomalous gauge couplings as an example, we show that within the allowed ranges for these couplings information about gamma obtained from B-->Kpi,pipi can be very different from the standard model prediction.

  1. Multispectral system analysis through modeling and simulation

    NASA Technical Reports Server (NTRS)

    Malila, W. A.; Gleason, J. M.; Cicone, R. C.

    1977-01-01

    The design and development of multispectral remote sensor systems and associated information extraction techniques should be optimized under the physical and economic constraints encountered and yet be effective over a wide range of scene and environmental conditions. Direct measurement of the full range of conditions to be encountered can be difficult, time consuming, and costly. Simulation of multispectral data by modeling scene, atmosphere, sensor, and data classifier characteristics is set forth as a viable alternative, particularly when coupled with limited sets of empirical measurements. A multispectral system modeling capability is described. Use of the model is illustrated for several applications - interpretation of remotely sensed data from agricultural and forest scenes, evaluating atmospheric effects in Landsat data, examining system design and operational configuration, and development of information extraction techniques.

  2. Multispectral system analysis through modeling and simulation

    NASA Technical Reports Server (NTRS)

    Malila, W. A.; Gleason, J. M.; Cicone, R. C.

    1977-01-01

    The design and development of multispectral remote sensor systems and associated information extraction techniques should be optimized under the physical and economic constraints encountered and yet be effective over a wide range of scene and environmental conditions. Direct measurement of the full range of conditions to be encountered can be difficult, time consuming, and costly. Simulation of multispectral data by modeling scene, atmosphere, sensor, and data classifier characteristics is set forth as a viable alternative, particularly when coupled with limited sets of empirical measurements. A multispectral system modeling capability is described. Use of the model is illustrated for several applications - interpretation of remotely sensed data from agricultural and forest scenes, evaluating atmospheric effects in LANDSAT data, examining system design and operational configuration, and development of information extraction techniques.

  3. Thermal feature extraction of servers in a datacenter using thermal image registration

    NASA Astrophysics Data System (ADS)

    Liu, Hang; Ran, Jian; Xie, Ting; Gao, Shan

    2017-09-01

    Thermal cameras provide fine-grained thermal information that enhances monitoring and enables automatic thermal management in large datacenters. Recent approaches employing mobile robots or thermal camera networks can already identify the physical locations of hot spots. Other distribution information used to optimize datacenter management can also be obtained automatically using pattern recognition technology. However, most of the features extracted from thermal images, such as shape and gradient, may be affected by changes in the position and direction of the thermal camera. This paper presents a method for extracting the thermal features of a hot spot or a server in a container datacenter. First, thermal and visual images are registered based on textural characteristics extracted from images acquired in datacenters. Then, the thermal distribution of each server is standardized. The features of a hot spot or server extracted from the standard distribution can reduce the impact of camera position and direction. The results of experiments show that image registration is efficient for aligning the corresponding visual and thermal images in the datacenter, and the standardization procedure reduces the impacts of camera position and direction on hot spot or server features.

  4. The influence of geomorphology on the role of women at artisanal and small-scale mine sites

    USGS Publications Warehouse

    Malpeli, Katherine C.; Chirico, Peter G.

    2013-01-01

    The geologic and geomorphic expressions of a mineral deposit determine its location, size, and accessibility, characteristics which in turn greatly influence the success of artisans mining the deposit. Despite this critical information, which can be garnered through studying the surficial physical expression of a deposit, the geologic and geomorphic sciences have been largely overlooked in artisanal mining-related research. This study demonstrates that a correlation exists between the roles of female miners at artisanal diamond and gold mining sites in western and central Africa and the physical expression of the deposits. Typically, women perform ore processing and ancillary roles at mine sites. On occasion, however, women participate in the extraction process itself. Women were found to participate in the extraction of ore only when a deposit had a thin overburden layer, thus rendering the mineralized ore more accessible. When deposits required a significant degree of manual labour to access the ore due to thick overburden layers, women were typically relegated to other roles. The identification of this link encourages the establishment of an alternative research avenue in which the physical and social sciences merge to better inform policymakers, so that the most appropriate artisanal mining assistance programs can be developed and implemented.

  5. Deep Learning of Atomically Resolved Scanning Transmission Electron Microscopy Images: Chemical Identification and Tracking Local Transformations.

    PubMed

    Ziatdinov, Maxim; Dyck, Ondrej; Maksov, Artem; Li, Xufan; Sang, Xiahan; Xiao, Kai; Unocic, Raymond R; Vasudevan, Rama; Jesse, Stephen; Kalinin, Sergei V

    2017-12-26

    Recent advances in scanning transmission electron and scanning probe microscopies have opened exciting opportunities in probing the materials structural parameters and various functional properties in real space with angstrom-level precision. This progress has been accompanied by an exponential increase in the size and quality of data sets produced by microscopic and spectroscopic experimental techniques. These developments necessitate adequate methods for extracting relevant physical and chemical information from the large data sets, for which a priori information on the structures of various atomic configurations and lattice defects is limited or absent. Here we demonstrate an application of deep neural networks to extract information from atomically resolved images including location of the atomic species and type of defects. We develop a "weakly supervised" approach that uses information on the coordinates of all atomic species in the image, extracted via a deep neural network, to identify a rich variety of defects that are not part of an initial training set. We further apply our approach to interpret complex atomic and defect transformation, including switching between different coordination of silicon dopants in graphene as a function of time, formation of peculiar silicon dimer with mixed 3-fold and 4-fold coordination, and the motion of molecular "rotor". This deep learning-based approach resembles logic of a human operator, but can be scaled leading to significant shift in the way of extracting and analyzing information from raw experimental data.

  6. Using Information from the Electronic Health Record to Improve Measurement of Unemployment in Service Members and Veterans with mTBI and Post-Deployment Stress

    PubMed Central

    Dillahunt-Aspillaga, Christina; Finch, Dezon; Massengale, Jill; Kretzmer, Tracy; Luther, Stephen L.; McCart, James A.

    2014-01-01

    Objective The purpose of this pilot study is 1) to develop an annotation schema and a training set of annotated notes to support the future development of a natural language processing (NLP) system to automatically extract employment information, and 2) to determine if information about employment status, goals and work-related challenges reported by service members and Veterans with mild traumatic brain injury (mTBI) and post-deployment stress can be identified in the Electronic Health Record (EHR). Design Retrospective cohort study using data from selected progress notes stored in the EHR. Setting Post-deployment Rehabilitation and Evaluation Program (PREP), an in-patient rehabilitation program for Veterans with TBI at the James A. Haley Veterans' Hospital in Tampa, Florida. Participants Service members and Veterans with TBI who participated in the PREP program (N = 60). Main Outcome Measures Documentation of employment status, goals, and work-related challenges reported by service members and recorded in the EHR. Results Two hundred notes were examined and unique vocational information was found indicating a variety of self-reported employment challenges. Current employment status and future vocational goals along with information about cognitive, physical, and behavioral symptoms that may affect return-to-work were extracted from the EHR. The annotation schema developed for this study provides an excellent tool upon which NLP studies can be developed. Conclusions Information related to employment status and vocational history is stored in text notes in the EHR system. Information stored in text does not lend itself to easy extraction or summarization for research and rehabilitation planning purposes. Development of NLP systems to automatically extract text-based employment information provides data that may improve the understanding and measurement of employment in this important cohort. PMID:25541956

  7. The Past, Present and Future of the Meteorological Phenomena Identification Near the Ground (mPING) Project

    NASA Astrophysics Data System (ADS)

    Elmore, K. L.

    2016-12-01

    The Metorological Phenomemna Identification NeartheGround (mPING) project is an example of a crowd-sourced, citizen science effort to gather data of sufficeint quality and quantity needed by new post processing methods that use machine learning. Transportation and infrastructure are particularly sensitive to precipitation type in winter weather. We extract attributes from operational numerical forecast models and use them in a random forest to generate forecast winter precipitation types. We find that random forests applied to forecast soundings are effective at generating skillful forecasts of surface ptype with consideralbly more skill than the current algorithms, especuially for ice pellets and freezing rain. We also find that three very different forecast models yuield similar overall results, showing that random forests are able to extract essentially equivalent information from different forecast models. We also show that the random forest for each model, and each profile type is unique to the particular forecast model and that the random forests developed using a particular model suffer significant degradation when given attributes derived from a different model. This implies that no single algorithm can perform well across all forecast models. Clearly, random forests extract information unavailable to "physically based" methods because the physical information in the models does not appear as we expect. One intersting result is that results from the classic "warm nose" sounding profile are, by far, the most sensitive to the particular forecast model, but this profile is also the one for which random forests are most skillful. Finally, a method for calibrarting probabilties for each different ptype using multinomial logistic regression is shown.

  8. Light Microscopy at Maximal Precision

    NASA Astrophysics Data System (ADS)

    Bierbaum, Matthew; Leahy, Brian D.; Alemi, Alexander A.; Cohen, Itai; Sethna, James P.

    2017-10-01

    Microscopy is the workhorse of the physical and life sciences, producing crisp images of everything from atoms to cells well beyond the capabilities of the human eye. However, the analysis of these images is frequently little more accurate than manual marking. Here, we revolutionize the analysis of microscopy images, extracting all the useful information theoretically contained in a complex microscope image. Using a generic, methodological approach, we extract the information by fitting experimental images with a detailed optical model of the microscope, a method we call parameter extraction from reconstructing images (PERI). As a proof of principle, we demonstrate this approach with a confocal image of colloidal spheres, improving measurements of particle positions and radii by 10-100 times over current methods and attaining the maximum possible accuracy. With this unprecedented accuracy, we measure nanometer-scale colloidal interactions in dense suspensions solely with light microscopy, a previously impossible feat. Our approach is generic and applicable to imaging methods from brightfield to electron microscopy, where we expect accuracies of 1 nm and 0.1 pm, respectively.

  9. Wavelets and their applications past and future

    NASA Astrophysics Data System (ADS)

    Coifman, Ronald R.

    2009-04-01

    As this is a conference on mathematical tools for defense, I would like to dedicate this talk to the memory of Louis Auslander, who through his insights and visionary leadership, brought powerful new mathematics into DARPA, he has provided the main impetus to the development and insertion of wavelet based processing in defense. My goal here is to describe the evolution of a stream of ideas in Harmonic Analysis, ideas which in the past have been mostly applied for the analysis and extraction of information from physical data, and which now are increasingly applied to organize and extract information and knowledge from any set of digital documents, from text to music to questionnaires. This form of signal processing on digital data, is part of the future of wavelet analysis.

  10. Simultaneous parameter optimization of x-ray and neutron reflectivity data using genetic algorithms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Singh, Surendra, E-mail: surendra@barc.gov.in; Basu, Saibal

    2016-05-23

    X-ray and neutron reflectivity are two non destructive techniques which provide a wealth of information on thickness, structure and interracial properties in nanometer length scale. Combination of X-ray and neutron reflectivity is well suited for obtaining physical parameters of nanostructured thin films and superlattices. Neutrons provide a different contrast between the elements than X-rays and are also sensitive to the magnetization depth profile in thin films and superlattices. The real space information is extracted by fitting a model for the structure of the thin film sample in reflectometry experiments. We have applied a Genetic Algorithms technique to extract depth dependentmore » structure and magnetic in thin film and multilayer systems by simultaneously fitting X-ray and neutron reflectivity data.« less

  11. MedEx/J: A One-Scan Simple and Fast NLP Tool for Japanese Clinical Texts.

    PubMed

    Aramaki, Eiji; Yano, Ken; Wakamiya, Shoko

    2017-01-01

    Because of recent replacement of physical documents with electronic medical records (EMR), the importance of information processing in the medical field has increased. In light of this trend, we have been developing MedEx/J, which retrieves important Japanese language information from medical reports. MedEx/J executes two tasks simultaneously: (1) term extraction, and (2) positive and negative event classification. We designate this approach as a one-scan approach, providing simplicity of systems and reasonable accuracy. MedEx/J performance on the two tasks is described herein: (1) term extraction (Fβ = 1 = 0.87) and (2) positive-negative classification (Fβ = 1 = 0.63). This paper also presents discussion and explains remaining issues in the medical natural language processing field.

  12. A Circuit Extraction System and Graphical Display for VLSI (Very Large Scale Integrated) Design.

    DTIC Science & Technology

    1989-12-01

    understandable as a net-list. The file contains information on the different physical layers of a polysilicon chip, not how these layers combine to form...yperc; struct vwsurf vsurf =DEFAULT_VWSURF(pixwt-ndd); stt-uct vwsurf vsurf2 DEFAULT-VWSURF(pixwfLndd); ma in) another[ Ol =IV while (anothler[0O = ’y

  13. Principal component greenness transformation in multitemporal agricultural Landsat data

    NASA Technical Reports Server (NTRS)

    Abotteen, R. A.

    1978-01-01

    A data compression technique for multitemporal Landsat imagery which extracts phenological growth pattern information for agricultural crops is described. The principal component greenness transformation was applied to multitemporal agricultural Landsat data for information retrieval. The transformation was favorable for applications in agricultural Landsat data analysis because of its physical interpretability and its relation to the phenological growth of crops. It was also found that the first and second greenness eigenvector components define a temporal small-grain trajectory and nonsmall-grain trajectory, respectively.

  14. Single fibre strength of cellulosic fibre extracted from "Belatlan roots" plant

    NASA Astrophysics Data System (ADS)

    M. Hanis. A., H.; Majid, M. S. Abdul; Ridzuan, M. J. M.; Fahmi, I.

    2017-12-01

    The tensile strength of a fibre extracted from "Belatlan Root" plant was investigated as potential reinforcement material in polymeric composites. Following retting process, the fibres were manually extracted from "Belatlan" root's plant. The fibres were treated with 5 % 10 %, 15 %, and 20 % sodium hydroxide (NaOH) wt. % concentration for 24 h. The single fibre tests were then performed in accordance with ASTM D3822-07 standard. The surfaces of the fibres prior and after the treatment were observed with a metallurgical Microscope MT8100 and the physical properties were recorded. Physically, in the post treatment, the fibre showed a decrease in diameter with increase in NaOH concentration The results from the mechanical testing indicates that samples subjected to 5 % NaOH treatment yielded the highest tensile strength and elastic modulus at 89.05 MPa ± 2.75 and 3.81 GPa ± 0.09 respectively compared to untreated fibres. This represents an increase of almost 72 % in tensile strength and 42 % for elastic modulus. The findings support the preliminary information for incorporating the "Belatlan Root" as possible reinforcing materials in composite structures.

  15. Contributions to the understanding of large-scale coherent structures in developing free turbulent shear flows

    NASA Technical Reports Server (NTRS)

    Liu, J. T. C.

    1986-01-01

    Advances in the mechanics of boundary layer flow are reported. The physical problems of large scale coherent structures in real, developing free turbulent shear flows, from the nonlinear aspects of hydrodynamic stability are addressed. The presence of fine grained turbulence in the problem, and its absence, lacks a small parameter. The problem is presented on the basis of conservation principles, which are the dynamics of the problem directed towards extracting the most physical information, however, it is emphasized that it must also involve approximations.

  16. MSL: Facilitating automatic and physical analysis of published scientific literature in PDF format.

    PubMed

    Ahmed, Zeeshan; Dandekar, Thomas

    2015-01-01

    Published scientific literature contains millions of figures, including information about the results obtained from different scientific experiments e.g. PCR-ELISA data, microarray analysis, gel electrophoresis, mass spectrometry data, DNA/RNA sequencing, diagnostic imaging (CT/MRI and ultrasound scans), and medicinal imaging like electroencephalography (EEG), magnetoencephalography (MEG), echocardiography  (ECG), positron-emission tomography (PET) images. The importance of biomedical figures has been widely recognized in scientific and medicine communities, as they play a vital role in providing major original data, experimental and computational results in concise form. One major challenge for implementing a system for scientific literature analysis is extracting and analyzing text and figures from published PDF files by physical and logical document analysis. Here we present a product line architecture based bioinformatics tool 'Mining Scientific Literature (MSL)', which supports the extraction of text and images by interpreting all kinds of published PDF files using advanced data mining and image processing techniques. It provides modules for the marginalization of extracted text based on different coordinates and keywords, visualization of extracted figures and extraction of embedded text from all kinds of biological and biomedical figures using applied Optimal Character Recognition (OCR). Moreover, for further analysis and usage, it generates the system's output in different formats including text, PDF, XML and images files. Hence, MSL is an easy to install and use analysis tool to interpret published scientific literature in PDF format.

  17. Framework for automatic information extraction from research papers on nanocrystal devices

    PubMed Central

    Yoshioka, Masaharu; Hara, Shinjiro; Newton, Marcus C

    2015-01-01

    Summary To support nanocrystal device development, we have been working on a computational framework to utilize information in research papers on nanocrystal devices. We developed an annotated corpus called “ NaDev” (Nanocrystal Device Development) for this purpose. We also proposed an automatic information extraction system called “NaDevEx” (Nanocrystal Device Automatic Information Extraction Framework). NaDevEx aims at extracting information from research papers on nanocrystal devices using the NaDev corpus and machine-learning techniques. However, the characteristics of NaDevEx were not examined in detail. In this paper, we conduct system evaluation experiments for NaDevEx using the NaDev corpus. We discuss three main issues: system performance, compared with human annotators; the effect of paper type (synthesis or characterization) on system performance; and the effects of domain knowledge features (e.g., a chemical named entity recognition system and list of names of physical quantities) on system performance. We found that overall system performance was 89% in precision and 69% in recall. If we consider identification of terms that intersect with correct terms for the same information category as the correct identification, i.e., loose agreement (in many cases, we can find that appropriate head nouns such as temperature or pressure loosely match between two terms), the overall performance is 95% in precision and 74% in recall. The system performance is almost comparable with results of human annotators for information categories with rich domain knowledge information (source material). However, for other information categories, given the relatively large number of terms that exist only in one paper, recall of individual information categories is not high (39–73%); however, precision is better (75–97%). The average performance for synthesis papers is better than that for characterization papers because of the lack of training examples for characterization papers. Based on these results, we discuss future research plans for improving the performance of the system. PMID:26665057

  18. Framework for automatic information extraction from research papers on nanocrystal devices.

    PubMed

    Dieb, Thaer M; Yoshioka, Masaharu; Hara, Shinjiro; Newton, Marcus C

    2015-01-01

    To support nanocrystal device development, we have been working on a computational framework to utilize information in research papers on nanocrystal devices. We developed an annotated corpus called " NaDev" (Nanocrystal Device Development) for this purpose. We also proposed an automatic information extraction system called "NaDevEx" (Nanocrystal Device Automatic Information Extraction Framework). NaDevEx aims at extracting information from research papers on nanocrystal devices using the NaDev corpus and machine-learning techniques. However, the characteristics of NaDevEx were not examined in detail. In this paper, we conduct system evaluation experiments for NaDevEx using the NaDev corpus. We discuss three main issues: system performance, compared with human annotators; the effect of paper type (synthesis or characterization) on system performance; and the effects of domain knowledge features (e.g., a chemical named entity recognition system and list of names of physical quantities) on system performance. We found that overall system performance was 89% in precision and 69% in recall. If we consider identification of terms that intersect with correct terms for the same information category as the correct identification, i.e., loose agreement (in many cases, we can find that appropriate head nouns such as temperature or pressure loosely match between two terms), the overall performance is 95% in precision and 74% in recall. The system performance is almost comparable with results of human annotators for information categories with rich domain knowledge information (source material). However, for other information categories, given the relatively large number of terms that exist only in one paper, recall of individual information categories is not high (39-73%); however, precision is better (75-97%). The average performance for synthesis papers is better than that for characterization papers because of the lack of training examples for characterization papers. Based on these results, we discuss future research plans for improving the performance of the system.

  19. Deep Learning of Atomically Resolved Scanning Transmission Electron Microscopy Images: Chemical Identification and Tracking Local Transformations

    DOE PAGES

    Ziatdinov, Maxim; Dyck, Ondrej; Maksov, Artem; ...

    2017-12-07

    Recent advances in scanning transmission electron and scanning probe microscopies have opened unprecedented opportunities in probing the materials structural parameters and various functional properties in real space with an angstrom-level precision. This progress has been accompanied by exponential increase in the size and quality of datasets produced by microscopic and spectroscopic experimental techniques. These developments necessitate adequate methods for extracting relevant physical and chemical information from the large datasets, for which a priori information on the structures of various atomic configurations and lattice defects is limited or absent. Here we demonstrate an application of deep neural networks to extracting informationmore » from atomically resolved images including location of the atomic species and type of defects. We develop a “weakly-supervised” approach that uses information on the coordinates of all atomic species in the image, extracted via a deep neural network, to identify a rich variety of defects that are not part of an initial training set. We further apply our approach to interpret complex atomic and defect transformation, including switching between different coordination of silicon dopants in graphene as a function of time, formation of peculiar silicon dimer with mixed 3-fold and 4-fold coordination, and the motion of molecular “rotor”. In conclusion, this deep learning based approach resembles logic of a human operator, but can be scaled leading to significant shift in the way of extracting and analyzing information from raw experimental data.« less

  20. Deep Learning of Atomically Resolved Scanning Transmission Electron Microscopy Images: Chemical Identification and Tracking Local Transformations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ziatdinov, Maxim; Dyck, Ondrej; Maksov, Artem

    Recent advances in scanning transmission electron and scanning probe microscopies have opened unprecedented opportunities in probing the materials structural parameters and various functional properties in real space with an angstrom-level precision. This progress has been accompanied by exponential increase in the size and quality of datasets produced by microscopic and spectroscopic experimental techniques. These developments necessitate adequate methods for extracting relevant physical and chemical information from the large datasets, for which a priori information on the structures of various atomic configurations and lattice defects is limited or absent. Here we demonstrate an application of deep neural networks to extracting informationmore » from atomically resolved images including location of the atomic species and type of defects. We develop a “weakly-supervised” approach that uses information on the coordinates of all atomic species in the image, extracted via a deep neural network, to identify a rich variety of defects that are not part of an initial training set. We further apply our approach to interpret complex atomic and defect transformation, including switching between different coordination of silicon dopants in graphene as a function of time, formation of peculiar silicon dimer with mixed 3-fold and 4-fold coordination, and the motion of molecular “rotor”. In conclusion, this deep learning based approach resembles logic of a human operator, but can be scaled leading to significant shift in the way of extracting and analyzing information from raw experimental data.« less

  1. The use of analytical sedimentation velocity to extract thermodynamic linkage.

    PubMed

    Cole, James L; Correia, John J; Stafford, Walter F

    2011-11-01

    For 25 years, the Gibbs Conference on Biothermodynamics has focused on the use of thermodynamics to extract information about the mechanism and regulation of biological processes. This includes the determination of equilibrium constants for macromolecular interactions by high precision physical measurements. These approaches further reveal thermodynamic linkages to ligand binding events. Analytical ultracentrifugation has been a fundamental technique in the determination of macromolecular reaction stoichiometry and energetics for 85 years. This approach is highly amenable to the extraction of thermodynamic couplings to small molecule binding in the overall reaction pathway. In the 1980s this approach was extended to the use of sedimentation velocity techniques, primarily by the analysis of tubulin-drug interactions by Na and Timasheff. This transport method necessarily incorporates the complexity of both hydrodynamic and thermodynamic nonideality. The advent of modern computational methods in the last 20 years has subsequently made the analysis of sedimentation velocity data for interacting systems more robust and rigorous. Here we review three examples where sedimentation velocity has been useful at extracting thermodynamic information about reaction stoichiometry and energetics. Approaches to extract linkage to small molecule binding and the influence of hydrodynamic nonideality are emphasized. These methods are shown to also apply to the collection of fluorescence data with the new Aviv FDS. Copyright © 2011 Elsevier B.V. All rights reserved.

  2. The use of analytical sedimentation velocity to extract thermodynamic linkage

    PubMed Central

    Cole, James L.; Correia, John J.; Stafford, Walter F.

    2011-01-01

    For 25 years, the Gibbs Conference on Biothermodynamics has focused on the use of thermodynamics to extract information about the mechanism and regulation of biological processes. This includes the determination of equilibrium constants for macromolecular interactions by high precision physical measurements. These approaches further reveal thermodynamic linkages to ligand binding events. Analytical ultracentrifugation has been a fundamental technique in the determination of macromolecular reaction stoichiometry and energetics for 85 years. This approach is highly amenable to the extraction of thermodynamic couplings to small molecule binding in the overall reaction pathway. In the 1980’s this approach was extended to the use of sedimentation velocity techniques, primarily by the analysis of tubulin-drug interactions by Na and Timasheff. This transport method necessarily incorporates the complexity of both hydrodynamic and thermodynamic nonideality. The advent of modern computational methods in the last 20 years has subsequently made the analysis of sedimentation velocity data for interacting systems more robust and rigorous. Here we review three examples where sedimentation velocity has been useful at extracting thermodynamic information about reaction stoichiometry and energetics. Approaches to extract linkage to small molecule binding and the influence of hydrodynamic nonideality are emphasized. These methods are shown to also apply to the collection of fluorescence data with the new Aviv FDS. PMID:21703752

  3. Fractional, biodegradable and spectral characteristics of extracted and fractionated sludge extracellular polymeric substances.

    PubMed

    Wei, Liang-Liang; Wang, Kun; Zhao, Qing-Liang; Jiang, Jun-Qiu; Kong, Xiang-Juan; Lee, Duu-Jong

    2012-09-15

    Correlation between fractional, biodegradable and spectral characteristics of sludge extracellular polymeric substances (EPS) by different protocols has not been well established. This work extracted sludge EPS using alkaline extractants (NH₄OH and formaldehyde + NaOH) and physical protocols (ultrasonication, heating at 80 °C or cation exchange resin (CER)) and then fractionated the extracts using XAD-8/XAD-4 resins. The alkaline extractants yielded more sludge EPS than the physical protocols. However, the physical protocols extracted principally the hydrophilic components which were readily biodegradable by microorganisms. The alkaline extractants dissolved additional humic-like substances from sludge solids which were refractory in nature. Different extraction protocols preferably extracted EPS with distinct fractional, biodegradable and spectral characteristics which could be applied in specific usages. Copyright © 2012 Elsevier Ltd. All rights reserved.

  4. Comparative Evaluation of Efficacy of Physics Forceps versus Conventional Forceps in Orthodontic Extractions: A Prospective Randomized Split Mouth Study.

    PubMed

    Patel, Harsh S; Managutti, Anil M; Menat, Shailesh; Agarwal, Arvind; Shah, Dishan; Patel, Jigar

    2016-07-01

    Tooth extraction is one of the most commonly performed procedures in dentistry. It is usually a traumatic procedure often resulting in immediate destruction and loss of alveolar bone and surrounding soft tissues. Various instruments have been described to perform atraumatic extractions which can prevent damage to the paradental structures. Recently developed physics forceps is one of the instruments which is claimed to perform atraumatic extractions. The aim of the present study was to compare the efficacy of physics forceps with conventional forceps in terms of operating time, prevention of marginal bone loss & soft tissue loss, postoperative pain and postoperative complications following bilateral premolar extractions for orthodontic purpose. In this prospective split-mouth study, outcomes of the 2 groups (n = 42 premolars) requiring extraction of premolars for orthodontic treatment purpose using Physics forceps and Conventional forceps were compared. Clinical outcomes in form of time taken, loss of buccal soft tissue and buccal cortical plate based on extraction defect classification system, postoperative pain and other complication associated with extraction were recorded and compared. Statistically significant reduction in the operating time was noted in physics forceps group. Marginal bone loss and soft tissue loss was also significantly lesser in physics forceps group when compared to conventional forceps group. However, there was no statistically significant difference in severity of postoperative pain between both groups. The results of the present study suggest that physics forceps was more efficient in reducing operating time and prevention of marginal bone loss & soft tissue loss when compared to conventional forceps in orthodontically indicated premolar extractions.

  5. Automated Fluid Feature Extraction from Transient Simulations

    NASA Technical Reports Server (NTRS)

    Haimes, Robert

    2000-01-01

    In the past, feature extraction and identification were interesting concepts, but not required in understanding the physics of a steady flow field. This is because the results of the more traditional tools like iso-surfaces, cuts and streamlines, were more interactive and easily abstracted so they could be represented to the investigator. These tools worked and properly conveyed the collected information at the expense of a great deal of interaction. For unsteady flow-fields, the investigator does not have the luxury of spending time scanning only one 'snap-shot' of the simulation. Automated assistance is required in pointing out areas of potential interest contained within the flow. This must not require a heavy compute burden (the visualization should not significantly slow down the solution procedure for co-processing environments like pV3). And methods must be developed to abstract the feature and display it in a manner that physically makes sense.

  6. Control volume based hydrocephalus research; analysis of human data

    NASA Astrophysics Data System (ADS)

    Cohen, Benjamin; Wei, Timothy; Voorhees, Abram; Madsen, Joseph; Anor, Tomer

    2010-11-01

    Hydrocephalus is a neuropathophysiological disorder primarily diagnosed by increased cerebrospinal fluid volume and pressure within the brain. To date, utilization of clinical measurements have been limited to understanding of the relative amplitude and timing of flow, volume and pressure waveforms; qualitative approaches without a clear framework for meaningful quantitative comparison. Pressure volume models and electric circuit analogs enforce volume conservation principles in terms of pressure. Control volume analysis, through the integral mass and momentum conservation equations, ensures that pressure and volume are accounted for using first principles fluid physics. This approach is able to directly incorporate the diverse measurements obtained by clinicians into a simple, direct and robust mechanics based framework. Clinical data obtained for analysis are discussed along with data processing techniques used to extract terms in the conservation equation. Control volume analysis provides a non-invasive, physics-based approach to extracting pressure information from magnetic resonance velocity data that cannot be measured directly by pressure instrumentation.

  7. Extraction of quantitative surface characteristics from AIRSAR data for Death Valley, California

    NASA Technical Reports Server (NTRS)

    Kierein-Young, K. S.; Kruse, F. A.

    1992-01-01

    Polarimetric Airborne Synthetic Aperture Radar (AIRSAR) data were collected for the Geologic Remote Sensing Field Experiment (GRSFE) over Death Valley, California, USA, in Sep. 1989. AIRSAR is a four-look, quad-polarization, three frequency instrument. It collects measurements at C-band (5.66 cm), L-band (23.98 cm), and P-band (68.13 cm), and has a GIFOV of 10 meters and a swath width of 12 kilometers. Because the radar measures at three wavelengths, different scales of surface roughness are measured. Also, dielectric constants can be calculated from the data. The AIRSAR data were calibrated using in-scene trihedral corner reflectors to remove cross-talk; and to calibrate the phase, amplitude, and co-channel gain imbalance. The calibration allows for the extraction of accurate values of rms surface roughness, dielectric constants, sigma(sub 0) backscatter, and polarization information. The radar data sets allow quantitative characterization of small scale surface structure of geologic units, providing information about the physical and chemical processes that control the surface morphology. Combining the quantitative information extracted from the radar data with other remotely sensed data sets allows discrimination, identification and mapping of geologic units that may be difficult to discern using conventional techniques.

  8. Prevalence of physical inactivity in Iran: a systematic review.

    PubMed

    Fakhrzadeh, Hossein; Djalalinia, Shirin; Mirarefin, Mojdeh; Arefirad, Tahereh; Asayesh, Hamid; Safiri, Saeid; Samami, Elham; Mansourian, Morteza; Shamsizadeh, Morteza; Qorbani, Mostafa

    2016-01-01

    Introduction: Physical inactivity is one of the most important risk factors for chronic diseases, including cardiovascular disease, cancer, and stroke. We aim to conduct a systematic review of the prevalence of physical inactivity in Iran. Methods: We searched international databases; ISI, PubMed/Medline, Scopus, and national databases Irandoc, Barakat knowledge network system, and Scientific Information Database (SID). We collected data for outcome measures of prevalence of physical inactivity by sex, age, province, and year. Quality assessment and data extraction has been conducted independently by two independent research experts. There were no limitations for time and language. Results: We analyzed data for prevalence of physical inactivity in Iranian population. According to our search strategy we found 254 records; of them 185 were from international databases and the remaining 69 were obtained from national databases after refining the data, 34 articles that met eligible criteria remained for data extraction. From them respectively; 9, 20, 2 and 3 studies were at national, provincial, regional and local levels. The estimates for inactivity ranged from approximately 30% to almost 70% and had considerable variation between sexes and studied sub-groups. Conclusion: In Iran, most of studies reported high prevalence of physical inactivity. Our findings reveal a heterogeneity of reported values, often from differences in study design, measurement tools and methods, different target groups and sub-population sampling. These data do not provide the possibility of aggregation of data for a comprehensive inference.

  9. Use of Visual Cues by Adults With Traumatic Brain Injuries to Interpret Explicit and Inferential Information.

    PubMed

    Brown, Jessica A; Hux, Karen; Knollman-Porter, Kelly; Wallace, Sarah E

    2016-01-01

    Concomitant visual and cognitive impairments following traumatic brain injuries (TBIs) may be problematic when the visual modality serves as a primary source for receiving information. Further difficulties comprehending visual information may occur when interpretation requires processing inferential rather than explicit content. The purpose of this study was to compare the accuracy with which people with and without severe TBI interpreted information in contextually rich drawings. Fifteen adults with and 15 adults without severe TBI. Repeated-measures between-groups design. Participants were asked to match images to sentences that either conveyed explicit (ie, main action or background) or inferential (ie, physical or mental inference) information. The researchers compared accuracy between participant groups and among stimulus conditions. Participants with TBI demonstrated significantly poorer accuracy than participants without TBI extracting information from images. In addition, participants with TBI demonstrated significantly higher response accuracy when interpreting explicit rather than inferential information; however, no significant difference emerged between sentences referencing main action versus background information or sentences providing physical versus mental inference information for this participant group. Difficulties gaining information from visual environmental cues may arise for people with TBI given their difficulties interpreting inferential content presented through the visual modality.

  10. The role of social support on physical activity behaviour in adolescent girls: a systematic review and meta-analysis.

    PubMed

    Laird, Yvonne; Fawkner, Samantha; Kelly, Paul; McNamee, Lily; Niven, Ailsa

    2016-07-07

    Adolescent girls have been targeted as a priority group for promoting physical activity levels however it is unclear how this can be achieved. There is some evidence to suggest that social support could impact the physical activity levels of adolescent girls, although the relationship is complex and not well understood. We aimed to systematically review and meta-analyse the relationship between social support and physical activity in adolescent girls, exploring how different types and providers of social support might influence the relationship. Articles were identified through a systematic search of the literature using 14 electronic databases, personal resources, grey literature, and reference lists of included studies and previous reviews. Search terms representing social support, physical activity and adolescent girls were identified and used in various combinations to form a search strategy which was adapted for different databases. Cross-sectional or longitudinal articles published in English that reported an association between social support and physical activity in adolescent girls between the ages of 10 to 19 years were included. Studies that focused only on clinical or overweight populations were excluded. Data extraction was carried out by one reviewer using an electronic extraction form. A random 25 % of included articles were selected for data extraction by a second reviewer to check fidelity. Risk of bias was assessed using a custom tool informed by the Critical Appraisal Skills Programme Cohort Study Checklist in conjunction with data extraction. Cross-sectional results were meta-analysed and longitudinal results were presented narratively. Small but significant associations between all available providers of total social support (except teachers) and physical activity were found (r = .14-.24). Small but significant associations were also identified for emotional, instrumental and modelling support for some providers of support (r = .10-.21). Longitudinal research supported the cross-sectional analyses. Many of the meta-analysis results suggested high heterogeneity and there was some evidence of publication bias, therefore, the meta-analysis results should be interpreted with caution. In conclusion, the meta-analysis results suggest that social support is not a strong predictor of physical activity in adolescent girls though parents and friends may have a role in enhancing PA. PROSPERO 2014: CRD42014006738.

  11. Perceptual Span Depends on Font Size during the Reading of Chinese Sentences

    ERIC Educational Resources Information Center

    Yan, Ming; Zhou, Wei; Shu, Hua; Kliegl, Reinhold

    2015-01-01

    The present study explored the perceptual span (i.e., the physical extent of an area from which useful visual information is extracted during a single fixation) during the reading of Chinese sentences in 2 experiments. In Experiment 1, we tested whether the rightward span can go beyond 3 characters when visually similar masks were used. Results…

  12. Signal processing methods for MFE plasma diagnostics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Candy, J.V.; Casper, T.; Kane, R.

    1985-02-01

    The application of various signal processing methods to extract energy storage information from plasma diamagnetism sensors occurring during physics experiments on the Tandom Mirror Experiment-Upgrade (TMX-U) is discussed. We show how these processing techniques can be used to decrease the uncertainty in the corresponding sensor measurements. The algorithms suggested are implemented using SIG, an interactive signal processing package developed at LLNL.

  13. Automated Fluid Feature Extraction from Transient Simulations

    NASA Technical Reports Server (NTRS)

    Haimes, Robert

    1998-01-01

    In the past, feature extraction and identification were interesting concepts, but not required to understand the underlying physics of a steady flow field. This is because the results of the more traditional tools like iso-surfaces, cuts and streamlines were more interactive and easily abstracted so they could be represented to the investigator. These tools worked and properly conveyed the collected information at the expense of much interaction. For unsteady flow-fields, the investigator does not have the luxury of spending time scanning only one 'snap-shot' of the simulation. Automated assistance is required in pointing out areas of potential interest contained within the flow. This must not require a heavy compute burden (the visualization should not significantly slow down the solution procedure for co-processing environments like pV3). And methods must be developed to abstract the feature and display it in a manner that physically makes sense. The following is a list of the important physical phenomena found in transient (and steady-state) fluid flow: Shocks; Vortex ores; Regions of Recirculation; Boundary Layers; Wakes.

  14. Defining photon channels in strong-field physics: the photon-phase Fourier representation

    NASA Astrophysics Data System (ADS)

    Zeng, Shuo; Zohrabi, Mohammad; Berry, Ben; Ablikim, Utuq; Kling, Nora; Severt, Travis; Jochim, Bethany; Carnes, Kevin; Ben-Itzhak, Itzik; Esry, Brett

    2014-05-01

    In strong-field physics, complex atomic and molecular dynamics can be steered by the carrier-envelope phase (CEP). The general theory formulated in Refs., provides a rigorous foundation upon which this understanding might be built. By recognizing the underlying periodicity of the time-dependent Schrödinger equation--and thus its solutions--in the CEP, all CEP effects can be understood as the interference of different photon channels. We will show that this understanding can be turned around to extract information on the photon channel by examining the CEP dependence. In particular, by taking the Fourier transform with respect to the CEP, photon channel information can be extracted from both theory and experiment. Through several examples, we will also show that this technique can be applied to any system and provides knowledge of the net numbers of photons absorbed--even in few-cycle pulses--that is not available in any other way. This work was supported by the Chemical Sciences, Geosciences, and Biosciences Division, Office of Basic Energy Sciences, Office of Science, U.S. Department of Energy under Grant No. DE-FG02-86ER13491. The PULSAR laser was provided by Grant No. DE-FG02-09.

  15. A New Self-Constrained Inversion Method of Potential Fields Based on Probability Tomography

    NASA Astrophysics Data System (ADS)

    Sun, S.; Chen, C.; WANG, H.; Wang, Q.

    2014-12-01

    The self-constrained inversion method of potential fields uses a priori information self-extracted from potential field data. Differing from external a priori information, the self-extracted information are generally parameters derived exclusively from the analysis of the gravity and magnetic data (Paoletti et al., 2013). Here we develop a new self-constrained inversion method based on probability tomography. Probability tomography doesn't need any priori information, as well as large inversion matrix operations. Moreover, its result can describe the sources, especially the distribution of which is complex and irregular, entirely and clearly. Therefore, we attempt to use the a priori information extracted from the probability tomography results to constrain the inversion for physical properties. The magnetic anomaly data was taken as an example in this work. The probability tomography result of magnetic total field anomaly(ΔΤ) shows a smoother distribution than the anomalous source and cannot display the source edges exactly. However, the gradients of ΔΤ are with higher resolution than ΔΤ in their own direction, and this characteristic is also presented in their probability tomography results. So we use some rules to combine the probability tomography results of ∂ΔΤ⁄∂x, ∂ΔΤ⁄∂y and ∂ΔΤ⁄∂z into a new result which is used for extracting a priori information, and then incorporate the information into the model objective function as spatial weighting functions to invert the final magnetic susceptibility. Some magnetic synthetic examples incorporated with and without a priori information extracted from the probability tomography results were made to do comparison, results of which show that the former are more concentrated and with higher resolution of the source body edges. This method is finally applied in an iron mine in China with field measured ΔΤ data and performs well. ReferencesPaoletti, V., Ialongo, S., Florio, G., Fedi, M. & Cella, F., 2013. Self-constrained inversion of potential fields, Geophys J Int.This research is supported by the Fundamental Research Funds for Institute for Geophysical and Geochemical Exploration, Chinese Academy of Geological Sciences (Grant Nos. WHS201210 and WHS201211).

  16. Effects of band selection on endmember extraction for forestry applications

    NASA Astrophysics Data System (ADS)

    Karathanassi, Vassilia; Andreou, Charoula; Andronis, Vassilis; Kolokoussis, Polychronis

    2014-10-01

    In spectral unmixing theory, data reduction techniques play an important role as hyperspectral imagery contains an immense amount of data, posing many challenging problems such as data storage, computational efficiency, and the so called "curse of dimensionality". Feature extraction and feature selection are the two main approaches for dimensionality reduction. Feature extraction techniques are used for reducing the dimensionality of the hyperspectral data by applying transforms on hyperspectral data. Feature selection techniques retain the physical meaning of the data by selecting a set of bands from the input hyperspectral dataset, which mainly contain the information needed for spectral unmixing. Although feature selection techniques are well-known for their dimensionality reduction potentials they are rarely used in the unmixing process. The majority of the existing state-of-the-art dimensionality reduction methods set criteria to the spectral information, which is derived by the whole wavelength, in order to define the optimum spectral subspace. These criteria are not associated with any particular application but with the data statistics, such as correlation and entropy values. However, each application is associated with specific land c over materials, whose spectral characteristics present variations in specific wavelengths. In forestry for example, many applications focus on tree leaves, in which specific pigments such as chlorophyll, xanthophyll, etc. determine the wavelengths where tree species, diseases, etc., can be detected. For such applications, when the unmixing process is applied, the tree species, diseases, etc., are considered as the endmembers of interest. This paper focuses on investigating the effects of band selection on the endmember extraction by exploiting the information of the vegetation absorbance spectral zones. More precisely, it is explored whether endmember extraction can be optimized when specific sets of initial bands related to leaf spectral characteristics are selected. Experiments comprise application of well-known signal subspace estimation and endmember extraction methods on a hyperspectral imagery that presents a forest area. Evaluation of the extracted endmembers showed that more forest species can be extracted as endmembers using selected bands.

  17. Memristive crypto primitive for building highly secure physical unclonable functions

    NASA Astrophysics Data System (ADS)

    Gao, Yansong; Ranasinghe, Damith C.; Al-Sarawi, Said F.; Kavehei, Omid; Abbott, Derek

    2015-08-01

    Physical unclonable functions (PUFs) exploit the intrinsic complexity and irreproducibility of physical systems to generate secret information. The advantage is that PUFs have the potential to provide fundamentally higher security than traditional cryptographic methods by preventing the cloning of devices and the extraction of secret keys. Most PUF designs focus on exploiting process variations in Complementary Metal Oxide Semiconductor (CMOS) technology. In recent years, progress in nanoelectronic devices such as memristors has demonstrated the prevalence of process variations in scaling electronics down to the nano region. In this paper, we exploit the extremely large information density available in nanocrossbar architectures and the significant resistance variations of memristors to develop an on-chip memristive device based strong PUF (mrSPUF). Our novel architecture demonstrates desirable characteristics of PUFs, including uniqueness, reliability, and large number of challenge-response pairs (CRPs) and desirable characteristics of strong PUFs. More significantly, in contrast to most existing PUFs, our PUF can act as a reconfigurable PUF (rPUF) without additional hardware and is of benefit to applications needing revocation or update of secure key information.

  18. Memristive crypto primitive for building highly secure physical unclonable functions.

    PubMed

    Gao, Yansong; Ranasinghe, Damith C; Al-Sarawi, Said F; Kavehei, Omid; Abbott, Derek

    2015-08-04

    Physical unclonable functions (PUFs) exploit the intrinsic complexity and irreproducibility of physical systems to generate secret information. The advantage is that PUFs have the potential to provide fundamentally higher security than traditional cryptographic methods by preventing the cloning of devices and the extraction of secret keys. Most PUF designs focus on exploiting process variations in Complementary Metal Oxide Semiconductor (CMOS) technology. In recent years, progress in nanoelectronic devices such as memristors has demonstrated the prevalence of process variations in scaling electronics down to the nano region. In this paper, we exploit the extremely large information density available in nanocrossbar architectures and the significant resistance variations of memristors to develop an on-chip memristive device based strong PUF (mrSPUF). Our novel architecture demonstrates desirable characteristics of PUFs, including uniqueness, reliability, and large number of challenge-response pairs (CRPs) and desirable characteristics of strong PUFs. More significantly, in contrast to most existing PUFs, our PUF can act as a reconfigurable PUF (rPUF) without additional hardware and is of benefit to applications needing revocation or update of secure key information.

  19. Memristive crypto primitive for building highly secure physical unclonable functions

    PubMed Central

    Gao, Yansong; Ranasinghe, Damith C.; Al-Sarawi, Said F.; Kavehei, Omid; Abbott, Derek

    2015-01-01

    Physical unclonable functions (PUFs) exploit the intrinsic complexity and irreproducibility of physical systems to generate secret information. The advantage is that PUFs have the potential to provide fundamentally higher security than traditional cryptographic methods by preventing the cloning of devices and the extraction of secret keys. Most PUF designs focus on exploiting process variations in Complementary Metal Oxide Semiconductor (CMOS) technology. In recent years, progress in nanoelectronic devices such as memristors has demonstrated the prevalence of process variations in scaling electronics down to the nano region. In this paper, we exploit the extremely large information density available in nanocrossbar architectures and the significant resistance variations of memristors to develop an on-chip memristive device based strong PUF (mrSPUF). Our novel architecture demonstrates desirable characteristics of PUFs, including uniqueness, reliability, and large number of challenge-response pairs (CRPs) and desirable characteristics of strong PUFs. More significantly, in contrast to most existing PUFs, our PUF can act as a reconfigurable PUF (rPUF) without additional hardware and is of benefit to applications needing revocation or update of secure key information. PMID:26239669

  20. How Nonlinear-Type Time-Frequency Analysis Can Help in Sensing Instantaneous Heart Rate and Instantaneous Respiratory Rate from Photoplethysmography in a Reliable Way

    PubMed Central

    Cicone, Antonio; Wu, Hau-Tieng

    2017-01-01

    Despite the population of the noninvasive, economic, comfortable, and easy-to-install photoplethysmography (PPG), it is still lacking a mathematically rigorous and stable algorithm which is able to simultaneously extract from a single-channel PPG signal the instantaneous heart rate (IHR) and the instantaneous respiratory rate (IRR). In this paper, a novel algorithm called deppG is provided to tackle this challenge. deppG is composed of two theoretically solid nonlinear-type time-frequency analyses techniques, the de-shape short time Fourier transform and the synchrosqueezing transform, which allows us to extract the instantaneous physiological information from the PPG signal in a reliable way. To test its performance, in addition to validating the algorithm by a simulated signal and discussing the meaning of “instantaneous,” the algorithm is applied to two publicly available batch databases, the Capnobase and the ICASSP 2015 signal processing cup. The former contains PPG signals relative to spontaneous or controlled breathing in static patients, and the latter is made up of PPG signals collected from subjects doing intense physical activities. The accuracies of the estimated IHR and IRR are compared with the ones obtained by other methods, and represent the state-of-the-art in this field of research. The results suggest the potential of deppG to extract instantaneous physiological information from a signal acquired from widely available wearable devices, even when a subject carries out intense physical activities. PMID:29018352

  1. Identifying Nanoscale Structure-Function Relationships Using Multimodal Atomic Force Microscopy, Dimensionality Reduction, and Regression Techniques.

    PubMed

    Kong, Jessica; Giridharagopal, Rajiv; Harrison, Jeffrey S; Ginger, David S

    2018-05-31

    Correlating nanoscale chemical specificity with operational physics is a long-standing goal of functional scanning probe microscopy (SPM). We employ a data analytic approach combining multiple microscopy modes, using compositional information in infrared vibrational excitation maps acquired via photoinduced force microscopy (PiFM) with electrical information from conductive atomic force microscopy. We study a model polymer blend comprising insulating poly(methyl methacrylate) (PMMA) and semiconducting poly(3-hexylthiophene) (P3HT). We show that PiFM spectra are different from FTIR spectra, but can still be used to identify local composition. We use principal component analysis to extract statistically significant principal components and principal component regression to predict local current and identify local polymer composition. In doing so, we observe evidence of semiconducting P3HT within PMMA aggregates. These methods are generalizable to correlated SPM data and provide a meaningful technique for extracting complex compositional information that are impossible to measure from any one technique.

  2. Environmentally friendly preparation of pectins from agricultural byproducts and their structural/rheological characterization.

    PubMed

    Min, Bockki; Lim, Jongbin; Ko, Sanghoon; Lee, Kwang-Geun; Lee, Sung Ho; Lee, Suyong

    2011-02-01

    Apple pomace which is the main waste of fruit juice industry was utilized to extract pectins in an environmentally friendly way, which was then compared with chemically-extracted pectins. The water-based extraction with combined physical and enzymatic treatments produced pectins with 693.2 mg g(-1) galacturonic acid and 4.6% yield, which were less than those of chemically-extracted pectins. Chemically-extracted pectins exhibited lower degree of esterification (58%) than the pectin samples obtained by physical/enzymatic treatments (69%), which were also confirmed by FT-IR analysis. When subjected to steady-shear rheological conditions, both pectin solutions were shown to have shear-thinning properties. However, decreased viscosity was observed in the pectins extracted by combined physical/enzymatic methods which could be mainly attributed to the presence of more methyl esters, thus limiting polymer chain interactions. Moreover, the pectins which were extracted by combined physical/enzymatic treatments, showed less elastic properties under high shear rate conditions, compared to the chemically-extracted pectins. Copyright © 2010 Elsevier Ltd. All rights reserved.

  3. Metadata to Support Data Warehouse Evolution

    NASA Astrophysics Data System (ADS)

    Solodovnikova, Darja

    The focus of this chapter is metadata necessary to support data warehouse evolution. We present the data warehouse framework that is able to track evolution process and adapt data warehouse schemata and data extraction, transformation, and loading (ETL) processes. We discuss the significant part of the framework, the metadata repository that stores information about the data warehouse, logical and physical schemata and their versions. We propose the physical implementation of multiversion data warehouse in a relational DBMS. For each modification of a data warehouse schema, we outline the changes that need to be made to the repository metadata and in the database.

  4. Spectroscopic Investigations of Fragment Species in the Coma

    NASA Technical Reports Server (NTRS)

    Feldman, Paul D.; Cochran, Anita L.; Combi, Michael R.

    2004-01-01

    The content of the gaseous coma of a comet is dominated by fragment species produced by photolysis of the parent molecules issuing directly from the icy nucleus of the comet. Spectroscopy of these species provides complementary information on the physical state of the coma to that obtained from observations of the parent species. Extraction of physical parameters requires detailed molecular and atomic data together with reliable high-resolution spectra and absolute fluxes of the primary source of excitation, the Sun. The large database of observations, dating back more than a century, provides a means to assess the chemical and evolutionary diversity of comets.

  5. MSL: Facilitating automatic and physical analysis of published scientific literature in PDF format

    PubMed Central

    Ahmed, Zeeshan; Dandekar, Thomas

    2018-01-01

    Published scientific literature contains millions of figures, including information about the results obtained from different scientific experiments e.g. PCR-ELISA data, microarray analysis, gel electrophoresis, mass spectrometry data, DNA/RNA sequencing, diagnostic imaging (CT/MRI and ultrasound scans), and medicinal imaging like electroencephalography (EEG), magnetoencephalography (MEG), echocardiography  (ECG), positron-emission tomography (PET) images. The importance of biomedical figures has been widely recognized in scientific and medicine communities, as they play a vital role in providing major original data, experimental and computational results in concise form. One major challenge for implementing a system for scientific literature analysis is extracting and analyzing text and figures from published PDF files by physical and logical document analysis. Here we present a product line architecture based bioinformatics tool ‘Mining Scientific Literature (MSL)’, which supports the extraction of text and images by interpreting all kinds of published PDF files using advanced data mining and image processing techniques. It provides modules for the marginalization of extracted text based on different coordinates and keywords, visualization of extracted figures and extraction of embedded text from all kinds of biological and biomedical figures using applied Optimal Character Recognition (OCR). Moreover, for further analysis and usage, it generates the system’s output in different formats including text, PDF, XML and images files. Hence, MSL is an easy to install and use analysis tool to interpret published scientific literature in PDF format. PMID:29721305

  6. Enriching a document collection by integrating information extraction and PDF annotation

    NASA Astrophysics Data System (ADS)

    Powley, Brett; Dale, Robert; Anisimoff, Ilya

    2009-01-01

    Modern digital libraries offer all the hyperlinking possibilities of the World Wide Web: when a reader finds a citation of interest, in many cases she can now click on a link to be taken to the cited work. This paper presents work aimed at providing the same ease of navigation for legacy PDF document collections that were created before the possibility of integrating hyperlinks into documents was ever considered. To achieve our goal, we need to carry out two tasks: first, we need to identify and link citations and references in the text with high reliability; and second, we need the ability to determine physical PDF page locations for these elements. We demonstrate the use of a high-accuracy citation extraction algorithm which significantly improves on earlier reported techniques, and a technique for integrating PDF processing with a conventional text-stream based information extraction pipeline. We demonstrate these techniques in the context of a particular document collection, this being the ACL Anthology; but the same approach can be applied to other document sets.

  7. A Foot-Mounted Inertial Measurement Unit (IMU) Positioning Algorithm Based on Magnetic Constraint

    PubMed Central

    Zou, Jiaheng

    2018-01-01

    With the development of related applications, indoor positioning techniques have been more and more widely developed. Based on Wi-Fi, Bluetooth low energy (BLE) and geomagnetism, indoor positioning techniques often rely on the physical location of fingerprint information. The focus and difficulty of establishing the fingerprint database are in obtaining a relatively accurate physical location with as little given information as possible. This paper presents a foot-mounted inertial measurement unit (IMU) positioning algorithm under the loop closure constraint based on magnetic information. It can provide relatively reliable position information without maps and geomagnetic information and provides a relatively accurate coordinate for the collection of a fingerprint database. In the experiment, the features extracted by the multi-level Fourier transform method proposed in this paper are validated and the validity of loop closure matching is tested with a RANSAC-based method. Moreover, the loop closure detection results show that the cumulative error of the trajectory processed by the graph optimization algorithm is significantly suppressed, presenting a good accuracy. The average error of the trajectory under loop closure constraint is controlled below 2.15 m. PMID:29494542

  8. A Foot-Mounted Inertial Measurement Unit (IMU) Positioning Algorithm Based on Magnetic Constraint.

    PubMed

    Wang, Yan; Li, Xin; Zou, Jiaheng

    2018-03-01

    With the development of related applications, indoor positioning techniques have been more and more widely developed. Based on Wi-Fi, Bluetooth low energy (BLE) and geomagnetism, indoor positioning techniques often rely on the physical location of fingerprint information. The focus and difficulty of establishing the fingerprint database are in obtaining a relatively accurate physical location with as little given information as possible. This paper presents a foot-mounted inertial measurement unit (IMU) positioning algorithm under the loop closure constraint based on magnetic information. It can provide relatively reliable position information without maps and geomagnetic information and provides a relatively accurate coordinate for the collection of a fingerprint database. In the experiment, the features extracted by the multi-level Fourier transform method proposed in this paper are validated and the validity of loop closure matching is tested with a RANSAC-based method. Moreover, the loop closure detection results show that the cumulative error of the trajectory processed by the graph optimization algorithm is significantly suppressed, presenting a good accuracy. The average error of the trajectory under loop closure constraint is controlled below 2.15 m.

  9. Semantic Information Processing of Physical Simulation Based on Scientific Concept Vocabulary Model

    NASA Astrophysics Data System (ADS)

    Kino, Chiaki; Suzuki, Yoshio; Takemiya, Hiroshi

    Scientific Concept Vocabulary (SCV) has been developed to actualize Cognitive methodology based Data Analysis System: CDAS which supports researchers to analyze large scale data efficiently and comprehensively. SCV is an information model for processing semantic information for physics and engineering. In the model of SCV, all semantic information is related to substantial data and algorisms. Consequently, SCV enables a data analysis system to recognize the meaning of execution results output from a numerical simulation. This method has allowed a data analysis system to extract important information from a scientific view point. Previous research has shown that SCV is able to describe simple scientific indices and scientific perceptions. However, it is difficult to describe complex scientific perceptions by currently-proposed SCV. In this paper, a new data structure for SCV has been proposed in order to describe scientific perceptions in more detail. Additionally, the prototype of the new model has been constructed and applied to actual data of numerical simulation. The result means that the new SCV is able to describe more complex scientific perceptions.

  10. Security Analysis of Smart Grid Cyber Physical Infrastructures Using Modeling and Game Theoretic Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abercrombie, Robert K; Sheldon, Frederick T.

    Cyber physical computing infrastructures typically consist of a number of sites are interconnected. Its operation critically depends both on cyber components and physical components. Both types of components are subject to attacks of different kinds and frequencies, which must be accounted for the initial provisioning and subsequent operation of the infrastructure via information security analysis. Information security analysis can be performed using game theory implemented in dynamic Agent Based Game Theoretic (ABGT) simulations. Such simulations can be verified with the results from game theory analysis and further used to explore larger scale, real world scenarios involving multiple attackers, defenders, andmore » information assets. We concentrated our analysis on the electric sector failure scenarios and impact analyses by the NESCOR Working Group Study, From the Section 5 electric sector representative failure scenarios; we extracted the four generic failure scenarios and grouped them into three specific threat categories (confidentiality, integrity, and availability) to the system. These specific failure scenarios serve as a demonstration of our simulation. The analysis using our ABGT simulation demonstrates how to model the electric sector functional domain using a set of rationalized game theoretic rules decomposed from the failure scenarios in terms of how those scenarios might impact the cyber physical infrastructure network with respect to CIA.« less

  11. Knowledge Extraction from Atomically Resolved Images.

    PubMed

    Vlcek, Lukas; Maksov, Artem; Pan, Minghu; Vasudevan, Rama K; Kalinin, Sergei V

    2017-10-24

    Tremendous strides in experimental capabilities of scanning transmission electron microscopy and scanning tunneling microscopy (STM) over the past 30 years made atomically resolved imaging routine. However, consistent integration and use of atomically resolved data with generative models is unavailable, so information on local thermodynamics and other microscopic driving forces encoded in the observed atomic configurations remains hidden. Here, we present a framework based on statistical distance minimization to consistently utilize the information available from atomic configurations obtained from an atomically resolved image and extract meaningful physical interaction parameters. We illustrate the applicability of the framework on an STM image of a FeSe x Te 1-x superconductor, with the segregation of the chalcogen atoms investigated using a nonideal interacting solid solution model. This universal method makes full use of the microscopic degrees of freedom sampled in an atomically resolved image and can be extended via Bayesian inference toward unbiased model selection with uncertainty quantification.

  12. Understanding the relationships between the physical environment and physical activity in older adults: a systematic review of qualitative studies.

    PubMed

    Moran, Mika; Van Cauwenberg, Jelle; Hercky-Linnewiel, Rachel; Cerin, Ester; Deforche, Benedicte; Plaut, Pnina

    2014-07-17

    While physical activity (PA) provides many physical, social, and mental health benefits for older adults, they are the least physically active age group. Ecological models highlight the importance of the physical environment in promoting PA. However, results of previous quantitative research revealed inconsistencies in environmental correlates of older adults' PA that may be explained by methodological issues. Qualitative studies can inform and complement quantitative research on environment-PA relationships by providing insight into how and why the environment influences participants' PA behaviors. The current study aimed to provide a systematic review of qualitative studies exploring the potential impact of the physical environment on older adults' PA behaviors. A systematic search was conducted in databases of various disciplines, including: health, architecture and urban planning, transportation, and interdisciplinary databases. From 3,047 articles identified in the physical activity, initial search, 31 articles published from 1996 to 2012 met all inclusion criteria. An inductive content analysis was performed on the extracted findings to identify emerging environmental elements related to older adults' PA. The identified environmental elements were then grouped by study methodologies [indoor interviews (individual or focus groups) vs spatial methods (photo-voice, observations, walk-along interviews)]. This review provides detailed information about environmental factors that potentially influence older adults' PA behaviors. These factors were categorized into five themes: pedestrian infrastructure, safety, access to amenities, aesthetics, and environmental conditions. Environmental factors especially relevant to older adults (i.e., access to facilities, green open spaces and rest areas) tended to emerge more frequently in studies that combined interviews with spatial qualitative methods. Findings showed that qualitative research can provide in-depth information on environmental elements that influence older adults' PA. Future qualitative studies on the physical environment and older adults' PA would benefit from combining interviews with more spatially-oriented methods. Multidisciplinary mixed-methods studies are recommended to establish quantitative relationships complemented with in-depth qualitative information.

  13. Improved collagen extraction from jellyfish (Acromitus hardenbergi) with increased physical-induced solubilization processes.

    PubMed

    Khong, Nicholas M H; Yusoff, Fatimah Md; Jamilah, B; Basri, Mahiran; Maznah, I; Chan, Kim Wei; Armania, Nurdin; Nishikawa, Jun

    2018-06-15

    Efficiency and effectiveness of collagen extraction process contribute to huge impacts to the quality, supply and cost of the collagen produced. Jellyfish is a potential sustainable source of collagen where their applications are not limited by religious constraints and threats of transmittable diseases. The present study compared the extraction yield, physico-chemical properties and toxicology in vitro of collagens obtained by the conventional acid-assisted and pepsin-assisted extraction to an improved physical-aided extraction process. By increasing physical intervention, the production yield increased significantly compared to the conventional extraction processes (p < .05). Collagen extracted using the improved process was found to possess similar proximate and amino acids composition to those extracted using pepsin (p > .05) while retaining high molecular weight distributions and polypeptide profiles similar to those extracted using only acid. Moreover, they exhibited better appearance, instrumental colour and were found to be non-toxic in vitro and free of heavy metal contamination. Copyright © 2017 Elsevier Ltd. All rights reserved.

  14. Wavelet phase extracting demodulation algorithm based on scale factor for optical fiber Fabry-Perot sensing.

    PubMed

    Zhang, Baolin; Tong, Xinglin; Hu, Pan; Guo, Qian; Zheng, Zhiyuan; Zhou, Chaoran

    2016-12-26

    Optical fiber Fabry-Perot (F-P) sensors have been used in various on-line monitoring of physical parameters such as acoustics, temperature and pressure. In this paper, a wavelet phase extracting demodulation algorithm for optical fiber F-P sensing is first proposed. In application of this demodulation algorithm, search range of scale factor is determined by estimated cavity length which is obtained by fast Fourier transform (FFT) algorithm. Phase information of each point on the optical interference spectrum can be directly extracted through the continuous complex wavelet transform without de-noising. And the cavity length of the optical fiber F-P sensor is calculated by the slope of fitting curve of the phase. Theorical analysis and experiment results show that this algorithm can greatly reduce the amount of computation and improve demodulation speed and accuracy.

  15. Digital Imaging

    NASA Technical Reports Server (NTRS)

    1986-01-01

    Digital Imaging is the computer processed numerical representation of physical images. Enhancement of images results in easier interpretation. Quantitative digital image analysis by Perceptive Scientific Instruments, locates objects within an image and measures them to extract quantitative information. Applications are CAT scanners, radiography, microscopy in medicine as well as various industrial and manufacturing uses. The PSICOM 327 performs all digital image analysis functions. It is based on Jet Propulsion Laboratory technology, is accurate and cost efficient.

  16. [Lithology feature extraction of CASI hyperspectral data based on fractal signal algorithm].

    PubMed

    Tang, Chao; Chen, Jian-Ping; Cui, Jing; Wen, Bo-Tao

    2014-05-01

    Hyperspectral data is characterized by combination of image and spectrum and large data volume dimension reduction is the main research direction. Band selection and feature extraction is the primary method used for this objective. In the present article, the authors tested methods applied for the lithology feature extraction from hyperspectral data. Based on the self-similarity of hyperspectral data, the authors explored the application of fractal algorithm to lithology feature extraction from CASI hyperspectral data. The "carpet method" was corrected and then applied to calculate the fractal value of every pixel in the hyperspectral data. The results show that fractal information highlights the exposed bedrock lithology better than the original hyperspectral data The fractal signal and characterized scale are influenced by the spectral curve shape, the initial scale selection and iteration step. At present, research on the fractal signal of spectral curve is rare, implying the necessity of further quantitative analysis and investigation of its physical implications.

  17. Hyperpolarized xenon NMR and MRI signal amplification by gas extraction

    PubMed Central

    Zhou, Xin; Graziani, Dominic; Pines, Alexander

    2009-01-01

    A method is reported for enhancing the sensitivity of NMR of dissolved xenon by detecting the signal after extraction to the gas phase. We demonstrate hyperpolarized xenon signal amplification by gas extraction (Hyper-SAGE) in both NMR spectra and magnetic resonance images with time-of-flight information. Hyper-SAGE takes advantage of a change in physical phase to increase the density of polarized gas in the detection coil. At equilibrium, the concentration of gas-phase xenon is ≈10 times higher than that of the dissolved-phase gas. After extraction the xenon density can be further increased by several orders of magnitude by compression and/or liquefaction. Additionally, being a remote detection technique, the Hyper-SAGE effect is further enhanced in situations where the sample of interest would occupy only a small proportion of the traditional NMR receiver. Coupled with targeted xenon biosensors, Hyper-SAGE offers another path to highly sensitive molecular imaging of specific cell markers by detection of exhaled xenon gas. PMID:19805177

  18. Physical activity interventions to promote positive youth development among indigenous youth: a RE-AIM review.

    PubMed

    Baillie, Colin P T; Galaviz, Karla I; Emiry, Kevin; Bruner, Mark W; Bruner, Brenda G; Lévesque, Lucie

    2017-03-01

    Physical activity (PA) programs are a promising strategy to promote positive youth development (PYD). It is not known if published reports provide sufficient information to promote the implementation of effective PYD in indigenous youth. The purpose of this study was to assess the extent to which published literature on PA programs that promote PYD in indigenous youth report on RE-AIM (reach, effectiveness, adoption, implementation, maintenance) indicators. A systematic literature search was conducted to identify articles reporting on PA programs that promote PYD in indigenous youth. The search yielded 8084 articles. A validated 21-item RE-AIM abstraction tool assessing internal and external validity factors was used to extract data from 10 articles meeting eligibility criteria. The most commonly reported dimensions were effectiveness (73 %), adoption (48 %), and maintenance (43 %). Reach (34 %) and implementation (30 %) were less often reported. Published research provides insufficient information to inform real-world implementation of PA programs to promote PYD in indigenous youth.

  19. Numerical algebraic geometry: a new perspective on gauge and string theories

    NASA Astrophysics Data System (ADS)

    Mehta, Dhagash; He, Yang-Hui; Hauensteine, Jonathan D.

    2012-07-01

    There is a rich interplay between algebraic geometry and string and gauge theories which has been recently aided immensely by advances in computational algebra. However, symbolic (Gröbner) methods are severely limited by algorithmic issues such as exponential space complexity and being highly sequential. In this paper, we introduce a novel paradigm of numerical algebraic geometry which in a plethora of situations overcomes these shortcomings. The so-called `embarrassing parallelizability' allows us to solve many problems and extract physical information which elude symbolic methods. We describe the method and then use it to solve various problems arising from physics which could not be otherwise solved.

  20. An evidential reasoning extension to quantitative model-based failure diagnosis

    NASA Technical Reports Server (NTRS)

    Gertler, Janos J.; Anderson, Kenneth C.

    1992-01-01

    The detection and diagnosis of failures in physical systems characterized by continuous-time operation are studied. A quantitative diagnostic methodology has been developed that utilizes the mathematical model of the physical system. On the basis of the latter, diagnostic models are derived each of which comprises a set of orthogonal parity equations. To improve the robustness of the algorithm, several models may be used in parallel, providing potentially incomplete and/or conflicting inferences. Dempster's rule of combination is used to integrate evidence from the different models. The basic probability measures are assigned utilizing quantitative information extracted from the mathematical model and from online computation performed therewith.

  1. Automated Extraction of Flow Features

    NASA Technical Reports Server (NTRS)

    Dorney, Suzanne (Technical Monitor); Haimes, Robert

    2005-01-01

    Computational Fluid Dynamics (CFD) simulations are routinely performed as part of the design process of most fluid handling devices. In order to efficiently and effectively use the results of a CFD simulation, visualization tools are often used. These tools are used in all stages of the CFD simulation including pre-processing, interim-processing, and post-processing, to interpret the results. Each of these stages requires visualization tools that allow one to examine the geometry of the device, as well as the partial or final results of the simulation. An engineer will typically generate a series of contour and vector plots to better understand the physics of how the fluid is interacting with the physical device. Of particular interest are detecting features such as shocks, re-circulation zones, and vortices (which will highlight areas of stress and loss). As the demand for CFD analyses continues to increase the need for automated feature extraction capabilities has become vital. In the past, feature extraction and identification were interesting concepts, but not required in understanding the physics of a steady flow field. This is because the results of the more traditional tools like; isc-surface, cuts and streamlines, were more interactive and easily abstracted so they could be represented to the investigator. These tools worked and properly conveyed the collected information at the expense of a great deal of interaction. For unsteady flow-fields, the investigator does not have the luxury of spending time scanning only one "snapshot" of the simulation. Automated assistance is required in pointing out areas of potential interest contained within the flow. This must not require a heavy compute burden (the visualization should not significantly slow down the solution procedure for co-processing environments). Methods must be developed to abstract the feature of interest and display it in a manner that physically makes sense.

  2. Automated Extraction of Flow Features

    NASA Technical Reports Server (NTRS)

    Dorney, Suzanne (Technical Monitor); Haimes, Robert

    2004-01-01

    Computational Fluid Dynamics (CFD) simulations are routinely performed as part of the design process of most fluid handling devices. In order to efficiently and effectively use the results of a CFD simulation, visualization tools are often used. These tools are used in all stages of the CFD simulation including pre-processing, interim-processing, and post-processing, to interpret the results. Each of these stages requires visualization tools that allow one to examine the geometry of the device, as well as the partial or final results of the simulation. An engineer will typically generate a series of contour and vector plots to better understand the physics of how the fluid is interacting with the physical device. Of particular interest are detecting features such as shocks, recirculation zones, and vortices (which will highlight areas of stress and loss). As the demand for CFD analyses continues to increase the need for automated feature extraction capabilities has become vital. In the past, feature extraction and identification were interesting concepts, but not required in understanding the physics of a steady flow field. This is because the results of the more traditional tools like; iso-surface, cuts and streamlines, were more interactive and easily abstracted so they could be represented to the investigator. These tools worked and properly conveyed the collected information at the expense of a great deal of interaction. For unsteady flow-fields, the investigator does not have the luxury of spending time scanning only one "snapshot" of the simulation. Automated assistance is required in pointing out areas of potential interest contained within the flow. This must not require a heavy compute burden (the visualization should not significantly slow down the solution procedure for (co-processing environments). Methods must be developed to abstract the feature of interest and display it in a manner that physically makes sense.

  3. KAM (Knowledge Acquisition Module): A tool to simplify the knowledge acquisition process

    NASA Technical Reports Server (NTRS)

    Gettig, Gary A.

    1988-01-01

    Analysts, knowledge engineers and information specialists are faced with increasing volumes of time-sensitive data in text form, either as free text or highly structured text records. Rapid access to the relevant data in these sources is essential. However, due to the volume and organization of the contents, and limitations of human memory and association, frequently: (1) important information is not located in time; (2) reams of irrelevant data are searched; and (3) interesting or critical associations are missed due to physical or temporal gaps involved in working with large files. The Knowledge Acquisition Module (KAM) is a microcomputer-based expert system designed to assist knowledge engineers, analysts, and other specialists in extracting useful knowledge from large volumes of digitized text and text-based files. KAM formulates non-explicit, ambiguous, or vague relations, rules, and facts into a manageable and consistent formal code. A library of system rules or heuristics is maintained to control the extraction of rules, relations, assertions, and other patterns from the text. These heuristics can be added, deleted or customized by the user. The user can further control the extraction process with optional topic specifications. This allows the user to cluster extracts based on specific topics. Because KAM formalizes diverse knowledge, it can be used by a variety of expert systems and automated reasoning applications. KAM can also perform important roles in computer-assisted training and skill development. Current research efforts include the applicability of neural networks to aid in the extraction process and the conversion of these extracts into standard formats.

  4. A statistical profile of physical therapists, 1980 and 1990.

    PubMed

    Chevan, J; Chevan, A

    1998-03-01

    To plan for future needs, human resource analysts require demographic data. In this research, US census data were used to develop a profile of physical therapists. Data were extracted from the Public Use Microdata Samples of the US censuses of population from 1980 and 1990. Samples of 3,112 physical therapists from 1990 and 1,530 therapists from 1980 were obtained. A profile was generated by use of descriptive statistics to examine geographic distribution, social characteristics, employment characteristics, and income. Linear regression was used to determine factors that influence income. During the 1980s, physical therapy demonstrated remarkable growth, with trends in physical therapist location, gender, age, and place of employment. Even as the profession aged, it stayed an occupation composed predominantly of women, but one less concentrated in hospitals. Geographically, physical therapists remained clustered in the Northeast and along the Pacific Coast. Income generated by physical therapists was predicted by social and geographic characteristics. This study presents a new data source to examine physical therapist characteristics. It provides information necessary for health care planners and analysts to better understand the nature of the profession and those who practice.

  5. Algorithms for Spectral Decomposition with Applications to Optical Plume Anomaly Detection

    NASA Technical Reports Server (NTRS)

    Srivastava, Askok N.; Matthews, Bryan; Das, Santanu

    2008-01-01

    The analysis of spectral signals for features that represent physical phenomenon is ubiquitous in the science and engineering communities. There are two main approaches that can be taken to extract relevant features from these high-dimensional data streams. The first set of approaches relies on extracting features using a physics-based paradigm where the underlying physical mechanism that generates the spectra is used to infer the most important features in the data stream. We focus on a complementary methodology that uses a data-driven technique that is informed by the underlying physics but also has the ability to adapt to unmodeled system attributes and dynamics. We discuss the following four algorithms: Spectral Decomposition Algorithm (SDA), Non-Negative Matrix Factorization (NMF), Independent Component Analysis (ICA) and Principal Components Analysis (PCA) and compare their performance on a spectral emulator which we use to generate artificial data with known statistical properties. This spectral emulator mimics the real-world phenomena arising from the plume of the space shuttle main engine and can be used to validate the results that arise from various spectral decomposition algorithms and is very useful for situations where real-world systems have very low probabilities of fault or failure. Our results indicate that methods like SDA and NMF provide a straightforward way of incorporating prior physical knowledge while NMF with a tuning mechanism can give superior performance on some tests. We demonstrate these algorithms to detect potential system-health issues on data from a spectral emulator with tunable health parameters.

  6. Feasibility of an anticipatory noncontact precrash restraint actuation system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kercel, S.W.; Dress, W.B.

    1995-12-31

    The problem of providing an electronic warning of an impending crash to a precrash restraint system a fraction of a second before physical contact differs from more widely explored problems, such as providing several seconds of crash warning to a driver. One approach to precrash restraint sensing is to apply anticipatory system theory. This consists of nested simplified models of the system to be controlled and of the system`s environment. It requires sensory information to describe the ``current state`` of the system and the environment. The models use the sensory data to make a faster-than-real-time prediction about the near future.more » Anticipation theory is well founded but rarely used. A major problem is to extract real-time current-state information from inexpensive sensors. Providing current-state information to the nested models is the weakest element of the system. Therefore, sensors and real-time processing of sensor signals command the most attention in an assessment of system feasibility. This paper describes problem definition, potential ``showstoppers,`` and ways to overcome them. It includes experiments showing that inexpensive radar is a practical sensing element. It considers fast and inexpensive algorithms to extract information from sensor data.« less

  7. Parental perceptions of facilitators and barriers to physical activity for children with intellectual disabilities: A mixed methods systematic review.

    PubMed

    McGarty, Arlene M; Melville, Craig A

    2018-02-01

    There is a need increase our understanding of what factors affect physical activity participation in children with intellectual disabilities (ID) and develop effective methods to overcome barriers and increase activity levels. This study aimed to systematically review parental perceptions of facilitators and barriers to physical activity for children with ID. A systematic search of Embase, Medline, ERIC, Web of Science, and PsycINFO was conducted (up to and including August, 2017) to identify relevant papers. A meta-ethnography approach was used to synthesise qualitative and quantitative results through the generation of third-order themes and a theoretical model. Ten studies were included, which ranged from weak to strong quality. Seventy-one second-order themes and 12 quantitative results were extracted. Five third-order themes were developed: family, child factors, inclusive programmes and facilities, social motivation, and child's experiences of physical activity. It is theorised that these factors can be facilitators or barriers to physical activity, depending on the information and education of relevant others, e.g. parents and coaches. Parents have an important role in supporting activity in children with ID. Increasing the information and education given to relevant others could be an important method of turning barriers into facilitators. Copyright © 2017 Elsevier Ltd. All rights reserved.

  8. Factors that Influence Students in Choosing Physics Programmes at University Level: the Case of Greece

    NASA Astrophysics Data System (ADS)

    Meli, Kalliopi; Lavidas, Konstantinos; Koliopoulos, Dimitrios

    2018-04-01

    Low enrolment in undergraduate level physics programmes has drawn the attention of the relevant disciplines, education policy-makers, and researchers worldwide. Many reports released during the previous decades attempt to identify the factors that attract young people to study science, but only few of them focus explicitly on physics. In Greece, in contrast to many other countries, physics departments are overflowing with young students. However, there are two categories of students: those for whom physics was the optimal choice of a programme ("choosers") and those for whom physics was an alternative choice that they had to settle for. We suggest that the latter category be called "nearly-choosers," in order to be differentiated from choosers as well as from "non-choosers," namely those candidates that did not apply to a physics programme at all. We are interested in the factors that attract high school students to study physics and the differences (if any) between choosers and nearly-choosers. A newly formed questionnaire was distributed within a Greek physics department (University of Patras), and the students' responses (n = 105) were analysed with exploratory factor analysis and specifically principal component analysis so as to extract broad factors. Three broad factors have arisen: school-based, career, and informal learning. The first two factors proved to be motivating for pursuing a degree in physics, while the third factor appeared to have a rather indifferent association. t tests and Pearson correlations indicated mild differentiations between choosers and nearly-choosers that pertain to school-based influences and informal learning.

  9. Process monitoring using automatic physical measurement based on electrical and physical variability analysis

    NASA Astrophysics Data System (ADS)

    Shauly, Eitan N.; Levi, Shimon; Schwarzband, Ishai; Adan, Ofer; Latinsky, Sergey

    2015-04-01

    A fully automated silicon-based methodology for systematic analysis of electrical features is shown. The system was developed for process monitoring and electrical variability reduction. A mapping step was created by dedicated structures such as static-random-access-memory (SRAM) array or standard cell library, or by using a simple design rule checking run-set. The resulting database was then used as an input for choosing locations for critical dimension scanning electron microscope images and for specific layout parameter extraction then was input to SPICE compact modeling simulation. Based on the experimental data, we identified two items that must be checked and monitored using the method described here: transistor's sensitivity to the distance between the poly end cap and edge of active area (AA) due to AA rounding, and SRAM leakage due to a too close N-well to P-well. Based on this example, for process monitoring and variability analyses, we extensively used this method to analyze transistor gates having different shapes. In addition, analysis for a large area of high density standard cell library was done. Another set of monitoring focused on a high density SRAM array is also presented. These examples provided information on the poly and AA layers, using transistor parameters such as leakage current and drive current. We successfully define "robust" and "less-robust" transistor configurations included in the library and identified unsymmetrical transistors in the SRAM bit-cells. These data were compared to data extracted from the same devices at the end of the line. Another set of analyses was done to samples after Cu M1 etch. Process monitoring information on M1 enclosed contact was extracted based on contact resistance as a feedback. Guidelines for the optimal M1 space for different layout configurations were also extracted. All these data showed the successful in-field implementation of our methodology as a useful process monitoring method.

  10. Ionization Electron Signal Processing in Single Phase LArTPCs II. Data/Simulation Comparison and Performance in MicroBooNE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Adams, C.; et al.

    The single-phase liquid argon time projection chamber (LArTPC) provides a large amount of detailed information in the form of fine-grained drifted ionization charge from particle traces. To fully utilize this information, the deposited charge must be accurately extracted from the raw digitized waveforms via a robust signal processing chain. Enabled by the ultra-low noise levels associated with cryogenic electronics in the MicroBooNE detector, the precise extraction of ionization charge from the induction wire planes in a single-phase LArTPC is qualitatively demonstrated on MicroBooNE data with event display images, and quantitatively demonstrated via waveform-level and track-level metrics. Improved performance of inductionmore » plane calorimetry is demonstrated through the agreement of extracted ionization charge measurements across different wire planes for various event topologies. In addition to the comprehensive waveform-level comparison of data and simulation, a calibration of the cryogenic electronics response is presented and solutions to various MicroBooNE-specific TPC issues are discussed. This work presents an important improvement in LArTPC signal processing, the foundation of reconstruction and therefore physics analyses in MicroBooNE.« less

  11. Developing a complex independent component analysis technique to extract non-stationary patterns from geophysical time-series

    NASA Astrophysics Data System (ADS)

    Forootan, Ehsan; Kusche, Jürgen

    2016-04-01

    Geodetic/geophysical observations, such as the time series of global terrestrial water storage change or sea level and temperature change, represent samples of physical processes and therefore contain information about complex physical interactionswith many inherent time scales. Extracting relevant information from these samples, for example quantifying the seasonality of a physical process or its variability due to large-scale ocean-atmosphere interactions, is not possible by rendering simple time series approaches. In the last decades, decomposition techniques have found increasing interest for extracting patterns from geophysical observations. Traditionally, principal component analysis (PCA) and more recently independent component analysis (ICA) are common techniques to extract statistical orthogonal (uncorrelated) and independent modes that represent the maximum variance of observations, respectively. PCA and ICA can be classified as stationary signal decomposition techniques since they are based on decomposing the auto-covariance matrix or diagonalizing higher (than two)-order statistical tensors from centered time series. However, the stationary assumption is obviously not justifiable for many geophysical and climate variables even after removing cyclic components e.g., the seasonal cycles. In this paper, we present a new decomposition method, the complex independent component analysis (CICA, Forootan, PhD-2014), which can be applied to extract to non-stationary (changing in space and time) patterns from geophysical time series. Here, CICA is derived as an extension of real-valued ICA (Forootan and Kusche, JoG-2012), where we (i) define a new complex data set using a Hilbert transformation. The complex time series contain the observed values in their real part, and the temporal rate of variability in their imaginary part. (ii) An ICA algorithm based on diagonalization of fourth-order cumulants is then applied to decompose the new complex data set in (i). (iii) Dominant non-stationary patterns are recognized as independent complex patterns that can be used to represent the space and time amplitude and phase propagations. We present the results of CICA on simulated and real cases e.g., for quantifying the impact of large-scale ocean-atmosphere interaction on global mass changes. Forootan (PhD-2014) Statistical signal decomposition techniques for analyzing time-variable satellite gravimetry data, PhD Thesis, University of Bonn, http://hss.ulb.uni-bonn.de/2014/3766/3766.htm Forootan and Kusche (JoG-2012) Separation of global time-variable gravity signals into maximally independent components, Journal of Geodesy 86 (7), 477-497, doi: 10.1007/s00190-011-0532-5

  12. Quantification of urban structure on building block level utilizing multisensoral remote sensing data

    NASA Astrophysics Data System (ADS)

    Wurm, Michael; Taubenböck, Hannes; Dech, Stefan

    2010-10-01

    Dynamics of urban environments are a challenge to a sustainable development. Urban areas promise wealth, realization of individual dreams and power. Hence, many cities are characterized by a population growth as well as physical development. Traditional, visual mapping and updating of urban structure information of cities is a very laborious and cost-intensive task, especially for large urban areas. For this purpose, we developed a workflow for the extraction of the relevant information by means of object-based image classification. In this manner, multisensoral remote sensing data has been analyzed in terms of very high resolution optical satellite imagery together with height information by a digital surface model to retrieve a detailed 3D city model with the relevant land-use / land-cover information. This information has been aggregated on the level of the building block to describe the urban structure by physical indicators. A comparison between the indicators derived by the classification and a reference classification has been accomplished to show the correlation between the individual indicators and a reference classification of urban structure types. The indicators have been used to apply a cluster analysis to group the individual blocks into similar clusters.

  13. Linking attentional processes and conceptual problem solving: visual cues facilitate the automaticity of extracting relevant information from diagrams

    PubMed Central

    Rouinfar, Amy; Agra, Elise; Larson, Adam M.; Rebello, N. Sanjay; Loschky, Lester C.

    2014-01-01

    This study investigated links between visual attention processes and conceptual problem solving. This was done by overlaying visual cues on conceptual physics problem diagrams to direct participants’ attention to relevant areas to facilitate problem solving. Participants (N = 80) individually worked through four problem sets, each containing a diagram, while their eye movements were recorded. Each diagram contained regions that were relevant to solving the problem correctly and separate regions related to common incorrect responses. Problem sets contained an initial problem, six isomorphic training problems, and a transfer problem. The cued condition saw visual cues overlaid on the training problems. Participants’ verbal responses were used to determine their accuracy. This study produced two major findings. First, short duration visual cues which draw attention to solution-relevant information and aid in the organizing and integrating of it, facilitate both immediate problem solving and generalization of that ability to new problems. Thus, visual cues can facilitate re-representing a problem and overcoming impasse, enabling a correct solution. Importantly, these cueing effects on problem solving did not involve the solvers’ attention necessarily embodying the solution to the problem, but were instead caused by solvers attending to and integrating relevant information in the problems into a solution path. Second, this study demonstrates that when such cues are used across multiple problems, solvers can automatize the extraction of problem-relevant information extraction. These results suggest that low-level attentional selection processes provide a necessary gateway for relevant information to be used in problem solving, but are generally not sufficient for correct problem solving. Instead, factors that lead a solver to an impasse and to organize and integrate problem information also greatly facilitate arriving at correct solutions. PMID:25324804

  14. Linking attentional processes and conceptual problem solving: visual cues facilitate the automaticity of extracting relevant information from diagrams.

    PubMed

    Rouinfar, Amy; Agra, Elise; Larson, Adam M; Rebello, N Sanjay; Loschky, Lester C

    2014-01-01

    This study investigated links between visual attention processes and conceptual problem solving. This was done by overlaying visual cues on conceptual physics problem diagrams to direct participants' attention to relevant areas to facilitate problem solving. Participants (N = 80) individually worked through four problem sets, each containing a diagram, while their eye movements were recorded. Each diagram contained regions that were relevant to solving the problem correctly and separate regions related to common incorrect responses. Problem sets contained an initial problem, six isomorphic training problems, and a transfer problem. The cued condition saw visual cues overlaid on the training problems. Participants' verbal responses were used to determine their accuracy. This study produced two major findings. First, short duration visual cues which draw attention to solution-relevant information and aid in the organizing and integrating of it, facilitate both immediate problem solving and generalization of that ability to new problems. Thus, visual cues can facilitate re-representing a problem and overcoming impasse, enabling a correct solution. Importantly, these cueing effects on problem solving did not involve the solvers' attention necessarily embodying the solution to the problem, but were instead caused by solvers attending to and integrating relevant information in the problems into a solution path. Second, this study demonstrates that when such cues are used across multiple problems, solvers can automatize the extraction of problem-relevant information extraction. These results suggest that low-level attentional selection processes provide a necessary gateway for relevant information to be used in problem solving, but are generally not sufficient for correct problem solving. Instead, factors that lead a solver to an impasse and to organize and integrate problem information also greatly facilitate arriving at correct solutions.

  15. The Goddard Profiling Algorithm (GPROF): Description and Current Applications

    NASA Technical Reports Server (NTRS)

    Olson, William S.; Yang, Song; Stout, John E.; Grecu, Mircea

    2004-01-01

    Atmospheric scientists use different methods for interpreting satellite data. In the early days of satellite meteorology, the analysis of cloud pictures from satellites was primarily subjective. As computer technology improved, satellite pictures could be processed digitally, and mathematical algorithms were developed and applied to the digital images in different wavelength bands to extract information about the atmosphere in an objective way. The kind of mathematical algorithm one applies to satellite data may depend on the complexity of the physical processes that lead to the observed image, and how much information is contained in the satellite images both spatially and at different wavelengths. Imagery from satellite-borne passive microwave radiometers has limited horizontal resolution, and the observed microwave radiances are the result of complex physical processes that are not easily modeled. For this reason, a type of algorithm called a Bayesian estimation method is utilized to interpret passive microwave imagery in an objective, yet computationally efficient manner.

  16. Accelerometry-based classification of human activities using Markov modeling.

    PubMed

    Mannini, Andrea; Sabatini, Angelo Maria

    2011-01-01

    Accelerometers are a popular choice as body-motion sensors: the reason is partly in their capability of extracting information that is useful for automatically inferring the physical activity in which the human subject is involved, beside their role in feeding biomechanical parameters estimators. Automatic classification of human physical activities is highly attractive for pervasive computing systems, whereas contextual awareness may ease the human-machine interaction, and in biomedicine, whereas wearable sensor systems are proposed for long-term monitoring. This paper is concerned with the machine learning algorithms needed to perform the classification task. Hidden Markov Model (HMM) classifiers are studied by contrasting them with Gaussian Mixture Model (GMM) classifiers. HMMs incorporate the statistical information available on movement dynamics into the classification process, without discarding the time history of previous outcomes as GMMs do. An example of the benefits of the obtained statistical leverage is illustrated and discussed by analyzing two datasets of accelerometer time series.

  17. Extraction of physical Schottky parameters using the Lambert function in Ni/AlGaN/GaN HEMT devices with defined conduction phenomena

    NASA Astrophysics Data System (ADS)

    Latry, O.; Divay, A.; Fadil, D.; Dherbécourt, P.

    2017-01-01

    Electrical characterization analyses are proposed in this work using the Lambert function on Schottky junctions in GaN wide band gap semiconductor devices for extraction of physical parameters. The Lambert function is used to give an explicit expression of the current in the Schottky junction. This function is applied with defined conduction phenomena, whereas other work presented arbitrary (or undefined) conduction mechanisms in such parameters’ extractions. Based upon AlGaN/GaN HEMT structures, extractions of parameters are undergone in order to provide physical characteristics. This work highlights a new expression of current with defined conduction phenomena in order to quantify the physical properties of Schottky contacts in AlGaN/GaN HEMT transistors. Project supported by the French Department of Defense (DGA).

  18. [Physical fingerprint for quality control of traditional Chinese medicine extract powders].

    PubMed

    Zhang, Yi; Xu, Bing; Sun, Fei; Wang, Xin; Zhang, Na; Shi, Xin-Yuan; Qiao, Yan-Jiang

    2016-06-01

    The physical properties of both raw materials and excipients are closely correlated with the quality of traditional Chinese medicine preparations in oral solid dosage forms. In this paper, based on the concept of the chemical fingerprint for quality control of traditional Chinese medicine products, the method of physical fingerprint for quality evaluation of traditional Chinese medicine extract powders was proposed. This novel physical fingerprint was built by the radar map, and consisted of five primary indexes (i.e. stackablity, homogeneity, flowability, compressibility and stability) and 12 secondary indexes (i.e. bulk density, tap density, particle size<50 μm percentage, relative homogeneity index, hausner ratio, angle of repose, powder flow time, inter-particle porosity, Carr index, cohesion index, loss on drying, hygroscopicity). Panax notoginseng saponins (PNS) extract was taken for an example. This paper introduced the application of physical fingerprint in the evaluation of source-to-source and batch-to-batch quality consistence of PNS extract powders. Moreover, the physical fingerprint of PNS was built by calculating the index of parameters, the index of parametric profile and the index of good compressibility, in order to successfully predict the compressibility of the PNS extract powder and relevant formulations containing PNS extract powder and conventional pharmaceutical excipients. The results demonstrated that the proposed method could not only provide new insights into the development and process control of traditional Chinese medicine solid dosage forms. Copyright© by the Chinese Pharmaceutical Association.

  19. Automated extraction and semantic analysis of mutation impacts from the biomedical literature

    PubMed Central

    2012-01-01

    Background Mutations as sources of evolution have long been the focus of attention in the biomedical literature. Accessing the mutational information and their impacts on protein properties facilitates research in various domains, such as enzymology and pharmacology. However, manually curating the rich and fast growing repository of biomedical literature is expensive and time-consuming. As a solution, text mining approaches have increasingly been deployed in the biomedical domain. While the detection of single-point mutations is well covered by existing systems, challenges still exist in grounding impacts to their respective mutations and recognizing the affected protein properties, in particular kinetic and stability properties together with physical quantities. Results We present an ontology model for mutation impacts, together with a comprehensive text mining system for extracting and analysing mutation impact information from full-text articles. Organisms, as sources of proteins, are extracted to help disambiguation of genes and proteins. Our system then detects mutation series to correctly ground detected impacts using novel heuristics. It also extracts the affected protein properties, in particular kinetic and stability properties, as well as the magnitude of the effects and validates these relations against the domain ontology. The output of our system can be provided in various formats, in particular by populating an OWL-DL ontology, which can then be queried to provide structured information. The performance of the system is evaluated on our manually annotated corpora. In the impact detection task, our system achieves a precision of 70.4%-71.1%, a recall of 71.3%-71.5%, and grounds the detected impacts with an accuracy of 76.5%-77%. The developed system, including resources, evaluation data and end-user and developer documentation is freely available under an open source license at http://www.semanticsoftware.info/open-mutation-miner. Conclusion We present Open Mutation Miner (OMM), the first comprehensive, fully open-source approach to automatically extract impacts and related relevant information from the biomedical literature. We assessed the performance of our work on manually annotated corpora and the results show the reliability of our approach. The representation of the extracted information into a structured format facilitates knowledge management and aids in database curation and correction. Furthermore, access to the analysis results is provided through multiple interfaces, including web services for automated data integration and desktop-based solutions for end user interactions. PMID:22759648

  20. Automated ancillary cancer history classification for mesothelioma patients from free-text clinical reports

    PubMed Central

    Wilson, Richard A.; Chapman, Wendy W.; DeFries, Shawn J.; Becich, Michael J.; Chapman, Brian E.

    2010-01-01

    Background: Clinical records are often unstructured, free-text documents that create information extraction challenges and costs. Healthcare delivery and research organizations, such as the National Mesothelioma Virtual Bank, require the aggregation of both structured and unstructured data types. Natural language processing offers techniques for automatically extracting information from unstructured, free-text documents. Methods: Five hundred and eight history and physical reports from mesothelioma patients were split into development (208) and test sets (300). A reference standard was developed and each report was annotated by experts with regard to the patient’s personal history of ancillary cancer and family history of any cancer. The Hx application was developed to process reports, extract relevant features, perform reference resolution and classify them with regard to cancer history. Two methods, Dynamic-Window and ConText, for extracting information were evaluated. Hx’s classification responses using each of the two methods were measured against the reference standard. The average Cohen’s weighted kappa served as the human benchmark in evaluating the system. Results: Hx had a high overall accuracy, with each method, scoring 96.2%. F-measures using the Dynamic-Window and ConText methods were 91.8% and 91.6%, which were comparable to the human benchmark of 92.8%. For the personal history classification, Dynamic-Window scored highest with 89.2% and for the family history classification, ConText scored highest with 97.6%, in which both methods were comparable to the human benchmark of 88.3% and 97.2%, respectively. Conclusion: We evaluated an automated application’s performance in classifying a mesothelioma patient’s personal and family history of cancer from clinical reports. To do so, the Hx application must process reports, identify cancer concepts, distinguish the known mesothelioma from ancillary cancers, recognize negation, perform reference resolution and determine the experiencer. Results indicated that both information extraction methods tested were dependant on the domain-specific lexicon and negation extraction. We showed that the more general method, ConText, performed as well as our task-specific method. Although Dynamic- Window could be modified to retrieve other concepts, ConText is more robust and performs better on inconclusive concepts. Hx could greatly improve and expedite the process of extracting data from free-text, clinical records for a variety of research or healthcare delivery organizations. PMID:21031012

  1. Perspective: Sloppiness and emergent theories in physics, biology, and beyond.

    PubMed

    Transtrum, Mark K; Machta, Benjamin B; Brown, Kevin S; Daniels, Bryan C; Myers, Christopher R; Sethna, James P

    2015-07-07

    Large scale models of physical phenomena demand the development of new statistical and computational tools in order to be effective. Many such models are "sloppy," i.e., exhibit behavior controlled by a relatively small number of parameter combinations. We review an information theoretic framework for analyzing sloppy models. This formalism is based on the Fisher information matrix, which is interpreted as a Riemannian metric on a parameterized space of models. Distance in this space is a measure of how distinguishable two models are based on their predictions. Sloppy model manifolds are bounded with a hierarchy of widths and extrinsic curvatures. The manifold boundary approximation can extract the simple, hidden theory from complicated sloppy models. We attribute the success of simple effective models in physics as likewise emerging from complicated processes exhibiting a low effective dimensionality. We discuss the ramifications and consequences of sloppy models for biochemistry and science more generally. We suggest that the reason our complex world is understandable is due to the same fundamental reason: simple theories of macroscopic behavior are hidden inside complicated microscopic processes.

  2. ANALYSIS OF OUT OF DATE MCU MODIFIER LOCATED IN SRNL

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Crawford, C.

    2014-10-22

    SRNL recently completed density measurements and chemical analyses on modifier samples stored in drums within SRNL. The modifier samples date back to 2008 and are in various quantities up to 40 gallons. Vendor information on the original samples indicates a shelf life of 5 years. There is interest in determining if samples that have been stored for more than the 5 year shelf life are still acceptable for use. The Modular Caustic Side Solvent Extraction Unit (MCU) Solvent component Cs-7SB [(2,2,3,3- tetraflouropropoxy)-3-(4-sec-butylphenoxy)-2-propanol, CAS #308362-88-1] is used as a diluent modifier to increase extractant solubility and provide physical characteristics necessary formore » diluent trimming.« less

  3. Fault-tolerant quantum error detection.

    PubMed

    Linke, Norbert M; Gutierrez, Mauricio; Landsman, Kevin A; Figgatt, Caroline; Debnath, Shantanu; Brown, Kenneth R; Monroe, Christopher

    2017-10-01

    Quantum computers will eventually reach a size at which quantum error correction becomes imperative. Quantum information can be protected from qubit imperfections and flawed control operations by encoding a single logical qubit in multiple physical qubits. This redundancy allows the extraction of error syndromes and the subsequent detection or correction of errors without destroying the logical state itself through direct measurement. We show the encoding and syndrome measurement of a fault-tolerantly prepared logical qubit via an error detection protocol on four physical qubits, represented by trapped atomic ions. This demonstrates the robustness of a logical qubit to imperfections in the very operations used to encode it. The advantage persists in the face of large added error rates and experimental calibration errors.

  4. Imaging nanoscale lattice variations by machine learning of x-ray diffraction microscopy data

    DOE PAGES

    Laanait, Nouamane; Zhang, Zhan; Schlepütz, Christian M.

    2016-08-09

    In this paper, we present a novel methodology based on machine learning to extract lattice variations in crystalline materials, at the nanoscale, from an x-ray Bragg diffraction-based imaging technique. By employing a full-field microscopy setup, we capture real space images of materials, with imaging contrast determined solely by the x-ray diffracted signal. The data sets that emanate from this imaging technique are a hybrid of real space information (image spatial support) and reciprocal lattice space information (image contrast), and are intrinsically multidimensional (5D). By a judicious application of established unsupervised machine learning techniques and multivariate analysis to this multidimensional datamore » cube, we show how to extract features that can be ascribed physical interpretations in terms of common structural distortions, such as lattice tilts and dislocation arrays. Finally, we demonstrate this 'big data' approach to x-ray diffraction microscopy by identifying structural defects present in an epitaxial ferroelectric thin-film of lead zirconate titanate.« less

  5. Imaging nanoscale lattice variations by machine learning of x-ray diffraction microscopy data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Laanait, Nouamane; Zhang, Zhan; Schlepütz, Christian M.

    In this paper, we present a novel methodology based on machine learning to extract lattice variations in crystalline materials, at the nanoscale, from an x-ray Bragg diffraction-based imaging technique. By employing a full-field microscopy setup, we capture real space images of materials, with imaging contrast determined solely by the x-ray diffracted signal. The data sets that emanate from this imaging technique are a hybrid of real space information (image spatial support) and reciprocal lattice space information (image contrast), and are intrinsically multidimensional (5D). By a judicious application of established unsupervised machine learning techniques and multivariate analysis to this multidimensional datamore » cube, we show how to extract features that can be ascribed physical interpretations in terms of common structural distortions, such as lattice tilts and dislocation arrays. Finally, we demonstrate this 'big data' approach to x-ray diffraction microscopy by identifying structural defects present in an epitaxial ferroelectric thin-film of lead zirconate titanate.« less

  6. On the Concept of Information and Its Role in Nature

    NASA Astrophysics Data System (ADS)

    Roederer, Juan G.

    2003-03-01

    In this article we address some fundamental questions concerning information: Can the existing laws of physics adequately deal with the most striking property of information, namely to cause specific changes in the structure and energy flows of a complex system, without the information in itself representing fields, forces or energy in any of their characteristic forms? Or is information irreducible to the laws of physics and chemistry? Are information and complexity related concepts? Does the Universe, in its evolution, constantly generate new information? Or are information and information-processing exclusive attributes of living systems, related to the very definition of life? If that were the case, what happens with the physical meanings of entropy in statistical mechanics or wave function in quantum mechanics? How many distinct classes of information and information processing do exist in the biological world? How does information appear in Darwinian evolution? Does the human brain have unique properties or capabilities in terms of information processing? In what ways does information processing bring about human self-consciousness? We shall introduce the meaning of "information" in a way that is detached from human technological systems and related algorithms and semantics, and that is not based on any mathematical formula. To accomplish this we turn to the concept of interaction as the basic departing point, and identify two fundamentally different classes, with information and information-processing appearing as the key discriminator: force-field driven interactions between elementary particles and ensembles of particles in the macroscopic physical domain, and information-based interactions between certain kinds of complex systems that form the biological domain. We shall show that in an abiotic world, information plays no role; physical interactions just happen, they are driven by energy exchange between the interacting parts and do not require any operations of information processing. Information only enters the non-living physical world when a living thing interacts with it-and when a scientist extracts information through observation and measurement. But for living organisms, information is the very essence of their existence: to maintain a long-term state of unstable thermodynamic equilibrium with its surroundings, consistently increase its organization and reproduce, an organism has to rely on information-based interactions in which form or pattern, not energy, is the controlling factor. This latter class comprises biomolecular information processes controlling the metabolism, growth, multiplication and differentiation of cells, and neural information processes controlling animal behavior and intelligence. The only way new information can appear is through the process of biological evolution and, in the short term, through sensory acquisition and the manipulation of images in the nervous system. Non-living informational systems such as books, computers, AI systems and other artifacts, as well as living organisms that are the result of breeding or cloning, are planned by human beings and will not be considered here.

  7. Remarks on the pion-nucleon σ-term

    NASA Astrophysics Data System (ADS)

    Hoferichter, Martin; Ruiz de Elvira, Jacobo; Kubis, Bastian; Meißner, Ulf-G.

    2016-09-01

    The pion-nucleon σ-term can be stringently constrained by the combination of analyticity, unitarity, and crossing symmetry with phenomenological information on the pion-nucleon scattering lengths. Recently, lattice calculations at the physical point have been reported that find lower values by about 3σ with respect to the phenomenological determination. We point out that a lattice measurement of the pion-nucleon scattering lengths could help resolve the situation by testing the values extracted from spectroscopy measurements in pionic atoms.

  8. Towards Photoplethysmography-Based Estimation of Instantaneous Heart Rate During Physical Activity.

    PubMed

    Jarchi, Delaram; Casson, Alexander J

    2017-09-01

    Recently numerous methods have been proposed for estimating average heart rate using photoplethysmography (PPG) during physical activity, overcoming the significant interference that motion causes in PPG traces. We propose a new algorithm framework for extracting instantaneous heart rate from wearable PPG and Electrocardiogram (ECG) signals to provide an estimate of heart rate variability during exercise. For ECG signals, we propose a new spectral masking approach which modifies a particle filter tracking algorithm, and for PPG signals constrains the instantaneous frequency obtained from the Hilbert transform to a region of interest around a candidate heart rate measure. Performance is verified using accelerometry and wearable ECG and PPG data from subjects while biking and running on a treadmill. Instantaneous heart rate provides more information than average heart rate alone. The instantaneous heart rate can be extracted during motion to an accuracy of 1.75 beats per min (bpm) from PPG signals and 0.27 bpm from ECG signals. Estimates of instantaneous heart rate can now be generated from PPG signals during motion. These estimates can provide more information on the human body during exercise. Instantaneous heart rate provides a direct measure of vagal nerve and sympathetic nervous system activity and is of substantial use in a number of analyzes and applications. Previously it has not been possible to estimate instantaneous heart rate from wrist wearable PPG signals.

  9. Unconventional Tools for an Unconventional Resource: Community and Landscape Planning for Shale in the Marcellus Region

    NASA Astrophysics Data System (ADS)

    Murtha, T., Jr.; Orland, B.; Goldberg, L.; Hammond, R.

    2014-12-01

    Deep shale natural gas deposits made accessible by new technologies are quickly becoming a considerable share of North America's energy portfolio. Unlike traditional deposits and extraction footprints, shale gas offers dispersed and complex landscape and community challenges. These challenges are both cultural and environmental. This paper describes the development and application of creative geospatial tools as a means to engage communities along the northern tier counties of Pennsylvania, experiencing Marcellus shale drilling in design and planning. Uniquely combining physical landscape models with predictive models of exploration activities, including drilling, pipeline construction and road reconstruction, the tools quantify the potential impacts of drilling activities for communities and landscapes in the commonwealth of Pennsylvania. Dividing the state into 9836 watershed sub-basins, we first describe the current state of Marcellus related activities through 2014. We then describe and report the results of three scaled predictive models designed to investigate probable sub-basins where future activities will be focused. Finally, the core of the paper reports on the second level of tools we have now developed to engage communities in planning for unconventional gas extraction in Pennsylvania. Using a geodesign approach we are working with communities to transfer information for comprehensive landscape planning and informed decision making. These tools not only quantify physical landscape impacts, but also quantify potential visual, aesthetic and cultural resource implications.

  10. Physical environment and life expectancy at birth in Mexico: an eco-epidemiological study.

    PubMed

    Idrovo, Alvaro J

    2011-06-01

    The objective of this ecological study was to ascertain the effects of physical environment on life expectancy at birth, using data from all 32 Mexican states. 50 environmental indicators with information about demography, housing, poverty, water, soils, biodiversity, forestry resources, and residues were included in exploratory factor analysis. Four factors were extracted: population vulnerability/susceptibility, and biodiversity (FC1), urbanization, industrialization, and environmental sustainability (FC2), ecological resilience (FC3), and free-plague environments (FC4). Using OLS regressions, FC2, FC3, and FC4 were found to be positively associated with life expectancy at birth, while FC1 was negatively associated. This study suggests that physical environment is an important macro-determinant of the health of the Mexican population, and highlights the usefulness of ecological concepts in epidemiological studies.

  11. Emerging interdependence between stock values during financial crashes.

    PubMed

    Rocchi, Jacopo; Tsui, Enoch Yan Lok; Saad, David

    2017-01-01

    To identify emerging interdependencies between traded stocks we investigate the behavior of the stocks of FTSE 100 companies in the period 2000-2015, by looking at daily stock values. Exploiting the power of information theoretical measures to extract direct influences between multiple time series, we compute the information flow across stock values to identify several different regimes. While small information flows is detected in most of the period, a dramatically different situation occurs in the proximity of global financial crises, where stock values exhibit strong and substantial interdependence for a prolonged period. This behavior is consistent with what one would generally expect from a complex system near criticality in physical systems, showing the long lasting effects of crashes on stock markets.

  12. Interferometer with Continuously Varying Path Length Measured in Wavelengths to the Reference Mirror

    NASA Technical Reports Server (NTRS)

    Ohara, Tetsuo (Inventor)

    2016-01-01

    An interferometer in which the path length of the reference beam, measured in wavelengths, is continuously changing in sinusoidal fashion and the interference signal created by combining the measurement beam and the reference beam is processed in real time to obtain the physical distance along the measurement beam between the measured surface and a spatial reference frame such as the beam splitter. The processing involves analyzing the Fourier series of the intensity signal at one or more optical detectors in real time and using the time-domain multi-frequency harmonic signals to extract the phase information independently at each pixel position of one or more optical detectors and converting the phase information to distance information.

  13. Emerging interdependence between stock values during financial crashes

    PubMed Central

    Tsui, Enoch Yan Lok; Saad, David

    2017-01-01

    To identify emerging interdependencies between traded stocks we investigate the behavior of the stocks of FTSE 100 companies in the period 2000-2015, by looking at daily stock values. Exploiting the power of information theoretical measures to extract direct influences between multiple time series, we compute the information flow across stock values to identify several different regimes. While small information flows is detected in most of the period, a dramatically different situation occurs in the proximity of global financial crises, where stock values exhibit strong and substantial interdependence for a prolonged period. This behavior is consistent with what one would generally expect from a complex system near criticality in physical systems, showing the long lasting effects of crashes on stock markets. PMID:28542278

  14. Split-mouth comparison of physics forceps and extraction forceps in orthodontic extraction of upper premolars.

    PubMed

    Hariharan, Samyuktha; Narayanan, Vinod; Soh, Chen Loong

    2014-12-01

    We compared outcome variables (operative complications, inflammatory complications, and operating time) in patients being treated by orthodontic extraction of upper premolars with the Physics forceps or the universal extraction forceps. We organised a single blind, split-mouth clinical trial to compare the outcomes of the 2 groups (n=54 premolars). The Physics forceps group had lower mean (SD) visual analogue scores (VAS) for pain (0.59 (0.57)) on the first postoperative day than the other group (1.04 (0.85)) (p=0.03). There were no other significant differences between the 2 groups in any other variable studied. Copyright © 2014 The British Association of Oral and Maxillofacial Surgeons. Published by Elsevier Ltd. All rights reserved.

  15. [Application of regular expression in extracting key information from Chinese medicine literatures about re-evaluation of post-marketing surveillance].

    PubMed

    Wang, Zhifei; Xie, Yanming; Wang, Yongyan

    2011-10-01

    Computerizing extracting information from Chinese medicine literature seems more convenient than hand searching, which could simplify searching process and improve the accuracy. However, many computerized auto-extracting methods are increasingly used, regular expression is so special that could be efficient for extracting useful information in research. This article focused on regular expression applying in extracting information from Chinese medicine literature. Two practical examples were reported in this article about regular expression to extract "case number (non-terminology)" and "efficacy rate (subgroups for related information identification)", which explored how to extract information in Chinese medicine literature by means of some special research method.

  16. Effects of Distant Green Space on Physical Activity in Sydney, Australia.

    PubMed

    Chong, Shanley; Byun, Roy; Mazumdar, Soumya; Bauman, Adrian; Jalaludin, Bin

    2017-01-01

    The aim was to investigate the association between distant green space and physical activity modified by local green space. Information about physical activity, demographic and socioeconomic background at the individual level was extracted from the New South Wales Population Health Survey. The proportion of a postcode that was parkland was used as a proxy measure for access to parklands and was calculated for each individual. There was a significant relationship between distant green space and engaging in moderate-to-vigorous physical activity (MVPA) at least once a week. No significant relationship was found between adequate physical activity and distant green space. No significant relationships were found between adequate physical activity, engaging in MVPA, and local green space. However, if respondents lived in greater local green space (≥25%), there was a significant relationship between engaging in MVPA at least once a week and distance green space of ≥20%. This study highlights the important effect of distant green space on physical activity. Our findings also suggest that moderate size of local green space together with moderate size of distant green space are important levers for participation of physical activity.

  17. Challenges in Managing Information Extraction

    ERIC Educational Resources Information Center

    Shen, Warren H.

    2009-01-01

    This dissertation studies information extraction (IE), the problem of extracting structured information from unstructured data. Example IE tasks include extracting person names from news articles, product information from e-commerce Web pages, street addresses from emails, and names of emerging music bands from blogs. IE is all increasingly…

  18. Thermal Physical Property-Based Fusion of Geostationary Meteorological Satellite Visible and Infrared Channel Images

    PubMed Central

    Han, Lei; Shi, Lu; Yang, Yiling; Song, Dalei

    2014-01-01

    Geostationary meteorological satellite infrared (IR) channel data contain important spectral information for meteorological research and applications, but their spatial resolution is relatively low. The objective of this study is to obtain higher-resolution IR images. One common method of increasing resolution fuses the IR data with high-resolution visible (VIS) channel data. However, most existing image fusion methods focus only on visual performance, and often fail to take into account the thermal physical properties of the IR images. As a result, spectral distortion occurs frequently. To tackle this problem, we propose a thermal physical properties-based correction method for fusing geostationary meteorological satellite IR and VIS images. In our two-step process, the high-resolution structural features of the VIS image are first extracted and incorporated into the IR image using regular multi-resolution fusion approach, such as the multiwavelet analysis. This step significantly increases the visual details in the IR image, but fake thermal information may be included. Next, the Stefan-Boltzmann Law is applied to correct the distortion, to retain or recover the thermal infrared nature of the fused image. The results of both the qualitative and quantitative evaluation demonstrate that the proposed physical correction method both improves the spatial resolution and preserves the infrared thermal properties. PMID:24919017

  19. Thermal physical property-based fusion of geostationary meteorological satellite visible and infrared channel images.

    PubMed

    Han, Lei; Shi, Lu; Yang, Yiling; Song, Dalei

    2014-06-10

    Geostationary meteorological satellite infrared (IR) channel data contain important spectral information for meteorological research and applications, but their spatial resolution is relatively low. The objective of this study is to obtain higher-resolution IR images. One common method of increasing resolution fuses the IR data with high-resolution visible (VIS) channel data. However, most existing image fusion methods focus only on visual performance, and often fail to take into account the thermal physical properties of the IR images. As a result, spectral distortion occurs frequently. To tackle this problem, we propose a thermal physical properties-based correction method for fusing geostationary meteorological satellite IR and VIS images. In our two-step process, the high-resolution structural features of the VIS image are first extracted and incorporated into the IR image using regular multi-resolution fusion approach, such as the multiwavelet analysis. This step significantly increases the visual details in the IR image, but fake thermal information may be included. Next, the Stefan-Boltzmann Law is applied to correct the distortion, to retain or recover the thermal infrared nature of the fused image. The results of both the qualitative and quantitative evaluation demonstrate that the proposed physical correction method both improves the spatial resolution and preserves the infrared thermal properties.

  20. Modeling of In-stream Tidal Energy Development and its Potential Effects in Tacoma Narrows, Washington, USA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, Zhaoqing; Wang, Taiping; Copping, Andrea E.

    Understanding and providing proactive information on the potential for tidal energy projects to cause changes to the physical system and to key water quality constituents in tidal waters is a necessary and cost-effective means to avoid costly regulatory involvement and late stage surprises in the permitting process. This paper presents a modeling study for evaluating the tidal energy extraction and its potential impacts on the marine environment in a real world site - Tacoma Narrows of Puget Sound, Washington State, USA. An unstructured-grid coastal ocean model, fitted with a module that simulates tidal energy devices, was applied to simulate themore » tidal energy extracted by different turbine array configurations and the potential effects of the extraction at local and system-wide scales in Tacoma Narrows and South Puget Sound. Model results demonstrated the advantage of an unstructured-grid model for simulating the far-field effects of tidal energy extraction in a large model domain, as well as assessing the near-field effect using a fine grid resolution near the tidal turbines. The outcome shows that a realistic near-term deployment scenario extracts a very small fraction of the total tidal energy in the system and that system wide environmental effects are not likely; however, near-field effects on the flow field and bed shear stress in the area of tidal turbine farm are more likely. Model results also indicate that from a practical standpoint, hydrodynamic or water quality effects are not likely to be the limiting factor for development of large commercial-scale tidal farms. Results indicate that very high numbers of turbines are required to significantly alter the tidal system; limitations on marine space or other environmental concerns are likely to be reached before reaching these deployment levels. These findings show that important information obtained from numerical modeling can be used to inform regulatory and policy processes for tidal energy development.« less

  1. PREdator: a python based GUI for data analysis, evaluation and fitting

    PubMed Central

    2014-01-01

    The analysis of a series of experimental data is an essential procedure in virtually every field of research. The information contained in the data is extracted by fitting the experimental data to a mathematical model. The type of the mathematical model (linear, exponential, logarithmic, etc.) reflects the physical laws that underlie the experimental data. Here, we aim to provide a readily accessible, user-friendly python script for data analysis, evaluation and fitting. PREdator is presented at the example of NMR paramagnetic relaxation enhancement analysis.

  2. Tethers in space handbook

    NASA Technical Reports Server (NTRS)

    Reese, T. G.; Baracat, W. A.; Butner, C. L.

    1986-01-01

    The handbook provides a list and description of ongoing tether programs. This includes the joint U.S.-Italy demonstration project, and individual U.S. and Italian studies and demonstration programs. An overview of the current activity level and areas of emphasis in this emerging field is provided. The fundamental physical principles behind the proposed tether applications are addressed. Four basic concepts of gravity gradient, rotation, momentum exchange, and electrodynamics are discussed. Information extracted from literature, which supplements and enhances the tether applications is also presented. A bibliography is appended.

  3. Cosmic X-ray physics

    NASA Technical Reports Server (NTRS)

    Mccammon, D.; Cox, D. P.; Kraushaar, W. L.; Sanders, W. T.

    1987-01-01

    The soft X-ray sky survey data are combined with the results from the UXT sounding rocket payload. Very strong constraints can then be placed on models of the origin of the soft diffuse background. Additional observational constraints force more complicated and realistic models. Significant progress was made in the extraction of more detailed spectral information from the UXT data set. Work was begun on a second generation proportional counter response model. The first flight of the sounding rocket will have a collimator to study the diffuse background.

  4. Infrared Spectroscopic Imaging: The Next Generation

    PubMed Central

    Bhargava, Rohit

    2013-01-01

    Infrared (IR) spectroscopic imaging seemingly matured as a technology in the mid-2000s, with commercially successful instrumentation and reports in numerous applications. Recent developments, however, have transformed our understanding of the recorded data, provided capability for new instrumentation, and greatly enhanced the ability to extract more useful information in less time. These developments are summarized here in three broad areas— data recording, interpretation of recorded data, and information extraction—and their critical review is employed to project emerging trends. Overall, the convergence of selected components from hardware, theory, algorithms, and applications is one trend. Instead of similar, general-purpose instrumentation, another trend is likely to be diverse and application-targeted designs of instrumentation driven by emerging component technologies. The recent renaissance in both fundamental science and instrumentation will likely spur investigations at the confluence of conventional spectroscopic analyses and optical physics for improved data interpretation. While chemometrics has dominated data processing, a trend will likely lie in the development of signal processing algorithms to optimally extract spectral and spatial information prior to conventional chemometric analyses. Finally, the sum of these recent advances is likely to provide unprecedented capability in measurement and scientific insight, which will present new opportunities for the applied spectroscopist. PMID:23031693

  5. Decoding Mode-mixing in Black-hole Merger Ringdown

    NASA Technical Reports Server (NTRS)

    Kelly, Bernard J.; Baker, John G.

    2013-01-01

    Optimal extraction of information from gravitational-wave observations of binary black-hole coalescences requires detailed knowledge of the waveforms. Current approaches for representing waveform information are based on spin-weighted spherical harmonic decomposition. Higher-order harmonic modes carrying a few percent of the total power output near merger can supply information critical to determining intrinsic and extrinsic parameters of the binary. One obstacle to constructing a full multi-mode template of merger waveforms is the apparently complicated behavior of some of these modes; instead of settling down to a simple quasinormal frequency with decaying amplitude, some |m| = modes show periodic bumps characteristic of mode-mixing. We analyze the strongest of these modes the anomalous (3, 2) harmonic mode measured in a set of binary black-hole merger waveform simulations, and show that to leading order, they are due to a mismatch between the spherical harmonic basis used for extraction in 3D numerical relativity simulations, and the spheroidal harmonics adapted to the perturbation theory of Kerr black holes. Other causes of mode-mixing arising from gauge ambiguities and physical properties of the quasinormal ringdown modes are also considered and found to be small for the waveforms studied here.

  6. PCA Tomography: how to extract information from data cubes

    NASA Astrophysics Data System (ADS)

    Steiner, J. E.; Menezes, R. B.; Ricci, T. V.; Oliveira, A. S.

    2009-05-01

    Astronomy has evolved almost exclusively by the use of spectroscopic and imaging techniques, operated separately. With the development of modern technologies, it is possible to obtain data cubes in which one combines both techniques simultaneously, producing images with spectral resolution. To extract information from them can be quite complex, and hence the development of new methods of data analysis is desirable. We present a method of analysis of data cube (data from single field observations, containing two spatial and one spectral dimension) that uses Principal Component Analysis (PCA) to express the data in the form of reduced dimensionality, facilitating efficient information extraction from very large data sets. PCA transforms the system of correlated coordinates into a system of uncorrelated coordinates ordered by principal components of decreasing variance. The new coordinates are referred to as eigenvectors, and the projections of the data on to these coordinates produce images we will call tomograms. The association of the tomograms (images) to eigenvectors (spectra) is important for the interpretation of both. The eigenvectors are mutually orthogonal, and this information is fundamental for their handling and interpretation. When the data cube shows objects that present uncorrelated physical phenomena, the eigenvector's orthogonality may be instrumental in separating and identifying them. By handling eigenvectors and tomograms, one can enhance features, extract noise, compress data, extract spectra, etc. We applied the method, for illustration purpose only, to the central region of the low ionization nuclear emission region (LINER) galaxy NGC 4736, and demonstrate that it has a type 1 active nucleus, not known before. Furthermore, we show that it is displaced from the centre of its stellar bulge. Based on observations obtained at the Gemini Observatory, which is operated by the Association of Universities for Research in Astronomy, Inc., under a cooperative agreement with the National Science Foundation on behalf of the Gemini partnership: the National Science Foundation (United States), the Science and Technology Facilities Council (United Kingdom), the National Research Council (Canada), CONICYT (Chile), the Australian Research Council (Australia), Ministério da Ciência e Tecnologia (Brazil) and SECYT (Argentina). E-mail: steiner@astro.iag.usp.br

  7. The effect of Valerian root extract on the severity of pre menstrual syndrome symptoms.

    PubMed

    Behboodi Moghadam, Zahra; Rezaei, Elham; Shirood Gholami, Roghaieh; Kheirkhah, Masomeh; Haghani, Hamid

    2016-07-01

    Premenstrual syndrome (PMS) is a common disorder. Due to the knowledge lack of the precise etiology of this syndrome, different treatment methods are recommended, one of them is the use of medicinal herbs. This study aimed to investigate the effect of Valerian ( xié cǎo) root extract on the intensity of PMS symptoms. In this double-blind clinical trial, 100 female students of Islamic Azad University, Tonekabon Branch, Mazandaran Province, Iran, with PMS were randomly divided into groups receiving Valerian (scientific name: Valeriana officinalis) and placebo in 2013. The participants received 2 pills daily in the last seven days of their menstrual cycle for 3 cycles and recorded their symptoms. The data collection tools included demographic information questionnaire, daily symptom severity questionnaire, and a provisional diagnosis of premenstrual syndrome questionnaire. Data were compared previous, one, two, and three cycles after student's intervention using and analyzed by independent t-test, paired t-test, chi-squared test, and repeated measures ANOVA in SPSS 16. A significant difference was seen in mean emotional, behavioral and physical premenstrual symptom severity in the intervention group before and after the intervention (P < 0.001). However, this difference was not statistically significant in the control group. The results of this study showed that Valerian root extract may reduce emotional, physical, and behavioral symptoms of premenstrual syndrome.

  8. Implementation and Initial Testing of Advanced Processing and Analysis Algorithms for Correlated Neutron Counting

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Santi, Peter Angelo; Cutler, Theresa Elizabeth; Favalli, Andrea

    In order to improve the accuracy and capabilities of neutron multiplicity counting, additional quantifiable information is needed in order to address the assumptions that are present in the point model. Extracting and utilizing higher order moments (Quads and Pents) from the neutron pulse train represents the most direct way of extracting additional information from the measurement data to allow for an improved determination of the physical properties of the item of interest. The extraction of higher order moments from a neutron pulse train required the development of advanced dead time correction algorithms which could correct for dead time effects inmore » all of the measurement moments in a self-consistent manner. In addition, advanced analysis algorithms have been developed to address specific assumptions that are made within the current analysis model, namely that all neutrons are created at a single point within the item of interest, and that all neutrons that are produced within an item are created with the same energy distribution. This report will discuss the current status of implementation and initial testing of the advanced dead time correction and analysis algorithms that have been developed in an attempt to utilize higher order moments to improve the capabilities of correlated neutron measurement techniques.« less

  9. PropBase Query Layer: a single portal to UK subsurface physical property databases

    NASA Astrophysics Data System (ADS)

    Kingdon, Andrew; Nayembil, Martin L.; Richardson, Anne E.; Smith, A. Graham

    2013-04-01

    Until recently, the delivery of geological information for industry and public was achieved by geological mapping. Now pervasively available computers mean that 3D geological models can deliver realistic representations of the geometric location of geological units, represented as shells or volumes. The next phase of this process is to populate these with physical properties data that describe subsurface heterogeneity and its associated uncertainty. Achieving this requires capture and serving of physical, hydrological and other property information from diverse sources to populate these models. The British Geological Survey (BGS) holds large volumes of subsurface property data, derived both from their own research data collection and also other, often commercially derived data sources. This can be voxelated to incorporate this data into the models to demonstrate property variation within the subsurface geometry. All property data held by BGS has for many years been stored in relational databases to ensure their long-term continuity. However these have, by necessity, complex structures; each database contains positional reference data and model information, and also metadata such as sample identification information and attributes that define the source and processing. Whilst this is critical to assessing these analyses, it also hugely complicates the understanding of variability of the property under assessment and requires multiple queries to study related datasets making extracting physical properties from these databases difficult. Therefore the PropBase Query Layer has been created to allow simplified aggregation and extraction of all related data and its presentation of complex data in simple, mostly denormalized, tables which combine information from multiple databases into a single system. The structure from each relational database is denormalized in a generalised structure, so that each dataset can be viewed together in a common format using a simple interface. Data are re-engineered to facilitate easy loading. The query layer structure comprises tables, procedures, functions, triggers, views and materialised views. The structure contains a main table PRB_DATA which contains all of the data with the following attribution: • a unique identifier • the data source • the unique identifier from the parent database for traceability • the 3D location • the property type • the property value • the units • necessary qualifiers • precision information and an audit trail Data sources, property type and units are constrained by dictionaries, a key component of the structure which defines what properties and inheritance hierarchies are to be coded and also guides the process as to what and how these are extracted from the structure. Data types served by the Query Layer include site investigation derived geotechnical data, hydrogeology datasets, regional geochemistry, geophysical logs as well as lithological and borehole metadata. The size and complexity of the data sets with multiple parent structures requires a technically robust approach to keep the layer synchronised. This is achieved through Oracle procedures written in PL/SQL containing the logic required to carry out the data manipulation (inserts, updates, deletes) to keep the layer synchronised with the underlying databases either as regular scheduled jobs (weekly, monthly etc) or invoked on demand. The PropBase Query Layer's implementation has enabled rapid data discovery, visualisation and interpretation of geological data with greater ease, simplifying the parametrisation of 3D model volumes and facilitating the study of intra-unit heterogeneity.

  10. a Probability-Based Statistical Method to Extract Water Body of TM Images with Missing Information

    NASA Astrophysics Data System (ADS)

    Lian, Shizhong; Chen, Jiangping; Luo, Minghai

    2016-06-01

    Water information cannot be accurately extracted using TM images because true information is lost in some images because of blocking clouds and missing data stripes, thereby water information cannot be accurately extracted. Water is continuously distributed in natural conditions; thus, this paper proposed a new method of water body extraction based on probability statistics to improve the accuracy of water information extraction of TM images with missing information. Different disturbing information of clouds and missing data stripes are simulated. Water information is extracted using global histogram matching, local histogram matching, and the probability-based statistical method in the simulated images. Experiments show that smaller Areal Error and higher Boundary Recall can be obtained using this method compared with the conventional methods.

  11. Model-Based Detection of Radioactive Contraband for Harbor Defense Incorporating Compton Scattering Physics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Candy, J V; Chambers, D H; Breitfeller, E F

    2010-03-02

    The detection of radioactive contraband is a critical problem is maintaining national security for any country. Photon emissions from threat materials challenge both detection and measurement technologies especially when concealed by various types of shielding complicating the transport physics significantly. This problem becomes especially important when ships are intercepted by U.S. Coast Guard harbor patrols searching for contraband. The development of a sequential model-based processor that captures both the underlying transport physics of gamma-ray emissions including Compton scattering and the measurement of photon energies offers a physics-based approach to attack this challenging problem. The inclusion of a basic radionuclide representationmore » of absorbed/scattered photons at a given energy along with interarrival times is used to extract the physics information available from the noisy measurements portable radiation detection systems used to interdict contraband. It is shown that this physics representation can incorporated scattering physics leading to an 'extended' model-based structure that can be used to develop an effective sequential detection technique. The resulting model-based processor is shown to perform quite well based on data obtained from a controlled experiment.« less

  12. Physical characteristics related to bra fit.

    PubMed

    Chen, Chin-Man; LaBat, Karen; Bye, Elizabeth

    2010-04-01

    Producing well-fitting garments has been a challenge for retailers and manufacturers since mass production began. Poorly fitted bras can cause discomfort or pain and result in lost sales for retailers. Because body contours are important factors affecting bra fit, this study analyses the relationship of physical characteristics to bra-fit problems. This study has used 3-D body-scanning technology to extract upper body angles from a sample of 103 college women; these data were used to categorise physical characteristics into shoulder slope, bust prominence, back curvature and acromion placement. Relationships between these physical categories and bra-fit problems were then analysed. Results show that significant main effects and two-way interactions of the physical categories exist in the fit problems of poor bra support and bra-motion restriction. The findings are valuable in helping the apparel industry create better-fitting bras. STATEMENT OF RELEVANCE: Poorly fitted bras can cause discomfort or pain and result in lost sales for retailers. The findings regarding body-shape classification provide researchers with a statistics method to quantify physical characteristics and the findings regarding the relationship analysis between physical characteristics and bra fit offer bra companies valuable information about bra-fit perceptions attributable to women with figure variations.

  13. Nonvolatile chemical cues affect host-plant ranking by gravid Polygonia c-album females.

    PubMed

    Mozūraitis, Raimondas; Murtazina, Rushana; Nylin, Sören; Borg-Karlson, Anna-Karin

    2012-01-01

    In a multiple-choice test, the preference of egg-laying Polygonia c-album (comma butterfly) females was studied for oviposition on plants bearing surrogate leaves treated with crude methanol extracts obtained from leaves of seven host-plant species: Humulus lupulus, Urtica dioica, Ulmus glabra, Salix caprea, Ribes nigrum, Corylus avellana, and Betula pubescens. The ranking order of surrogate leaves treated with host-plant extracts corresponded well to that reported on natural foliage, except R. nigrum. Thus, host-plant choice in P. c-album seems to be highly dependent on chemical cues. Moreover, after two subsequent fractionations using reversed-phase chromatography the nonvolatile chemical cues residing in the most polar water-soluble fractions evidently provided sufficient information for egg-laying females to discriminate and rank between the samples of more and less preferred plants, since the ranking in these assays was similar to that for natural foliage or whole methanol extracts, while the physical traits of the surrogate leaves remained uniform.

  14. Radiation crosslinking of highly plasticized PVC

    NASA Astrophysics Data System (ADS)

    Mendizabal, E.; Cruz, L.; Jasso, C. F.; Burillo, G.; Dakin, V. I.

    1996-02-01

    To improve the physical properties of highly plasticized PVC, the polymer was crosslinked by gamma irradiation using a dose rate of 91 kGy/h. The effect of plasticizer type was studied by using three different plasticizers, 2,2,4-trimethyl-1,3-pentanediol diisobutyrate (TXIB), di(2-ethyl hexyl) phthalate (DOP), and di(2-ethylhexyl terephthalate) (DOTP), and varying irradiation doses. Gel content was determined by soxhlet extraction, tensile measurements were made on a universal testing machine and the mechano-dynamic measurements were made in a dynamic rheometer. It was found that a considerable bonding of plasticizer molecules to macromolelcules takes place along with crosslinking, so that the use of the solvent extraction method for measuring the degree of crosslinking can give erroneous information. Radiation-chemical crosslinking yield ( Gc) and molecular weight of interjunctions chains ( Mc), were calculated for different systems studied. Addition of ethylene glycol dimethacrylate (EGDM) as a crosslinking coagent and dioctyl tin oxide (DOTO) as a stabilizer was also studied. Plasticizers extraction resistance was increased by irradiation treatment.

  15. Occurrence of pesticide non extractable residues in physical and chemical fractions from two natural soils.

    NASA Astrophysics Data System (ADS)

    Andreou, K.; Jones, K.; Semple, K.

    2009-04-01

    Distribution of pesticide non extractable residues resulted from the incubation of two natural soils with each of the isoproturon, diazinon and cypermethrin pesticide was assessed in this study. Pesticide non extractable residues distribution in soil physical and chemical fractions is known to ultimately affect their fate. This study aimed to address the fate and behaviour of the non extractable residues in the context of their association with soil physical and chemical fractions with varying properties and characteristics. Non extractable residues were formed from incubation of each pesticide in the two natural soils over a period of 24 months. Soils containing the non extractable residues were fractionated into three solid phase fractions using a physical fractionation procedure as follows: Sediment (SED, >20 μm), (II) Microaggregate (MA, 20-2 μm) and (III) Colloid phase (COL, 2-0.05 μm). Each soil fraction was then fractionated into organic carbon chemical fractionations as follows: Fulvic acid (FA), Humic acid (HA) and Humin (HM). Significant amount of the pesticides was lost during the incubation period. Enrichment factors for the organic carbon and the 14C-pesticide residues were higher in the MA and COL fraction rather than the SED fraction. Greater association and enrichment of the fulvic acid fraction of the organic carbon in the soil was observed. Non extractable residues at the FA fraction showed to diminish while in the HA fraction were increased with decreasing the fraction size. An appreciable amount of non extractable residues were located in the HM fraction but this was less than the amount recovered in the humic substances. Long term fate of pesticide non extractable residues in the soil structural components is important in order to assess any risk associated with them.

  16. Information extraction system

    DOEpatents

    Lemmond, Tracy D; Hanley, William G; Guensche, Joseph Wendell; Perry, Nathan C; Nitao, John J; Kidwell, Paul Brandon; Boakye, Kofi Agyeman; Glaser, Ron E; Prenger, Ryan James

    2014-05-13

    An information extraction system and methods of operating the system are provided. In particular, an information extraction system for performing meta-extraction of named entities of people, organizations, and locations as well as relationships and events from text documents are described herein.

  17. Fault-tolerant quantum error detection

    PubMed Central

    Linke, Norbert M.; Gutierrez, Mauricio; Landsman, Kevin A.; Figgatt, Caroline; Debnath, Shantanu; Brown, Kenneth R.; Monroe, Christopher

    2017-01-01

    Quantum computers will eventually reach a size at which quantum error correction becomes imperative. Quantum information can be protected from qubit imperfections and flawed control operations by encoding a single logical qubit in multiple physical qubits. This redundancy allows the extraction of error syndromes and the subsequent detection or correction of errors without destroying the logical state itself through direct measurement. We show the encoding and syndrome measurement of a fault-tolerantly prepared logical qubit via an error detection protocol on four physical qubits, represented by trapped atomic ions. This demonstrates the robustness of a logical qubit to imperfections in the very operations used to encode it. The advantage persists in the face of large added error rates and experimental calibration errors. PMID:29062889

  18. Predicting scattering scanning near-field optical microscopy of mass-produced plasmonic devices

    NASA Astrophysics Data System (ADS)

    Otto, Lauren M.; Burgos, Stanley P.; Staffaroni, Matteo; Ren, Shen; Süzer, Özgün; Stipe, Barry C.; Ashby, Paul D.; Hammack, Aeron T.

    2018-05-01

    Scattering scanning near-field optical microscopy enables optical imaging and characterization of plasmonic devices with nanometer-scale resolution well below the diffraction limit. This technique enables developers to probe and understand the waveguide-coupled plasmonic antenna in as-fabricated heat-assisted magnetic recording heads. In order to validate and predict results and to extract information from experimental measurements that is physically comparable to simulations, a model was developed to translate the simulated electric field into expected near-field measurements using physical parameters specific to scattering scanning near-field optical microscopy physics. The methods used in this paper prove that scattering scanning near-field optical microscopy can be used to determine critical sub-diffraction-limited dimensions of optical field confinement, which is a crucial metrology requirement for the future of nano-optics, semiconductor photonic devices, and biological sensing where the near-field character of light is fundamental to device operation.

  19. Buying drugs on a Darknet market: A better deal? Studying the online illicit drug market through the analysis of digital, physical and chemical data.

    PubMed

    Rhumorbarbe, Damien; Staehli, Ludovic; Broséus, Julian; Rossy, Quentin; Esseiva, Pierre

    2016-10-01

    Darknet markets, also known as cryptomarkets, are websites located on the Darknet and designed to allow the trafficking of illicit products, mainly drugs. This study aims at presenting the added value of combining digital, chemical and physical information to reconstruct sellers' activities. In particular, this research focuses on Evolution, one of the most popular cryptomarkets active from January 2014 to March 2015. Evolution source code files were analysed using Python scripts based on regular expressions to extract information about listings (i.e., sales proposals) and sellers. The results revealed more than 48,000 listings and around 2700 vendors claiming to send illicit drug products from 70 countries. The most frequent categories of illicit drugs offered by vendors were cannabis-related products (around 25%) followed by ecstasy (MDA, MDMA) and stimulants (cocaine, speed). The cryptomarket was then especially studied from a Swiss point of view. Illicit drugs were purchased from three sellers located in Switzerland. The purchases were carried out to confront digital information (e.g., the type of drug, the purity, the shipping country and the concealment methods mentioned on listings) with the physical analysis of the shipment packaging and the chemical analysis of the received product (purity, cutting agents, chemical profile based on minor and major alkaloids, chemical class). The results show that digital information, such as concealment methods and shipping country, seems accurate. But the illicit drugs purity is found to be different from the information indicated on their respective listings. Moreover, chemical profiling highlighted links between cocaine sold online and specimens seized in Western Switzerland. This study highlights that (1) the forensic analysis of the received products allows the evaluation of the accuracy of digital data collected on the website, and (2) the information from digital and physical/chemical traces are complementary to evaluate the practices of the online selling of illicit drugs on cryptomarkets. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  20. Optimal Correlations in Many-Body Quantum Systems

    NASA Astrophysics Data System (ADS)

    Amico, L.; Rossini, D.; Hamma, A.; Korepin, V. E.

    2012-06-01

    Information and correlations in a quantum system are closely related through the process of measurement. We explore such relation in a many-body quantum setting, effectively bridging between quantum metrology and condensed matter physics. To this aim we adopt the information-theory view of correlations and study the amount of correlations after certain classes of positive-operator-valued measurements are locally performed. As many-body systems, we consider a one-dimensional array of interacting two-level systems (a spin chain) at zero temperature, where quantum effects are most pronounced. We demonstrate how the optimal strategy to extract the correlations depends on the quantum phase through a subtle interplay between local interactions and coherence.

  1. Smart Networked Elements in Support of ISHM

    NASA Technical Reports Server (NTRS)

    Oostdyk, Rebecca; Mata, Carlos; Perotti, Jose M.

    2008-01-01

    At the core of ISHM is the ability to extract information and knowledge from raw data. Conventional data acquisition systems sample and convert physical measurements to engineering units, which higher-level systems use to derive health and information about processes and systems. Although health management is essential at the top level, there are considerable advantages to implementing health-related functions at the sensor level. The distribution of processing to lower levels reduces bandwidth requirements, enhances data fusion, and improves the resolution for detection and isolation of failures in a system, subsystem, component, or process. The Smart Networked Element (SNE) has been developed to implement intelligent functions and algorithms at the sensor level in support of ISHM.

  2. Unfolding single RNA molecules: bridging the gap between equilibrium and non-equilibrium statistical thermodynamics.

    PubMed

    Bustamante, Carlos

    2005-11-01

    During the last 15 years, scientists have developed methods that permit the direct mechanical manipulation of individual molecules. Using this approach, they have begun to investigate the effect of force and torque in chemical and biochemical reactions. These studies span from the study of the mechanical properties of macromolecules, to the characterization of molecular motors, to the mechanical unfolding of individual proteins and RNA. Here I present a review of some of our most recent results using mechanical force to unfold individual molecules of RNA. These studies make it possible to follow in real time the trajectory of each molecule as it unfolds and characterize the various intermediates of the reaction. Moreover, if the process takes place reversibly it is possible to extract both kinetic and thermodynamic information from these experiments at the same time that we characterize the forces that maintain the three-dimensional structure of the molecule in solution. These studies bring us closer to the biological unfolding processes in the cell as they simulate in vitro, the mechanical unfolding of RNAs carried out in the cell by helicases. If the unfolding process occurs irreversibly, I show here that single-molecule experiments can still provide equilibrium, thermodynamic information from non-equilibrium data by using recently discovered fluctuation theorems. Such theorems represent a bridge between equilibrium and non-equilibrium statistical mechanics. In fact, first derived in 1997, the first experimental demonstration of the validity of fluctuation theorems was obtained by unfolding mechanically a single molecule of RNA. It is perhaps a sign of the times that important physical results are these days used to extract information about biological systems and that biological systems are being used to test and confirm fundamental new laws in physics.

  3. BMI and attitudes and beliefs about physical activity and nutrition of parents of adolescents with intellectual disabilities.

    PubMed

    George, V A; Shacter, S D; Johnson, P M

    2011-11-01

    The purpose of this study was: (1) to evaluate the beliefs, attitudes and behaviours associated with nutrition and physical activity of parents with adolescents with intellectual disabilities (ID); (2) to determine if these variables related to the body mass index (BMI) of the adolescents and the parents' BMI; and (3) to investigate if the parents' perception of their child's weight status was accurate. A survey was used to collect information on BMI and attitudes and beliefs about nutrition and physical activity from parents (n = 207) of adolescents with ID attending schools participating in the Best Buddies Program. Approximately 45% of the adolescents were overweight or obese and over two-thirds of the parents were either overweight or obese. There was a significant difference in child's BMI by parents' description, F(3,158) = 72.75, P < 0.001. Factor analysis on questions on physical activity and nutrition revealed three factors (Factor 1 - Family Healthy Habits, Factor 2 - Parental Role and Factor 3 - Parental Activity) extracting 63% of the variance. The BMI of the adolescents significantly correlated with Factors 2 and 3. Children categorised as having a lower BMI had parents who agreed significantly more (r = -0.22, P < 0.005) with questions about being role models. There was a significant correlation between BMI for both the parents and adolescents and frequency of fast foods purchased. Efforts need to be made to provide parents of adolescents with ID tailored information about how they can assist their child in managing their weight. This information should emphasise to parents the important part they play as role models and as providers for healthy choices for physical activity as well as nutrition. © 2011 The Authors. Journal of Intellectual Disability Research © 2011 Blackwell Publishing Ltd.

  4. Reconstruction of dynamic structures of experimental setups based on measurable experimental data only

    NASA Astrophysics Data System (ADS)

    Chen, Tian-Yu; Chen, Yang; Yang, Hu-Jiang; Xiao, Jing-Hua; Hu, Gang

    2018-03-01

    Nowadays, massive amounts of data have been accumulated in various and wide fields, it has become today one of the central issues in interdisciplinary fields to analyze existing data and extract as much useful information as possible from data. It is often that the output data of systems are measurable while dynamic structures producing these data are hidden, and thus studies to reveal system structures by analyzing available data, i.e., reconstructions of systems become one of the most important tasks of information extractions. In the past, most of the works in this respect were based on theoretical analyses and numerical verifications. Direct analyses of experimental data are very rare. In physical science, most of the analyses of experimental setups were based on the first principles of physics laws, i.e., so-called top-down analyses. In this paper, we conducted an experiment of “Boer resonant instrument for forced vibration” (BRIFV) and inferred the dynamic structure of the experimental set purely from the analysis of the measurable experimental data, i.e., by applying the bottom-up strategy. Dynamics of the experimental set is strongly nonlinear and chaotic, and itʼs subjects to inevitable noises. We proposed to use high-order correlation computations to treat nonlinear dynamics; use two-time correlations to treat noise effects. By applying these approaches, we have successfully reconstructed the structure of the experimental setup, and the dynamic system reconstructed with the measured data reproduces good experimental results in a wide range of parameters.

  5. Simultaneous Spectral-Spatial Feature Selection and Extraction for Hyperspectral Images.

    PubMed

    Zhang, Lefei; Zhang, Qian; Du, Bo; Huang, Xin; Tang, Yuan Yan; Tao, Dacheng

    2018-01-01

    In hyperspectral remote sensing data mining, it is important to take into account of both spectral and spatial information, such as the spectral signature, texture feature, and morphological property, to improve the performances, e.g., the image classification accuracy. In a feature representation point of view, a nature approach to handle this situation is to concatenate the spectral and spatial features into a single but high dimensional vector and then apply a certain dimension reduction technique directly on that concatenated vector before feed it into the subsequent classifier. However, multiple features from various domains definitely have different physical meanings and statistical properties, and thus such concatenation has not efficiently explore the complementary properties among different features, which should benefit for boost the feature discriminability. Furthermore, it is also difficult to interpret the transformed results of the concatenated vector. Consequently, finding a physically meaningful consensus low dimensional feature representation of original multiple features is still a challenging task. In order to address these issues, we propose a novel feature learning framework, i.e., the simultaneous spectral-spatial feature selection and extraction algorithm, for hyperspectral images spectral-spatial feature representation and classification. Specifically, the proposed method learns a latent low dimensional subspace by projecting the spectral-spatial feature into a common feature space, where the complementary information has been effectively exploited, and simultaneously, only the most significant original features have been transformed. Encouraging experimental results on three public available hyperspectral remote sensing datasets confirm that our proposed method is effective and efficient.

  6. Information extraction from multi-institutional radiology reports.

    PubMed

    Hassanpour, Saeed; Langlotz, Curtis P

    2016-01-01

    The radiology report is the most important source of clinical imaging information. It documents critical information about the patient's health and the radiologist's interpretation of medical findings. It also communicates information to the referring physicians and records that information for future clinical and research use. Although efforts to structure some radiology report information through predefined templates are beginning to bear fruit, a large portion of radiology report information is entered in free text. The free text format is a major obstacle for rapid extraction and subsequent use of information by clinicians, researchers, and healthcare information systems. This difficulty is due to the ambiguity and subtlety of natural language, complexity of described images, and variations among different radiologists and healthcare organizations. As a result, radiology reports are used only once by the clinician who ordered the study and rarely are used again for research and data mining. In this work, machine learning techniques and a large multi-institutional radiology report repository are used to extract the semantics of the radiology report and overcome the barriers to the re-use of radiology report information in clinical research and other healthcare applications. We describe a machine learning system to annotate radiology reports and extract report contents according to an information model. This information model covers the majority of clinically significant contents in radiology reports and is applicable to a wide variety of radiology study types. Our automated approach uses discriminative sequence classifiers for named-entity recognition to extract and organize clinically significant terms and phrases consistent with the information model. We evaluated our information extraction system on 150 radiology reports from three major healthcare organizations and compared its results to a commonly used non-machine learning information extraction method. We also evaluated the generalizability of our approach across different organizations by training and testing our system on data from different organizations. Our results show the efficacy of our machine learning approach in extracting the information model's elements (10-fold cross-validation average performance: precision: 87%, recall: 84%, F1 score: 85%) and its superiority and generalizability compared to the common non-machine learning approach (p-value<0.05). Our machine learning information extraction approach provides an effective automatic method to annotate and extract clinically significant information from a large collection of free text radiology reports. This information extraction system can help clinicians better understand the radiology reports and prioritize their review process. In addition, the extracted information can be used by researchers to link radiology reports to information from other data sources such as electronic health records and the patient's genome. Extracted information also can facilitate disease surveillance, real-time clinical decision support for the radiologist, and content-based image retrieval. Copyright © 2015 Elsevier B.V. All rights reserved.

  7. Measurement of global functional performance in patients with rheumatoid arthritis using rheumatology function tests

    PubMed Central

    Escalante, Agustín; Haas, Roy W; del Rincón, Inmaculada

    2004-01-01

    Outcome assessment in patients with rheumatoid arthritis (RA) includes measurement of physical function. We derived a scale to quantify global physical function in RA, using three performance-based rheumatology function tests (RFTs). We measured grip strength, walking velocity, and shirt button speed in consecutive RA patients attending scheduled appointments at six rheumatology clinics, repeating these measurements after a median interval of 1 year. We extracted the underlying latent variable using principal component factor analysis. We used the Bayesian information criterion to assess the global physical function scale's cross-sectional fit to criterion standards. The criteria were joint tenderness, swelling, and deformity, pain, physical disability, current work status, and vital status at 6 years after study enrolment. We computed Guyatt's responsiveness statistic for improvement according to the American College of Rheumatology (ACR) definition. Baseline functional performance data were available for 777 patients, and follow-up data were available for 681. Mean ± standard deviation for each RFT at baseline were: grip strength, 14 ± 10 kg; walking velocity, 194 ± 82 ft/min; and shirt button speed, 7.1 ± 3.8 buttons/min. Grip strength and walking velocity departed significantly from normality. The three RFTs loaded strongly on a single factor that explained ≥70% of their combined variance. We rescaled the factor to vary from 0 to 100. Its mean ± standard deviation was 41 ± 20, with a normal distribution. The new global scale had a stronger fit than the primary RFT to most of the criterion standards. It correlated more strongly with physical disability at follow-up and was more responsive to improvement defined according to the ACR20 and ACR50 definitions. We conclude that a performance-based physical function scale extracted from three RFTs has acceptable distributional and measurement properties and is responsive to clinically meaningful change. It provides a parsimonious scale to measure global physical function in RA. PMID:15225367

  8. Review of Extracting Information From the Social Web for Health Personalization

    PubMed Central

    Karlsen, Randi; Bonander, Jason

    2011-01-01

    In recent years the Web has come into its own as a social platform where health consumers are actively creating and consuming Web content. Moreover, as the Web matures, consumers are gaining access to personalized applications adapted to their health needs and interests. The creation of personalized Web applications relies on extracted information about the users and the content to personalize. The Social Web itself provides many sources of information that can be used to extract information for personalization apart from traditional Web forms and questionnaires. This paper provides a review of different approaches for extracting information from the Social Web for health personalization. We reviewed research literature across different fields addressing the disclosure of health information in the Social Web, techniques to extract that information, and examples of personalized health applications. In addition, the paper includes a discussion of technical and socioethical challenges related to the extraction of information for health personalization. PMID:21278049

  9. Selectivity of physiotherapist programs in the United States does not differ by institutional funding source or research activity level.

    PubMed

    Riley, Sean P; Covington, Kyle; Landry, Michel D; McCallum, Christine; Engelhard, Chalee; Cook, Chad E

    2016-01-01

    This study aimed to compare selectivity characteristics among institution characteristics to determine differences by institutional funding source (public vs. private) or research activity level (research vs. non-research). This study included information provided by the Commission on Accreditation in Physical Therapy Education (CAPTE) and the Federation of State Boards of Physical Therapy. Data were extracted from all students who graduated in 2011 from accredited physical therapy programs in the United States. The public and private designations of the institutions were extracted directly from the classifications from the 'CAPTE annual accreditation report,' and high and low research activity was determined based on Carnegie classifications. The institutions were classified into four groups: public/research intensive, public/non-research intensive, private/research intensive, and private/non-research intensive. Descriptive and comparison analyses with post hoc testing were performed to determine whether there were statistically significant differences among the four groups. Although there were statistically significant baseline grade point average differences among the four categorized groups, there were no significant differences in licensure pass rates or for any of the selectivity variables of interest. Selectivity characteristics did not differ by institutional funding source (public vs. private) or research activity level (research vs. non-research). This suggests that the concerns about reduced selectivity among physiotherapy programs, specifically the types that are experiencing the largest proliferation, appear less warranted.

  10. Research of information classification and strategy intelligence extract algorithm based on military strategy hall

    NASA Astrophysics Data System (ADS)

    Chen, Lei; Li, Dehua; Yang, Jie

    2007-12-01

    Constructing virtual international strategy environment needs many kinds of information, such as economy, politic, military, diploma, culture, science, etc. So it is very important to build an information auto-extract, classification, recombination and analysis management system with high efficiency as the foundation and component of military strategy hall. This paper firstly use improved Boost algorithm to classify obtained initial information, then use a strategy intelligence extract algorithm to extract strategy intelligence from initial information to help strategist to analysis information.

  11. Biologically-inspired data decorrelation for hyper-spectral imaging

    NASA Astrophysics Data System (ADS)

    Picon, Artzai; Ghita, Ovidiu; Rodriguez-Vaamonde, Sergio; Iriondo, Pedro Ma; Whelan, Paul F.

    2011-12-01

    Hyper-spectral data allows the construction of more robust statistical models to sample the material properties than the standard tri-chromatic color representation. However, because of the large dimensionality and complexity of the hyper-spectral data, the extraction of robust features (image descriptors) is not a trivial issue. Thus, to facilitate efficient feature extraction, decorrelation techniques are commonly applied to reduce the dimensionality of the hyper-spectral data with the aim of generating compact and highly discriminative image descriptors. Current methodologies for data decorrelation such as principal component analysis (PCA), linear discriminant analysis (LDA), wavelet decomposition (WD), or band selection methods require complex and subjective training procedures and in addition the compressed spectral information is not directly related to the physical (spectral) characteristics associated with the analyzed materials. The major objective of this article is to introduce and evaluate a new data decorrelation methodology using an approach that closely emulates the human vision. The proposed data decorrelation scheme has been employed to optimally minimize the amount of redundant information contained in the highly correlated hyper-spectral bands and has been comprehensively evaluated in the context of non-ferrous material classification

  12. On the equivalence of the RTI and SVM approaches to time correlated analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Croft, S.; Favalli, A.; Henzlova, D.

    2014-11-21

    Recently two papers on how to perform passive neutron auto-correlation analysis on time gated histograms formed from pulse train data, generically called time correlation analysis (TCA), have appeared in this journal [1,2]. For those of us working in international nuclear safeguards these treatments are of particular interest because passive neutron multiplicity counting is a widely deployed technique for the quantification of plutonium. The purpose of this letter is to show that the skewness-variance-mean (SVM) approach developed in [1] is equivalent in terms of assay capability to the random trigger interval (RTI) analysis laid out in [2]. Mathematically we could alsomore » use other numerical ways to extract the time correlated information from the histogram data including for example what we might call the mean, mean square, and mean cube approach. The important feature however, from the perspective of real world applications, is that the correlated information extracted is the same, and subsequently gets interpreted in the same way based on the same underlying physics model.« less

  13. Language extraction from zinc sulfide

    NASA Astrophysics Data System (ADS)

    Varn, Dowman Parks

    2001-09-01

    Recent advances in the analysis of one-dimensional temporal and spacial series allow for detailed characterization of disorder and computation in physical systems. One such system that has defied theoretical understanding since its discovery in 1912 is polytypism. Polytypes are layered compounds, exhibiting crystallinity in two dimensions, yet having complicated stacking sequences in the third direction. They can show both ordered and disordered sequences, sometimes each in the same specimen. We demonstrate a method for extracting two-layer correlation information from ZnS diffraction patterns and employ a novel technique for epsilon-machine reconstruction. We solve a long-standing problem---that of determining structural information for disordered materials from their diffraction patterns---for this special class of disorder. Our solution offers the most complete possible statistical description of the disorder. Furthermore, from our reconstructed epsilon-machines we find the effective range of the interlayer interaction in these materials, as well as the configurational energy of both ordered and disordered specimens. Finally, we can determine the 'language' (in terms of the Chomsky Hierarchy) these small rocks speak, and we find that regular languages are sufficient to describe them.

  14. High-Resolution Remote Sensing Image Building Extraction Based on Markov Model

    NASA Astrophysics Data System (ADS)

    Zhao, W.; Yan, L.; Chang, Y.; Gong, L.

    2018-04-01

    With the increase of resolution, remote sensing images have the characteristics of increased information load, increased noise, more complex feature geometry and texture information, which makes the extraction of building information more difficult. To solve this problem, this paper designs a high resolution remote sensing image building extraction method based on Markov model. This method introduces Contourlet domain map clustering and Markov model, captures and enhances the contour and texture information of high-resolution remote sensing image features in multiple directions, and further designs the spectral feature index that can characterize "pseudo-buildings" in the building area. Through the multi-scale segmentation and extraction of image features, the fine extraction from the building area to the building is realized. Experiments show that this method can restrain the noise of high-resolution remote sensing images, reduce the interference of non-target ground texture information, and remove the shadow, vegetation and other pseudo-building information, compared with the traditional pixel-level image information extraction, better performance in building extraction precision, accuracy and completeness.

  15. LiDAR Vegetation Investigation and Signature Analysis System (LVISA)

    NASA Astrophysics Data System (ADS)

    Höfle, Bernhard; Koenig, Kristina; Griesbaum, Luisa; Kiefer, Andreas; Hämmerle, Martin; Eitel, Jan; Koma, Zsófia

    2015-04-01

    Our physical environment undergoes constant changes in space and time with strongly varying triggers, frequencies, and magnitudes. Monitoring these environmental changes is crucial to improve our scientific understanding of complex human-environmental interactions and helps us to respond to environmental change by adaptation or mitigation. The three-dimensional (3D) description of the Earth surface features and the detailed monitoring of surface processes using 3D spatial data have gained increasing attention within the last decades, such as in climate change research (e.g., glacier retreat), carbon sequestration (e.g., forest biomass monitoring), precision agriculture and natural hazard management. In all those areas, 3D data have helped to improve our process understanding by allowing quantifying the structural properties of earth surface features and their changes over time. This advancement has been fostered by technological developments and increased availability of 3D sensing systems. In particular, LiDAR (light detection and ranging) technology, also referred to as laser scanning, has made significant progress and has evolved into an operational tool in environmental research and geosciences. The main result of LiDAR measurements is a highly spatially resolved 3D point cloud. Each point within the LiDAR point cloud has a XYZ coordinate associated with it and often additional information such as the strength of the returned backscatter. The point cloud provided by LiDAR contains rich geospatial, structural, and potentially biochemical information about the surveyed objects. To deal with the inherently unorganized datasets and the large data volume (frequently millions of XYZ coordinates) of LiDAR datasets, a multitude of algorithms for automatic 3D object detection (e.g., of single trees) and physical surface description (e.g., biomass) have been developed. However, so far the exchange of datasets and approaches (i.e., extraction algorithms) among LiDAR users lacks behind. We propose a novel concept, the LiDAR Vegetation Investigation and Signature Analysis System (LVISA), which shall enhance sharing of i) reference datasets of single vegetation objects with rich reference data (e.g., plant species, basic plant morphometric information) and ii) approaches for information extraction (e.g., single tree detection, tree species classification based on waveform LiDAR features). We will build an extensive LiDAR data repository for supporting the development and benchmarking of LiDAR-based object information extraction. The LiDAR Vegetation Investigation and Signature Analysis System (LVISA) uses international web service standards (Open Geospatial Consortium, OGC) for geospatial data access and also analysis (e.g., OGC Web Processing Services). This will allow the research community identifying plant object specific vegetation features from LiDAR data, while accounting for differences in LiDAR systems (e.g., beam divergence), settings (e.g., point spacing), and calibration techniques. It is the goal of LVISA to develop generic 3D information extraction approaches, which can be seamlessly transferred to other datasets, timestamps and also extraction tasks. The current prototype of LVISA can be visited and tested online via http://uni-heidelberg.de/lvisa. Video tutorials provide a quick overview and entry into the functionality of LVISA. We will present the current advances of LVISA and we will highlight future research and extension of LVISA, such as integrating low-cost LiDAR data and datasets acquired by highly temporal scanning of vegetation (e.g., continuous measurements). Everybody is invited to join the LVISA development and share datasets and analysis approaches in an interoperable way via the web-based LVISA geoportal.

  16. A survey of social media data analysis for physical activity surveillance.

    PubMed

    Liu, Sam; Young, Sean D

    2018-07-01

    Social media data can provide valuable information regarding people's behaviors and health outcomes. Previous studies have shown that social media data can be extracted to monitor and predict infectious disease outbreaks. These same approaches can be applied to other fields including physical activity research and forensic science. Social media data have the potential to provide real-time monitoring and prediction of physical activity level in a given region. This tool can be valuable to public health organizations as it can overcome the time lag in the reporting of physical activity epidemiology data faced by traditional research methods (e.g. surveys, observational studies). As a result, this tool could help public health organizations better mobilize and target physical activity interventions. The first part of this paper aims to describe current approaches (e.g. topic modeling, sentiment analysis and social network analysis) that could be used to analyze social media data to provide real-time monitoring of physical activity level. The second aim of this paper was to discuss ways to apply social media analysis to other fields such as forensic sciences and provide recommendations to further social media research. Copyright © 2016 Elsevier Ltd and Faculty of Forensic and Legal Medicine. All rights reserved.

  17. Measuring presenteeism: which questionnaire to use in physical activity research?

    PubMed

    Brown, Helen Elizabeth; Burton, Nicola; Gilson, Nicholas David; Brown, Wendy

    2014-02-01

    An emerging area of interest in workplace health is presenteeism; the measurable extent to which physical or psychosocial symptoms, conditions and disease adversely affect the work productivity of those who choose to remain at work. Given established links between presenteeism and health, and health and physical activity, presenteeism could be an important outcome in workplace physical activity research. This study provides a narrative review of questionnaires for use in such research. Eight self-report measures of presenteeism were identified. Information regarding development, constructs measured and psychometric properties was extracted from relevant articles. Questionnaires were largely self-administered, had 4-44 items, and recall periods ranging from 1 week to 1 year. Items were identified as assessing work performance, physical tolerance, psychological well-being and social or role functioning. Samples used to test questionnaires were predominantly American male employees, with an age range of 30-59 years. All instruments had undergone psychometric assessment, most commonly discriminant and construct validity. Based on instrument characteristics, the range of conceptual foci covered and acceptable measurement properties, the Health and Work Questionnaire, Work Ability Index, and Work Limitations Questionnaire are suggested as most suitable for further exploring the relationship between physical activity and presenteeism.

  18. Experimental Rectification of Entropy Production by Maxwell's Demon in a Quantum System

    NASA Astrophysics Data System (ADS)

    Camati, Patrice A.; Peterson, John P. S.; Batalhão, Tiago B.; Micadei, Kaonan; Souza, Alexandre M.; Sarthour, Roberto S.; Oliveira, Ivan S.; Serra, Roberto M.

    2016-12-01

    Maxwell's demon explores the role of information in physical processes. Employing information about microscopic degrees of freedom, this "intelligent observer" is capable of compensating entropy production (or extracting work), apparently challenging the second law of thermodynamics. In a modern standpoint, it is regarded as a feedback control mechanism and the limits of thermodynamics are recast incorporating information-to-energy conversion. We derive a trade-off relation between information-theoretic quantities empowering the design of an efficient Maxwell's demon in a quantum system. The demon is experimentally implemented as a spin-1 /2 quantum memory that acquires information, and employs it to control the dynamics of another spin-1 /2 system, through a natural interaction. Noise and imperfections in this protocol are investigated by the assessment of its effectiveness. This realization provides experimental evidence that the irreversibility in a nonequilibrium dynamics can be mitigated by assessing microscopic information and applying a feed-forward strategy at the quantum scale.

  19. Experimental Rectification of Entropy Production by Maxwell's Demon in a Quantum System.

    PubMed

    Camati, Patrice A; Peterson, John P S; Batalhão, Tiago B; Micadei, Kaonan; Souza, Alexandre M; Sarthour, Roberto S; Oliveira, Ivan S; Serra, Roberto M

    2016-12-09

    Maxwell's demon explores the role of information in physical processes. Employing information about microscopic degrees of freedom, this "intelligent observer" is capable of compensating entropy production (or extracting work), apparently challenging the second law of thermodynamics. In a modern standpoint, it is regarded as a feedback control mechanism and the limits of thermodynamics are recast incorporating information-to-energy conversion. We derive a trade-off relation between information-theoretic quantities empowering the design of an efficient Maxwell's demon in a quantum system. The demon is experimentally implemented as a spin-1/2 quantum memory that acquires information, and employs it to control the dynamics of another spin-1/2 system, through a natural interaction. Noise and imperfections in this protocol are investigated by the assessment of its effectiveness. This realization provides experimental evidence that the irreversibility in a nonequilibrium dynamics can be mitigated by assessing microscopic information and applying a feed-forward strategy at the quantum scale.

  20. Physical and Biological Modification of Polycaprolactone Electrospun Nanofiber by Panax Ginseng Extract for Bone Tissue Engineering Application.

    PubMed

    Pajoumshariati, Seyedramin; Yavari, Seyedeh Kimia; Shokrgozar, Mohammad Ali

    2016-05-01

    Medicinal plants as a therapeutic agent with osteogenic properties can enhance fracture-healing process. In this study, the osteo-inductive potential of Asian Panax Ginseng root extract within electrospun polycaprolactone (PCL) based nanofibers has been investigated. Scanning electron microscopy images revealed that all nanofibers were highly porous and beadles with average diameter ranging from 250 to 650 nm. The incorporation of ginseng extract improved the physical characteristics (i.e., hydrophilicity) of PCL nanofibers, as well as the mechanical properties. Although ginseng extract increased the degradation rate of pure PCL nanofibers, the porous structure and morphology of fibers did not change significantly after 42 days. It was found that nanofibrous scaffolds containing ginseng extract had higher proliferation (up to ~1.5 fold) compared to the pristine PCL. The qRT-PCR analysis demonstrated the addition of ginseng extract into PCL nanofibers induced significant expression of osteogenic genes (Osteocalcin, Runx-2 and Col-1) in MSCs in a concentration dependent manner. Moreover, higher calcium content, alkaline phosphatase activity and higher mineralization of MSCs were observed compared to the pristine PCL fibers. Our results indicated the promising potential of ginseng extract as an additive to enhance osteo-inductivity, mechanical and physical properties of PCL nanofibers for bone tissue engineering application.

  1. Distance biases in the estimation of the physical properties of Hi-GAL compact sources - I. Clump properties and the identification of high-mass star-forming candidates

    NASA Astrophysics Data System (ADS)

    Baldeschi, Adriano; Elia, D.; Molinari, S.; Pezzuto, S.; Schisano, E.; Gatti, M.; Serra, A.; Merello, M.; Benedettini, M.; Di Giorgio, A. M.; Liu, J. S.

    2017-04-01

    The degradation of spatial resolution in star-forming regions, observed at large distances (d ≳ 1 kpc) with Herschel, can lead to estimates of the physical parameters of the detected compact sources (clumps), which do not necessarily mirror the properties of the original population of cores. This paper aims at quantifying the bias introduced in the estimation of these parameters by the distance effect. To do so, we consider Herschel maps of nearby star-forming regions taken from the Herschel Gould Belt survey, and simulate the effect of increased distance to understand what amount of information is lost when a distant star-forming region is observed with Herschel resolution. In the maps displaced to different distances we extract compact sources, and we derive their physical parameters as if they were original Herschel infrared Galactic Plane Survey maps of the extracted source samples. In this way, we are able to discuss how the main physical properties change with distance. In particular, we discuss the ability of clumps to form massive stars: we estimate the fraction of distant sources that are classified as high-mass stars-forming objects due to their position in the mass versus radius diagram, that are only 'false positives'. We also give a threshold for high-mass star formation M>1282 (r/ [pc])^{1.42} M_{⊙}. In conclusion, this paper provides the astronomer dealing with Herschel maps of distant star-forming regions with a set of prescriptions to partially recover the character of the core population in unresolved clumps.

  2. Safety and efficacy of physical restraints for the elderly. Review of the evidence.

    PubMed Central

    Frank, C.; Hodgetts, G.; Puxty, J.

    1996-01-01

    OBJECTIVE: To critically review evidence on the safety and efficacy of physical restraints for the elderly and to provide family physicians with guidelines for rational use of restraints. DATA SOURCES: Articles cited on MEDLINE (from 1989 to November 1994) and Cinahl (from 1982 to 1994) under the MeSH heading "physical restraints." STUDY SELECTION: Articles that specifically dealt with the safety and efficacy of restraints and current patterns of use, including prevalence, risk factors, and indications, were selected. Eight original research articles were identified and critically appraised. DATA EXTRACTION: Data extracted concerned the negative sequelae of restraints and the association between restraint use and fall and injury rates. General data about current patterns of restraint use were related to safety and efficacy findings. DATA SYNTHESIS: No randomized, controlled trials of physical restraint use were found in the literature. A variety of study design, including retrospective chart review, prospective cohort studies, and case reports, found little evidence that restraints prevent injury. Some evidence suggested that restraints might increase risk of falls and injury. Restraint-reduction programs have not been shown to increase fall or injury rates. Numerous case reports document injuries or deaths resulting from restraint use or misuse. CONCLUSIONS: Although current evidence does not support the belief that restraints prevent falls and injuries and questions their safety, further prospective and controlled studies are needed to clarify these issues. Information from review and research articles was synthesized in this paper to produce guidelines for the safe and rational use of restraints. PMID:8969858

  3. Native Cellulose: Structure, Characterization and Thermal Properties

    PubMed Central

    Poletto, Matheus; Ornaghi Júnior, Heitor L.; Zattera, Ademir J.

    2014-01-01

    In this work, the relationship between cellulose crystallinity, the influence of extractive content on lignocellulosic fiber degradation, the correlation between chemical composition and the physical properties of ten types of natural fibers were investigated by FTIR spectroscopy, X-ray diffraction and thermogravimetry techniques. The results showed that higher extractive contents associated with lower crystallinity and lower cellulose crystallite size can accelerate the degradation process and reduce the thermal stability of the lignocellulosic fibers studied. On the other hand, the thermal decomposition of natural fibers is shifted to higher temperatures with increasing the cellulose crystallinity and crystallite size. These results indicated that the cellulose crystallite size affects the thermal degradation temperature of natural fibers. This study showed that through the methods used, previous information about the structure and properties of lignocellulosic fibers can be obtained before use in composite formulations. PMID:28788179

  4. Channel-Based Key Generation for Encrypted Body-Worn Wireless Sensor Networks.

    PubMed

    Van Torre, Patrick

    2016-09-08

    Body-worn sensor networks are important for rescue-workers, medical and many other applications. Sensitive data are often transmitted over such a network, motivating the need for encryption. Body-worn sensor networks are deployed in conditions where the wireless communication channel varies dramatically due to fading and shadowing, which is considered a disadvantage for communication. Interestingly, these channel variations can be employed to extract a common encryption key at both sides of the link. Legitimate users share a unique physical channel and the variations thereof provide data series on both sides of the link, with highly correlated values. An eavesdropper, however, does not share this physical channel and cannot extract the same information when intercepting the signals. This paper documents a practical wearable communication system implementing channel-based key generation, including an implementation and a measurement campaign comprising indoor as well as outdoor measurements. The results provide insight into the performance of channel-based key generation in realistic practical conditions. Employing a process known as key reconciliation, error free keys are generated in all tested scenarios. The key-generation system is computationally simple and therefore compatible with the low-power micro controllers and low-data rate transmissions commonly used in wireless sensor networks.

  5. Nootropic effect of meadowsweet (Filipendula vulgaris) extracts.

    PubMed

    Shilova, I V; Suslov, N I

    2015-03-01

    The effects of the extracts of the aboveground parts of Filipendula vulgaris Moench on the behavior and memory of mice after hypoxic injury and their physical performance in the open-field test were studied using the models of hypoxia in a sealed volume, conditioned passive avoidance response (CPAR), and forced swimming with a load. The extracts improved animal resistance to hypoxia, normalized orientation and exploration activities, promoted CPAR retention after hypoxic injury, and increased physical performance. Aqueous extract of meadowsweet had the most pronounced effect that corresponded to the effect of the reference drug piracetam. These effects were probably caused by modulation of hippocampal activity.

  6. A rapid extraction of landslide disaster information research based on GF-1 image

    NASA Astrophysics Data System (ADS)

    Wang, Sai; Xu, Suning; Peng, Ling; Wang, Zhiyi; Wang, Na

    2015-08-01

    In recent years, the landslide disasters occurred frequently because of the seismic activity. It brings great harm to people's life. It has caused high attention of the state and the extensive concern of society. In the field of geological disaster, landslide information extraction based on remote sensing has been controversial, but high resolution remote sensing image can improve the accuracy of information extraction effectively with its rich texture and geometry information. Therefore, it is feasible to extract the information of earthquake- triggered landslides with serious surface damage and large scale. Taking the Wenchuan county as the study area, this paper uses multi-scale segmentation method to extract the landslide image object through domestic GF-1 images and DEM data, which uses the estimation of scale parameter tool to determine the optimal segmentation scale; After analyzing the characteristics of landslide high-resolution image comprehensively and selecting spectrum feature, texture feature, geometric features and landform characteristics of the image, we can establish the extracting rules to extract landslide disaster information. The extraction results show that there are 20 landslide whose total area is 521279.31 .Compared with visual interpretation results, the extraction accuracy is 72.22%. This study indicates its efficient and feasible to extract earthquake landslide disaster information based on high resolution remote sensing and it provides important technical support for post-disaster emergency investigation and disaster assessment.

  7. Multi-Filter String Matching and Human-Centric Entity Matching for Information Extraction

    ERIC Educational Resources Information Center

    Sun, Chong

    2012-01-01

    More and more information is being generated in text documents, such as Web pages, emails and blogs. To effectively manage this unstructured information, one broadly used approach includes locating relevant content in documents, extracting structured information and integrating the extracted information for querying, mining or further analysis. In…

  8. Extracting Useful Semantic Information from Large Scale Corpora of Text

    ERIC Educational Resources Information Center

    Mendoza, Ray Padilla, Jr.

    2012-01-01

    Extracting and representing semantic information from large scale corpora is at the crux of computer-assisted knowledge generation. Semantic information depends on collocation extraction methods, mathematical models used to represent distributional information, and weighting functions which transform the space. This dissertation provides a…

  9. Global 21 cm Signal Extraction from Foreground and Instrumental Effects. I. Pattern Recognition Framework for Separation Using Training Sets

    NASA Astrophysics Data System (ADS)

    Tauscher, Keith; Rapetti, David; Burns, Jack O.; Switzer, Eric

    2018-02-01

    The sky-averaged (global) highly redshifted 21 cm spectrum from neutral hydrogen is expected to appear in the VHF range of ∼20–200 MHz and its spectral shape and strength are determined by the heating properties of the first stars and black holes, by the nature and duration of reionization, and by the presence or absence of exotic physics. Measurements of the global signal would therefore provide us with a wealth of astrophysical and cosmological knowledge. However, the signal has not yet been detected because it must be seen through strong foregrounds weighted by a large beam, instrumental calibration errors, and ionospheric, ground, and radio-frequency-interference effects, which we collectively refer to as “systematics.” Here, we present a signal extraction method for global signal experiments which uses Singular Value Decomposition of “training sets” to produce systematics basis functions specifically suited to each observation. Instead of requiring precise absolute knowledge of the systematics, our method effectively requires precise knowledge of how the systematics can vary. After calculating eigenmodes for the signal and systematics, we perform a weighted least square fit of the corresponding coefficients and select the number of modes to include by minimizing an information criterion. We compare the performance of the signal extraction when minimizing various information criteria and find that minimizing the Deviance Information Criterion most consistently yields unbiased fits. The methods used here are built into our widely applicable, publicly available Python package, pylinex, which analytically calculates constraints on signals and systematics from given data, errors, and training sets.

  10. A radiation scalar for numerical relativity.

    PubMed

    Beetle, Christopher; Burko, Lior M

    2002-12-30

    This Letter describes a scalar curvature invariant for general relativity with a certain, distinctive feature. While many such invariants exist, this one vanishes in regions of space-time which can be said unambiguously to contain no gravitational radiation. In more general regions which incontrovertibly support nontrivial radiation fields, it can be used to extract local, coordinate-independent information partially characterizing that radiation. While a clear, physical interpretation is possible only in such radiation zones, a simple algorithm can be given to extend the definition smoothly to generic regions of space-time.

  11. Electroweak radiative corrections to the top quark decay

    NASA Astrophysics Data System (ADS)

    Kuruma, Toshiyuki

    1993-12-01

    The top quark, once produced, should be an important window to the electroweak symmetry breaking sector. We compute electroweak radiative corrections to the decay process t→b+W + in order to extract information on the Higgs sector and to fix the background in searches for a possible new physics contribution. The large Yukawa coupling of the top quark induces a new form factor through vertex corrections and causes discrepancy from the tree-level longitudinal W-boson production fraction, but the effect is of order 1% or less for m H<1 TeV.

  12. Lattice field theory applications in high energy physics

    NASA Astrophysics Data System (ADS)

    Gottlieb, Steven

    2016-10-01

    Lattice gauge theory was formulated by Kenneth Wilson in 1974. In the ensuing decades, improvements in actions, algorithms, and computers have enabled tremendous progress in QCD, to the point where lattice calculations can yield sub-percent level precision for some quantities. Beyond QCD, lattice methods are being used to explore possible beyond the standard model (BSM) theories of dynamical symmetry breaking and supersymmetry. We survey progress in extracting information about the parameters of the standard model by confronting lattice calculations with experimental results and searching for evidence of BSM effects.

  13. Hydraulophones: Acoustic musical instruments and expressive user interfaces

    NASA Astrophysics Data System (ADS)

    Janzen, Ryan E.

    Fluid flow creates an expansive range of acoustic possibilities, particularly in the case of water, which has unique turbulence and vortex shedding properties as compared with the air of ordinary wind instruments. Sound from water flow is explained with reference to a new class of musical instruments, hydraulophones, in which oscillation originates directly from matter in its liquid state. Several hydraulophones which were realized in practical form are described. A unique user-interface consisting of a row of water jets is presented, in terms of its expressiveness, tactility, responsiveness to derivatives and integrals of displacement, and in terms of the direct physical interaction between a user and the physical process of sound production. Signal processing algorithms are introduced, which extract further information from turbulent water flow, for industrial applications as well as musical applications.

  14. On Organization of Information: Approach and Early Work

    NASA Technical Reports Server (NTRS)

    Degani, Asaf; Jorgensen, Charles C.; Iverson, David; Shafto, Michael; Olson, Leonard

    2009-01-01

    In this report we describe an approach for organizing information for presentation and display. "e approach stems from the observation that there is a stepwise progression in the way signals (from the environment and the system under consideration) are extracted and transformed into data, and then analyzed and abstracted to form representations (e.g., indications and icons) on the user interface. In physical environments such as aerospace and process control, many system components and their corresponding data and information are interrelated (e.g., an increase in a chamber s temperature results in an increase in its pressure). "ese interrelationships, when presented clearly, allow users to understand linkages among system components and how they may affect one another. Organization of these interrelationships by means of an orderly structure provides for the so-called "big picture" that pilots, astronauts, and operators strive for.

  15. From Principal Component to Direct Coupling Analysis of Coevolution in Proteins: Low-Eigenvalue Modes are Needed for Structure Prediction

    PubMed Central

    Cocco, Simona; Monasson, Remi; Weigt, Martin

    2013-01-01

    Various approaches have explored the covariation of residues in multiple-sequence alignments of homologous proteins to extract functional and structural information. Among those are principal component analysis (PCA), which identifies the most correlated groups of residues, and direct coupling analysis (DCA), a global inference method based on the maximum entropy principle, which aims at predicting residue-residue contacts. In this paper, inspired by the statistical physics of disordered systems, we introduce the Hopfield-Potts model to naturally interpolate between these two approaches. The Hopfield-Potts model allows us to identify relevant ‘patterns’ of residues from the knowledge of the eigenmodes and eigenvalues of the residue-residue correlation matrix. We show how the computation of such statistical patterns makes it possible to accurately predict residue-residue contacts with a much smaller number of parameters than DCA. This dimensional reduction allows us to avoid overfitting and to extract contact information from multiple-sequence alignments of reduced size. In addition, we show that low-eigenvalue correlation modes, discarded by PCA, are important to recover structural information: the corresponding patterns are highly localized, that is, they are concentrated in few sites, which we find to be in close contact in the three-dimensional protein fold. PMID:23990764

  16. Measuring nuclear reaction cross sections to extract information on neutrinoless double beta decay

    NASA Astrophysics Data System (ADS)

    Cavallaro, M.; Cappuzzello, F.; Agodi, C.; Acosta, L.; Auerbach, N.; Bellone, J.; Bijker, R.; Bonanno, D.; Bongiovanni, D.; Borello-Lewin, T.; Boztosun, I.; Branchina, V.; Bussa, M. P.; Calabrese, S.; Calabretta, L.; Calanna, A.; Calvo, D.; Carbone, D.; Chávez Lomelí, E. R.; Coban, A.; Colonna, M.; D'Agostino, G.; De Geronimo, G.; Delaunay, F.; Deshmukh, N.; de Faria, P. N.; Ferraresi, C.; Ferreira, J. L.; Finocchiaro, P.; Fisichella, M.; Foti, A.; Gallo, G.; Garcia, U.; Giraudo, G.; Greco, V.; Hacisalihoglu, A.; Kotila, J.; Iazzi, F.; Introzzi, R.; Lanzalone, G.; Lavagno, A.; La Via, F.; Lay, J. A.; Lenske, H.; Linares, R.; Litrico, G.; Longhitano, F.; Lo Presti, D.; Lubian, J.; Medina, N.; Mendes, D. R.; Muoio, A.; Oliveira, J. R. B.; Pakou, A.; Pandola, L.; Petrascu, H.; Pinna, F.; Reito, S.; Rifuggiato, D.; Rodrigues, M. R. D.; Russo, A. D.; Russo, G.; Santagati, G.; Santopinto, E.; Sgouros, O.; Solakci, S. O.; Souliotis, G.; Soukeras, V.; Spatafora, A.; Torresi, D.; Tudisco, S.; Vsevolodovna, R. I. M.; Wheadon, R. J.; Yildirin, A.; Zagatto, V. A. B.

    2018-02-01

    Neutrinoless double beta decay (0vββ) is considered the best potential resource to access the absolute neutrino mass scale. Moreover, if observed, it will signal that neutrinos are their own anti-particles (Majorana particles). Presently, this physics case is one of the most important research “beyond Standard Model” and might guide the way towards a Grand Unified Theory of fundamental interactions. Since the 0vββ decay process involves nuclei, its analysis necessarily implies nuclear structure issues. In the NURE project, supported by a Starting Grant of the European Research Council (ERC), nuclear reactions of double charge-exchange (DCE) are used as a tool to extract information on the 0vββ Nuclear Matrix Elements. In DCE reactions and ββ decay indeed the initial and final nuclear states are the same and the transition operators have similar structure. Thus the measurement of the DCE absolute cross-sections can give crucial information on ββ matrix elements. In a wider view, the NUMEN international collaboration plans a major upgrade of the INFN-LNS facilities in the next years in order to increase the experimental production of nuclei of at least two orders of magnitude, thus making feasible a systematic study of all the cases of interest as candidates for 0vββ.

  17. Guidelines for Physical Activity during Pregnancy: Comparisons From Around the World

    PubMed Central

    Evenson, Kelly R.; Barakat, Ruben; Brown, Wendy J.; Dargent-Molina, Patricia; Haruna, Megumi; Mikkelsen, Ellen M.; Mottola, Michelle F.; Owe, Katrine M.; Rousham, Emily K.; Yeo, SeonAe

    2013-01-01

    Introduction Women attain numerous benefits from physical activity during pregnancy. However, due to physical changes that occur during pregnancy, special precautions are also needed. This review summarizes current guidelines for physical activity among pregnant women worldwide. Methods We searched PubMed (MedLINE) for country-specific governmental and clinical guidelines on physical activity during pregnancy through the year 2012. We cross-referenced with articles referring to guidelines, with only the most recent included. An abstraction form was used to extract key details and summarize. Results In total, 11 guidelines were identified from nine countries (Australia, Canada, Denmark, France, Japan, Norway, Spain, United Kingdom, United States). Most guidelines supported moderate intensity physical activity during pregnancy (10/11) and indicated specific frequency (9/11) and duration/time (9/11) recommendations. Most guidelines provided advice on initiating an exercise program during pregnancy (10/11). Six guidelines included absolute and relative contraindications to exercise. All guidelines generally ruled-out sports with risks of falls, trauma, or collisions. Six guidelines included indications for stopping exercise during pregnancy. Conclusion This review contrasted pregnancy-related physical activity guidelines from around the world, and can help to inform new guidelines as they are created or updated, and facilitate the development of a worldwide guideline. PMID:25346651

  18. Real-Time Joint Streaming Data Processing from Social and Physical Sensors

    NASA Astrophysics Data System (ADS)

    Kropivnitskaya, Y. Y.; Qin, J.; Tiampo, K. F.; Bauer, M.

    2014-12-01

    The results of the technological breakthroughs in computing that have taken place over the last few decades makes it possible to achieve emergency management objectives that focus on saving human lives and decreasing economic effects. In particular, the integration of a wide variety of information sources, including observations from spatially-referenced physical sensors and new social media sources, enables better real-time seismic hazard analysis through distributed computing networks. The main goal of this work is to utilize innovative computational algorithms for better real-time seismic risk analysis by integrating different data sources and processing tools into streaming and cloud computing applications. The Geological Survey of Canada operates the Canadian National Seismograph Network (CNSN) with over 100 high-gain instruments and 60 low-gain or strong motion seismographs. The processing of the continuous data streams from each station of the CNSN provides the opportunity to detect possible earthquakes in near real-time. The information from physical sources is combined to calculate a location and magnitude for an earthquake. The automatically calculated results are not always sufficiently precise and prompt that can significantly reduce the response time to a felt or damaging earthquake. Social sensors, here represented as Twitter users, can provide information earlier to the general public and more rapidly to the emergency planning and disaster relief agencies. We introduce joint streaming data processing from social and physical sensors in real-time based on the idea that social media observations serve as proxies for physical sensors. By using the streams of data in the form of Twitter messages, each of which has an associated time and location, we can extract information related to a target event and perform enhanced analysis by combining it with physical sensor data. Results of this work suggest that the use of data from social media, in conjunction with the development of innovative computing algorithms, when combined with sensor data can provide a new paradigm for real-time earthquake detection in order to facilitate rapid and inexpensive natural risk reduction.

  19. A dental vision system for accurate 3D tooth modeling.

    PubMed

    Zhang, Li; Alemzadeh, K

    2006-01-01

    This paper describes an active vision system based reverse engineering approach to extract the three-dimensional (3D) geometric information from dental teeth and transfer this information into Computer-Aided Design/Computer-Aided Manufacture (CAD/CAM) systems to improve the accuracy of 3D teeth models and at the same time improve the quality of the construction units to help patient care. The vision system involves the development of a dental vision rig, edge detection, boundary tracing and fast & accurate 3D modeling from a sequence of sliced silhouettes of physical models. The rig is designed using engineering design methods such as a concept selection matrix and weighted objectives evaluation chart. Reconstruction results and accuracy evaluation are presented on digitizing different teeth models.

  20. Difficulty of distinguishing product states locally

    NASA Astrophysics Data System (ADS)

    Croke, Sarah; Barnett, Stephen M.

    2017-01-01

    Nonlocality without entanglement is a rather counterintuitive phenomenon in which information may be encoded entirely in product (unentangled) states of composite quantum systems in such a way that local measurement of the subsystems is not enough for optimal decoding. For simple examples of pure product states, the gap in performance is known to be rather small when arbitrary local strategies are allowed. Here we restrict to local strategies readily achievable with current technology: those requiring neither a quantum memory nor joint operations. We show that even for measurements on pure product states, there can be a large gap between such strategies and theoretically optimal performance. Thus, even in the absence of entanglement, physically realizable local strategies can be far from optimal for extracting quantum information.

  1. A extract method of mountainous area settlement place information from GF-1 high resolution optical remote sensing image under semantic constraints

    NASA Astrophysics Data System (ADS)

    Guo, H., II

    2016-12-01

    Spatial distribution information of mountainous area settlement place is of great significance to the earthquake emergency work because most of the key earthquake hazardous areas of china are located in the mountainous area. Remote sensing has the advantages of large coverage and low cost, it is an important way to obtain the spatial distribution information of mountainous area settlement place. At present, fully considering the geometric information, spectral information and texture information, most studies have applied object-oriented methods to extract settlement place information, In this article, semantic constraints is to be added on the basis of object-oriented methods. The experimental data is one scene remote sensing image of domestic high resolution satellite (simply as GF-1), with a resolution of 2 meters. The main processing consists of 3 steps, the first is pretreatment, including ortho rectification and image fusion, the second is Object oriented information extraction, including Image segmentation and information extraction, the last step is removing the error elements under semantic constraints, in order to formulate these semantic constraints, the distribution characteristics of mountainous area settlement place must be analyzed and the spatial logic relation between settlement place and other objects must be considered. The extraction accuracy calculation result shows that the extraction accuracy of object oriented method is 49% and rise up to 86% after the use of semantic constraints. As can be seen from the extraction accuracy, the extract method under semantic constraints can effectively improve the accuracy of mountainous area settlement place information extraction. The result shows that it is feasible to extract mountainous area settlement place information form GF-1 image, so the article proves that it has a certain practicality to use domestic high resolution optical remote sensing image in earthquake emergency preparedness.

  2. Optimization of Physical Conditions for the Aqueous Extraction of Antioxidant Compounds from Ginger (Zingiber officinale) Applying a Box-Behnken Design.

    PubMed

    Ramírez-Godínez, Juan; Jaimez-Ordaz, Judith; Castañeda-Ovando, Araceli; Añorve-Morga, Javier; Salazar-Pereda, Verónica; González-Olivares, Luis Guillermo; Contreras-López, Elizabeth

    2017-03-01

    Since ancient times, ginger (Zingiber officinale) has been widely used for culinary and medicinal purposes. This rhizome possesses several chemical constituents; most of them present antioxidant capacity due mainly to the presence of phenolic compounds. Thus, the physical conditions for the optimal extraction of antioxidant components of ginger were investigated by applying a Box-Behnken experimental design. Extracts of ginger were prepared using water as solvent in a conventional solid-liquid extraction. The analyzed variables were time (5, 15 and 25 min), temperature (20, 55 and 90 °C) and sample concentration (2, 6 and 10 %). The antioxidant activity was measured using the 2,2-diphenyl-1-picrylhydrazyl method and a modified ferric reducing antioxidant power assay while total phenolics were measured by Folin & Ciocalteu's method. The suggested experimental design allowed the acquisition of aqueous extracts of ginger with diverse antioxidant activity (100-555 mg Trolox/100 g, 147-1237 mg Fe 2+ /100 g and 50-332 mg gallic acid/100 g). Temperature was the determining factor in the extraction of components with antioxidant activity, regardless of time and sample quantity. The optimal physical conditions that allowed the highest antioxidant activity were: 90 °C, 15 min and 2 % of the sample. The correlation value between the antioxidant activity by ferric reducing antioxidant power assay and the content of total phenolics was R 2  = 0.83. The experimental design applied allowed the determination of the physical conditions under which ginger aqueous extracts liberate compounds with antioxidant activity. Most of them are of the phenolic type as it was demonstrated through the correlation established between different methods used to measure antioxidant capacity.

  3. Study of ultrasonic cavitation during extraction of the peanut oil at varying frequencies.

    PubMed

    Zhang, Lei; Zhou, Cunshan; Wang, Bei; Yagoub, Abu El-Gasim A; Ma, Haile; Zhang, Xiao; Wu, Mian

    2017-07-01

    The ultrasonic extraction of oils is a typical physical processing technology. The extraction process was monitored from the standpoint of the oil quality and efficiency of oil extraction. In this study, the ultrasonic cavitation fields were measured by polyvinylidene fluoride (PVDF) sensor. Waveform of ultrasonic cavitation fields was gained and analyzed. The extraction yield and oxidation properties were compared. The relationship between the fields and cavitation oxidation was established. Numerical calculation of oscillation cycle was done for the cavitation bubbles. Results showed that the resonance frequency, f r , of the oil extraction was 40kHz. At f r , the voltage amplitude was the highest; the time was the shortest as reaching the amplitude of the waveform. Accordingly, the cavitation effect worked most rapidly, resulting in the strongest cavitation intensity. The extraction yield and oxidation properties were closely related to the cavitation effect. It controlled the cavitation oxidation effectively from the viewpoint of chemical and physical aspects. Copyright © 2017 Elsevier B.V. All rights reserved.

  4. Autism, Context/Noncontext Information Processing, and Atypical Development

    PubMed Central

    Skoyles, John R.

    2011-01-01

    Autism has been attributed to a deficit in contextual information processing. Attempts to understand autism in terms of such a defect, however, do not include more recent computational work upon context. This work has identified that context information processing depends upon the extraction and use of the information hidden in higher-order (or indirect) associations. Higher-order associations underlie the cognition of context rather than that of situations. This paper starts by examining the differences between higher-order and first-order (or direct) associations. Higher-order associations link entities not directly (as with first-order ones) but indirectly through all the connections they have via other entities. Extracting this information requires the processing of past episodes as a totality. As a result, this extraction depends upon specialised extraction processes separate from cognition. This information is then consolidated. Due to this difference, the extraction/consolidation of higher-order information can be impaired whilst cognition remains intact. Although not directly impaired, cognition will be indirectly impaired by knock on effects such as cognition compensating for absent higher-order information with information extracted from first-order associations. This paper discusses the implications of this for the inflexible, literal/immediate, and inappropriate information processing of autistic individuals. PMID:22937255

  5. Identifying the Critical Time Period for Information Extraction when Recognizing Sequences of Play

    ERIC Educational Resources Information Center

    North, Jamie S.; Williams, A. Mark

    2008-01-01

    The authors attempted to determine the critical time period for information extraction when recognizing play sequences in soccer. Although efforts have been made to identify the perceptual information underpinning such decisions, no researchers have attempted to determine "when" this information may be extracted from the display. The authors…

  6. Can we replace curation with information extraction software?

    PubMed

    Karp, Peter D

    2016-01-01

    Can we use programs for automated or semi-automated information extraction from scientific texts as practical alternatives to professional curation? I show that error rates of current information extraction programs are too high to replace professional curation today. Furthermore, current IEP programs extract single narrow slivers of information, such as individual protein interactions; they cannot extract the large breadth of information extracted by professional curators for databases such as EcoCyc. They also cannot arbitrate among conflicting statements in the literature as curators can. Therefore, funding agencies should not hobble the curation efforts of existing databases on the assumption that a problem that has stymied Artificial Intelligence researchers for more than 60 years will be solved tomorrow. Semi-automated extraction techniques appear to have significantly more potential based on a review of recent tools that enhance curator productivity. But a full cost-benefit analysis for these tools is lacking. Without such analysis it is possible to expend significant effort developing information-extraction tools that automate small parts of the overall curation workflow without achieving a significant decrease in curation costs.Database URL. © The Author(s) 2016. Published by Oxford University Press.

  7. Back to the Future: Have Remotely Sensed Digital Elevation Models Improved Hydrological Parameter Extraction?

    NASA Astrophysics Data System (ADS)

    Jarihani, B.

    2015-12-01

    Digital Elevation Models (DEMs) that accurately replicate both landscape form and processes are critical to support modeling of environmental processes. Pre-processing analysis of DEMs and extracting characteristics of the watershed (e.g., stream networks, catchment delineation, surface and subsurface flow paths) is essential for hydrological and geomorphic analysis and sediment transport. This study investigates the status of the current remotely-sensed DEMs in providing advanced morphometric information of drainage basins particularly in data sparse regions. Here we assess the accuracy of three available DEMs: (i) hydrologically corrected "H-DEM" of Geoscience Australia derived from the Shuttle Radar Topography Mission (SRTM) data; (ii) the Advanced Spaceborne Thermal Emission and Reflection Radiometer Global Digital Elevation Model (ASTER GDEM) version2 1-arc-second (~30 m) data; and (iii) the 9-arc-second national GEODATA DEM-9S ver3 from Geoscience Australia and the Australian National University. We used ESRI's geospatial data model, Arc Hydro and HEC-GeoHMS, designed for building hydrologic information systems to synthesize geospatial and temporal water resources data that support hydrologic modeling and analysis. A coastal catchment in northeast Australia was selected as the study site where very high resolution LiDAR data are available for parts of the area as reference data to assess the accuracy of other lower resolution datasets. This study provides morphometric information for drainage basins as part of the broad research on sediment flux from coastal basins to Great Barrier Reef, Australia. After applying geo-referencing and elevation corrections, stream and sub basins were delineated for each DEM. Then physical characteristics for streams (i.e., length, upstream and downstream elevation, and slope) and sub-basins (i.e., longest flow lengths, area, relief and slopes) were extracted and compared with reference datasets from LiDAR. Results showed that, in the absence of high-precision and high resolution DEM data, ASTER GDEM or SRTM DEM can be used to extract common morphometric relationship which are widely used for hydrological and geomorphological modelling.

  8. Blending Education and Polymer Science: Semi Automated Creation of a Thermodynamic Property Database.

    PubMed

    Tchoua, Roselyne B; Qin, Jian; Audus, Debra J; Chard, Kyle; Foster, Ian T; de Pablo, Juan

    2016-09-13

    Structured databases of chemical and physical properties play a central role in the everyday research activities of scientists and engineers. In materials science, researchers and engineers turn to these databases to quickly query, compare, and aggregate various properties, thereby allowing for the development or application of new materials. The vast majority of these databases have been generated manually, through decades of labor-intensive harvesting of information from the literature; yet, while there are many examples of commonly used databases, a significant number of important properties remain locked within the tables, figures, and text of publications. The question addressed in our work is whether, and to what extent, the process of data collection can be automated. Students of the physical sciences and engineering are often confronted with the challenge of finding and applying property data from the literature, and a central aspect of their education is to develop the critical skills needed to identify such data and discern their meaning or validity. To address shortcomings associated with automated information extraction, while simultaneously preparing the next generation of scientists for their future endeavors, we developed a novel course-based approach in which students develop skills in polymer chemistry and physics and apply their knowledge by assisting with the semi-automated creation of a thermodynamic property database.

  9. Blending Education and Polymer Science: Semiautomated Creation of a Thermodynamic Property Database

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tchoua, Roselyne B.; Qin, Jian; Audus, Debra J.

    Structured databases of chemical and physical properties play a central role in the everyday research activities of scientists and engineers. In materials science, researchers and engineers turn to these databases to quickly query, compare, and aggregate various properties, thereby allowing for the development or application of new materials. The vast majority of these databases have been generated manually, through decades of labor-intensive harvesting of information from the literature, yet while there are many examples of commonly used databases, a significant number of important properties remain locked within the tables, figures, and text of publications. The question addressed in our workmore » is whether and to what extent the process of data collection can be automated. Students of the physical sciences and engineering are often confronted with the challenge of finding and applying property data from the literature, and a central aspect of their education is to develop the critical skills needed to identify such data and discern their meaning or validity. To address shortcomings associated with automated information extraction while simultaneously preparing the next generation of scientists for their future endeavors, we developed a novel course-based approach in which students develop skills in polymer chemistry and physics and apply their knowledge by assisting with the semiautomated creation of a thermodynamic property database.« less

  10. Nutrition quality of extraction mannan residue from palm kernel cake on brolier chicken

    NASA Astrophysics Data System (ADS)

    Tafsin, M.; Hanafi, N. D.; Kejora, E.; Yusraini, E.

    2018-02-01

    This study aims to find out the nutrient residue of palm kernel cake from mannan extraction on broiler chicken by evaluating physical quality (specific gravity, bulk density and compacted bulk density), chemical quality (proximate analysis and Van Soest Test) and biological test (metabolizable energy). Treatment composed of T0 : palm kernel cake extracted aquadest (control), T1 : palm kernel cake extracted acetic acid (CH3COOH) 1%, T2 : palm kernel cake extracted aquadest + mannanase enzyme 100 u/l and T3 : palm kernel cake extracted acetic acid (CH3COOH) 1% + enzyme mannanase 100 u/l. The results showed that mannan extraction had significant effect (P<0.05) in improving the quality of physical and numerically increase the value of crude protein and decrease the value of NDF (Neutral Detergent Fiber). Treatments had highly significant influence (P<0.01) on the metabolizable energy value of palm kernel cake residue in broiler chickens. It can be concluded that extraction with aquadest + enzyme mannanase 100 u/l yields the best nutrient quality of palm kernel cake residue for broiler chicken.

  11. Prediction of SOFC Performance with or without Experiments: A Study on Minimum Requirements for Experimental Data

    DOE PAGES

    Yang, Tao; Sezer, Hayri; Celik, Ismail B.; ...

    2015-06-02

    In the present paper, a physics-based procedure combining experiments and multi-physics numerical simulations is developed for overall analysis of SOFCs operational diagnostics and performance predictions. In this procedure, essential information for the fuel cell is extracted first by utilizing empirical polarization analysis in conjunction with experiments and refined by multi-physics numerical simulations via simultaneous analysis and calibration of polarization curve and impedance behavior. The performance at different utilization cases and operating currents is also predicted to confirm the accuracy of the proposed model. It is demonstrated that, with the present electrochemical model, three air/fuel flow conditions are needed to producemore » a set of complete data for better understanding of the processes occurring within SOFCs. After calibration against button cell experiments, the methodology can be used to assess performance of planar cell without further calibration. The proposed methodology would accelerate the calibration process and improve the efficiency of design and diagnostics.« less

  12. Stroke Survivors' Experiences of Physical Rehabilitation: A Systematic Review of Qualitative Studies.

    PubMed

    Luker, Julie; Lynch, Elizabeth; Bernhardsson, Susanne; Bennett, Leanne; Bernhardt, Julie

    2015-09-01

    To report and synthesize the perspectives, experiences, and preferences of stroke survivors undertaking inpatient physical rehabilitation through a systematic review of qualitative studies. MEDLINE, CINAHL, Embase, and PsycINFO were searched from database inception to February 2014. Reference lists of relevant publications were searched. All languages were included. Qualitative studies reporting stroke survivors' experiences of inpatient stroke rehabilitation were selected independently by 2 reviewers. The search yielded 3039 records; 95 full-text publications were assessed for eligibility, and 32 documents (31 studies) were finally included. Comprehensiveness and explicit reporting were assessed independently by 2 reviewers using the consolidated criteria for reporting qualitative research framework. Discrepancies were resolved by consensus. Data regarding characteristics of the included studies were extracted by 1 reviewer, tabled, and checked for accuracy by another reviewer. All text reported in studies' results sections were entered into qualitative data management software for analysis. Extracted texts were inductively coded and analyzed in 3 phases using thematic synthesis. Nine interrelated analytical themes, with descriptive subthemes, were identified that related to issues of importance to stroke survivors: (1) physical activity is valued; (2) bored and alone; (3) patient-centered therapy; (4) recreation is also rehabilitation; (5) dependency and lack of control; (6) fostering autonomy; (7) power of communication and information; (8) motivation needs nurturing; and (9) fatigue can overwhelm. The thematic synthesis provides new insights into stroke survivors' experiences of inpatient rehabilitation. Negative experiences were reported in all studies and include disempowerment, boredom, and frustration. Rehabilitation could be improved by increasing activity within formal therapy and in free time, fostering patients' autonomy through genuinely patient-centered care, and more effective communication and information. Future stroke rehabilitation research should take into account the experiences and preferences of stroke survivors. Copyright © 2015 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.

  13. Effects of preprocessing Landsat MSS data on derived features

    NASA Technical Reports Server (NTRS)

    Parris, T. M.; Cicone, R. C.

    1983-01-01

    Important to the use of multitemporal Landsat MSS data for earth resources monitoring, such as agricultural inventories, is the ability to minimize the effects of varying atmospheric and satellite viewing conditions, while extracting physically meaningful features from the data. In general, the approaches to the preprocessing problem have been derived from either physical or statistical models. This paper compares three proposed algorithms; XSTAR haze correction, Color Normalization, and Multiple Acquisition Mean Level Adjustment. These techniques represent physical, statistical, and hybrid physical-statistical models, respectively. The comparisons are made in the context of three feature extraction techniques; the Tasseled Cap, the Cate Color Cube. and Normalized Difference.

  14. Modelling dental implant extraction by pullout and torque procedures.

    PubMed

    Rittel, D; Dorogoy, A; Shemtov-Yona, K

    2017-07-01

    Dental implants extraction, achieved either by applying torque or pullout force, is used to estimate the bone-implant interfacial strength. A detailed description of the mechanical and physical aspects of the extraction process in the literature is still missing. This paper presents 3D nonlinear dynamic finite element simulations of a commercial implant extraction process from the mandible bone. Emphasis is put on the typical load-displacement and torque-angle relationships for various types of cortical and trabecular bone strengths. The simulations also study of the influence of the osseointegration level on those relationships. This is done by simulating implant extraction right after insertion when interfacial frictional contact exists between the implant and bone, and long after insertion, assuming that the implant is fully bonded to the bone. The model does not include a separate representation and model of the interfacial layer for which available data is limited. The obtained relationships show that the higher the strength of the trabecular bone the higher the peak extraction force, while for application of torque, it is the cortical bone which might dictate the peak torque value. Information on the relative strength contrast of the cortical and trabecular components, as well as the progressive nature of the damage evolution, can be revealed from the obtained relations. It is shown that full osseointegration might multiply the peak and average load values by a factor 3-12 although the calculated work of extraction varies only by a factor of 1.5. From a quantitative point of view, it is suggested that, as an alternative to reporting peak load or torque values, an average value derived from the extraction work be used to better characterize the bone-implant interfacial strength. Copyright © 2017 Elsevier Ltd. All rights reserved.

  15. [Technologies for Complex Intelligent Clinical Data Analysis].

    PubMed

    Baranov, A A; Namazova-Baranova, L S; Smirnov, I V; Devyatkin, D A; Shelmanov, A O; Vishneva, E A; Antonova, E V; Smirnov, V I

    2016-01-01

    The paper presents the system for intelligent analysis of clinical information. Authors describe methods implemented in the system for clinical information retrieval, intelligent diagnostics of chronic diseases, patient's features importance and for detection of hidden dependencies between features. Results of the experimental evaluation of these methods are also presented. Healthcare facilities generate a large flow of both structured and unstructured data which contain important information about patients. Test results are usually retained as structured data but some data is retained in the form of natural language texts (medical history, the results of physical examination, and the results of other examinations, such as ultrasound, ECG or X-ray studies). Many tasks arising in clinical practice can be automated applying methods for intelligent analysis of accumulated structured array and unstructured data that leads to improvement of the healthcare quality. the creation of the complex system for intelligent data analysis in the multi-disciplinary pediatric center. Authors propose methods for information extraction from clinical texts in Russian. The methods are carried out on the basis of deep linguistic analysis. They retrieve terms of diseases, symptoms, areas of the body and drugs. The methods can recognize additional attributes such as "negation" (indicates that the disease is absent), "no patient" (indicates that the disease refers to the patient's family member, but not to the patient), "severity of illness", disease course", "body region to which the disease refers". Authors use a set of hand-drawn templates and various techniques based on machine learning to retrieve information using a medical thesaurus. The extracted information is used to solve the problem of automatic diagnosis of chronic diseases. A machine learning method for classification of patients with similar nosology and the methodfor determining the most informative patients'features are also proposed. Authors have processed anonymized health records from the pediatric center to estimate the proposed methods. The results show the applicability of the information extracted from the texts for solving practical problems. The records ofpatients with allergic, glomerular and rheumatic diseases were used for experimental assessment of the method of automatic diagnostic. Authors have also determined the most appropriate machine learning methods for classification of patients for each group of diseases, as well as the most informative disease signs. It has been found that using additional information extracted from clinical texts, together with structured data helps to improve the quality of diagnosis of chronic diseases. Authors have also obtained pattern combinations of signs of diseases. The proposed methods have been implemented in the intelligent data processing system for a multidisciplinary pediatric center. The experimental results show the availability of the system to improve the quality of pediatric healthcare.

  16. Combined non-parametric and parametric approach for identification of time-variant systems

    NASA Astrophysics Data System (ADS)

    Dziedziech, Kajetan; Czop, Piotr; Staszewski, Wieslaw J.; Uhl, Tadeusz

    2018-03-01

    Identification of systems, structures and machines with variable physical parameters is a challenging task especially when time-varying vibration modes are involved. The paper proposes a new combined, two-step - i.e. non-parametric and parametric - modelling approach in order to determine time-varying vibration modes based on input-output measurements. Single-degree-of-freedom (SDOF) vibration modes from multi-degree-of-freedom (MDOF) non-parametric system representation are extracted in the first step with the use of time-frequency wavelet-based filters. The second step involves time-varying parametric representation of extracted modes with the use of recursive linear autoregressive-moving-average with exogenous inputs (ARMAX) models. The combined approach is demonstrated using system identification analysis based on the experimental mass-varying MDOF frame-like structure subjected to random excitation. The results show that the proposed combined method correctly captures the dynamics of the analysed structure, using minimum a priori information on the model.

  17. Capturing Revolute Motion and Revolute Joint Parameters with Optical Tracking

    NASA Astrophysics Data System (ADS)

    Antonya, C.

    2017-12-01

    Optical tracking of users and various technical systems are becoming more and more popular. It consists of analysing sequence of recorded images using video capturing devices and image processing algorithms. The returned data contains mainly point-clouds, coordinates of markers or coordinates of point of interest. These data can be used for retrieving information related to the geometry of the objects, but also to extract parameters for the analytical model of the system useful in a variety of computer aided engineering simulations. The parameter identification of joints deals with extraction of physical parameters (mainly geometric parameters) for the purpose of constructing accurate kinematic and dynamic models. The input data are the time-series of the marker’s position. The least square method was used for fitting the data into different geometrical shapes (ellipse, circle, plane) and for obtaining the position and orientation of revolute joins.

  18. Optimization of the scan protocols for CT-based material extraction in small animal PET/CT studies

    NASA Astrophysics Data System (ADS)

    Yang, Ching-Ching; Yu, Jhih-An; Yang, Bang-Hung; Wu, Tung-Hsin

    2013-12-01

    We investigated the effects of scan protocols on CT-based material extraction to minimize radiation dose while maintaining sufficient image information in small animal studies. The phantom simulation experiments were performed with the high dose (HD), medium dose (MD) and low dose (LD) protocols at 50, 70 and 80 kVp with varying mA s. The reconstructed CT images were segmented based on Hounsfield unit (HU)-physical density (ρ) calibration curves and the dual-energy CT-based (DECT) method. Compared to the (HU;ρ) method performed on CT images acquired with the 80 kVp HD protocol, a 2-fold improvement in segmentation accuracy and a 7.5-fold reduction in radiation dose were observed when the DECT method was performed on CT images acquired with the 50/80 kVp LD protocol, showing the possibility to reduce radiation dose while achieving high segmentation accuracy.

  19. Extraction of CT dose information from DICOM metadata: automated Matlab-based approach.

    PubMed

    Dave, Jaydev K; Gingold, Eric L

    2013-01-01

    The purpose of this study was to extract exposure parameters and dose-relevant indexes of CT examinations from information embedded in DICOM metadata. DICOM dose report files were identified and retrieved from a PACS. An automated software program was used to extract from these files information from the structured elements in the DICOM metadata relevant to exposure. Extracting information from DICOM metadata eliminated potential errors inherent in techniques based on optical character recognition, yielding 100% accuracy.

  20. IN SITU SOIL VAPOR EXTRACTION TREATMENT

    EPA Science Inventory

    Soil vapor extraction (SVE) is designed to physically remove volatile compounds, generally from the vadose or unsaturated zone. t is an in situ process employing vapor extraction wells alone or in combination with air injection wells. acuum blowers supply the motive force, induci...

  1. Accurate facade feature extraction method for buildings from three-dimensional point cloud data considering structural information

    NASA Astrophysics Data System (ADS)

    Wang, Yongzhi; Ma, Yuqing; Zhu, A.-xing; Zhao, Hui; Liao, Lixia

    2018-05-01

    Facade features represent segmentations of building surfaces and can serve as a building framework. Extracting facade features from three-dimensional (3D) point cloud data (3D PCD) is an efficient method for 3D building modeling. By combining the advantages of 3D PCD and two-dimensional optical images, this study describes the creation of a highly accurate building facade feature extraction method from 3D PCD with a focus on structural information. The new extraction method involves three major steps: image feature extraction, exploration of the mapping method between the image features and 3D PCD, and optimization of the initial 3D PCD facade features considering structural information. Results show that the new method can extract the 3D PCD facade features of buildings more accurately and continuously. The new method is validated using a case study. In addition, the effectiveness of the new method is demonstrated by comparing it with the range image-extraction method and the optical image-extraction method in the absence of structural information. The 3D PCD facade features extracted by the new method can be applied in many fields, such as 3D building modeling and building information modeling.

  2. Computer vision-based analysis of foods: a non-destructive colour measurement tool to monitor quality and safety.

    PubMed

    Mogol, Burçe Ataç; Gökmen, Vural

    2014-05-01

    Computer vision-based image analysis has been widely used in food industry to monitor food quality. It allows low-cost and non-contact measurements of colour to be performed. In this paper, two computer vision-based image analysis approaches are discussed to extract mean colour or featured colour information from the digital images of foods. These types of information may be of particular importance as colour indicates certain chemical changes or physical properties in foods. As exemplified here, the mean CIE a* value or browning ratio determined by means of computer vision-based image analysis algorithms can be correlated with acrylamide content of potato chips or cookies. Or, porosity index as an important physical property of breadcrumb can be calculated easily. In this respect, computer vision-based image analysis provides a useful tool for automatic inspection of food products in a manufacturing line, and it can be actively involved in the decision-making process where rapid quality/safety evaluation is needed. © 2013 Society of Chemical Industry.

  3. The Agent of extracting Internet Information with Lead Order

    NASA Astrophysics Data System (ADS)

    Mo, Zan; Huang, Chuliang; Liu, Aijun

    In order to carry out e-commerce better, advanced technologies to access business information are in need urgently. An agent is described to deal with the problems of extracting internet information that caused by the non-standard and skimble-scamble structure of Chinese websites. The agent designed includes three modules which respond to the process of extracting information separately. A method of HTTP tree and a kind of Lead algorithm is proposed to generate a lead order, with which the required web can be retrieved easily. How to transform the extracted information structuralized with natural language is also discussed.

  4. Childhood socioeconomic position and adult leisure-time physical activity: a systematic review protocol.

    PubMed

    Elhakeem, Ahmed; Cooper, Rachel; Bann, David; Hardy, Rebecca

    2014-12-05

    Participation in leisure-time physical activity benefits health and is thought to be more prevalent in higher socioeconomic groups. Evidence indicates that childhood socioeconomic circumstances may have long-term influences on adult health and behaviour; however, it is unclear if this extends to an influence on adult physical activity. The aim of this review is to examine whether a lower childhood socioeconomic position is associated with lower levels of leisure-time physical activity during adulthood. Keywords will be used to systematically search five online databases and additional studies will be located through a search of reference lists. At least two researchers working independently will screen search results assess the quality of included studies and extract all relevant data. Studies will be included if they are English language publications that test the association between at least one indicator of childhood socioeconomic position and a leisure-time physical activity outcome measured during adulthood. Any disagreements and discrepancies arising during the conduct of the study will be resolved through discussion. This study will address the gap in evidence by systematically reviewing the published literature to establish whether childhood socioeconomic position is related to adult participation in leisure-time physical activity. The findings may be used to inform future research and policy. PROSPERO CRD42014007063.

  5. Highlights from the First Ever Demographic Study of Solar Physics, Space Physics, and Upper Atmospheric Physics

    NASA Astrophysics Data System (ADS)

    Moldwin, M.; Morrow, C. A.; White, S. C.; Ivie, R.

    2014-12-01

    Members of the Education & Workforce Working Group and the American Institute of Physics (AIP) conducted the first ever National Demographic Survey of working professionals for the 2012 National Academy of Sciences Solar and Space Physics Decadal Survey to learn about the demographics of this sub-field of space science. The instrument contained questions for participants on: the type of workplace; basic demographic information regarding gender and minority status, educational pathways (discipline of undergrad degree, field of their PhD), how their undergraduate and graduate student researchers are funded, participation in NSF and NASA funded spaceflight missions and suborbital programs, and barriers to career advancement. Using contact data bases from AGU, the American Astronomical Society's Solar Physics Division (AAS-SPD), attendees of NOAA's Space Weather Week and proposal submissions to NSF's Atmospheric, Geospace Science Division, the AIP's Statistical Research Center cross correlated and culled these data bases resulting in 2776 unique email addresses of US based working professionals. The survey received 1305 responses (51%) and generated 125 pages of single space answers to a number of open-ended questions. This talk will summarize the highlights of this first-ever demographic survey including findings extracted from the open-ended responses regarding barriers to career advancement which showed significant gender differences.

  6. Recreational physical activity and the risk of preeclampsia: a prospective cohort of Norwegian women.

    PubMed

    Magnus, Per; Trogstad, Lill; Owe, Katrine M; Olsen, Sjurdur F; Nystad, Wenche

    2008-10-15

    Previous case-control studies suggest that recreational physical activity protects against preeclampsia. Using a prospective design, the authors estimated the risk of preeclampsia for pregnant women according to level of physical activity, taking other variables that influence risk into consideration. The data set comprised 59,573 pregnancies from the Norwegian Mother and Child Cohort Study (1999-2006). Information on physical activity and other exposures was extracted from questionnaire responses given in pregnancy weeks 14-22, whereas diagnosis of preeclampsia was retrieved from the Medical Birth Registry of Norway. Estimation and confounder control was performed with multiple logistic regression. About 24% of pregnant women reported no physical activity, and 7% reported more than 25 such activities per month. The adjusted odds ratio was 0.79 (95% confidence interval: 0.65, 0.96) for preeclampsia when comparing women who exercised 25 times or more per month with inactive women. The association appeared strongest among women whose body mass index was less than 25 kg/m(2) and was absent among women whose body mass index was higher than 30 kg/m(2). These results suggest that the preventive effect of recreational physical activity during pregnancy may be more limited than has been shown in case-control studies and may apply to nonobese women only.

  7. Recreational Physical Activity and the Risk of Preeclampsia: A Prospective Cohort of Norwegian Women

    PubMed Central

    Trogstad, Lill; Owe, Katrine M.; Olsen, Sjurdur F.; Nystad, Wenche

    2008-01-01

    Previous case-control studies suggest that recreational physical activity protects against preeclampsia. Using a prospective design, the authors estimated the risk of preeclampsia for pregnant women according to level of physical activity, taking other variables that influence risk into consideration. The data set comprised 59,573 pregnancies from the Norwegian Mother and Child Cohort Study (1999–2006). Information on physical activity and other exposures was extracted from questionnaire responses given in pregnancy weeks 14–22, whereas diagnosis of preeclampsia was retrieved from the Medical Birth Registry of Norway. Estimation and confounder control was performed with multiple logistic regression. About 24% of pregnant women reported no physical activity, and 7% reported more than 25 such activities per month. The adjusted odds ratio was 0.79 (95% confidence interval: 0.65, 0.96) for preeclampsia when comparing women who exercised 25 times or more per month with inactive women. The association appeared strongest among women whose body mass index was less than 25 kg/m2 and was absent among women whose body mass index was higher than 30 kg/m2. These results suggest that the preventive effect of recreational physical activity during pregnancy may be more limited than has been shown in case-control studies and may apply to nonobese women only. PMID:18701444

  8. Ocean Carbon States: Data Mining in Observations and Numerical Simulations Results

    NASA Astrophysics Data System (ADS)

    Latto, R.; Romanou, A.

    2017-12-01

    Advanced data mining techniques are rapidly becoming widely used in Climate and Earth Sciences with the purpose of extracting new meaningful information from increasingly larger and more complex datasets. This is particularly important in studies of the global carbon cycle, where any lack of understanding of its combined physical and biogeochemical drivers is detrimental to our ability to accurately describe, understand, and predict CO2 concentrations and their changes in the major carbon reservoirs. The analysis presented here evaluates the use of cluster analysis as a means of identifying and comparing spatial and temporal patterns extracted from observational and model datasets. As the observational data is organized into various regimes, which we will call "ocean carbon states", we gain insight into the physical and/or biogeochemical processes controlling the ocean carbon cycle as well as how well these processes are simulated by a state-of-the-art climate model. We find that cluster analysis effectively produces realistic, dynamic regimes that can be associated with specific processes at different temporal scales for both observations and the model. In addition, we show how these regimes can be used to illustrate and characterize the model biases in the model air-sea flux of CO2. These biases are attributed to biases in salinity, sea surface temperature, wind speed, and nitrate, which are then used to identify the physical processes that are inaccurately reproduced by the model. In this presentation, we provide a proof-of-concept application using simple datasets, and we expand to more complex ones, using several physical and biogeochemical variable pairs, thus providing considerable insight into the mechanisms and phases of the ocean carbon cycle over different temporal and spatial scales.

  9. Rest and Return to Activity After Sport-Related Concussion: A Systematic Review of the Literature

    PubMed Central

    McLeod, Tamara C. Valovich; Lewis, Joy H.; Whelihan, Kate; Bacon, Cailee E. Welch

    2017-01-01

    Objective: To systematically review the literature regarding rest and return to activity after sport-related concussion. Data Sources: The search was conducted in the Cochrane Central Register of Controlled Trials, CINAHL, SPORTDiscus, Educational Resources Information Center, Ovid MEDLINE, and PubMed using terms related to concussion, mild traumatic brain injury, physical and cognitive rest, and return to activity. Study Selection: Studies were included if they were published in English; were original research; and evaluated the use of, compliance with, or effectiveness of physical or cognitive rest or provided empirical evidence supporting the graded return-to-activity progression. Data Extraction: The study design, patient or participant sample, interventions used, outcome measures, main results, and conclusions were extracted, as appropriate, from each article. Data Synthesis: Articles were categorized into groups based on their ability to address one of the primary clinical questions of interest: use of rest, rest effectiveness, compliance with recommendations, or outcome after graded return-to-activity progression. A qualitative synthesis of the results was provided, along with summary tables. Conclusions: Our main findings suggest that rest is underused by health care providers, recommendations for rest are broad and not specific to individual patients, an initial period of moderate physical and cognitive rest (eg, limited physical activity and light mental activity) may improve outcomes during the acute postinjury phase, significant variability in the use of assessment tools and compliance with recommended return-to-activity guidelines exists, and additional research is needed to empirically evaluate the effectiveness of graded return-to-activity progressions. Furthermore, there is a significant need to translate knowledge of best practices in concussion management to primary care providers. PMID:28387547

  10. Deterministic realization of collective measurements via photonic quantum walks.

    PubMed

    Hou, Zhibo; Tang, Jun-Feng; Shang, Jiangwei; Zhu, Huangjun; Li, Jian; Yuan, Yuan; Wu, Kang-Da; Xiang, Guo-Yong; Li, Chuan-Feng; Guo, Guang-Can

    2018-04-12

    Collective measurements on identically prepared quantum systems can extract more information than local measurements, thereby enhancing information-processing efficiency. Although this nonclassical phenomenon has been known for two decades, it has remained a challenging task to demonstrate the advantage of collective measurements in experiments. Here, we introduce a general recipe for performing deterministic collective measurements on two identically prepared qubits based on quantum walks. Using photonic quantum walks, we realize experimentally an optimized collective measurement with fidelity 0.9946 without post selection. As an application, we achieve the highest tomographic efficiency in qubit state tomography to date. Our work offers an effective recipe for beating the precision limit of local measurements in quantum state tomography and metrology. In addition, our study opens an avenue for harvesting the power of collective measurements in quantum information-processing and for exploring the intriguing physics behind this power.

  11. Handling of huge multispectral image data volumes from a spectral hole burning device (SHBD)

    NASA Astrophysics Data System (ADS)

    Graff, Werner; Rosselet, Armel C.; Wild, Urs P.; Gschwind, Rudolf; Keller, Christoph U.

    1995-06-01

    We use chlorin-doped polymer films at low temperatures as the primary imaging detector. Based on the principles of persistent spectral hole burning, this system is capable of storing spatial and spectral information simultaneously in one exposure with extremely high resolution. The sun as an extended light source has been imaged onto the film. The information recorded amounts to tens of GBytes. This data volume is read out by scanning the frequency of a tunable dye laser and reading the images with a digital CCD camera. For acquisition, archival, processing, and visualization, we use MUSIC (MUlti processor System with Intelligent Communication), a single instruction multiple data parallel processor system equipped with the necessary I/O facilities. The huge amount of data requires the developemnt of sophisticated algorithms to efficiently calibrate the data and to extract useful and new information for solar physics.

  12. Probing hydrogen positions in hydrous compounds: information from parametric neutron powder diffraction studies.

    PubMed

    Ting, Valeska P; Henry, Paul F; Schmidtmann, Marc; Wilson, Chick C; Weller, Mark T

    2012-05-21

    We demonstrate the extent to which modern detector technology, coupled with a high flux constant wavelength neutron source, can be used to obtain high quality diffraction data from short data collections, allowing the refinement of the full structures (including hydrogen positions) of hydrous compounds from in situ neutron powder diffraction measurements. The in situ thermodiffractometry and controlled humidity studies reported here reveal that important information on the reorientations of structural water molecules with changing conditions can be easily extracted, providing insight into the effects of hydrogen bonding on bulk physical properties. Using crystalline BaCl2·2H2O as an example system, we analyse the structural changes in the compound and its dehydration intermediates with changing temperature and humidity levels to demonstrate the quality of the dynamic structural information on the hydrogen atoms and associated hydrogen bonding that can be obtained without resorting to sample deuteration.

  13. Characterizing the Fundamental Intellectual Steps Required in the Solution of Conceptual Problems

    NASA Astrophysics Data System (ADS)

    Stewart, John

    2010-02-01

    At some level, the performance of a science class must depend on what is taught, the information content of the materials and assignments of the course. The introductory calculus-based electricity and magnetism class at the University of Arkansas is examined using a catalog of the basic reasoning steps involved in the solution of problems assigned in the class. This catalog was developed by sampling popular physics textbooks for conceptual problems. The solution to each conceptual problem was decomposed into its fundamental reasoning steps. These fundamental steps are, then, used to quantify the distribution of conceptual content within the course. Using this characterization technique, an exceptionally detailed picture of the information flow and structure of the class can be produced. The intellectual structure of published conceptual inventories is compared with the information presented in the class and the dependence of conceptual performance on the details of coverage extracted. )

  14. Inverse statistical physics of protein sequences: a key issues review.

    PubMed

    Cocco, Simona; Feinauer, Christoph; Figliuzzi, Matteo; Monasson, Rémi; Weigt, Martin

    2018-03-01

    In the course of evolution, proteins undergo important changes in their amino acid sequences, while their three-dimensional folded structure and their biological function remain remarkably conserved. Thanks to modern sequencing techniques, sequence data accumulate at unprecedented pace. This provides large sets of so-called homologous, i.e. evolutionarily related protein sequences, to which methods of inverse statistical physics can be applied. Using sequence data as the basis for the inference of Boltzmann distributions from samples of microscopic configurations or observables, it is possible to extract information about evolutionary constraints and thus protein function and structure. Here we give an overview over some biologically important questions, and how statistical-mechanics inspired modeling approaches can help to answer them. Finally, we discuss some open questions, which we expect to be addressed over the next years.

  15. Inverse statistical physics of protein sequences: a key issues review

    NASA Astrophysics Data System (ADS)

    Cocco, Simona; Feinauer, Christoph; Figliuzzi, Matteo; Monasson, Rémi; Weigt, Martin

    2018-03-01

    In the course of evolution, proteins undergo important changes in their amino acid sequences, while their three-dimensional folded structure and their biological function remain remarkably conserved. Thanks to modern sequencing techniques, sequence data accumulate at unprecedented pace. This provides large sets of so-called homologous, i.e. evolutionarily related protein sequences, to which methods of inverse statistical physics can be applied. Using sequence data as the basis for the inference of Boltzmann distributions from samples of microscopic configurations or observables, it is possible to extract information about evolutionary constraints and thus protein function and structure. Here we give an overview over some biologically important questions, and how statistical-mechanics inspired modeling approaches can help to answer them. Finally, we discuss some open questions, which we expect to be addressed over the next years.

  16. Recovery of Near-Fault Ground Motion by Introducing Rotational Motions

    NASA Astrophysics Data System (ADS)

    Chiu, H. C.

    2014-12-01

    Near-fault ground motion is the key data to seismologists for revealing the seismic faulting and earthquake physics and strong-motion data is the only near-fault seismogram that can keep on-scale recording in a major earthquake. Unfortunately, this type of data might be contaminated by the rotation induced effects such as the centrifugal acceleration and the gravity effects. We analyze these effects based on a set of collocated rotation-translation data of small to moderate earthquakes. Results show these rotation effects could be negligible in small ground motion, but they might have a radical growing in the near-fault/extremely large ground motions. In order to extract more information from near-fault seismogram for improving our understating of seismic faulting and earthquake physics, it requires six-component collocated rotation-translation records to reduce or remove these effects.

  17. Understanding disciplinary vocabularies using a full-text enabled domain-independent term extraction approach.

    PubMed

    Yan, Erjia; Williams, Jake; Chen, Zheng

    2017-01-01

    Publication metadata help deliver rich analyses of scholarly communication. However, research concepts and ideas are more effectively expressed through unstructured fields such as full texts. Thus, the goals of this paper are to employ a full-text enabled method to extract terms relevant to disciplinary vocabularies, and through them, to understand the relationships between disciplines. This paper uses an efficient, domain-independent term extraction method to extract disciplinary vocabularies from a large multidisciplinary corpus of PLoS ONE publications. It finds a power-law pattern in the frequency distributions of terms present in each discipline, indicating a semantic richness potentially sufficient for further study and advanced analysis. The salient relationships amongst these vocabularies become apparent in application of a principal component analysis. For example, Mathematics and Computer and Information Sciences were found to have similar vocabulary use patterns along with Engineering and Physics; while Chemistry and the Social Sciences were found to exhibit contrasting vocabulary use patterns along with the Earth Sciences and Chemistry. These results have implications to studies of scholarly communication as scholars attempt to identify the epistemological cultures of disciplines, and as a full text-based methodology could lead to machine learning applications in the automated classification of scholarly work according to disciplinary vocabularies.

  18. Understanding disciplinary vocabularies using a full-text enabled domain-independent term extraction approach

    PubMed Central

    Williams, Jake; Chen, Zheng

    2017-01-01

    Publication metadata help deliver rich analyses of scholarly communication. However, research concepts and ideas are more effectively expressed through unstructured fields such as full texts. Thus, the goals of this paper are to employ a full-text enabled method to extract terms relevant to disciplinary vocabularies, and through them, to understand the relationships between disciplines. This paper uses an efficient, domain-independent term extraction method to extract disciplinary vocabularies from a large multidisciplinary corpus of PLoS ONE publications. It finds a power-law pattern in the frequency distributions of terms present in each discipline, indicating a semantic richness potentially sufficient for further study and advanced analysis. The salient relationships amongst these vocabularies become apparent in application of a principal component analysis. For example, Mathematics and Computer and Information Sciences were found to have similar vocabulary use patterns along with Engineering and Physics; while Chemistry and the Social Sciences were found to exhibit contrasting vocabulary use patterns along with the Earth Sciences and Chemistry. These results have implications to studies of scholarly communication as scholars attempt to identify the epistemological cultures of disciplines, and as a full text-based methodology could lead to machine learning applications in the automated classification of scholarly work according to disciplinary vocabularies. PMID:29186141

  19. Physiological effects of formulation containing tannase-converted green tea extract on skin care: physical stability, collagenase, elastase, and tyrosinase activities.

    PubMed

    Hong, Yang-Hee; Jung, Eun Young; Noh, Dong Ouk; Suh, Hyung Joo

    2014-03-01

    Green tea contains numerous polyphenols, which have health-promoting effects. The purpose of this study was to evaluate the effect of tannase-converted green tea extract (TGE) formulation on the physical stability and activities of skin-related enzymes. Physical stability was evaluated by measuring the pH, precipitation, and colors at 25 ± 2 °C/ambient humidity and at 40 ± 2 °C/70% ± 5% relative humidity for 4 months. Activities of collagenase, elastase, and tyrosinase as skin-related enzymes were assessed on TGE formulation. The concentrations of epigallocatechin-3-gallate and epicatechin-3-gallate in green tea extract were greatly decreased to the extent of negligible level when treated with tannase. The formulation containing 5% tannase-converted green tea extract showed relatively stable pH, precipitation, and color features for 16 weeks. When TGE was added to the formulation, there was a significant increase in the inhibition of elastase and tyrosinase activities ( p  < 0.05) compared with the formulation containing 5% normal green tea extract. The TGE could be used in cosmetics as skin antiwrinkling or depigmenting agent.

  20. Longitudinal Analysis of New Information Types in Clinical Notes

    PubMed Central

    Zhang, Rui; Pakhomov, Serguei; Melton, Genevieve B.

    2014-01-01

    It is increasingly recognized that redundant information in clinical notes within electronic health record (EHR) systems is ubiquitous, significant, and may negatively impact the secondary use of these notes for research and patient care. We investigated several automated methods to identify redundant versus relevant new information in clinical reports. These methods may provide a valuable approach to extract clinically pertinent information and further improve the accuracy of clinical information extraction systems. In this study, we used UMLS semantic types to extract several types of new information, including problems, medications, and laboratory information. Automatically identified new information highly correlated with manual reference standard annotations. Methods to identify different types of new information can potentially help to build up more robust information extraction systems for clinical researchers as well as aid clinicians and researchers in navigating clinical notes more effectively and quickly identify information pertaining to changes in health states. PMID:25717418

  1. Status and trends of dam removal research in the United States

    USGS Publications Warehouse

    Bellmore, James; Duda, Jeff; Craig, Laura; Greene, Samantha L.; Torgersen, Christian E.; Collins, Mathias J.; Vittum, Katherine

    2017-01-01

    Aging infrastructure coupled with growing interest in river restoration has driven a dramatic increase in the practice of dam removal. With this increase, there has been a proliferation of studies that assess the physical and ecological responses of rivers to these removals. As more dams are considered for removal, scientific information from these dam-removal studies will increasingly be called upon to inform decisions about whether, and how best, to bring down dams. This raises a critical question: what is the current state of dam-removal science in the United States? To explore the status, trends, and characteristics of dam-removal research in the U.S., we searched the scientific literature and extracted basic information from studies on dam removal. Our literature review illustrates that although over 1200 dams have been removed in the U.S., fewer than 10% have been scientifically evaluated, and most of these studies were short in duration ( < 4 years) and had limited (1–2 years) or no pre-removal monitoring. The majority of studies focused on hydrologic and geomorphic responses to removal rather than biological and water-quality responses, and few studies were published on linkages between physical and ecological components. Our review illustrates the need for long-term, multidisciplinary case studies, with robust study designs, in order to anticipate the effects of dam removal and inform future decision making.

  2. Optimal Information Extraction of Laser Scanning Dataset by Scale-Adaptive Reduction

    NASA Astrophysics Data System (ADS)

    Zang, Y.; Yang, B.

    2018-04-01

    3D laser technology is widely used to collocate the surface information of object. For various applications, we need to extract a good perceptual quality point cloud from the scanned points. To solve the problem, most of existing methods extract important points based on a fixed scale. However, geometric features of 3D object come from various geometric scales. We propose a multi-scale construction method based on radial basis function. For each scale, important points are extracted from the point cloud based on their importance. We apply a perception metric Just-Noticeable-Difference to measure degradation of each geometric scale. Finally, scale-adaptive optimal information extraction is realized. Experiments are undertaken to evaluate the effective of the proposed method, suggesting a reliable solution for optimal information extraction of object.

  3. Effects of Eleutherococcus senticosus Cortex on Recovery from the Forced Swimming Test and Fatty Acid β-Oxidation in the Liver and Skeletal Muscle of mice.

    PubMed

    Sumiyoshi, Maho; Kimura, Yoshiyuki

    2016-03-01

    The root and stem barks of Eleutherococcus senticosus have been used to treat emotional and physical fatigue in China, Russia, Korea, and Japan. The effects of E. senticosus on recovery from physical fatigue and the expenditure of energy currently remain unclear. We herein examined the effects of E. senticosus extract on recovery from physical fatigue after the forced swimming test as well as fatty acid β-oxidation in the liver and skeletal muscle of mice. 1) Physical fatigue; E. senticosus extract (500 and 1000 mg/kg, twice daily) was administered orally to ICR male mice for 7 consecutive days. After swimming had been performed for 15 min, each mouse was placed on the cover of a 100-mm culture plate, and the time for each mouse to move away from the cover was measured. 2) Fatty acid β-oxidation in the liver and skeletal muscle; E. senticosus extract (500 and 1000 mg/kg) was administered orally twice daily to C57BL/6J male mice for 21 consecutive days. The initial and final body and liver weight were measured, and then fatty acid β-oxidation activity in the liver and skeletal muscle was measured by methods using [1- 14 C] palmitic acid. Recovery times after forced swimming were shorter in E. senticosus extract (500 and 1000 mg/kg)-treated mice than in vehicle-treated mice. The body and liver weight had no effect by the oral administration of E. senticosus extract, vitamin mixture and L-carnitine. Fatty acid β-oxidation activity in skeletal muscle was increased by E. senticosus extract (500 and 1000 mg/kg). E. senticosus may enhance recovery from physical fatigue induced by forced swimming by accelerating energy changes through fatty acid β-oxidation in skeletal muscle.

  4. Interpreting consumer preferences: physicohedonic and psychohedonic models yield different information in a coffee-flavored dairy beverage.

    PubMed

    Li, Bangde; Hayes, John E; Ziegler, Gregory R

    2014-09-01

    Designed experiments provide product developers feedback on the relationship between formulation and consumer acceptability. While actionable, this approach typically assumes a simple psychophysical relationship between ingredient concentration and perceived intensity. This assumption may not be valid, especially in cases where perceptual interactions occur. Additional information can be gained by considering the liking-intensity function, as single ingredients can influence more than one perceptual attribute. Here, 20 coffee-flavored dairy beverages were formulated using a fractional mixture design that varied the amount of coffee extract, fluid milk, sucrose, and water. Overall liking ( liking ) was assessed by 388 consumers using an incomplete block design (4 out of 20 prototypes) to limit fatigue; all participants also rated the samples for intensity of coffee flavor (coffee) , milk flavor (milk) , sweetness (sweetness) and thickness (thickness) . Across product means, the concentration variables explained 52% of the variance in liking in main effects multiple regression. The amount of sucrose (β = 0.46) and milk (β = 0.46) contributed significantly to the model (p's <0.02) while coffee extract (β = -0.17; p = 0.35) did not. A comparable model based on the perceived intensity explained 63% of the variance in mean liking ; sweetness (β = 0.53) and milk (β = 0.69) contributed significantly to the model (p's <0.04), while the influence of coffee flavor (β = 0.48) was positive but marginally (p = 0.09). Since a strong linear relationship existed between coffee extract concentration and coffee flavor, this discrepancy between the two models was unexpected, and probably indicates that adding more coffee extract also adds a negative attribute, e.g. too much bitterness. In summary, modeling liking as a function of both perceived intensity and physical concentration provides a richer interpretation of consumer data.

  5. Interpreting consumer preferences: physicohedonic and psychohedonic models yield different information in a coffee-flavored dairy beverage

    PubMed Central

    Li, Bangde; Hayes, John E.; Ziegler, Gregory R.

    2014-01-01

    Designed experiments provide product developers feedback on the relationship between formulation and consumer acceptability. While actionable, this approach typically assumes a simple psychophysical relationship between ingredient concentration and perceived intensity. This assumption may not be valid, especially in cases where perceptual interactions occur. Additional information can be gained by considering the liking-intensity function, as single ingredients can influence more than one perceptual attribute. Here, 20 coffee-flavored dairy beverages were formulated using a fractional mixture design that varied the amount of coffee extract, fluid milk, sucrose, and water. Overall liking (liking) was assessed by 388 consumers using an incomplete block design (4 out of 20 prototypes) to limit fatigue; all participants also rated the samples for intensity of coffee flavor (coffee), milk flavor (milk), sweetness (sweetness) and thickness (thickness). Across product means, the concentration variables explained 52% of the variance in liking in main effects multiple regression. The amount of sucrose (β = 0.46) and milk (β = 0.46) contributed significantly to the model (p’s <0.02) while coffee extract (β = −0.17; p = 0.35) did not. A comparable model based on the perceived intensity explained 63% of the variance in mean liking; sweetness (β = 0.53) and milk (β = 0.69) contributed significantly to the model (p’s <0.04), while the influence of coffee flavor (β = 0.48) was positive but marginally (p = 0.09). Since a strong linear relationship existed between coffee extract concentration and coffee flavor, this discrepancy between the two models was unexpected, and probably indicates that adding more coffee extract also adds a negative attribute, e.g. too much bitterness. In summary, modeling liking as a function of both perceived intensity and physical concentration provides a richer interpretation of consumer data. PMID:25024507

  6. Selected physical and chemical properties of Feverfew (Tanacetum parthenium) extracts important for formulated product quality and performance.

    PubMed

    Jin, Ping; Madieh, Shadi; Augsburger, Larry L

    2008-01-01

    The objectives of this research are: (1) to assess selected formulation-relevant physical properties of several commercial Feverfew extracts, including flowability, hygroscopicity, compressibility and compactibility (2) to develop and validate a suitable extraction method and HPLC assay, and (3) to determine the parthenolide content of several commercial Feverfew extracts. Carr's index, minimum orifice diameter and particle-particle interaction were used to evaluate powder flowability. Hygroscopicity was evaluated by determining the equilibrium moisture content (EMC) after storage at various % relative humidities. Heckle analysis and compression pressure-radial tensile strength relationship were used to represent compression and compaction properties of feverfew extracts. An adapted analytical method was developed based on literature methods and then validated for the determination of parthenolide in feverfew. The commercial extracts tested exhibited poor to very poor flowability. The comparatively low mean yield pressure suggested that feverfew extracts deformed mainly plastically. Hygroscopicity and compactibility varied greatly with source. No commercial feverfew extracts tested contained the label claimed parthenolide. Even different batches from the same manufacturer showed significantly different parthenolide content. Therefore, extract manufactures should commit to proper quality control procedures that ensure accurate label claims, and supplement manufacturers should take into account possible differences in physico-chemical properties when using extracts from multiple suppliers.

  7. Information Extraction of High Resolution Remote Sensing Images Based on the Calculation of Optimal Segmentation Parameters

    PubMed Central

    Zhu, Hongchun; Cai, Lijie; Liu, Haiying; Huang, Wei

    2016-01-01

    Multi-scale image segmentation and the selection of optimal segmentation parameters are the key processes in the object-oriented information extraction of high-resolution remote sensing images. The accuracy of remote sensing special subject information depends on this extraction. On the basis of WorldView-2 high-resolution data, the optimal segmentation parameters methodof object-oriented image segmentation and high-resolution image information extraction, the following processes were conducted in this study. Firstly, the best combination of the bands and weights was determined for the information extraction of high-resolution remote sensing image. An improved weighted mean-variance method was proposed andused to calculatethe optimal segmentation scale. Thereafter, the best shape factor parameter and compact factor parameters were computed with the use of the control variables and the combination of the heterogeneity and homogeneity indexes. Different types of image segmentation parameters were obtained according to the surface features. The high-resolution remote sensing images were multi-scale segmented with the optimal segmentation parameters. Ahierarchical network structure was established by setting the information extraction rules to achieve object-oriented information extraction. This study presents an effective and practical method that can explain expert input judgment by reproducible quantitative measurements. Furthermore the results of this procedure may be incorporated into a classification scheme. PMID:27362762

  8. Information extraction during simultaneous motion processing.

    PubMed

    Rideaux, Reuben; Edwards, Mark

    2014-02-01

    When confronted with multiple moving objects the visual system can process them in two stages: an initial stage in which a limited number of signals are processed in parallel (i.e. simultaneously) followed by a sequential stage. We previously demonstrated that during the simultaneous stage, observers could discriminate between presentations containing up to 5 vs. 6 spatially localized motion signals (Edwards & Rideaux, 2013). Here we investigate what information is actually extracted during the simultaneous stage and whether the simultaneous limit varies with the detail of information extracted. This was achieved by measuring the ability of observers to extract varied information from low detail, i.e. the number of signals presented, to high detail, i.e. the actual directions present and the direction of a specific element, during the simultaneous stage. The results indicate that the resolution of simultaneous processing varies as a function of the information which is extracted, i.e. as the information extraction becomes more detailed, from the number of moving elements to the direction of a specific element, the capacity to process multiple signals is reduced. Thus, when assigning a capacity to simultaneous motion processing, this must be qualified by designating the degree of information extraction. Crown Copyright © 2013. Published by Elsevier Ltd. All rights reserved.

  9. Information Extraction of High Resolution Remote Sensing Images Based on the Calculation of Optimal Segmentation Parameters.

    PubMed

    Zhu, Hongchun; Cai, Lijie; Liu, Haiying; Huang, Wei

    2016-01-01

    Multi-scale image segmentation and the selection of optimal segmentation parameters are the key processes in the object-oriented information extraction of high-resolution remote sensing images. The accuracy of remote sensing special subject information depends on this extraction. On the basis of WorldView-2 high-resolution data, the optimal segmentation parameters methodof object-oriented image segmentation and high-resolution image information extraction, the following processes were conducted in this study. Firstly, the best combination of the bands and weights was determined for the information extraction of high-resolution remote sensing image. An improved weighted mean-variance method was proposed andused to calculatethe optimal segmentation scale. Thereafter, the best shape factor parameter and compact factor parameters were computed with the use of the control variables and the combination of the heterogeneity and homogeneity indexes. Different types of image segmentation parameters were obtained according to the surface features. The high-resolution remote sensing images were multi-scale segmented with the optimal segmentation parameters. Ahierarchical network structure was established by setting the information extraction rules to achieve object-oriented information extraction. This study presents an effective and practical method that can explain expert input judgment by reproducible quantitative measurements. Furthermore the results of this procedure may be incorporated into a classification scheme.

  10. Graph-theoretic analysis of discrete-phase-space states for condition change detection and quantification of information

    DOEpatents

    Hively, Lee M.

    2014-09-16

    Data collected from devices and human condition may be used to forewarn of critical events such as machine/structural failure or events from brain/heart wave data stroke. By monitoring the data, and determining what values are indicative of a failure forewarning, one can provide adequate notice of the impending failure in order to take preventive measures. This disclosure teaches a computer-based method to convert dynamical numeric data representing physical objects (unstructured data) into discrete-phase-space states, and hence into a graph (structured data) for extraction of condition change.

  11. Effects of exercise training on fitness, mobility, fatigue, and health-related quality of life among adults with multiple sclerosis: a systematic review to inform guideline development.

    PubMed

    Latimer-Cheung, Amy E; Pilutti, Lara A; Hicks, Audrey L; Martin Ginis, Kathleen A; Fenuta, Alyssa M; MacKibbon, K Ann; Motl, Robert W

    2013-09-01

    To conduct a systematic review of evidence surrounding the effects of exercise training on physical fitness, mobility, fatigue, and health-related quality of life in adults with multiple sclerosis (MS). The databases included EMBASE, 1980 to 2011 (wk 12); Ovid MEDLINE and Ovid OLDMEDLINE, 1947 to March (wk 3) 2011; PsycINFO, 1967 to March (wk 4) 2011; CINAHL all-inclusive; SPORTDiscus all-inclusive; Cochrane Library all-inclusive; and Physiotherapy Evidence Database all-inclusive. The review was limited to English-language studies (published before December 2011) of people with MS that evaluated the effects of exercise training on outcomes of physical fitness, mobility, fatigue, and/or health-related quality of life. One research assistant extracted data and rated study quality. A second research assistant verified the extraction and quality assessment. From the 4362 studies identified, 54 studies were included in the review. The extracted data were analyzed using a descriptive approach. There was strong evidence that exercise performed 2 times per week at a moderate intensity increases aerobic capacity and muscular strength. The evidence was not consistent regarding the effects of exercise training on other outcomes. Among those with mild to moderate disability from MS, there is sufficient evidence that exercise training is effective for improving both aerobic capacity and muscular strength. Exercise may improve mobility, fatigue, and health-related quality of life. Copyright © 2013 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.

  12. Symmetries in Physics

    NASA Astrophysics Data System (ADS)

    Brading, Katherine; Castellani, Elena

    2010-01-01

    Preface; Copyright acknowledgements; List of contributors; 1. Introduction; Part I. Continuous Symmetries: 2. Classic texts: extracts from Weyl and Wigner; 3. Review paper: On the significance of continuous symmetry to the foundations of physics C. Martin; 4. The philosophical roots of the gauge principle: Weyl and transcendental phenomenological idealism T. Ryckman; 5. Symmetries and Noether's theorems K. A. Brading and H. R. Brown; 6. General covariance, gauge theories, and the Kretschmann objection J. Norton; 7. The interpretation of gauge symmetry M. Redhead; 8. Tracking down gauge: an ode to the constrained Hamiltonian formalism J. Earman; 9. Time-dependent symmetries: the link between gauge symmetries and indeterminism D. Wallace; 10. A fourth way to the Aharanov-Bohm effect A. Nounou; Part II. Discrete Symmetries: 11. Classic texts: extracts from Lebniz, Kant and Black; 12. Review paper: Understanding permutation symmetry S. French and D. Rickles; 13. Quarticles and the identity of discernibles N. Hugget; 14. Review paper: Handedness, parity violation, and the reality of space O. Pooley; 15. Mirror symmetry: what is it for a relational space to be orientable? N. Huggett; 16. Physics and Leibniz's principles S. Saunders; Part III. Symmetry Breaking: 17: Classic texts: extracts from Curie and Weyl; 18. Extract from G. Jona-Lasinio: Cross-fertilization in theoretical physics: the case of condensed matter and particle physics G. Jona-Lasinio; 19. Review paper: On the meaning of symmetry breaking E. Castellani; 20. Rough guide to spontaneous symmetry breaking J. Earman; 21. Spontaneous symmetry breaking: theoretical arguments and philosophical problems M. Morrison; Part IV. General Interpretative Issues: 22. Classic texts: extracts from Wigner; 23. Symmetry as a guide to superfluous theoretical structure J. Ismael and B. van Fraassen; 24. Notes on symmetries G. Belot; 25. Symmetry, objectivity, and design P. Kosso; 26. Symmetry and equivalence E. Castellani.

  13. Symmetries in Physics

    NASA Astrophysics Data System (ADS)

    Brading, Katherine; Castellani, Elena

    2003-12-01

    Preface; Copyright acknowledgements; List of contributors; 1. Introduction; Part I. Continuous Symmetries: 2. Classic texts: extracts from Weyl and Wigner; 3. Review paper: On the significance of continuous symmetry to the foundations of physics C. Martin; 4. The philosophical roots of the gauge principle: Weyl and transcendental phenomenological idealism T. Ryckman; 5. Symmetries and Noether's theorems K. A. Brading and H. R. Brown; 6. General covariance, gauge theories, and the Kretschmann objection J. Norton; 7. The interpretation of gauge symmetry M. Redhead; 8. Tracking down gauge: an ode to the constrained Hamiltonian formalism J. Earman; 9. Time-dependent symmetries: the link between gauge symmetries and indeterminism D. Wallace; 10. A fourth way to the Aharanov-Bohm effect A. Nounou; Part II. Discrete Symmetries: 11. Classic texts: extracts from Lebniz, Kant and Black; 12. Review paper: Understanding permutation symmetry S. French and D. Rickles; 13. Quarticles and the identity of discernibles N. Hugget; 14. Review paper: Handedness, parity violation, and the reality of space O. Pooley; 15. Mirror symmetry: what is it for a relational space to be orientable? N. Huggett; 16. Physics and Leibniz's principles S. Saunders; Part III. Symmetry Breaking: 17: Classic texts: extracts from Curie and Weyl; 18. Extract from G. Jona-Lasinio: Cross-fertilization in theoretical physics: the case of condensed matter and particle physics G. Jona-Lasinio; 19. Review paper: On the meaning of symmetry breaking E. Castellani; 20. Rough guide to spontaneous symmetry breaking J. Earman; 21. Spontaneous symmetry breaking: theoretical arguments and philosophical problems M. Morrison; Part IV. General Interpretative Issues: 22. Classic texts: extracts from Wigner; 23. Symmetry as a guide to superfluous theoretical structure J. Ismael and B. van Fraassen; 24. Notes on symmetries G. Belot; 25. Symmetry, objectivity, and design P. Kosso; 26. Symmetry and equivalence E. Castellani.

  14. Examining the Self-Assembly of Rod-Coil Block Copolymers via Physics Based Polymer Models and Polarized X-Ray Scattering

    NASA Astrophysics Data System (ADS)

    Hannon, Adam; Sunday, Daniel; Windover, Donald; Liman, Christopher; Bowen, Alec; Khaira, Gurdaman; de Pablo, Juan; Delongchamp, Dean; Kline, R. Joseph

    Photovoltaics, flexible electronics, and stimuli-responsive materials all require enhanced methodology to examine their nanoscale molecular orientation. The mechanical, electronic, optical, and transport properties of devices made from these materials are all a function of this orientation. The polymer chains in these materials are best modeled as semi-flexible to rigid rods. Characterizing the rigidity and molecular orientation of these polymers non-invasively is currently being pursued by using polarized resonant soft X-ray scattering (P-RSoXS). In this presentation, we show recent work on implementing such a characterization process using a rod-coil block copolymer system in the rigid-rod limit. We first demonstrate how we have used physics based models such as self-consistent field theory (SCFT) in non-polarized RSoXS work to fit scattering profiles for thin film coil-coil PS- b-PMMA block copolymer systems. We then show by using a wormlike chain partition function in the SCFT formulism to model the rigid-rod block, the methodology can be used there as well to extract the molecular orientation of the rod block from a simulated P-RSoXS experiment. The results from the work show the potential of the technique to extract thermodynamic and morphological sample information.

  15. What is covered by "cancer rehabilitation" in PubMed? A review of randomized controlled trials 1990-2011.

    PubMed

    Gudbergsson, Sævar Berg; Dahl, Alv A; Loge, Jon Håvard; Thorsen, Lene; Oldervoll, Line M; Grov, Ellen K

    2015-02-01

    This focused review examines randomized controlled studies included by the term "cancer rehabilitation" in PubMed. The research questions concern the type of interventions performed and their methodological quality. Using the Medical Subject Headings (MeSH) terms: neoplasm AND rehabilitation, all articles with randomized controlled studies that included adult cancer patients, written in English, were extracted from PubMed. Papers covering physical exercise, psychiatric/psychological treatment or social support only were excluded as they had been reviewed recently. Abstracts and papers were assessed by 3 pairs of reviewers, and descriptive information was extracted systematically. Methodological quality was rated on a 10-item index scale, and the cut-off for acceptable quality was set at ≥ 8. A total of 132 (19%) of the 683 identified papers met the eligibility criteria and were assessed in detail. The papers were grouped into 5 thematic categories: 44 physical; 15 art and expressive; 47 psycho-educative; 21 emotionally supportive; and 5 others. Good quality of design was observed in 32 studies, 18 of them uni-dimensional and 14 multi-dimensional. Published randomized controlled studies on cancer rehabilitation are heterogeneous in terms of content and samples, and are mostly characterized by suboptimal design quality. Future studies should be more specific and well-designed with sufficient statistical strength.

  16. Storage Thresholds for Relative Sea Level Signals in the Stratigraphic Record

    NASA Astrophysics Data System (ADS)

    Li, Q.; Yu, L.; Straub, K. M.

    2015-12-01

    Many argue that the tug of Relative Sea Level (RSL) change represents the most important allogenic forcing affecting deltas and is the primary control on stratigraphic architecture of deltas. However, the range of amplitudes and periodicities of RSL cycles stored in stratigraphy remains unknown. Here we use a suite of physical experiments to show that RSL cycles with magnitudes and periodicities less than the spatial and temporal scales of deltaic internal (autogenic) dynamics cannot confidently be extracted from the physical stratigraphic record. Additional analysis of deltaic morphodynamics also suggest no significant differences between an experiment with constant boundary conditions (control experiment) and an experiment with small magnitude and short periodicity RSL cycles, relative to the autogenic dynamics. Significant differences in the aspect ratio of channel bodies and deposit sand fractions do exists between our control experiment and those experiments with either large magnitudes or long periodicities RSL cycles. Using a compilation of data from major river delta systems, we show that our predicted thresholds for RSL signal storage often overlap with the magnitudes and periodicities of commonly discussed drivers of global sea level. This theory defines quantitative limits on the range of paleo-RSL information that can be extracted from the stratigraphic record, which could aid stratigraphic prediction and the inversion of stratigraphy for paleo- deltaic response to climate change.

  17. Factors associated with physical activity promotion by allied and other non-medical health professionals: A systematic review.

    PubMed

    Crisford, Paul; Winzenberg, Tania; Venn, Alison; Schultz, Martin; Aitken, Dawn; Cleland, Verity

    2018-05-21

    To identify factors associated with non-medical health professionals' engagement in physical activity (PA) promotion. Five electronic databases were searched for studies including practising health professionals (excluding medical doctors), a PA promotion practice measure, a test of association between potential influencing factors and PA promotion practice, and written in English. Two researchers independently screened studies and extracted data. Extracted data were synthesized in a tabular format with a narrative summary (thematic analysis). Thirty studies involving 7734 non-medical health professionals were included. Self-efficacy in PA promotion, positive beliefs in the benefits of PA, assessing patients' PA, and PA promotion training were the main factors associated with engaging in PA promotion. Lack of remuneration was not associated. Common study limitations included a lack of information on non-responders, data collection by survey only and limited reliability or validity testing of measurements. There are common factors influencing PA promotion, but the absence of studies from some health professions, limitations related to study measures, and the lack of randomised controlled intervention trials highlights the need for further research. The factors identified may prove useful for guiding the development of strategies to encourage greater engagement in PA promotion by health professionals. Copyright © 2018 Elsevier B.V. All rights reserved.

  18. Implementation of generalized quantum measurements: Superadditive quantum coding, accessible information extraction, and classical capacity limit

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Takeoka, Masahiro; Fujiwara, Mikio; Mizuno, Jun

    2004-05-01

    Quantum-information theory predicts that when the transmission resource is doubled in quantum channels, the amount of information transmitted can be increased more than twice by quantum-channel coding technique, whereas the increase is at most twice in classical information theory. This remarkable feature, the superadditive quantum-coding gain, can be implemented by appropriate choices of code words and corresponding quantum decoding which requires a collective quantum measurement. Recently, an experimental demonstration was reported [M. Fujiwara et al., Phys. Rev. Lett. 90, 167906 (2003)]. The purpose of this paper is to describe our experiment in detail. Particularly, a design strategy of quantum-collective decodingmore » in physical quantum circuits is emphasized. We also address the practical implication of the gain on communication performance by introducing the quantum-classical hybrid coding scheme. We show how the superadditive quantum-coding gain, even in a small code length, can boost the communication performance of conventional coding techniques.« less

  19. Document Examination: Applications of Image Processing Systems.

    PubMed

    Kopainsky, B

    1989-12-01

    Dealing with images is a familiar business for an expert in questioned documents: microscopic, photographic, infrared, and other optical techniques generate images containing the information he or she is looking for. A recent method for extracting most of this information is digital image processing, ranging from the simple contrast and contour enhancement to the advanced restoration of blurred texts. When combined with a sophisticated physical imaging system, an image pricessing system has proven to be a powerful and fast tool for routine non-destructive scanning of suspect documents. This article reviews frequent applications, comprising techniques to increase legibility, two-dimensional spectroscopy (ink discrimination, alterations, erased entries, etc.), comparison techniques (stamps, typescript letters, photo substitution), and densitometry. Computerized comparison of handwriting is not included. Copyright © 1989 Central Police University.

  20. Development of tea extracts and chitosan composite films for active packaging materials.

    PubMed

    Peng, Yong; Wu, Yan; Li, Yunfei

    2013-08-01

    The effects of 0.5%, 1% and 2% green tea extracts (GTE) and black tea extracts (BTE) on the physical, structural and antioxidant properties of chitosan films were investigated. Results showed that the addition of tea extracts significantly decreased water vapour permeability and increased the antioxidant ability of films. The DPPH radical scavenging ability of GTE films was stronger than that of BTE films in all food simulants (0%, 20%, 75% and 95% ethanol). The equilibration time in different food simulants decreased with the increased ethanol concentration. DSC and FTIR spectra analysis indicated that there was strong interaction in film matrix, which could be reflected by the physical and mechanical properties of composite films. This study revealed that an active chitosan film could be obtained by incorporation of tea extracts, which may provide new formulation options for developing an antioxidant active packaging. Copyright © 2013 Elsevier B.V. All rights reserved.

  1. Physically incorporated extraction phase of solid-phase microextraction by sol-gel technology.

    PubMed

    Liu, Wenmin; Hu, Yuan; Zhao, Jinghong; Xu, Yuan; Guan, Yafeng

    2006-01-13

    A sol-gel method for the preparation of solid-phase microextraction (SPME) fiber was described and evaluated. The extraction phase of poly(dimethysiloxane) (PDMS) containing 3% vinyl group was physically incorporated into the sol-gel network without chemical bonding. The extraction phase itself is then partly crosslinked at 320 degrees C, forming an independent polymer network and can withstand desorption temperature of 290 degrees C. The headspace extraction of BTX by the fiber SPME was evaluated and the detection limit of o-xylene was down to 0.26 ng/l. Extraction and determination of organophosphorus pesticides (OPPs) in water, orange juice and red wine by the SPME-GC thermionic specified detector (TSD) was validated. Limits of detection of the method for OPPs were below 10 ng/l except methidathion. Relative standard deviations (RSDs) were in the range of 1-20% for pesticides being tested.

  2. Research on Optimal Observation Scale for Damaged Buildings after Earthquake Based on Optimal Feature Space

    NASA Astrophysics Data System (ADS)

    Chen, J.; Chen, W.; Dou, A.; Li, W.; Sun, Y.

    2018-04-01

    A new information extraction method of damaged buildings rooted in optimal feature space is put forward on the basis of the traditional object-oriented method. In this new method, ESP (estimate of scale parameter) tool is used to optimize the segmentation of image. Then the distance matrix and minimum separation distance of all kinds of surface features are calculated through sample selection to find the optimal feature space, which is finally applied to extract the image of damaged buildings after earthquake. The overall extraction accuracy reaches 83.1 %, the kappa coefficient 0.813. The new information extraction method greatly improves the extraction accuracy and efficiency, compared with the traditional object-oriented method, and owns a good promotional value in the information extraction of damaged buildings. In addition, the new method can be used for the information extraction of different-resolution images of damaged buildings after earthquake, then to seek the optimal observation scale of damaged buildings through accuracy evaluation. It is supposed that the optimal observation scale of damaged buildings is between 1 m and 1.2 m, which provides a reference for future information extraction of damaged buildings.

  3. Analysis of Technique to Extract Data from the Web for Improved Performance

    NASA Astrophysics Data System (ADS)

    Gupta, Neena; Singh, Manish

    2010-11-01

    The World Wide Web rapidly guides the world into a newly amazing electronic world, where everyone can publish anything in electronic form and extract almost all the information. Extraction of information from semi structured or unstructured documents, such as web pages, is a useful yet complex task. Data extraction, which is important for many applications, extracts the records from the HTML files automatically. Ontologies can achieve a high degree of accuracy in data extraction. We analyze method for data extraction OBDE (Ontology-Based Data Extraction), which automatically extracts the query result records from the web with the help of agents. OBDE first constructs an ontology for a domain according to information matching between the query interfaces and query result pages from different web sites within the same domain. Then, the constructed domain ontology is used during data extraction to identify the query result section in a query result page and to align and label the data values in the extracted records. The ontology-assisted data extraction method is fully automatic and overcomes many of the deficiencies of current automatic data extraction methods.

  4. Robust real-time extraction of respiratory signals from PET list-mode data.

    PubMed

    Salomon, Andre; Zhang, Bin; Olivier, Patrick; Goedicke, Andreas

    2018-05-01

    Respiratory motion, which typically cannot simply be suspended during PET image acquisition, affects lesions' detection and quantitative accuracy inside or in close vicinity to the lungs. Some motion compensation techniques address this issue via pre-sorting ("binning") of the acquired PET data into a set of temporal gates, where each gate is assumed to be minimally affected by respiratory motion. Tracking respiratory motion is typically realized using dedicated hardware (e.g. using respiratory belts and digital cameras). Extracting respiratory signalsdirectly from the acquired PET data simplifies the clinical workflow as it avoids to handle additional signal measurement equipment. We introduce a new data-driven method "Combined Local Motion Detection" (CLMD). It uses the Time-of-Flight (TOF) information provided by state-of-the-art PET scanners in order to enable real-time respiratory signal extraction without additional hardware resources. CLMD applies center-of-mass detection in overlapping regions based on simple back-positioned TOF event sets acquired in short time frames. Following a signal filtering and quality-based pre-selection step, the remaining extracted individual position information over time is then combined to generate a global respiratory signal. The method is evaluated using 7 measured FDG studies from single and multiple scan positions of the thorax region, and it is compared to other software-based methods regarding quantitative accuracy and statistical noise stability. Correlation coefficients around 90% between the reference and the extracted signal have been found for those PET scans where motion affected features such as tumors or hot regions were present in the PET field-of-view. For PET scans with a quarter of typically applied radiotracer doses, the CLMD method still provides similar high correlation coefficients which indicates its robustness to noise. Each CLMD processing needed less than 0.4s in total on a standard multi-core CPU and thus provides a robust and accurate approach enabling real-time processing capabilities using standard PC hardware. © 2018 Institute of Physics and Engineering in Medicine.

  5. A new green chemistry method based on plant extracts to synthesize gold nanoparticles

    NASA Astrophysics Data System (ADS)

    Montes Castillo, Milka Odemariz

    Extraordinary chemical and physical properties exhibited by nanomaterials, as compared to their bulk counterparts, have made the area of nanotechnology a growing realm in the past three decades. It is the nanoscale size (from 1 to 100 nm) and the morphologies of nanomaterials that provide several properties and applications not possible for the same material in the bulk. Magnetic and optical properties, as well as surface reactivity are highly dependent on the size and morphology of the nanomaterial. Diverse nanomaterials are being widely used in molecular diagnostics as well as in medicine, electronic and optical devices. Among the most studied nanomaterials, gold nanoparticles are of special interest due to their multifunctional capabilities. For instance, spherical gold nanoparticles measuring 15-20 nm in diameter have been studied due to their insulin binding properties. Also, thiol functionalized gold nanoparticles between 5 and 30 nm are used in the detection of DNA. Thus, harnessing the shape and size of gold nanoparticles plays an important role in science and technology. The synthesis of gold nanoparticles via the reduction of gold salts, using citrate or other reducing agents, has been widely studied. In recent years, algae, fungi, bacteria, and living plants have been used to reduce trivalent gold (Au3+) to its zero oxidation state (Au 0) forming gold nanoparticles of different sizes and shapes. In addition, plant biomasses have also been studied for their gold-reducing power and nanoparticle formation. Although there is information about the synthesis of the gold nanoparticles by biologically based materials; to our knowledge, the study of the use of alfalfa extracts has not been reported. This innovation represents a significant improvement; that is an environmentally friendly method that does not use toxic chemicals. Also, the problem of extracting the formed gold nanoparticles from biomaterials is addressed in this research but still remains to be solved. In this work, secondary metabolites were extracted from alfalfa biomass in liquid phase by hot water, isopropanol, and methanol, and used to reduce tetrachloroaurate ion (AuCl4-) for the synthesis of gold nanoparticles. Biosyntheses of gold nanoparticles were performed by mixing 0.75, 1.5 and 3.0 mM Au3+ solutions with each one of the extracts at a ratio of 3:1 respectively, and shaken at room temperature for 1h. Resulting gold colloids were characterized by UV-Vis spectrophotometry and electron microscopy techniques, showing size and morphology dependency on the reaction conditions. Isopropanol alfalfa extracts reacted with Au 3+ produced gold nanoparticles with a size range of 15-60 nm. The most abundant were from 40-50 nm, and the morphologies found were polygons, decahedra and icosahedra. Methanol alfalfa extracts produced monodisperse 50 nm decahedral and icosahedral gold nanoparticles. Lastly, water alfalfa extracts reacted with Au3+ produced triangular, truncated triangular and hexagonal nanoplates with diameters ranging from 500 nm to 4 mum and thicknesses of ˜15-40 nm. The production of gold nanoplates by alfalfa extracts has never been reported before. In order to extract the formed gold nanoparticles from the biomass, physical and chemical extractions were used. For the chemical extraction, NaCl, dilute H2SO4, Triton X and DI water were tested. In these cases, the best results were obtained with DI water, followed by NaCl. The extracted nanoparticles had an absorption band at about 539 nm. For the physical extractions, alfalfa biomass containing gold nanoparticles were exposed to 400°C, 500°C, 550°C and 600°C to recover the gold nanoparticles. X-ray diffractograms taken after pyrolysis of the biomass showed that the recovered nanoparticles kept their crystal structure.

  6. Insufficient evidence for the use of a physical examination to detect maltreatment in children without prior suspicion: a systematic review

    PubMed Central

    2013-01-01

    Background Although it is often performed in clinical practice, the diagnostic value of a screening physical examination to detect maltreatment in children without prior suspicion has not been reviewed. This article aims to evaluate the diagnostic value of a complete physical examination as a screening instrument to detect maltreatment in children without prior suspicion. Methods We systematically searched the databases of MEDLINE, EMBASE, PsychINFO, CINAHL, and ERIC, using a sensitive search strategy. Studies that i) presented medical findings of a complete physical examination for screening purposes in children 0–18 years, ii) specifically recorded the presence or absence of signs of child maltreatment, and iii) recorded child maltreatment confirmed by a reference standard, were included. Two reviewers independently performed study selection, data extraction, and quality appraisal using the QUADAS-2 tool. Results The search yielded 4,499 titles, of which three studies met the eligibility criteria. The prevalence of confirmed signs of maltreatment during screening physical examination varied between 0.8% and 13.5%. The designs of the studies were inadequate to assess the diagnostic accuracy of a screening physical examination for child maltreatment. Conclusions Because of the lack of informative studies, we could not draw conclusions about the diagnostic value of a screening physical examination in children without prior suspicion of child maltreatment. PMID:24313949

  7. Insufficient evidence for the use of a physical examination to detect maltreatment in children without prior suspicion: a systematic review.

    PubMed

    Hoytema van Konijnenburg, Eva Mm; Teeuw, Arianne H; Sieswerda-Hoogendoorn, Tessa; Leenders, Arnold G E; van der Lee, Johanna H

    2013-12-06

    Although it is often performed in clinical practice, the diagnostic value of a screening physical examination to detect maltreatment in children without prior suspicion has not been reviewed. This article aims to evaluate the diagnostic value of a complete physical examination as a screening instrument to detect maltreatment in children without prior suspicion. We systematically searched the databases of MEDLINE, EMBASE, PsychINFO, CINAHL, and ERIC, using a sensitive search strategy. Studies that i) presented medical findings of a complete physical examination for screening purposes in children 0-18 years, ii) specifically recorded the presence or absence of signs of child maltreatment, and iii) recorded child maltreatment confirmed by a reference standard, were included. Two reviewers independently performed study selection, data extraction, and quality appraisal using the QUADAS-2 tool. The search yielded 4,499 titles, of which three studies met the eligibility criteria. The prevalence of confirmed signs of maltreatment during screening physical examination varied between 0.8% and 13.5%. The designs of the studies were inadequate to assess the diagnostic accuracy of a screening physical examination for child maltreatment. Because of the lack of informative studies, we could not draw conclusions about the diagnostic value of a screening physical examination in children without prior suspicion of child maltreatment.

  8. A randomized control trial comparing the visual and verbal communication methods for reducing fear and anxiety during tooth extraction.

    PubMed

    Gazal, Giath; Tola, Ahmed W; Fareed, Wamiq M; Alnazzawi, Ahmad A; Zafar, Muhammad S

    2016-04-01

    To evaluate the value of using the visual information for reducing the level of dental fear and anxiety in patients undergoing teeth extraction under LA. A total of 64 patients were indiscriminately allotted to solitary of the study groups following reading the information sheet and signing the formal consent. If patient was in the control group, only verbal information and routine warnings were provided. If patient was in the study group, tooth extraction video was showed. The level of dental fear and anxiety was detailed by the patients on customary 100 mm visual analog scales (VAS), with "no dental fear and anxiety" (0 mm) and "severe dental distress and unease" (100 mm). Evaluation of dental apprehension and fretfulness was made pre-operatively, following visual/verbal information and post-extraction. There was a substantial variance among the mean dental fear and anxiety scores for both groups post-extraction (p-value < 0.05). Patients in tooth extraction video group were more comfortable after dental extraction than verbal information and routine warning group. For tooth extraction video group there were major decreases in dental distress and anxiety scores between the pre-operative and either post video information scores or postoperative scores (p-values < 0.05). Younger patients recorded higher dental fear and anxiety scores than older ones (P < 0.05). Dental fear and anxiety associated with dental extractions under local anesthesia can be reduced by showing a tooth extraction video to the patients preoperatively.

  9. Design criteria for extraction with chemical reaction and liquid membrane permeation

    NASA Technical Reports Server (NTRS)

    Bart, H. J.; Bauer, A.; Lorbach, D.; Marr, R.

    1988-01-01

    The design criteria for heterogeneous chemical reactions in liquid/liquid systems formally correspond to those of classical physical extraction. More complex models are presented which describe the material exchange at the individual droplets in an extraction with chemical reaction and in liquid membrane permeation.

  10. Information Extraction from Unstructured Text for the Biodefense Knowledge Center

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Samatova, N F; Park, B; Krishnamurthy, R

    2005-04-29

    The Bio-Encyclopedia at the Biodefense Knowledge Center (BKC) is being constructed to allow an early detection of emerging biological threats to homeland security. It requires highly structured information extracted from variety of data sources. However, the quantity of new and vital information available from every day sources cannot be assimilated by hand, and therefore reliable high-throughput information extraction techniques are much anticipated. In support of the BKC, Lawrence Livermore National Laboratory and Oak Ridge National Laboratory, together with the University of Utah, are developing an information extraction system built around the bioterrorism domain. This paper reports two important pieces ofmore » our effort integrated in the system: key phrase extraction and semantic tagging. Whereas two key phrase extraction technologies developed during the course of project help identify relevant texts, our state-of-the-art semantic tagging system can pinpoint phrases related to emerging biological threats. Also we are enhancing and tailoring the Bio-Encyclopedia by augmenting semantic dictionaries and extracting details of important events, such as suspected disease outbreaks. Some of these technologies have already been applied to large corpora of free text sources vital to the BKC mission, including ProMED-mail, PubMed abstracts, and the DHS's Information Analysis and Infrastructure Protection (IAIP) news clippings. In order to address the challenges involved in incorporating such large amounts of unstructured text, the overall system is focused on precise extraction of the most relevant information for inclusion in the BKC.« less

  11. Monitoring evolving urban cluster systems using DMSP/OLS nighttime light data: a case study of the Yangtze River Delta region, China

    NASA Astrophysics Data System (ADS)

    Wang, Zhao; Yang, Shan; Wang, Shuguang; Shen, Yan

    2017-10-01

    The assessment of the dynamic urban structure has been affected by lack of timely and accurate spatial information for a long period, which has hindered the measurements of structural continuity at the macroscale. Defense meteorological satellite program's operational linescan system (DMSP/OLS) nighttime light (NTL) data provide an ideal source for urban information detection with a long-time span, short-time interval, and wide coverage. In this study, we extracted the physical boundaries of urban clusters from corrected NTL images and quantitatively analyzed the structure of the urban cluster system based on rank-size distribution, spatial metrics, and Mann-Kendall trend test. Two levels of urban cluster systems in the Yangtze River Delta region (YRDR) were examined. We found that (1) in the entire YRDR, the urban cluster system showed a periodic process, with a significant trend of even distribution before 2007 but an unequal growth pattern after 2007, and (2) at the metropolitan level, vast disparities exist in four metropolitan areas for the fluctuations of Pareto's exponent, the speed of cluster expansion, and the dominance of core cluster. The results suggest that the extracted urban cluster information from NTL data effectively reflect the evolving nature of regional urbanization, which in turn can aid in the planning of cities and help achieve more sustainable regional development.

  12. An Effective Approach to Biomedical Information Extraction with Limited Training Data

    ERIC Educational Resources Information Center

    Jonnalagadda, Siddhartha

    2011-01-01

    In the current millennium, extensive use of computers and the internet caused an exponential increase in information. Few research areas are as important as information extraction, which primarily involves extracting concepts and the relations between them from free text. Limitations in the size of training data, lack of lexicons and lack of…

  13. Tagline: Information Extraction for Semi-Structured Text Elements in Medical Progress Notes

    ERIC Educational Resources Information Center

    Finch, Dezon Kile

    2012-01-01

    Text analysis has become an important research activity in the Department of Veterans Affairs (VA). Statistical text mining and natural language processing have been shown to be very effective for extracting useful information from medical documents. However, neither of these techniques is effective at extracting the information stored in…

  14. Rare tradition of the folk medicinal use of Aconitum spp. is kept alive in Solčavsko, Slovenia.

    PubMed

    Povšnar, Marija; Koželj, Gordana; Kreft, Samo; Lumpert, Mateja

    2017-08-08

    Aconitum species are poisonous plants that have been used in Western medicine for centuries. In the nineteenth century, these plants were part of official and folk medicine in the Slovenian territory. According to current ethnobotanical studies, folk use of Aconitum species is rarely reported in Europe. The purpose of this study was to research the folk medicinal use of Aconitum species in Solčavsko, Slovenia; to collect recipes for the preparation of Aconitum spp., indications for use, and dosing; and to investigate whether the folk use of aconite was connected to poisoning incidents. In Solčavsko, a remote alpine area in northern Slovenia, we performed semi-structured interviews with 19 informants in Solčavsko, 3 informants in Luče, and two retired physicians who worked in that area. Three samples of homemade ethanolic extracts were obtained from informants, and the concentration of aconitine was measured. In addition, four extracts were prepared according to reported recipes. All 22 informants knew of Aconitum spp. and their therapeutic use, and 5 of them provided a detailed description of the preparation and use of "voukuc", an ethanolic extract made from aconite roots. Seven informants were unable to describe the preparation in detail, since they knew of the extract only from the narration of others or they remembered it from childhood. Most likely, the roots of Aconitum tauricum and Aconitum napellus were used for the preparation of the extract, and the solvent was homemade spirits. Four informants kept the extract at home; two extracts were prepared recently (1998 and 2015). Three extracts were analyzed, and 2 contained aconitine. Informants reported many indications for the use of the extract; it was used internally and, in some cases, externally as well. The extract was also used in animals. The extract was measured in drops, but the number of drops differed among the informants. The informants reported nine poisonings with Aconitum spp., but none of them occurred as a result of medicinal use of the extract. In this study, we determined that folk knowledge of the medicinal use of Aconitum spp. is still present in Solčavsko, but Aconitum preparations are used only infrequently.

  15. New Method for Knowledge Management Focused on Communication Pattern in Product Development

    NASA Astrophysics Data System (ADS)

    Noguchi, Takashi; Shiba, Hajime

    In the field of manufacturing, the importance of utilizing knowledge and know-how has been growing. To meet this background, there is a need for new methods to efficiently accumulate and extract effective knowledge and know-how. To facilitate the extraction of knowledge and know-how needed by engineers, we first defined business process information which includes schedule/progress information, document data, information about communication among parties concerned, and information which corresponds to these three types of information. Based on our definitions, we proposed an IT system (FlexPIM: Flexible and collaborative Process Information Management) to register and accumulate business process information with the least effort. In order to efficiently extract effective information from huge volumes of accumulated business process information, focusing attention on “actions” and communication patterns, we propose a new extraction method using communication patterns. And the validity of this method has been verified for some communication patterns.

  16. Effect of hydroalcoholic extract of Aegle marmelos fruit on radical scavenging activity and exercise-endurance capacity in mice.

    PubMed

    Nallamuthu, Ilaiyaraja; Tamatam, Anand; Khanum, Farhath

    2014-05-01

    Aegle marmelos L. Corr (Rutaceae) is an important Indian Ayurvedic medicinal plant used for the treatment of various ailments. However, little information is available on the anti-fatigue properties of its fruit. Evaluation of the physical endurance and exercise-induced oxidative stress modulating properties of A. marmelos fruit in mice. Radical scavenging activity of the fruit hydroalcoholic extract was evaluated using in vitro systems. The extract was further evaluated for its endurance-enhancing properties at three oral doses (100, 200 and 400 mg/kg b.wt) in BALB/c mice for 21 d using a swimming test. The extract exhibited significant scavenging activity against DPPH (IC₅₀, 351 ± 37 µg/ml) and ABTS radicals (IC₅₀, 228 ± 25 µg/ml), respectively, with the polyphenol content of 95 µg/mg extract. It also inhibited AAPH radical-induced oxidation of biomolecules such as BSA protein (63%), plasmid DNA (81%) and lipids (80.5%). Administration of extract resulted in an increase in the duration of swimming time to exhaustion by 23.4 and 47.5% for medium and higher doses, respectively. The extract significantly normalized the fatigue-related biochemical parameters and also down-regulated the swim stress-induced over-expression of heat shock protein-70 and up-regulated the skeletal muscle metabolic regulators (GLUT-4 and AMPK1-α) by 2- and 3-fold, respectively, at the higher dose in muscle tissues. Our study demonstrates the anti-fatigue properties of A. marmelos fruit, most probably manifested by delaying the accumulation of serum lactic acid, increasing the fat utilization and up-regulating the skeletal muscle metabolic regulators.

  17. Information Extraction Using Controlled English to Support Knowledge-Sharing and Decision-Making

    DTIC Science & Technology

    2012-06-01

    or language variants. CE-based information extraction will greatly facilitate the processes in the cognitive and social domains that enable forces...terminology or language variants. CE-based information extraction will greatly facilitate the processes in the cognitive and social domains that...processor is run to turn the atomic CE into a more “ stylistically felicitous” CE, using techniques such as: aggregating all information about an entity

  18. Hot Water Extract of Leather Carp (Cyprinus carpio nudus) Improves Exercise Performance in Mice

    PubMed Central

    Lee, Gong-Hyeon; Harwanto, Dicky; Park, Sun-Mee; Choi, Jae-Suk; Kim, Mi-Ryung; Hong, Yong-Ki

    2015-01-01

    The hot water extract of leather carp (Cyprinus carpio nudus) has been used as a nourishing tonic soup and as an aid for recovery from physical fatigue. In this study, we investigated the effect of leather carp extract on exercise performance in mice. Swimming endurance and forelimb grip strength were assessed following oral administration of the extract (once per day for 7 days) at a dose of 0.5 mg/10 μL/g body weight. After 7 days, mice given the leather carp extract had significantly greater swimming endurance [105±18 s (P<0.05); 52% longer than day 0] and forelimb grip strength [1.18±0.05 Newton (P<0.01); 17% greater than day 0]. The extract increased muscle mass, but had little effect on body weight. Following the swimming exercise, blood glucose, glutathione peroxidase, and superoxide dismutase levels in extract-fed mice were significantly higher (145%, 131%, and 106%, respectively) than in the saline control group. Blood levels of high-density lipoprotein cholesterol were also significantly increased (128%) in mice given the extract compared to the controls. These results suggest that leather carp extract can improve physical exercise performance and prevent oxidative stress caused by exhaustive workouts. PMID:26770911

  19. Real-Time Information Extraction from Big Data

    DTIC Science & Technology

    2015-10-01

    I N S T I T U T E F O R D E F E N S E A N A L Y S E S Real-Time Information Extraction from Big Data Robert M. Rolfe...Information Extraction from Big Data Jagdeep Shah Robert M. Rolfe Francisco L. Loaiza-Lemos October 7, 2015 I N S T I T U T E F O R D E F E N S E...AN A LY S E S Abstract We are drowning under the 3 Vs (volume, velocity and variety) of big data . Real-time information extraction from big

  20. MICHIGAN SOIL VAPOR EXTRACTION REMEDIATION (MISER) MODEL: A COMPUTER PROGRAM TO MODEL SOIL VAPORT EXTRACTION AND BIOVENTING OF ORGANIC MATERIALS IN UNSATURATED GEOLOGICAL MATERIAL

    EPA Science Inventory

    This report describes the formulation, numerical development, and use of a multiphase, multicomponent, biodegradation model designed to simulate physical, chemical, and biological interactions occurring primarily in field scale soil vapor extraction (SVE) and bioventing (B...

  1. Extraction of Data from a Hospital Information System to Perform Process Mining.

    PubMed

    Neira, Ricardo Alfredo Quintano; de Vries, Gert-Jan; Caffarel, Jennifer; Stretton, Erin

    2017-01-01

    The aim of this work is to share our experience in relevant data extraction from a hospital information system in preparation for a research study using process mining techniques. The steps performed were: research definition, mapping the normative processes, identification of tables and fields names of the database, and extraction of data. We then offer lessons learned during data extraction phase. Any errors made in the extraction phase will propagate and have implications on subsequent analyses. Thus, it is essential to take the time needed and devote sufficient attention to detail to perform all activities with the goal of ensuring high quality of the extracted data. We hope this work will be informative for other researchers to plan and execute extraction of data for process mining research studies.

  2. Data extraction from electronic health records (EHRs) for quality measurement of the physical therapy process: comparison between EHR data and survey data.

    PubMed

    Scholte, Marijn; van Dulmen, Simone A; Neeleman-Van der Steen, Catherina W M; van der Wees, Philip J; Nijhuis-van der Sanden, Maria W G; Braspenning, Jozé

    2016-11-08

    With the emergence of the electronic health records (EHRs) as a pervasive healthcare information technology, new opportunities and challenges for use of clinical data for quality measurements arise with respect to data quality, data availability and comparability. The objective of this study is to test whether data extracted from electronic health records (EHRs) was of comparable quality as survey data for the calculation of quality indicators. Data from surveys describing patient cases and filled out by physiotherapists in 2009-2010 were used to calculate scores on eight quality indicators (QIs) to measure the quality of physiotherapy care. In 2011, data was extracted directly from EHRs. The data collection methods were evaluated for comparability. EHR data was compared to survey data on completeness and correctness. Five of the eight QIs could be extracted from the EHRs. Three were omitted from the indicator set, as they proved too difficult to be extracted from the EHRs. Another QI proved incomparable due to errors in the extraction software of some of the EHRs. Three out of four comparable QIs performed better (p < 0.001) in EHR data on completeness. EHR data also proved to be correct; the relative change in indicator scores between EHR and survey data were small (<5 %) in three out of four QIs. Data quality of EHRs was sufficient to be used for the calculation of QIs, although comparability to survey data was problematic. Standardization is needed, not only to be able to compare different data collection methods properly, but also to compare between practices with different EHRs. EHRs have the option to administrate narrative data, but natural language processing tools are needed to quantify these text boxes. Such development, can narrow the comparability gap between scoring QIs based on EHR data and based on survey data. EHRs have the potential to provide real time feedback to professionals and quality measurements for research, but more effort is needed to create unambiguous and uniform information and to unlock written text in a standardized manner.

  3. Place in Perspective: Extracting Online Information about Points of Interest

    NASA Astrophysics Data System (ADS)

    Alves, Ana O.; Pereira, Francisco C.; Rodrigues, Filipe; Oliveirinha, João

    During the last few years, the amount of online descriptive information about places has reached reasonable dimensions for many cities in the world. Being such information mostly in Natural Language text, Information Extraction techniques are needed for obtaining the meaning of places that underlies these massive amounts of commonsense and user made sources. In this article, we show how we automatically label places using Information Extraction techniques applied to online resources such as Wikipedia, Yellow Pages and Yahoo!.

  4. Residual and Destroyed Accessible Information after Measurements

    NASA Astrophysics Data System (ADS)

    Han, Rui; Leuchs, Gerd; Grassl, Markus

    2018-04-01

    When quantum states are used to send classical information, the receiver performs a measurement on the signal states. The amount of information extracted is often not optimal due to the receiver's measurement scheme and experimental apparatus. For quantum nondemolition measurements, there is potentially some residual information in the postmeasurement state, while part of the information has been extracted and the rest is destroyed. Here, we propose a framework to characterize a quantum measurement by how much information it extracts and destroys, and how much information it leaves in the residual postmeasurement state. The concept is illustrated for several receivers discriminating coherent states.

  5. Question analysis for Indonesian comparative question

    NASA Astrophysics Data System (ADS)

    Saelan, A.; Purwarianti, A.; Widyantoro, D. H.

    2017-01-01

    Information seeking is one of human needs today. Comparing things using search engine surely take more times than search only one thing. In this paper, we analyzed comparative questions for comparative question answering system. Comparative question is a question that comparing two or more entities. We grouped comparative questions into 5 types: selection between mentioned entities, selection between unmentioned entities, selection between any entity, comparison, and yes or no question. Then we extracted 4 types of information from comparative questions: entity, aspect, comparison, and constraint. We built classifiers for classification task and information extraction task. Features used for classification task are bag of words, whether for information extraction, we used lexical, 2 previous and following words lexical, and previous label as features. We tried 2 scenarios: classification first and extraction first. For classification first, we used classification result as a feature for extraction. Otherwise, for extraction first, we used extraction result as features for classification. We found that the result would be better if we do extraction first before classification. For the extraction task, classification using SMO gave the best result (88.78%), while for classification, it is better to use naïve bayes (82.35%).

  6. Semantic Preview Benefit in English: Individual Differences in the Extraction and Use of Parafoveal Semantic Information

    ERIC Educational Resources Information Center

    Veldre, Aaron; Andrews, Sally

    2016-01-01

    Although there is robust evidence that skilled readers of English extract and use orthographic and phonological information from the parafovea to facilitate word identification, semantic preview benefits have been elusive. We sought to establish whether individual differences in the extraction and/or use of parafoveal semantic information could…

  7. Perceived barriers and facilitators to physical activity for children with disability: a systematic review.

    PubMed

    Shields, Nora; Synnot, Anneliese Jane; Barr, Megan

    2012-11-01

    The aim of this systematic review was to investigate the perceived barriers and facilitators to physical activity among children with disability. 10 electronic databases were searched from the earliest time available to September 2010 to identify relevant articles. Articles were included if they examined the barriers or facilitators to physical activity for children with disability and were written in English. Articles were excluded if they included children with an acute, transient or chronic medical condition, examined sedentary leisure activities, or societal participation in general. Two reviewers independently assessed the search yields, extracted the data and assessed trial quality. Data were analysed descriptively. 14 articles met the inclusion criteria. Barriers included lack of knowledge and skills, the child's preferences, fear, parental behaviour, negative attitudes to disability, inadequate facilities, lack of transport, programmes and staff capacity, and cost. Facilitators included the child's desire to be active, practising skills, involvement of peers, family support, accessible facilities, proximity of location, better opportunities, skilled staff and information. Personal, social, environmental, and policy and programme-related barriers and facilitators influence the amount of activity children with disability undertake. The barriers to physical activity have been studied more comprehensively than the facilitators.

  8. Thermal machines beyond the weak coupling regime

    NASA Astrophysics Data System (ADS)

    Gallego, R.; Riera, A.; Eisert, J.

    2014-12-01

    How much work can be extracted from a heat bath using a thermal machine? The study of this question has a very long history in statistical physics in the weak-coupling limit, when applied to macroscopic systems. However, the assumption that thermal heat baths remain uncorrelated with associated physical systems is less reasonable on the nano-scale and in the quantum setting. In this work, we establish a framework of work extraction in the presence of quantum correlations. We show in a mathematically rigorous and quantitative fashion that quantum correlations and entanglement emerge as limitations to work extraction compared to what would be allowed by the second law of thermodynamics. At the heart of the approach are operations that capture the naturally non-equilibrium dynamics encountered when putting physical systems into contact with each other. We discuss various limits that relate to known results and put our work into the context of approaches to finite-time quantum thermodynamics.

  9. Phases and interfaces from real space atomically resolved data: Physics-based deep data image analysis

    DOE PAGES

    Vasudevan, Rama K.; Ziatdinov, Maxim; Jesse, Stephen; ...

    2016-08-12

    Advances in electron and scanning probe microscopies have led to a wealth of atomically resolved structural and electronic data, often with ~1–10 pm precision. However, knowledge generation from such data requires the development of a physics-based robust framework to link the observed structures to macroscopic chemical and physical descriptors, including single phase regions, order parameter fields, interfaces, and structural and topological defects. Here, we develop an approach based on a synergy of sliding window Fourier transform to capture the local analog of traditional structure factors combined with blind linear unmixing of the resultant 4D data set. This deep data analysismore » is ideally matched to the underlying physics of the problem and allows reconstruction of the a priori unknown structure factors of individual components and their spatial localization. We demonstrate the principles of this approach using a synthetic data set and further apply it for extracting chemical and physically relevant information from electron and scanning tunneling microscopy data. Furthermore, this method promises to dramatically speed up crystallographic analysis in atomically resolved data, paving the road toward automatic local structure–property determinations in crystalline and quasi-ordered systems, as well as systems with competing structural and electronic order parameters.« less

  10. Deep learning in color: towards automated quark/gluon jet discrimination

    DOE PAGES

    Komiske, Patrick T.; Metodiev, Eric M.; Schwartz, Matthew D.

    2017-01-25

    Artificial intelligence offers the potential to automate challenging data-processing tasks in collider physics. Here, to establish its prospects, we explore to what extent deep learning with convolutional neural networks can discriminate quark and gluon jets better than observables designed by physicists. Our approach builds upon the paradigm that a jet can be treated as an image, with intensity given by the local calorimeter deposits. We supplement this construction by adding color to the images, with red, green and blue intensities given by the transverse momentum in charged particles, transverse momentum in neutral particles, and pixel-level charged particle counts. Overall, themore » deep networks match or outperform traditional jet variables. We also find that, while various simulations produce different quark and gluon jets, the neural networks are surprisingly insensitive to these differences, similar to traditional observables. This suggests that the networks can extract robust physical information from imperfect simulations.« less

  11. The effects of workplace physical activity interventions in men: a systematic review.

    PubMed

    Wong, Jason Y L; Gilson, Nicholas D; van Uffelen, Jannique G Z; Brown, Wendy J

    2012-07-01

    The workplace is cited as a promising setting for physical activity (PA) promotion, but workplace PA interventions tend not to specifically target men. The aim of this article was to review the literature on workplace PA interventions for men and to identify key issues for future intervention development. Articles targeting PA at the workplace were located through a structured database search. Information on intervention strategies and PA outcomes were extracted. Only 13 studies (10.5%) reviewed focused on men, of which 5 showed significant increases in PA. These studies used generic, multicomponent, health promotion strategies with a variety of timeframes, self-report PA measures, and PA outcomes. The systematic review identified that evidence on the effectiveness of workplace PA interventions for men is equivocal and highlighted methodological concerns. Future research should use reliable and valid measures of PA and interventions that focus specifically on men's needs and PA preferences.

  12. Deep learning in color: towards automated quark/gluon jet discrimination

    NASA Astrophysics Data System (ADS)

    Komiske, Patrick T.; Metodiev, Eric M.; Schwartz, Matthew D.

    2017-01-01

    Artificial intelligence offers the potential to automate challenging data-processing tasks in collider physics. To establish its prospects, we explore to what extent deep learning with convolutional neural networks can discriminate quark and gluon jets better than observables designed by physicists. Our approach builds upon the paradigm that a jet can be treated as an image, with intensity given by the local calorimeter deposits. We supplement this construction by adding color to the images, with red, green and blue intensities given by the transverse momentum in charged particles, transverse momentum in neutral particles, and pixel-level charged particle counts. Overall, the deep networks match or outperform traditional jet variables. We also find that, while various simulations produce different quark and gluon jets, the neural networks are surprisingly insensitive to these differences, similar to traditional observables. This suggests that the networks can extract robust physical information from imperfect simulations.

  13. The nitrogen-vacancy colour centre in diamond

    NASA Astrophysics Data System (ADS)

    Doherty, Marcus W.; Manson, Neil B.; Delaney, Paul; Jelezko, Fedor; Wrachtrup, Jörg; Hollenberg, Lloyd C. L.

    2013-07-01

    The nitrogen-vacancy (NV) colour centre in diamond is an important physical system for emergent quantum technologies, including quantum metrology, information processing and communications, as well as for various nanotechnologies, such as biological and sub-diffraction limit imaging, and for tests of entanglement in quantum mechanics. Given this array of existing and potential applications and the almost 50 years of NV research, one would expect that the physics of the centre is well understood, however, the study of the NV centre has proved challenging, with many early assertions now believed false and many remaining issues yet to be resolved. This review represents the first time that the key empirical and ab initio results have been extracted from the extensive NV literature and assembled into one consistent picture of the current understanding of the centre. As a result, the key unresolved issues concerning the NV centre are identified and the possible avenues for their resolution are examined.

  14. Deep learning in color: towards automated quark/gluon jet discrimination

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Komiske, Patrick T.; Metodiev, Eric M.; Schwartz, Matthew D.

    Artificial intelligence offers the potential to automate challenging data-processing tasks in collider physics. Here, to establish its prospects, we explore to what extent deep learning with convolutional neural networks can discriminate quark and gluon jets better than observables designed by physicists. Our approach builds upon the paradigm that a jet can be treated as an image, with intensity given by the local calorimeter deposits. We supplement this construction by adding color to the images, with red, green and blue intensities given by the transverse momentum in charged particles, transverse momentum in neutral particles, and pixel-level charged particle counts. Overall, themore » deep networks match or outperform traditional jet variables. We also find that, while various simulations produce different quark and gluon jets, the neural networks are surprisingly insensitive to these differences, similar to traditional observables. This suggests that the networks can extract robust physical information from imperfect simulations.« less

  15. Solar physics applications of computer graphics and image processing

    NASA Technical Reports Server (NTRS)

    Altschuler, M. D.

    1985-01-01

    Computer graphics devices coupled with computers and carefully developed software provide new opportunities to achieve insight into the geometry and time evolution of scalar, vector, and tensor fields and to extract more information quickly and cheaply from the same image data. Two or more different fields which overlay in space can be calculated from the data (and the physics), then displayed from any perspective, and compared visually. The maximum regions of one field can be compared with the gradients of another. Time changing fields can also be compared. Images can be added, subtracted, transformed, noise filtered, frequency filtered, contrast enhanced, color coded, enlarged, compressed, parameterized, and histogrammed, in whole or section by section. Today it is possible to process multiple digital images to reveal spatial and temporal correlations and cross correlations. Data from different observatories taken at different times can be processed, interpolated, and transformed to a common coordinate system.

  16. Forensic Identification of Gender from Fingerprints.

    PubMed

    Huynh, Crystal; Brunelle, Erica; Halámková, Lenka; Agudelo, Juliana; Halámek, Jan

    2015-11-17

    In the past century, forensic investigators have universally accepted fingerprinting as a reliable identification method, which relies mainly on pictorial comparisons. Despite developments to software systems in order to increase the probability and speed of identification, there has been limited success in the efforts that have been made to move away from the discipline's absolute dependence on the existence of a prerecorded matching fingerprint. Here, we have revealed that an information-rich latent fingerprint has not been used to its full potential. In our approach, the content present in the sweat left behind-namely the amino acids-can be used to determine physical such as gender of the originator. As a result, we were able to focus on the biochemical content in the fingerprint using a biocatalytic assay, coupled with a specially designed extraction protocol, for determining gender rather than focusing solely on the physical image.

  17. Extracting laboratory test information from biomedical text

    PubMed Central

    Kang, Yanna Shen; Kayaalp, Mehmet

    2013-01-01

    Background: No previous study reported the efficacy of current natural language processing (NLP) methods for extracting laboratory test information from narrative documents. This study investigates the pathology informatics question of how accurately such information can be extracted from text with the current tools and techniques, especially machine learning and symbolic NLP methods. The study data came from a text corpus maintained by the U.S. Food and Drug Administration, containing a rich set of information on laboratory tests and test devices. Methods: The authors developed a symbolic information extraction (SIE) system to extract device and test specific information about four types of laboratory test entities: Specimens, analytes, units of measures and detection limits. They compared the performance of SIE and three prominent machine learning based NLP systems, LingPipe, GATE and BANNER, each implementing a distinct supervised machine learning method, hidden Markov models, support vector machines and conditional random fields, respectively. Results: Machine learning systems recognized laboratory test entities with moderately high recall, but low precision rates. Their recall rates were relatively higher when the number of distinct entity values (e.g., the spectrum of specimens) was very limited or when lexical morphology of the entity was distinctive (as in units of measures), yet SIE outperformed them with statistically significant margins on extracting specimen, analyte and detection limit information in both precision and F-measure. Its high recall performance was statistically significant on analyte information extraction. Conclusions: Despite its shortcomings against machine learning methods, a well-tailored symbolic system may better discern relevancy among a pile of information of the same type and may outperform a machine learning system by tapping into lexically non-local contextual information such as the document structure. PMID:24083058

  18. [Extraction of buildings three-dimensional information from high-resolution satellite imagery based on Barista software].

    PubMed

    Zhang, Pei-feng; Hu, Yuan-man; He, Hong-shi

    2010-05-01

    The demand for accurate and up-to-date spatial information of urban buildings is becoming more and more important for urban planning, environmental protection, and other vocations. Today's commercial high-resolution satellite imagery offers the potential to extract the three-dimensional information of urban buildings. This paper extracted the three-dimensional information of urban buildings from QuickBird imagery, and validated the precision of the extraction based on Barista software. It was shown that the extraction of three-dimensional information of the buildings from high-resolution satellite imagery based on Barista software had the advantages of low professional level demand, powerful universality, simple operation, and high precision. One pixel level of point positioning and height determination accuracy could be achieved if the digital elevation model (DEM) and sensor orientation model had higher precision and the off-Nadir View Angle was relatively perfect.

  19. Evaluation of Physicochemical Properties of South African Cashew Apple Juice as a Biofuel Feedstock

    PubMed Central

    Deenanath, Evanie Devi; Daramola, Michael; Falcon, Rosemary; Iyuke, Sunny

    2015-01-01

    Cashew apple juice (CAJ) is one of the feedstocks used for biofuel production and ethanol yield depends on the physical and chemical properties of the extracted juice. As far as can be ascertained, information on physical and chemical properties of South African cashew apple juice is limited in open literature. Therefore, this study provides information on the physical and chemical properties of the South African cashew apple juice. Physicochemical characteristics of the juice, such as specific gravity, pH, sugars, condensed tannins, Vitamin C, minerals, and total protein, were measured from a mixed variety of cashew apples. Analytical results showed the CAJ possesses specific gravity and pH of 1.050 and 4.52, respectively. The highest sugars were glucose (40.56 gL−1) and fructose (57.06 gL−1). Other chemical compositions of the juice were condensed tannin (55.34 mgL−1), Vitamin C (112 mg/100 mL), and total protein (1.78 gL−1). The minerals content was as follows: zinc (1.39 ppm), copper (2.18 ppm), magnesium (4.32 ppm), iron (1.32 ppm), sodium (5.44 ppm), and manganese (1.24 ppm). With these findings, South African CAJ is a suitable biomass feedstock for ethanol production. PMID:26345160

  20. Parameter estimation for compact binary coalescence signals with the first generation gravitational-wave detector network

    NASA Astrophysics Data System (ADS)

    Aasi, J.; Abadie, J.; Abbott, B. P.; Abbott, R.; Abbott, T. D.; Abernathy, M.; Accadia, T.; Acernese, F.; Adams, C.; Adams, T.; Addesso, P.; Adhikari, R.; Affeldt, C.; Agathos, M.; Agatsuma, K.; Ajith, P.; Allen, B.; Allocca, A.; Amador Ceron, E.; Amariutei, D.; Anderson, S. B.; Anderson, W. G.; Arai, K.; Araya, M. C.; Ast, S.; Aston, S. M.; Astone, P.; Atkinson, D.; Aufmuth, P.; Aulbert, C.; Aylott, B. E.; Babak, S.; Baker, P.; Ballardin, G.; Ballmer, S.; Bao, Y.; Barayoga, J. C. B.; Barker, D.; Barone, F.; Barr, B.; Barsotti, L.; Barsuglia, M.; Barton, M. A.; Bartos, I.; Bassiri, R.; Bastarrika, M.; Basti, A.; Batch, J.; Bauchrowitz, J.; Bauer, Th. S.; Bebronne, M.; Beck, D.; Behnke, B.; Bejger, M.; Beker, M. G.; Bell, A. S.; Bell, C.; Belopolski, I.; Benacquista, M.; Berliner, J. M.; Bertolini, A.; Betzwieser, J.; Beveridge, N.; Beyersdorf, P. T.; Bhadbade, T.; Bilenko, I. A.; Billingsley, G.; Birch, J.; Biswas, R.; Bitossi, M.; Bizouard, M. A.; Black, E.; Blackburn, J. K.; Blackburn, L.; Blair, D.; Bland, B.; Blom, M.; Bock, O.; Bodiya, T. P.; Bogan, C.; Bond, C.; Bondarescu, R.; Bondu, F.; Bonelli, L.; Bonnand, R.; Bork, R.; Born, M.; Boschi, V.; Bose, S.; Bosi, L.; Bouhou, B.; Braccini, S.; Bradaschia, C.; Brady, P. R.; Braginsky, V. B.; Branchesi, M.; Brau, J. E.; Breyer, J.; Briant, T.; Bridges, D. O.; Brillet, A.; Brinkmann, M.; Brisson, V.; Britzger, M.; Brooks, A. F.; Brown, D. A.; Bulik, T.; Bulten, H. J.; Buonanno, A.; Burguet–Castell, J.; Buskulic, D.; Buy, C.; Byer, R. L.; Cadonati, L.; Cagnoli, G.; Calloni, E.; Camp, J. B.; Campsie, P.; Cannon, K.; Canuel, B.; Cao, J.; Capano, C. D.; Carbognani, F.; Carbone, L.; Caride, S.; Caudill, S.; Cavaglià, M.; Cavalier, F.; Cavalieri, R.; Cella, G.; Cepeda, C.; Cesarini, E.; Chalermsongsak, T.; Charlton, P.; Chassande-Mottin, E.; Chen, W.; Chen, X.; Chen, Y.; Chincarini, A.; Chiummo, A.; Cho, H. S.; Chow, J.; Christensen, N.; Chua, S. S. Y.; Chung, C. T. Y.; Chung, S.; Ciani, G.; Clara, F.; Clark, D. E.; Clark, J. A.; Clayton, J. H.; Cleva, F.; Coccia, E.; Cohadon, P.-F.; Colacino, C. N.; Colla, A.; Colombini, M.; Conte, A.; Conte, R.; Cook, D.; Corbitt, T. R.; Cordier, M.; Cornish, N.; Corsi, A.; Costa, C. A.; Coughlin, M.; Coulon, J.-P.; Couvares, P.; Coward, D. M.; Cowart, M.; Coyne, D. C.; Creighton, J. D. E.; Creighton, T. D.; Cruise, A. M.; Cumming, A.; Cunningham, L.; Cuoco, E.; Cutler, R. M.; Dahl, K.; Damjanic, M.; Danilishin, S. L.; D'Antonio, S.; Danzmann, K.; Dattilo, V.; Daudert, B.; Daveloza, H.; Davier, M.; Daw, E. J.; Dayanga, T.; De Rosa, R.; DeBra, D.; Debreczeni, G.; Degallaix, J.; Del Pozzo, W.; Dent, T.; Dergachev, V.; DeRosa, R.; Dhurandhar, S.; Di Fiore, L.; Di Lieto, A.; Di Palma, I.; Di Paolo Emilio, M.; Di Virgilio, A.; Díaz, M.; Dietz, A.; Donovan, F.; Dooley, K. L.; Doravari, S.; Dorsher, S.; Drago, M.; Drever, R. W. P.; Driggers, J. C.; Du, Z.; Dumas, J.-C.; Dwyer, S.; Eberle, T.; Edgar, M.; Edwards, M.; Effler, A.; Ehrens, P.; Endrőczi, G.; Engel, R.; Etzel, T.; Evans, K.; Evans, M.; Evans, T.; Factourovich, M.; Fafone, V.; Fairhurst, S.; Farr, B. F.; Farr, W. M.; Favata, M.; Fazi, D.; Fehrmann, H.; Feldbaum, D.; Feroz, F.; Ferrante, I.; Ferrini, F.; Fidecaro, F.; Finn, L. S.; Fiori, I.; Fisher, R. P.; Flaminio, R.; Foley, S.; Forsi, E.; Forte, L. A.; Fotopoulos, N.; Fournier, J.-D.; Franc, J.; Franco, S.; Frasca, S.; Frasconi, F.; Frede, M.; Frei, M. A.; Frei, Z.; Freise, A.; Frey, R.; Fricke, T. T.; Friedrich, D.; Fritschel, P.; Frolov, V. V.; Fujimoto, M.-K.; Fulda, P. J.; Fyffe, M.; Gair, J.; Galimberti, M.; Gammaitoni, L.; Garcia, J.; Garufi, F.; Gáspár, M. E.; Gelencser, G.; Gemme, G.; Genin, E.; Gennai, A.; Gergely, L. Á.; Ghosh, S.; Giaime, J. A.; Giampanis, S.; Giardina, K. D.; Giazotto, A.; Gil-Casanova, S.; Gill, C.; Gleason, J.; Goetz, E.; González, G.; Gorodetsky, M. L.; Goßler, S.; Gouaty, R.; Graef, C.; Graff, P. B.; Granata, M.; Grant, A.; Gray, C.; Greenhalgh, R. J. S.; Gretarsson, A. M.; Griffo, C.; Grote, H.; Grover, K.; Grunewald, S.; Guidi, G. M.; Guido, C.; Gupta, R.; Gustafson, E. K.; Gustafson, R.; Hallam, J. M.; Hammer, D.; Hammond, G.; Hanks, J.; Hanna, C.; Hanson, J.; Harms, J.; Harry, G. M.; Harry, I. W.; Harstad, E. D.; Hartman, M. T.; Haster, C.-J.; Haughian, K.; Hayama, K.; Hayau, J.-F.; Heefner, J.; Heidmann, A.; Heintze, M. C.; Heitmann, H.; Hello, P.; Hemming, G.; Hendry, M. A.; Heng, I. S.; Heptonstall, A. W.; Herrera, V.; Heurs, M.; Hewitson, M.; Hild, S.; Hoak, D.; Hodge, K. A.; Holt, K.; Holtrop, M.; Hong, T.; Hooper, S.; Hough, J.; Howell, E. J.; Hughey, B.; Husa, S.; Huttner, S. H.; Huynh-Dinh, T.; Ingram, D. R.; Inta, R.; Isogai, T.; Ivanov, A.; Izumi, K.; Jacobson, M.; James, E.; Jang, Y. J.; Jaranowski, P.; Jesse, E.; Johnson, W. W.; Jones, D. I.; Jones, R.; Jonker, R. J. G.; Ju, L.; Kalmus, P.; Kalogera, V.; Kandhasamy, S.; Kang, G.; Kanner, J. B.; Kasprzack, M.; Kasturi, R.; Katsavounidis, E.; Katzman, W.; Kaufer, H.; Kaufman, K.; Kawabe, K.; Kawamura, S.; Kawazoe, F.; Keitel, D.; Kelley, D.; Kells, W.; Keppel, D. G.; Keresztes, Z.; Khalaidovski, A.; Khalili, F. Y.; Khazanov, E. A.; Kim, B. K.; Kim, C.; Kim, H.; Kim, K.; Kim, N.; Kim, Y. M.; King, P. J.; Kinzel, D. L.; Kissel, J. S.; Klimenko, S.; Kline, J.; Kokeyama, K.; Kondrashov, V.; Koranda, S.; Korth, W. Z.; Kowalska, I.; Kozak, D.; Kringel, V.; Krishnan, B.; Królak, A.; Kuehn, G.; Kumar, P.; Kumar, R.; Kurdyumov, R.; Kwee, P.; Lam, P. K.; Landry, M.; Langley, A.; Lantz, B.; Lastzka, N.; Lawrie, C.; Lazzarini, A.; Le Roux, A.; Leaci, P.; Lee, C. H.; Lee, H. K.; Lee, H. M.; Leong, J. R.; Leonor, I.; Leroy, N.; Letendre, N.; Lhuillier, V.; Li, J.; Li, T. G. F.; Lindquist, P. E.; Litvine, V.; Liu, Y.; Liu, Z.; Lockerbie, N. A.; Lodhia, D.; Logue, J.; Lorenzini, M.; Loriette, V.; Lormand, M.; Losurdo, G.; Lough, J.; Lubinski, M.; Lück, H.; Lundgren, A. P.; Macarthur, J.; Macdonald, E.; Machenschalk, B.; MacInnis, M.; Macleod, D. M.; Mageswaran, M.; Mailand, K.; Majorana, E.; Maksimovic, I.; Malvezzi, V.; Man, N.; Mandel, I.; Mandic, V.; Mantovani, M.; Marchesoni, F.; Marion, F.; Márka, S.; Márka, Z.; Markosyan, A.; Maros, E.; Marque, J.; Martelli, F.; Martin, I. W.; Martin, R. M.; Marx, J. N.; Mason, K.; Masserot, A.; Matichard, F.; Matone, L.; Matzner, R. A.; Mavalvala, N.; Mazzolo, G.; McCarthy, R.; McClelland, D. E.; McGuire, S. C.; McIntyre, G.; McIver, J.; Meadors, G. D.; Mehmet, M.; Meier, T.; Melatos, A.; Melissinos, A. C.; Mendell, G.; Menéndez, D. F.; Mercer, R. A.; Meshkov, S.; Messenger, C.; Meyer, M. S.; Miao, H.; Michel, C.; Milano, L.; Miller, J.; Minenkov, Y.; Mingarelli, C. M. F.; Mitrofanov, V. P.; Mitselmakher, G.; Mittleman, R.; Moe, B.; Mohan, M.; Mohapatra, S. R. P.; Moraru, D.; Moreno, G.; Morgado, N.; Morgia, A.; Mori, T.; Morriss, S. R.; Mosca, S.; Mossavi, K.; Mours, B.; Mow–Lowry, C. M.; Mueller, C. L.; Mueller, G.; Mukherjee, S.; Mullavey, A.; Müller-Ebhardt, H.; Munch, J.; Murphy, D.; Murray, P. G.; Mytidis, A.; Nash, T.; Naticchioni, L.; Necula, V.; Nelson, J.; Neri, I.; Newton, G.; Nguyen, T.; Nishizawa, A.; Nitz, A.; Nocera, F.; Nolting, D.; Normandin, M. E.; Nuttall, L.; Ochsner, E.; O'Dell, J.; Oelker, E.; Ogin, G. H.; Oh, J. J.; Oh, S. H.; Oldenberg, R. G.; O'Reilly, B.; O'Shaughnessy, R.; Osthelder, C.; Ott, C. D.; Ottaway, D. J.; Ottens, R. S.; Overmier, H.; Owen, B. J.; Page, A.; Palladino, L.; Palomba, C.; Pan, Y.; Pankow, C.; Paoletti, F.; Paoletti, R.; Papa, M. A.; Parisi, M.; Pasqualetti, A.; Passaquieti, R.; Passuello, D.; Pedraza, M.; Penn, S.; Perreca, A.; Persichetti, G.; Phelps, M.; Pichot, M.; Pickenpack, M.; Piergiovanni, F.; Pierro, V.; Pihlaja, M.; Pinard, L.; Pinto, I. M.; Pitkin, M.; Pletsch, H. J.; Plissi, M. V.; Poggiani, R.; Pöld, J.; Postiglione, F.; Poux, C.; Prato, M.; Predoi, V.; Prestegard, T.; Price, L. R.; Prijatelj, M.; Principe, M.; Privitera, S.; Prodi, G. A.; Prokhorov, L. G.; Puncken, O.; Punturo, M.; Puppo, P.; Quetschke, V.; Quitzow-James, R.; Raab, F. J.; Rabeling, D. S.; Rácz, I.; Radkins, H.; Raffai, P.; Rakhmanov, M.; Ramet, C.; Rankins, B.; Rapagnani, P.; Raymond, V.; Re, V.; Reed, C. M.; Reed, T.; Regimbau, T.; Reid, S.; Reitze, D. H.; Ricci, F.; Riesen, R.; Riles, K.; Roberts, M.; Robertson, N. A.; Robinet, F.; Robinson, C.; Robinson, E. L.; Rocchi, A.; Roddy, S.; Rodriguez, C.; Rodruck, M.; Rolland, L.; Rollins, J. G.; Romano, R.; Romie, J. H.; Rosińska, D.; Röver, C.; Rowan, S.; Rüdiger, A.; Ruggi, P.; Ryan, K.; Salemi, F.; Sammut, L.; Sandberg, V.; Sankar, S.; Sannibale, V.; Santamaría, L.; Santiago-Prieto, I.; Santostasi, G.; Saracco, E.; Sassolas, B.; Sathyaprakash, B. S.; Saulson, P. R.; Savage, R. L.; Schilling, R.; Schnabel, R.; Schofield, R. M. S.; Schulz, B.; Schutz, B. F.; Schwinberg, P.; Scott, J.; Scott, S. M.; Seifert, F.; Sellers, D.; Sentenac, D.; Sergeev, A.; Shaddock, D. A.; Shaltev, M.; Shapiro, B.; Shawhan, P.; Shoemaker, D. H.; Sidery, T. L.; Siemens, X.; Sigg, D.; Simakov, D.; Singer, A.; Singer, L.; Sintes, A. M.; Skelton, G. R.; Slagmolen, B. J. J.; Slutsky, J.; Smith, J. R.; Smith, M. R.; Smith, R. J. E.; Smith-Lefebvre, N. D.; Somiya, K.; Sorazu, B.; Speirits, F. C.; Sperandio, L.; Stefszky, M.; Steinert, E.; Steinlechner, J.; Steinlechner, S.; Steplewski, S.; Stochino, A.; Stone, R.; Strain, K. A.; Strigin, S. E.; Stroeer, A. S.; Sturani, R.; Stuver, A. L.; Summerscales, T. Z.; Sung, M.; Susmithan, S.; Sutton, P. J.; Swinkels, B.; Szeifert, G.; Tacca, M.; Taffarello, L.; Talukder, D.; Tanner, D. B.; Tarabrin, S. P.; Taylor, R.; ter Braack, A. P. M.; Thomas, P.; Thorne, K. A.; Thorne, K. S.; Thrane, E.; Thüring, A.; Titsler, C.; Tokmakov, K. V.; Tomlinson, C.; Toncelli, A.; Tonelli, M.; Torre, O.; Torres, C. V.; Torrie, C. I.; Tournefier, E.; Travasso, F.; Traylor, G.; Tse, M.; Ugolini, D.; Vahlbruch, H.; Vajente, G.; van den Brand, J. F. J.; Van Den Broeck, C.; van der Putten, S.; van Veggel, A. A.; Vass, S.; Vasuth, M.; Vaulin, R.; Vavoulidis, M.; Vecchio, A.; Vedovato, G.; Veitch, J.; Veitch, P. J.; Venkateswara, K.; Verkindt, D.; Vetrano, F.; Viceré, A.; Villar, A. E.; Vinet, J.-Y.; Vitale, S.; Vocca, H.; Vorvick, C.; Vyatchanin, S. P.; Wade, A.; Wade, L.; Wade, M.; Waldman, S. J.; Wallace, L.; Wan, Y.; Wang, M.; Wang, X.; Wanner, A.; Ward, R. L.; Was, M.; Weinert, M.; Weinstein, A. J.; Weiss, R.; Welborn, T.; Wen, L.; Wessels, P.; West, M.; Westphal, T.; Wette, K.; Whelan, J. T.; Whitcomb, S. E.; White, D. J.; Whiting, B. F.; Wiesner, K.; Wilkinson, C.; Willems, P. A.; Williams, L.; Williams, R.; Willke, B.; Wimmer, M.; Winkelmann, L.; Winkler, W.; Wipf, C. C.; Wiseman, A. G.; Wittel, H.; Woan, G.; Wooley, R.; Worden, J.; Yablon, J.; Yakushin, I.; Yamamoto, H.; Yamamoto, K.; Yancey, C. C.; Yang, H.; Yeaton-Massey, D.; Yoshida, S.; Yvert, M.; Zadrożny, A.; Zanolin, M.; Zendri, J.-P.; Zhang, F.; Zhang, L.; Zhao, C.; Zotov, N.; Zucker, M. E.; Zweizig, J.

    2013-09-01

    Compact binary systems with neutron stars or black holes are one of the most promising sources for ground-based gravitational-wave detectors. Gravitational radiation encodes rich information about source physics; thus parameter estimation and model selection are crucial analysis steps for any detection candidate events. Detailed models of the anticipated waveforms enable inference on several parameters, such as component masses, spins, sky location and distance, that are essential for new astrophysical studies of these sources. However, accurate measurements of these parameters and discrimination of models describing the underlying physics are complicated by artifacts in the data, uncertainties in the waveform models and in the calibration of the detectors. Here we report such measurements on a selection of simulated signals added either in hardware or software to the data collected by the two LIGO instruments and the Virgo detector during their most recent joint science run, including a “blind injection” where the signal was not initially revealed to the collaboration. We exemplify the ability to extract information about the source physics on signals that cover the neutron-star and black-hole binary parameter space over the component mass range 1M⊙-25M⊙ and the full range of spin parameters. The cases reported in this study provide a snapshot of the status of parameter estimation in preparation for the operation of advanced detectors.

  1. a Statistical Texture Feature for Building Collapse Information Extraction of SAR Image

    NASA Astrophysics Data System (ADS)

    Li, L.; Yang, H.; Chen, Q.; Liu, X.

    2018-04-01

    Synthetic Aperture Radar (SAR) has become one of the most important ways to extract post-disaster collapsed building information, due to its extreme versatility and almost all-weather, day-and-night working capability, etc. In view of the fact that the inherent statistical distribution of speckle in SAR images is not used to extract collapsed building information, this paper proposed a novel texture feature of statistical models of SAR images to extract the collapsed buildings. In the proposed feature, the texture parameter of G0 distribution from SAR images is used to reflect the uniformity of the target to extract the collapsed building. This feature not only considers the statistical distribution of SAR images, providing more accurate description of the object texture, but also is applied to extract collapsed building information of single-, dual- or full-polarization SAR data. The RADARSAT-2 data of Yushu earthquake which acquired on April 21, 2010 is used to present and analyze the performance of the proposed method. In addition, the applicability of this feature to SAR data with different polarizations is also analysed, which provides decision support for the data selection of collapsed building information extraction.

  2. Collective Influence of Multiple Spreaders Evaluated by Tracing Real Information Flow in Large-Scale Social Networks.

    PubMed

    Teng, Xian; Pei, Sen; Morone, Flaviano; Makse, Hernán A

    2016-10-26

    Identifying the most influential spreaders that maximize information flow is a central question in network theory. Recently, a scalable method called "Collective Influence (CI)" has been put forward through collective influence maximization. In contrast to heuristic methods evaluating nodes' significance separately, CI method inspects the collective influence of multiple spreaders. Despite that CI applies to the influence maximization problem in percolation model, it is still important to examine its efficacy in realistic information spreading. Here, we examine real-world information flow in various social and scientific platforms including American Physical Society, Facebook, Twitter and LiveJournal. Since empirical data cannot be directly mapped to ideal multi-source spreading, we leverage the behavioral patterns of users extracted from data to construct "virtual" information spreading processes. Our results demonstrate that the set of spreaders selected by CI can induce larger scale of information propagation. Moreover, local measures as the number of connections or citations are not necessarily the deterministic factors of nodes' importance in realistic information spreading. This result has significance for rankings scientists in scientific networks like the APS, where the commonly used number of citations can be a poor indicator of the collective influence of authors in the community.

  3. Extraction of organic compounds with room temperature ionic liquids.

    PubMed

    Poole, Colin F; Poole, Salwa K

    2010-04-16

    Room temperature ionic liquids are novel solvents with a rather specific blend of physical and solution properties that makes them of interest for applications in separation science. They are good solvents for a wide range of compounds in which they behave as polar solvents. Their physical properties of note that distinguish them from conventional organic solvents are a negligible vapor pressure, high thermal stability, and relatively high viscosity. They can form biphasic systems with water or low polarity organic solvents and gases suitable for use in liquid-liquid and gas-liquid partition systems. An analysis of partition coefficients for varied compounds in these systems allows characterization of solvent selectivity using the solvation parameter model, which together with spectroscopic studies of solvent effects on probe substances, results in a detailed picture of solvent behavior. These studies indicate that the solution properties of ionic liquids are similar to those of polar organic solvents. Practical applications of ionic liquids in sample preparation include extractive distillation, aqueous biphasic systems, liquid-liquid extraction, liquid-phase microextraction, supported liquid membrane extraction, matrix solvents for headspace analysis, and micellar extraction. The specific advantages and limitations of ionic liquids in these studies is discussed with a view to defining future uses and the need not to neglect the identification of new room temperature ionic liquids with physical and solution properties tailored to the needs of specific sample preparation techniques. The defining feature of the special nature of ionic liquids is not their solution or physical properties viewed separately but their unique combinations when taken together compared with traditional organic solvents. Copyright 2009 Elsevier B.V. All rights reserved.

  4. OpenDMAP: An open source, ontology-driven concept analysis engine, with applications to capturing knowledge regarding protein transport, protein interactions and cell-type-specific gene expression

    PubMed Central

    Hunter, Lawrence; Lu, Zhiyong; Firby, James; Baumgartner, William A; Johnson, Helen L; Ogren, Philip V; Cohen, K Bretonnel

    2008-01-01

    Background Information extraction (IE) efforts are widely acknowledged to be important in harnessing the rapid advance of biomedical knowledge, particularly in areas where important factual information is published in a diverse literature. Here we report on the design, implementation and several evaluations of OpenDMAP, an ontology-driven, integrated concept analysis system. It significantly advances the state of the art in information extraction by leveraging knowledge in ontological resources, integrating diverse text processing applications, and using an expanded pattern language that allows the mixing of syntactic and semantic elements and variable ordering. Results OpenDMAP information extraction systems were produced for extracting protein transport assertions (transport), protein-protein interaction assertions (interaction) and assertions that a gene is expressed in a cell type (expression). Evaluations were performed on each system, resulting in F-scores ranging from .26 – .72 (precision .39 – .85, recall .16 – .85). Additionally, each of these systems was run over all abstracts in MEDLINE, producing a total of 72,460 transport instances, 265,795 interaction instances and 176,153 expression instances. Conclusion OpenDMAP advances the performance standards for extracting protein-protein interaction predications from the full texts of biomedical research articles. Furthermore, this level of performance appears to generalize to other information extraction tasks, including extracting information about predicates of more than two arguments. The output of the information extraction system is always constructed from elements of an ontology, ensuring that the knowledge representation is grounded with respect to a carefully constructed model of reality. The results of these efforts can be used to increase the efficiency of manual curation efforts and to provide additional features in systems that integrate multiple sources for information extraction. The open source OpenDMAP code library is freely available at PMID:18237434

  5. Extracting remaining information from an inconclusive result in optimal unambiguous state discrimination

    NASA Astrophysics Data System (ADS)

    Zhang, Gang; Yu, Long-Bao; Zhang, Wen-Hai; Cao, Zhuo-Liang

    2014-12-01

    In unambiguous state discrimination, the measurement results consist of the error-free results and an inconclusive result, and an inconclusive result is conventionally regarded as a useless remainder from which no information about initial states is extracted. In this paper, we investigate the problem of extracting remaining information from an inconclusive result, provided that the optimal total success probability is determined. We present three simple examples. An inconclusive answer in the first two examples can be extracted partial information, while an inconclusive answer in the third one cannot be. The initial states in the third example are defined as the highly symmetric states.

  6. Construction of Green Tide Monitoring System and Research on its Key Techniques

    NASA Astrophysics Data System (ADS)

    Xing, B.; Li, J.; Zhu, H.; Wei, P.; Zhao, Y.

    2018-04-01

    As a kind of marine natural disaster, Green Tide has been appearing every year along the Qingdao Coast, bringing great loss to this region, since the large-scale bloom in 2008. Therefore, it is of great value to obtain the real time dynamic information about green tide distribution. In this study, methods of optical remote sensing and microwave remote sensing are employed in Green Tide Monitoring Research. A specific remote sensing data processing flow and a green tide information extraction algorithm are designed, according to the optical and microwave data of different characteristics. In the aspect of green tide spatial distribution information extraction, an automatic extraction algorithm of green tide distribution boundaries is designed based on the principle of mathematical morphology dilation/erosion. And key issues in information extraction, including the division of green tide regions, the obtaining of basic distributions, the limitation of distribution boundary, and the elimination of islands, have been solved. The automatic generation of green tide distribution boundaries from the results of remote sensing information extraction is realized. Finally, a green tide monitoring system is built based on IDL/GIS secondary development in the integrated environment of RS and GIS, achieving the integration of RS monitoring and information extraction.

  7. Automatic information extraction from unstructured mammography reports using distributed semantics.

    PubMed

    Gupta, Anupama; Banerjee, Imon; Rubin, Daniel L

    2018-02-01

    To date, the methods developed for automated extraction of information from radiology reports are mainly rule-based or dictionary-based, and, therefore, require substantial manual effort to build these systems. Recent efforts to develop automated systems for entity detection have been undertaken, but little work has been done to automatically extract relations and their associated named entities in narrative radiology reports that have comparable accuracy to rule-based methods. Our goal is to extract relations in a unsupervised way from radiology reports without specifying prior domain knowledge. We propose a hybrid approach for information extraction that combines dependency-based parse tree with distributed semantics for generating structured information frames about particular findings/abnormalities from the free-text mammography reports. The proposed IE system obtains a F 1 -score of 0.94 in terms of completeness of the content in the information frames, which outperforms a state-of-the-art rule-based system in this domain by a significant margin. The proposed system can be leveraged in a variety of applications, such as decision support and information retrieval, and may also easily scale to other radiology domains, since there is no need to tune the system with hand-crafted information extraction rules. Copyright © 2018 Elsevier Inc. All rights reserved.

  8. Removal of arsenic from Janghang smelter site and energy crops-grown soil with soil washing using magnetic iron oxide

    NASA Astrophysics Data System (ADS)

    Han, Jaemaro; Zhao, Xin; Lee, Jong Keun; Kim, Jae Young

    2014-05-01

    Arsenic compounds are considered carcinogen and easily enter drinking water supplies with their natural abundance. US Environmental Protection Agency is finalizing a regulation to reduce the public health risks from arsenic in drinking water by revising the current drinking water standard for arsenic from 50 ppb to 10 ppb in 2001 (USEPA, 2001). Therefore, soil remediation is also growing field to prevent contamination of groundwater as well as crop cultivation. Soil washing is adjusted as ex-situ soil remediation technique which reduces volume of the contaminated soil. The technique is composed of physical separation and chemical extraction to extract target metal contamination in the soil. Chemical extraction methods have been developed solubilizing contaminants containing reagents such as acids or chelating agents. And acid extraction is proven as the most commonly used technology to treat heavy metals in soil, sediment, and sludge (FRTR, 2007). Due to the unique physical and chemical properties, magnetic iron oxide have been used in diverse areas including information technology and biomedicine. Magnetic iron oxides also can be used as adsorbent to heavy metal enhancing removal efficiency of arsenic concentration. In this study, magnetite is used as the washing agent with acid extraction condition so that the injected oxide can be separated by magnetic field. Soil samples were collected from three separate areas in the Janghang smelter site and energy crops-grown soil to have synergy effect with phytoremediation. Each sample was air-dried and sieved (2mm). Soil washing condition was adjusted on pH in the range of 0-12 with hydrogen chloride and sodium hydroxide. After performing soil washing procedure, arsenic-extracted samples were analyzed for arsenic concentration by inductively coupled plasma optical emission spectrometer (ICP-OES). All the soils have exceeded worrisome level of soil contamination for region 1 (25mg/kg) so the soil remediation techniques are needed to be applied. The objective of this study is to investigate soil washing efficiency using magnetic iron oxide and derive the availability of the washing technique to the arsenic-contaminated field soils. Acknowledgement This study was supported by Korea Ministry of Environment as 'Knowledge-based environmental service (Waste to Energy) Human Resource Development Project'.

  9. Studies of extraction and transport system for highly charged ion beam of 18 GHz superconducting electron cyclotron resonance ion source at Research Center for Nuclear Physics.

    PubMed

    Yorita, T; Hatanaka, K; Fukuda, M; Ueda, H; Yasuda, Y; Morinobu, S; Tamii, A; Kamakura, K

    2014-02-01

    An 18 GHz superconducting electron cyclotron resonance ion source is installed to increase beam currents and to extend the variety of ions especially for highly charged heavy ions which can be accelerated by cyclotrons of Research Center for Nuclear Physics (RCNP), Osaka University. The beam production developments of several ions from B to Xe have been already done [T. Yorita, K. Hatanaka, M. Fukuda, M. Kibayashi, S. Morinobu, H.Okamura, and A. Tamii, Rev. Sci. Instrum. 79, 02A311 (2008) and T. Yorita, K. Hatanaka, M. Fukuda, M. Kibayashi, S. Morinobu, H.Okamura, and A. Tamii, Rev. Sci. Instrum. 81, 02A332 (2010)] and the further studies for those beam extraction and its transport have been done in order to increase the beam current more. The plasma electrode, extraction electrode, and einzel lens are modified. Especially extraction electrode can be applied minus voltage for the beam extraction and it works well to improve the extracted beam current. The extraction voltage dependences of transmission and emittance also have been studied for beam current improvement which is injected into azimuthally varying field cyclotron at RCNP.

  10. Semantic Information Extraction of Lanes Based on Onboard Camera Videos

    NASA Astrophysics Data System (ADS)

    Tang, L.; Deng, T.; Ren, C.

    2018-04-01

    In the field of autonomous driving, semantic information of lanes is very important. This paper proposes a method of automatic detection of lanes and extraction of semantic information from onboard camera videos. The proposed method firstly detects the edges of lanes by the grayscale gradient direction, and improves the Probabilistic Hough transform to fit them; then, it uses the vanishing point principle to calculate the lane geometrical position, and uses lane characteristics to extract lane semantic information by the classification of decision trees. In the experiment, 216 road video images captured by a camera mounted onboard a moving vehicle were used to detect lanes and extract lane semantic information. The results show that the proposed method can accurately identify lane semantics from video images.

  11. Weak characteristic information extraction from early fault of wind turbine generator gearbox

    NASA Astrophysics Data System (ADS)

    Xu, Xiaoli; Liu, Xiuli

    2017-09-01

    Given the weak early degradation characteristic information during early fault evolution in gearbox of wind turbine generator, traditional singular value decomposition (SVD)-based denoising may result in loss of useful information. A weak characteristic information extraction based on μ-SVD and local mean decomposition (LMD) is developed to address this problem. The basic principle of the method is as follows: Determine the denoising order based on cumulative contribution rate, perform signal reconstruction, extract and subject the noisy part of signal to LMD and μ-SVD denoising, and obtain denoised signal through superposition. Experimental results show that this method can significantly weaken signal noise, effectively extract the weak characteristic information of early fault, and facilitate the early fault warning and dynamic predictive maintenance.

  12. An information extraction framework for cohort identification using electronic health records.

    PubMed

    Liu, Hongfang; Bielinski, Suzette J; Sohn, Sunghwan; Murphy, Sean; Wagholikar, Kavishwar B; Jonnalagadda, Siddhartha R; Ravikumar, K E; Wu, Stephen T; Kullo, Iftikhar J; Chute, Christopher G

    2013-01-01

    Information extraction (IE), a natural language processing (NLP) task that automatically extracts structured or semi-structured information from free text, has become popular in the clinical domain for supporting automated systems at point-of-care and enabling secondary use of electronic health records (EHRs) for clinical and translational research. However, a high performance IE system can be very challenging to construct due to the complexity and dynamic nature of human language. In this paper, we report an IE framework for cohort identification using EHRs that is a knowledge-driven framework developed under the Unstructured Information Management Architecture (UIMA). A system to extract specific information can be developed by subject matter experts through expert knowledge engineering of the externalized knowledge resources used in the framework.

  13. The Extraction of Post-Earthquake Building Damage Informatiom Based on Convolutional Neural Network

    NASA Astrophysics Data System (ADS)

    Chen, M.; Wang, X.; Dou, A.; Wu, X.

    2018-04-01

    The seismic damage information of buildings extracted from remote sensing (RS) imagery is meaningful for supporting relief and effective reduction of losses caused by earthquake. Both traditional pixel-based and object-oriented methods have some shortcoming in extracting information of object. Pixel-based method can't make fully use of contextual information of objects. Object-oriented method faces problem that segmentation of image is not ideal, and the choice of feature space is difficult. In this paper, a new stratage is proposed which combines Convolution Neural Network (CNN) with imagery segmentation to extract building damage information from remote sensing imagery. the key idea of this method includes two steps. First to use CNN to predicate the probability of each pixel and then integrate the probability within each segmentation spot. The method is tested through extracting the collapsed building and uncollapsed building from the aerial image which is acquired in Longtoushan Town after Ms 6.5 Ludian County, Yunnan Province earthquake. The results show that the proposed method indicates its effectiveness in extracting damage information of buildings after earthquake.

  14. Object-Based Arctic Sea Ice Feature Extraction through High Spatial Resolution Aerial photos

    NASA Astrophysics Data System (ADS)

    Miao, X.; Xie, H.

    2015-12-01

    High resolution aerial photographs used to detect and classify sea ice features can provide accurate physical parameters to refine, validate, and improve climate models. However, manually delineating sea ice features, such as melt ponds, submerged ice, water, ice/snow, and pressure ridges, is time-consuming and labor-intensive. An object-based classification algorithm is developed to automatically extract sea ice features efficiently from aerial photographs taken during the Chinese National Arctic Research Expedition in summer 2010 (CHINARE 2010) in the MIZ near the Alaska coast. The algorithm includes four steps: (1) the image segmentation groups the neighboring pixels into objects based on the similarity of spectral and textural information; (2) the random forest classifier distinguishes four general classes: water, general submerged ice (GSI, including melt ponds and submerged ice), shadow, and ice/snow; (3) the polygon neighbor analysis separates melt ponds and submerged ice based on spatial relationship; and (4) pressure ridge features are extracted from shadow based on local illumination geometry. The producer's accuracy of 90.8% and user's accuracy of 91.8% are achieved for melt pond detection, and shadow shows a user's accuracy of 88.9% and producer's accuracies of 91.4%. Finally, pond density, pond fraction, ice floes, mean ice concentration, average ridge height, ridge profile, and ridge frequency are extracted from batch processing of aerial photos, and their uncertainties are estimated.

  15. Multi-Intelligence Analytics for Next Generation Analysts (MIAGA)

    NASA Astrophysics Data System (ADS)

    Blasch, Erik; Waltz, Ed

    2016-05-01

    Current analysts are inundated with large volumes of data from which extraction, exploitation, and indexing are required. A future need for next-generation analysts is an appropriate balance between machine analytics from raw data and the ability of the user to interact with information through automation. Many quantitative intelligence tools and techniques have been developed which are examined towards matching analyst opportunities with recent technical trends such as big data, access to information, and visualization. The concepts and techniques summarized are derived from discussions with real analysts, documented trends of technical developments, and methods to engage future analysts with multiintelligence services. For example, qualitative techniques should be matched against physical, cognitive, and contextual quantitative analytics for intelligence reporting. Future trends include enabling knowledge search, collaborative situational sharing, and agile support for empirical decision-making and analytical reasoning.

  16. The analysis of forest policy using Landsat multi-spectral scanner data and geographic information systems

    NASA Technical Reports Server (NTRS)

    Peterson, D. L.; Brass, J. A.; Norman, S. D.; Tosta-Miller, N.

    1984-01-01

    The role of Landsat multi-spectral scanner (MSS) data for forest policy analysis in the state of California has been investigated. The combined requirements for physical, socio-economic, and institutional data in policy analysis were studied to explain potential data needs. A statewide MSS data and general land cover classification was created from which country-wide data sets could be extracted for detailed analyses. The potential to combine point sample data with MSS data was examined as a means to improve specificity in estimations. MSS data was incorporated into geographic information systems to demonstrate modeling techniques using abiotic, biotic, and socio-economic data layers. The review of system configurations to help the California Department of Forestry (CDF) acquire the capability demonstrated resulted in a sequence of options for implementation.

  17. Geomatic Methods for the Analysis of Data in the Earth Sciences: Lecture Notes in Earth Sciences, Vol. 95

    NASA Astrophysics Data System (ADS)

    Pavlis, Nikolaos K.

    Geomatics is a trendy term that has been used in recent years to describe academic departments that teach and research theories, methods, algorithms, and practices used in processing and analyzing data related to the Earth and other planets. Naming trends aside, geomatics could be considered as the mathematical and statistical “toolbox” that allows Earth scientists to extract information about physically relevant parameters from the available data and accompany such information with some measure of its reliability. This book is an attempt to present the mathematical-statistical methods used in data analysis within various disciplines—geodesy, geophysics, photogrammetry and remote sensing—from a unifying perspective that inverse problem formalism permits. At the same time, it allows us to stretch the relevance of statistical methods in achieving an optimal solution.

  18. Lipid extraction methods from microalgal biomass harvested by two different paths: screening studies toward biodiesel production.

    PubMed

    Ríos, Sergio D; Castañeda, Joandiet; Torras, Carles; Farriol, Xavier; Salvadó, Joan

    2013-04-01

    Microalgae can grow rapidly and capture CO2 from the atmosphere to convert it into complex organic molecules such as lipids (biodiesel feedstock). High scale economically feasible microalgae based oil depends on optimizing the entire process production. This process can be divided in three very different but directly related steps (production, concentration, lipid extraction and transesterification). The aim of this study is to identify the best method of lipid extraction to undergo the potentiality of some microalgal biomass obtained from two different harvesting paths. The first path used all physicals concentration steps, and the second path was a combination of chemical and physical concentration steps. Three microalgae species were tested: Phaeodactylum tricornutum, Nannochloropsis gaditana, and Chaetoceros calcitrans One step lipid extraction-transesterification reached the same fatty acid methyl ester yield as the Bligh and Dyer and soxhlet extraction with n-hexane methods with the corresponding time, cost and solvent saving. Copyright © 2013 Elsevier Ltd. All rights reserved.

  19. X-ray diagnostic development for measurement of electron deposition to the SABRE anode

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lash, J.S.; Derzon, M.S.; Cuneo, M.E.

    Extraction applied-B ion diodes are under development on the SABRE (6 MV, 250 kA) accelerator at Sandia. The authors are assessing this technology for the production of high brightness lithium ion beams for inertial confinement fusion. Electron loss physics is a focus of effort since electron sheath physics affects ion beam divergence, ion beam purity, and diode impedance. An x-ray slit-imaging diagnostic is under development for detection of x-rays produced during electron deposition to the anode. This diagnostic will aid in the correlation of electron deposition to ion production to better understand the ion diode physics. The x-ray detector consistsmore » of a filter pack, scintillator and optical fiber array that is streaked onto a CCD camera. Current orientation of the diagnostic provides spatial information across the anode radius at three different azimuths or at three different x-ray energy cuts. The observed x-ray emission spectrum can then be compared to current modeling efforts examining electron deposition to the anode.« less

  20. Evaluation of multiple-scale 3D characterization for coal physical structure with DCM method and synchrotron X-ray CT.

    PubMed

    Wang, Haipeng; Yang, Yushuang; Yang, Jianli; Nie, Yihang; Jia, Jing; Wang, Yudan

    2015-01-01

    Multiscale nondestructive characterization of coal microscopic physical structure can provide important information for coal conversion and coal-bed methane extraction. In this study, the physical structure of a coal sample was investigated by synchrotron-based multiple-energy X-ray CT at three beam energies and two different spatial resolutions. A data-constrained modeling (DCM) approach was used to quantitatively characterize the multiscale compositional distributions at the two resolutions. The volume fractions of each voxel for four different composition groups were obtained at the two resolutions. Between the two resolutions, the difference for DCM computed volume fractions of coal matrix and pores is less than 0.3%, and the difference for mineral composition groups is less than 0.17%. This demonstrates that the DCM approach can account for compositions beyond the X-ray CT imaging resolution with adequate accuracy. By using DCM, it is possible to characterize a relatively large coal sample at a relatively low spatial resolution with minimal loss of the effect due to subpixel fine length scale structures.

  1. Perceptual and affective mechanisms in facial expression recognition: An integrative review.

    PubMed

    Calvo, Manuel G; Nummenmaa, Lauri

    2016-09-01

    Facial expressions of emotion involve a physical component of morphological changes in a face and an affective component conveying information about the expresser's internal feelings. It remains unresolved how much recognition and discrimination of expressions rely on the perception of morphological patterns or the processing of affective content. This review of research on the role of visual and emotional factors in expression recognition reached three major conclusions. First, behavioral, neurophysiological, and computational measures indicate that basic expressions are reliably recognized and discriminated from one another, albeit the effect may be inflated by the use of prototypical expression stimuli and forced-choice responses. Second, affective content along the dimensions of valence and arousal is extracted early from facial expressions, although this coarse affective representation contributes minimally to categorical recognition of specific expressions. Third, the physical configuration and visual saliency of facial features contribute significantly to expression recognition, with "emotionless" computational models being able to reproduce some of the basic phenomena demonstrated in human observers. We conclude that facial expression recognition, as it has been investigated in conventional laboratory tasks, depends to a greater extent on perceptual than affective information and mechanisms.

  2. Physical properties evaluation of roselle extract-egg white mixture under various drying temperatures

    NASA Astrophysics Data System (ADS)

    Triyastuti, M. S.; Kumoro, A. C.; Djaeni, M.

    2017-03-01

    Roselle contains anthocyanin that is potential for food colorant. Occasionally, roselle extract is provided in dry powder prepared under high temperature. In this case, the anthocyanin color degrades due to the intervention of heat. The foammat drying with egg white is a potential method to speed up the drying process as well as minimize color degradation. This research aims to study the physical properties of roselle extract under foam mat drying. As indicators, the powder size and color intensity were observed. The result showed that at high temperatures, roselle powder under foam mat drying has the fine size with porous structure. However, at the higher the drying temperature the color retention decreased.

  3. Zener Diode Compact Model Parameter Extraction Using Xyce-Dakota Optimization.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Buchheit, Thomas E.; Wilcox, Ian Zachary; Sandoval, Andrew J

    This report presents a detailed process for compact model parameter extraction for DC circuit Zener diodes. Following the traditional approach of Zener diode parameter extraction, circuit model representation is defined and then used to capture the different operational regions of a real diode's electrical behavior. The circuit model contains 9 parameters represented by resistors and characteristic diodes as circuit model elements. The process of initial parameter extraction, the identification of parameter values for the circuit model elements, is presented in a way that isolates the dependencies between certain electrical parameters and highlights both the empirical nature of the extraction andmore » portions of the real diode physical behavior which of the parameters are intended to represent. Optimization of the parameters, a necessary part of a robost parameter extraction process, is demonstrated using a 'Xyce-Dakota' workflow, discussed in more detail in the report. Among other realizations during this systematic approach of electrical model parameter extraction, non-physical solutions are possible and can be difficult to avoid because of the interdependencies between the different parameters. The process steps described are fairly general and can be leveraged for other types of semiconductor device model extractions. Also included in the report are recommendations for experiment setups for generating optimum dataset for model extraction and the Parameter Identification and Ranking Table (PIRT) for Zener diodes.« less

  4. Accurate electrical prediction of memory array through SEM-based edge-contour extraction using SPICE simulation

    NASA Astrophysics Data System (ADS)

    Shauly, Eitan; Rotstein, Israel; Peltinov, Ram; Latinski, Sergei; Adan, Ofer; Levi, Shimon; Menadeva, Ovadya

    2009-03-01

    The continues transistors scaling efforts, for smaller devices, similar (or larger) drive current/um and faster devices, increase the challenge to predict and to control the transistor off-state current. Typically, electrical simulators like SPICE, are using the design intent (as-drawn GDS data). At more sophisticated cases, the simulators are fed with the pattern after lithography and etch process simulations. As the importance of electrical simulation accuracy is increasing and leakage is becoming more dominant, there is a need to feed these simulators, with more accurate information extracted from physical on-silicon transistors. Our methodology to predict changes in device performances due to systematic lithography and etch effects was used in this paper. In general, the methodology consists on using the OPCCmaxTM for systematic Edge-Contour-Extraction (ECE) from transistors, taking along the manufacturing and includes any image distortions like line-end shortening, corner rounding and line-edge roughness. These measurements are used for SPICE modeling. Possible application of this new metrology is to provide a-head of time, physical and electrical statistical data improving time to market. In this work, we applied our methodology to analyze a small and large array's of 2.14um2 6T-SRAM, manufactured using Tower Standard Logic for General Purposes Platform. 4 out of the 6 transistors used "U-Shape AA", known to have higher variability. The predicted electrical performances of the transistors drive current and leakage current, in terms of nominal values and variability are presented. We also used the methodology to analyze an entire SRAM Block array. Study of an isolation leakage and variability are presented.

  5. The research of road and vehicle information extraction algorithm based on high resolution remote sensing image

    NASA Astrophysics Data System (ADS)

    Zhou, Tingting; Gu, Lingjia; Ren, Ruizhi; Cao, Qiong

    2016-09-01

    With the rapid development of remote sensing technology, the spatial resolution and temporal resolution of satellite imagery also have a huge increase. Meanwhile, High-spatial-resolution images are becoming increasingly popular for commercial applications. The remote sensing image technology has broad application prospects in intelligent traffic. Compared with traditional traffic information collection methods, vehicle information extraction using high-resolution remote sensing image has the advantages of high resolution and wide coverage. This has great guiding significance to urban planning, transportation management, travel route choice and so on. Firstly, this paper preprocessed the acquired high-resolution multi-spectral and panchromatic remote sensing images. After that, on the one hand, in order to get the optimal thresholding for image segmentation, histogram equalization and linear enhancement technologies were applied into the preprocessing results. On the other hand, considering distribution characteristics of road, the normalized difference vegetation index (NDVI) and normalized difference water index (NDWI) were used to suppress water and vegetation information of preprocessing results. Then, the above two processing result were combined. Finally, the geometric characteristics were used to completed road information extraction. The road vector extracted was used to limit the target vehicle area. Target vehicle extraction was divided into bright vehicles extraction and dark vehicles extraction. Eventually, the extraction results of the two kinds of vehicles were combined to get the final results. The experiment results demonstrated that the proposed algorithm has a high precision for the vehicle information extraction for different high resolution remote sensing images. Among these results, the average fault detection rate was about 5.36%, the average residual rate was about 13.60% and the average accuracy was approximately 91.26%.

  6. An introduction to chaotic and random time series analysis

    NASA Technical Reports Server (NTRS)

    Scargle, Jeffrey D.

    1989-01-01

    The origin of chaotic behavior and the relation of chaos to randomness are explained. Two mathematical results are described: (1) a representation theorem guarantees the existence of a specific time-domain model for chaos and addresses the relation between chaotic, random, and strictly deterministic processes; (2) a theorem assures that information on the behavior of a physical system in its complete state space can be extracted from time-series data on a single observable. Focus is placed on an important connection between the dynamical state space and an observable time series. These two results lead to a practical deconvolution technique combining standard random process modeling methods with new embedded techniques.

  7. Eclipsing binary stars in the era of massive surveys First results and future prospects

    NASA Astrophysics Data System (ADS)

    Papageorgiou, Athanasios; Catelan, Márcio; Ramos, Rodrigo Contreras; Drake, Andrew J.

    2017-09-01

    Our thinking about eclipsing binary stars has undergone a tremendous change in the last decade. Eclipsing binary stars are one of nature's best laboratories for determining the fundamental physical properties of stars and thus for testing the predictions of theoretical models. Some of the largest ongoing variable star surveys include the Catalina Real-time Transient Survey (CRTS) and the VISTA Variables in the Vía Láctea survey (VVV). They both contain a large amount of photometric data and plenty of information about eclipsing binaries that wait to be extracted and exploited. Here we briefly describe our efforts in this direction.

  8. Research on Crowdsourcing Emergency Information Extraction of Based on Events' Frame

    NASA Astrophysics Data System (ADS)

    Yang, Bo; Wang, Jizhou; Ma, Weijun; Mao, Xi

    2018-01-01

    At present, the common information extraction method cannot extract the structured emergency event information accurately; the general information retrieval tool cannot completely identify the emergency geographic information; these ways also do not have an accurate assessment of these results of distilling. So, this paper proposes an emergency information collection technology based on event framework. This technique is to solve the problem of emergency information picking. It mainly includes emergency information extraction model (EIEM), complete address recognition method (CARM) and the accuracy evaluation model of emergency information (AEMEI). EIEM can be structured to extract emergency information and complements the lack of network data acquisition in emergency mapping. CARM uses a hierarchical model and the shortest path algorithm and allows the toponomy pieces to be joined as a full address. AEMEI analyzes the results of the emergency event and summarizes the advantages and disadvantages of the event framework. Experiments show that event frame technology can solve the problem of emergency information drawing and provides reference cases for other applications. When the emergency disaster is about to occur, the relevant departments query emergency's data that has occurred in the past. They can make arrangements ahead of schedule which defense and reducing disaster. The technology decreases the number of casualties and property damage in the country and world. This is of great significance to the state and society.

  9. Chitosan films incorporated with nettle (Urtica dioica L.) extract-loaded nanoliposomes: I. Physicochemical characterisation and antimicrobial properties.

    PubMed

    Haghju, Sara; Beigzadeh, Sara; Almasi, Hadi; Hamishehkar, Hamed

    2016-07-17

    The objective of this study was to characterise and compare physical, mechanical and antimicrobial properties of chitosan-based films, containing free or nanoencapsulated nettle (Urtica dioica L.) extract (NE) at concentrations of 0, 0.5, 1 and 1.5% w/w. Nanoliposomes were prepared using soy-lecithin by thin-film hydration and sonication method to generate an average size of 107-136 nm with 70% encapsulation efficiency. The information on FT-IR reflected that some new interaction have occurred between chitosan and nanoliposomes. Despite the increasing yellowness and decreasing whiteness indexes, the nanoliposomes incorporation improved the thermal properties and mechanical stiffness and caused to decrease water vapour permeability (WVP), moisture uptake and water solubility. The possible antimicrobial activity of the films containing NE-loaded nanoliposomes against Staphylococcus aureus was decreased in comparison to free NE-incorporated films, which could be due to the inhibition effect of the encapsulation that prevents the release of NE from the matrix.

  10. Biometric Authentication for Gender Classification Techniques: A Review

    NASA Astrophysics Data System (ADS)

    Mathivanan, P.; Poornima, K.

    2017-12-01

    One of the challenging biometric authentication applications is gender identification and age classification, which captures gait from far distance and analyze physical information of the subject such as gender, race and emotional state of the subject. It is found that most of the gender identification techniques have focused only with frontal pose of different human subject, image size and type of database used in the process. The study also classifies different feature extraction process such as, Principal Component Analysis (PCA) and Local Directional Pattern (LDP) that are used to extract the authentication features of a person. This paper aims to analyze different gender classification techniques that help in evaluating strength and weakness of existing gender identification algorithm. Therefore, it helps in developing a novel gender classification algorithm with less computation cost and more accuracy. In this paper, an overview and classification of different gender identification techniques are first presented and it is compared with other existing human identification system by means of their performance.

  11. Using a GIS to link digital spatial data and the precipitation-runoff modeling system, Gunnison River Basin, Colorado

    USGS Publications Warehouse

    Battaglin, William A.; Kuhn, Gerhard; Parker, Randolph S.

    1993-01-01

    The U.S. Geological Survey Precipitation-Runoff Modeling System, a modular, distributed-parameter, watershed-modeling system, is being applied to 20 smaller watersheds within the Gunnison River basin. The model is used to derive a daily water balance for subareas in a watershed, ultimately producing simulated streamflows that can be input into routing and accounting models used to assess downstream water availability under current conditions, and to assess the sensitivity of water resources in the basin to alterations in climate. A geographic information system (GIS) is used to automate a method for extracting physically based hydrologic response unit (HRU) distributed parameter values from digital data sources, and for the placement of those estimates into GIS spatial datalayers. The HRU parameters extracted are: area, mean elevation, average land-surface slope, predominant aspect, predominant land-cover type, predominant soil type, average total soil water-holding capacity, and average water-holding capacity of the root zone.

  12. Extreme Threshold Failures Within a Heterogeneous Elastic Thin Sheet and the Spatial-Temporal Development of Induced Seismicity Within the Groningen Gas Field

    NASA Astrophysics Data System (ADS)

    Bourne, S. J.; Oates, S. J.

    2017-12-01

    Measurements of the strains and earthquakes induced by fluid extraction from a subsurface reservoir reveal a transient, exponential-like increase in seismicity relative to the volume of fluids extracted. If the frictional strength of these reactivating faults is heterogeneously and randomly distributed, then progressive failures of the weakest fault patches account in a general manner for this initial exponential-like trend. Allowing for the observable elastic and geometric heterogeneity of the reservoir, the spatiotemporal evolution of induced seismicity over 5 years is predictable without significant bias using a statistical physics model of poroelastic reservoir deformations inducing extreme threshold frictional failures of previously inactive faults. This model is used to forecast the temporal and spatial probability density of earthquakes within the Groningen natural gas reservoir, conditional on future gas production plans. Probabilistic seismic hazard and risk assessments based on these forecasts inform the current gas production policy and building strengthening plans.

  13. An Information Extraction Framework for Cohort Identification Using Electronic Health Records

    PubMed Central

    Liu, Hongfang; Bielinski, Suzette J.; Sohn, Sunghwan; Murphy, Sean; Wagholikar, Kavishwar B.; Jonnalagadda, Siddhartha R.; Ravikumar, K.E.; Wu, Stephen T.; Kullo, Iftikhar J.; Chute, Christopher G

    Information extraction (IE), a natural language processing (NLP) task that automatically extracts structured or semi-structured information from free text, has become popular in the clinical domain for supporting automated systems at point-of-care and enabling secondary use of electronic health records (EHRs) for clinical and translational research. However, a high performance IE system can be very challenging to construct due to the complexity and dynamic nature of human language. In this paper, we report an IE framework for cohort identification using EHRs that is a knowledge-driven framework developed under the Unstructured Information Management Architecture (UIMA). A system to extract specific information can be developed by subject matter experts through expert knowledge engineering of the externalized knowledge resources used in the framework. PMID:24303255

  14. Table Extraction from Web Pages Using Conditional Random Fields to Extract Toponym Related Data

    NASA Astrophysics Data System (ADS)

    Luthfi Hanifah, Hayyu'; Akbar, Saiful

    2017-01-01

    Table is one of the ways to visualize information on web pages. The abundant number of web pages that compose the World Wide Web has been the motivation of information extraction and information retrieval research, including the research for table extraction. Besides, there is a need for a system which is designed to specifically handle location-related information. Based on this background, this research is conducted to provide a way to extract location-related data from web tables so that it can be used in the development of Geographic Information Retrieval (GIR) system. The location-related data will be identified by the toponym (location name). In this research, a rule-based approach with gazetteer is used to recognize toponym from web table. Meanwhile, to extract data from a table, a combination of rule-based approach and statistical-based approach is used. On the statistical-based approach, Conditional Random Fields (CRF) model is used to understand the schema of the table. The result of table extraction is presented on JSON format. If a web table contains toponym, a field will be added on the JSON document to store the toponym values. This field can be used to index the table data in accordance to the toponym, which then can be used in the development of GIR system.

  15. Effects of oil extraction methods on physical and chemical properties of red salmon oils (Oncorhynchus nerka)

    USDA-ARS?s Scientific Manuscript database

    Four different red salmon oil extraction processes were used to extract oil from red salmon heads: RS1 involved a mixture of ground red salmon heads and water, no heat treatment, and centrifugation; RS2 involved ground red salmon heads (no water added), heat treatment, and centrifugation; RS3 involv...

  16. MICHIGAN SOIL VAPOR EXTRACTION REMEDIATION (MISER) MODEL: A COMPUTER PROGRAM TO MODEL SOIL VAPOR EXTRACTION AND BIOVENTING OF ORGANIC CHEMICALS IN UNSATURATED GEOLOGICAL MATERIAL

    EPA Science Inventory

    Soil vapor extraction (SVE) and bioventing (BV) are proven strategies for remediation of unsaturated zone soils. Mathematical models are powerful tools that can be used to integrate and quantify the interaction of physical, chemical, and biological processes occurring in field sc...

  17. [A customized method for information extraction from unstructured text data in the electronic medical records].

    PubMed

    Bao, X Y; Huang, W J; Zhang, K; Jin, M; Li, Y; Niu, C Z

    2018-04-18

    There is a huge amount of diagnostic or treatment information in electronic medical record (EMR), which is a concrete manifestation of clinicians actual diagnosis and treatment details. Plenty of episodes in EMRs, such as complaints, present illness, past history, differential diagnosis, diagnostic imaging, surgical records, reflecting details of diagnosis and treatment in clinical process, adopt Chinese description of natural language. How to extract effective information from these Chinese narrative text data, and organize it into a form of tabular for analysis of medical research, for the practical utilization of clinical data in the real world, is a difficult problem in Chinese medical data processing. Based on the EMRs narrative text data in a tertiary hospital in China, a customized information extracting rules learning, and rule based information extraction methods is proposed. The overall method consists of three steps, which includes: (1) Step 1, a random sample of 600 copies (including the history of present illness, past history, personal history, family history, etc.) of the electronic medical record data, was extracted as raw corpora. With our developed Chinese clinical narrative text annotation platform, the trained clinician and nurses marked the tokens and phrases in the corpora which would be extracted (with a history of diabetes as an example). (2) Step 2, based on the annotated corpora clinical text data, some extraction templates were summarized and induced firstly. Then these templates were rewritten using regular expressions of Perl programming language, as extraction rules. Using these extraction rules as basic knowledge base, we developed extraction packages in Perl, for extracting data from the EMRs text data. In the end, the extracted data items were organized in tabular data format, for later usage in clinical research or hospital surveillance purposes. (3) As the final step of the method, the evaluation and validation of the proposed methods were implemented in the National Clinical Service Data Integration Platform, and we checked the extraction results using artificial verification and automated verification combined, proved the effectiveness of the method. For all the patients with diabetes as diagnosed disease in the Department of Endocrine in the hospital, the medical history episode of these patients showed that, altogether 1 436 patients were dismissed in 2015, and a history of diabetes medical records extraction results showed that the recall rate was 87.6%, the accuracy rate was 99.5%, and F-Score was 0.93. For all the 10% patients (totally 1 223 patients) with diabetes by the dismissed dates of August 2017 in the same department, the extracted diabetes history extraction results showed that the recall rate was 89.2%, the accuracy rate was 99.2%, F-Score was 0.94. This study mainly adopts the combination of natural language processing and rule-based information extraction, and designs and implements an algorithm for extracting customized information from unstructured Chinese electronic medical record text data. It has better results than existing work.

  18. Representational analysis of extended disorder in atomistic ensembles derived from total scattering data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Neilson, James R.; McQueen, Tyrel M.

    With the increased availability of high-intensity time-of-flight neutron and synchrotron X-ray scattering sources that can access wide ranges of momentum transfer, the pair distribution function method has become a standard analysis technique for studying disorder of local coordination spheres and at intermediate atomic separations. In some cases, rational modeling of the total scattering data (Bragg and diffuse) becomes intractable with least-squares approaches, necessitating reverse Monte Carlo simulations using large atomistic ensembles. However, the extraction of meaningful information from the resulting atomistic ensembles is challenging, especially at intermediate length scales. Representational analysis is used here to describe the displacements of atomsmore » in reverse Monte Carlo ensembles from an ideal crystallographic structure in an approach analogous to tight-binding methods. Rewriting the displacements in terms of a local basis that is descriptive of the ideal crystallographic symmetry provides a robust approach to characterizing medium-range order (and disorder) and symmetry breaking in complex and disordered crystalline materials. Lastly, this method enables the extraction of statistically relevant displacement modes (orientation, amplitude and distribution) of the crystalline disorder and provides directly meaningful information in a locally symmetry-adapted basis set that is most descriptive of the crystal chemistry and physics.« less

  19. Representational analysis of extended disorder in atomistic ensembles derived from total scattering data

    DOE PAGES

    Neilson, James R.; McQueen, Tyrel M.

    2015-09-20

    With the increased availability of high-intensity time-of-flight neutron and synchrotron X-ray scattering sources that can access wide ranges of momentum transfer, the pair distribution function method has become a standard analysis technique for studying disorder of local coordination spheres and at intermediate atomic separations. In some cases, rational modeling of the total scattering data (Bragg and diffuse) becomes intractable with least-squares approaches, necessitating reverse Monte Carlo simulations using large atomistic ensembles. However, the extraction of meaningful information from the resulting atomistic ensembles is challenging, especially at intermediate length scales. Representational analysis is used here to describe the displacements of atomsmore » in reverse Monte Carlo ensembles from an ideal crystallographic structure in an approach analogous to tight-binding methods. Rewriting the displacements in terms of a local basis that is descriptive of the ideal crystallographic symmetry provides a robust approach to characterizing medium-range order (and disorder) and symmetry breaking in complex and disordered crystalline materials. Lastly, this method enables the extraction of statistically relevant displacement modes (orientation, amplitude and distribution) of the crystalline disorder and provides directly meaningful information in a locally symmetry-adapted basis set that is most descriptive of the crystal chemistry and physics.« less

  20. Simulating the Mg II NUV Spectra & C II Resonance Lines During Solar Flares

    NASA Astrophysics Data System (ADS)

    Kerr, Graham Stewart; Allred, Joel C.; Leenaarts, Jorrit; Butler, Elizabeth; Kowalski, Adam

    2017-08-01

    The solar chromosphere is the origin of the bulk of the enhanced radiative output during solar flares, and so comprehensive understanding of this region is important if we wish to understand energy transport in solar flares. It is only relatively recently, however, with the launch of IRIS that we have routine spectroscopic flarea observations of the chromsphere and transition region. Since several of the spectral lines observed by IRIS are optically thick, it is necessary to use forward modelling to extract the useful information that these lines carry about the flaring chromosphere and transition region. We present the results of modelling the formation properties Mg II resonance lines & subordinate lines, and the C II resonance lines during solar flares. We focus on understanding their relation to the physical strucutre of the flaring atmosphere, exploiting formation height differences to determine if we can extract information about gradients in the atmosphere. We show the effect of degrading the profiles to the resolution of the IRIS, and that the usual observational techniques used to identify the line centroid do a poor job in the early stages of the flare (partly due to multiple optically thick line components). Finally, we will tentatively comment on the effects that 3D radiation transfer may have on these lines.

  1. Methods for spectral image analysis by exploiting spatial simplicity

    DOEpatents

    Keenan, Michael R.

    2010-05-25

    Several full-spectrum imaging techniques have been introduced in recent years that promise to provide rapid and comprehensive chemical characterization of complex samples. One of the remaining obstacles to adopting these techniques for routine use is the difficulty of reducing the vast quantities of raw spectral data to meaningful chemical information. Multivariate factor analysis techniques, such as Principal Component Analysis and Alternating Least Squares-based Multivariate Curve Resolution, have proven effective for extracting the essential chemical information from high dimensional spectral image data sets into a limited number of components that describe the spectral characteristics and spatial distributions of the chemical species comprising the sample. There are many cases, however, in which those constraints are not effective and where alternative approaches may provide new analytical insights. For many cases of practical importance, imaged samples are "simple" in the sense that they consist of relatively discrete chemical phases. That is, at any given location, only one or a few of the chemical species comprising the entire sample have non-zero concentrations. The methods of spectral image analysis of the present invention exploit this simplicity in the spatial domain to make the resulting factor models more realistic. Therefore, more physically accurate and interpretable spectral and abundance components can be extracted from spectral images that have spatially simple structure.

  2. Methods for spectral image analysis by exploiting spatial simplicity

    DOEpatents

    Keenan, Michael R.

    2010-11-23

    Several full-spectrum imaging techniques have been introduced in recent years that promise to provide rapid and comprehensive chemical characterization of complex samples. One of the remaining obstacles to adopting these techniques for routine use is the difficulty of reducing the vast quantities of raw spectral data to meaningful chemical information. Multivariate factor analysis techniques, such as Principal Component Analysis and Alternating Least Squares-based Multivariate Curve Resolution, have proven effective for extracting the essential chemical information from high dimensional spectral image data sets into a limited number of components that describe the spectral characteristics and spatial distributions of the chemical species comprising the sample. There are many cases, however, in which those constraints are not effective and where alternative approaches may provide new analytical insights. For many cases of practical importance, imaged samples are "simple" in the sense that they consist of relatively discrete chemical phases. That is, at any given location, only one or a few of the chemical species comprising the entire sample have non-zero concentrations. The methods of spectral image analysis of the present invention exploit this simplicity in the spatial domain to make the resulting factor models more realistic. Therefore, more physically accurate and interpretable spectral and abundance components can be extracted from spectral images that have spatially simple structure.

  3. [The application of spectral geological profile in the alteration mapping].

    PubMed

    Li, Qing-Ting; Lin, Qi-Zhong; Zhang, Bing; Lu, Lin-Lin

    2012-07-01

    Geological section can help validating and understanding of the alteration information which is extracted from remote sensing images. In the paper, the concept of spectral geological profile was introduced based on the principle of geological section and the method of spectral information extraction. The spectral profile can realize the storage and vision of spectra along the geological profile, but the spectral geological spectral profile includes more information besides the information of spectral profile. The main object of spectral geological spectral profile is to obtain the distribution of alteration types and content of minerals along the profile which can be extracted from spectra measured by field spectrometer, especially for the spatial distribution and mode of alteration association. Technical method and work flow of alteration information extraction was studied for the spectral geological profile. The spectral geological profile was set up using the ground reflectance spectra and the alteration information was extracted from the remote sensing image with the help of typical spectra geological profile. At last the meaning and effect of the spectral geological profile was discussed.

  4. Leaf Extraction and Analysis Framework Graphical User Interface: Segmenting and Analyzing the Structure of Leaf Veins and Areoles1[W][OA

    PubMed Central

    Price, Charles A.; Symonova, Olga; Mileyko, Yuriy; Hilley, Troy; Weitz, Joshua S.

    2011-01-01

    Interest in the structure and function of physical biological networks has spurred the development of a number of theoretical models that predict optimal network structures across a broad array of taxonomic groups, from mammals to plants. In many cases, direct tests of predicted network structure are impossible given the lack of suitable empirical methods to quantify physical network geometry with sufficient scope and resolution. There is a long history of empirical methods to quantify the network structure of plants, from roots, to xylem networks in shoots and within leaves. However, with few exceptions, current methods emphasize the analysis of portions of, rather than entire networks. Here, we introduce the Leaf Extraction and Analysis Framework Graphical User Interface (LEAF GUI), a user-assisted software tool that facilitates improved empirical understanding of leaf network structure. LEAF GUI takes images of leaves where veins have been enhanced relative to the background, and following a series of interactive thresholding and cleaning steps, returns a suite of statistics and information on the structure of leaf venation networks and areoles. Metrics include the dimensions, position, and connectivity of all network veins, and the dimensions, shape, and position of the areoles they surround. Available for free download, the LEAF GUI software promises to facilitate improved understanding of the adaptive and ecological significance of leaf vein network structure. PMID:21057114

  5. Machine Learning and Experimental Design for Hydrogen Cosmology

    NASA Astrophysics Data System (ADS)

    Rapetti, David; Tauscher, Keith A.; Burns, Jack O.; Mirocha, Jordan; Switzer, Eric; Monsalve, Raul A.; Furlanetto, Steven R.; Bowman, Judd D.

    2018-06-01

    Based on two powerful innovations, we present a new pipeline to analyze the redshifted sky-averaged 21-cm spectrum (~10-200 MHz) of neutral hydrogen from the first stars, galaxies and black holes. First, we combine machine learning and model selection techniques to extract the global 21-cm signal from foreground and instrumental systematics. Second, we employ experimental designs to increase our ability to separate these two components in data sets. For measurements with foreground polarization induced by rotation about the anisotropic low-frequency radio sky on a large beam, we incorporate this information into the likelihood to distinguish the unpolarized 21-cm signal from the rest of the data. For experiments with a drift scan strategy, we take advantage of the varying foreground in time to identify the constant 21-cm signal. This pipeline can be applied to either lunar orbit/surface instruments shielded from terrestrial and solar radio contamination, or existing ground-based observations, such as those from the EDGES collaboration that recently observed an absorption trough potentially consistent with the global 21-cm signal of Cosmic Dawn. Finally, this pipeline allows us to constrain physical parameters for a given model of the first luminous objects plus exotic physics in the early universe, from e.g. dark matter, through an MCMC analysis that uses the extracted signal as a starting point, providing key efficiency for unexplored cosmologies.

  6. Where the bugs are: analyzing distributions of bacterial phyla by descriptor keyword search in the nucleotide database.

    PubMed

    Squartini, Andrea

    2011-07-26

    The associations between bacteria and environment underlie their preferential interactions with given physical or chemical conditions. Microbial ecology aims at extracting conserved patterns of occurrence of bacterial taxa in relation to defined habitats and contexts. In the present report the NCBI nucleotide sequence database is used as dataset to extract information relative to the distribution of each of the 24 phyla of the bacteria superkingdom and of the Archaea. Over two and a half million records are filtered in their cross-association with each of 48 sets of keywords, defined to cover natural or artificial habitats, interactions with plant, animal or human hosts, and physical-chemical conditions. The results are processed showing: (a) how the different descriptors enrich or deplete the proportions at which the phyla occur in the total database; (b) in which order of abundance do the different keywords score for each phylum (preferred habitats or conditions), and to which extent are phyla clustered to few descriptors (specific) or spread across many (cosmopolitan); (c) which keywords individuate the communities ranking highest for diversity and evenness. A number of cues emerge from the results, contributing to sharpen the picture on the functional systematic diversity of prokaryotes. Suggestions are given for a future automated service dedicated to refining and updating such kind of analyses via public bioinformatic engines.

  7. Leaf extraction and analysis framework graphical user interface: segmenting and analyzing the structure of leaf veins and areoles.

    PubMed

    Price, Charles A; Symonova, Olga; Mileyko, Yuriy; Hilley, Troy; Weitz, Joshua S

    2011-01-01

    Interest in the structure and function of physical biological networks has spurred the development of a number of theoretical models that predict optimal network structures across a broad array of taxonomic groups, from mammals to plants. In many cases, direct tests of predicted network structure are impossible given the lack of suitable empirical methods to quantify physical network geometry with sufficient scope and resolution. There is a long history of empirical methods to quantify the network structure of plants, from roots, to xylem networks in shoots and within leaves. However, with few exceptions, current methods emphasize the analysis of portions of, rather than entire networks. Here, we introduce the Leaf Extraction and Analysis Framework Graphical User Interface (LEAF GUI), a user-assisted software tool that facilitates improved empirical understanding of leaf network structure. LEAF GUI takes images of leaves where veins have been enhanced relative to the background, and following a series of interactive thresholding and cleaning steps, returns a suite of statistics and information on the structure of leaf venation networks and areoles. Metrics include the dimensions, position, and connectivity of all network veins, and the dimensions, shape, and position of the areoles they surround. Available for free download, the LEAF GUI software promises to facilitate improved understanding of the adaptive and ecological significance of leaf vein network structure.

  8. Considering context: reliable entity networks through contextual relationship extraction

    NASA Astrophysics Data System (ADS)

    David, Peter; Hawes, Timothy; Hansen, Nichole; Nolan, James J.

    2016-05-01

    Existing information extraction techniques can only partially address the problem of exploiting unreadable-large amounts text. When discussion of events and relationships is limited to simple, past-tense, factual descriptions of events, current NLP-based systems can identify events and relationships and extract a limited amount of additional information. But the simple subset of available information that existing tools can extract from text is only useful to a small set of users and problems. Automated systems need to find and separate information based on what is threatened or planned to occur, has occurred in the past, or could potentially occur. We address the problem of advanced event and relationship extraction with our event and relationship attribute recognition system, which labels generic, planned, recurring, and potential events. The approach is based on a combination of new machine learning methods, novel linguistic features, and crowd-sourced labeling. The attribute labeler closes the gap between structured event and relationship models and the complicated and nuanced language that people use to describe them. Our operational-quality event and relationship attribute labeler enables Warfighters and analysts to more thoroughly exploit information in unstructured text. This is made possible through 1) More precise event and relationship interpretation, 2) More detailed information about extracted events and relationships, and 3) More reliable and informative entity networks that acknowledge the different attributes of entity-entity relationships.

  9. Postmastectomy Information Needs and Information-seeking Motives for Women with Breast Cancer

    PubMed Central

    Latifi, Masoome; Salimi, Sohrab; Barahmand, Nilofar; Fahimnia, Fateme; Allahbakhshian Farsani, Leili

    2018-01-01

    Background: Health information-seeking behavior is a key concept in the empowerment of women with breast cancer after mastectomy for self-care management. Thus, a real understanding of their information needs and their information-seeking behavior may open up new opportunities for their postsurgery cares. The current research was conducted to identify the information needs and information-seeking motives of women with breast cancer after mastectomy. Materials and Methods: This is an applied qualitative research. Samples included 17 women with breast cancer after mastectomy selected from two hospitals of Shahid Mohammadi and Persian Gulf and Omid Central Chemotherapy in Bandar Abbas. Data were collected using semi-structured interview on winter 2014 and analyzed using qualitative content analysis method. Results: Three basic contents were extracted including information needs related to mental health, physical health related to disease and personal daily activities along with their subcategories, and representing common experience and perception of mastectomized women seeking for health information. Furthermore, hope, self-esteem, return to life, and available social support resources were expressed as the main information-seeking motives. Conclusion: Considering research findings, mastectomized women need to receive information in wide range of health and thus pursue purposeful behavior. Hence, it is necessary that required actions and measures are taken by health-care authorities, especially institutions responsible for women health, to support and meet information needs of the patients considering their information-seeking motives. PMID:29862224

  10. Postmastectomy Information Needs and Information-seeking Motives for Women with Breast Cancer.

    PubMed

    Latifi, Masoome; Salimi, Sohrab; Barahmand, Nilofar; Fahimnia, Fateme; Allahbakhshian Farsani, Leili

    2018-01-01

    Health information-seeking behavior is a key concept in the empowerment of women with breast cancer after mastectomy for self-care management. Thus, a real understanding of their information needs and their information-seeking behavior may open up new opportunities for their postsurgery cares. The current research was conducted to identify the information needs and information-seeking motives of women with breast cancer after mastectomy. This is an applied qualitative research. Samples included 17 women with breast cancer after mastectomy selected from two hospitals of Shahid Mohammadi and Persian Gulf and Omid Central Chemotherapy in Bandar Abbas. Data were collected using semi-structured interview on winter 2014 and analyzed using qualitative content analysis method. Three basic contents were extracted including information needs related to mental health, physical health related to disease and personal daily activities along with their subcategories, and representing common experience and perception of mastectomized women seeking for health information. Furthermore, hope, self-esteem, return to life, and available social support resources were expressed as the main information-seeking motives. Considering research findings, mastectomized women need to receive information in wide range of health and thus pursue purposeful behavior. Hence, it is necessary that required actions and measures are taken by health-care authorities, especially institutions responsible for women health, to support and meet information needs of the patients considering their information-seeking motives.

  11. Automated Information Extraction on Treatment and Prognosis for Non-Small Cell Lung Cancer Radiotherapy Patients: Clinical Study.

    PubMed

    Zheng, Shuai; Jabbour, Salma K; O'Reilly, Shannon E; Lu, James J; Dong, Lihua; Ding, Lijuan; Xiao, Ying; Yue, Ning; Wang, Fusheng; Zou, Wei

    2018-02-01

    In outcome studies of oncology patients undergoing radiation, researchers extract valuable information from medical records generated before, during, and after radiotherapy visits, such as survival data, toxicities, and complications. Clinical studies rely heavily on these data to correlate the treatment regimen with the prognosis to develop evidence-based radiation therapy paradigms. These data are available mainly in forms of narrative texts or table formats with heterogeneous vocabularies. Manual extraction of the related information from these data can be time consuming and labor intensive, which is not ideal for large studies. The objective of this study was to adapt the interactive information extraction platform Information and Data Extraction using Adaptive Learning (IDEAL-X) to extract treatment and prognosis data for patients with locally advanced or inoperable non-small cell lung cancer (NSCLC). We transformed patient treatment and prognosis documents into normalized structured forms using the IDEAL-X system for easy data navigation. The adaptive learning and user-customized controlled toxicity vocabularies were applied to extract categorized treatment and prognosis data, so as to generate structured output. In total, we extracted data from 261 treatment and prognosis documents relating to 50 patients, with overall precision and recall more than 93% and 83%, respectively. For toxicity information extractions, which are important to study patient posttreatment side effects and quality of life, the precision and recall achieved 95.7% and 94.5% respectively. The IDEAL-X system is capable of extracting study data regarding NSCLC chemoradiation patients with significant accuracy and effectiveness, and therefore can be used in large-scale radiotherapy clinical data studies. ©Shuai Zheng, Salma K Jabbour, Shannon E O'Reilly, James J Lu, Lihua Dong, Lijuan Ding, Ying Xiao, Ning Yue, Fusheng Wang, Wei Zou. Originally published in JMIR Medical Informatics (http://medinform.jmir.org), 01.02.2018.

  12. Using text mining techniques to extract phenotypic information from the PhenoCHF corpus

    PubMed Central

    2015-01-01

    Background Phenotypic information locked away in unstructured narrative text presents significant barriers to information accessibility, both for clinical practitioners and for computerised applications used for clinical research purposes. Text mining (TM) techniques have previously been applied successfully to extract different types of information from text in the biomedical domain. They have the potential to be extended to allow the extraction of information relating to phenotypes from free text. Methods To stimulate the development of TM systems that are able to extract phenotypic information from text, we have created a new corpus (PhenoCHF) that is annotated by domain experts with several types of phenotypic information relating to congestive heart failure. To ensure that systems developed using the corpus are robust to multiple text types, it integrates text from heterogeneous sources, i.e., electronic health records (EHRs) and scientific articles from the literature. We have developed several different phenotype extraction methods to demonstrate the utility of the corpus, and tested these methods on a further corpus, i.e., ShARe/CLEF 2013. Results Evaluation of our automated methods showed that PhenoCHF can facilitate the training of reliable phenotype extraction systems, which are robust to variations in text type. These results have been reinforced by evaluating our trained systems on the ShARe/CLEF corpus, which contains clinical records of various types. Like other studies within the biomedical domain, we found that solutions based on conditional random fields produced the best results, when coupled with a rich feature set. Conclusions PhenoCHF is the first annotated corpus aimed at encoding detailed phenotypic information. The unique heterogeneous composition of the corpus has been shown to be advantageous in the training of systems that can accurately extract phenotypic information from a range of different text types. Although the scope of our annotation is currently limited to a single disease, the promising results achieved can stimulate further work into the extraction of phenotypic information for other diseases. The PhenoCHF annotation guidelines and annotations are publicly available at https://code.google.com/p/phenochf-corpus. PMID:26099853

  13. Using text mining techniques to extract phenotypic information from the PhenoCHF corpus.

    PubMed

    Alnazzawi, Noha; Thompson, Paul; Batista-Navarro, Riza; Ananiadou, Sophia

    2015-01-01

    Phenotypic information locked away in unstructured narrative text presents significant barriers to information accessibility, both for clinical practitioners and for computerised applications used for clinical research purposes. Text mining (TM) techniques have previously been applied successfully to extract different types of information from text in the biomedical domain. They have the potential to be extended to allow the extraction of information relating to phenotypes from free text. To stimulate the development of TM systems that are able to extract phenotypic information from text, we have created a new corpus (PhenoCHF) that is annotated by domain experts with several types of phenotypic information relating to congestive heart failure. To ensure that systems developed using the corpus are robust to multiple text types, it integrates text from heterogeneous sources, i.e., electronic health records (EHRs) and scientific articles from the literature. We have developed several different phenotype extraction methods to demonstrate the utility of the corpus, and tested these methods on a further corpus, i.e., ShARe/CLEF 2013. Evaluation of our automated methods showed that PhenoCHF can facilitate the training of reliable phenotype extraction systems, which are robust to variations in text type. These results have been reinforced by evaluating our trained systems on the ShARe/CLEF corpus, which contains clinical records of various types. Like other studies within the biomedical domain, we found that solutions based on conditional random fields produced the best results, when coupled with a rich feature set. PhenoCHF is the first annotated corpus aimed at encoding detailed phenotypic information. The unique heterogeneous composition of the corpus has been shown to be advantageous in the training of systems that can accurately extract phenotypic information from a range of different text types. Although the scope of our annotation is currently limited to a single disease, the promising results achieved can stimulate further work into the extraction of phenotypic information for other diseases. The PhenoCHF annotation guidelines and annotations are publicly available at https://code.google.com/p/phenochf-corpus.

  14. Houttuynia cordata Extract Improves Physical Endurance Performance by Regulating Endothelial Production of Nitric Oxide.

    PubMed

    Yang, Ui-Jeong; Maeng, Hyojin; Park, Tae-Sik; Shim, Soon-Mi

    2015-09-01

    Vascular function is mediated by various regulatory molecules, including endothelial nitric oxide (NO), which regulates the vasodilation of smooth muscle cells. We investigated whether standardized Houttuynia cordata extract (SHCE) could improve physical endurance performance by regulating the endothelial production of NO. For the standardization of Houttuynia cordata (HC) extract, its bioactive components were identified and quantified using ultraperformance liquid chromatography-mass spectrometry. Bioaccessibility and biological activity were measured by the in vitro digestion model system and free radical scavenging capacity, respectively. The vascular function in the endothelium was assessed by the phosphorylation of endothelial nitric oxide synthase (eNOS). A preliminary clinical trial was carried out to assess the physical endurance performance. HC extract was standardized to bioactive components, including chlorogenic acid, rutin, and quercitrin, with the concentration of 5.53, 6.09, and 16.15 mg from 1 g of dry weight, respectively. Bioaccessibility was 33.17%, 31.67%, and 11.18% for chlorogenic acid, rutin, and quercitrin, respectively. Antioxidant activities of SHCE were expressed as vitamin C equivalent antioxidant capacity in 55.81 and 17.23 mg/g of HC extract using ABTS and DPPH scavenging assay, respectively. In human aortic endothelial cells, insulin-mediated phosphorylation of eNOS was increased by SHCE in the presence of palmitate. However, the expression of blood pressure-regulating genes was not altered. The level of blood lactate concentration and the heart rate of subjects who drank SHCE were lower than those of subjects who drank plain water. Oxygen uptake from subjects drinking SHCE was slightly higher than that from those who drank plain water. This study demonstrated that SHCE decreased heart rate and blood lactate, increased oxygen uptake, and improved physical performance, presumably due to the increased NO production.

  15. Adaptive Texture Synthesis for Large Scale City Modeling

    NASA Astrophysics Data System (ADS)

    Despine, G.; Colleu, T.

    2015-02-01

    Large scale city models textured with aerial images are well suited for bird-eye navigation but generally the image resolution does not allow pedestrian navigation. One solution to face this problem is to use high resolution terrestrial photos but it requires huge amount of manual work to remove occlusions. Another solution is to synthesize generic textures with a set of procedural rules and elementary patterns like bricks, roof tiles, doors and windows. This solution may give realistic textures but with no correlation to the ground truth. Instead of using pure procedural modelling we present a method to extract information from aerial images and adapt the texture synthesis to each building. We describe a workflow allowing the user to drive the information extraction and to select the appropriate texture patterns. We also emphasize the importance to organize the knowledge about elementary pattern in a texture catalogue allowing attaching physical information, semantic attributes and to execute selection requests. Roofs are processed according to the detected building material. Façades are first described in terms of principal colours, then opening positions are detected and some window features are computed. These features allow selecting the most appropriate patterns from the texture catalogue. We experimented this workflow on two samples with 20 cm and 5 cm resolution images. The roof texture synthesis and opening detection were successfully conducted on hundreds of buildings. The window characterization is still sensitive to the distortions inherent to the projection of aerial images onto the facades.

  16. Fine-grained information extraction from German transthoracic echocardiography reports.

    PubMed

    Toepfer, Martin; Corovic, Hamo; Fette, Georg; Klügl, Peter; Störk, Stefan; Puppe, Frank

    2015-11-12

    Information extraction techniques that get structured representations out of unstructured data make a large amount of clinically relevant information about patients accessible for semantic applications. These methods typically rely on standardized terminologies that guide this process. Many languages and clinical domains, however, lack appropriate resources and tools, as well as evaluations of their applications, especially if detailed conceptualizations of the domain are required. For instance, German transthoracic echocardiography reports have not been targeted sufficiently before, despite of their importance for clinical trials. This work therefore aimed at development and evaluation of an information extraction component with a fine-grained terminology that enables to recognize almost all relevant information stated in German transthoracic echocardiography reports at the University Hospital of Würzburg. A domain expert validated and iteratively refined an automatically inferred base terminology. The terminology was used by an ontology-driven information extraction system that outputs attribute value pairs. The final component has been mapped to the central elements of a standardized terminology, and it has been evaluated according to documents with different layouts. The final system achieved state-of-the-art precision (micro average.996) and recall (micro average.961) on 100 test documents that represent more than 90 % of all reports. In particular, principal aspects as defined in a standardized external terminology were recognized with f 1=.989 (micro average) and f 1=.963 (macro average). As a result of keyword matching and restraint concept extraction, the system obtained high precision also on unstructured or exceptionally short documents, and documents with uncommon layout. The developed terminology and the proposed information extraction system allow to extract fine-grained information from German semi-structured transthoracic echocardiography reports with very high precision and high recall on the majority of documents at the University Hospital of Würzburg. Extracted results populate a clinical data warehouse which supports clinical research.

  17. Missing binary data extraction challenges from Cochrane reviews in mental health and Campbell reviews with implications for empirical research.

    PubMed

    Spineli, Loukia M

    2017-12-01

    Tο report challenges encountered during the extraction process from Cochrane reviews in mental health and Campbell reviews and to indicate their implications on the empirical performance of different methods to handle missingness. We used a collection of meta-analyses on binary outcomes collated from a previous work on missing outcome data. To evaluate the accuracy of their extraction, we developed specific criteria pertaining to the reporting of missing outcome data in systematic reviews. Using the most popular methods to handle missing binary outcome data, we investigated the implications of the accuracy of the extracted meta-analysis on the random-effects meta-analysis results. Of 113 meta-analyses from Cochrane reviews, 60 (53%) were judged as "unclearly" extracted (ie, no information on the outcome of completers but available information on how missing participants were handled) and 42 (37%) as "unacceptably" extracted (ie, no information on the outcome of completers as well as no information on how missing participants were handled). For the remaining meta-analyses, it was judged that data were "acceptably" extracted (ie, information on the completers' outcome was provided for all trials). Overall, "unclear" extraction overestimated the magnitude of the summary odds ratio and the between-study variance and additionally inflated the uncertainty of both meta-analytical parameters. The only eligible Campbell review was judged as "unclear." Depending on the extent of missingness, the reporting quality of the systematic reviews can greatly affect the accuracy of the extracted meta-analyses and by extent, the empirical performance of different methods to handle missingness. Copyright © 2017 John Wiley & Sons, Ltd.

  18. Quality assessment of osteoporosis clinical practice guidelines for physical activity and safe movement: an AGREE II appraisal.

    PubMed

    Armstrong, James Jacob; Rodrigues, Isabel Braganca; Wasiuta, Tom; MacDermid, Joy C

    2016-01-01

    Many osteoporosis clinical practice guidelines are published, and the extent to which physical activity and safe movement is addressed varies. To better inform clinical decision-making, a quality assessment and structured analysis of recommendations was undertaken. Guideline quality varied substantially, and improvement is necessary in physical activity and safe movement recommendations. The purpose of the present study is to survey available osteoporosis clinical practice guidelines (CPGs) containing physical activity and safe movement recommendations in order to assess the methodological quality with which they were developed. An analysis of the various physical activity and safe movement recommendations was conducted to determine variability between CPGs. An online literature search revealed 19 CPGs meeting our inclusion criteria. Three independent scorers evaluated CPG quality using the Appraisal of Guidelines for Research and Evaluation version II (AGREE II) instrument. Two separate individuals used a standard table to extract relevant recommendations. Intra-reviewer AGREE II score agreement ranged from fair to good (intra-class correlation coefficient (ICC) = 0.34 to 0.65). The quality of the 19 included CPGs was variable (AGREE sub-scores: 14 to 100%). CPGs scored higher in the "scope and purpose" and "clarity of presentation" domains. They scored the lowest in "applicability" and "editorial independence." Four CPGs were classified as high quality, ten average quality, and five low quality. Most CPGs recommended weight-bearing, muscle-strengthening, and resistance exercises. Information on exercise dosage, progression, and contraindications was often absent. Immobility and movements involving spinal flexion and/or torsion were discouraged. There were several high-quality CPGs; however, variability in quality and lack of specific parameters for implementation necessitates caution and critical examination by readers. CPG development groups should pay special attention to the clinical applicability of their CPGs as well as fully disclosing conflicts of interest. CPGs were in general an agreement regarding safe physical activity and safe movement recommendations. However, recommendations were often vague and the more specific recommendations were inconsistent between CPGs.

  19. On extracting hadron multiplicities and unpolarized nucleon structure ratios from SIDIS data at the HERMES experiment

    NASA Astrophysics Data System (ADS)

    Linden-Levy, Loren Alexander

    2008-10-01

    We present an analysis using the world's largest data set of semi-inclusive deep inelastic scattering (SIDIS) in the kinematic range 0.1 < x < 0.6 at an average Q2 of 2.5 GeV2. This data was collected at the HERMES experiment located in the east hall of the HERA accelerator between the years 2000 and 2006. The hadron multiplicity from these scattering events is extracted for identified charged pions, kaons and protons from two different gaseous targets (H & D). For the hydrogen (deuterium) target 12.5 (16.68) million events were recorded. Using these hadron multiplicities an attempt is made to extract unpolarized information about the parton momentum distribution functions (PDFs) inside the nucleon via the flavor tagging technique within the quark-parton model. In particular, one can exploit certain factorization assumptions and fragmentation symmetries to extract the valence quark ratio dv/ uv and the light sea asymmetry d -- u/(u -- d) from the measured pion multiplicities on hydrogen and deuterium targets. The excellent particle identification available in the HERMES spectrometer coupled with the overwhelming statistics that are available from the high density end-of-fill running (especially in 2002 and 2004) make the HERMES data invaluable for reinforcing the E866/NuSea Drell-Yan result on d/ u at a different and from an entirely different physical process. These PDF extractions are also an important test of many typical assumptions made in SIDIS analyses and must be taken into consideration in light of the future facilities that propose to use this technique.

  20. Considerations on the Optimal and Efficient Processing of Information-Bearing Signals

    ERIC Educational Resources Information Center

    Harms, Herbert Andrew

    2013-01-01

    Noise is a fundamental hurdle that impedes the processing of information-bearing signals, specifically the extraction of salient information. Processing that is both optimal and efficient is desired; optimality ensures the extracted information has the highest fidelity allowed by the noise, while efficiency ensures limited resource usage. Optimal…

  1. Lattice QCD and physics beyond the Standar Model: an experimentalist perspective

    NASA Astrophysics Data System (ADS)

    Artuso, Marina

    2017-01-01

    The new frontier in elementary particle physics is to find evidence for new physics that may lead to a deeper understanding of observations such as the baryon-antibaryon asymmetry of the universe, mass hierarchy, dark matter, or dark energy to name a few. Flavor physics provides a wealth of opportunities to find such signatures, and a vast body of data taken at e+e- b-factories and at hadron machines has provided valuable information, and a few tantalizing ``tensions'' with respect to the Standard Model predictions. While the window for new physics is still open, the chance that its manifestations will be subtle is very real. A vibrant experimental program is ongoing, and significant upgrades, such as the upgraded LHCb experiment at LHC and Belle 2 at KEKb, are imminent. One of the challenges in extracting new physics from flavor physics data is the need to relate observed hadron decays to fundamental particles and interactions. The continuous improvement of Lattice QCD predictions is a key element to achieve success in this quest. Improvements in algorithms and hardware have led to predictions of increasing precision on several fundamental matrix elements, and the continuous breaking of new grounds, thus allowing a broader spectrum of measurements to become relevant to this quest. An important aspect of the experiment-lattice synergy is a comparison between lattice predictions with experiment for a variety of hadronic quantities. This talk summarizes current synergies between lattice QCD theory and flavor physics experiments, and gives some highlights of expectations from future upgrades. this work was supported by NSF.

  2. The extraction of N,N-dialkylamides III. A thermodynamical approach of the multicomponent extraction organic media by a statistical mechanic theory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Condamines, N.; Musikas, C.; Turq, P.

    1993-04-01

    The non-ideality of multicomponent media are difficult to describe, especially for situations as complex as the extraction of metals into organic media. We present a simplified model which takes into account hard-sphere' effects and physical interactions between some solutes of the studied media in the case of actinide ions liquid-liquid extraction. We focus our interest on N,N-dialkylamides extractants which have a strong non-ideal behaviour. 24 refs., 10 figs., 6 tabs.

  3. Physical activity text messaging interventions in adults: a systematic review.

    PubMed

    Buchholz, Susan Weber; Wilbur, JoEllen; Ingram, Diana; Fogg, Louis

    2013-08-01

    Physical inactivity is a leading health risk factor for mortality worldwide. Researchers are examining innovative techniques including the use of mobile technology to promote physical activity. One such technology, text messaging, is emerging internationally as a method to communicate with and motivate individuals to engage in healthy behaviors, including physical activity. Review the existing scientific literature on adult physical activity text messaging interventions. This systematic review examined research papers that addressed physical activity text messaging intervention studies in adults. Using multiple databases, the search strategy included published English language studies through October 1, 2011. An author-developed data collection tool was used independently by two reviewers to extract and examine the selected study variables. The initial search resulted in the identification of 200 publications. Eleven publications representing 10 studies were included in the final review. Studies were conducted in seven countries with over half the studies being randomized controlled trials. Participants of the studies were predominantly young to middle aged women. Physical activity data were mainly obtained by self-report although three studies used pedometers or accelerometers. Interventions ranged from only sending out text messages to combining text messages with educational materials, staff support, and/or Internet technology. Minimal information was given regarding development or number of text messages used. The median effect size for the studies was 0.50. To date, using text messaging as a method to promote physical activity has only been studied by a small group of researchers. Current physical activity text messaging literature is characterized by small sample sizes, heterogeneous but positive effect sizes, and a lack of specificity as to the development of the text messages used in these studies. Further research in this area is imperative to facilitate the expansion of mobile technology to promote physical activity. © 2013 Sigma Theta Tau International.

  4. Studies of extraction and transport system for highly charged ion beam of 18 GHz superconducting electron cyclotron resonance ion source at Research Center for Nuclear Physics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yorita, T., E-mail: yorita@rcnp.osaka-u.ac.jp; Hatanaka, K.; Fukuda, M.

    2014-02-15

    An 18 GHz superconducting electron cyclotron resonance ion source is installed to increase beam currents and to extend the variety of ions especially for highly charged heavy ions which can be accelerated by cyclotrons of Research Center for Nuclear Physics (RCNP), Osaka University. The beam production developments of several ions from B to Xe have been already done [T. Yorita, K. Hatanaka, M. Fukuda, M. Kibayashi, S. Morinobu, H.Okamura, and A. Tamii, Rev. Sci. Instrum. 79, 02A311 (2008) and T. Yorita, K. Hatanaka, M. Fukuda, M. Kibayashi, S. Morinobu, H.Okamura, and A. Tamii, Rev. Sci. Instrum. 81, 02A332 (2010)] andmore » the further studies for those beam extraction and its transport have been done in order to increase the beam current more. The plasma electrode, extraction electrode, and einzel lens are modified. Especially extraction electrode can be applied minus voltage for the beam extraction and it works well to improve the extracted beam current. The extraction voltage dependences of transmission and emittance also have been studied for beam current improvement which is injected into azimuthally varying field cyclotron at RCNP.« less

  5. HELIOGate, a Portal for the Heliophysics Community

    NASA Astrophysics Data System (ADS)

    Pierantoni; Gabriele; Carley, Eoin

    2014-10-01

    Heliophysics is the branch of physics that investigates the interactions between the Sun and the other bodies of the solar system. Heliophysicists rely on data collected from numerous sources scattered across the Solar System. The data collected from these sources is processed to extract metadata and the metadata extracted in this fashion is then used to build indexes of features and events called catalogues. Heliophysicists also develop conceptual and mathematical models of the phenomena and the environment of the Solar System. More specifically, they investigate the physical characteristics of the phenomena and they simulate how they propagate throughout the Solar System with mathematical and physical abstractions called propagation models. HELIOGate aims at addressing the need to combine and orchestrate existing web services in a flexible and easily configurable fashion to tackle different scientific questions. HELIOGate also offers a tool capable of connecting to size! able computation and storage infrastructures to execute data processing codes that are needed to calibrate raw data and to extract metadata.

  6. CRL/Brandeis: Description of the DIDEROT System as Used for MUC-5

    DTIC Science & Technology

    1993-01-01

    been evaluated in the 4th Message Understanding Conference (MUC-4 ) where it was required to extract information from 200 texts on South American...Email : jamesp@cs.brandeis .edu Abstract This report describes the major developments over the last six months in completing th e Diderot information ...extraction system for the MUC-5 evaluation . Diderot is an information extraction system built at CRL and Brandeis University over th e past two

  7. Extracting important information from Chinese Operation Notes with natural language processing methods.

    PubMed

    Wang, Hui; Zhang, Weide; Zeng, Qiang; Li, Zuofeng; Feng, Kaiyan; Liu, Lei

    2014-04-01

    Extracting information from unstructured clinical narratives is valuable for many clinical applications. Although natural Language Processing (NLP) methods have been profoundly studied in electronic medical records (EMR), few studies have explored NLP in extracting information from Chinese clinical narratives. In this study, we report the development and evaluation of extracting tumor-related information from operation notes of hepatic carcinomas which were written in Chinese. Using 86 operation notes manually annotated by physicians as the training set, we explored both rule-based and supervised machine-learning approaches. Evaluating on unseen 29 operation notes, our best approach yielded 69.6% in precision, 58.3% in recall and 63.5% F-score. Copyright © 2014 Elsevier Inc. All rights reserved.

  8. MICHIGAN SOIL VAPOR EXTRACTION REMEDIATION (MISER) MODEL: A COMPUTER PROGRAM TO MODEL SOIL VAPORT EXTRACTION AND BIOVENTING OF ORGANIC MATERIALS IN UNSATURATED GEOLO-GICAL MATERIAL (EPA/600/SR-97/099)

    EPA Science Inventory

    Soil vapor extraction (SVE) and bioventing (BV) are proven strategies for remediation of unsaturated zone soils. Mathematical models are powerful tools that can be used to integrate and quantify the interaction of physical, chemical, and biological processes occurring in field sc...

  9. Effect of basic physical parameters to control plasma meniscus and beam halo formation in negative ion sources

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miyamoto, K.; Okuda, S.; Nishioka, S.

    2013-09-14

    Our previous study shows that the curvature of the plasma meniscus causes the beam halo in the negative ion sources: the negative ions extracted from the periphery of the meniscus are over-focused in the extractor due to the electrostatic lens effect, and consequently become the beam halo. In this article, the detail physics of the plasma meniscus and beam halo formation is investigated with two-dimensional particle-in-cell simulation. It is shown that the basic physical parameters such as the H{sup −} extraction voltage and the effective electron confinement time significantly affect the formation of the plasma meniscus and the resultant beammore » halo since the penetration of electric field for negative ion extraction depends on these physical parameters. Especially, the electron confinement time depends on the characteristic time of electron escape along the magnetic field as well as the characteristic time of electron diffusion across the magnetic field. The plasma meniscus penetrates deeply into the source plasma region when the effective electron confinement time is short. In this case, the curvature of the plasma meniscus becomes large, and consequently the fraction of the beam halo increases.« less

  10. Quality characteristics of oil extracted from gamma irradiated peanut (Arachis hypogea L.)

    NASA Astrophysics Data System (ADS)

    Al-Bachir, Mahfouz

    2015-01-01

    The effect of gamma radiation and storage on the characteristics of oil extracted from peanut seeds has been investigated in this study. Peanut seeds were undergone gamma irradiation process with the doses of 1, 2 and 3 kGy. The changes in chemical and physical attributes were observed immediately after irradiation and after 12 months of storage. The data obtained from the experiments showed that irradiation process had no effect on the chemical and physical qualities such as, fatty acid composition, peroxide value, iodine value specification number, TBA value and color of oil extracted from peanut seeds. On the contrary, the peroxide, acidity and TBA values of the peanut oil were decreased due to storage time.

  11. Electron current extraction from a permanent magnet waveguide plasma cathode.

    PubMed

    Weatherford, B R; Foster, J E; Kamhawi, H

    2011-09-01

    An electron cyclotron resonance plasma produced in a cylindrical waveguide with external permanent magnets was investigated as a possible plasma cathode electron source. The configuration is desirable in that it eliminates the need for a physical antenna inserted into the plasma, the erosion of which limits operating lifetime. Plasma bulk density was found to be overdense in the source. Extraction currents over 4 A were achieved with the device. Measurements of extracted electron currents were similar to calculated currents, which were estimated using Langmuir probe measurements at the plasma cathode orifice and along the length of the external plume. The influence of facility effects and trace ionization in the anode-cathode gap are also discussed. © 2011 American Institute of Physics

  12. A rule-based named-entity recognition method for knowledge extraction of evidence-based dietary recommendations

    PubMed Central

    2017-01-01

    Evidence-based dietary information represented as unstructured text is a crucial information that needs to be accessed in order to help dietitians follow the new knowledge arrives daily with newly published scientific reports. Different named-entity recognition (NER) methods have been introduced previously to extract useful information from the biomedical literature. They are focused on, for example extracting gene mentions, proteins mentions, relationships between genes and proteins, chemical concepts and relationships between drugs and diseases. In this paper, we present a novel NER method, called drNER, for knowledge extraction of evidence-based dietary information. To the best of our knowledge this is the first attempt at extracting dietary concepts. DrNER is a rule-based NER that consists of two phases. The first one involves the detection and determination of the entities mention, and the second one involves the selection and extraction of the entities. We evaluate the method by using text corpora from heterogeneous sources, including text from several scientifically validated web sites and text from scientific publications. Evaluation of the method showed that drNER gives good results and can be used for knowledge extraction of evidence-based dietary recommendations. PMID:28644863

  13. Chaotic itinerancy within the coupled dynamics between a physical body and neural oscillator networks

    PubMed Central

    Mori, Hiroki; Okuyama, Yuji; Asada, Minoru

    2017-01-01

    Chaotic itinerancy is a phenomenon in which the state of a nonlinear dynamical system spontaneously explores and attracts certain states in a state space. From this perspective, the diverse behavior of animals and its spontaneous transitions lead to a complex coupled dynamical system, including a physical body and a brain. Herein, a series of simulations using different types of non-linear oscillator networks (i.e., regular, small-world, scale-free, random) with a musculoskeletal model (i.e., a snake-like robot) as a physical body are conducted to understand how the chaotic itinerancy of bodily behavior emerges from the coupled dynamics between the body and the brain. A behavior analysis (behavior clustering) and network analysis for the classified behavior are then applied. The former consists of feature vector extraction from the motions and classification of the movement patterns that emerged from the coupled dynamics. The network structures behind the classified movement patterns are revealed by estimating the “information networks” different from the given non-linear oscillator networks based on the transfer entropy which finds the information flow among neurons. The experimental results show that: (1) the number of movement patterns and their duration depend on the sensor ratio to control the balance of strength between the body and the brain dynamics and on the type of the given non-linear oscillator networks; and (2) two kinds of information networks are found behind two kinds movement patterns with different durations by utilizing the complex network measures, clustering coefficient and the shortest path length with a negative and a positive relationship with the duration periods of movement patterns. The current results seem promising for a future extension of the method to a more complicated body and environment. Several requirements are also discussed. PMID:28796797

  14. Chaotic itinerancy within the coupled dynamics between a physical body and neural oscillator networks.

    PubMed

    Park, Jihoon; Mori, Hiroki; Okuyama, Yuji; Asada, Minoru

    2017-01-01

    Chaotic itinerancy is a phenomenon in which the state of a nonlinear dynamical system spontaneously explores and attracts certain states in a state space. From this perspective, the diverse behavior of animals and its spontaneous transitions lead to a complex coupled dynamical system, including a physical body and a brain. Herein, a series of simulations using different types of non-linear oscillator networks (i.e., regular, small-world, scale-free, random) with a musculoskeletal model (i.e., a snake-like robot) as a physical body are conducted to understand how the chaotic itinerancy of bodily behavior emerges from the coupled dynamics between the body and the brain. A behavior analysis (behavior clustering) and network analysis for the classified behavior are then applied. The former consists of feature vector extraction from the motions and classification of the movement patterns that emerged from the coupled dynamics. The network structures behind the classified movement patterns are revealed by estimating the "information networks" different from the given non-linear oscillator networks based on the transfer entropy which finds the information flow among neurons. The experimental results show that: (1) the number of movement patterns and their duration depend on the sensor ratio to control the balance of strength between the body and the brain dynamics and on the type of the given non-linear oscillator networks; and (2) two kinds of information networks are found behind two kinds movement patterns with different durations by utilizing the complex network measures, clustering coefficient and the shortest path length with a negative and a positive relationship with the duration periods of movement patterns. The current results seem promising for a future extension of the method to a more complicated body and environment. Several requirements are also discussed.

  15. Forensic drug intelligence and the rise of cryptomarkets. Part I: Studying the Australian virtual market.

    PubMed

    Broséus, Julian; Morelato, Marie; Tahtouh, Mark; Roux, Claude

    2017-10-01

    Analysing and understanding cryptomarkets is essential to become proactive in the fight against the illicit drug trade. Such a research seeks to combine a diversity of indicators related to the virtual (darknet markets) and physical (the traditional "offline" market) aspects of the illicit drug trade to provide information on the distribution and consumption as well as to assess similarities/differences between the virtual and physical markets. This study analysed data that had previously been collected on cryptomarkets from December 2013 to March 2015. In this article, the data was extracted from two marketplaces, Evolution and Silk Road 2, and analysed to evaluate the illicit drug trade of the Australian virtual market (e.g. information about the supply and demand, trafficking flows, prices of illicit drugs and market share) and highlight its specificities. The results revealed the domestic nature of the virtual Australian illicit drug trade (i.e. Australian sellers essentially ship their products to local customers). This may explain the coherence between supply and demand. Particularly, the virtual Australian illicit drug trade is dominated by amphetamine-type substances (ATS), mainly methamphetamine and 3,4-methylenedioxymethamphetamine (MDMA), and cannabis. Australia, as a shipping country, accounts for half of the methamphetamine offered and purchased on Silk Road 2. Moreover, it was observed that the online price fixed by Australian sellers for the considered illicit drugs is higher than for any other shipping countries, which is in line with previous studies. Understanding the virtual and physical drug market necessitates the integration and fusion of different perspectives to capture the dynamic nature of drug trafficking, monitor its evolution and finally improve our understanding of the phenomenon so policy makers can make informed decisions. Copyright © 2017 Elsevier B.V. All rights reserved.

  16. Smart Extraction and Analysis System for Clinical Research.

    PubMed

    Afzal, Muhammad; Hussain, Maqbool; Khan, Wajahat Ali; Ali, Taqdir; Jamshed, Arif; Lee, Sungyoung

    2017-05-01

    With the increasing use of electronic health records (EHRs), there is a growing need to expand the utilization of EHR data to support clinical research. The key challenge in achieving this goal is the unavailability of smart systems and methods to overcome the issue of data preparation, structuring, and sharing for smooth clinical research. We developed a robust analysis system called the smart extraction and analysis system (SEAS) that consists of two subsystems: (1) the information extraction system (IES), for extracting information from clinical documents, and (2) the survival analysis system (SAS), for a descriptive and predictive analysis to compile the survival statistics and predict the future chance of survivability. The IES subsystem is based on a novel permutation-based pattern recognition method that extracts information from unstructured clinical documents. Similarly, the SAS subsystem is based on a classification and regression tree (CART)-based prediction model for survival analysis. SEAS is evaluated and validated on a real-world case study of head and neck cancer. The overall information extraction accuracy of the system for semistructured text is recorded at 99%, while that for unstructured text is 97%. Furthermore, the automated, unstructured information extraction has reduced the average time spent on manual data entry by 75%, without compromising the accuracy of the system. Moreover, around 88% of patients are found in a terminal or dead state for the highest clinical stage of disease (level IV). Similarly, there is an ∼36% probability of a patient being alive if at least one of the lifestyle risk factors was positive. We presented our work on the development of SEAS to replace costly and time-consuming manual methods with smart automatic extraction of information and survival prediction methods. SEAS has reduced the time and energy of human resources spent unnecessarily on manual tasks.

  17. Forecasting induced seismicity rate and Mmax using calibrated numerical models

    NASA Astrophysics Data System (ADS)

    Dempsey, D.; Suckale, J.

    2016-12-01

    At Groningen, The Netherlands, several decades of induced seismicity from gas extraction has culminated in a M 3.6 event (mid 2012). From a public safety and commercial perspective, it is desirable to anticipate future seismicity outcomes at Groningen. One way to quantify earthquake risk is Probabilistic Seismic Hazard Analysis (PSHA), which requires an estimate of the future seismicity rate and its magnitude frequency distribution (MFD). This approach is effective at quantifying risk from tectonic events because the seismicity rate, once measured, is almost constant over timescales of interest. In contrast, rates of induced seismicity vary significantly over building lifetimes, largely in response to changes in injection or extraction. Thus, the key to extending PSHA to induced earthquakes is to estimate future changes of the seismicity rate in response to some proposed operating schedule. Numerical models can describe the physical link between fluid pressure, effective stress change, and the earthquake process (triggering and propagation). However, models with predictive potential of individual earthquakes face the difficulty of characterizing specific heterogeneity - stress, strength, roughness, etc. - at locations of interest. Modeling catalogs of earthquakes provides a means of averaging over this uncertainty, focusing instead on the collective features of the seismicity, e.g., its rate and MFD. The model we use incorporates fluid pressure and stress changes to describe nucleation and crack-like propagation of earthquakes on stochastically characterized 1D faults. This enables simulation of synthetic catalogs of induced seismicity from which the seismicity rate, location and MFD are extracted. A probability distribution for Mmax - the largest event in some specified time window - is also computed. Because the model captures the physics linking seismicity to changes in the reservoir, earthquake observations and operating information can be used to calibrate a model at a specific site (or, ideally, many models). This restricts analysis of future seismicity to likely parameter sets and provides physical justification for linking operational changes to subsequent seismicity. To illustrate these concepts, a recent study of prior and forecast seismicity at Groningen will be presented.

  18. 30 CFR 702.10 - Information collection.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 30 Mineral Resources 3 2012-07-01 2012-07-01 false Information collection. 702.10 Section 702.10... EXEMPTION FOR COAL EXTRACTION INCIDENTAL TO THE EXTRACTION OF OTHER MINERALS § 702.10 Information collection. The collections of information contained in §§ 702.11, 702.12, 702.13, 702.15 and 702.18 of this part...

  19. 30 CFR 702.10 - Information collection.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 30 Mineral Resources 3 2011-07-01 2011-07-01 false Information collection. 702.10 Section 702.10... EXEMPTION FOR COAL EXTRACTION INCIDENTAL TO THE EXTRACTION OF OTHER MINERALS § 702.10 Information collection. The collections of information contained in §§ 702.11, 702.12, 702.13, 702.15 and 702.18 of this part...

  20. 30 CFR 702.10 - Information collection.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 30 Mineral Resources 3 2010-07-01 2010-07-01 false Information collection. 702.10 Section 702.10... EXEMPTION FOR COAL EXTRACTION INCIDENTAL TO THE EXTRACTION OF OTHER MINERALS § 702.10 Information collection. The collections of information contained in §§ 702.11, 702.12, 702.13, 702.15 and 702.18 of this part...

  1. Integrating Information Extraction Agents into a Tourism Recommender System

    NASA Astrophysics Data System (ADS)

    Esparcia, Sergio; Sánchez-Anguix, Víctor; Argente, Estefanía; García-Fornes, Ana; Julián, Vicente

    Recommender systems face some problems. On the one hand information needs to be maintained updated, which can result in a costly task if it is not performed automatically. On the other hand, it may be interesting to include third party services in the recommendation since they improve its quality. In this paper, we present an add-on for the Social-Net Tourism Recommender System that uses information extraction and natural language processing techniques in order to automatically extract and classify information from the Web. Its goal is to maintain the system updated and obtain information about third party services that are not offered by service providers inside the system.

  2. SU-E-J-252: A Motion Algorithm to Extract Physical and Motion Parameters of a Mobile Target in Cone-Beam Computed Tomographic Imaging Retrospective to Image Reconstruction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ali, I; Ahmad, S; Alsbou, N

    Purpose: A motion algorithm was developed to extract actual length, CT-numbers and motion amplitude of a mobile target imaged with cone-beam-CT (CBCT) retrospective to image-reconstruction. Methods: The motion model considered a mobile target moving with a sinusoidal motion and employed three measurable parameters: apparent length, CT number level and gradient of a mobile target obtained from CBCT images to extract information about the actual length and CT number value of the stationary target and motion amplitude. The algorithm was verified experimentally with a mobile phantom setup that has three targets with different sizes manufactured from homogenous tissue-equivalent gel material embeddedmore » into a thorax phantom. The phantom moved sinusoidal in one-direction using eight amplitudes (0–20mm) and a frequency of 15-cycles-per-minute. The model required imaging parameters such as slice thickness, imaging time. Results: This motion algorithm extracted three unknown parameters: length of the target, CT-number-level, motion amplitude for a mobile target retrospective to CBCT image reconstruction. The algorithm relates three unknown parameters to measurable apparent length, CT-number-level and gradient for well-defined mobile targets obtained from CBCT images. The motion model agreed with measured apparent lengths which were dependent on actual length of the target and motion amplitude. The cumulative CT-number for a mobile target was dependent on CT-number-level of the stationary target and motion amplitude. The gradient of the CT-distribution of mobile target is dependent on the stationary CT-number-level, actual target length along the direction of motion, and motion amplitude. Motion frequency and phase did not affect the elongation and CT-number distributions of mobile targets when imaging time included several motion cycles. Conclusion: The motion algorithm developed in this study has potential applications in diagnostic CT imaging and radiotherapy to extract actual length, size and CT-numbers distorted by motion in CBCT imaging. The model provides further information about motion of the target.« less

  3. Law for the Welfare of Physically Disabled Persons, 1949 (The Latest Amendment Was in 1986).

    ERIC Educational Resources Information Center

    Japanese Society for Rehabilitation of the Disabled, Tokyo.

    This document presents the text of the 1949 Japanese Law for the Welfare of Physically Disabled Persons and brief extracts of later amendments. Sections in Chapter 1 cover definitions, the Advisory Council on Welfare of Physically Disabled Persons, and service providers. Chapter II covers welfare measures such as the physically disabled person's…

  4. Conception of Self-Construction Production Scheduling System

    NASA Astrophysics Data System (ADS)

    Xue, Hai; Zhang, Xuerui; Shimizu, Yasuhiro; Fujimura, Shigeru

    With the high speed innovation of information technology, many production scheduling systems have been developed. However, a lot of customization according to individual production environment is required, and then a large investment for development and maintenance is indispensable. Therefore now the direction to construct scheduling systems should be changed. The final objective of this research aims at developing a system which is built by it extracting the scheduling technique automatically through the daily production scheduling work, so that an investment will be reduced. This extraction mechanism should be applied for various production processes for the interoperability. Using the master information extracted by the system, production scheduling operators can be supported to accelerate the production scheduling work easily and accurately without any restriction of scheduling operations. By installing this extraction mechanism, it is easy to introduce scheduling system without a lot of expense for customization. In this paper, at first a model for expressing a scheduling problem is proposed. Then the guideline to extract the scheduling information and use the extracted information is shown and some applied functions are also proposed based on it.

  5. Extracting information from 0νββ decay and LHC pp-cross sections: Limits on the left-right mixing angle and right-handed boson mass

    NASA Astrophysics Data System (ADS)

    Civitarese, O.; Suhonen, J.; Zuber, K.

    2015-10-01

    The existence of massive neutrinos forces the extension of the Standard Model of electroweak interactions, to accommodate them and/or right-handed currents. This is one of the fundamental questions in todays's physics. The consequences of it would reflect upon several decay processes, like the very exotic nuclear double-beta-decay. By the other hand, high-energy proton-proton reactions of the type performed at the LHC accelerator can provide information about the existence of a right-handed generation of the W and Z-bosons. Here we shall address the possibility of performing a joint analysis of the results reported by the ATLAS and CMS collaborations (σ(pp- > 2l + jets)) and the latest measurements of nuclear-double-beta decays reported by the GERDA and EXO collaborations.

  6. Sequential visibility-graph motifs

    NASA Astrophysics Data System (ADS)

    Iacovacci, Jacopo; Lacasa, Lucas

    2016-04-01

    Visibility algorithms transform time series into graphs and encode dynamical information in their topology, paving the way for graph-theoretical time series analysis as well as building a bridge between nonlinear dynamics and network science. In this work we introduce and study the concept of sequential visibility-graph motifs, smaller substructures of n consecutive nodes that appear with characteristic frequencies. We develop a theory to compute in an exact way the motif profiles associated with general classes of deterministic and stochastic dynamics. We find that this simple property is indeed a highly informative and computationally efficient feature capable of distinguishing among different dynamics and robust against noise contamination. We finally confirm that it can be used in practice to perform unsupervised learning, by extracting motif profiles from experimental heart-rate series and being able, accordingly, to disentangle meditative from other relaxation states. Applications of this general theory include the automatic classification and description of physical, biological, and financial time series.

  7. Towards NIRS-based hand movement recognition.

    PubMed

    Paleari, Marco; Luciani, Riccardo; Ariano, Paolo

    2017-07-01

    This work reports on preliminary results about on hand movement recognition with Near InfraRed Spectroscopy (NIRS) and surface ElectroMyoGraphy (sEMG). Either basing on physical contact (touchscreens, data-gloves, etc.), vision techniques (Microsoft Kinect, Sony PlayStation Move, etc.), or other modalities, hand movement recognition is a pervasive function in today environment and it is at the base of many gaming, social, and medical applications. Albeit, in recent years, the use of muscle information extracted by sEMG has spread out from the medical applications to contaminate the consumer world, this technique still falls short when dealing with movements of the hand. We tested NIRS as a technique to get another point of view on the muscle phenomena and proved that, within a specific movements selection, NIRS can be used to recognize movements and return information regarding muscles at different depths. Furthermore, we propose here three different multimodal movement recognition approaches and compare their performances.

  8. Quasi-Epipolar Resampling of High Resolution Satellite Stereo Imagery for Semi Global Matching

    NASA Astrophysics Data System (ADS)

    Tatar, N.; Saadatseresht, M.; Arefi, H.; Hadavand, A.

    2015-12-01

    Semi-global matching is a well-known stereo matching algorithm in photogrammetric and computer vision society. Epipolar images are supposed as input of this algorithm. Epipolar geometry of linear array scanners is not a straight line as in case of frame camera. Traditional epipolar resampling algorithms demands for rational polynomial coefficients (RPCs), physical sensor model or ground control points. In this paper we propose a new solution for epipolar resampling method which works without the need for these information. In proposed method, automatic feature extraction algorithms are employed to generate corresponding features for registering stereo pairs. Also original images are divided into small tiles. In this way by omitting the need for extra information, the speed of matching algorithm increased and the need for high temporal memory decreased. Our experiments on GeoEye-1 stereo pair captured over Qom city in Iran demonstrates that the epipolar images are generated with sub-pixel accuracy.

  9. Systematics of the electric dipole response in stable tin isotopes

    NASA Astrophysics Data System (ADS)

    Bassauer, Sergej; von Neumann-Cosel, Peter; Tamii, Atsushi

    2018-05-01

    The electric dipole is an important property of heavy nuclei. Precise information on the electric dipole response provides information on the electric dipole polarisability which in turn allows to extract important constraints on neutron-skin thickness in heavy nuclei and parameters of the symmetry energy. The tin isotope chain is particularly suited for a systematic study of the dependence of the electric dipole response on neutron excess as it provides a wide mass range of accessible isotopes with little change of the underlying structure. Recently an inelastic proton scattering experiment under forward angles including 0º on 112,116,124Sn was performed at the Research Centre for Nuclear Physics (RCNP), Japan with a focus on the low-energy dipole strength and the polarisability. First results are presented here. Using data from an earlier proton scattering experiment on 120Sn the gamma strength function and level density are determined for this nucleus.

  10. Magnetosensitive e-skins with directional perception for augmented reality

    PubMed Central

    Cañón Bermúdez, Gilbert Santiago; Karnaushenko, Dmitriy D.; Karnaushenko, Daniil; Lebanov, Ana; Bischoff, Lothar; Kaltenbrunner, Martin; Fassbender, Jürgen; Schmidt, Oliver G.; Makarov, Denys

    2018-01-01

    Electronic skins equipped with artificial receptors are able to extend our perception beyond the modalities that have naturally evolved. These synthetic receptors offer complimentary information on our surroundings and endow us with novel means of manipulating physical or even virtual objects. We realize highly compliant magnetosensitive skins with directional perception that enable magnetic cognition, body position tracking, and touchless object manipulation. Transfer printing of eight high-performance spin valve sensors arranged into two Wheatstone bridges onto 1.7-μm-thick polyimide foils ensures mechanical imperceptibility. This resembles a new class of interactive devices extracting information from the surroundings through magnetic tags. We demonstrate this concept in augmented reality systems with virtual knob-turning functions and the operation of virtual dialing pads, based on the interaction with magnetic fields. This technology will enable a cornucopia of applications from navigation, motion tracking in robotics, regenerative medicine, and sports and gaming to interaction in supplemented reality. PMID:29376121

  11. Development of user-friendly and interactive data collection system for cerebral palsy.

    PubMed

    Raharjo, I; Burns, T G; Venugopalan, J; Wang, M D

    2016-02-01

    Cerebral palsy (CP) is a permanent motor disorder that appears in early age and it requires multiple tests to assess the physical and mental capabilities of the patients. Current medical record data collection systems, e.g., EPIC, employed for CP are very general, difficult to navigate, and prone to errors. The data cannot easily be extracted which limits data analysis on this rich source of information. To overcome these limitations, we designed and prototyped a database with a graphical user interface geared towards clinical research specifically in CP. The platform with MySQL and Java framework is reliable, secure, and can be easily integrated with other programming languages for data analysis such as MATLAB. This database with GUI design is a promising tool for data collection and can be applied in many different fields aside from CP to infer useful information out of the vast amount of data being collected.

  12. Development of user-friendly and interactive data collection system for cerebral palsy

    PubMed Central

    Raharjo, I.; Burns, T. G.; Venugopalan, J.; Wang., M. D.

    2016-01-01

    Cerebral palsy (CP) is a permanent motor disorder that appears in early age and it requires multiple tests to assess the physical and mental capabilities of the patients. Current medical record data collection systems, e.g., EPIC, employed for CP are very general, difficult to navigate, and prone to errors. The data cannot easily be extracted which limits data analysis on this rich source of information. To overcome these limitations, we designed and prototyped a database with a graphical user interface geared towards clinical research specifically in CP. The platform with MySQL and Java framework is reliable, secure, and can be easily integrated with other programming languages for data analysis such as MATLAB. This database with GUI design is a promising tool for data collection and can be applied in many different fields aside from CP to infer useful information out of the vast amount of data being collected. PMID:28133638

  13. Can quantum probes satisfy the weak equivalence principle?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Seveso, Luigi, E-mail: luigi.seveso@unimi.it; Paris, Matteo G.A.; INFN, Sezione di Milano, I-20133 Milano

    We address the question whether quantum probes in a gravitational field can be considered as test particles obeying the weak equivalence principle (WEP). A formulation of the WEP is proposed which applies also in the quantum regime, while maintaining the physical content of its classical counterpart. Such formulation requires the introduction of a gravitational field not to modify the Fisher information about the mass of a freely-falling probe, extractable through measurements of its position. We discover that, while in a uniform field quantum probes satisfy our formulation of the WEP exactly, gravity gradients can encode nontrivial information about the particle’smore » mass in its wavefunction, leading to violations of the WEP. - Highlights: • Can quantum probes under gravity be approximated as test-bodies? • A formulation of the weak equivalence principle for quantum probes is proposed. • Quantum probes are found to violate it as a matter of principle.« less

  14. Extracting information from 0νββ decay and LHC pp-cross sections: Limits on the left-right mixing angle and right-handed boson mass

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Civitarese, O., E-mail: osvaldo.civitarese@fisica.unlp.edu.ar; Suhonen, J.; Zuber, K.

    2015-10-28

    The existence of massive neutrinos forces the extension of the Standard Model of electroweak interactions, to accommodate them and/or right-handed currents. This is one of the fundamental questions in todays’s physics. The consequences of it would reflect upon several decay processes, like the very exotic nuclear double-beta-decay. By the other hand, high-energy proton-proton reactions of the type performed at the LHC accelerator can provide information about the existence of a right-handed generation of the W and Z-bosons. Here we shall address the possibility of performing a joint analysis of the results reported by the ATLAS and CMS collaborations (σ(pp− >more » 2l + jets)) and the latest measurements of nuclear-double-beta decays reported by the GERDA and EXO collaborations.« less

  15. Model for Semantically Rich Point Cloud Data

    NASA Astrophysics Data System (ADS)

    Poux, F.; Neuville, R.; Hallot, P.; Billen, R.

    2017-10-01

    This paper proposes an interoperable model for managing high dimensional point clouds while integrating semantics. Point clouds from sensors are a direct source of information physically describing a 3D state of the recorded environment. As such, they are an exhaustive representation of the real world at every scale: 3D reality-based spatial data. Their generation is increasingly fast but processing routines and data models lack of knowledge to reason from information extraction rather than interpretation. The enhanced smart point cloud developed model allows to bring intelligence to point clouds via 3 connected meta-models while linking available knowledge and classification procedures that permits semantic injection. Interoperability drives the model adaptation to potentially many applications through specialized domain ontologies. A first prototype is implemented in Python and PostgreSQL database and allows to combine semantic and spatial concepts for basic hybrid queries on different point clouds.

  16. On the three primordial numbers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gobbetti, Roberto; Pajer, Enrico; Roest, Diederik, E-mail: r.gobbetti@uu.nl, E-mail: enrico.pajer@gmail.com, E-mail: d.roest@rug.nl

    2015-09-01

    Cosmological observations have provided us with the measurement of just three numbers that characterize the very early universe:  1−n{sub s} , N and ln Δ{sub R}{sup 2}. Although each of the three numbers individually carries limited information about the physics of inflation, one may hope to extract non-trivial information from relations among them. Invoking minimality, namely the absence of ad hoc large numbers, we find two viable and mutually exclusive inflationary scenarios. The first is the well-known inverse relation between 1− n{sub s}  and N. The second implies a new relation between  1−n{sub s}  and ln  Δ{sub R}{sup 2}, which might providemore » us with a handle on the beginning of inflation and predicts the intriguing lower bound on the tensor-to-scalar ratio r > 0.006 (95% CL)« less

  17. The Unified Database for BM@N experiment data handling

    NASA Astrophysics Data System (ADS)

    Gertsenberger, Konstantin; Rogachevsky, Oleg

    2018-04-01

    The article describes the developed Unified Database designed as a comprehensive relational data storage for the BM@N experiment at the Joint Institute for Nuclear Research in Dubna. The BM@N experiment, which is one of the main elements of the first stage of the NICA project, is a fixed target experiment at extracted Nuclotron beams of the Laboratory of High Energy Physics (LHEP JINR). The structure and purposes of the BM@N setup are briefly presented. The article considers the scheme of the Unified Database, its attributes and implemented features in detail. The use of the developed BM@N database provides correct multi-user access to actual information of the experiment for data processing. It stores information on the experiment runs, detectors and their geometries, different configuration, calibration and algorithm parameters used in offline data processing. An important part of any database - user interfaces are presented.

  18. [Construction of chemical information database based on optical structure recognition technique].

    PubMed

    Lv, C Y; Li, M N; Zhang, L R; Liu, Z M

    2018-04-18

    To create a protocol that could be used to construct chemical information database from scientific literature quickly and automatically. Scientific literature, patents and technical reports from different chemical disciplines were collected and stored in PDF format as fundamental datasets. Chemical structures were transformed from published documents and images to machine-readable data by using the name conversion technology and optical structure recognition tool CLiDE. In the process of molecular structure information extraction, Markush structures were enumerated into well-defined monomer molecules by means of QueryTools in molecule editor ChemDraw. Document management software EndNote X8 was applied to acquire bibliographical references involving title, author, journal and year of publication. Text mining toolkit ChemDataExtractor was adopted to retrieve information that could be used to populate structured chemical database from figures, tables, and textual paragraphs. After this step, detailed manual revision and annotation were conducted in order to ensure the accuracy and completeness of the data. In addition to the literature data, computing simulation platform Pipeline Pilot 7.5 was utilized to calculate the physical and chemical properties and predict molecular attributes. Furthermore, open database ChEMBL was linked to fetch known bioactivities, such as indications and targets. After information extraction and data expansion, five separate metadata files were generated, including molecular structure data file, molecular information, bibliographical references, predictable attributes and known bioactivities. Canonical simplified molecular input line entry specification as primary key, metadata files were associated through common key nodes including molecular number and PDF number to construct an integrated chemical information database. A reasonable construction protocol of chemical information database was created successfully. A total of 174 research articles and 25 reviews published in Marine Drugs from January 2015 to June 2016 collected as essential data source, and an elementary marine natural product database named PKU-MNPD was built in accordance with this protocol, which contained 3 262 molecules and 19 821 records. This data aggregation protocol is of great help for the chemical information database construction in accuracy, comprehensiveness and efficiency based on original documents. The structured chemical information database can facilitate the access to medical intelligence and accelerate the transformation of scientific research achievements.

  19. Extraction of cellulose from pistachio shell and physical and mechanical characterisation of cellulose-based nanocomposites

    NASA Astrophysics Data System (ADS)

    Movva, Mounika; Kommineni, Ravindra

    2017-04-01

    Cellulose is an important nanoentity that have been used for the preparation of composites. The present work focuses on the extraction of cellulose from pistachio shell and preparing a partially degradable nanocomposite with extracted cellulose. Physical and microstructural characteristics of nanocellulose extracted from pistachio shell powder (PSP) through various stages of chemical treatment are identified from scanning electron microscopy (SEM), Fourier transform infra-red spectroscopy (FTIR), x-ray powder diffraction (XRD), and thermogravimetric analysis (TGA). Later, characterized nanocellulose is reinforced in a polyester matrix to fabricate nanocellulose-based composites according to the ASTM standard. The resulting nanocellulose composite performance is evaluated in the mechanical perspective through tensile and flexural loading. SEM, FTIR, and XRD showed that the process for extraction is efficient in obtaining 95% crystalline cellulose. Cellulose also showed good thermal stability with a peak thermal degradation temperature of 361 °C. Such cellulose when reinforced in a matrix material showed a noteworthy rise in tensile and flexural strengths of 43 MPa and 127 MPa, at a definite weight percent of 5%.

  20. Spent brewer's yeast extract as an ingredient in cooked hams.

    PubMed

    Pancrazio, Gaston; Cunha, Sara C; de Pinho, Paula Guedes; Loureiro, Mónica; Meireles, Sónia; Ferreira, Isabel M P L V O; Pinho, Olívia

    2016-11-01

    This work describes the effect of the incorporation of 1% spent yeast extract into cooked hams. Physical/chemical/sensorial characteristics and changes during 12 and 90days storage were evaluated on control and treated cooked hams processed for 1.5, 2.0, 2.5 or 3h. Spent yeast extract addition increased hardness, chewiness, ash, protein and free amino acid content. Similar volatile profiles were obtained, although there were some quantitative differences. No advantages were observed for increased cooking time. No significant differences were observed for physical and sensorial parameters of cooked hams with spent yeast extract at 12 and 90days post production, but His, aldehydes and esters increased at the end of storage. This behaviour was similar to that observed for control hams. The higher hardness of cooked ham with 1% yeast extract was due to the stronger gel formed during cooking and was maintained during storage. This additive acts as gel stabilizer for cooked ham production and could potentially improve other processing characteristics. Copyright © 2016 Elsevier Ltd. All rights reserved.

  1. GeoDeepDive: Towards a Machine Reading-Ready Digital Library and Information Integration Resource

    NASA Astrophysics Data System (ADS)

    Husson, J. M.; Peters, S. E.; Livny, M.; Ross, I.

    2015-12-01

    Recent developments in machine reading and learning approaches to text and data mining hold considerable promise for accelerating the pace and quality of literature-based data synthesis, but these advances have outpaced even basic levels of access to the published literature. For many geoscience domains, particularly those based on physical samples and field-based descriptions, this limitation is significant. Here we describe a general infrastructure to support published literature-based machine reading and learning approaches to information integration and knowledge base creation. This infrastructure supports rate-controlled automated fetching of original documents, along with full bibliographic citation metadata, from remote servers, the secure storage of original documents, and the utilization of considerable high-throughput computing resources for the pre-processing of these documents by optical character recognition, natural language parsing, and other document annotation and parsing software tools. New tools and versions of existing tools can be automatically deployed against original documents when they are made available. The products of these tools (text/XML files) are managed by MongoDB and are available for use in data extraction applications. Basic search and discovery functionality is provided by ElasticSearch, which is used to identify documents of potential relevance to a given data extraction task. Relevant files derived from the original documents are then combined into basic starting points for application building; these starting points are kept up-to-date as new relevant documents are incorporated into the digital library. Currently, our digital library stores contains more than 360K documents supplied by Elsevier and the USGS and we are actively seeking additional content providers. By focusing on building a dependable infrastructure to support the retrieval, storage, and pre-processing of published content, we are establishing a foundation for complex, and continually improving, information integration and data extraction applications. We have developed one such application, which we present as an example, and invite new collaborations to develop other such applications.

  2. Ethical experiential learning in medical, nursing and allied health education: A narrative review.

    PubMed

    Grace, Sandra; Innes, Ev; Patton, Narelle; Stockhausen, Lynette

    2017-04-01

    Students enrolled in medical, nursing and health science programs often participate in experiential learning in their practical classes. Experiential learning includes peer physical examination and peer-assisted learning where students practise clinical skills on each other. To identify effective strategies that enable ethical experiential learning for health students during practical classes. A narrative review of the literature. Pubmed, Cinahl and Scopus databases were searched because they include most of the health education journals where relevant articles would be published. A data extraction framework was developed to extract information from the included papers. Data were entered into a fillable form in Google Docs. Findings from identified studies were extracted to a series of tables (e.g. strategies for fostering ethical conduct; facilitators and barriers to peer-assisted learning). Themes were identified from these findings through a process of line by line coding and organisation of codes into descriptive themes using a constant comparative method. Finally understandings and hypotheses of relevance to our research question were generated from the descriptive themes. A total of 35 articles were retrieved that met the inclusion criteria. A total of 13 strategies for ethical experiential learning were identified and one evaluation was reported. The most frequently reported strategies were gaining written informed consent from students, providing information about the benefits of experiential learning and what to expect in practical classes, and facilitating discussions in class about potential issues. Contexts that facilitated participation in experiential learning included allowing students to choose their own groups, making participation voluntary, and providing adequate supervision, feedback and encouragement. A total of 13 strategies for ethical experiential learning were identified in the literature. A formal process for written consent was evaluated as effective; the effectiveness of other strategies remains to be determined. A comprehensive framework that integrates all recommendations from the literature is needed to guide future research and practise of ethical experiential learning in health courses. Copyright © 2017 Elsevier Ltd. All rights reserved.

  3. Exercise and nutrition interventions in advanced lung cancer: a systematic review

    PubMed Central

    Payne, C.; Larkin, P.J.; McIlfatrick, S.; Dunwoody, L.; Gracey, J.H.

    2013-01-01

    In this systematic review, we sought to evaluate the effect of physical activity or nutrition interventions (or both) in adults with advanced non-small-cell lung cancer (nsclc). Methods A systematic search for relevant clinical trials was conducted in 6 electronic databases, by hand searching, and by contacting key investigators. No limits were placed on study language. Information about recruitment rates, protocol adherence, patient-reported and clinical outcome measures, and study conclusions was extracted. Methodologic quality and risk of bias in each study was assessed using validated tools. Main Results Six papers detailing five studies involving 203 participants met the inclusion criteria. Two of the studies were single-cohort physical activity studies (54 participants), and three were controlled nutrition studies (149 participants). All were conducted in an outpatient setting. None of the included studies combined physical activity with nutrition interventions. Conclusions Our systematic review suggests that exercise and nutrition interventions are not harmful and may have beneficial effects on unintentional weight loss, physical strength, and functional performance in patients with advanced nsclc. However, the observed improvements must be interpreted with caution, because findings were not consistent across the included studies. Moreover, the included studies were small and at significant risk of bias. More research is required to ascertain the optimal physical activity and nutrition interventions in advanced inoperable nsclc. Specifically, the potential benefits of combining physical activity with nutrition counselling have yet to be adequately explored in this population. PMID:23904771

  4. User-centered evaluation of Arizona BioPathway: an information extraction, integration, and visualization system.

    PubMed

    Quiñones, Karin D; Su, Hua; Marshall, Byron; Eggers, Shauna; Chen, Hsinchun

    2007-09-01

    Explosive growth in biomedical research has made automated information extraction, knowledge integration, and visualization increasingly important and critically needed. The Arizona BioPathway (ABP) system extracts and displays biological regulatory pathway information from the abstracts of journal articles. This study uses relations extracted from more than 200 PubMed abstracts presented in a tabular and graphical user interface with built-in search and aggregation functionality. This paper presents a task-centered assessment of the usefulness and usability of the ABP system focusing on its relation aggregation and visualization functionalities. Results suggest that our graph-based visualization is more efficient in supporting pathway analysis tasks and is perceived as more useful and easier to use as compared to a text-based literature-viewing method. Relation aggregation significantly contributes to knowledge-acquisition efficiency. Together, the graphic and tabular views in the ABP Visualizer provide a flexible and effective interface for pathway relation browsing and analysis. Our study contributes to pathway-related research and biological information extraction by assessing the value of a multiview, relation-based interface that supports user-controlled exploration of pathway information across multiple granularities.

  5. 78 FR 39001 - 30-Day Notice of Proposed Information Collection: Uniform Physical Standards and Physical...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-06-28

    ... Information Collection: Uniform Physical Standards and Physical Inspection Requirements AGENCY: Office of the... Information Collection Title of Information Collection: Uniform Physical Standards and Physical Inspection... for conducting physical inspections of the properties are HUD, the lender or the owner. Owners/Agents...

  6. Collective Influence of Multiple Spreaders Evaluated by Tracing Real Information Flow in Large-Scale Social Networks

    NASA Astrophysics Data System (ADS)

    Teng, Xian; Pei, Sen; Morone, Flaviano; Makse, Hernán A.

    2016-10-01

    Identifying the most influential spreaders that maximize information flow is a central question in network theory. Recently, a scalable method called “Collective Influence (CI)” has been put forward through collective influence maximization. In contrast to heuristic methods evaluating nodes’ significance separately, CI method inspects the collective influence of multiple spreaders. Despite that CI applies to the influence maximization problem in percolation model, it is still important to examine its efficacy in realistic information spreading. Here, we examine real-world information flow in various social and scientific platforms including American Physical Society, Facebook, Twitter and LiveJournal. Since empirical data cannot be directly mapped to ideal multi-source spreading, we leverage the behavioral patterns of users extracted from data to construct “virtual” information spreading processes. Our results demonstrate that the set of spreaders selected by CI can induce larger scale of information propagation. Moreover, local measures as the number of connections or citations are not necessarily the deterministic factors of nodes’ importance in realistic information spreading. This result has significance for rankings scientists in scientific networks like the APS, where the commonly used number of citations can be a poor indicator of the collective influence of authors in the community.

  7. Collective Influence of Multiple Spreaders Evaluated by Tracing Real Information Flow in Large-Scale Social Networks

    PubMed Central

    Teng, Xian; Pei, Sen; Morone, Flaviano; Makse, Hernán A.

    2016-01-01

    Identifying the most influential spreaders that maximize information flow is a central question in network theory. Recently, a scalable method called “Collective Influence (CI)” has been put forward through collective influence maximization. In contrast to heuristic methods evaluating nodes’ significance separately, CI method inspects the collective influence of multiple spreaders. Despite that CI applies to the influence maximization problem in percolation model, it is still important to examine its efficacy in realistic information spreading. Here, we examine real-world information flow in various social and scientific platforms including American Physical Society, Facebook, Twitter and LiveJournal. Since empirical data cannot be directly mapped to ideal multi-source spreading, we leverage the behavioral patterns of users extracted from data to construct “virtual” information spreading processes. Our results demonstrate that the set of spreaders selected by CI can induce larger scale of information propagation. Moreover, local measures as the number of connections or citations are not necessarily the deterministic factors of nodes’ importance in realistic information spreading. This result has significance for rankings scientists in scientific networks like the APS, where the commonly used number of citations can be a poor indicator of the collective influence of authors in the community. PMID:27782207

  8. Fusing Sensor Paradigms to Acquire Chemical Information: An Integrative Role for Smart Biopolymeric Hydrogels

    PubMed Central

    Kim, Eunkyoung; Liu, Yi; Ben-Yoav, Hadar; Winkler, Thomas E.; Yan, Kun; Shi, Xiaowen; Shen, Jana; Kelly, Deanna L.; Ghodssi, Reza; Bentley, William E.

    2017-01-01

    The Information Age transformed our lives but it has had surprisingly little impact on the way chemical information (e.g., from our biological world) is acquired, analyzed and communicated. Sensor systems are poised to change this situation by providing rapid access to chemical information. This access will be enabled by technological advances from various fields: biology enables the synthesis, design and discovery of molecular recognition elements as well as the generation of cell-based signal processors; physics and chemistry are providing nano-components that facilitate the transmission and transduction of signals rich with chemical information; microfabrication is yielding sensors capable of receiving these signals through various modalities; and signal processing analysis enhances the extraction of chemical information. The authors contend that integral to the development of functional sensor systems will be materials that (i) enable the integrative and hierarchical assembly of various sensing components (for chemical recognition and signal transduction) and (ii) facilitate meaningful communication across modalities. It is suggested that stimuli-responsive self-assembling biopolymers can perform such integrative functions, and redox provides modality-spanning communication capabilities. Recent progress toward the development of electrochemical sensors to manage schizophrenia is used to illustrate the opportunities and challenges for enlisting sensors for chemical information processing. PMID:27616350

  9. Acquiring 3-D information about thick objects from differential interference contrast images using texture extraction

    NASA Astrophysics Data System (ADS)

    Sierra, Heidy; Brooks, Dana; Dimarzio, Charles

    2010-07-01

    The extraction of 3-D morphological information about thick objects is explored in this work. We extract this information from 3-D differential interference contrast (DIC) images by applying a texture detection method. Texture extraction methods have been successfully used in different applications to study biological samples. A 3-D texture image is obtained by applying a local entropy-based texture extraction method. The use of this method to detect regions of blastocyst mouse embryos that are used in assisted reproduction techniques such as in vitro fertilization is presented as an example. Results demonstrate the potential of using texture detection methods to improve morphological analysis of thick samples, which is relevant to many biomedical and biological studies. Fluorescence and optical quadrature microscope phase images are used for validation.

  10. Study on identifying deciduous forest by the method of feature space transformation

    NASA Astrophysics Data System (ADS)

    Zhang, Xuexia; Wu, Pengfei

    2009-10-01

    The thematic remotely sensed information extraction is always one of puzzling nuts which the remote sensing science faces, so many remote sensing scientists devotes diligently to this domain research. The methods of thematic information extraction include two kinds of the visual interpretation and the computer interpretation, the developing direction of which is intellectualization and comprehensive modularization. The paper tries to develop the intelligent extraction method of feature space transformation for the deciduous forest thematic information extraction in Changping district of Beijing city. The whole Chinese-Brazil resources satellite images received in 2005 are used to extract the deciduous forest coverage area by feature space transformation method and linear spectral decomposing method, and the result from remote sensing is similar to woodland resource census data by Chinese forestry bureau in 2004.

  11. [Study on infrared spectrum change of Ganoderma lucidum and its extracts].

    PubMed

    Chen, Zao-Xin; Xu, Yong-Qun; Chen, Xiao-Kang; Huang, Dong-Lan; Lu, Wen-Guan

    2013-05-01

    From the determination of the infrared spectra of four substances (original ganoderma lucidum and ganoderma lucidum water extract, 95% ethanol extract and petroleum ether extract), it was found that the infrared spectrum can carry systematic chemical information and basically reflects the distribution of each component of the analyte. Ganoderma lucidum and its extracts can be distinguished according to the absorption peak area ratio of 3 416-3 279, 1 541 and 723 cm(-1) to 2 935-2 852 cm(-1). A method of calculating the information entropy of the sample set with Euclidean distance was proposed, the relationship between the information entropy and the amount of chemical information carried by the sample set was discussed, and the authors come to a conclusion that sample set of original ganoderma lucidum carry the most abundant chemical information. The infrared spectrum set of original ganoderma lucidum has better clustering effect on ganoderma atrum, Cyan ganoderma, ganoderma multiplicatum and ganoderma lucidum when making hierarchical cluster analysis of 4 sample set. The results show that infrared spectrum carries the chemical information of the material structure and closely relates to the chemical composition of the system. The higher the value of information entropy, the much richer the chemical information and the more the benefit for pattern recognition. This study has a guidance function to the construction of the sample set in pattern recognition.

  12. Synergies between optical and physical variables in intercepting parabolic targets

    PubMed Central

    Gómez, José; López-Moliner, Joan

    2013-01-01

    Interception requires precise estimation of time-to-contact (TTC) information. A long-standing view posits that all relevant information for extracting TTC is available in the angular variables, which result from the projection of distal objects onto the retina. The different timing models rooted in this tradition have consequently relied on combining visual angle and its rate of expansion in different ways with tau being the most well-known solution for TTC. The generalization of these models to timing parabolic trajectories is not straightforward. For example, these different combinations rely on isotropic expansion and usually assume first-order information only, neglecting acceleration. As a consequence no optical formulations have been put forward so far to specify TTC of parabolic targets with enough accuracy. It is only recently that context-dependent physical variables have been shown to play an important role in TTC estimation. Known physical size and gravity can adequately explain observed data of linear and free-falling trajectories, respectively. Yet, a full timing model for specifying parabolic TTC has remained elusive. We here derive two formulations that specify TTC for parabolic ball trajectories. The first specification extends previous models in which known size is combined with thresholding visual angle or its rate of expansion to the case of fly balls. To efficiently use this model, observers need to recover the 3D radial velocity component of the trajectory which conveys the isotropic expansion. The second one uses knowledge of size and gravity combined with ball visual angle and elevation angle. Taking into account the noise due to sensory measurements, we simulate the expected performance of these models in terms of accuracy and precision. While the model that combines expansion information and size knowledge is more efficient during the late trajectory, the second one is shown to be efficient along all the flight. PMID:23720614

  13. Research on Remote Sensing Geological Information Extraction Based on Object Oriented Classification

    NASA Astrophysics Data System (ADS)

    Gao, Hui

    2018-04-01

    The northern Tibet belongs to the Sub cold arid climate zone in the plateau. It is rarely visited by people. The geological working conditions are very poor. However, the stratum exposures are good and human interference is very small. Therefore, the research on the automatic classification and extraction of remote sensing geological information has typical significance and good application prospect. Based on the object-oriented classification in Northern Tibet, using the Worldview2 high-resolution remote sensing data, combined with the tectonic information and image enhancement, the lithological spectral features, shape features, spatial locations and topological relations of various geological information are excavated. By setting the threshold, based on the hierarchical classification, eight kinds of geological information were classified and extracted. Compared with the existing geological maps, the accuracy analysis shows that the overall accuracy reached 87.8561 %, indicating that the classification-oriented method is effective and feasible for this study area and provides a new idea for the automatic extraction of remote sensing geological information.

  14. Design of extraction system in BRing at HIAF

    NASA Astrophysics Data System (ADS)

    Ruan, Shuang; Yang, Jiancheng; Zhang, Jinquan; Shen, Guodong; Ren, Hang; Liu, Jie; Shangguan, Jingbing; Zhang, Xiaoying; Zhang, Jingjing; Mao, Lijun; Sheng, Lina; Yin, Dayu; Wang, Geng; Wu, Bo; Yao, Liping; Tang, Meitang; Cai, Fucheng; Chen, Xiaoqiang

    2018-06-01

    The Booster Ring (BRing), which is the key part of HIAF (High Intensity heavy ion Accelerator Facility) complex at IMP (Institute of Modern Physics, Chinese Academy of Sciences), can provide uranium (A / q = 7) beam with a wide extraction energy range of 200-800 MeV/u. To fulfill a flexible beam extraction for multi-purpose experiments, both fast and slow extraction systems will be accommodated in the BRing. The fast extraction system is used for extracting short bunched beam horizontally in single-turn. The slow extraction system is used to provide quasi-continuous beam by the third order resonance and RF-knockout scheme. To achieve a compact structure, the two extraction systems are designed to share the same extraction channel. The general design of the fast and slow extraction systems and simulation results are discussed in this paper.

  15. HitPredict version 4: comprehensive reliability scoring of physical protein-protein interactions from more than 100 species.

    PubMed

    López, Yosvany; Nakai, Kenta; Patil, Ashwini

    2015-01-01

    HitPredict is a consolidated resource of experimentally identified, physical protein-protein interactions with confidence scores to indicate their reliability. The study of genes and their inter-relationships using methods such as network and pathway analysis requires high quality protein-protein interaction information. Extracting reliable interactions from most of the existing databases is challenging because they either contain only a subset of the available interactions, or a mixture of physical, genetic and predicted interactions. Automated integration of interactions is further complicated by varying levels of accuracy of database content and lack of adherence to standard formats. To address these issues, the latest version of HitPredict provides a manually curated dataset of 398 696 physical associations between 70 808 proteins from 105 species. Manual confirmation was used to resolve all issues encountered during data integration. For improved reliability assessment, this version combines a new score derived from the experimental information of the interactions with the original score based on the features of the interacting proteins. The combined interaction score performs better than either of the individual scores in HitPredict as well as the reliability score of another similar database. HitPredict provides a web interface to search proteins and visualize their interactions, and the data can be downloaded for offline analysis. Data usability has been enhanced by mapping protein identifiers across multiple reference databases. Thus, the latest version of HitPredict provides a significantly larger, more reliable and usable dataset of protein-protein interactions from several species for the study of gene groups. Database URL: http://hintdb.hgc.jp/htp. © The Author(s) 2015. Published by Oxford University Press.

  16. Highly efficient enantioselective liquid–liquid extraction of 1,2-amino-alcohols using SPINOL based phosphoric acid hosts† †Electronic supplementary information (ESI) available: Experimental data regarding the synthesis the hosts as well as procedures and raw data and for ELLE experiments. See DOI: 10.1039/c7sc02783d Click here for additional data file.

    PubMed Central

    Pinxterhuis, Erik B.; Gualtierotti, Jean-Baptiste; Heeres, Hero J.

    2017-01-01

    Access to enantiopure compounds on large scale in an environmentally friendly and cost-efficient manner remains one of the greatest challenges in chemistry. Resolution of racemates using enantioselective liquid–liquid extraction has great potential to meet that challenge. However, a relatively feeble understanding of the chemical principles and physical properties behind this technique has hampered the development of hosts possessing sufficient resolving power for their application to large scale processes. Herein we present, employing the previously untested SPINOL based phosphoric acids host family, an in depths study of the parameters affecting the efficiency of the resolution of amino-alcohols in the optic of further understanding the core principles behind ELLE. We have systematically investigated the dependencies of the enantioselection by parameters such as the choice of solvent, the temperature, as well as the pH and bring to light many previously unsuspected and highly intriguing interactions. Furthermore, utilizing these new insights to our advantage, we developed novel, highly efficient, extraction and resolving protocols which provide remarkable levels of enantioselectivity. It was shown that the extraction is catalytic in host by demonstrating transport in a U-tube and finally it was demonstrated how the solvent dependency could be exploited in an unprecedented triphasic resolution system. PMID:28989671

  17. DNA and bone structure preservation in medieval human skeletons.

    PubMed

    Coulson-Thomas, Yvette M; Norton, Andrew L; Coulson-Thomas, Vivien J; Florencio-Silva, Rinaldo; Ali, Nadir; Elmrghni, Samir; Gil, Cristiane D; Sasso, Gisela R S; Dixon, Ronald A; Nader, Helena B

    2015-06-01

    Morphological and ultrastructural data from archaeological human bones are scarce, particularly data that have been correlated with information on the preservation of molecules such as DNA. Here we examine the bone structure of macroscopically well-preserved medieval human skeletons by transmission electron microscopy and immunohistochemistry, and the quantity and quality of DNA extracted from these skeletons. DNA technology has been increasingly used for analyzing physical evidence in archaeological forensics; however, the isolation of ancient DNA is difficult since it is highly degraded, extraction yields are low and the co-extraction of PCR inhibitors is a problem. We adapted and optimised a method that is frequently used for isolating DNA from modern samples, Chelex(®) 100 (Bio-Rad) extraction, for isolating DNA from archaeological human bones and teeth. The isolated DNA was analysed by real-time PCR using primers targeting the sex determining region on the Y chromosome (SRY) and STR typing using the AmpFlSTR(®) Identifiler PCR Amplification kit. Our results clearly show the preservation of bone matrix in medieval bones and the presence of intact osteocytes with well preserved encapsulated nuclei. In addition, we show how effective Chelex(®) 100 is for isolating ancient DNA from archaeological bones and teeth. This optimised method is suitable for STR typing using kits aimed specifically at degraded and difficult DNA templates since amplicons of up to 250bp were successfully amplified. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  18. Development of Phenol-Enriched Olive Oil with Phenolic Compounds Extracted from Wastewater Produced by Physical Refining.

    PubMed

    Venturi, Francesca; Sanmartin, Chiara; Taglieri, Isabella; Nari, Anita; Andrich, Gianpaolo; Terzuoli, Erika; Donnini, Sandra; Nicolella, Cristiano; Zinnai, Angela

    2017-08-22

    While in the last few years the use of olive cake and mill wastewater as natural sources of phenolic compounds has been widely considered and several studies have focused on the development of new extraction methods and on the production of functional foods enriched with natural antioxidants, no data has been available on the production of a phenol-enriched refined olive oil with its own phenolic compounds extracted from wastewater produced during physical refining. In this study; we aimed to: (i) verify the effectiveness of a multi-step extraction process to recover the high-added-value phenolic compounds contained in wastewater derived from the preliminary washing degumming step of the physical refining of vegetal oils; (ii) evaluate their potential application for the stabilization of olive oil obtained with refined olive oils; and (iii) evaluate their antioxidant activity in an in vitro model of endothelial cells. The results obtained demonstrate the potential of using the refining wastewater as a source of bioactive compounds to improve the nutraceutical value as well as the antioxidant capacity of commercial olive oils. In the conditions adopted, the phenolic content significantly increased in the prototypes of phenol-enriched olive oils when compared with the control oil.

  19. Development of Phenol-Enriched Olive Oil with Phenolic Compounds Extracted from Wastewater Produced by Physical Refining

    PubMed Central

    Taglieri, Isabella; Nari, Anita; Andrich, Gianpaolo; Terzuoli, Erika; Donnini, Sandra; Nicolella, Cristiano; Zinnai, Angela

    2017-01-01

    While in the last few years the use of olive cake and mill wastewater as natural sources of phenolic compounds has been widely considered and several studies have focused on the development of new extraction methods and on the production of functional foods enriched with natural antioxidants, no data has been available on the production of a phenol-enriched refined olive oil with its own phenolic compounds extracted from wastewater produced during physical refining. In this study; we aimed to: (i) verify the effectiveness of a multi-step extraction process to recover the high-added-value phenolic compounds contained in wastewater derived from the preliminary washing degumming step of the physical refining of vegetal oils; (ii) evaluate their potential application for the stabilization of olive oil obtained with refined olive oils; and (iii) evaluate their antioxidant activity in an in vitro model of endothelial cells. The results obtained demonstrate the potential of using the refining wastewater as a source of bioactive compounds to improve the nutraceutical value as well as the antioxidant capacity of commercial olive oils. In the conditions adopted, the phenolic content significantly increased in the prototypes of phenol-enriched olive oils when compared with the control oil. PMID:28829365

  20. Geoparsing text for characterizing urban operational environments through machine learning techniques

    NASA Astrophysics Data System (ADS)

    Garfinkle, Noah W.; Selig, Lucas; Perkins, Timothy K.; Calfas, George W.

    2017-05-01

    Increasing worldwide internet connectivity and access to sources of print and open social media has increased near realtime availability of textual information. Capabilities to structure and integrate textual data streams can contribute to more meaningful representations of operational environment factors (i.e., Political, Military, Economic, Social, Infrastructure, Information, Physical Environment, and Time [PMESII-PT]) and tactical civil considerations (i.e., Areas, Structures, Capabilities, Organizations, People and Events [ASCOPE]). However, relying upon human analysts to encode this information as it arrives quickly proves intractable. While human analysts possess an ability to comprehend context in unstructured text far beyond that of computers, automated geoparsing (the extraction of locations from unstructured text) can empower analysts to automate sifting through datasets for areas of interest. This research evaluates existing approaches to geoprocessing as well as initiating the research and development of locally-improved methods of tagging parts of text as possible locations, resolving possible locations into coordinates, and interfacing such results with human analysts. The objective of this ongoing research is to develop a more contextually-complete picture of an area of interest (AOI) including human-geographic context for events. In particular, our research is working to make improvements to geoparsing (i.e., the extraction of spatial context from documents), which requires development, integration, and validation of named-entity recognition (NER) tools, gazetteers, and entity-attribution. This paper provides an overview of NER models and methodologies as applied to geoparsing, explores several challenges encountered, presents preliminary results from the creation of a flexible geoparsing research pipeline, and introduces ongoing and future work with the intention of contributing to the efficient geocoding of information containing valuable insights into human activities in space.

  1. Extraction of Graph Information Based on Image Contents and the Use of Ontology

    ERIC Educational Resources Information Center

    Kanjanawattana, Sarunya; Kimura, Masaomi

    2016-01-01

    A graph is an effective form of data representation used to summarize complex information. Explicit information such as the relationship between the X- and Y-axes can be easily extracted from a graph by applying human intelligence. However, implicit knowledge such as information obtained from other related concepts in an ontology also resides in…

  2. Ontology-Based Information Extraction for Business Intelligence

    NASA Astrophysics Data System (ADS)

    Saggion, Horacio; Funk, Adam; Maynard, Diana; Bontcheva, Kalina

    Business Intelligence (BI) requires the acquisition and aggregation of key pieces of knowledge from multiple sources in order to provide valuable information to customers or feed statistical BI models and tools. The massive amount of information available to business analysts makes information extraction and other natural language processing tools key enablers for the acquisition and use of that semantic information. We describe the application of ontology-based extraction and merging in the context of a practical e-business application for the EU MUSING Project where the goal is to gather international company intelligence and country/region information. The results of our experiments so far are very promising and we are now in the process of building a complete end-to-end solution.

  3. CMS-2 Reverse Engineering and ENCORE/MODEL Integration

    DTIC Science & Technology

    1992-05-01

    Automated extraction of design information from an existing software system written in CMS-2 can be used to document that system as-built, and that I The...extracted information is provided by a commer- dally available CASE tool. * Information describing software system design is automatically extracted...the displays in Figures 1, 2, and 3. T achiev ths GE 11 b iuo w as rjcs CM-2t Aa nsltr(M2da 1 n Joia Reverse EwngiernTcnlg 5RT [2GRE] . Two xampe fD

  4. DTIC (Defense Technical Information Center) Model Action Plan for Incorporating DGIS (DOD Gateway Information System) Capabilities.

    DTIC Science & Technology

    1986-05-01

    Information System (DGIS) is being developed to provide the DD crmjnj t with a modern tool to access diverse dtabaiees and extract information products...this community with a modern tool for accessing these databases and extracting information products from them. Since the Defense Technical Information...adjunct to DROLS xesults. The study , thereor. centerd around obtaining background information inside the unit on that unit’s users who request DROLS

  5. Patient experiences of colonoscopy, barium enema and CT colonography: a qualitative study.

    PubMed

    Von Wagner, C; Knight, K; Halligan, S; Atkin, W; Lilford, R; Morton, D; Wardle, J

    2009-01-01

    Previous studies of patient experience with bowel screening tests, in particular CT colonography (CTC), have superimposed global rating scales and not explored individual experience in detail. To redress this, we performed qualitative interviews in order to characterize patient expectations and experiences in depth. Following ethical permission, 16 patients undergoing CTC, 18 undergoing colonoscopy and 15 undergoing barium enema agreed to a semi-structured interview by a health psychologist. Interviews were recorded, responses transcribed and themes extracted with the aim of assimilating individual experiences to facilitate subsequent development and interpretation of quantitative surveys of overall satisfaction with each diagnostic test. Transcript analysis identified three principal themes: physical sensations, social interactions and information provision. Physical sensations differed for each test but were surprisingly well tolerated overall. Social interactions with staff were perceived as very important in colouring the whole experience, particularly in controlling the feelings of embarrassment, which was critical for all procedures. Information provision was also an important determinant of experience. Verbal feedback was most common during colonoscopy and invariably reassuring. However, patients undergoing CTC received little visual or verbal feedback and were often confused regarding the test outcome. Barium enema had no specific advantage over other tests. Qualitative interviews provided important perspectives on patient experience. Our data demonstrated that models describing the quality of medical encounters are applicable to single diagnostic episodes. Staff interactions and information provision were particularly important. We found advantages specific to both CTC and colonoscopy but none for barium enema. CTC could benefit greatly from improved information provision following examination.

  6. Associations between sedentary behaviour and physical activity in children and adolescents: a meta-analysis

    PubMed Central

    Pearson, N; Braithwaite, R E; Biddle, S J H; van Sluijs, E M F; Atkin, A J

    2014-01-01

    Physical activity and sedentary behaviour are associated with metabolic and mental health during childhood and adolescence. Understanding the inter-relationships between these behaviours will help to inform intervention design. This systematic review and meta-analysis synthesized evidence from observational studies describing the association between sedentary behaviour and physical activity in young people (<18 years). English-language publications up to August 2013 were located through electronic and manual searches. Included studies presented statistical associations between at least one measure of sedentary behaviour and one measure of physical activity. One hundred sixty-three papers were included in the meta-analysis, from which data on 254 independent samples was extracted. In the summary meta-analytic model (k = 230), a small, but significant, negative association between sedentary behaviour and physical activity was observed (r = −0.108, 95% confidence interval [CI] = −0.128, −0.087). In moderator analyses, studies that recruited smaller samples (n < 100, r = −0.193, 95% CI = −0.276, −0.109) employed objective methods of measurement (objectively measured physical activity; r = −0.233, 95% CI = −0.330, −0.137) or were assessed to be of higher methodological quality (r = −0.176, 95% CI = −0.215, −0.138) reported stronger associations, although effect sizes remained small. The association between sedentary behaviour and physical activity in young people is negative, but small, suggesting that these behaviours do not directly displace one another. PMID:24844784

  7. [Physical fitness and motor ability in obese boys 12 through 14 years of age].

    PubMed

    Kim, H K; Matsuura, Y; Tanaka, K; Inagaki, A

    1993-01-01

    Excess body fat has generally been considered to be an influential factor to physical fitness and motor ability in obese boys. However, little information is available on the physical fitness and motor ability in obese boys. The purpose of this study was to clarify characteristics of physical fitness and motor ability in obese boys. The subjects were three hundreds and five boys aged 12-14 years. Nineteen physical fitness and motor ability items were tested and skinfold thickness was measured at six sites. Bioelectrical impedance was measured using a tetrapolar impedance plethysmograph (Selco SIF-891). Body density was calculated from the formula of Kim et al. The results of comparison clearly indicated that the obese group was significantly poorer in 1,500-m run, 5-min run, 50-m run, running long jump and many other variables, but was superior only in back strength. To analyze the factorial structure in boys, principal factor analysis was applied to the correlation matrix which was calculated with 19 variables, and then five factors were extracted. The obese group was significantly poorer in total body endurance and muscular endurance than the non-obese group. From these results, it was confirmed that the excess body fat could be one of the most important factors that affects the state of many physical fitness and motor ability elements in obese boys. However, the relationships between physical fitness, motor ability and the degree of fatness seem to be rather complicated. A great deal of data should be accumulated for more detailed analysis on the influence of the excess body fat in obese boys.

  8. Associations between sedentary behaviour and physical activity in children and adolescents: a meta-analysis.

    PubMed

    Pearson, N; Braithwaite, R E; Biddle, S J H; van Sluijs, E M F; Atkin, A J

    2014-08-01

    Physical activity and sedentary behaviour are associated with metabolic and mental health during childhood and adolescence. Understanding the inter-relationships between these behaviours will help to inform intervention design. This systematic review and meta-analysis synthesized evidence from observational studies describing the association between sedentary behaviour and physical activity in young people (<18 years). English-language publications up to August 2013 were located through electronic and manual searches. Included studies presented statistical associations between at least one measure of sedentary behaviour and one measure of physical activity. One hundred sixty-three papers were included in the meta-analysis, from which data on 254 independent samples was extracted. In the summary meta-analytic model (k = 230), a small, but significant, negative association between sedentary behaviour and physical activity was observed (r = -0.108, 95% confidence interval [CI] = -0.128, -0.087). In moderator analyses, studies that recruited smaller samples (n < 100, r = -0.193, 95% CI = -0.276, -0.109) employed objective methods of measurement (objectively measured physical activity; r = -0.233, 95% CI = -0.330, -0.137) or were assessed to be of higher methodological quality (r = -0.176, 95% CI = -0.215, -0.138) reported stronger associations, although effect sizes remained small. The association between sedentary behaviour and physical activity in young people is negative, but small, suggesting that these behaviours do not directly displace one another. © 2014 The Authors. Obesity Reviews published by John Wiley & Sons Ltd on behalf of International Association for the Study of Obesity.

  9. Tools and Data Services from the NASA Earth Satellite Observations for Remote Sensing Commercial Applications

    NASA Technical Reports Server (NTRS)

    Vicente, Gilberto

    2005-01-01

    Several commercial applications of remote sensing data, such as water resources management, environmental monitoring, climate prediction, agriculture, forestry, preparation for and migration of extreme weather events, require access to vast amounts of archived high quality data, software tools and services for data manipulation and information extraction. These on the other hand require gaining detailed understanding of the data's internal structure and physical implementation of data reduction, combination and data product production. The time-consuming task must be undertaken before the core investigation can begin and is an especially difficult challenge when science objectives require users to deal with large multi-sensor data sets of different formats, structures, and resolutions.

  10. Automated synthetic scene generation

    NASA Astrophysics Data System (ADS)

    Givens, Ryan N.

    Physics-based simulations generate synthetic imagery to help organizations anticipate system performance of proposed remote sensing systems. However, manually constructing synthetic scenes which are sophisticated enough to capture the complexity of real-world sites can take days to months depending on the size of the site and desired fidelity of the scene. This research, sponsored by the Air Force Research Laboratory's Sensors Directorate, successfully developed an automated approach to fuse high-resolution RGB imagery, lidar data, and hyperspectral imagery and then extract the necessary scene components. The method greatly reduces the time and money required to generate realistic synthetic scenes and developed new approaches to improve material identification using information from all three of the input datasets.

  11. Towards Solving the Mixing Problem in the Decomposition of Geophysical Time Series by Independent Component Analysis

    NASA Technical Reports Server (NTRS)

    Aires, Filipe; Rossow, William B.; Chedin, Alain; Hansen, James E. (Technical Monitor)

    2000-01-01

    The use of the Principal Component Analysis technique for the analysis of geophysical time series has been questioned in particular for its tendency to extract components that mix several physical phenomena even when the signal is just their linear sum. We demonstrate with a data simulation experiment that the Independent Component Analysis, a recently developed technique, is able to solve this problem. This new technique requires the statistical independence of components, a stronger constraint, that uses higher-order statistics, instead of the classical decorrelation a weaker constraint, that uses only second-order statistics. Furthermore, ICA does not require additional a priori information such as the localization constraint used in Rotational Techniques.

  12. Performance evaluation of BPM system in SSRF using PCA method

    NASA Astrophysics Data System (ADS)

    Chen, Zhi-Chu; Leng, Yong-Bin; Yan, Ying-Bing; Yuan, Ren-Xian; Lai, Long-Wei

    2014-07-01

    The beam position monitor (BPM) system is of most importance in a light source. The capability of the BPM depends on the resolution of the system. The traditional standard deviation on the raw data method merely gives the upper limit of the resolution. Principal component analysis (PCA) had been introduced in the accelerator physics and it could be used to get rid of the actual signals. Beam related information was extracted before the evaluation of the BPM performance. A series of studies had been made in the Shanghai Synchrotron Radiation Facility (SSRF) and PCA was proved to be an effective and robust method in the performance evaluations of our BPM system.

  13. Convergent close coupling versus the generalized Sturmian function approach: Wave-function analysis

    NASA Astrophysics Data System (ADS)

    Ambrosio, M.; Mitnik, D. M.; Gasaneo, G.; Randazzo, J. M.; Kadyrov, A. S.; Fursa, D. V.; Bray, I.

    2015-11-01

    We compare the physical information contained in the Temkin-Poet (TP) scattering wave function representing electron-impact ionization of hydrogen, calculated by the convergent close-coupling (CCC) and generalized Sturmian function (GSF) methodologies. The idea is to show that the ionization cross section can be extracted from the wave functions themselves. Using two different procedures based on hyperspherical Sturmian functions we show that the transition amplitudes contained in both GSF and CCC scattering functions lead to similar single-differential cross sections. The single-continuum channels were also a subject of the present studies, and we show that the elastic and excitation amplitudes are essentially the same as well.

  14. Integrating semantic information into multiple kernels for protein-protein interaction extraction from biomedical literatures.

    PubMed

    Li, Lishuang; Zhang, Panpan; Zheng, Tianfu; Zhang, Hongying; Jiang, Zhenchao; Huang, Degen

    2014-01-01

    Protein-Protein Interaction (PPI) extraction is an important task in the biomedical information extraction. Presently, many machine learning methods for PPI extraction have achieved promising results. However, the performance is still not satisfactory. One reason is that the semantic resources were basically ignored. In this paper, we propose a multiple-kernel learning-based approach to extract PPIs, combining the feature-based kernel, tree kernel and semantic kernel. Particularly, we extend the shortest path-enclosed tree kernel (SPT) by a dynamic extended strategy to retrieve the richer syntactic information. Our semantic kernel calculates the protein-protein pair similarity and the context similarity based on two semantic resources: WordNet and Medical Subject Heading (MeSH). We evaluate our method with Support Vector Machine (SVM) and achieve an F-score of 69.40% and an AUC of 92.00%, which show that our method outperforms most of the state-of-the-art systems by integrating semantic information.

  15. Extracting Social Information from Chemosensory Cues: Consideration of Several Scenarios and Their Functional Implications

    PubMed Central

    Ben-Shaul, Yoram

    2015-01-01

    Across all sensory modalities, stimuli can vary along multiple dimensions. Efficient extraction of information requires sensitivity to those stimulus dimensions that provide behaviorally relevant information. To derive social information from chemosensory cues, sensory systems must embed information about the relationships between behaviorally relevant traits of individuals and the distributions of the chemical cues that are informative about these traits. In simple cases, the mere presence of one particular compound is sufficient to guide appropriate behavior. However, more generally, chemosensory information is conveyed via relative levels of multiple chemical cues, in non-trivial ways. The computations and networks needed to derive information from multi-molecule stimuli are distinct from those required by single molecule cues. Our current knowledge about how socially relevant information is encoded by chemical blends, and how it is extracted by chemosensory systems is very limited. This manuscript explores several scenarios and the neuronal computations required to identify them. PMID:26635515

  16. Peripheral vasomotor activity assessment using a continuous wavelet analysis on webcam photoplethysmographic signals.

    PubMed

    Bousefsaf, F; Maaoui, C; Pruski, A

    2016-11-25

    Vasoconstriction and vasodilation phenomena reflect the relative changes in the vascular bed. They induce particular modifications in the pulse wave magnitude. Webcams correspond to remote sensors that can be employed to measure the pulse wave in order to compute the pulse frequency. Record and analyze pulse wave signal with a low-cost webcam to extract the amplitude information and assess the vasomotor activity of the participant. Photoplethysmographic signals obtained from a webcam are analyzed through a continuous wavelet transform. The performance of the proposed filtering technique was evaluated using approved contact probes on a set of 12 healthy subjects after they perform a short but intense physical exercise. During the rest period, a cutaneous vasodilation is observable. High degrees of correlation between the webcam and a reference sensor were obtained. Webcams are low-cost and non-contact devices that can be used to reliably estimate both heart rate and peripheral vasomotor activity, notably during physical exertion.

  17. New developments in FEYNRULES

    NASA Astrophysics Data System (ADS)

    Alloul, Adam; Christensen, Neil D.; Degrande, Céline; Duhr, Claude; Fuks, Benjamin

    2014-06-01

    The program FEYNRULES is a MATHEMATICA package developed to facilitate the implementation of new physics theories into high-energy physics tools. Starting from a minimal set of information such as the model gauge symmetries, its particle content, parameters and Lagrangian, FEYNRULES provides all necessary routines to extract automatically from the Lagrangian (that can also be computed semi-automatically for supersymmetric theories) the associated Feynman rules. These can be further exported to several Monte Carlo event generators through dedicated interfaces, as well as translated into a PYTHON library, under the so-called UFO model format, agnostic of the model complexity, especially in terms of Lorentz and/or color structures appearing in the vertices or of number of external legs. In this work, we briefly report on the most recent new features that have been added to FEYNRULES, including full support for spin-1 fermions, a new module allowing for the automated diagonalization of the particle spectrum and a new set of routines dedicated to decay width calculations.

  18. The dynamic and steady state behavior of a PEM fuel cell as an electric energy source

    NASA Astrophysics Data System (ADS)

    Costa, R. A.; Camacho, J. R.

    The main objective of this work is to extract information on the internal behavior of three small polymer electrolyte membrane fuel cells under static and dynamic load conditions. A computational model was developed using Scilab [SCILAB 4, Scilab-a free scientific software package, http://www.scilab.org/, INRIA, France, December, 2005] to simulate the static and dynamic performance [J.M. Correa, A.F. Farret, L.N. Canha, An analysis of the dynamic performance of proton exchange membrane fuel cells using an electrochemical model, in: 27th Annual Conference of IEEE Industrial Electronics Society, 2001, pp. 141-146] of this particular type of fuel cell. This dynamic model is based on electrochemical equations and takes into consideration most of the chemical and physical characteristics of the device in order to generate electric power. The model takes into consideration the operating, design parameters and physical material properties. The results show the internal losses and concentration effects behavior, which are of interest for power engineers and researchers.

  19. Secure communications using nonlinear silicon photonic keys.

    PubMed

    Grubel, Brian C; Bosworth, Bryan T; Kossey, Michael R; Cooper, A Brinton; Foster, Mark A; Foster, Amy C

    2018-02-19

    We present a secure communication system constructed using pairs of nonlinear photonic physical unclonable functions (PUFs) that harness physical chaos in integrated silicon micro-cavities. Compared to a large, electronically stored one-time pad, our method provisions large amounts of information within the intrinsically complex nanostructure of the micro-cavities. By probing a micro-cavity with a rapid sequence of spectrally-encoded ultrafast optical pulses and measuring the lightwave responses, we experimentally demonstrate the ability to extract 2.4 Gb of key material from a single micro-cavity device. Subsequently, in a secure communication experiment with pairs of devices, we achieve bit error rates below 10 -5 at code rates of up to 0.1. The PUFs' responses are never transmitted over the channel or stored in digital memory, thus enhancing the security of the system. Additionally, the micro-cavity PUFs are extremely small, inexpensive, robust, and fully compatible with telecommunications infrastructure, components, and electronic fabrication. This approach can serve one-time pad or public key exchange applications where high security is required.

  20. Physical and chemical stratigraphy suggest small or absent glacioeustatic variation during formation of the Paradox Basin cyclothems

    NASA Astrophysics Data System (ADS)

    Dyer, Blake; Maloof, Adam C.

    2015-06-01

    The Paradox Basin cyclothems previously have been interpreted as Milankovitch style glacial-interglacial cycles from the Late Paleozoic Ice Age, but an unambiguous test for a glacioeustatic origin has not been conducted. A high resolution coupled chemical and physical stratigraphic analysis of two outcrop sections and three core segments provides new evidence that supports either minor sea level change of several meters or an autocyclic mechanism for parasequence formation. High amplitude sea level change is ruled out by the scale of thin top-negative isotopic meteoric diagenesis trends associated with parasequence tops and subaerial exposure fabrics. Isotopic gradients from shelf (light) to basin (heavy) indicate that parasequences are deposited diachronously, with isotopes of more distal sections recording increased basin restriction. These results support the idea that the late Pennsylvanian was a prolonged period of relatively static eustasy, agreeing with recent studies in the western USA. The methods provide a new set of tools and context for extracting environmental information from cyclic upward shallowing carbonate parasequences.

  1. Diagnosis and treatment of bone metastasis: comprehensive guideline of the Japanese Society of Medical Oncology, Japanese Orthopedic Association, Japanese Urological Association, and Japanese Society for Radiation Oncology.

    PubMed

    Shibata, H; Kato, S; Sekine, I; Abe, K; Araki, N; Iguchi, H; Izumi, T; Inaba, Y; Osaka, I; Kato, S; Kawai, A; Kinuya, S; Kodaira, M; Kobayashi, E; Kobayashi, T; Sato, J; Shinohara, N; Takahashi, S; Takamatsu, Y; Takayama, K; Takayama, K; Tateishi, U; Nagakura, H; Hosaka, M; Morioka, H; Moriya, T; Yuasa, T; Yurikusa, T; Yomiya, K; Yoshida, M

    2016-01-01

    Diagnosis and treatment of bone metastasis requires various types of measures, specialists and caregivers. To provide better diagnosis and treatment, a multidisciplinary team approach is required. The members of this multidisciplinary team include doctors of primary cancers, radiologists, pathologists, orthopaedists, radiotherapists, clinical oncologists, palliative caregivers, rehabilitation doctors, dentists, nurses, pharmacists, physical therapists, occupational therapists, medical social workers, etc. Medical evidence was extracted from published articles describing meta-analyses or randomised controlled trials concerning patients with bone metastases mainly from 2003 to 2013, and a guideline was developed according to the Medical Information Network Distribution Service Handbook for Clinical Practice Guideline Development 2014. Multidisciplinary team meetings are helpful in diagnosis and treatment. Clinical benefits such as physical or psychological palliation obtained using the multidisciplinary team approaches are apparent. We established a guideline describing each specialty field, to improve understanding of the different fields among the specialists, who can further provide appropriate treatment, and to improve patients' outcomes.

  2. High-throughput determination of structural phase diagram and constituent phases using GRENDEL

    NASA Astrophysics Data System (ADS)

    Kusne, A. G.; Keller, D.; Anderson, A.; Zaban, A.; Takeuchi, I.

    2015-11-01

    Advances in high-throughput materials fabrication and characterization techniques have resulted in faster rates of data collection and rapidly growing volumes of experimental data. To convert this mass of information into actionable knowledge of material process-structure-property relationships requires high-throughput data analysis techniques. This work explores the use of the Graph-based endmember extraction and labeling (GRENDEL) algorithm as a high-throughput method for analyzing structural data from combinatorial libraries, specifically, to determine phase diagrams and constituent phases from both x-ray diffraction and Raman spectral data. The GRENDEL algorithm utilizes a set of physical constraints to optimize results and provides a framework by which additional physics-based constraints can be easily incorporated. GRENDEL also permits the integration of database data as shown by the use of critically evaluated data from the Inorganic Crystal Structure Database in the x-ray diffraction data analysis. Also the Sunburst radial tree map is demonstrated as a tool to visualize material structure-property relationships found through graph based analysis.

  3. Pore network extraction from pore space images of various porous media systems

    NASA Astrophysics Data System (ADS)

    Yi, Zhixing; Lin, Mian; Jiang, Wenbin; Zhang, Zhaobin; Li, Haishan; Gao, Jian

    2017-04-01

    Pore network extraction, which is defined as the transformation from irregular pore space to a simplified network in the form of pores connected by throats, is significant to microstructure analysis and network modeling. A physically realistic pore network is not only a representation of the pore space in the sense of topology and morphology, but also a good tool for predicting transport properties accurately. We present a method to extract pore network by employing the centrally located medial axis to guide the construction of maximal-balls-like skeleton where the pores and throats are defined and parameterized. To validate our method, various rock samples including sand pack, sandstones, and carbonates were used to extract pore networks. The pore structures were compared quantitatively with the structures extracted by medial axis method or maximal ball method. The predicted absolute permeability and formation factor were verified against the theoretical solutions obtained by lattice Boltzmann method and finite volume method, respectively. The two-phase flow was simulated through the networks extracted from homogeneous sandstones, and the generated relative permeability curves were compared with the data obtained from experimental method and other numerical models. The results show that the accuracy of our network is higher than that of other networks for predicting transport properties, so the presented method is more reliable for extracting physically realistic pore network.

  4. The Changes of Gene Expression on Human Hair during Long-Spaceflight

    NASA Astrophysics Data System (ADS)

    Terada, Masahiro; Mukai, Chiaki; Ishioka, Noriaki; Majima, Hideyuki J.; Yamada, Shin; Seki, Masaya; Takahashi, Rika; Higashibata, Akira; Ohshima, Hiroshi; Sudoh, Masamichi; Minamisawa, Susumu

    Hair has many advantages as the experimental sample. In a hair follicle, hair matrix cells actively divide and these active changes sensitively reflect physical condition on human body. The hair shaft records the metabolic conditions of mineral elements in our body. From human hairs, we can detect physiological informations about the human health. Therefore, we focused on using hair root analysis to understand the effects of spaceflight on astronauts. In 2009, we started a research program focusing on the analysis of astronauts’ hairs to examine the effects of long-term spaceflight on the gene expression in the human body. We want to get basic information to invent the effectivly diagnostic methods to detect the health situations of astronauts during space flight by analyzing human hair. We extracted RNA form the collected samples. Then, these extracted RNA was amplified. Amplified RNA was processed and hybridized to the Whole Human Genome (4×44K) Oligo Microarray (Agilent Technologies) according to the manufacturer’s protocol. Slide scanning was performed using the Agilent DNA Microarray Scanner. Scanning data were normalized with Agilent’s Feature Extraction software. Data preprocessing and analysis were performed using GeneSpring software 11.0.1. Next, Synthesis of cDNA (1 mg) was carried out using the PrimeScript RT reagent Kit (TaKaRa Bio) following the manufacturer’s instructions. The qRT-PCR experiment was performed with SYBR Premix Ex Taq (TaKaRa Bio) using the 7500 Real-Time PCR system (Applied Biosystems). We detected the changes of some gene expressions during spaceflight from both microarray and qRT-PCR data. These genes seems to be related with the hair proliferation. We believe that these results will lead to the discovery of the important factor effected during space flight on the hair.

  5. Extracting information from the text of electronic medical records to improve case detection: a systematic review

    PubMed Central

    Carroll, John A; Smith, Helen E; Scott, Donia; Cassell, Jackie A

    2016-01-01

    Background Electronic medical records (EMRs) are revolutionizing health-related research. One key issue for study quality is the accurate identification of patients with the condition of interest. Information in EMRs can be entered as structured codes or unstructured free text. The majority of research studies have used only coded parts of EMRs for case-detection, which may bias findings, miss cases, and reduce study quality. This review examines whether incorporating information from text into case-detection algorithms can improve research quality. Methods A systematic search returned 9659 papers, 67 of which reported on the extraction of information from free text of EMRs with the stated purpose of detecting cases of a named clinical condition. Methods for extracting information from text and the technical accuracy of case-detection algorithms were reviewed. Results Studies mainly used US hospital-based EMRs, and extracted information from text for 41 conditions using keyword searches, rule-based algorithms, and machine learning methods. There was no clear difference in case-detection algorithm accuracy between rule-based and machine learning methods of extraction. Inclusion of information from text resulted in a significant improvement in algorithm sensitivity and area under the receiver operating characteristic in comparison to codes alone (median sensitivity 78% (codes + text) vs 62% (codes), P = .03; median area under the receiver operating characteristic 95% (codes + text) vs 88% (codes), P = .025). Conclusions Text in EMRs is accessible, especially with open source information extraction algorithms, and significantly improves case detection when combined with codes. More harmonization of reporting within EMR studies is needed, particularly standardized reporting of algorithm accuracy metrics like positive predictive value (precision) and sensitivity (recall). PMID:26911811

  6. Automated extraction of radiation dose information for CT examinations.

    PubMed

    Cook, Tessa S; Zimmerman, Stefan; Maidment, Andrew D A; Kim, Woojin; Boonn, William W

    2010-11-01

    Exposure to radiation as a result of medical imaging is currently in the spotlight, receiving attention from Congress as well as the lay press. Although scanner manufacturers are moving toward including effective dose information in the Digital Imaging and Communications in Medicine headers of imaging studies, there is a vast repository of retrospective CT data at every imaging center that stores dose information in an image-based dose sheet. As such, it is difficult for imaging centers to participate in the ACR's Dose Index Registry. The authors have designed an automated extraction system to query their PACS archive and parse CT examinations to extract the dose information stored in each dose sheet. First, an open-source optical character recognition program processes each dose sheet and converts the information to American Standard Code for Information Interchange (ASCII) text. Each text file is parsed, and radiation dose information is extracted and stored in a database which can be queried using an existing pathology and radiology enterprise search tool. Using this automated extraction pipeline, it is possible to perform dose analysis on the >800,000 CT examinations in the PACS archive and generate dose reports for all of these patients. It is also possible to more effectively educate technologists, radiologists, and referring physicians about exposure to radiation from CT by generating report cards for interpreted and performed studies. The automated extraction pipeline enables compliance with the ACR's reporting guidelines and greater awareness of radiation dose to patients, thus resulting in improved patient care and management. Copyright © 2010 American College of Radiology. Published by Elsevier Inc. All rights reserved.

  7. Automated generation of individually customized visualizations of diagnosis-specific medical information using novel techniques of information extraction

    NASA Astrophysics Data System (ADS)

    Chen, Andrew A.; Meng, Frank; Morioka, Craig A.; Churchill, Bernard M.; Kangarloo, Hooshang

    2005-04-01

    Managing pediatric patients with neurogenic bladder (NGB) involves regular laboratory, imaging, and physiologic testing. Using input from domain experts and current literature, we identified specific data points from these tests to develop the concept of an electronic disease vector for NGB. An information extraction engine was used to extract the desired data elements from free-text and semi-structured documents retrieved from the patient"s medical record. Finally, a Java-based presentation engine created graphical visualizations of the extracted data. After precision, recall, and timing evaluation, we conclude that these tools may enable clinically useful, automatically generated, and diagnosis-specific visualizations of patient data, potentially improving compliance and ultimately, outcomes.

  8. Road Damage Extraction from Post-Earthquake Uav Images Assisted by Vector Data

    NASA Astrophysics Data System (ADS)

    Chen, Z.; Dou, A.

    2018-04-01

    Extraction of road damage information after earthquake has been regarded as urgent mission. To collect information about stricken areas, Unmanned Aerial Vehicle can be used to obtain images rapidly. This paper put forward a novel method to detect road damage and bring forward a coefficient to assess road accessibility. With the assistance of vector road data, image data of the Jiuzhaigou Ms7.0 Earthquake is tested. In the first, the image is clipped according to vector buffer. Then a large-scale segmentation is applied to remove irrelevant objects. Thirdly, statistics of road features are analysed, and damage information is extracted. Combining with the on-filed investigation, the extraction result is effective.

  9. Holistic Context-Sensitivity for Run-Time Optimization of Flexible Manufacturing Systems.

    PubMed

    Scholze, Sebastian; Barata, Jose; Stokic, Dragan

    2017-02-24

    Highly flexible manufacturing systems require continuous run-time (self-) optimization of processes with respect to diverse parameters, e.g., efficiency, availability, energy consumption etc. A promising approach for achieving (self-) optimization in manufacturing systems is the usage of the context sensitivity approach based on data streaming from high amount of sensors and other data sources. Cyber-physical systems play an important role as sources of information to achieve context sensitivity. Cyber-physical systems can be seen as complex intelligent sensors providing data needed to identify the current context under which the manufacturing system is operating. In this paper, it is demonstrated how context sensitivity can be used to realize a holistic solution for (self-) optimization of discrete flexible manufacturing systems, by making use of cyber-physical systems integrated in manufacturing systems/processes. A generic approach for context sensitivity, based on self-learning algorithms, is proposed aiming at a various manufacturing systems. The new solution encompasses run-time context extractor and optimizer. Based on the self-learning module both context extraction and optimizer are continuously learning and improving their performance. The solution is following Service Oriented Architecture principles. The generic solution is developed and then applied to two very different manufacturing processes.

  10. Holistic Context-Sensitivity for Run-Time Optimization of Flexible Manufacturing Systems

    PubMed Central

    Scholze, Sebastian; Barata, Jose; Stokic, Dragan

    2017-01-01

    Highly flexible manufacturing systems require continuous run-time (self-) optimization of processes with respect to diverse parameters, e.g., efficiency, availability, energy consumption etc. A promising approach for achieving (self-) optimization in manufacturing systems is the usage of the context sensitivity approach based on data streaming from high amount of sensors and other data sources. Cyber-physical systems play an important role as sources of information to achieve context sensitivity. Cyber-physical systems can be seen as complex intelligent sensors providing data needed to identify the current context under which the manufacturing system is operating. In this paper, it is demonstrated how context sensitivity can be used to realize a holistic solution for (self-) optimization of discrete flexible manufacturing systems, by making use of cyber-physical systems integrated in manufacturing systems/processes. A generic approach for context sensitivity, based on self-learning algorithms, is proposed aiming at a various manufacturing systems. The new solution encompasses run-time context extractor and optimizer. Based on the self-learning module both context extraction and optimizer are continuously learning and improving their performance. The solution is following Service Oriented Architecture principles. The generic solution is developed and then applied to two very different manufacturing processes. PMID:28245564

  11. Children's Out-of-School Independently Mobile Trips, Active Travel, and Physical Activity: A Cross-Sectional Examination from the Kids in the City Study.

    PubMed

    Oliver, Melody; Parker, Karl; Witten, Karen; Mavoa, Suzanne; Badland, Hannah M; Donovan, Phil; Chaudhury, Moushumi; Kearn, Robin

    2016-03-01

    The study aim was to determine the association between children's objectively assessed moderate-to-vigorous physical activity (MVPA) and active trips (AT) and independently mobile trips (IM) during out-of-school hours. Children aged 9 to 13 years (n = 254) were recruited from 9 schools in Auckland, New Zealand between 2011 and 2012. Children completed travel diaries and wore accelerometers for 7 days. Parents provided demographic information. Geographic information systems-derived distance to school was calculated. Accelerometer data were extracted for out of school hours only. Percentage of time spent in MVPA (%MVPA), AT, and IM were calculated. Generalized estimating equations were used to determine the relationship between daily %MVPA and AT and between daily %MVPA and IM, accounting for age, sex, ethnicity, distance to school, day of the week, and numeric day of data collection. A significant positive relationship was observed between %MVPA and both AT and IM. For every unit increase in the daily percentage of trips made that were AT or IM, we found an average increase of 1.28% (95% CI 0.87%, 1.70%) and 1.15% (95% CI 0.71%, 1.59%) time in MVPA, respectively. Children's AT and IM are associated with increased MVPA during out-of-school hours.

  12. Resolution-Enhanced Harmonic and Interharmonic Measurement for Power Quality Analysis in Cyber-Physical Energy System.

    PubMed

    Liu, Yanchi; Wang, Xue; Liu, Youda; Cui, Sujin

    2016-06-27

    Power quality analysis issues, especially the measurement of harmonic and interharmonic in cyber-physical energy systems, are addressed in this paper. As new situations are introduced to the power system, the impact of electric vehicles, distributed generation and renewable energy has introduced extra demands to distributed sensors, waveform-level information and power quality data analytics. Harmonics and interharmonics, as the most significant disturbances, require carefully designed detection methods for an accurate measurement of electric loads whose information is crucial to subsequent analyzing and control. This paper gives a detailed description of the power quality analysis framework in networked environment and presents a fast and resolution-enhanced method for harmonic and interharmonic measurement. The proposed method first extracts harmonic and interharmonic components efficiently using the single-channel version of Robust Independent Component Analysis (RobustICA), then estimates the high-resolution frequency from three discrete Fourier transform (DFT) samples with little additional computation, and finally computes the amplitudes and phases with the adaptive linear neuron network. The experiments show that the proposed method is time-efficient and leads to a better accuracy of the simulated and experimental signals in the presence of noise and fundamental frequency deviation, thus providing a deeper insight into the (inter)harmonic sources or even the whole system.

  13. Resolution-Enhanced Harmonic and Interharmonic Measurement for Power Quality Analysis in Cyber-Physical Energy System

    PubMed Central

    Liu, Yanchi; Wang, Xue; Liu, Youda; Cui, Sujin

    2016-01-01

    Power quality analysis issues, especially the measurement of harmonic and interharmonic in cyber-physical energy systems, are addressed in this paper. As new situations are introduced to the power system, the impact of electric vehicles, distributed generation and renewable energy has introduced extra demands to distributed sensors, waveform-level information and power quality data analytics. Harmonics and interharmonics, as the most significant disturbances, require carefully designed detection methods for an accurate measurement of electric loads whose information is crucial to subsequent analyzing and control. This paper gives a detailed description of the power quality analysis framework in networked environment and presents a fast and resolution-enhanced method for harmonic and interharmonic measurement. The proposed method first extracts harmonic and interharmonic components efficiently using the single-channel version of Robust Independent Component Analysis (RobustICA), then estimates the high-resolution frequency from three discrete Fourier transform (DFT) samples with little additional computation, and finally computes the amplitudes and phases with the adaptive linear neuron network. The experiments show that the proposed method is time-efficient and leads to a better accuracy of the simulated and experimental signals in the presence of noise and fundamental frequency deviation, thus providing a deeper insight into the (inter)harmonic sources or even the whole system. PMID:27355946

  14. Ground-state-entanglement bound for quantum energy teleportation of general spin-chain models

    NASA Astrophysics Data System (ADS)

    Hotta, Masahiro

    2013-03-01

    Many-body quantum systems in the ground states have zero-point energy due to the uncertainty relation. In many cases, the system in the ground state accompanies spatially entangled energy density fluctuation via the noncommutativity of the energy density operators, though the total energy takes a fixed value, i.e., the lowest eigenvalue of the Hamiltonian. Quantum energy teleportation (QET) is a protocol for the extraction of the zero-point energy out of one subsystem using information of a remote measurement of another subsystem. From an operational viewpoint of protocol users, QET can be regarded as an effective rapid energy transportation without breaking all physical laws, including causality and local energy conservation. In the protocol, the ground-state entanglement plays a crucial role. In this paper, we show analytically for a general class of spin-chain systems that the entanglement entropy is lower bounded by a positive quadratic function of the teleported energy between the regions of a QET protocol. This supports a general conjecture that ground-state entanglement is an evident physical resource for energy transportation in the context of QET. The result may also deepen our understanding of the energy density fluctuation in condensed-matter systems from a perspective of quantum information theory.

  15. Oil and Gas Extraction Sector (NAICS 211)

    EPA Pesticide Factsheets

    Environmental regulatory information for oil and gas extraction sectors, including oil and natural gas drilling. Includes information about NESHAPs for RICE and stationary combustion engines, and effluent guidelines for synthetic-based drilling fluids

  16. Note: extraction of temperature-dependent interfacial resistance of thermoelectric modules.

    PubMed

    Chen, Min

    2011-11-01

    This article discusses an approach for extracting the temperature dependency of the electrical interfacial resistance associated with thermoelectric devices. The method combines a traditional module-level test rig and a nonlinear numerical model of thermoelectricity to minimize measurement errors on the interfacial resistance. The extracted results represent useful data to investigating the characteristics of thermoelectric module resistance and comparing performance of various modules. © 2011 American Institute of Physics

  17. Comparison studies on catalytic properties of silver nanoparticles biosynthesized via aqueous leaves extract of Hibiscus rosa sinensis and Imperata cylindrica

    NASA Astrophysics Data System (ADS)

    Fairuzi, Afiza Ahmad; Bonnia, Noor Najmi; Akhir, Rabiatuladawiyah Md.; Akil, Hazizan Md; Yahya, Sabrina M.; Rahman, Norafifah A.

    2018-05-01

    Synthesis of silver nanoparticles has been developed by using aqueous leaves extract (ALE) of Hibiscus rosa sinensis (H. rosa sinensis) and Imperata cylindrica (I. cylindrica). Both plants extract acts as reducing and capping agent. The colour change in reaction mixture (pale yellow to dark brown) was observed during the synthesis process. The formation of silver nanoparticles was confirmed by surface Plasmon Resonance (SPR) at range 300-700 nm for both leaves using UV-Vis Spectroscopy. The reduction of silver ions to silver nanoparticles was completed within 2 hour for H. rosa sinensis and 30 minutes for I. cylindrica extract. The synthesized nanoparticles were characterized using UV-Vis spectroscopy, field emission scanning electron microscope (FESEM) and Fourier transform infrared (FTIR) spectroscopy. The morphology of silver nanoparticles was found to be different when synthesized using different plant extract. In addition, this study also reported on the effect of silver nanoparticles on the degradation of organic dye by sodium borohydride (NaBH4). The silver nanoparticles synthesis by aqueous leaf extract demonstrates rapid, simple and inexpensive method compared to the conventional physical and physical methods. The efficiency of silver nanoparticles as a promising candidate for the catalysis of organic dyes by NaBH4 through the electron transfer is established in the present study.

  18. Multi-Hadron Observables from Lattice Quantum Chromodynamics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hansen, Maxwell

    2014-01-01

    We describe formal work that relates the nite-volume spectrum in a quantum eld theory to scattering and decay amplitudes. This is of particular relevance to numerical calculations performed using Lattice Quantum Chromodynamics (LQCD). Correlators calculated using LQCD can only be determined on the Euclidean time axis. For this reason the standard method of determining scattering amplitudes via the Lehmann-Symanzik-Zimmermann reduction formula cannot be employed. By contrast, the nite-volume spectrum is directly accessible in LQCD calculations. Formalism for relating the spectrum to physical scattering observables is thus highly desirable. In this thesis we develop tools for extracting physical information from LQCDmore » for four types of observables. First we analyze systems with multiple, strongly-coupled two-scalar channels. Here we accommodate both identical and nonidentical scalars, and in the latter case allow for degenerate as well as nondegenerate particle masses. Using relativistic eld theory, and summing to all orders in perturbation theory, we derive a result relating the nite-volume spectrum to the two-to-two scattering amplitudes of the coupled-channel theory. This generalizes the formalism of Martin L uscher for the case of single-channel scattering. Second we consider the weak decay of a single particle into multiple, coupled two-scalar channels. We show how the nite-volume matrix element extracted in LQCD is related to matrix elements of asymptotic two-particle states, and thus to decay amplitudes. This generalizes work by Laurent Lellouch and Martin L uscher. Third we extend the method for extracting matrix elements by considering currents which insert energy, momentum and angular momentum. This allows one to extract transition matrix elements and form factors from LQCD. Finally we look beyond two-particle systems to those with three-particles in asymptotic states. Working again to all orders in relativistic eld theory, we derive a relation between the spectrum and an in nite-volume three-to-three scattering quantity. This nal analysis is the most complicated of the four, because the all-orders summation is more di cult for this system, and also because a number of new technical issues arise in analyzing the contributing diagrams.« less

  19. Extracting and standardizing medication information in clinical text - the MedEx-UIMA system.

    PubMed

    Jiang, Min; Wu, Yonghui; Shah, Anushi; Priyanka, Priyanka; Denny, Joshua C; Xu, Hua

    2014-01-01

    Extraction of medication information embedded in clinical text is important for research using electronic health records (EHRs). However, most of current medication information extraction systems identify drug and signature entities without mapping them to standard representation. In this study, we introduced the open source Java implementation of MedEx, an existing high-performance medication information extraction system, based on the Unstructured Information Management Architecture (UIMA) framework. In addition, we developed new encoding modules in the MedEx-UIMA system, which mapped an extracted drug name/dose/form to both generalized and specific RxNorm concepts and translated drug frequency information to ISO standard. We processed 826 documents by both systems and verified that MedEx-UIMA and MedEx (the Python version) performed similarly by comparing both results. Using two manually annotated test sets that contained 300 drug entries from medication list and 300 drug entries from narrative reports, the MedEx-UIMA system achieved F-measures of 98.5% and 97.5% respectively for encoding drug names to corresponding RxNorm generic drug ingredients, and F-measures of 85.4% and 88.1% respectively for mapping drug names/dose/form to the most specific RxNorm concepts. It also achieved an F-measure of 90.4% for normalizing frequency information to ISO standard. The open source MedEx-UIMA system is freely available online at http://code.google.com/p/medex-uima/.

  20. Behavioral response to contamination risk information in a spatially explicit groundwater environment: Experimental evidence

    NASA Astrophysics Data System (ADS)

    Li, Jingyuan; Michael, Holly A.; Duke, Joshua M.; Messer, Kent D.; Suter, Jordan F.

    2014-08-01

    This paper assesses the effectiveness of aquifer monitoring information in achieving more sustainable use of a groundwater resource in the absence of management policy. Groundwater user behavior in the face of an irreversible contamination threat is studied by applying methods of experimental economics to scenarios that combine a physics-based, spatially explicit, numerical groundwater model with different representations of information about an aquifer and its risk of contamination. The results suggest that the threat of catastrophic contamination affects pumping decisions: pumping is significantly reduced in experiments where contamination is possible compared to those where pumping cost is the only factor discouraging groundwater use. The level of information about the state of the aquifer also affects extraction behavior. Pumping rates differ when information that synthesizes data on aquifer conditions (a "risk gauge") is provided, despite invariant underlying economic incentives, and this result does not depend on whether the risk information is location-specific or from a whole aquifer perspective. Interestingly, users increase pumping when the risk gauge signals good aquifer status compared to a no-gauge treatment. When the gauge suggests impending contamination, however, pumping declines significantly, resulting in a lower probability of contamination. The study suggests that providing relatively simple aquifer condition guidance derived from monitoring data can lead to more sustainable use of groundwater resources.

  1. Fully Convolutional Network Based Shadow Extraction from GF-2 Imagery

    NASA Astrophysics Data System (ADS)

    Li, Z.; Cai, G.; Ren, H.

    2018-04-01

    There are many shadows on the high spatial resolution satellite images, especially in the urban areas. Although shadows on imagery severely affect the information extraction of land cover or land use, they provide auxiliary information for building extraction which is hard to achieve a satisfactory accuracy through image classification itself. This paper focused on the method of building shadow extraction by designing a fully convolutional network and training samples collected from GF-2 satellite imagery in the urban region of Changchun city. By means of spatial filtering and calculation of adjacent relationship along the sunlight direction, the small patches from vegetation or bridges have been eliminated from the preliminary extracted shadows. Finally, the building shadows were separated. The extracted building shadow information from the proposed method in this paper was compared with the results from the traditional object-oriented supervised classification algorihtms. It showed that the deep learning network approach can improve the accuracy to a large extent.

  2. An evaluation of the hypolipidemic effect of an extract of Hibiscus Sabdariffa leaves in hyperlipidemic Indians: a double blind, placebo controlled trial.

    PubMed

    Kuriyan, Rebecca; Kumar, Divya R; R, Rajendran; Kurpad, Anura V

    2010-06-17

    Hibiscus sabdariffa is used regularly in folk medicine to treat various conditions. The study was a double blind, placebo controlled, randomized trial. Sixty subjects with serum LDL values in the range of 130-190 mg/dl and with no history of coronary heart disease were randomized into experimental and placebo groups. The experimental group received 1 gm of the extract for 90 days while the placebo received a similar amount of maltodextrin in addition to dietary and physical activity advice for the control of their blood lipids. Anthropometry, blood biochemistry, dietary and physical activity were assessed at baseline, day 45 and day 90. While body weight, serum LDL cholesterol and triglyceride levels decreased in both groups, there were no significant differences between the experimental and placebo group. It is likely that the observed effects were as a result of the patients following the standard dietary and physical activity advice. At a dose of 1 gm/day, hibiscus sabdariffa leaf extract did not appear to have a blood lipid lowering effect. REFCTRI2009000472.

  3. Models Extracted from Text for System-Software Safety Analyses

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.

    2010-01-01

    This presentation describes extraction and integration of requirements information and safety information in visualizations to support early review of completeness, correctness, and consistency of lengthy and diverse system safety analyses. Software tools have been developed and extended to perform the following tasks: 1) extract model parts and safety information from text in interface requirements documents, failure modes and effects analyses and hazard reports; 2) map and integrate the information to develop system architecture models and visualizations for safety analysts; and 3) provide model output to support virtual system integration testing. This presentation illustrates the methods and products with a rocket motor initiation case.

  4. Tissue cell assisted fabrication of tubular catalytic platinum microengines

    NASA Astrophysics Data System (ADS)

    Wang, Hong; Moo, James Guo Sheng; Pumera, Martin

    2014-09-01

    We report a facile platform for mass production of robust self-propelled tubular microengines. Tissue cells extracted from fruits of banana and apple, Musa acuminata and Malus domestica, are used as the support on which a thin platinum film is deposited by means of physical vapor deposition. Upon sonication of the cells/Pt-coated substrate in water, microscrolls of highly uniform sizes are spontaneously formed. Tubular microengines fabricated with the fruit cell assisted method exhibit a fast motion of ~100 bodylengths per s (~1 mm s-1). An extremely simple and affordable platform for mass production of the micromotors is crucial for the envisioned swarms of thousands and millions of autonomous micromotors performing biomedical and environmental remediation tasks.We report a facile platform for mass production of robust self-propelled tubular microengines. Tissue cells extracted from fruits of banana and apple, Musa acuminata and Malus domestica, are used as the support on which a thin platinum film is deposited by means of physical vapor deposition. Upon sonication of the cells/Pt-coated substrate in water, microscrolls of highly uniform sizes are spontaneously formed. Tubular microengines fabricated with the fruit cell assisted method exhibit a fast motion of ~100 bodylengths per s (~1 mm s-1). An extremely simple and affordable platform for mass production of the micromotors is crucial for the envisioned swarms of thousands and millions of autonomous micromotors performing biomedical and environmental remediation tasks. Electronic supplementary information (ESI) available: Related video. See DOI: 10.1039/c4nr03720k

  5. Health-related rehabilitation services: assessing the global supply of and need for human resources

    PubMed Central

    2011-01-01

    Background Human resources for rehabilitation are often a neglected component of health services strengthening and health workforce development. This may be partly related to weaknesses in the available research and evidence to inform advocacy and programmatic strategies. The objective of this study was to quantitatively describe the global situation in terms of supply of and need for human resources for health-related rehabilitation services, as a basis for strategy development of the workforce in physical and rehabilitation medicine. Methods Data for assessing supply of and need for rehabilitative personnel were extracted and analyzed from statistical databases maintained by the World Health Organization and other national and international health information sources. Standardized classifications were used to enhance cross-national comparability of findings. Results Large differences were found across countries and regions between assessed need for services requiring health workers associated to physical and rehabilitation medicine against estimated supply of health personnel skilled in rehabilitation services. Despite greater need, low- and middle-income countries tended to report less availability of skilled health personnel, although the strength of the supply-need relationship varied across geographical and economic country groupings. Conclusion The evidence base on human resources for health-related rehabilitation services remains fragmented, the result of limited availability and use of quality, comparable data and information within and across countries. This assessment offered the first global baseline, intended to catalyze further research that can be translated into evidence to support human resources for rehabilitation policy and practice. PMID:22004560

  6. A real time sorting algorithm to time sort any deterministic time disordered data stream

    NASA Astrophysics Data System (ADS)

    Saini, J.; Mandal, S.; Chakrabarti, A.; Chattopadhyay, S.

    2017-12-01

    In new generation high intensity high energy physics experiments, millions of free streaming high rate data sources are to be readout. Free streaming data with associated time-stamp can only be controlled by thresholds as there is no trigger information available for the readout. Therefore, these readouts are prone to collect large amount of noise and unwanted data. For this reason, these experiments can have output data rate of several orders of magnitude higher than the useful signal data rate. It is therefore necessary to perform online processing of the data to extract useful information from the full data set. Without trigger information, pre-processing on the free streaming data can only be done with time based correlation among the data set. Multiple data sources have different path delays and bandwidth utilizations and therefore the unsorted merged data requires significant computational efforts for real time manifestation of sorting before analysis. Present work reports a new high speed scalable data stream sorting algorithm with its architectural design, verified through Field programmable Gate Array (FPGA) based hardware simulation. Realistic time based simulated data likely to be collected in an high energy physics experiment have been used to study the performance of the algorithm. The proposed algorithm uses parallel read-write blocks with added memory management and zero suppression features to make it efficient for high rate data-streams. This algorithm is best suited for online data streams with deterministic time disorder/unsorting on FPGA like hardware.

  7. EXAMINATION OF THE ROLE OF PHYSICAL RESOLUTION AND SCALE ON SEDIMENT AND NUTRIENT YIELDS

    EPA Science Inventory

    Currently, watershed delineation and extraction of stream networks are accomplished with GIS databases of digital elevation maps (DEMs). The most common method for extracting channel networks requires the a-priori specification of a critical source area that is required for chann...

  8. Association of physical activity with cognition, metacognition and academic performance in children and adolescents: a protocol for systematic review and meta-analysis.

    PubMed

    Álvarez-Bueno, Celia; Pesce, Caterina; Cavero-Redondo, Iván; Sánchez-López, Mairena; Pardo-Guijarro, María Jesús; Martínez-Vizcaíno, Vicente

    2016-06-28

    Schools provide a relevant context for improving children's and adolescents' physical and mental health by increasing physical activity during school hours and/or beyond. The interest in the relationship between physical activity programmes and cognition during development has recently increased, with evidence suggesting a positive association. We present a protocol of systematic reviews and meta-analysis of intervention studies that, by determining the effects of chronic physical exercise on children's and adolescents' cognitive and metacognitive functions, cognitive life skills, academic behaviours and achievement, aims to ensure procedural objectivity and transparency, and maximise the extraction of relevant information to inform policy development. This protocol is guided by Preferred Reporting Items for Systematic Review and Meta-Analysis Protocols (PRISMA-P) and by the Cochrane Collaboration Handbook. Databases to be utilised for a thorough selection of the pertinent literature are MEDLINE, EMBASE, Cochrane Central Register of Controlled Trials, Cochrane Database of Systematic Reviews, Web of Science, PsycINFO and ERIC. Selection is proposed to encompass an international and a national publication level, with inclusion of experimental studies written in English or in Spanish, respectively. Also, relevant references included in the selected studies will be considered suitable for review as supplemental sources.We present an integrated approach to the methodological quality assessment of the selected studies, including the Jadad Scale for the assessment of the quality of randomised controlled trials and the Quality Assessment Tool for Quantitative Studies for pre-post studies and non-randomised controlled trials. The pre-post interventions mean differences will be the primary indicator of the intervention outcome. A subgroup analysis is proposed based on cognitive functions and their neural correlates, metacognitive functions and cognitive life skills, academic achievement areas and academic behaviours. PROSPERO CRD42015029913. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  9. Publishing Linked Open Data for Physical Samples - Lessons Learned

    NASA Astrophysics Data System (ADS)

    Ji, P.; Arko, R. A.; Lehnert, K.; Bristol, S.

    2016-12-01

    Most data and information about physical samples and associated sampling features currently reside in relational databases. Integrating common concepts from various databases has motivated us to publish Linked Open Data for collections of physical samples, using Semantic Web technologies including the Resource Description Framework (RDF), RDF Query Language (SPARQL), and Web Ontology Language (OWL). The goal of our work is threefold: To evaluate and select ontologies in different granularities for common concepts; to establish best practices and develop a generic methodology for publishing physical sample data stored in relational database as Linked Open Data; and to reuse standard community vocabularies from the International Commission on Stratigraphy (ICS), Global Volcanism Program (GVP), General Bathymetric Chart of the Oceans (GEBCO), and others. Our work leverages developments in the EarthCube GeoLink project and the Interdisciplinary Earth Data Alliance (IEDA) facility for modeling and extracting physical sample data stored in relational databases. Reusing ontologies developed by GeoLink and IEDA has facilitated discovery and integration of data and information across multiple collections including the USGS National Geochemical Database (NGDB), System for Earth Sample Registration (SESAR), and Index to Marine & Lacustrine Geological Samples (IMLGS). We have evaluated, tested, and deployed Linked Open Data tools including Morph, Virtuoso Server, LodView, LodLive, and YASGUI for converting, storing, representing, and querying data in a knowledge base (RDF triplestore). Using persistent identifiers such as Open Researcher & Contributor IDs (ORCIDs) and International Geo Sample Numbers (IGSNs) at the record level makes it possible for other repositories to link related resources such as persons, datasets, documents, expeditions, awards, etc. to samples, features, and collections. This work is supported by the EarthCube "GeoLink" project (NSF# ICER14-40221 and others) and the "USGS-IEDA Partnership to Support a Data Lifecycle Framework and Tools" project (USGS# G13AC00381).

  10. An information-theoretic approach to the modeling and analysis of whole-genome bisulfite sequencing data.

    PubMed

    Jenkinson, Garrett; Abante, Jordi; Feinberg, Andrew P; Goutsias, John

    2018-03-07

    DNA methylation is a stable form of epigenetic memory used by cells to control gene expression. Whole genome bisulfite sequencing (WGBS) has emerged as a gold-standard experimental technique for studying DNA methylation by producing high resolution genome-wide methylation profiles. Statistical modeling and analysis is employed to computationally extract and quantify information from these profiles in an effort to identify regions of the genome that demonstrate crucial or aberrant epigenetic behavior. However, the performance of most currently available methods for methylation analysis is hampered by their inability to directly account for statistical dependencies between neighboring methylation sites, thus ignoring significant information available in WGBS reads. We present a powerful information-theoretic approach for genome-wide modeling and analysis of WGBS data based on the 1D Ising model of statistical physics. This approach takes into account correlations in methylation by utilizing a joint probability model that encapsulates all information available in WGBS methylation reads and produces accurate results even when applied on single WGBS samples with low coverage. Using the Shannon entropy, our approach provides a rigorous quantification of methylation stochasticity in individual WGBS samples genome-wide. Furthermore, it utilizes the Jensen-Shannon distance to evaluate differences in methylation distributions between a test and a reference sample. Differential performance assessment using simulated and real human lung normal/cancer data demonstrate a clear superiority of our approach over DSS, a recently proposed method for WGBS data analysis. Critically, these results demonstrate that marginal methods become statistically invalid when correlations are present in the data. This contribution demonstrates clear benefits and the necessity of modeling joint probability distributions of methylation using the 1D Ising model of statistical physics and of quantifying methylation stochasticity using concepts from information theory. By employing this methodology, substantial improvement of DNA methylation analysis can be achieved by effectively taking into account the massive amount of statistical information available in WGBS data, which is largely ignored by existing methods.

  11. Mining of the social network extraction

    NASA Astrophysics Data System (ADS)

    Nasution, M. K. M.; Hardi, M.; Syah, R.

    2017-01-01

    The use of Web as social media is steadily gaining ground in the study of social actor behaviour. However, information in Web can be interpreted in accordance with the ability of the method such as superficial methods for extracting social networks. Each method however has features and drawbacks: it cannot reveal the behaviour of social actors, but it has the hidden information about them. Therefore, this paper aims to reveal such information in the social networks mining. Social behaviour could be expressed through a set of words extracted from the list of snippets.

  12. Lexical and sublexical semantic preview benefits in Chinese reading.

    PubMed

    Yan, Ming; Zhou, Wei; Shu, Hua; Kliegl, Reinhold

    2012-07-01

    Semantic processing from parafoveal words is an elusive phenomenon in alphabetic languages, but it has been demonstrated only for a restricted set of noncompound Chinese characters. Using the gaze-contingent boundary paradigm, this experiment examined whether parafoveal lexical and sublexical semantic information was extracted from compound preview characters. Results generalized parafoveal semantic processing to this representative set of Chinese characters and extended the parafoveal processing to radical (sublexical) level semantic information extraction. Implications for notions of parafoveal information extraction during Chinese reading are discussed. 2012 APA, all rights reserved

  13. Application of the medical data warehousing architecture EPIDWARE to epidemiological follow-up: data extraction and transformation.

    PubMed

    Kerkri, E; Quantin, C; Yetongnon, K; Allaert, F A; Dusserre, L

    1999-01-01

    In this paper, we present an application of EPIDWARE, medical data warehousing architecture, to our epidemiological follow-up project. The aim of this project is to extract and regroup information from various information systems for epidemiological studies. We give a description of the requirements of the epidemiological follow-up project such as anonymity of medical data information and data file linkage procedure. We introduce the concept of Data Warehousing Architecture. The particularities of data extraction and transformation are presented and discussed.

  14. Architecture and data processing alternatives for the TSE computer. Volume 2: Extraction of topological information from an image by the Tse computer

    NASA Technical Reports Server (NTRS)

    Jones, J. R.; Bodenheimer, R. E.

    1976-01-01

    A simple programmable Tse processor organization and arithmetic operations necessary for extraction of the desired topological information are described. Hardware additions to this organization are discussed along with trade-offs peculiar to the tse computing concept. An improved organization is presented along with the complementary software for the various arithmetic operations. The performance of the two organizations is compared in terms of speed, power, and cost. Software routines developed to extract the desired information from an image are included.

  15. Optimal tuning of a confined Brownian information engine.

    PubMed

    Park, Jong-Min; Lee, Jae Sung; Noh, Jae Dong

    2016-03-01

    A Brownian information engine is a device extracting mechanical work from a single heat bath by exploiting the information on the state of a Brownian particle immersed in the bath. As for engines, it is important to find the optimal operating condition that yields the maximum extracted work or power. The optimal condition for a Brownian information engine with a finite cycle time τ has been rarely studied because of the difficulty in finding the nonequilibrium steady state. In this study, we introduce a model for the Brownian information engine and develop an analytic formalism for its steady-state distribution for any τ. We find that the extracted work per engine cycle is maximum when τ approaches infinity, while the power is maximum when τ approaches zero.

  16. Relating structural growth environment to white spruce sapling establishment at the Forest-Tundra Ecotone

    NASA Astrophysics Data System (ADS)

    Maguire, A.; Boelman, N.; Griffin, K. L.; Jensen, J.; Hiers, E.; Johnson, D. M.; Vierling, L. A.; Eitel, J.

    2017-12-01

    The effect of climate change on treeline position at the latitudinal Forest-Tundra ecotone (FTE) is poorly understood. While the FTE is expansive (stretching 13,000 km acros the panarctic), understanding relationships between climate and tree function may depend on very fine scale processes. High resolution tools are therefore needed to appropriately characterize the leading (northernmost) edge of the FTE. We hypothesized that microstructural metrics obtainable from lidar remote sensing may explain variation in the physical growth environment that governs sapling establishment. To test our hypothesis, we used terrestrial laser scanning (TLS) to collect highly spatially resolved 3-D structural information of white spruce (Picea glauca) saplings and their aboveground growth environment at the leading edge of a FTE in northern Alaska and Northwest Territories, Canada. Coordinates of sapling locations were extracted from the 3-D TLS data. Within each sampling plot, 20 sets of coordinates were randomly selected from regions where no saplings were present. Ground roughness, canopy roughness, average aspect, average slope, average curvature, wind shelter index, and wetness indexwere extracted from point clouds within a variable radius from all coordinates. Generalized linear models (GLM) were fit to determine which microstructural metrics were most strongly associated with sapling establishment. Preliminary analyses of three plots suggest that vegetation roughness, wetness index, ground roughness, and slope were the most important terrain metrics governing sapling presence (Figure 1). Comprehensive analyses will include eight plots and GLMs optimized for scale at which structural parameters affect sapling establishment. Spatial autocorrelation of sample locations will be accounted for in models. Because these analyses address how the physical growth environment affects sapling establishment, model outputs will provide information for improving understanding of the ecological processes that regulate treeline dynamics. Moreover, establishing relationships between the remotely sensed structural growth environment and tree establishment provides new ways of spatially scaling across larger areas to study ecological change at the FTE.

  17. Tai Chi Improves Sleep Quality in Healthy Adults and Patients with Chronic Conditions: A Systematic Review and Meta-analysis

    PubMed Central

    Raman, Gowri; Zhang, Yuan; Minichiello, Vincent J; D'Ambrosio, Carolyn M.; Wang, Chenchen

    2017-01-01

    Background Physical activity and exercise appear to improve sleep quality. However, the quantitative effects of Tai Chi on sleep quality in the adult population have rarely been examined. We conducted a systematic review and meta-analysis evaluating the effects of Tai Chi on sleep quality in healthy adults and disease populations. Methods Medline, Cochrane Central databases, and review of references were searched through July 31, 2013. English-language studies of all designs evaluating Tai Chi’s effect on sleep outcomes in adults were examined. Data were extracted and verified by 2 reviewers. Extracted information included study setting and design, population characteristics, type and duration of interventions, outcomes, risk of bias and main results. Random effect models meta-analysis was used to assess the magnitude of treatment effect when at least 3 trials reported on the same sleep outcomes. Results Eleven studies (9 randomized and 2 non-randomized trials) totaling 994 subjects published between 2004 and 2012 were identified. All studies except one reported Pittsburg Sleep Quality Index. Nine randomized trials reported that 1.5 to 3 hour each week for a duration of 6 to 24 weeks of Tai Chi significantly improved sleep quality (Effect Size, 0.89; 95% confidence interval [CI], 0.28 to 1.50), in community-dwelling healthy participants and in patients with chronic conditions. Improvement in health outcomes including physical performance, pain reduction, and psychological well-being occurred in the Tai Chi group compared with various controls. Limitations Studies were heterogeneous and some trials were lacking in methodological rigor. Conclusions Tai Chi significantly improved sleep quality in both healthy adults and patients with chronic health conditions, which suggests that Tai Chi may be considered as an alternative behavioral therapy in the treatment of insomnia. High-quality, well-controlled randomized trials are needed to better inform clinical decisions. PMID:28845367

  18. Patient outcomes after critical illness: a systematic review of qualitative studies following hospital discharge.

    PubMed

    Hashem, Mohamed D; Nallagangula, Aparna; Nalamalapu, Swaroopa; Nunna, Krishidhar; Nausran, Utkarsh; Robinson, Karen A; Dinglas, Victor D; Needham, Dale M; Eakin, Michelle N

    2016-10-26

    There is growing interest in patient outcomes following critical illness, with an increasing number and different types of studies conducted, and a need for synthesis of existing findings to help inform the field. For this purpose we conducted a systematic review of qualitative studies evaluating patient outcomes after hospital discharge for survivors of critical illness. We searched the PubMed, EMBASE, CINAHL, PsycINFO, and CENTRAL databases from inception to June 2015. Studies were eligible for inclusion if the study population was >50 % adults discharged from the ICU, with qualitative evaluation of patient outcomes. Studies were excluded if they focused on specific ICU patient populations or specialty ICUs. Citations were screened in duplicate, and two reviewers extracted data sequentially for each eligible article. Themes related to patient outcome domains were coded and categorized based on the main domains of the Patient Reported Outcomes Measurement Information System (PROMIS) framework. A total of 2735 citations were screened, and 22 full-text articles were eligible, with year of publication ranging from 1995 to 2015. All of the qualitative themes were extracted from eligible studies and then categorized using PROMIS descriptors: satisfaction with life (16 studies), including positive outlook, acceptance, gratitude, independence, boredom, loneliness, and wishing they had not lived; mental health (15 articles), including symptoms of post-traumatic stress disorder, anxiety, depression, and irritability/anger; physical health (14 articles), including mobility, activities of daily living, fatigue, appetite, sensory changes, muscle weakness, and sleep disturbances; social health (seven articles), including changes in friends/family relationships; and ability to participate in social roles and activities (six articles), including hobbies and disability. ICU survivors may experience positive emotions and life satisfaction; however, a wide range of mental, physical, social, and functional sequelae occur after hospital discharge. These findings are important for understanding patient-centered outcomes in critical care and providing focus for future interventional studies aimed at improving outcomes of importance to ICU survivors.

  19. Measurements of the Total Reaction Cross Sections for 6,8He and 8,9Li Nuclei with Energies of (25-45)A Mev on natAl, natTa and natPb

    NASA Astrophysics Data System (ADS)

    Erdemchimeg, B.; Artukh, A. G.; Klygin, S. A.; Kononenko, G. A.; Kyslukha, D. A.; Sereda, Yu. M.; Vorontzov, A. N.; Lukyanov, S. M.; Penionzhkevich, Yu. E.; Davaa, S.; Khuukhenkhuu, G.; Borcea, C.; Rotaru, F.; Stanoiu, M.; Martina, L.; Saillant, F.; Raine, B.

    2015-06-01

    The total nuclear reaction cross sections (σR) measurements have long been of interest since they tell us about the radii and transparency of these nuclei and give clues to understanding of their structure. For studies of unstable nuclei, in particular the physical properties of halo nuclei and the neutron skin thickness, it is valuable to know not only the root-mean-square radii (rms) but it is important to know the details of nucleusnucleus potentials. Our goal was to study total reaction cross sections (σR) by a direct measurement technique (the so-called beam attenuation or transmission method) which allows to extract model independent information. The interaction radii for 6He, 8,9Li were extracted, which are in agreement with the previous measurement at the similar energies (about a few tens of AMeV) Our results show a tendency of increasing radii as function of mass of the secondary targets.

  20. Concept of operations for knowledge discovery from Big Data across enterprise data warehouses

    NASA Astrophysics Data System (ADS)

    Sukumar, Sreenivas R.; Olama, Mohammed M.; McNair, Allen W.; Nutaro, James J.

    2013-05-01

    The success of data-driven business in government, science, and private industry is driving the need for seamless integration of intra and inter-enterprise data sources to extract knowledge nuggets in the form of correlations, trends, patterns and behaviors previously not discovered due to physical and logical separation of datasets. Today, as volume, velocity, variety and complexity of enterprise data keeps increasing, the next generation analysts are facing several challenges in the knowledge extraction process. Towards addressing these challenges, data-driven organizations that rely on the success of their analysts have to make investment decisions for sustainable data/information systems and knowledge discovery. Options that organizations are considering are newer storage/analysis architectures, better analysis machines, redesigned analysis algorithms, collaborative knowledge management tools, and query builders amongst many others. In this paper, we present a concept of operations for enabling knowledge discovery that data-driven organizations can leverage towards making their investment decisions. We base our recommendations on the experience gained from integrating multi-agency enterprise data warehouses at the Oak Ridge National Laboratory to design the foundation of future knowledge nurturing data-system architectures.

  1. Thermal-to-visible face recognition using partial least squares.

    PubMed

    Hu, Shuowen; Choi, Jonghyun; Chan, Alex L; Schwartz, William Robson

    2015-03-01

    Although visible face recognition has been an active area of research for several decades, cross-modal face recognition has only been explored by the biometrics community relatively recently. Thermal-to-visible face recognition is one of the most difficult cross-modal face recognition challenges, because of the difference in phenomenology between the thermal and visible imaging modalities. We address the cross-modal recognition problem using a partial least squares (PLS) regression-based approach consisting of preprocessing, feature extraction, and PLS model building. The preprocessing and feature extraction stages are designed to reduce the modality gap between the thermal and visible facial signatures, and facilitate the subsequent one-vs-all PLS-based model building. We incorporate multi-modal information into the PLS model building stage to enhance cross-modal recognition. The performance of the proposed recognition algorithm is evaluated on three challenging datasets containing visible and thermal imagery acquired under different experimental scenarios: time-lapse, physical tasks, mental tasks, and subject-to-camera range. These scenarios represent difficult challenges relevant to real-world applications. We demonstrate that the proposed method performs robustly for the examined scenarios.

  2. Top physics at CDF

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hughes, R.E.

    1997-01-01

    We report on top physics results using a 100 pb{sup -1} data sample of p{bar p} collisions at {radical}s = 1.8 TeV collected with the Collider Detector at Fermilab (CDF). We have identified top signals in a variety of decay channels, and used these channels to extract a measurement of the top mass and production cross section. A subset of the data (67 pb{sup -1}) is used to determine M{sub top} = 176 {+-} 8(stat) {+-} 10(syst) and {sigma}(tt) = 7.6 {sub -2.0}{sup +2.4} pb. We present studies of the kinematics of t{bar t} events and extract the first directmore » measurement of V{sub tb}. Finally, we indicate prospects for future study of top physics at the Tevatron.« less

  3. An Ontology-Enabled Natural Language Processing Pipeline for Provenance Metadata Extraction from Biomedical Text (Short Paper).

    PubMed

    Valdez, Joshua; Rueschman, Michael; Kim, Matthew; Redline, Susan; Sahoo, Satya S

    2016-10-01

    Extraction of structured information from biomedical literature is a complex and challenging problem due to the complexity of biomedical domain and lack of appropriate natural language processing (NLP) techniques. High quality domain ontologies model both data and metadata information at a fine level of granularity, which can be effectively used to accurately extract structured information from biomedical text. Extraction of provenance metadata, which describes the history or source of information, from published articles is an important task to support scientific reproducibility. Reproducibility of results reported by previous research studies is a foundational component of scientific advancement. This is highlighted by the recent initiative by the US National Institutes of Health called "Principles of Rigor and Reproducibility". In this paper, we describe an effective approach to extract provenance metadata from published biomedical research literature using an ontology-enabled NLP platform as part of the Provenance for Clinical and Healthcare Research (ProvCaRe). The ProvCaRe-NLP tool extends the clinical Text Analysis and Knowledge Extraction System (cTAKES) platform using both provenance and biomedical domain ontologies. We demonstrate the effectiveness of ProvCaRe-NLP tool using a corpus of 20 peer-reviewed publications. The results of our evaluation demonstrate that the ProvCaRe-NLP tool has significantly higher recall in extracting provenance metadata as compared to existing NLP pipelines such as MetaMap.

  4. FIR: An Effective Scheme for Extracting Useful Metadata from Social Media.

    PubMed

    Chen, Long-Sheng; Lin, Zue-Cheng; Chang, Jing-Rong

    2015-11-01

    Recently, the use of social media for health information exchange is expanding among patients, physicians, and other health care professionals. In medical areas, social media allows non-experts to access, interpret, and generate medical information for their own care and the care of others. Researchers paid much attention on social media in medical educations, patient-pharmacist communications, adverse drug reactions detection, impacts of social media on medicine and healthcare, and so on. However, relatively few papers discuss how to extract useful knowledge from a huge amount of textual comments in social media effectively. Therefore, this study aims to propose a Fuzzy adaptive resonance theory network based Information Retrieval (FIR) scheme by combining Fuzzy adaptive resonance theory (ART) network, Latent Semantic Indexing (LSI), and association rules (AR) discovery to extract knowledge from social media. In our FIR scheme, Fuzzy ART network firstly has been employed to segment comments. Next, for each customer segment, we use LSI technique to retrieve important keywords. Then, in order to make the extracted keywords understandable, association rules mining is presented to organize these extracted keywords to build metadata. These extracted useful voices of customers will be transformed into design needs by using Quality Function Deployment (QFD) for further decision making. Unlike conventional information retrieval techniques which acquire too many keywords to get key points, our FIR scheme can extract understandable metadata from social media.

  5. Mars Target Encyclopedia: Information Extraction for Planetary Science

    NASA Astrophysics Data System (ADS)

    Wagstaff, K. L.; Francis, R.; Gowda, T.; Lu, Y.; Riloff, E.; Singh, K.

    2017-06-01

    Mars surface targets / and published compositions / Seek and ye will find. We used text mining methods to extract information from LPSC abstracts about the composition of Mars surface targets. Users can search by element, mineral, or target.

  6. Noncontact Measurements Of Torques In Shafts

    NASA Technical Reports Server (NTRS)

    Schwartzbart, Aaron

    1991-01-01

    Additional information extracted from eddy-current proximeter. Positioned over rotating shaft, measures both displacement of and torsion in shaft. Torque applied to shaft calculable from output of proximeter. Possible to extract torsion information from existing tape-recorded proximeter data.

  7. Techniques for information extraction from compressed GPS traces : final report.

    DOT National Transportation Integrated Search

    2015-12-31

    Developing techniques for extracting information requires a good understanding of methods used to compress the traces. Many techniques for compressing trace data : consisting of position (i.e., latitude/longitude) and time values have been developed....

  8. Feasibility of Extracting Key Elements from ClinicalTrials.gov to Support Clinicians' Patient Care Decisions.

    PubMed

    Kim, Heejun; Bian, Jiantao; Mostafa, Javed; Jonnalagadda, Siddhartha; Del Fiol, Guilherme

    2016-01-01

    Motivation: Clinicians need up-to-date evidence from high quality clinical trials to support clinical decisions. However, applying evidence from the primary literature requires significant effort. Objective: To examine the feasibility of automatically extracting key clinical trial information from ClinicalTrials.gov. Methods: We assessed the coverage of ClinicalTrials.gov for high quality clinical studies that are indexed in PubMed. Using 140 random ClinicalTrials.gov records, we developed and tested rules for the automatic extraction of key information. Results: The rate of high quality clinical trial registration in ClinicalTrials.gov increased from 0.2% in 2005 to 17% in 2015. Trials reporting results increased from 3% in 2005 to 19% in 2015. The accuracy of the automatic extraction algorithm for 10 trial attributes was 90% on average. Future research is needed to improve the algorithm accuracy and to design information displays to optimally present trial information to clinicians.

  9. Feature extraction applied to agricultural crops as seen by LANDSAT

    NASA Technical Reports Server (NTRS)

    Kauth, R. J.; Lambeck, P. F.; Richardson, W.; Thomas, G. S.; Pentland, A. P. (Principal Investigator)

    1979-01-01

    The physical interpretation of the spectral-temporal structure of LANDSAT data can be conveniently described in terms of a graphic descriptive model called the Tassled Cap. This model has been a source of development not only in crop-related feature extraction, but also for data screening and for haze effects correction. Following its qualitative description and an indication of its applications, the model is used to analyze several feature extraction algorithms.

  10. Visualizing Capsaicinoids: Colorimetric Analysis of Chili Peppers

    ERIC Educational Resources Information Center

    Thompson, Robert Q.; Chu, Christopher; Gent, Robin; Gould, Alexandra P.; Rios, Laura; Vertigan, Theresa M.

    2012-01-01

    A colorimetric method for total capsaicinoids in chili pepper ("Capsicum") fruit is described. The placental material of the pepper, containing 90% of the capsaicinoids, was physically separated from the colored materials in the pericarp and extracted twice with methanol, capturing 85% of the remaining capsaicinoids. The extract, evaporated and…

  11. Extraction of actionable information from crowdsourced disaster data.

    PubMed

    Kiatpanont, Rungsun; Tanlamai, Uthai; Chongstitvatana, Prabhas

    Natural disasters cause enormous damage to countries all over the world. To deal with these common problems, different activities are required for disaster management at each phase of the crisis. There are three groups of activities as follows: (1) make sense of the situation and determine how best to deal with it, (2) deploy the necessary resources, and (3) harmonize as many parties as possible, using the most effective communication channels. Current technological improvements and developments now enable people to act as real-time information sources. As a result, inundation with crowdsourced data poses a real challenge for a disaster manager. The problem is how to extract the valuable information from a gigantic data pool in the shortest possible time so that the information is still useful and actionable. This research proposed an actionable-data-extraction process to deal with the challenge. Twitter was selected as a test case because messages posted on Twitter are publicly available. Hashtag, an easy and very efficient technique, was also used to differentiate information. A quantitative approach to extract useful information from the tweets was supported and verified by interviews with disaster managers from many leading organizations in Thailand to understand their missions. The information classifications extracted from the collected tweets were first performed manually, and then the tweets were used to train a machine learning algorithm to classify future tweets. One particularly useful, significant, and primary section was the request for help category. The support vector machine algorithm was used to validate the results from the extraction process of 13,696 sample tweets, with over 74 percent accuracy. The results confirmed that the machine learning technique could significantly and practically assist with disaster management by dealing with crowdsourced data.

  12. DEXTER: Disease-Expression Relation Extraction from Text.

    PubMed

    Gupta, Samir; Dingerdissen, Hayley; Ross, Karen E; Hu, Yu; Wu, Cathy H; Mazumder, Raja; Vijay-Shanker, K

    2018-01-01

    Gene expression levels affect biological processes and play a key role in many diseases. Characterizing expression profiles is useful for clinical research, and diagnostics and prognostics of diseases. There are currently several high-quality databases that capture gene expression information, obtained mostly from large-scale studies, such as microarray and next-generation sequencing technologies, in the context of disease. The scientific literature is another rich source of information on gene expression-disease relationships that not only have been captured from large-scale studies but have also been observed in thousands of small-scale studies. Expression information obtained from literature through manual curation can extend expression databases. While many of the existing databases include information from literature, they are limited by the time-consuming nature of manual curation and have difficulty keeping up with the explosion of publications in the biomedical field. In this work, we describe an automated text-mining tool, Disease-Expression Relation Extraction from Text (DEXTER) to extract information from literature on gene and microRNA expression in the context of disease. One of the motivations in developing DEXTER was to extend the BioXpress database, a cancer-focused gene expression database that includes data derived from large-scale experiments and manual curation of publications. The literature-based portion of BioXpress lags behind significantly compared to expression information obtained from large-scale studies and can benefit from our text-mined results. We have conducted two different evaluations to measure the accuracy of our text-mining tool and achieved average F-scores of 88.51 and 81.81% for the two evaluations, respectively. Also, to demonstrate the ability to extract rich expression information in different disease-related scenarios, we used DEXTER to extract information on differential expression information for 2024 genes in lung cancer, 115 glycosyltransferases in 62 cancers and 826 microRNA in 171 cancers. All extractions using DEXTER are integrated in the literature-based portion of BioXpress.Database URL: http://biotm.cis.udel.edu/DEXTER.

  13. Measurement of dielectric constant of organic solvents by indigenously developed dielectric probe

    NASA Astrophysics Data System (ADS)

    Keshari, Ajay Kumar; Rao, J. Prabhakar; Rao, C. V. S. Brahmmananda; Ramakrishnan, R.; Ramanarayanan, R. R.

    2018-04-01

    The extraction, separation and purification of actinides (uranium and plutonium) from various matrices are an important step in nuclear fuel cycle. One of the separation process adopted in an industrial scale is the liquid-liquid extraction or solvent extraction. Liquid-liquid extraction uses a specific ligand/extractant in conjunction with suitable diluent. Solvent extraction or liquid-liquid extraction, involves the partitioning of the solute between two immiscible phases. In most cases, one of the phases is aqueous, and the other one is an organic solvent. The solvent used in solvent extraction should be selective for the metal of interest, it should have optimum distribution ratio, and the loaded metal from the organic phase should be easily stripped under suitable experimental conditions. Some of the important physical properties which are important for the solvent are density, viscosity, phase separation time, interfacial surface tension and the polarity of the extractant.

  14. Dual-wavelength phase-shifting digital holography selectively extracting wavelength information from wavelength-multiplexed holograms.

    PubMed

    Tahara, Tatsuki; Mori, Ryota; Kikunaga, Shuhei; Arai, Yasuhiko; Takaki, Yasuhiro

    2015-06-15

    Dual-wavelength phase-shifting digital holography that selectively extracts wavelength information from five wavelength-multiplexed holograms is presented. Specific phase shifts for respective wavelengths are introduced to remove the crosstalk components and extract only the object wave at the desired wavelength from the holograms. Object waves in multiple wavelengths are selectively extracted by utilizing 2π ambiguity and the subtraction procedures based on phase-shifting interferometry. Numerical results show the validity of the proposed technique. The proposed technique is also experimentally demonstrated.

  15. Thermal quantum coherence and correlation in the extended XY spin chain

    NASA Astrophysics Data System (ADS)

    Sha, Ya-Ting; Wang, Yue; Sun, Zheng-Hang; Hou, Xi-Wen

    2018-05-01

    Quantum coherence and correlation of thermal states in the extended XY spin chain are studied in terms of the recently proposed l1 norm, skew information, and Bures distance of geometry discord (BGD), respectively. The entanglement measured via concurrence is calculated for reference. A two-dimensional susceptibility is introduced to explore their capability in highlighting the critical lines associated with quantum phase transitions in the model. It is shown that the susceptibility of the skew information and BGD is a genuine indicator of quantum phase transitions, and characterizes the factorization. However, the l1 norm is trivial for the factorization. An explicit scaling law of BGD is captured at low temperature in the XY model. In contrast to the entanglement, quantum coherence reveals a kind of long-range nonclassical correlation. Moreover, the obvious relation among model parameters is extracted for the factorized line in the extended model. Those are instructive for the understanding of quantum coherence and correlation in the theory of quantum information, and quantum phase transitions and factorization in condensed-matter physics.

  16. Smart Health - Potential and Pathways: A Survey

    NASA Astrophysics Data System (ADS)

    Arulananthan, C.; Hanifa, Sabibullah Mohamed

    2017-08-01

    Healthcare is an imperative key field of research, where individuals or groups can be engaged in the self-tracking of any kind of biological, physical, behavioral, or environmental information. In a massive health care data, the valuable information is hidden. The quantity of the available unstructured data has been expanding on an exponential scale. The newly developing Disruptive Technologies can handle many challenges that face data analysis and ability to extract valuable information via data analytics. Connected Wellness in Healthcare would retrieve patient’s physiological, pathological and behavioral parameters through sensors to perform inner workings of human body analysis. Disruptive technologies can take us from a reactive illness-driven to a proactive wellness-driven system in health care. It is need to be strive and create a smart health system towards wellness-driven instead of being illness-driven, today’s biggest problem in health care. Wellness-driven-analytics application help to promote healthiest living environment called “Smart Health”, deliver empower based quality of living. The contributions of this survey reveals and opens (touches uncovered areas) the possible doors in the line of research on smart health and its computing technologies.

  17. Role of maternal occupational physical activity and psychosocial stressors on adverse birth outcomes

    PubMed Central

    Lee, Laura J; Symanski, Elaine; Lupo, Philip J; Tinker, Sarah C; Razzaghi, Hilda; Chan, Wenyaw; Hoyt, Adrienne T; Canfield, Mark A

    2016-01-01

    Objectives We examined the association of an array of estimated maternal occupational physical activities and psychosocial stressors during pregnancy with odds for preterm birth (PTB) and small-for-gestational age (SGA). Methods Data for infants born without major birth defects delivered from 1997 to 2009 whose mothers reported working at least 1 month during pregnancy were obtained from the National Birth Defects Prevention Study. We linked occupational codes to the US Department of Labor’s Occupational Information Network, which provides estimates of exposure for multiple domains of physical activity and psychosocial stressors by occupational categories. We conducted factor analysis using principal components extraction with 17 occupational activities and calculated factor scores. ORs for PTB and SGA across quartiles of factor scores in each trimester were computed using logistic regression. Results Factor analysis grouped occupational domains into 4 groups based on factor loadings. These groups were ‘occupational physical activity’, ‘interpersonal stressor’, ‘automated work’ and ‘job responsibility’. High levels of ‘occupational physical activity’ were significantly associated with SGA (adjusted OR (AOR) for highest quartile compared with lowest quartile of factor score: 1.36; 95% CIs 1.02 to 1.82; p for trend=0.001) and were also positively associated with PTB (AOR: 1.24; 95% CI 0.93 to 1.64; p for trend=0.01). No clear results were observed across domains of psychosocial stressors. Conclusions Our findings expand understanding of associations between occupational physical activity and psychosocial stressors and PTB and SGA and suggest that additional research is needed to further examine these relationships. PMID:27919059

  18. Handling of subpixel structures in the application of satellite derived irradiance data for solar energy system analysis - a review

    NASA Astrophysics Data System (ADS)

    Beyer, Hans Georg

    2016-04-01

    With the increasing availability of satellite derived irradiance information, this type of data set is more and more in use for the design and operation of solar energy systems, most notably PV- and CSP-systems. By this, the need for data measured on-site is reduced. However, due to basic limitations of the satellite-derived data, several requirements put by the intended application cannot be coped with this data type directly. Traw satellite information has to be enhanced in both space and time resolution by additional information to be fully applicable for all aspects of the modelling od solar energy systems. To cope with this problem, several individual and collaborative projects had been performed in the recent years or are ongoing. Approaches are on one hand based on pasting synthesized high-resolution data into the low-resolution original sets. Pre-requite is an appropriate model, validated against real world data. For the case of irradiance data, these models can be extracted either directly from ground measured data sets or from data referring to the cloud situation as gained from the images of sky cameras or from monte -carlo initialized physical models. The current models refer to the spatial structure of the cloud fields. Dynamics are imposed by moving the cloud structures according to a large scale cloud motion vector, either extracted from the dynamics interfered from consecutive satellite images or taken from a meso-scale meteorological model. Dynamic irradiance information is then derived from the cloud field structure and the cloud motion vector. This contribution, which is linked to subtask A - Solar Resource Applications for High Penetration of Solar Technologies - of IEA SHC task 46, will present the different approaches and discuss examples in view of validation, need for auxiliary information and respective general applicability.

  19. Optimized aqueous extraction of saponins from bitter melon for production of a saponin-enriched bitter melon powder.

    PubMed

    Tan, Sing P; Vuong, Quan V; Stathopoulos, Costas E; Parks, Sophie E; Roach, Paul D

    2014-07-01

    Bitter melon, Momordica charantia L. (Cucurbitaceae), aqueous extracts are proposed to have health-promoting properties due to their content of saponins and their antioxidant activity. However, the optimal conditions for the aqueous extraction of saponins from bitter melon and the effects of spray drying have not been established. Therefore, this study aimed to optimize the aqueous extraction of the saponins from bitter melon, using response surface methodology, prepare a powder using spray drying, and compare the powder's physical properties, components, and antioxidant capacity with aqueous and ethanol freeze-dried bitter melon powders and a commercial powder. The optimal aqueous extraction conditions were determined to be 40 °C for 15 min and the water-to-sample ratio was chosen to be 20:1 mL/g. For many of its physical properties, components, and antioxidant capacity, the aqueous spray-dried powder was comparable to the aqueous and ethanol freeze-dried bitter melon powders and the commercial powder. The optimal conditions for the aqueous extraction of saponins from bitter melon followed by spray drying gave a high quality powder in terms of saponins and antioxidant activity. This study highlights that bitter melon is a rich source of saponin compounds and their associated antioxidant activities, which may provide health benefits. The findings of the current study will help with the development of extraction and drying technologies for the preparation of a saponin-enriched powdered extract from bitter melon. The powdered extract may have potential as a nutraceutical supplement or as a value-added ingredient for incorporation into functional foods. © 2014 Institute of Food Technologists®

  20. Increasing physical activity with mobile devices: a meta-analysis.

    PubMed

    Fanning, Jason; Mullen, Sean P; McAuley, Edward

    2012-11-21

    Regular physical activity has established physical and mental health benefits; however, merely one quarter of the U.S. adult population meets national physical activity recommendations. In an effort to engage individuals who do not meet these guidelines, researchers have utilized popular emerging technologies, including mobile devices (ie, personal digital assistants [PDAs], mobile phones). This study is the first to synthesize current research focused on the use of mobile devices for increasing physical activity. To conduct a meta-analysis of research utilizing mobile devices to influence physical activity behavior. The aims of this review were to: (1) examine the efficacy of mobile devices in the physical activity setting, (2) explore and discuss implementation of device features across studies, and (3) make recommendations for future intervention development. We searched electronic databases (PubMed, PsychINFO, SCOPUS) and identified publications through reference lists and requests to experts in the field of mobile health. Studies were included that provided original data and aimed to influence physical activity through dissemination or collection of intervention materials with a mobile device. Data were extracted to calculate effect sizes for individual studies, as were study descriptives. A random effects meta-analysis was conducted using the Comprehensive Meta-Analysis software suite. Study quality was assessed using the quality of execution portion of the Guide to Community Preventative Services data extraction form. Four studies were of "good" quality and seven of "fair" quality. In total, 1351 individuals participated in 11 unique studies from which 18 effects were extracted and synthesized, yielding an overall weight mean effect size of g = 0.54 (95% CI = 0.17 to 0.91, P = .01). Research utilizing mobile devices is gaining in popularity, and this study suggests that this platform is an effective means for influencing physical activity behavior. Our focus must be on the best possible use of these tools to measure and understand behavior. Therefore, theoretically grounded behavior change interventions that recognize and act on the potential of smartphone technology could provide investigators with an effective tool for increasing physical activity.

Top