Studies on Experimental Ontology and Knowledge Service Development in Bio-Environmental Engineering
NASA Astrophysics Data System (ADS)
Zhang, Yunliang
2018-01-01
The existing domain-related ontology and information service patterns are analyzed, and the main problems faced by the experimental scheme knowledge service were clarified. The ontology framework model for knowledge service of Bio-environmental Engineering was proposed from the aspects of experimental materials, experimental conditions and experimental instruments, and this ontology will be combined with existing knowledge organization systems to organize scientific and technological literatures, data and experimental schemes. With the similarity and priority calculation, it can improve the related domain research.
Early Family Environments of Obese and Non-Obese College Students.
ERIC Educational Resources Information Center
Hailey, B. Jo; Sison, Gustave F. P., Jr.
Although case studies and anecdotal information have suggested that differences exist between the early family environments of obese and non-obese individuals, no experimental research exists. Undergraduates completed the Family Environment Scale (FES) and a questionnaire concerning past and present weight information. Subjects were classified as…
McElreath, Richard; Bell, Adrian V; Efferson, Charles; Lubell, Mark; Richerson, Peter J; Waring, Timothy
2008-11-12
The existence of social learning has been confirmed in diverse taxa, from apes to guppies. In order to advance our understanding of the consequences of social transmission and evolution of behaviour, however, we require statistical tools that can distinguish among diverse social learning strategies. In this paper, we advance two main ideas. First, social learning is diverse, in the sense that individuals can take advantage of different kinds of information and combine them in different ways. Examining learning strategies for different information conditions illuminates the more detailed design of social learning. We construct and analyse an evolutionary model of diverse social learning heuristics, in order to generate predictions and illustrate the impact of design differences on an organism's fitness. Second, in order to eventually escape the laboratory and apply social learning models to natural behaviour, we require statistical methods that do not depend upon tight experimental control. Therefore, we examine strategic social learning in an experimental setting in which the social information itself is endogenous to the experimental group, as it is in natural settings. We develop statistical models for distinguishing among different strategic uses of social information. The experimental data strongly suggest that most participants employ a hierarchical strategy that uses both average observed pay-offs of options as well as frequency information, the same model predicted by our evolutionary analysis to dominate a wide range of conditions.
Experimental economics' inconsistent ban on deception.
Hersch, Gil
2015-08-01
According to what I call the 'argument from public bads', if a researcher deceived subjects in the past, there is a chance that subjects will discount the information that a subsequent researcher provides, thus compromising the validity of the subsequent researcher's experiment. While this argument is taken to justify an existing informal ban on explicit deception in experimental economics, it can also apply to implicit deception, yet implicit deception is not banned and is sometimes used in experimental economics. Thus, experimental economists are being inconsistent when they appeal to the argument from public bads to justify banning explicit deception but not implicit deception. Copyright © 2015 Elsevier Ltd. All rights reserved.
Experimental demonstration of a flexible time-domain quantum channel.
Xing, Xingxing; Feizpour, Amir; Hayat, Alex; Steinberg, Aephraim M
2014-10-20
We present an experimental realization of a flexible quantum channel where the Hilbert space dimensionality can be controlled electronically. Using electro-optical modulators (EOM) and narrow-band optical filters, quantum information is encoded and decoded in the temporal degrees of freedom of photons from a long-coherence-time single-photon source. Our results demonstrate the feasibility of a generic scheme for encoding and transmitting multidimensional quantum information over the existing fiber-optical telecommunications infrastructure.
76 FR 10365 - Agency Information Collection Request. 60-Day Public Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2011-02-24
... research integrity? To answer the first question, a pretest-posttest control group experimental design is... existing syllabus for a research integrity or research ethics course for the treatment group. The control group will use the existing syllabus with no video simulation in class. Participants will be graduate...
Evaluation of a conventional chip seal under an overlay to mitigate reflective cracking (informal).
DOT National Transportation Integrated Search
2015-03-01
The Billings District initiated an experimental project in placing a conventional : chip seal (as an interlayer) on an existing pavement prior to an overlay : (composed of a 0.25 PMS thickness). The intent of the chip seal (CS) was to : seal exist...
2014-06-01
analytics to evaluate document relevancy and order query results. 4 Background • Information environment complexity • Relevancy solutions for big data ...027 Primary Topic: Data , Information and Knowledge Alternatives: Organizational Concepts and Approaches; Experimentation, Metrics, and Analysis...instructions, searching existing data sources, gathering and maintaining the data needed, and completing and reviewing the collection of information. Send
Diffusion of Technical Agricultural Information in Chile.
ERIC Educational Resources Information Center
Brown, Marion Ray
This study examined current thought concerning the role of mass communication in economic development in developing nations; analyzed existing efforts to diffuse agricultural technology in Chile; assessed the effectiveness of various approaches; and tested the effects (primarily on knowledge levels) of an experimental technical information service…
Federal Register 2010, 2011, 2012, 2013, 2014
2013-09-05
... revise the existing nonessential experimental population designation of the Mexican wolf (Canis lupus... lupus baileyi), which also published in the Federal Register on June 13, 2013, should be submitted to... (Canis lupus baileyi) by listing it as endangered (78 FR 35664). FOR FURTHER INFORMATION CONTACT: Mexican...
A Decade of Literacy Research in the "Journal of Experimental Education."
ERIC Educational Resources Information Center
Knudson, Ruth E.; Theurer, Joan Leikam; Onofrey, Karen A.
This study examined various trends that exist among the 246 refereed articles published in the "Journal of Experimental Education" between 1990 and 1999. The study's results showed that 39 (16%) of the Journal articles published during this 9-year span focused on literacy. Information was categorized for each article with respect to authors'…
John Kilgo; Mark Vukovich
2014-01-01
Thresholds in response by cavity-nesting bird populations to variations in the snag resource are poorly understood. In addition, limited information exists on the value of artificially created snags for cavity-nesting birds. Therefore, uncertainty exists in whether artificially created snags can yield a positive population response among snag-dependent birds. We used...
Biological Information Processing in Single Microtubules
2014-03-05
single Microtubule Google Mountain view campus, workshop on quantum biology 22 October 2010 3. Paul Davies Beyond Center at Arizona State University...Phoenix) Phoenix, workshop on quantum biology and cancer research, Experimental studies on single microtubule, 25-27 October 2010, Tempe, Arizona...State University, USA 4. Quantum aspects of microtubule: Direct experimental evidence for the existence of quantum states in microtubule, Towards a
Efficient experimental design of high-fidelity three-qubit quantum gates via genetic programming
NASA Astrophysics Data System (ADS)
Devra, Amit; Prabhu, Prithviraj; Singh, Harpreet; Arvind; Dorai, Kavita
2018-03-01
We have designed efficient quantum circuits for the three-qubit Toffoli (controlled-controlled-NOT) and the Fredkin (controlled-SWAP) gate, optimized via genetic programming methods. The gates thus obtained were experimentally implemented on a three-qubit NMR quantum information processor, with a high fidelity. Toffoli and Fredkin gates in conjunction with the single-qubit Hadamard gates form a universal gate set for quantum computing and are an essential component of several quantum algorithms. Genetic algorithms are stochastic search algorithms based on the logic of natural selection and biological genetics and have been widely used for quantum information processing applications. We devised a new selection mechanism within the genetic algorithm framework to select individuals from a population. We call this mechanism the "Luck-Choose" mechanism and were able to achieve faster convergence to a solution using this mechanism, as compared to existing selection mechanisms. The optimization was performed under the constraint that the experimentally implemented pulses are of short duration and can be implemented with high fidelity. We demonstrate the advantage of our pulse sequences by comparing our results with existing experimental schemes and other numerical optimization methods.
Snag longevity in managed northern hardwoods
Mariko Yamasaki; William B. Leak
2006-01-01
Little information on standing snag and coarse woody debris longevity exists for New England forest types. Forest managers thus lack the information on changes over time of the habitat components influenced by the decay process. We examined the fate of 568 snags that occurred on a long-term hardwood growth study on the Bartlett Experimental Forest, NH. Approximately...
Coeli M. Hoover
2010-01-01
Although long-term research is a critical tool for answering forest management questions, managers must often make decisions before results from such experiments are available. One way to meet those information needs is to reanalyze existing long-term data sets to address current research questions; the Forest Service Experimental Forests and Ranges (EFRs) network...
Modal Analysis Using the Singular Value Decomposition and Rational Fraction Polynomials
2017-04-06
information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and...results. The programs are designed for experimental datasets with multiple drive and response points and have proven effective even for systems with... designed for experimental datasets with multiple drive and response points and have proven effective even for systems with numerous closely-spaced
An Extraction Method of an Informative DOM Node from a Web Page by Using Layout Information
NASA Astrophysics Data System (ADS)
Tsuruta, Masanobu; Masuyama, Shigeru
We propose an informative DOM node extraction method from a Web page for preprocessing of Web content mining. Our proposed method LM uses layout data of DOM nodes generated by a generic Web browser, and the learning set consists of hundreds of Web pages and the annotations of informative DOM nodes of those Web pages. Our method does not require large scale crawling of the whole Web site to which the target Web page belongs. We design LM so that it uses the information of the learning set more efficiently in comparison to the existing method that uses the same learning set. By experiments, we evaluate the methods obtained by combining one that consists of the method for extracting the informative DOM node both the proposed method and the existing methods, and the existing noise elimination methods: Heur removes advertisements and link-lists by some heuristics and CE removes the DOM nodes existing in the Web pages in the same Web site to which the target Web page belongs. Experimental results show that 1) LM outperforms other methods for extracting the informative DOM node, 2) the combination method (LM, {CE(10), Heur}) based on LM (precision: 0.755, recall: 0.826, F-measure: 0.746) outperforms other combination methods.
High-Fidelity Simulations of Moving and Flexible Airfoils at Low Reynolds Numbers (Postprint)
2010-02-01
1 hour per response, including the time for reviewing instructions, searching existing data sources, searching existing data sources, gathering and...maintaining the data needed, and completing and reviewing the collection of information. Send comments regarding this burden estimate or any other...phased-averaged structures for both values of Reynolds number are found to be in good agreement with the experimental data . Finally, the effect of
NASA Astrophysics Data System (ADS)
Chau, H. F.; Wang, Qinan; Wong, Cardythy
2017-02-01
Recently, Chau [Phys. Rev. A 92, 062324 (2015), 10.1103/PhysRevA.92.062324] introduced an experimentally feasible qudit-based quantum-key-distribution (QKD) scheme. In that scheme, one bit of information is phase encoded in the prepared state in a 2n-dimensional Hilbert space in the form (|i > ±|j >) /√{2 } with n ≥2 . For each qudit prepared and measured in the same two-dimensional Hilbert subspace, one bit of raw secret key is obtained in the absence of transmission error. Here we show that by modifying the basis announcement procedure, the same experimental setup can generate n bits of raw key for each qudit prepared and measured in the same basis in the noiseless situation. The reason is that in addition to the phase information, each qudit also carries information on the Hilbert subspace used. The additional (n -1 ) bits of raw key comes from a clever utilization of this extra piece of information. We prove the unconditional security of this modified protocol and compare its performance with other existing provably secure qubit- and qudit-based protocols on market in the one-way classical communication setting. Interestingly, we find that for the case of n =2 , the secret key rate of this modified protocol using nondegenerate random quantum code to perform one-way entanglement distillation is equal to that of the six-state scheme.
Prevention of Blast-Related Injuries
2013-07-01
collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources...of information if it does not display a currently valid OMB control number. PLEASE DO NOT RETURN YOUR FORM TO THE ABOVE ADDRESS. 1 . REPORT DATE...Introduction 4 Statement of Work 4 Task I Report 4 1 . Adjustment of the experimental design and methodology 4 2. Preparations for Blast
Infrared Ship Target Segmentation Based on Spatial Information Improved FCM.
Bai, Xiangzhi; Chen, Zhiguo; Zhang, Yu; Liu, Zhaoying; Lu, Yi
2016-12-01
Segmentation of infrared (IR) ship images is always a challenging task, because of the intensity inhomogeneity and noise. The fuzzy C-means (FCM) clustering is a classical method widely used in image segmentation. However, it has some shortcomings, like not considering the spatial information or being sensitive to noise. In this paper, an improved FCM method based on the spatial information is proposed for IR ship target segmentation. The improvements include two parts: 1) adding the nonlocal spatial information based on the ship target and 2) using the spatial shape information of the contour of the ship target to refine the local spatial constraint by Markov random field. In addition, the results of K -means are used to initialize the improved FCM method. Experimental results show that the improved method is effective and performs better than the existing methods, including the existing FCM methods, for segmentation of the IR ship images.
77 FR 20792 - New England Fishery Management Council (NEFMC); Public Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2012-04-06
... experimental fishery permit applications that have been made available since the January 2012 Council meeting... will present information about future plans for the federal sea scallop survey, including the integration of Habcam (towed underwater camera) results with existing survey technologies. The Enforcement...
NASA Technical Reports Server (NTRS)
Khovanskiy, Y. D.; Kremneva, N. I.
1975-01-01
Problems and methods are discussed of automating information retrieval operations in a data bank used for long term storage and retrieval of data from scientific experiments. Existing information retrieval languages are analyzed along with those being developed. The results of studies discussing the application of the descriptive 'Kristall' language used in the 'ASIOR' automated information retrieval system are presented. The development and use of a specialized language of the classification-descriptive type, using universal decimal classification indices as the main descriptors, is described.
Experimental and Computational Analysis of a Miniature Ramjet at Mach 4.0
2013-09-01
this collection of information is estimated to average 1 hour per response, including the time for reviewing instruction, searching existing data...sources, gathering and maintaining the data needed, and completing and reviewing the collection of information. Send comments regarding this burden...intermittent after the second World War, with the most well-known example being Lockheed Martin’s SR-71 Blackbird using the Pratt & Whitney J58 turbojet
The laterality effect: myth or truth?
Cohen Kadosh, Roi
2008-03-01
Tzelgov and colleagues [Tzelgov, J., Meyer, J., and Henik, A. (1992). Automatic and intentional processing of numerical information. Journal of Experimental Psychology: Learning, Memory and Cognition, 18, 166-179.], offered the existence of the laterality effect as a post-hoc explanation for their results. According to this effect, numbers are classified automatically as small/large versus a standard point under autonomous processing of numerical information. However, the genuinity of the laterality effect was never examined, or was confounded with the numerical distance effect. In the current study, I controlled the numerical distance effect and observed that the laterality effect does exist, and affects the processing of automatic numerical information. The current results suggest that the laterality effect should be taken into account when using paradigms that require automatic numerical processing such as Stroop-like or priming tasks.
A BIOASSAY THAT IDENTIFIES POSTNATAL FUNCTIONAL DEFICITS IN MICE PRENATALLY EXPOSED TO XENOBIOTICS
Experimental strategies to evaluate adverse postnatal effects due to prenatal exposure exist for many organ systems. Often, however, there is insufficient information to suggest that a particular organ system(s) may be sensitive to the test agent. A single bioassay to identify ...
Role of sufficient statistics in stochastic thermodynamics and its implication to sensory adaptation
NASA Astrophysics Data System (ADS)
Matsumoto, Takumi; Sagawa, Takahiro
2018-04-01
A sufficient statistic is a significant concept in statistics, which means a probability variable that has sufficient information required for an inference task. We investigate the roles of sufficient statistics and related quantities in stochastic thermodynamics. Specifically, we prove that for general continuous-time bipartite networks, the existence of a sufficient statistic implies that an informational quantity called the sensory capacity takes the maximum. Since the maximal sensory capacity imposes a constraint that the energetic efficiency cannot exceed one-half, our result implies that the existence of a sufficient statistic is inevitably accompanied by energetic dissipation. We also show that, in a particular parameter region of linear Langevin systems there exists the optimal noise intensity at which the sensory capacity, the information-thermodynamic efficiency, and the total entropy production are optimized at the same time. We apply our general result to a model of sensory adaptation of E. coli and find that the sensory capacity is nearly maximal with experimentally realistic parameters.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-12-09
... comparative visuals, and using vaguer language. This study is designed to apply the existing comparative... Effectiveness) studies designed to explore comparative effectiveness. When this large project is completed, FDA... Request; Experimental Study of Comparative Direct-to-Consumer Advertising AGENCY: Food and Drug...
Experimental Studies of Incubation: Searching for the Elusive.
ERIC Educational Resources Information Center
Olton, Robert M.
1979-01-01
The author discusses the phenomenon of incubation in creative problem solving, distinguishes it from "creative worrying" and "tip of the tongue" experiences, and reviews research to indicate a lack of evidence of incubation's existence in well-controlled studies. Note: For related information, see EC 120 232-238. (CL)
Critical evaluation and thermodynamic assessment of the ZrPb system
NASA Astrophysics Data System (ADS)
Arias, D.; Abriata, J.; Gribaudo, L.
1996-04-01
In the present work we have critically evaluated the existing experimental information regarding phase stabilities in the ZrPb system. From this, the ZrPb phase diagram has been assessed up to 50 at.% Pb. The proposed diagram has been further supported by a thermodynamic model calculation.
Definition of smolder experiments for Spacelab
NASA Technical Reports Server (NTRS)
Summerfield, M.; Messina, N. A.; Ingram, L. S.
1979-01-01
The feasibility of conducting experiments in space on smoldering combustion was studied to conceptually design specific smoldering experiments to be conducted in the Shuttle/Spacelab System. Design information for identified experiment critical components is provided. The analytical and experimental basis for conducting research on smoldering phenomena in space was established. Physical descriptions of the various competing processes pertaining to smoldering combustion were identified. The need for space research was defined based on limitations of existing knowledge and limitations of ground-based reduced-gravity experimental facilities.
Abduallah, Yasser; Turki, Turki; Byron, Kevin; Du, Zongxuan; Cervantes-Cervantes, Miguel; Wang, Jason T L
2017-01-01
Gene regulation is a series of processes that control gene expression and its extent. The connections among genes and their regulatory molecules, usually transcription factors, and a descriptive model of such connections are known as gene regulatory networks (GRNs). Elucidating GRNs is crucial to understand the inner workings of the cell and the complexity of gene interactions. To date, numerous algorithms have been developed to infer gene regulatory networks. However, as the number of identified genes increases and the complexity of their interactions is uncovered, networks and their regulatory mechanisms become cumbersome to test. Furthermore, prodding through experimental results requires an enormous amount of computation, resulting in slow data processing. Therefore, new approaches are needed to expeditiously analyze copious amounts of experimental data resulting from cellular GRNs. To meet this need, cloud computing is promising as reported in the literature. Here, we propose new MapReduce algorithms for inferring gene regulatory networks on a Hadoop cluster in a cloud environment. These algorithms employ an information-theoretic approach to infer GRNs using time-series microarray data. Experimental results show that our MapReduce program is much faster than an existing tool while achieving slightly better prediction accuracy than the existing tool.
NASA Astrophysics Data System (ADS)
Howard, N. T.; Holland, C.; White, A. E.; Greenwald, M.; Candy, J.; Creely, A. J.
2016-05-01
To better understand the role of cross-scale coupling in experimental conditions, a series of multi-scale gyrokinetic simulations were performed on Alcator C-Mod, L-mode plasmas. These simulations, performed using all experimental inputs and realistic ion to electron mass ratio ((mi/me)1/2 = 60.0), simultaneously capture turbulence at the ion ( kθρs˜O (1.0 ) ) and electron-scales ( kθρe˜O (1.0 ) ). Direct comparison with experimental heat fluxes and electron profile stiffness indicates that Electron Temperature Gradient (ETG) streamers and strong cross-scale turbulence coupling likely exist in both of the experimental conditions studied. The coupling between ion and electron-scales exists in the form of energy cascades, modification of zonal flow dynamics, and the effective shearing of ETG turbulence by long wavelength, Ion Temperature Gradient (ITG) turbulence. The tightly coupled nature of ITG and ETG turbulence in these realistic plasma conditions is shown to have significant implications for the interpretation of experimental transport and fluctuations. Initial attempts are made to develop a "rule of thumb" based on linear physics, to help predict when cross-scale coupling plays an important role and to inform future modeling of experimental discharges. The details of the simulations, comparisons with experimental measurements, and implications for both modeling and experimental interpretation are discussed.
Experimental demonstration of a fully inseparable quantum state with nonlocalizable entanglement
Mičuda, M.; Koutný, D.; Miková, M.; Straka, I.; Ježek, M.; Mišta, L.
2017-01-01
Localizability of entanglement in fully inseparable states is a key ingredient of assisted quantum information protocols as well as measurement-based models of quantum computing. We investigate the existence of fully inseparable states with nonlocalizable entanglement, that is, with entanglement which cannot be localized between any pair of subsystems by any measurement on the remaining part of the system. It is shown, that the nonlocalizable entanglement occurs already in suitable mixtures of a three-qubit GHZ state and white noise. Further, we generalize this set of states to a two-parametric family of fully inseparable three-qubit states with nonlocalizable entanglement. Finally, we demonstrate experimentally the existence of nonlocalizable entanglement by preparing and characterizing one state from the family using correlated single photons and linear optical circuit. PMID:28344336
Experimental demonstration of a fully inseparable quantum state with nonlocalizable entanglement.
Mičuda, M; Koutný, D; Miková, M; Straka, I; Ježek, M; Mišta, L
2017-03-27
Localizability of entanglement in fully inseparable states is a key ingredient of assisted quantum information protocols as well as measurement-based models of quantum computing. We investigate the existence of fully inseparable states with nonlocalizable entanglement, that is, with entanglement which cannot be localized between any pair of subsystems by any measurement on the remaining part of the system. It is shown, that the nonlocalizable entanglement occurs already in suitable mixtures of a three-qubit GHZ state and white noise. Further, we generalize this set of states to a two-parametric family of fully inseparable three-qubit states with nonlocalizable entanglement. Finally, we demonstrate experimentally the existence of nonlocalizable entanglement by preparing and characterizing one state from the family using correlated single photons and linear optical circuit.
Spacetime Replication of Quantum Information Using (2 , 3) Quantum Secret Sharing and Teleportation
NASA Astrophysics Data System (ADS)
Wu, Yadong; Khalid, Abdullah; Davijani, Masoud; Sanders, Barry
The aim of this work is to construct a protocol to replicate quantum information in any valid configuration of causal diamonds and assess resources required to physically realize spacetime replication. We present a set of codes to replicate quantum information along with a scheme to realize these codes using continuous-variable quantum optics. We use our proposed experimental realizations to determine upper bounds on the quantum and classical resources required to simulate spacetime replication. For four causal diamonds, our implementation scheme is more efficient than the one proposed previously. Our codes are designed using a decomposition algorithm for complete directed graphs, (2 , 3) quantum secret sharing, quantum teleportation and entanglement swapping. These results show the simulation of spacetime replication of quantum information is feasible with existing experimental methods. Alberta Innovates, NSERC, China's 1000 Talent Plan and the Institute for Quantum Information and Matter, which is an NSF Physics Frontiers Center (NSF Grant PHY-1125565) with support of the Gordon and Betty Moore Foundation (GBMF-2644).
77 FR 2297 - Agency Information Collection Request. 30-Day Public Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2012-01-17
... answer the first question, a pretest-posttest control group experimental design is used to assess the... from the students who participate in the first part of the study. The focus group will last one hour... integrity or research ethics course for the treatment group. The control group will use the existing...
Conservation genetics of the European beech in France
A. Ducousso; B. Musch; S. Irola; A. Quenu; A. Hampe; R.J. Petit
2017-01-01
European beech (Fagus sylvatica) is one of the most abundant tree species in Europe. Its genetic structure and diversity have been investigated using both molecular markers and adaptive traits as assessed in field and laboratory experimental tests looking at adaptative traits. A great deal of information also exists on the Quaternary history of the...
Space Radiation Shielding Studies for Astronaut and Electronic Component Risk Assessment
NASA Technical Reports Server (NTRS)
Fuchs, Jordan Robert
2010-01-01
The dosimetry component of the Center for Radiation Engineering and Science for Space Exploration (CRESSE) will design, develop and characterize the response of a suite of radiation detectors and supporting instrumentation and electronics with three primary goals that will: (1) Use established space radiation detection systems to characterize the primary and secondary radiation fields existing in the experimental test-bed zones during exposures at particle accelerator facilities. (2) Characterize the responses of newly developed space radiation detection systems in the experimental test-bed zones during exposures at particle accelerator facilities, and (3) Provide CRESSE collaborators with detailed dosimetry information in experimental test-bed zones.
CCTOP: a Consensus Constrained TOPology prediction web server.
Dobson, László; Reményi, István; Tusnády, Gábor E
2015-07-01
The Consensus Constrained TOPology prediction (CCTOP; http://cctop.enzim.ttk.mta.hu) server is a web-based application providing transmembrane topology prediction. In addition to utilizing 10 different state-of-the-art topology prediction methods, the CCTOP server incorporates topology information from existing experimental and computational sources available in the PDBTM, TOPDB and TOPDOM databases using the probabilistic framework of hidden Markov model. The server provides the option to precede the topology prediction with signal peptide prediction and transmembrane-globular protein discrimination. The initial result can be recalculated by (de)selecting any of the prediction methods or mapped experiments or by adding user specified constraints. CCTOP showed superior performance to existing approaches. The reliability of each prediction is also calculated, which correlates with the accuracy of the per protein topology prediction. The prediction results and the collected experimental information are visualized on the CCTOP home page and can be downloaded in XML format. Programmable access of the CCTOP server is also available, and an example of client-side script is provided. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.
Cadiz, David M; O'Neill, Chris; Butell, Sue S; Epeneter, Beverly J; Basin, Basilia
2012-07-01
This article reports on a study that evaluated the effectiveness of an educational intervention, Addressing Nurse Impairment, for addressing nursing students' knowledge acquisition, changes in self-efficacy to intervene, and changes in substance abuse stigma. A gap exists in nursing students' education regarding the risks of addiction within the profession and how to handle a colleague suspected of having a substance use disorder. The seminar was adapted from an existing evidence-based prevention program called Team Awareness, as well as information from focus groups and a pilot test. A quasi-experimental pretest-posttest design was used to evaluate the effect of the seminar. When the control and experimental groups were compared, the results indicated that the seminar significantly affected knowledge and self-efficacy to intervene but did not significantly affect stigma. This research contributes to the body of evidence related to educational interventions for nursing students regarding substance abuse in the nursing profession. Copyright 2012, SLACK Incorporated.
Wind tunnel measurements for dispersion modelling of vehicle wakes
NASA Astrophysics Data System (ADS)
Carpentieri, Matteo; Kumar, Prashant; Robins, Alan
2012-12-01
Wind tunnel measurements downwind of reduced scale car models have been made to study the wake regions in detail, test the usefulness of existing vehicle wake models, and draw key information needed for dispersion modelling in vehicle wakes. The experiments simulated a car moving in still air. This is achieved by (i) the experimental characterisation of the flow, turbulence and concentration fields in both the near and far wake regions, (ii) the preliminary assessment of existing wake models using the experimental database, and (iii) the comparison of previous field measurements in the wake of a real diesel car with the wind tunnel measurements. The experiments highlighted very large gradients of velocities and concentrations existing, in particular, in the near-wake. Of course, the measured fields are strongly dependent on the geometry of the modelled vehicle and a generalisation for other vehicles may prove to be difficult. The methodology applied in the present study, although improvable, could constitute a first step towards the development of mathematical parameterisations. Experimental results were also compared with the estimates from two wake models. It was found that they can adequately describe the far-wake of a vehicle in terms of velocities, but a better characterisation in terms of turbulence and pollutant dispersion is needed. Parameterised models able to predict velocity and concentrations with fine enough details at the near-wake scale do not exist.
Collett, T.S.
1985-01-01
In 1973, during the drilling of the West Sak #1 well on the North Slope of Alaska, oil was first recovered from a shallow Cretaceous sand interval which was later informally named the West Sak sands by ARCO Alaska. Stratigraphically above the West Sak sands there are two additional oil bearing sands, and are informally referred to by ARCO as the Ugnu and the 2150 horizons. Gas hydrates are interpreted to exist in the West Sak #6 well in conjunction with heavy oil and the physical properties of this oil may have been influenced by the gas hydrate. Prior to this work, only experimental evidence suggested that hydrates and oil could exist in the same reservoir.
Radiation reaction studies in an all-optical set-up: experimental limitations
NASA Astrophysics Data System (ADS)
Samarin, G. M.; Zepf, M.; Sarri, G.
2018-06-01
The recent development of ultra-high intensity laser facilities is finally opening up the possibility of studying high-field quantum electrodynamics in the laboratory. Arguably, one of the central phenomena in this area is that of quantum radiation reaction experienced by an ultra-relativistic electron beam as it propagates through the tight focus of a laser beam. In this paper, we discuss the major experimental challenges that are to be faced in order to extract meaningful and quantitative information from this class of experiments using existing and near-term laser facilities.
X-ray absorption and reflection as probes of the GaN conduction bands: Theory and experiments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lambrecht, W.R.L.; Rashkeev, S.N.; Segall, B.
1997-04-01
X-ray absorption measurements are a well-known probe of the unoccupied states in a material. The same information can be obtained by using glancing angle X-ray reflectivity. In spite of several existing band structure calculations of the group III nitrides and previous optical studies in UV range, a direct probe of their conduction band densities of states is of interest. The authors performed a joint experimental and theoretical investigation using both of these experimental techniques for wurtzite GaN.
Drag reduction by polymers in wall bounded turbulence.
L'vov, Victor S; Pomyalov, Anna; Procaccia, Itamar; Tiberkevich, Vasil
2004-06-18
We elucidate the mechanism of drag reduction by polymers in turbulent wall-bounded flows: while momentum is produced at a fixed rate by the forcing, polymer stretching results in the suppression of momentum flux to the wall. On the basis of the equations of fluid mechanics we develop the phenomenology of the "maximum drag reduction asymptote" which is the maximum drag reduction attained by polymers. Based on Newtonian information only we demonstrate the existence of drag reduction, and with one experimental parameter we reach agreement with the experimental measurements.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Reyniers, G.C.; Froment, G.F.; Kopinke, F.D.
1994-11-01
An extensive experimental program has been carried out in a pilot unit for the thermal cracking of hydrocarbons. On the basis of the experimental information and the insight in the mechanisms for coke formation in pyrolysis reactors, a mathematical model describing the coke formation has been derived. This model has been incorporated in the existing simulation tools at the Laboratorium voor Petrochemische Techniek, and the run length of an industrial naphtha cracking furnace has been accurately simulated. In this way the coking model has been validated.
Tucker, George; Loh, Po-Ru; Berger, Bonnie
2013-10-04
Comprehensive protein-protein interaction (PPI) maps are a powerful resource for uncovering the molecular basis of genetic interactions and providing mechanistic insights. Over the past decade, high-throughput experimental techniques have been developed to generate PPI maps at proteome scale, first using yeast two-hybrid approaches and more recently via affinity purification combined with mass spectrometry (AP-MS). Unfortunately, data from both protocols are prone to both high false positive and false negative rates. To address these issues, many methods have been developed to post-process raw PPI data. However, with few exceptions, these methods only analyze binary experimental data (in which each potential interaction tested is deemed either observed or unobserved), neglecting quantitative information available from AP-MS such as spectral counts. We propose a novel method for incorporating quantitative information from AP-MS data into existing PPI inference methods that analyze binary interaction data. Our approach introduces a probabilistic framework that models the statistical noise inherent in observations of co-purifications. Using a sampling-based approach, we model the uncertainty of interactions with low spectral counts by generating an ensemble of possible alternative experimental outcomes. We then apply the existing method of choice to each alternative outcome and aggregate results over the ensemble. We validate our approach on three recent AP-MS data sets and demonstrate performance comparable to or better than state-of-the-art methods. Additionally, we provide an in-depth discussion comparing the theoretical bases of existing approaches and identify common aspects that may be key to their performance. Our sampling framework extends the existing body of work on PPI analysis using binary interaction data to apply to the richer quantitative data now commonly available through AP-MS assays. This framework is quite general, and many enhancements are likely possible. Fruitful future directions may include investigating more sophisticated schemes for converting spectral counts to probabilities and applying the framework to direct protein complex prediction methods.
Becky K. Kerns; Margaret M. Moore; Stephen C. Hart
2008-01-01
In the last century, ponderosa pine forests in the Southwest have changed from more open park-like stands of older trees to denser stands of younger, small-diameter trees. Considerable information exists regarding ponderosa pine forest fire history and recent shifts in stand structure and composition, yet quantitative studies investigating understory reference...
Yoo, Danny; Xu, Iris; Berardini, Tanya Z; Rhee, Seung Yon; Narayanasamy, Vijay; Twigger, Simon
2006-03-01
For most systems in biology, a large body of literature exists that describes the complexity of the system based on experimental results. Manual review of this literature to extract targeted information into biological databases is difficult and time consuming. To address this problem, we developed PubSearch and PubFetch, which store literature, keyword, and gene information in a relational database, index the literature with keywords and gene names, and provide a Web user interface for annotating the genes from experimental data found in the associated literature. A set of protocols is provided in this unit for installing, populating, running, and using PubSearch and PubFetch. In addition, we provide support protocols for performing controlled vocabulary annotations. Intended users of PubSearch and PubFetch are database curators and biology researchers interested in tracking the literature and capturing information about genes of interest in a more effective way than with conventional spreadsheets and lab notebooks.
Remembrance of Things Future: Prospective Memory in Laboratory, Workplace, and Everyday Settings
NASA Technical Reports Server (NTRS)
Dismukes, R. Key
2010-01-01
In this review, oriented to the human factors community, I will summarize and provide a perspective on recent research and theory on prospective memory. This will not be an exhaustive review of literature, which is already available in two excellent recent books that provide a wealth of detail on the current state of experimental research (Kliegel, McCaniel, & Einstein, 2008; McDaniel & Einstein, 2007; also see Brandimonte, Einstein, & McDaniel, 1996, for a still relevant overview of the field as it was emerging). Rather, I will explore the limits of existing experimental paradigms and theory, Vvilich, in my opinion, fail to capture some critical aspects of performance outside the laboratory. I will also review the relatively few studies in workplace and everyday settings and will discuss several studies that attempt to bridge between the bulk of experimental studies and these few naturalistic studies. Finally, I will describe countermeasures that can reduce vulnerability to forgetting to perform intended tasks, and I will propose a research agenda that would extend existing experimental and theoretical approaches and would support human factors practitioners by generating information on a wide range of issues relevant to prospective memory performance in natural settings.
Chapter 15: Disease Gene Prioritization
Bromberg, Yana
2013-01-01
Disease-causing aberrations in the normal function of a gene define that gene as a disease gene. Proving a causal link between a gene and a disease experimentally is expensive and time-consuming. Comprehensive prioritization of candidate genes prior to experimental testing drastically reduces the associated costs. Computational gene prioritization is based on various pieces of correlative evidence that associate each gene with the given disease and suggest possible causal links. A fair amount of this evidence comes from high-throughput experimentation. Thus, well-developed methods are necessary to reliably deal with the quantity of information at hand. Existing gene prioritization techniques already significantly improve the outcomes of targeted experimental studies. Faster and more reliable techniques that account for novel data types are necessary for the development of new diagnostics, treatments, and cure for many diseases. PMID:23633938
Benzodiazepines, opioids and driving: an overview of the experimental research.
Leung, Stefanie Y
2011-05-01
Road crashes contribute significantly to the total burden of injury in Australia, with the risk of injury being associated with the presence of drugs and/or alcohol in the driver's blood. Increasingly, some of the most commonly detected drugs include prescription medicines, the most notable of these being benzodiazepines and opioids. However, there is a paucity of experimental research into the effects of prescribed psychoactive drugs on driving behaviours. This paper provides an overview of experimental studies investigating the effects of prescribed doses of benzodiazepines and opioids on driving ability, and points to future directions for research. There is growing epidemiological evidence linking the therapeutic use of benzodiazepines and opioids to an increased crash risk. However, the current experimental literature remains unclear. Limitations to study methodologies have resulted in inconsistent findings. Limited experimental evidence exists to inform policy and guidelines regarding fitness-to-drive for patients taking prescribed benzodiazepines and opioids. Further experimental research is required to elucidate the effects of these medications on driving, under varying conditions and in different medical contexts. This will ensure that doctors prescribing benzodiazepines and opioids are well informed, and can appropriately advise patients of the risks associated with driving whilst taking these medications. © 2011 Australasian Professional Society on Alcohol and other Drugs.
ERIC Educational Resources Information Center
Selden, Sally; Sherrier, Tom; Wooters, Robert
2012-01-01
The purpose of this study is to examine the effects of a new approach to performance appraisal training. Motivated by split-brain theory and existing studies of cognitive information processing and performance appraisals, this exploratory study examined the effects of a whole-brain approach to training managers for implementing performance…
A Quantitative Experimental Study of the Effectiveness of Systems to Identify Network Attackers
ERIC Educational Resources Information Center
Handorf, C. Russell
2016-01-01
This study analyzed the meta-data collected from a honeypot that was run by the Federal Bureau of Investigation for a period of 5 years. This analysis compared the use of existing industry methods and tools, such as Intrusion Detection System alerts, network traffic flow and system log traffic, within the Open Source Security Information Manager…
Improved separability criteria via some classes of measurements
NASA Astrophysics Data System (ADS)
Shen, Shu-Qian; Li, Ming; Li-Jost, Xianqing; Fei, Shao-Ming
2018-05-01
The entanglement detection via local measurements can be experimentally implemented. Based on mutually unbiased measurements and general symmetric informationally complete positive-operator-valued measures, we present separability criteria for bipartite quantum states, which, by theoretical analysis, are stronger than the related existing criteria via these measurements. Two detailed examples are supplemented to show the efficiency of the presented separability criteria.
Mental Models for Mechanical Comprehension. A Review of Literature.
1986-06-01
the mental models that people use to understand and solve problems involving mechanics and motion. Method The existing psychological literature on...have been used to investigate mental models. The constructionist school is concerned with how mental models are formed. The information-processing...school uses the experimental methods of modern cognitive psychology to investigate mental structures. The componential approach attempts to meld the
Becky K. Kerns; Margaret M. Moore; Stephen C. Hart
2008-01-01
In the last century, ponderosa pine forests in the Southwest have changed from more open park-like stands of older trees to denser stands of younger, smalldiameter trees. Considerable information exists regarding ponderosa pine forest fire history and recent shifts in stand structure and composition, yet quantitative studies investigating understory reference...
NASA Astrophysics Data System (ADS)
Chen, Tian-Yu; Chen, Yang; Yang, Hu-Jiang; Xiao, Jing-Hua; Hu, Gang
2018-03-01
Nowadays, massive amounts of data have been accumulated in various and wide fields, it has become today one of the central issues in interdisciplinary fields to analyze existing data and extract as much useful information as possible from data. It is often that the output data of systems are measurable while dynamic structures producing these data are hidden, and thus studies to reveal system structures by analyzing available data, i.e., reconstructions of systems become one of the most important tasks of information extractions. In the past, most of the works in this respect were based on theoretical analyses and numerical verifications. Direct analyses of experimental data are very rare. In physical science, most of the analyses of experimental setups were based on the first principles of physics laws, i.e., so-called top-down analyses. In this paper, we conducted an experiment of “Boer resonant instrument for forced vibration” (BRIFV) and inferred the dynamic structure of the experimental set purely from the analysis of the measurable experimental data, i.e., by applying the bottom-up strategy. Dynamics of the experimental set is strongly nonlinear and chaotic, and itʼs subjects to inevitable noises. We proposed to use high-order correlation computations to treat nonlinear dynamics; use two-time correlations to treat noise effects. By applying these approaches, we have successfully reconstructed the structure of the experimental setup, and the dynamic system reconstructed with the measured data reproduces good experimental results in a wide range of parameters.
NASA Technical Reports Server (NTRS)
Yates, E. Carson, Jr.
1987-01-01
To promote the evaluation of existing and emerging unsteady aerodynamic codes and methods for applying them to aeroelastic problems, especially for the transonic range, a limited number of aerodynamic configurations and experimental dynamic response data sets are to be designated by the AGARD Structures and Materials Panel as standards for comparison. This set is a sequel to that established several years ago for comparisons of calculated and measured aerodynamic pressures and forces. This report presents the information needed to perform flutter calculations for the first candidate standard configuration for dynamic response along with the related experimental flutter data.
Cruella: developing a scalable tissue microarray data management system.
Cowan, James D; Rimm, David L; Tuck, David P
2006-06-01
Compared with DNA microarray technology, relatively little information is available concerning the special requirements, design influences, and implementation strategies of data systems for tissue microarray technology. These issues include the requirement to accommodate new and different data elements for each new project as well as the need to interact with pre-existing models for clinical, biological, and specimen-related data. To design and implement a flexible, scalable tissue microarray data storage and management system that could accommodate information regarding different disease types and different clinical investigators, and different clinical investigation questions, all of which could potentially contribute unforeseen data types that require dynamic integration with existing data. The unpredictability of the data elements combined with the novelty of automated analysis algorithms and controlled vocabulary standards in this area require flexible designs and practical decisions. Our design includes a custom Java-based persistence layer to mediate and facilitate interaction with an object-relational database model and a novel database schema. User interaction is provided through a Java Servlet-based Web interface. Cruella has become an indispensable resource and is used by dozens of researchers every day. The system stores millions of experimental values covering more than 300 biological markers and more than 30 disease types. The experimental data are merged with clinical data that has been aggregated from multiple sources and is available to the researchers for management, analysis, and export. Cruella addresses many of the special considerations for managing tissue microarray experimental data and the associated clinical information. A metadata-driven approach provides a practical solution to many of the unique issues inherent in tissue microarray research, and allows relatively straightforward interoperability with and accommodation of new data models.
Ice in space: An experimental and theoretical investigation
NASA Technical Reports Server (NTRS)
Patashnick, H.; Rupprecht, G.
1977-01-01
Basic knowledge is provided on the behavior of ice and ice particles under a wide variety of conditions including those of interplanetary space. This information and, in particular, the lifetime of ice particles as a function of solar distance is an absolute requirement for a proper interpretation of photometric profiles in comets. Because fundamental properties of ice and ice particles are developed in this report, the applicability of this information extends beyond the realm of comets into any area where volatile particles exist, be it in space or in the earth's atmosphere.
Experimental and numerical study of the British Experimental Rotor Programme blade
NASA Technical Reports Server (NTRS)
Brocklehurst, Alan; Duque, Earl P. N.
1990-01-01
Wind-tunnel tests on the British Experimental Rotor Programme (BERP) tip are described, and the results are compared with computational fluid dynamics (CFD) results. The test model was molded using the Lynx-BERP blade tooling to provide a semispan, cantilever wing comprising the outboard 30 percent of the rotor blade. The tests included both surface-pressure measurements and flow visualization to obtain detailed information of the flow over the BERP tip for a range of angles of attack. It was observed that, outboard of the notch, favorable pressure gradients exist which ensure attached flow, and that the tip vortex also remains stable to large angles of attack. On the rotor, these features yield a very gradual break in control loads when the retreating-blade limit is eventually reached. Computational and experimental results were generally found to be in good agreement.
Protein-protein interaction predictions using text mining methods.
Papanikolaou, Nikolas; Pavlopoulos, Georgios A; Theodosiou, Theodosios; Iliopoulos, Ioannis
2015-03-01
It is beyond any doubt that proteins and their interactions play an essential role in most complex biological processes. The understanding of their function individually, but also in the form of protein complexes is of a great importance. Nowadays, despite the plethora of various high-throughput experimental approaches for detecting protein-protein interactions, many computational methods aiming to predict new interactions have appeared and gained interest. In this review, we focus on text-mining based computational methodologies, aiming to extract information for proteins and their interactions from public repositories such as literature and various biological databases. We discuss their strengths, their weaknesses and how they complement existing experimental techniques by simultaneously commenting on the biological databases which hold such information and the benchmark datasets that can be used for evaluating new tools. Copyright © 2014 Elsevier Inc. All rights reserved.
Probabilistic modeling of discourse-aware sentence processing.
Dubey, Amit; Keller, Frank; Sturt, Patrick
2013-07-01
Probabilistic models of sentence comprehension are increasingly relevant to questions concerning human language processing. However, such models are often limited to syntactic factors. This restriction is unrealistic in light of experimental results suggesting interactions between syntax and other forms of linguistic information in human sentence processing. To address this limitation, this article introduces two sentence processing models that augment a syntactic component with information about discourse co-reference. The novel combination of probabilistic syntactic components with co-reference classifiers permits them to more closely mimic human behavior than existing models. The first model uses a deep model of linguistics, based in part on probabilistic logic, allowing it to make qualitative predictions on experimental data; the second model uses shallow processing to make quantitative predictions on a broad-coverage reading-time corpus. Copyright © 2013 Cognitive Science Society, Inc.
Carcagno, G J; Kemper, P
1988-04-01
The channeling demonstration sought to substitute community care for nursing home care to reduce long-term care costs and improve the quality of life of elderly clients and the family members and friends who care for them. Two interventions were tested, each in five sites; both had comprehensive case management at their core. One model added a small amount of additional funding for direct community services to fill the gaps in the existing system; the other substantially expanded coverage of community services regardless of categorical eligibility under existing programs. The demonstration was evaluated using a randomized experimental design to test the effects of channeling on use of community care, nursing homes, hospitals, and informal caregiving, and on measures of the quality of life of clients and their informal caregivers. Data were obtained from interviews with clients and informal caregivers; service use and cost records came from Medicare, Medicaid, channeling, and providers; and death records for an 18-month follow-up period were examined.
Objective fitting of hemoglobin dynamics in traumatic bruises based on temperature depth profiling
NASA Astrophysics Data System (ADS)
Vidovič, Luka; Milanič, Matija; Majaron, Boris
2014-02-01
Pulsed photothermal radiometry (PPTR) allows noninvasive measurement of laser-induced temperature depth profiles. The obtained profiles provide information on depth distribution of absorbing chromophores, such as melanin and hemoglobin. We apply this technique to objectively characterize mass diffusion and decomposition rate of extravasated hemoglobin during the bruise healing process. In present study, we introduce objective fitting of PPTR data obtained over the course of the bruise healing process. By applying Monte Carlo simulation of laser energy deposition and simulation of the corresponding PPTR signal, quantitative analysis of underlying bruise healing processes is possible. Introduction of objective fitting enables an objective comparison between the simulated and experimental PPTR signals. In this manner, we avoid reconstruction of laser-induced depth profiles and thus inherent loss of information in the process. This approach enables us to determine the value of hemoglobin mass diffusivity, which is controversial in existing literature. Such information will be a valuable addition to existing bruise age determination techniques.
Coupled CFD and Particle Vortex Transport Method: Wing Performance and Wake Validations
2008-06-26
the PVTM analysis. The results obtained using the coupled RANS/PVTM analysis compare well with experimental data , in particular the pressure...searching existing data sources, gathering and maintaining the data needed, and completing and reviewing the collection of information. Send comments...is validated against wind tunnel test data . Comparisons with measured pressure distribution, loadings, and vortex parameters, and the corresponding
Scrambling and thermalization in a diffusive quantum many-body system
Bohrdt, A.; Mendl, C. B.; Endres, M.; ...
2017-06-02
Out-of-time ordered (OTO) correlation functions describe scrambling of information in correlated quantum matter. They are of particular interest in incoherent quantum systems lacking well defined quasi-particles. Thus far, it is largely elusive how OTO correlators spread in incoherent systems with diffusive transport governed by a few globally conserved quantities. Here, we study the dynamical response of such a system using high-performance matrix-product-operator techniques. Specifically, we consider the non-integrable, one-dimensional Bose–Hubbard model in the incoherent high-temperature regime. Our system exhibits diffusive dynamics in time-ordered correlators of globally conserved quantities, whereas OTO correlators display a ballistic, light-cone spreading of quantum information. Themore » slowest process in the global thermalization of the system is thus diffusive, yet information spreading is not inhibited by such slow dynamics. We furthermore develop an experimentally feasible protocol to overcome some challenges faced by existing proposals and to probe time-ordered and OTO correlation functions. As a result, our study opens new avenues for both the theoretical and experimental exploration of thermalization and information scrambling dynamics.« less
Scrambling and thermalization in a diffusive quantum many-body system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bohrdt, A.; Mendl, C. B.; Endres, M.
Out-of-time ordered (OTO) correlation functions describe scrambling of information in correlated quantum matter. They are of particular interest in incoherent quantum systems lacking well defined quasi-particles. Thus far, it is largely elusive how OTO correlators spread in incoherent systems with diffusive transport governed by a few globally conserved quantities. Here, we study the dynamical response of such a system using high-performance matrix-product-operator techniques. Specifically, we consider the non-integrable, one-dimensional Bose–Hubbard model in the incoherent high-temperature regime. Our system exhibits diffusive dynamics in time-ordered correlators of globally conserved quantities, whereas OTO correlators display a ballistic, light-cone spreading of quantum information. Themore » slowest process in the global thermalization of the system is thus diffusive, yet information spreading is not inhibited by such slow dynamics. We furthermore develop an experimentally feasible protocol to overcome some challenges faced by existing proposals and to probe time-ordered and OTO correlation functions. As a result, our study opens new avenues for both the theoretical and experimental exploration of thermalization and information scrambling dynamics.« less
Improving plant bioaccumulation science through consistent reporting of experimental data.
Fantke, Peter; Arnot, Jon A; Doucette, William J
2016-10-01
Experimental data and models for plant bioaccumulation of organic contaminants play a crucial role for assessing the potential human and ecological risks associated with chemical use. Plants are receptor organisms and direct or indirect vectors for chemical exposures to all other organisms. As new experimental data are generated they are used to improve our understanding of plant-chemical interactions that in turn allows for the development of better scientific knowledge and conceptual and predictive models. The interrelationship between experimental data and model development is an ongoing, never-ending process needed to advance our ability to provide reliable quality information that can be used in various contexts including regulatory risk assessment. However, relatively few standard experimental protocols for generating plant bioaccumulation data are currently available and because of inconsistent data collection and reporting requirements, the information generated is often less useful than it could be for direct applications in chemical assessments and for model development and refinement. We review existing testing guidelines, common data reporting practices, and provide recommendations for revising testing guidelines and reporting requirements to improve bioaccumulation knowledge and models. This analysis provides a list of experimental parameters that will help to develop high quality datasets and support modeling tools for assessing bioaccumulation of organic chemicals in plants and ultimately addressing uncertainty in ecological and human health risk assessments. Copyright © 2016 Elsevier Ltd. All rights reserved.
Visual slant misperception and the Black-Hole landing situation
NASA Technical Reports Server (NTRS)
Perrone, J. A.
1983-01-01
A theory which explains the tendency for dangerously low approaches during night landing situations is presented. The two dimensional information at the pilot's eye contains sufficient information for the visual system to extract the angle of slant of the runway relative to the approach path. The analysis is depends upon perspective information which is available at a certain distance out from the aimpoint, to either side of the runway edgelights. Under black hole landing conditions, however, this information is not available, and it is proposed that the visual system use instead the only available information, the perspective gradient of the runway edgelights. An equation is developed which predicts the perceived approach angle when this incorrect parameter is used. The predictions are in close agreement with existing experimental data.
Experimental Design for Hanford Low-Activity Waste Glasses with High Waste Loading
DOE Office of Scientific and Technical Information (OSTI.GOV)
Piepel, Gregory F.; Cooley, Scott K.; Vienna, John D.
This report discusses the development of an experimental design for the initial phase of the Hanford low-activity waste (LAW) enhanced glass study. This report is based on a manuscript written for an applied statistics journal. Appendices A, B, and E include additional information relevant to the LAW enhanced glass experimental design that is not included in the journal manuscript. The glass composition experimental region is defined by single-component constraints (SCCs), linear multiple-component constraints (MCCs), and a nonlinear MCC involving 15 LAW glass components. Traditional methods and software for designing constrained mixture experiments with SCCs and linear MCCs are not directlymore » applicable because of the nonlinear MCC. A modification of existing methodology to account for the nonlinear MCC was developed and is described in this report. One of the glass components, SO 3, has a solubility limit in glass that depends on the composition of the balance of the glass. A goal was to design the experiment so that SO 3 would not exceed its predicted solubility limit for any of the experimental glasses. The SO 3 solubility limit had previously been modeled by a partial quadratic mixture model expressed in the relative proportions of the 14 other components. The partial quadratic mixture model was used to construct a nonlinear MCC in terms of all 15 components. In addition, there were SCCs and linear MCCs. This report describes how a layered design was generated to (i) account for the SCCs, linear MCCs, and nonlinear MCC and (ii) meet the goals of the study. A layered design consists of points on an outer layer, and inner layer, and a center point. There were 18 outer-layer glasses chosen using optimal experimental design software to augment 147 existing glass compositions that were within the LAW glass composition experimental region. Then 13 inner-layer glasses were chosen with the software to augment the existing and outer-layer glasses. The experimental design was completed by a center-point glass, a Vitreous State Laboratory glass, and replicates of the center point and Vitreous State Laboratory glasses.« less
Wagner, Delphine; Bolender, Yves; Rémond, Yves; George, Daniel
2017-01-01
Although orthodontics have greatly improved over the years, understanding of its associated biomechanics remains incomplete and is mainly based on two dimensional (2D) mechanical equilibrium and long-time clinical experience. Little experimental information exists in three dimensions (3D) about the forces and moments developed on orthodontic brackets over more than two or three adjacent teeth. We define here a simplified methodology to quantify 3D forces and moments applied on orthodontic brackets fixed on a dental arch and validate our methodology using existing results from the literature by means of simplified hypotheses.
An Approach for Removing Redundant Data from RFID Data Streams
Mahdin, Hairulnizam; Abawajy, Jemal
2011-01-01
Radio frequency identification (RFID) systems are emerging as the primary object identification mechanism, especially in supply chain management. However, RFID naturally generates a large amount of duplicate readings. Removing these duplicates from the RFID data stream is paramount as it does not contribute new information to the system and wastes system resources. Existing approaches to deal with this problem cannot fulfill the real time demands to process the massive RFID data stream. We propose a data filtering approach that efficiently detects and removes duplicate readings from RFID data streams. Experimental results show that the proposed approach offers a significant improvement as compared to the existing approaches. PMID:22163730
NASA Astrophysics Data System (ADS)
Marcus, Kelvin
2014-06-01
The U.S Army Research Laboratory (ARL) has built a "Network Science Research Lab" to support research that aims to improve their ability to analyze, predict, design, and govern complex systems that interweave the social/cognitive, information, and communication network genres. Researchers at ARL and the Network Science Collaborative Technology Alliance (NS-CTA), a collaborative research alliance funded by ARL, conducted experimentation to determine if automated network monitoring tools and task-aware agents deployed within an emulated tactical wireless network could potentially increase the retrieval of relevant data from heterogeneous distributed information nodes. ARL and NS-CTA required the capability to perform this experimentation over clusters of heterogeneous nodes with emulated wireless tactical networks where each node could contain different operating systems, application sets, and physical hardware attributes. Researchers utilized the Dynamically Allocated Virtual Clustering Management System (DAVC) to address each of the infrastructure support requirements necessary in conducting their experimentation. The DAVC is an experimentation infrastructure that provides the means to dynamically create, deploy, and manage virtual clusters of heterogeneous nodes within a cloud computing environment based upon resource utilization such as CPU load, available RAM and hard disk space. The DAVC uses 802.1Q Virtual LANs (VLANs) to prevent experimentation crosstalk and to allow for complex private networks. Clusters created by the DAVC system can be utilized for software development, experimentation, and integration with existing hardware and software. The goal of this paper is to explore how ARL and the NS-CTA leveraged the DAVC to create, deploy and manage multiple experimentation clusters to support their experimentation goals.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pyrak-Nolte, Laura J.; Cheng, JiangTao; Yu, Ping
2003-01-29
During this reporting period, shown experimentally that the optical coherence imaging system can acquire information on grain interfaces and void shape for a maximum depth of half a millimeter into sandstone. The measurement of interfacial area per volume (IAV), capillary pressure and saturation in two dimensional micro-models structures has shown the existence of a unique relationship among these hydraulic parameters for different pore geometry. The measurement of interfacial area per volume on a three-dimensional natural sample, i.e., sandstone, has shown the homogeneity of IAV with depth in a sample when the fluids are in equilibrium.
NASA Astrophysics Data System (ADS)
Brennen, Gavin; Giacobino, Elisabeth; Simon, Christoph
2015-05-01
Quantum memories are essential for quantum information processing and long-distance quantum communication. The field has recently seen a lot of progress, and the present focus issue offers a glimpse of these developments, showing both experimental and theoretical results from many of the leading groups around the world. On the experimental side, it shows work on cold gases, warm vapors, rare-earth ion doped crystals and single atoms. On the theoretical side there are in-depth studies of existing memory protocols, proposals for new protocols including approaches based on quantum error correction, and proposals for new applications of quantum storage. Looking forward, we anticipate many more exciting results in this area.
Manifold Regularized Experimental Design for Active Learning.
Zhang, Lining; Shum, Hubert P H; Shao, Ling
2016-12-02
Various machine learning and data mining tasks in classification require abundant data samples to be labeled for training. Conventional active learning methods aim at labeling the most informative samples for alleviating the labor of the user. Many previous studies in active learning select one sample after another in a greedy manner. However, this is not very effective because the classification models has to be retrained for each newly labeled sample. Moreover, many popular active learning approaches utilize the most uncertain samples by leveraging the classification hyperplane of the classifier, which is not appropriate since the classification hyperplane is inaccurate when the training data are small-sized. The problem of insufficient training data in real-world systems limits the potential applications of these approaches. This paper presents a novel method of active learning called manifold regularized experimental design (MRED), which can label multiple informative samples at one time for training. In addition, MRED gives an explicit geometric explanation for the selected samples to be labeled by the user. Different from existing active learning methods, our method avoids the intrinsic problems caused by insufficiently labeled samples in real-world applications. Various experiments on synthetic datasets, the Yale face database and the Corel image database have been carried out to show how MRED outperforms existing methods.
Sex education and family planning services for young adults: alternative urban strategies in Mexico.
Townsend, J W; Diaz de May, E; Sepúlveda, Y; Santos de Garza, Y; Rosenhouse, S
1987-01-01
In Mexico, youth face difficulties in obtaining reliable information on sex education and family planning through existing community programs. Two alternative strategies to provide these services are being tested in poor urban areas of Monterrey. In one experimental area, Integrated Youth Centers were established, which provide sex education and family planning services as well as counseling, academic tutoring, and recreational activities. In another area, trained young adults and community counselors work through informal networks to provide sex education and family planning information. Both utilization and the cost of these services are examined in the context of plans for expanding coverage in Mexico-U.S. border areas.
Theofilatos, Konstantinos; Pavlopoulou, Niki; Papasavvas, Christoforos; Likothanassis, Spiros; Dimitrakopoulos, Christos; Georgopoulos, Efstratios; Moschopoulos, Charalampos; Mavroudi, Seferina
2015-03-01
Proteins are considered to be the most important individual components of biological systems and they combine to form physical protein complexes which are responsible for certain molecular functions. Despite the large availability of protein-protein interaction (PPI) information, not much information is available about protein complexes. Experimental methods are limited in terms of time, efficiency, cost and performance constraints. Existing computational methods have provided encouraging preliminary results, but they phase certain disadvantages as they require parameter tuning, some of them cannot handle weighted PPI data and others do not allow a protein to participate in more than one protein complex. In the present paper, we propose a new fully unsupervised methodology for predicting protein complexes from weighted PPI graphs. The proposed methodology is called evolutionary enhanced Markov clustering (EE-MC) and it is a hybrid combination of an adaptive evolutionary algorithm and a state-of-the-art clustering algorithm named enhanced Markov clustering. EE-MC was compared with state-of-the-art methodologies when applied to datasets from the human and the yeast Saccharomyces cerevisiae organisms. Using public available datasets, EE-MC outperformed existing methodologies (in some datasets the separation metric was increased by 10-20%). Moreover, when applied to new human datasets its performance was encouraging in the prediction of protein complexes which consist of proteins with high functional similarity. In specific, 5737 protein complexes were predicted and 72.58% of them are enriched for at least one gene ontology (GO) function term. EE-MC is by design able to overcome intrinsic limitations of existing methodologies such as their inability to handle weighted PPI networks, their constraint to assign every protein in exactly one cluster and the difficulties they face concerning the parameter tuning. This fact was experimentally validated and moreover, new potentially true human protein complexes were suggested as candidates for further validation using experimental techniques. Copyright © 2015 Elsevier B.V. All rights reserved.
Schnoes, Alexandra M.; Ream, David C.; Thorman, Alexander W.; Babbitt, Patricia C.; Friedberg, Iddo
2013-01-01
The ongoing functional annotation of proteins relies upon the work of curators to capture experimental findings from scientific literature and apply them to protein sequence and structure data. However, with the increasing use of high-throughput experimental assays, a small number of experimental studies dominate the functional protein annotations collected in databases. Here, we investigate just how prevalent is the “few articles - many proteins” phenomenon. We examine the experimentally validated annotation of proteins provided by several groups in the GO Consortium, and show that the distribution of proteins per published study is exponential, with 0.14% of articles providing the source of annotations for 25% of the proteins in the UniProt-GOA compilation. Since each of the dominant articles describes the use of an assay that can find only one function or a small group of functions, this leads to substantial biases in what we know about the function of many proteins. Mass-spectrometry, microscopy and RNAi experiments dominate high throughput experiments. Consequently, the functional information derived from these experiments is mostly of the subcellular location of proteins, and of the participation of proteins in embryonic developmental pathways. For some organisms, the information provided by different studies overlap by a large amount. We also show that the information provided by high throughput experiments is less specific than those provided by low throughput experiments. Given the experimental techniques available, certain biases in protein function annotation due to high-throughput experiments are unavoidable. Knowing that these biases exist and understanding their characteristics and extent is important for database curators, developers of function annotation programs, and anyone who uses protein function annotation data to plan experiments. PMID:23737737
SNPdbe: constructing an nsSNP functional impacts database.
Schaefer, Christian; Meier, Alice; Rost, Burkhard; Bromberg, Yana
2012-02-15
Many existing databases annotate experimentally characterized single nucleotide polymorphisms (SNPs). Each non-synonymous SNP (nsSNP) changes one amino acid in the gene product (single amino acid substitution;SAAS). This change can either affect protein function or be neutral in that respect. Most polymorphisms lack experimental annotation of their functional impact. Here, we introduce SNPdbe-SNP database of effects, with predictions of computationally annotated functional impacts of SNPs. Database entries represent nsSNPs in dbSNP and 1000 Genomes collection, as well as variants from UniProt and PMD. SAASs come from >2600 organisms; 'human' being the most prevalent. The impact of each SAAS on protein function is predicted using the SNAP and SIFT algorithms and augmented with experimentally derived function/structure information and disease associations from PMD, OMIM and UniProt. SNPdbe is consistently updated and easily augmented with new sources of information. The database is available as an MySQL dump and via a web front end that allows searches with any combination of organism names, sequences and mutation IDs. http://www.rostlab.org/services/snpdbe.
NASA Astrophysics Data System (ADS)
Johnston, Michael A.; Farrell, Damien; Nielsen, Jens Erik
2012-04-01
The exchange of information between experimentalists and theoreticians is crucial to improving the predictive ability of theoretical methods and hence our understanding of the related biology. However many barriers exist which prevent the flow of information between the two disciplines. Enabling effective collaboration requires that experimentalists can easily apply computational tools to their data, share their data with theoreticians, and that both the experimental data and computational results are accessible to the wider community. We present a prototype collaborative environment for developing and validating predictive tools for protein biophysical characteristics. The environment is built on two central components; a new python-based integration module which allows theoreticians to provide and manage remote access to their programs; and PEATDB, a program for storing and sharing experimental data from protein biophysical characterisation studies. We demonstrate our approach by integrating PEATSA, a web-based service for predicting changes in protein biophysical characteristics, into PEATDB. Furthermore, we illustrate how the resulting environment aids method development using the Potapov dataset of experimentally measured ΔΔGfold values, previously employed to validate and train protein stability prediction algorithms.
Dasgupta, Annwesa P.; Anderson, Trevor R.
2014-01-01
It is essential to teach students about experimental design, as this facilitates their deeper understanding of how most biological knowledge was generated and gives them tools to perform their own investigations. Despite the importance of this area, surprisingly little is known about what students actually learn from designing biological experiments. In this paper, we describe a rubric for experimental design (RED) that can be used to measure knowledge of and diagnose difficulties with experimental design. The development and validation of the RED was informed by a literature review and empirical analysis of undergraduate biology students’ responses to three published assessments. Five areas of difficulty with experimental design were identified: the variable properties of an experimental subject; the manipulated variables; measurement of outcomes; accounting for variability; and the scope of inference appropriate for experimental findings. Our findings revealed that some difficulties, documented some 50 yr ago, still exist among our undergraduate students, while others remain poorly investigated. The RED shows great promise for diagnosing students’ experimental design knowledge in lecture settings, laboratory courses, research internships, and course-based undergraduate research experiences. It also shows potential for guiding the development and selection of assessment and instructional activities that foster experimental design. PMID:26086658
Magnetic particle-mediated magnetoreception
Shaw, Jeremy; Boyd, Alastair; House, Michael; Woodward, Robert; Mathes, Falko; Cowin, Gary; Saunders, Martin; Baer, Boris
2015-01-01
Behavioural studies underpin the weight of experimental evidence for the existence of a magnetic sense in animals. In contrast, studies aimed at understanding the mechanistic basis of magnetoreception by determining the anatomical location, structure and function of sensory cells have been inconclusive. In this review, studies attempting to demonstrate the existence of a magnetoreceptor based on the principles of the magnetite hypothesis are examined. Specific attention is given to the range of techniques, and main animal model systems that have been used in the search for magnetite particulates. Anatomical location/cell rarity and composition are identified as two key obstacles that must be addressed in order to make progress in locating and characterizing a magnetite-based magnetoreceptor cell. Avenues for further study are suggested, including the need for novel experimental, correlative, multimodal and multidisciplinary approaches. The aim of this review is to inspire new efforts towards understanding the cellular basis of magnetoreception in animals, which will in turn inform a new era of behavioural research based on first principles. PMID:26333810
Mathematical modeling of the aerodynamics of high-angle-of-attack maneuvers
NASA Technical Reports Server (NTRS)
Schiff, L. B.; Tobak, M.; Malcolm, G. N.
1980-01-01
This paper is a review of the current state of aerodynamic mathematical modeling for aircraft motions at high angles of attack. The mathematical model serves to define a set of characteristic motions from whose known aerodynamic responses the aerodynamic response to an arbitrary high angle-of-attack flight maneuver can be predicted. Means are explored of obtaining stability parameter information in terms of the characteristic motions, whether by wind-tunnel experiments, computational methods, or by parameter-identification methods applied to flight-test data. A rationale is presented for selecting and verifying the aerodynamic mathematical model at the lowest necessary level of complexity. Experimental results describing the wing-rock phenomenon are shown to be accommodated within the most recent mathematical model by admitting the existence of aerodynamic hysteresis in the steady-state variation of the rolling moment with roll angle. Interpretation of the experimental results in terms of bifurcation theory reveals the general conditions under which aerodynamic hysteresis must exist.
O'Callaghan, Maureen; Soboleva, Tanya K; Barratt, Barbara I P
2010-01-01
Determining the effects of genetically modified (GM) crops on non-target organisms is essential as many non-target species provide important ecological functions. However, it is simply not possible to collect field data on more than a few potential non-target species present in the receiving environment of a GM crop. While risk assessment must be rigorous, new approaches are necessary to improve the efficiency of the process. Utilisation of published information and existing data on the phenology and population dynamics of test species in the field can be combined with limited amounts of experimental biosafety data to predict possible outcomes on species persistence. This paper presents an example of an approach where data from laboratory experiments and field studies on phenology are combined using predictive modelling. Using the New Zealand native weevil species Nicaeana cervina as a case study, we could predict that oviposition rates of the weevil feeding on a GM ryegrass could be reduced by up to 30% without threat to populations of the weevil in pastoral ecosystems. In addition, an experimentally established correlation between feeding level and oviposition led to the prediction that a consistent reduction in feeding of 50% or higher indicated a significant risk to the species and could potentially lead to local extinctions. This approach to biosafety risk assessment, maximising the use of pre-existing field and laboratory data on non-target species, can make an important contribution to informed decision-making by regulatory authorities and developers of new technologies. © ISBR, EDP Sciences, 2011.
Experimental test of Landauer’s principle in single-bit operations on nanomagnetic memory bits
Hong, Jeongmin; Lambson, Brian; Dhuey, Scott; Bokor, Jeffrey
2016-01-01
Minimizing energy dissipation has emerged as the key challenge in continuing to scale the performance of digital computers. The question of whether there exists a fundamental lower limit to the energy required for digital operations is therefore of great interest. A well-known theoretical result put forward by Landauer states that any irreversible single-bit operation on a physical memory element in contact with a heat bath at a temperature T requires at least kBT ln(2) of heat be dissipated from the memory into the environment, where kB is the Boltzmann constant. We report an experimental investigation of the intrinsic energy loss of an adiabatic single-bit reset operation using nanoscale magnetic memory bits, by far the most ubiquitous digital storage technology in use today. Through sensitive, high-precision magnetometry measurements, we observed that the amount of dissipated energy in this process is consistent (within 2 SDs of experimental uncertainty) with the Landauer limit. This result reinforces the connection between “information thermodynamics” and physical systems and also provides a foundation for the development of practical information processing technologies that approach the fundamental limit of energy dissipation. The significance of the result includes insightful direction for future development of information technology. PMID:26998519
3D shape reconstruction of specular surfaces by using phase measuring deflectometry
NASA Astrophysics Data System (ADS)
Zhou, Tian; Chen, Kun; Wei, Haoyun; Li, Yan
2016-10-01
The existing estimation methods for recovering height information from surface gradient are mainly divided into Modal and Zonal techniques. Since specular surfaces used in the industry always have complex and large areas, considerations must be given to both the improvement of measurement accuracy and the acceleration of on-line processing speed, which beyond the capacity of existing estimations. Incorporating the Modal and Zonal approaches into a unifying scheme, we introduce an improved 3D shape reconstruction version of specular surfaces based on Phase Measuring Deflectometry in this paper. The Modal estimation is firstly implemented to derive the coarse height information of the measured surface as initial iteration values. Then the real shape can be recovered utilizing a modified Zonal wave-front reconstruction algorithm. By combining the advantages of Modal and Zonal estimations, the proposed method simultaneously achieves consistently high accuracy and dramatically rapid convergence. Moreover, the iterative process based on an advanced successive overrelaxation technique shows a consistent rejection of measurement errors, guaranteeing the stability and robustness in practical applications. Both simulation and experimentally measurement demonstrate the validity and efficiency of the proposed improved method. According to the experimental result, the computation time decreases approximately 74.92% in contrast to the Zonal estimation and the surface error is about 6.68 μm with reconstruction points of 391×529 pixels of an experimentally measured sphere mirror. In general, this method can be conducted with fast convergence speed and high accuracy, providing an efficient, stable and real-time approach for the shape reconstruction of specular surfaces in practical situations.
NASA Astrophysics Data System (ADS)
Nielsen, R. L.; Ghiorso, M. S.; Trischman, T.
2015-12-01
The database traceDs is designed to provide a transparent and accessible resource of experimental partitioning data. It now includes ~ 90% of all the experimental trace element partitioning data (~4000 experiments) produced over the past 45 years, and is accessible through a web based interface (using the portal lepr.ofm-research.org). We set a minimum standard for inclusion, with the threshold criteria being the inclusion of: Experimental conditions (temperature, pressure, device, container, time, etc.) Major element composition of the phases Trace element analyses of the phases Data sources that did not report these minimum components were not included. The rationale for not including such data is that the degree of equilibration is unknown, and more important, no rigorous approach to modeling the behavior of trace elements is possible without knowledge of composition of the phases, and the temperature and pressure of formation/equilibration. The data are stored using a schema derived from that of the Library of Experimental Phase Relations (LEPR), modified to account for additional metadata, and restructured to permit multiple analytical entries for various element/technique/standard combinations. In the process of populating the database, we have learned a number of things about the existing published experimental partitioning data. Most important are: ~ 20% of the papers do not satisfy one or more of the threshold criteria. The standard format for presenting data is the average. This was developed as the standard during the time where there were space constraints for publication in spite of fact that all the information can now be published as electronic supplements. The uncertainties that are published with the compositional data are often not adequately explained (e.g. 1 or 2 sigma, standard deviation of the average, etc.). We propose a new set of publication standards for experimental data that include the minimum criteria described above, the publication of all analyses with error based on peak count rates and background, plus information on the structural state of the mineral (e.g. orthopyroxene vs. pigeonite).
CNTRO: A Semantic Web Ontology for Temporal Relation Inferencing in Clinical Narratives.
Tao, Cui; Wei, Wei-Qi; Solbrig, Harold R; Savova, Guergana; Chute, Christopher G
2010-11-13
Using Semantic-Web specifications to represent temporal information in clinical narratives is an important step for temporal reasoning and answering time-oriented queries. Existing temporal models are either not compatible with the powerful reasoning tools developed for the Semantic Web, or designed only for structured clinical data and therefore are not ready to be applied on natural-language-based clinical narrative reports directly. We have developed a Semantic-Web ontology which is called Clinical Narrative Temporal Relation ontology. Using this ontology, temporal information in clinical narratives can be represented as RDF (Resource Description Framework) triples. More temporal information and relations can then be inferred by Semantic-Web based reasoning tools. Experimental results show that this ontology can represent temporal information in real clinical narratives successfully.
Excitation and Ionization Cross Sections for Electron-Beam Energy Deposition in High Temperature Air
1987-07-09
are given and compared to existing experimental results or other theoretical approaches. This information can readily be used as input for a deposition...of the doubly-differential, singly- differential and total ionization cross sections which subsequently served to guide theoretical calculations on...coworkers have been leaders in developing a theoretical base for studying electron production and energy deposition in atmospheric gases such as He, N2
Topological solitons as addressable phase bits in a driven laser
NASA Astrophysics Data System (ADS)
Garbin, Bruno; Javaloyes, Julien; Tissoni, Giovanna; Barland, Stéphane
2015-01-01
Optical localized states are usually defined as self-localized bistable packets of light, which exist as independently controllable optical intensity pulses either in the longitudinal or transverse dimension of nonlinear optical systems. Here we demonstrate experimentally and analytically the existence of longitudinal localized states that exist fundamentally in the phase of laser light. These robust and versatile phase bits can be individually nucleated and canceled in an injection-locked semiconductor laser operated in a neuron-like excitable regime and submitted to delayed feedback. The demonstration of their control opens the way to their use as phase information units in next-generation coherent communication systems. We analyse our observations in terms of a generic model, which confirms the topological nature of the phase bits and discloses their formal but profound analogy with Sine-Gordon solitons.
Considering RNAi experimental design in parasitic helminths.
Dalzell, Johnathan J; Warnock, Neil D; McVeigh, Paul; Marks, Nikki J; Mousley, Angela; Atkinson, Louise; Maule, Aaron G
2012-04-01
Almost a decade has passed since the first report of RNA interference (RNAi) in a parasitic helminth. Whilst much progress has been made with RNAi informing gene function studies in disparate nematode and flatworm parasites, substantial and seemingly prohibitive difficulties have been encountered in some species, hindering progress. An appraisal of current practices, trends and ideals of RNAi experimental design in parasitic helminths is both timely and necessary for a number of reasons: firstly, the increasing availability of parasitic helminth genome/transcriptome resources means there is a growing need for gene function tools such as RNAi; secondly, fundamental differences and unique challenges exist for parasite species which do not apply to model organisms; thirdly, the inherent variation in experimental design, and reported difficulties with reproducibility undermine confidence. Ideally, RNAi studies of gene function should adopt standardised experimental design to aid reproducibility, interpretation and comparative analyses. Although the huge variations in parasite biology and experimental endpoints make RNAi experimental design standardization difficult or impractical, we must strive to validate RNAi experimentation in helminth parasites. To aid this process we identify multiple approaches to RNAi experimental validation and highlight those which we deem to be critical for gene function studies in helminth parasites.
Carcagno, G J; Kemper, P
1988-01-01
The channeling demonstration sought to substitute community care for nursing home care to reduce long-term care costs and improve the quality of life of elderly clients and the family members and friends who care for them. Two interventions were tested, each in five sites; both had comprehensive case management at their core. One model added a small amount of additional funding for direct community services to fill the gaps in the existing system; the other substantially expanded coverage of community services regardless of categorical eligibility under existing programs. The demonstration was evaluated using a randomized experimental design to test the effects of channeling on use of community care, nursing homes, hospitals, and informal caregiving, and on measures of the quality of life of clients and their informal caregivers. Data were obtained from interviews with clients and informal caregivers; service use and cost records came from Medicare, Medicaid, channeling, and providers; and death records for an 18-month follow-up period were examined. PMID:3130322
Restoring the spatial resolution of refocus images on 4D light field
NASA Astrophysics Data System (ADS)
Lim, JaeGuyn; Park, ByungKwan; Kang, JooYoung; Lee, SeongDeok
2010-01-01
This paper presents the method for generating a refocus image with restored spatial resolution on a plenoptic camera, which functions controlling the depth of field after capturing one image unlike a traditional camera. It is generally known that the camera captures 4D light field (angular and spatial information of light) within a limited 2D sensor and results in reducing 2D spatial resolution due to inevitable 2D angular data. That's the reason why a refocus image is composed of a low spatial resolution compared with 2D sensor. However, it has recently been known that angular data contain sub-pixel spatial information such that the spatial resolution of 4D light field can be increased. We exploit the fact for improving the spatial resolution of a refocus image. We have experimentally scrutinized that the spatial information is different according to the depth of objects from a camera. So, from the selection of refocused regions (corresponding depth), we use corresponding pre-estimated sub-pixel spatial information for reconstructing spatial resolution of the regions. Meanwhile other regions maintain out-of-focus. Our experimental results show the effect of this proposed method compared to existing method.
Computational modeling of RNA 3D structures, with the aid of experimental restraints
Magnus, Marcin; Matelska, Dorota; Łach, Grzegorz; Chojnowski, Grzegorz; Boniecki, Michal J; Purta, Elzbieta; Dawson, Wayne; Dunin-Horkawicz, Stanislaw; Bujnicki, Janusz M
2014-01-01
In addition to mRNAs whose primary function is transmission of genetic information from DNA to proteins, numerous other classes of RNA molecules exist, which are involved in a variety of functions, such as catalyzing biochemical reactions or performing regulatory roles. In analogy to proteins, the function of RNAs depends on their structure and dynamics, which are largely determined by the ribonucleotide sequence. Experimental determination of high-resolution RNA structures is both laborious and difficult, and therefore, the majority of known RNAs remain structurally uncharacterized. To address this problem, computational structure prediction methods were developed that simulate either the physical process of RNA structure formation (“Greek science” approach) or utilize information derived from known structures of other RNA molecules (“Babylonian science” approach). All computational methods suffer from various limitations that make them generally unreliable for structure prediction of long RNA sequences. However, in many cases, the limitations of computational and experimental methods can be overcome by combining these two complementary approaches with each other. In this work, we review computational approaches for RNA structure prediction, with emphasis on implementations (particular programs) that can utilize restraints derived from experimental analyses. We also list experimental approaches, whose results can be relatively easily used by computational methods. Finally, we describe case studies where computational and experimental analyses were successfully combined to determine RNA structures that would remain out of reach for each of these approaches applied separately. PMID:24785264
Hurka, Florian; Wenger, Thomas; Heininger, Sebastian; Lueth, Tim C
2011-01-01
This article describes a new interaction device for surgical navigation systems--the so-called navigation mouse system. The idea is to use a tracked instrument of a surgical navigation system like a pointer to control the software. The new interaction system extends existing navigation systems with a microcontroller-unit. The microcontroller-unit uses the existing communication line to extract the needed 3D-information of an instrument to calculate positions analogous to the PC mouse cursor and click events. These positions and events are used to manipulate the navigation system. In an experimental setup the reachable accuracy with the new mouse system is shown.
Title I preliminary engineering for: A. S. E. F. solid waste to methane gas
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
1976-01-01
An assignment to provide preliminary engineering of an Advanced System Experimental Facility for production of methane gas from urban solid waste by anaerobic digestion is documented. The experimental facility will be constructed on a now-existing solid waste shredding and landfill facility in Pompano Beach, Florida. Information is included on: general description of the project; justification of basic need; process design; preliminary drawings; outline specifications; preliminary estimate of cost; and time schedules for design and construction of accomplishment of design and construction. The preliminary cost estimate for the design and construction phases of the experimental program is $2,960,000, based on Dec.more » 1975 and Jan. 1976 costs. A time schedule of eight months to complete the Detailed Design, Equipment Procurement and the Award of Subcontracts is given.« less
Adaptive algorithm of magnetic heading detection
NASA Astrophysics Data System (ADS)
Liu, Gong-Xu; Shi, Ling-Feng
2017-11-01
Magnetic data obtained from a magnetic sensor usually fluctuate in a certain range, which makes it difficult to estimate the magnetic heading accurately. In fact, magnetic heading information is usually submerged in noise because of all kinds of electromagnetic interference and the diversity of the pedestrian’s motion states. In order to solve this problem, a new adaptive algorithm based on the (typically) right-angled corridors of a building or residential buildings is put forward to process heading information. First, a 3D indoor localization platform is set up based on MPU9250. Then, several groups of data are measured by changing the experimental environment and pedestrian’s motion pace. The raw data from the attached inertial measurement unit are calibrated and arranged into a time-stamped array and written to a data file. Later, the data file is imported into MATLAB for processing and analysis using the proposed adaptive algorithm. Finally, the algorithm is verified by comparison with the existing algorithm. The experimental results show that the algorithm has strong robustness and good fault tolerance, which can detect the heading information accurately and in real-time.
NASA Astrophysics Data System (ADS)
Shirai, Tomohiro; Friberg, Ari T.
2018-04-01
Dispersion-canceled optical coherence tomography (OCT) based on spectral intensity interferometry was devised as a classical counterpart of quantum OCT to enhance the basic performance of conventional OCT. In this paper, we demonstrate experimentally that an alternative method of realizing this kind of OCT by means of two optical fiber couplers and a single spectrometer is a more practical and reliable option than the existing methods proposed previously. Furthermore, we develop a recipe for reducing multiple artifacts simultaneously on the basis of simple averaging and verify experimentally that it works successfully in the sense that all the artifacts are mitigated effectively and only the true signals carrying structural information about the sample survive.
An online database of nuclear electromagnetic moments
NASA Astrophysics Data System (ADS)
Mertzimekis, T. J.; Stamou, K.; Psaltis, A.
2016-01-01
Measurements of nuclear magnetic dipole and electric quadrupole moments are considered quite important for the understanding of nuclear structure both near and far from the valley of stability. The recent advent of radioactive beams has resulted in a plethora of new, continuously flowing, experimental data on nuclear structure - including nuclear moments - which hinders the information management. A new, dedicated, public and user friendly online database (http://magneticmoments.info) has been created comprising experimental data of nuclear electromagnetic moments. The present database supersedes existing printed compilations, including also non-evaluated series of data and relevant meta-data, while putting strong emphasis on bimonthly updates. The scope, features and extensions of the database are reported.
A new online database of nuclear electromagnetic moments
NASA Astrophysics Data System (ADS)
Mertzimekis, Theo J.
2017-09-01
Nuclear electromagnetic (EM) moments, i.e., the magnetic dipole and the electric quadrupole moments, provide important information of nuclear structure. As in other types of experimental data available to the community, measurements of nuclear EM moments have been organized systematically in compilations since the dawn of nuclear science. However, the wealth of recent moments measurements with radioactive beams, as well as earlier existing measurements, lack an online, easy-to-access, systematically organized presence to disseminate information to researchers. In addition, available printed compilations suffer a rather long life cycle, being left behind experimental measurements published in journals or elsewhere. A new, online database (
Smerecnik, Chris M R; Mesters, Ilse; de Vries, Nanne K; de Vries, Hein
2009-11-01
Health messages alerting the public to previously unknown genetic risk factors for multifactorial diseases are a potentially useful strategy to create public awareness, and may be an important first step in promoting public health. However, there is a lack of evidence-based insight into its impact on individuals who were unaware of the existence of genetic risk factors at the moment of information exposure. The authors conducted 3 experimental studies with health messages communicating information about genetic risk factors for salt sensitivity (Studies 1A and 1B) and heightened cholesterol (Study 2) compared with general information without reference to genetic risk factors as a between-subjects variable and risk perception and intention to engage in preventive behavior as dependent variables. All 3 studies revealed lower perceived susceptibility among participants who received information on genetic risk factors, which was associated with lowered intentions to engage in preventive behavior. In Studies 1A and 1B, these effects were observed only for previously unaware individuals, whereas in Study 2, they were observed for the entire sample. Alerting the public to the existence of genetic risk factors may not necessarily be beneficial to public health. Public health promoters should be aware of the possible adverse effects of alerting the general population to genetic risk factors, and should simultaneously educate the public about the meaning and consequences of such factors. PsycINFO Database Record (c) 2009 APA, all rights reserved.
NASA Technical Reports Server (NTRS)
2005-01-01
This paper addresses the regulatory processes and requirements already in place by which an applicant might obtain experimental airworthiness certification for a civil Unmanned Aircraft System (UAS). It is more extensive and subsequent to an earlier, similar deliverable, PD007, which was an interim study of the same topic. Since few regulatory airworthiness and operating standards exist for UAS like those for traditional manned aircraft and since most UAS have historically been developed and operated under military auspices, civil use of UAS in the NAS is a new and unfamiliar challenge requiring specific and unique considerations. Experimental certification is the most basic level of FAA approval toward routine UAS operation in the NAS. The paper reviews and explains existing FAA requirements for an applicant seeking experimental airworthiness approval and details the process for submission of necessary information. It summarizes the limited purposes for which experimental aircraft may be used and addresses pertinent aspects of UAS design, construction and operation in the NAS in harmony with traditional manned aircraft. Policy IPT position is that UAS, while different from manned aircraft, can use the same initial processes to gain civil operating experience under the experimental approval. Particular note is taken of those UAS-unique characteristics which require extra attention to assure equivalent safety of operation, such as the UAS control station and sense-and-avoid. The paper also provides "best practices" guidance for UAS manufacturers and FAA personnel in two appendices. The material in Appendix A is intended to provide guidance on assuring UAS safety to FAA, and provides FAA personnel with a suggested list of items to review, with a focus on UAS unique factors, prior to issuance of an experimental airworthiness certificate. Appendix B provides an outline for a program letter which a manufacturer could use in preparing the application for an UAS experimental airworthiness certificate.
Laser marking as a result of applying reverse engineering
NASA Astrophysics Data System (ADS)
Mihalache, Andrei; Nagîţ, Gheorghe; Rîpanu, Marius Ionuţ; Slǎtineanu, Laurenţiu; Dodun, Oana; Coteaţǎ, Margareta
2018-05-01
The elaboration of a modern manufacturing technology needs a certain quantum of information concerning the part to be obtained. When it is necessary to elaborate the technology for an existing object, such an information could be ensured by using the principles specific to the reverse engineering. Essentially, in the case of this method, the analysis of the surfaces and of other characteristics of the part must offer enough information for the elaboration of the part manufacturing technology. On the other hand, it is known that the laser marking is a processing method able to ensure the transfer of various inscriptions or drawings on a part. Sometimes, the laser marking could be based on the analysis of an existing object, whose image could be used to generate the same object or an improved object. There are many groups of factors able to affect the results of applying the laser marking process. A theoretical analysis was proposed to show that the heights of triangles obtained by means of a CNC marking equipment depend on the width of the line generated by the laser spot on the workpiece surface. An experimental research was thought and materialized to highlight the influence exerted by the line with and the angle of lines intersections on the accuracy of the marking process. By mathematical processing of the experimental results, empirical mathematical models were determined. The power type model and the graphical representation elaborated on the base of this model offered an image concerning the influences exerted by the considered input factors on the marking process accuracy.
Combining Multiple Forms Of Visual Information To Specify Contact Relations In Spatial Layout
NASA Astrophysics Data System (ADS)
Sedgwick, Hal A.
1990-03-01
An expert system, called Layout2, has been described, which models a subset of available visual information for spatial layout. The system is used to examine detailed interactions between multiple, partially redundant forms of information in an environment-centered geometrical model of an environment obeying certain rather general constraints. This paper discusses the extension of Layout2 to include generalized contact relations between surfaces. In an environment-centered model, the representation of viewer-centered distance is replaced by the representation of environmental location. This location information is propagated through the representation of the environment by a network of contact relations between contiguous surfaces. Perspective information interacts with other forms of information to specify these contact relations. The experimental study of human perception of contact relations in extended spatial layouts is also discussed. Differences between human results and Layout2 results reveal limitations in the human ability to register available information; they also point to the existence of certain forms of information not yet formalized in Layout2.
Radiation Detection Computational Benchmark Scenarios
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shaver, Mark W.; Casella, Andrew M.; Wittman, Richard S.
2013-09-24
Modeling forms an important component of radiation detection development, allowing for testing of new detector designs, evaluation of existing equipment against a wide variety of potential threat sources, and assessing operation performance of radiation detection systems. This can, however, result in large and complex scenarios which are time consuming to model. A variety of approaches to radiation transport modeling exist with complementary strengths and weaknesses for different problems. This variety of approaches, and the development of promising new tools (such as ORNL’s ADVANTG) which combine benefits of multiple approaches, illustrates the need for a means of evaluating or comparing differentmore » techniques for radiation detection problems. This report presents a set of 9 benchmark problems for comparing different types of radiation transport calculations, identifying appropriate tools for classes of problems, and testing and guiding the development of new methods. The benchmarks were drawn primarily from existing or previous calculations with a preference for scenarios which include experimental data, or otherwise have results with a high level of confidence, are non-sensitive, and represent problem sets of interest to NA-22. From a technical perspective, the benchmarks were chosen to span a range of difficulty and to include gamma transport, neutron transport, or both and represent different important physical processes and a range of sensitivity to angular or energy fidelity. Following benchmark identification, existing information about geometry, measurements, and previous calculations were assembled. Monte Carlo results (MCNP decks) were reviewed or created and re-run in order to attain accurate computational times and to verify agreement with experimental data, when present. Benchmark information was then conveyed to ORNL in order to guide testing and development of hybrid calculations. The results of those ADVANTG calculations were then sent to PNNL for compilation. This is a report describing the details of the selected Benchmarks and results from various transport codes.« less
Development Of A Numerical Tow Tank With Wave Generation To Supplement Experimental Efforts
2017-12-01
vehicles CAD computer aided design CFD computational fluid dynamics FVM finite volume method IO information operations ISR intelligence, surveillance, and...deliver a product that I am truly proud of. xv THIS PAGE INTENTIONALLY LEFT BLANK xvi CHAPTER 1: Introduction 1.1 Importance of Tow Tank Testing Modern...wedge installation. 1 In 2016, NPS student Ensign Ryan Tran adapted an existing vertical plunging wedge wave maker design used at the U.S. Naval
Debunking minimum information myths: one hat need not fit all.
Orchard, Sandra; Taylor, Chris F
2009-04-01
A recent meeting report published in this journal suggests that the work of the various bodies attempting to improve the quality of articles describing the results of biomedical experimental work has been misunderstood or, at best, misinterpreted. This response is an attempt to set the record straight and ensure that other groups are not discouraged from using these standards or from joining in their further development in either existing or novel areas of research.
Evolution of radiation resistance in a complex microenvironment
NASA Astrophysics Data System (ADS)
Kim, So Hyun; Austin, Robert; Mehta, Monal; Kahn, Atif
2013-03-01
Radiation treatment responses in brain cancers are typically associated with short progression-free intervals in highly lethal malignancies such as glioblastomas. Even as patients routinely progress through second and third line salvage therapies, which are usually empirically selected, surprisingly little information exists on how cancer cells evolve resistance. We will present experimental results showing how in the presence of complex radiation gradients evolution of resistance to radiation occurs. Sponsored by the NCI/NIH Physical Sciences Oncology Centers
Depth-tunable three-dimensional display with interactive light field control
NASA Astrophysics Data System (ADS)
Xie, Songlin; Wang, Peng; Sang, Xinzhu; Li, Chenyu; Dou, Wenhua; Xiao, Liquan
2016-07-01
A software-defined depth-tunable three-dimensional (3D) display with interactive 3D depth control is presented. With the proposed post-processing system, the disparity of the multi-view media can be freely adjusted. Benefiting from a wealth of information inherently contains in dense multi-view images captured with parallel arrangement camera array, the 3D light field is built and the light field structure is controlled to adjust the disparity without additional acquired depth information since the light field structure itself contains depth information. A statistical analysis based on the least square is carried out to extract the depth information inherently exists in the light field structure and the accurate depth information can be used to re-parameterize light fields for the autostereoscopic display, and a smooth motion parallax can be guaranteed. Experimental results show that the system is convenient and effective to adjust the 3D scene performance in the 3D display.
Parallel photonic information processing at gigabyte per second data rates using transient states
NASA Astrophysics Data System (ADS)
Brunner, Daniel; Soriano, Miguel C.; Mirasso, Claudio R.; Fischer, Ingo
2013-01-01
The increasing demands on information processing require novel computational concepts and true parallelism. Nevertheless, hardware realizations of unconventional computing approaches never exceeded a marginal existence. While the application of optics in super-computing receives reawakened interest, new concepts, partly neuro-inspired, are being considered and developed. Here we experimentally demonstrate the potential of a simple photonic architecture to process information at unprecedented data rates, implementing a learning-based approach. A semiconductor laser subject to delayed self-feedback and optical data injection is employed to solve computationally hard tasks. We demonstrate simultaneous spoken digit and speaker recognition and chaotic time-series prediction at data rates beyond 1Gbyte/s. We identify all digits with very low classification errors and perform chaotic time-series prediction with 10% error. Our approach bridges the areas of photonic information processing, cognitive and information science.
Praveen, Paurush; Fröhlich, Holger
2013-01-01
Inferring regulatory networks from experimental data via probabilistic graphical models is a popular framework to gain insights into biological systems. However, the inherent noise in experimental data coupled with a limited sample size reduces the performance of network reverse engineering. Prior knowledge from existing sources of biological information can address this low signal to noise problem by biasing the network inference towards biologically plausible network structures. Although integrating various sources of information is desirable, their heterogeneous nature makes this task challenging. We propose two computational methods to incorporate various information sources into a probabilistic consensus structure prior to be used in graphical model inference. Our first model, called Latent Factor Model (LFM), assumes a high degree of correlation among external information sources and reconstructs a hidden variable as a common source in a Bayesian manner. The second model, a Noisy-OR, picks up the strongest support for an interaction among information sources in a probabilistic fashion. Our extensive computational studies on KEGG signaling pathways as well as on gene expression data from breast cancer and yeast heat shock response reveal that both approaches can significantly enhance the reconstruction accuracy of Bayesian Networks compared to other competing methods as well as to the situation without any prior. Our framework allows for using diverse information sources, like pathway databases, GO terms and protein domain data, etc. and is flexible enough to integrate new sources, if available.
Lu, Jiwen; Erin Liong, Venice; Zhou, Jie
2017-08-09
In this paper, we propose a simultaneous local binary feature learning and encoding (SLBFLE) approach for both homogeneous and heterogeneous face recognition. Unlike existing hand-crafted face descriptors such as local binary pattern (LBP) and Gabor features which usually require strong prior knowledge, our SLBFLE is an unsupervised feature learning approach which automatically learns face representation from raw pixels. Unlike existing binary face descriptors such as the LBP, discriminant face descriptor (DFD), and compact binary face descriptor (CBFD) which use a two-stage feature extraction procedure, our SLBFLE jointly learns binary codes and the codebook for local face patches so that discriminative information from raw pixels from face images of different identities can be obtained by using a one-stage feature learning and encoding procedure. Moreover, we propose a coupled simultaneous local binary feature learning and encoding (C-SLBFLE) method to make the proposed approach suitable for heterogeneous face matching. Unlike most existing coupled feature learning methods which learn a pair of transformation matrices for each modality, we exploit both the common and specific information from heterogeneous face samples to characterize their underlying correlations. Experimental results on six widely used face datasets are presented to demonstrate the effectiveness of the proposed method.
Saxena, Anupam; Lipson, Hod; Valero-Cuevas, Francisco J.
2012-01-01
In systems and computational biology, much effort is devoted to functional identification of systems and networks at the molecular-or cellular scale. However, similarly important networks exist at anatomical scales such as the tendon network of human fingers: the complex array of collagen fibers that transmits and distributes muscle forces to finger joints. This network is critical to the versatility of the human hand, and its function has been debated since at least the 16th century. Here, we experimentally infer the structure (both topology and parameter values) of this network through sparse interrogation with force inputs. A population of models representing this structure co-evolves in simulation with a population of informative future force inputs via the predator-prey estimation-exploration algorithm. Model fitness depends on their ability to explain experimental data, while the fitness of future force inputs depends on causing maximal functional discrepancy among current models. We validate our approach by inferring two known synthetic Latex networks, and one anatomical tendon network harvested from a cadaver's middle finger. We find that functionally similar but structurally diverse models can exist within a narrow range of the training set and cross-validation errors. For the Latex networks, models with low training set error [<4%] and resembling the known network have the smallest cross-validation errors [∼5%]. The low training set [<4%] and cross validation [<7.2%] errors for models for the cadaveric specimen demonstrate what, to our knowledge, is the first experimental inference of the functional structure of complex anatomical networks. This work expands current bioinformatics inference approaches by demonstrating that sparse, yet informative interrogation of biological specimens holds significant computational advantages in accurate and efficient inference over random testing, or assuming model topology and only inferring parameters values. These findings also hold clues to both our evolutionary history and the development of versatile machines. PMID:23144601
Saxena, Anupam; Lipson, Hod; Valero-Cuevas, Francisco J
2012-01-01
In systems and computational biology, much effort is devoted to functional identification of systems and networks at the molecular-or cellular scale. However, similarly important networks exist at anatomical scales such as the tendon network of human fingers: the complex array of collagen fibers that transmits and distributes muscle forces to finger joints. This network is critical to the versatility of the human hand, and its function has been debated since at least the 16(th) century. Here, we experimentally infer the structure (both topology and parameter values) of this network through sparse interrogation with force inputs. A population of models representing this structure co-evolves in simulation with a population of informative future force inputs via the predator-prey estimation-exploration algorithm. Model fitness depends on their ability to explain experimental data, while the fitness of future force inputs depends on causing maximal functional discrepancy among current models. We validate our approach by inferring two known synthetic Latex networks, and one anatomical tendon network harvested from a cadaver's middle finger. We find that functionally similar but structurally diverse models can exist within a narrow range of the training set and cross-validation errors. For the Latex networks, models with low training set error [<4%] and resembling the known network have the smallest cross-validation errors [∼5%]. The low training set [<4%] and cross validation [<7.2%] errors for models for the cadaveric specimen demonstrate what, to our knowledge, is the first experimental inference of the functional structure of complex anatomical networks. This work expands current bioinformatics inference approaches by demonstrating that sparse, yet informative interrogation of biological specimens holds significant computational advantages in accurate and efficient inference over random testing, or assuming model topology and only inferring parameters values. These findings also hold clues to both our evolutionary history and the development of versatile machines.
Parametric models to relate spike train and LFP dynamics with neural information processing.
Banerjee, Arpan; Dean, Heather L; Pesaran, Bijan
2012-01-01
Spike trains and local field potentials (LFPs) resulting from extracellular current flows provide a substrate for neural information processing. Understanding the neural code from simultaneous spike-field recordings and subsequent decoding of information processing events will have widespread applications. One way to demonstrate an understanding of the neural code, with particular advantages for the development of applications, is to formulate a parametric statistical model of neural activity and its covariates. Here, we propose a set of parametric spike-field models (unified models) that can be used with existing decoding algorithms to reveal the timing of task or stimulus specific processing. Our proposed unified modeling framework captures the effects of two important features of information processing: time-varying stimulus-driven inputs and ongoing background activity that occurs even in the absence of environmental inputs. We have applied this framework for decoding neural latencies in simulated and experimentally recorded spike-field sessions obtained from the lateral intraparietal area (LIP) of awake, behaving monkeys performing cued look-and-reach movements to spatial targets. Using both simulated and experimental data, we find that estimates of trial-by-trial parameters are not significantly affected by the presence of ongoing background activity. However, including background activity in the unified model improves goodness of fit for predicting individual spiking events. Uncovering the relationship between the model parameters and the timing of movements offers new ways to test hypotheses about the relationship between neural activity and behavior. We obtained significant spike-field onset time correlations from single trials using a previously published data set where significantly strong correlation was only obtained through trial averaging. We also found that unified models extracted a stronger relationship between neural response latency and trial-by-trial behavioral performance than existing models of neural information processing. Our results highlight the utility of the unified modeling framework for characterizing spike-LFP recordings obtained during behavioral performance.
Observing the operational significance of discord consumption
NASA Astrophysics Data System (ADS)
Gu, Mile; Chrzanowski, Helen M.; Assad, Syed M.; Symul, Thomas; Modi, Kavan; Ralph, Timothy C.; Vedral, Vlatko; Lam, Ping Koy
2012-09-01
Coherent interactions that generate negligible entanglement can still exhibit unique quantum behaviour. This observation has motivated a search beyond entanglement for a complete description of all quantum correlations. Quantum discord is a promising candidate. Here, we demonstrate that under certain measurement constraints, discord between bipartite systems can be consumed to encode information that can only be accessed by coherent quantum interactions. The inability to access this information by any other means allows us to use discord to directly quantify this `quantum advantage'. We experimentally encode information within the discordant correlations of two separable Gaussian states. The amount of extra information recovered by coherent interaction is quantified and directly linked with the discord consumed during encoding. No entanglement exists at any point of this experiment. Thus we introduce and demonstrate an operational method to use discord as a physical resource.
BioNetCAD: design, simulation and experimental validation of synthetic biochemical networks
Rialle, Stéphanie; Felicori, Liza; Dias-Lopes, Camila; Pérès, Sabine; El Atia, Sanaâ; Thierry, Alain R.; Amar, Patrick; Molina, Franck
2010-01-01
Motivation: Synthetic biology studies how to design and construct biological systems with functions that do not exist in nature. Biochemical networks, although easier to control, have been used less frequently than genetic networks as a base to build a synthetic system. To date, no clear engineering principles exist to design such cell-free biochemical networks. Results: We describe a methodology for the construction of synthetic biochemical networks based on three main steps: design, simulation and experimental validation. We developed BioNetCAD to help users to go through these steps. BioNetCAD allows designing abstract networks that can be implemented thanks to CompuBioTicDB, a database of parts for synthetic biology. BioNetCAD enables also simulations with the HSim software and the classical Ordinary Differential Equations (ODE). We demonstrate with a case study that BioNetCAD can rationalize and reduce further experimental validation during the construction of a biochemical network. Availability and implementation: BioNetCAD is freely available at http://www.sysdiag.cnrs.fr/BioNetCAD. It is implemented in Java and supported on MS Windows. CompuBioTicDB is freely accessible at http://compubiotic.sysdiag.cnrs.fr/ Contact: stephanie.rialle@sysdiag.cnrs.fr; franck.molina@sysdiag.cnrs.fr Supplementary information: Supplementary data are available at Bioinformatics online. PMID:20628073
Challenges of Obtaining Evidence-Based Information Regarding Medications and Male Fertility.
Drobnis, Erma Z; Nangia, Ajay K
2017-01-01
In the clinic, the existing literature is insufficient to counsel our infertile men on medication use. Most studies have flaws that limit their application to evidence-based practice. In this chapter, we discuss the limitations of the current literature and the challenges to designing more useful studies. Among the most important weaknesses of existing studies is lack of power; that is, too few men are included to draw conclusions about the existence and size of medication effects. Adequate power is particularly important when confirming an absence of medication effect. Bias is also a problem in most studies. Early studies were rarely randomized, placebo-controlled, or blinded; a common example is patients receiving different medication regimes based on the severity of their symptoms-making it impossible to attribute differences between treated and untreated men to the medications. Additional bias is introduced by failing to include other factors that influence the outcome in the experimental design. A uniform population amenable to randomization and placebo-control are experimental species, and useful information has been gained from these models. However, application to humans is limited by differences from other species in route of drug administration, absorption of the drug, concentration in the male genital tract tissues, and genital tract physiology. To a lesser degree, there is variation among individual men in their response to drugs. In addition, drugs in the same class may have different effects, limiting the applicability of data across drugs of a single class. Complicating matters further, a toxic medication may seem to improve fertility endpoints by improving a disease condition that diminishes fertility. Finally, drug interactions have not been studied, and actual fertility data (pregnancy/fecundity) in humans are rare. A healthy dose of skepticism is warranted when evaluating studies of medications and male reproductive health.
FY2017 Pilot Project Plan for the Nuclear Energy Knowledge and Validation Center Initiative
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ren, Weiju
To prepare for technical development of computational code validation under the Nuclear Energy Knowledge and Validation Center (NEKVAC) initiative, several meetings were held by a group of experts of the Idaho National Laboratory (INL) and the Oak Ridge National Laboratory (ORNL) to develop requirements of, and formulate a structure for, a transient fuel database through leveraging existing resources. It was concluded in discussions of these meetings that a pilot project is needed to address the most fundamental issues that can generate immediate stimulus to near-future validation developments as well as long-lasting benefits to NEKVAC operation. The present project is proposedmore » based on the consensus of these discussions. Analysis of common scenarios in code validation indicates that the incapability of acquiring satisfactory validation data is often a showstopper that must first be tackled before any confident validation developments can be carried out. Validation data are usually found scattered in different places most likely with interrelationships among the data not well documented, incomplete with information for some parameters missing, nonexistent, or unrealistic to experimentally generate. Furthermore, with very different technical backgrounds, the modeler, the experimentalist, and the knowledgebase developer that must be involved in validation data development often cannot communicate effectively without a data package template that is representative of the data structure for the information domain of interest to the desired code validation. This pilot project is proposed to use the legendary TREAT Experiments Database to provide core elements for creating an ideal validation data package. Data gaps and missing data interrelationships will be identified from these core elements. All the identified missing elements will then be filled in with experimental data if available from other existing sources or with dummy data if nonexistent. The resulting hybrid validation data package (composed of experimental and dummy data) will provide a clear and complete instance delineating the structure of the desired validation data and enabling effective communication among the modeler, the experimentalist, and the knowledgebase developer. With a good common understanding of the desired data structure by the three parties of subject matter experts, further existing data hunting will be effectively conducted, new experimental data generation will be realistically pursued, knowledgebase schema will be practically designed; and code validation will be confidently planned.« less
Hybrid active contour model for inhomogeneous image segmentation with background estimation
NASA Astrophysics Data System (ADS)
Sun, Kaiqiong; Li, Yaqin; Zeng, Shan; Wang, Jun
2018-03-01
This paper proposes a hybrid active contour model for inhomogeneous image segmentation. The data term of the energy function in the active contour consists of a global region fitting term in a difference image and a local region fitting term in the original image. The difference image is obtained by subtracting the background from the original image. The background image is dynamically estimated from a linear filtered result of the original image on the basis of the varying curve locations during the active contour evolution process. As in existing local models, fitting the image to local region information makes the proposed model robust against an inhomogeneous background and maintains the accuracy of the segmentation result. Furthermore, fitting the difference image to the global region information makes the proposed model robust against the initial contour location, unlike existing local models. Experimental results show that the proposed model can obtain improved segmentation results compared with related methods in terms of both segmentation accuracy and initial contour sensitivity.
Placebos in clinical practice and research.
De Deyn, P P; D'Hooge, R
1996-01-01
The main current application of placebo is in clinical research. The term placebo effect refers to diverse non-specific, desired or non-desired effects of substances or procedures and interactions between patient and therapist. Unpredictability of the placebo effect necessitates placebo-controlled designs for most trials. Therapeutic and diagnostic use of placebo is ethically acceptable only in few well-defined cases. While "therapeutic" application of placebo almost invariably implies deception, this is not the case for its use in research. Conflicts may exist between the therapist's Hippocratic and scientific obligations. The authors provide examples in neuropsychiatry, illustrating that objective scientific data and well-considered guidelines may solve the ethical dilemma. Placebo control might even be considered an ethical obligation but some provisos should be kept in mind: (a) no adequate therapy for the disease should exist and/or (presumed) active therapy should have serious side-effects; (b) placebo treatment should not last too long; (c) placebo treatment should not inflict unacceptable risks, and (d) the experimental subject should be adequately informed and informed consent given. PMID:8798935
Pavement crack detection combining non-negative feature with fast LoG in complex scene
NASA Astrophysics Data System (ADS)
Wang, Wanli; Zhang, Xiuhua; Hong, Hanyu
2015-12-01
Pavement crack detection is affected by much interference in the realistic situation, such as the shadow, road sign, oil stain, salt and pepper noise etc. Due to these unfavorable factors, the exist crack detection methods are difficult to distinguish the crack from background correctly. How to extract crack information effectively is the key problem to the road crack detection system. To solve this problem, a novel method for pavement crack detection based on combining non-negative feature with fast LoG is proposed. The two key novelties and benefits of this new approach are that 1) using image pixel gray value compensation to acquisit uniform image, and 2) combining non-negative feature with fast LoG to extract crack information. The image preprocessing results demonstrate that the method is indeed able to homogenize the crack image with more accurately compared to existing methods. A large number of experimental results demonstrate the proposed approach can detect the crack regions more correctly compared with traditional methods.
Medical Libraries and the Assessment of User Needs *
Rees, Alan M.
1966-01-01
Users of information in science and technology have been studied in great detail with respect to material read, amount of time spent in reading and searching the literature, categories of questions asked, and so on. Probing for this information has been undertaken by means of structured and unstructured interviews, diaries, surveys, and questionnaires. Although a large amount of data has emerged on information usage and flow, the subjective response of scientists furnishes comment only on the satisfaction produced by present information services and does not yield insight into the extent to which needs remain unsatisfied. Relevance figures based upon the response of systems to questions cannot be equated with satisfaction of needs, since questions constitute, in most cases, inadequate representations of underlying information needs. Assessment of the needs of users of medical libraries and information systems must, in fact, be made in relation to the observed behavior and experience of biomedical scientists. There is room for well-designed experimentation which can explore the interaction of both psychological and environmental factors. Significant differences in information needs exist among and between individuals such as researchers and clinicians in the same environment. With respect to environment, it is hypothesized that the information needs of medical practitioners in remote areas might differ significantly from those of their colleagues working in large metropolitan centers in close proximity to medical schools, research institutions, and other rich sources of information fallout. It is anticipated that experimentation will eventually result in a methodology which will permit the determination and prediction of the information needs of any identified groups of users in a specific environment. PMID:5910386
Liao, Hung-Chang; Wang, Ya-Huei
2016-09-02
To facilitate interdisciplinary collaboration and to make connections between patients' diseases and their social/cultural contexts, the study examined whether the use of heterogeneous cluster grouping in reflective writing for medical humanities literature acquisition could have positive effects on medical university students in terms of empathy, critical thinking, and reflective writing. A 15-week quasi-experimental design was conducted to investigate the learning outcomes. After conducting cluster algorithms, heterogeneous learning clusters (experimental group; n = 43) and non-heterogeneous learning clusters (control group; n = 43) were derived for a medical humanities literature study. Before and after the intervention, an Empathy Scale in Patient Care (ES-PC), a critical thinking disposition assessment (CTDA-R), and a reflective writing test were administered to both groups. The findings showed that on the empathy scale, significant differences in the "behavioral empathy," "affective empathy," and overall sections existed between the post-test mean scores of the experimental group and those of the control group, but such differences did not exist in "intelligent empathy." Regarding critical thinking, there were significant differences in "systematicity and analyticity," "skepticism and well-informed," "maturity and skepticism," and overall sections. As for reflective writing, significant differences existed in "ideas," "voice and point of view," "critical thinking and representation," "depth of reflection on personal growth," and overall sections, but not in "focus and context structure" and "language and conventions." This study outlined an alternative for using heterogeneous cluster grouping in reflective writing about medical humanities literature to facilitate interdisciplinary cooperation to provide more humanizing medical care.
REFOLDdb: a new and sustainable gateway to experimental protocols for protein refolding.
Mizutani, Hisashi; Sugawara, Hideaki; Buckle, Ashley M; Sangawa, Takeshi; Miyazono, Ken-Ichi; Ohtsuka, Jun; Nagata, Koji; Shojima, Tomoki; Nosaki, Shohei; Xu, Yuqun; Wang, Delong; Hu, Xiao; Tanokura, Masaru; Yura, Kei
2017-04-24
More than 7000 papers related to "protein refolding" have been published to date, with approximately 300 reports each year during the last decade. Whilst some of these papers provide experimental protocols for protein refolding, a survey in the structural life science communities showed a necessity for a comprehensive database for refolding techniques. We therefore have developed a new resource - "REFOLDdb" that collects refolding techniques into a single, searchable repository to help researchers develop refolding protocols for proteins of interest. We based our resource on the existing REFOLD database, which has not been updated since 2009. We redesigned the data format to be more concise, allowing consistent representations among data entries compared with the original REFOLD database. The remodeled data architecture enhances the search efficiency and improves the sustainability of the database. After an exhaustive literature search we added experimental refolding protocols from reports published 2009 to early 2017. In addition to this new data, we fully converted and integrated existing REFOLD data into our new resource. REFOLDdb contains 1877 entries as of March 17 th , 2017, and is freely available at http://p4d-info.nig.ac.jp/refolddb/ . REFOLDdb is a unique database for the life sciences research community, providing annotated information for designing new refolding protocols and customizing existing methodologies. We envisage that this resource will find wide utility across broad disciplines that rely on the production of pure, active, recombinant proteins. Furthermore, the database also provides a useful overview of the recent trends and statistics in refolding technology development.
Hildebrandt, Tom; Shiovitz, Rachel; Alfano, Lauren; Greif, Rebecca
2008-09-01
The purpose of the current study was to operationalize the phenomenon of body deception, describe its theoretical importance, and validate its existence in an experimental paradigm. The definition of body deception includes the intentional misrepresentation of information about appearance to others. The present study examined body deception in a controlled experimental study of male and female same-sex peer groups using a series of hierarchical linear models. Ninety male and 90 female undergraduates were randomized to an experimental same-sex peer group or individual control condition. The results suggested that both men and women used body deception among peers, but men's body deception was muscularity driven whereas women's was thinness driven. Body dissatisfaction was significantly predictive of the degree of body deception used by both genders and it was significantly related to peer group membership. An integrated model for the role of body deception in body image disturbance is proposed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vonach, H.; Tagesen, S.
Starting with a discussion of the requirements and goals for high quality general-purpose evaluations the paper will describe the procedures chosen in our evaluation work for JEFF for producing new general evaluations with complete covariance information for all cross sections (file 3 data). Key problems essential for the goal of making the best possible use of the existing theoretical and experimental knowledge on neutron interactions with the respective nuclide will be addressed, especially the problem of assigning covariances to calculated cross sections, necessary checking procedures for all experimental data and various possibilities to amend the experimental database beyond the obviousmore » use of EXFOR data for the respective cross sections. In this respect both, the use of elemental cross sections in isotopic evaluations and the use of implicit cross-section data (that is data which can be converted into cross sections by simple methods) will be discussed in some detail.« less
An integrated approach to model strain localization bands in magnesium alloys
NASA Astrophysics Data System (ADS)
Baxevanakis, K. P.; Mo, C.; Cabal, M.; Kontsos, A.
2018-02-01
Strain localization bands (SLBs) that appear at early stages of deformation of magnesium alloys have been recently associated with heterogeneous activation of deformation twinning. Experimental evidence has demonstrated that such "Lüders-type" band formations dominate the overall mechanical behavior of these alloys resulting in sigmoidal type stress-strain curves with a distinct plateau followed by pronounced anisotropic hardening. To evaluate the role of SLB formation on the local and global mechanical behavior of magnesium alloys, an integrated experimental/computational approach is presented. The computational part is developed based on custom subroutines implemented in a finite element method that combine a plasticity model with a stiffness degradation approach. Specific inputs from the characterization and testing measurements to the computational approach are discussed while the numerical results are validated against such available experimental information, confirming the existence of load drops and the intensification of strain accumulation at the time of SLB initiation.
Error-based Extraction of States and Energy Landscapes from Experimental Single-Molecule Time-Series
NASA Astrophysics Data System (ADS)
Taylor, J. Nicholas; Li, Chun-Biu; Cooper, David R.; Landes, Christy F.; Komatsuzaki, Tamiki
2015-03-01
Characterization of states, the essential components of the underlying energy landscapes, is one of the most intriguing subjects in single-molecule (SM) experiments due to the existence of noise inherent to the measurements. Here we present a method to extract the underlying state sequences from experimental SM time-series. Taking into account empirical error and the finite sampling of the time-series, the method extracts a steady-state network which provides an approximation of the underlying effective free energy landscape. The core of the method is the application of rate-distortion theory from information theory, allowing the individual data points to be assigned to multiple states simultaneously. We demonstrate the method's proficiency in its application to simulated trajectories as well as to experimental SM fluorescence resonance energy transfer (FRET) trajectories obtained from isolated agonist binding domains of the AMPA receptor, an ionotropic glutamate receptor that is prevalent in the central nervous system.
Lappala, Anna; Nishima, Wataru; Miner, Jacob; Fenimore, Paul; Fischer, Will; Hraber, Peter; Zhang, Ming; McMahon, Benjamin; Tung, Chang-Shung
2018-05-10
Membrane fusion proteins are responsible for viral entry into host cells—a crucial first step in viral infection. These proteins undergo large conformational changes from pre-fusion to fusion-initiation structures, and, despite differences in viral genomes and disease etiology, many fusion proteins are arranged as trimers. Structural information for both pre-fusion and fusion-initiation states is critical for understanding virus neutralization by the host immune system. In the case of Ebola virus glycoprotein (EBOV GP) and Zika virus envelope protein (ZIKV E), pre-fusion state structures have been identified experimentally, but only partial structures of fusion-initiation states have been described. While the fusion-initiation structure is in an energetically unfavorable state that is difficult to solve experimentally, the existing structural information combined with computational approaches enabled the modeling of fusion-initiation state structures of both proteins. These structural models provide an improved understanding of four different neutralizing antibodies in the prevention of viral host entry.
Compound image segmentation of published biomedical figures.
Li, Pengyuan; Jiang, Xiangying; Kambhamettu, Chandra; Shatkay, Hagit
2018-04-01
Images convey essential information in biomedical publications. As such, there is a growing interest within the bio-curation and the bio-databases communities, to store images within publications as evidence for biomedical processes and for experimental results. However, many of the images in biomedical publications are compound images consisting of multiple panels, where each individual panel potentially conveys a different type of information. Segmenting such images into constituent panels is an essential first step toward utilizing images. In this article, we develop a new compound image segmentation system, FigSplit, which is based on Connected Component Analysis. To overcome shortcomings typically manifested by existing methods, we develop a quality assessment step for evaluating and modifying segmentations. Two methods are proposed to re-segment the images if the initial segmentation is inaccurate. Experimental results show the effectiveness of our method compared with other methods. The system is publicly available for use at: https://www.eecis.udel.edu/~compbio/FigSplit. The code is available upon request. shatkay@udel.edu. Supplementary data are available online at Bioinformatics.
The effectiveness and cost of home care: an information synthesis.
Hedrick, S C; Inui, T S
1986-01-01
The effect of home care on patient outcomes and costs of care has been controversial. This information synthesis summarizes results from studies of home care using experimental or quasi-experimental designs, explicitly including judgments of methodologic soundness in weighing the results. In 12 studies of programs targeted at chronically ill populations, home care services appear to have no impact on mortality, patient functioning, or nursing home placements. Across studies, these services either have no effect on hospitalization or tend to increase the number of hospital days; ambulatory care utilization may be increased by 40 percent. The cost of care either is not affected or is actually increased by 15 percent. The critical need at present is for better-designed studies to test the effects of different types of home care, targeted at various types of patients, on the outcomes assessed in the existing studies, as well as on other important outcomes such as family finances, quality of life, and quality of care. PMID:3512486
Enhanced Vapor-Phase Diffusion in Porous Media - LDRD Final Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ho, C.K.; Webb, S.W.
1999-01-01
As part of the Laboratory-Directed Research and Development (LDRD) Program at Sandia National Laboratories, an investigation into the existence of enhanced vapor-phase diffusion (EVD) in porous media has been conducted. A thorough literature review was initially performed across multiple disciplines (soil science and engineering), and based on this review, the existence of EVD was found to be questionable. As a result, modeling and experiments were initiated to investigate the existence of EVD. In this LDRD, the first mechanistic model of EVD was developed which demonstrated the mechanisms responsible for EVD. The first direct measurements of EVD have also been conductedmore » at multiple scales. Measurements have been made at the pore scale, in a two- dimensional network as represented by a fracture aperture, and in a porous medium. Significant enhancement of vapor-phase transport relative to Fickian diffusion was measured in all cases. The modeling and experimental results provide additional mechanisms for EVD beyond those presented by the generally accepted model of Philip and deVries (1957), which required a thermal gradient for EVD to exist. Modeling and experimental results show significant enhancement under isothermal conditions. Application of EVD to vapor transport in the near-surface vadose zone show a significant variation between no enhancement, the model of Philip and deVries, and the present results. Based on this information, the model of Philip and deVries may need to be modified, and additional studies are recommended.« less
Experimental Treatment for Duchenne Muscular Dystrophy Gets Boost from Existing Medication
... Boost from Existing Medication Spotlight on Research Experimental Treatment for Duchenne Muscular Dystrophy Gets Boost from Existing Medication By Colleen Labbe, M.S. | March 1, 2013 A mouse hanging on a wire during a test of muscle strength. Mice with a mutant dystrophin gene, which ...
Assessment of Existing Data and Reports for System Evaluation
NASA Technical Reports Server (NTRS)
Matolak, David W.; Skidmore, Trent A.
2000-01-01
This report describes work done as part of the Weather Datalink Research project grant. We describe the work done under Task 1 of this project: the assessment of the suitability of available reports and data for use in evaluation of candidate weather datalink systems, and the development of a performance parameter set for comparative system evaluation. It was found that existing data and reports are inadequate for a complete physical layer characterization, but that these reports provide a good foundation for system comparison. In addition, these reports also contain some information useful for evaluation at higher layers. The performance parameter list compiled can be viewed as near complete-additional investigations, both analytical/simulation and experimental, will likely result in additions and improvements to this list.
A review of unsteady turbulent boundary-layer experiments
NASA Technical Reports Server (NTRS)
Carr, L. W.
1981-01-01
The essential results of a comprehensive review of existing unsteady turbulent boundary-layer experiments are presented. Different types of unsteady flow facilities are described, and the related unsteady turbulent boundary-layer experiments are cataloged and discussed. The measurements that were obtained in the various experiments are described, and a complete list of experimental results is presented. All the experiments that measured instantaneous values of velocity, turbulence intensity, or turbulent shear stress are identified, and the availability of digital data is indicated. The results of the experiments are analyzed, and several significant trends are identified. An assessment of the available data is presented, delineating gaps in the existing data, and indicating where new or extended information is needed. Guidelines for future experiments are included.
FCRD Transmutation Fuels Handbook 2015
DOE Office of Scientific and Technical Information (OSTI.GOV)
Janney, Dawn Elizabeth; Papesch, Cynthia Ann
2015-09-01
Transmutation of minor actinides such as Np, Am, and Cm in spent nuclear fuel is of international interest because of its potential for reducing the long-term health and safety hazards caused by the radioactivity of the spent fuel. One important approach to transmutation (currently being pursued by the DOE Fuel Cycle Research & Development Advanced Fuels Campaign) involves incorporating the minor actinides into U-Pu-Zr alloys, which can be used as fuel in fast reactors. It is, therefore, important to understand the properties of U-Pu-Zr alloys, both with and without minor actinide additions. In addition to requiring extensive safety precautions, alloysmore » containing U and Pu are difficult to study for numerous reasons, including their complex phase transformations, characteristically sluggish phase-transformation kinetics, tendency to produce experimental results that vary depending on the histories of individual samples, and sensitivity to contaminants such as oxygen in concentrations below a hundred parts per million. Many of the experimental measurements were made before 1980, and the level of documentation for experimental methods and results varies widely. It is, therefore, not surprising that little is known with certainty about U-Pu-Zr alloys, and that general acceptance of results sometimes indicates that there is only a single measurement for a particular property. This handbook summarizes currently available information about U, Pu, Zr, and alloys of two or three of these elements. It contains information about phase diagrams and related information (including phases and phase transformations); heat capacity, entropy, and enthalpy; thermal expansion; and thermal conductivity and diffusivity. In addition to presenting information about materials properties, it attempts to provide information about how well the property is known and how much variation exists between measurements. Although the handbook includes some references to publications about modeling, its primary focus is experimental data. Most of the data has been published elsewhere (although scattered throughout numerous references, some quite obscure); however, some data is presented here for the first time.« less
Experimental Design for Parameter Estimation of Gene Regulatory Networks
Timmer, Jens
2012-01-01
Systems biology aims for building quantitative models to address unresolved issues in molecular biology. In order to describe the behavior of biological cells adequately, gene regulatory networks (GRNs) are intensively investigated. As the validity of models built for GRNs depends crucially on the kinetic rates, various methods have been developed to estimate these parameters from experimental data. For this purpose, it is favorable to choose the experimental conditions yielding maximal information. However, existing experimental design principles often rely on unfulfilled mathematical assumptions or become computationally demanding with growing model complexity. To solve this problem, we combined advanced methods for parameter and uncertainty estimation with experimental design considerations. As a showcase, we optimized three simulated GRNs in one of the challenges from the Dialogue for Reverse Engineering Assessment and Methods (DREAM). This article presents our approach, which was awarded the best performing procedure at the DREAM6 Estimation of Model Parameters challenge. For fast and reliable parameter estimation, local deterministic optimization of the likelihood was applied. We analyzed identifiability and precision of the estimates by calculating the profile likelihood. Furthermore, the profiles provided a way to uncover a selection of most informative experiments, from which the optimal one was chosen using additional criteria at every step of the design process. In conclusion, we provide a strategy for optimal experimental design and show its successful application on three highly nonlinear dynamic models. Although presented in the context of the GRNs to be inferred for the DREAM6 challenge, the approach is generic and applicable to most types of quantitative models in systems biology and other disciplines. PMID:22815723
Proton-induced knockout reactions with polarized and unpolarized beams
NASA Astrophysics Data System (ADS)
Wakasa, T.; Ogata, K.; Noro, T.
2017-09-01
Proton-induced knockout reactions provide a direct means of studying the single particle or cluster structures of target nuclei. In addition, these knockout reactions are expected to play a unique role in investigations of the effects of the nuclear medium on nucleon-nucleon interactions as well as the properties of nucleons and mesons. However, due to the nature of hadron probes, these reactions can suffer significant disturbances from the nuclear surroundings and the quantitative theoretical treatment of such processes can also be challenging. In this article, we review the experimental and theoretical progress in this field, particularly focusing on the use of these reactions as a spectroscopic tool and as a way to examine the medium modification of nucleon-nucleon interactions. With regard to the former aspect, the review presents a semi-quantitative evaluation of these reactions based on existing experimental data. In terms of the latter point, we introduce a significant body of evidence that suggests, although does not conclusively prove, the existence of medium effects. In addition, this paper also provides information and comments on other related subjects.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Janney, Dawn E.; Papesch, Cynthia A.; Burkes, Douglas E.
This is not a typical External Report--It is a Handbook. No Abstract is involved. This includes both Parts 1 and 2. The Metallic Fuels Handbook summarizes currently available information about phases and phase diagrams, heat capacity, thermal expansion, and thermal conductivity of elements and alloys in the U-Pu-Zr-Np-Am-La-Ce-Pr-Nd system. Although many sections are reviews and updates of material in previous versions of the Handbook [1, 2], this revision is the first to include alloys with four or more elements. In addition to presenting information about materials properties, the handbook attempts to provide information about how well each property is knownmore » and how much variation exists between measurements. Although it includes some results from models, its primary focus is experimental data.« less
Privacy-preserving heterogeneous health data sharing.
Mohammed, Noman; Jiang, Xiaoqian; Chen, Rui; Fung, Benjamin C M; Ohno-Machado, Lucila
2013-05-01
Privacy-preserving data publishing addresses the problem of disclosing sensitive data when mining for useful information. Among existing privacy models, ε-differential privacy provides one of the strongest privacy guarantees and makes no assumptions about an adversary's background knowledge. All existing solutions that ensure ε-differential privacy handle the problem of disclosing relational and set-valued data in a privacy-preserving manner separately. In this paper, we propose an algorithm that considers both relational and set-valued data in differentially private disclosure of healthcare data. The proposed approach makes a simple yet fundamental switch in differentially private algorithm design: instead of listing all possible records (ie, a contingency table) for noise addition, records are generalized before noise addition. The algorithm first generalizes the raw data in a probabilistic way, and then adds noise to guarantee ε-differential privacy. We showed that the disclosed data could be used effectively to build a decision tree induction classifier. Experimental results demonstrated that the proposed algorithm is scalable and performs better than existing solutions for classification analysis. The resulting utility may degrade when the output domain size is very large, making it potentially inappropriate to generate synthetic data for large health databases. Unlike existing techniques, the proposed algorithm allows the disclosure of health data containing both relational and set-valued data in a differentially private manner, and can retain essential information for discriminative analysis.
Privacy-preserving heterogeneous health data sharing
Mohammed, Noman; Jiang, Xiaoqian; Chen, Rui; Fung, Benjamin C M; Ohno-Machado, Lucila
2013-01-01
Objective Privacy-preserving data publishing addresses the problem of disclosing sensitive data when mining for useful information. Among existing privacy models, ε-differential privacy provides one of the strongest privacy guarantees and makes no assumptions about an adversary's background knowledge. All existing solutions that ensure ε-differential privacy handle the problem of disclosing relational and set-valued data in a privacy-preserving manner separately. In this paper, we propose an algorithm that considers both relational and set-valued data in differentially private disclosure of healthcare data. Methods The proposed approach makes a simple yet fundamental switch in differentially private algorithm design: instead of listing all possible records (ie, a contingency table) for noise addition, records are generalized before noise addition. The algorithm first generalizes the raw data in a probabilistic way, and then adds noise to guarantee ε-differential privacy. Results We showed that the disclosed data could be used effectively to build a decision tree induction classifier. Experimental results demonstrated that the proposed algorithm is scalable and performs better than existing solutions for classification analysis. Limitation The resulting utility may degrade when the output domain size is very large, making it potentially inappropriate to generate synthetic data for large health databases. Conclusions Unlike existing techniques, the proposed algorithm allows the disclosure of health data containing both relational and set-valued data in a differentially private manner, and can retain essential information for discriminative analysis. PMID:23242630
Tang, Yongchuan; Zhou, Deyun; Chan, Felix T S
2018-06-11
Quantification of uncertain degree in the Dempster-Shafer evidence theory (DST) framework with belief entropy is still an open issue, even a blank field for the open world assumption. Currently, the existed uncertainty measures in the DST framework are limited to the closed world where the frame of discernment (FOD) is assumed to be complete. To address this issue, this paper focuses on extending a belief entropy to the open world by considering the uncertain information represented as the FOD and the nonzero mass function of the empty set simultaneously. An extension to Deng’s entropy in the open world assumption (EDEOW) is proposed as a generalization of the Deng’s entropy and it can be degenerated to the Deng entropy in the closed world wherever necessary. In order to test the reasonability and effectiveness of the extended belief entropy, an EDEOW-based information fusion approach is proposed and applied to sensor data fusion under uncertainty circumstance. The experimental results verify the usefulness and applicability of the extended measure as well as the modified sensor data fusion method. In addition, a few open issues still exist in the current work: the necessary properties for a belief entropy in the open world assumption, whether there exists a belief entropy that satisfies all the existed properties, and what is the most proper fusion frame for sensor data fusion under uncertainty.
Integrating In Silico Resources to Map a Signaling Network
Liu, Hanqing; Beck, Tim N.; Golemis, Erica A.; Serebriiskii, Ilya G.
2013-01-01
The abundance of publicly available life science databases offer a wealth of information that can support interpretation of experimentally derived data and greatly enhance hypothesis generation. Protein interaction and functional networks are not simply new renditions of existing data: they provide the opportunity to gain insights into the specific physical and functional role a protein plays as part of the biological system. In this chapter, we describe different in silico tools that can quickly and conveniently retrieve data from existing data repositories and discuss how the available tools are best utilized for different purposes. While emphasizing protein-protein interaction databases (e.g., BioGrid and IntAct), we also introduce metasearch platforms such as STRING and GeneMANIA, pathway databases (e.g., BioCarta and Pathway Commons), text mining approaches (e.g., PubMed and Chilibot), and resources for drug-protein interactions, genetic information for model organisms and gene expression information based on microarray data mining. Furthermore, we provide a simple step-by-step protocol to building customized protein-protein interaction networks in Cytoscape, a powerful network assembly and visualization program, integrating data retrieved from these various databases. As we illustrate, generation of composite interaction networks enables investigators to extract significantly more information about a given biological system than utilization of a single database or sole reliance on primary literature. PMID:24233784
Isotropic neutrino flux from supernova explosions in the universe
NASA Astrophysics Data System (ADS)
Petkov, V. B.
2018-01-01
Neutrinos of all types are emitted from the gravitational collapse of massive star cores, and have been amassed in the Universe throughout the history of evolution of galaxies. The isotropic and stable flux of these neutrinos is a source of information on the spectra of neutrinos from individual supernovae and on their redshift distribution. The prospects for detecting the isotropic neutrino flux with the existing and upcoming experimental facilities and the current upper limits are discussed in this paper.
An Experimental Study of an Ultra-Mobile Vehicle for Off-Road Transportation.
1983-07-01
implemented. 2.2.3 Image Processing Algorithms The ultimate goal of a vision system is to understand the content of a scene and to extract useful...to extract useful information from it. Four existing robot-vision systems, the General Motors CONSIGHT system, the UNIVISIUN system, the Westinghouse...cos + C . sino A (5.48) By taking out a comon factor, Eq. (5.48) can be rewritten as /-- c. ( B coso + C sine) A (5.49) 203 !_ - Let Z B sie4 = : v, VB2
Design and Implementation of Telemedicine based on Java Media Framework
NASA Astrophysics Data System (ADS)
Xiong, Fengguang; Jia, Zhiyan
According to analyze the importance and problem of telemedicine in this paper, a telemedicine system based on JMF is proposed to design and implement capturing, compression, storage, transmission, reception and play of a medical audio and video. The telemedicine system can solve existing problems that medical information is not shared, platform-dependent is high, software is incompatibilities and so on. Experimental data prove that the system has low hardware cost, and is easy to transmission and storage, and is portable and powerful.
Experimental climate information services in support of risk management
NASA Astrophysics Data System (ADS)
Webb, R. S.; Pulwarty, R. S.; Davidson, M. A.; Shea, E. E.; Nierenberg, C.; Dole, R. M.
2009-12-01
Climate variability and change impact national and local economies and environments. Developing and communicating climate and climate impacts information to inform decision making requires an understanding of context, societal objectives, and identification of factors important to the management of risk. Information sensitive to changing baselines or extremes is a critical emergent need. Meeting this need requires timely production and delivery of useful climate data, information and knowledge within familiar pathways. We identify key attributes for a climate service , and the network and infrastructure to develop and coordinate the resulting services based on lessons learned in experimental implementations of climate services. "Service-type" activities already exist in many settings within federal, state, academic, and private sectors. The challenge for a climate service is to find effective implementation strategies for improving decision quality (not just meeting user needs). These strategies include upfront infrastructure investments, learning from event to event, coordinated innovation and diffusion, and highlighting common adaptation interests. Common to these strategies is the production of reliable and accessible data, analyses of emergent conditions and needs, and deliberative processes to identify appropriate entry points and uses for improved knowledge. Experimental climate services show that the development of well-structured paths among observations, projections, risk assessments and usable information requires sustained participation in “knowledge management systems” for early warning across temporal and spatial scales. Central to these systems is a collaborative framework between research and management to ensure anticipatory coordination between decision makers and information providers, allowing for emerging research findings and their attendant uncertainties to be considered. Early warnings in this context are not simply forecasts or predictions but information on potential “futures” derived from past records, expert judgments, scenarios, and availability of mechanisms and capacity to use such information. Effective experimental climate services facilitate ongoing appraisals of knowledge needs for informing adaptation and mitigation options across sectors and across scenarios of near and longer-term future climates. Analyses show that climate service experiments drawing on data, applied research and prototyping functions of activities such as RISAs and RCCs are critical to developing the learning needed to inform and structure the flow of knowledge and understanding from problem definition and applications research to information delivery, use and evaluation. These activities effectively serve to inform services implementation when overarching cross-agency coordination, knowledge management, and innovation diffusion mechanisms such as afforded by NIDIS and the Coastal Services Center are engaged. We also demonstrate the importance of positioning climate research to engage and inform the decision-making process as society anticipates and responds to climate and its impacts.
Dasgupta, Annwesa P.; Anderson, Trevor R.; Pelaez, Nancy J.
2016-01-01
Researchers, instructors, and funding bodies in biology education are unanimous about the importance of developing students’ competence in experimental design. Despite this, only limited measures are available for assessing such competence development, especially in the areas of molecular and cellular biology. Also, existing assessments do not measure how well students use standard symbolism to visualize biological experiments. We propose an assessment-design process that 1) provides background knowledge and questions for developers of new “experimentation assessments,” 2) elicits practices of representing experiments with conventional symbol systems, 3) determines how well the assessment reveals expert knowledge, and 4) determines how well the instrument exposes student knowledge and difficulties. To illustrate this process, we developed the Neuron Assessment and coded responses from a scientist and four undergraduate students using the Rubric for Experimental Design and the Concept-Reasoning Mode of representation (CRM) model. Some students demonstrated sound knowledge of concepts and representations. Other students demonstrated difficulty with depicting treatment and control group data or variability in experimental outcomes. Our process, which incorporates an authentic research situation that discriminates levels of visualization and experimentation abilities, shows potential for informing assessment design in other disciplines. PMID:27146159
Yunta, Jorge; Garcia-Pozuelo, Daniel; Diaz, Vicente; Olatunbosun, Oluremi
2018-02-06
Tires are a key sub-system of vehicles that have a big responsibility for comfort, fuel consumption and traffic safety. However, current tires are just passive rubber elements which do not contribute actively to improve the driving experience or vehicle safety. The lack of information from the tire during driving gives cause for developing an intelligent tire. Therefore, the aim of the intelligent tire is to monitor tire working conditions in real-time, providing useful information to other systems and becoming an active system. In this paper, tire tread deformation is measured to provide a strong experimental base with different experiments and test results by means of a tire fitted with sensors. Tests under different working conditions such as vertical load or slip angle have been carried out with an indoor tire test rig. The experimental data analysis shows the strong relation that exists between lateral force and the maximum tensile and compressive strain peaks when the tire is not working at the limit of grip. In the last section, an estimation system from experimental data has been developed and implemented in Simulink to show the potential of strain sensors for developing intelligent tire systems, obtaining as major results a signal to detect tire's loss of grip and estimations of the lateral friction coefficient.
Garcia-Pozuelo, Daniel; Diaz, Vicente; Olatunbosun, Oluremi
2018-01-01
Tires are a key sub-system of vehicles that have a big responsibility for comfort, fuel consumption and traffic safety. However, current tires are just passive rubber elements which do not contribute actively to improve the driving experience or vehicle safety. The lack of information from the tire during driving gives cause for developing an intelligent tire. Therefore, the aim of the intelligent tire is to monitor tire working conditions in real-time, providing useful information to other systems and becoming an active system. In this paper, tire tread deformation is measured to provide a strong experimental base with different experiments and test results by means of a tire fitted with sensors. Tests under different working conditions such as vertical load or slip angle have been carried out with an indoor tire test rig. The experimental data analysis shows the strong relation that exists between lateral force and the maximum tensile and compressive strain peaks when the tire is not working at the limit of grip. In the last section, an estimation system from experimental data has been developed and implemented in Simulink to show the potential of strain sensors for developing intelligent tire systems, obtaining as major results a signal to detect tire’s loss of grip and estimations of the lateral friction coefficient. PMID:29415513
An affine projection algorithm using grouping selection of input vectors
NASA Astrophysics Data System (ADS)
Shin, JaeWook; Kong, NamWoong; Park, PooGyeon
2011-10-01
This paper present an affine projection algorithm (APA) using grouping selection of input vectors. To improve the performance of conventional APA, the proposed algorithm adjusts the number of the input vectors using two procedures: grouping procedure and selection procedure. In grouping procedure, the some input vectors that have overlapping information for update is grouped using normalized inner product. Then, few input vectors that have enough information for for coefficient update is selected using steady-state mean square error (MSE) in selection procedure. Finally, the filter coefficients update using selected input vectors. The experimental results show that the proposed algorithm has small steady-state estimation errors comparing with the existing algorithms.
Selecting Models for Measuring Change When True Experimental Conditions Do Not Exist.
ERIC Educational Resources Information Center
Fortune, Jim C.; Hutson, Barbara A.
1984-01-01
Measuring change when true experimental conditions do not exist is a difficult process. This article reviews the artifacts of change measurement in evaluations and quasi-experimental designs, delineates considerations in choosing a model to measure change under nonideal conditions, and suggests ways to organize models to facilitate selection.…
Saccharomyces genome database informs human biology
Skrzypek, Marek S; Nash, Robert S; Wong, Edith D; MacPherson, Kevin A; Karra, Kalpana; Binkley, Gail; Simison, Matt; Miyasato, Stuart R
2018-01-01
Abstract The Saccharomyces Genome Database (SGD; http://www.yeastgenome.org) is an expertly curated database of literature-derived functional information for the model organism budding yeast, Saccharomyces cerevisiae. SGD constantly strives to synergize new types of experimental data and bioinformatics predictions with existing data, and to organize them into a comprehensive and up-to-date information resource. The primary mission of SGD is to facilitate research into the biology of yeast and to provide this wealth of information to advance, in many ways, research on other organisms, even those as evolutionarily distant as humans. To build such a bridge between biological kingdoms, SGD is curating data regarding yeast-human complementation, in which a human gene can successfully replace the function of a yeast gene, and/or vice versa. These data are manually curated from published literature, made available for download, and incorporated into a variety of analysis tools provided by SGD. PMID:29140510
A Probabilistic Palimpsest Model of Visual Short-term Memory
Matthey, Loic; Bays, Paul M.; Dayan, Peter
2015-01-01
Working memory plays a key role in cognition, and yet its mechanisms remain much debated. Human performance on memory tasks is severely limited; however, the two major classes of theory explaining the limits leave open questions about key issues such as how multiple simultaneously-represented items can be distinguished. We propose a palimpsest model, with the occurrent activity of a single population of neurons coding for several multi-featured items. Using a probabilistic approach to storage and recall, we show how this model can account for many qualitative aspects of existing experimental data. In our account, the underlying nature of a memory item depends entirely on the characteristics of the population representation, and we provide analytical and numerical insights into critical issues such as multiplicity and binding. We consider representations in which information about individual feature values is partially separate from the information about binding that creates single items out of multiple features. An appropriate balance between these two types of information is required to capture fully the different types of error seen in human experimental data. Our model provides the first principled account of misbinding errors. We also suggest a specific set of stimuli designed to elucidate the representations that subjects actually employ. PMID:25611204
A probabilistic palimpsest model of visual short-term memory.
Matthey, Loic; Bays, Paul M; Dayan, Peter
2015-01-01
Working memory plays a key role in cognition, and yet its mechanisms remain much debated. Human performance on memory tasks is severely limited; however, the two major classes of theory explaining the limits leave open questions about key issues such as how multiple simultaneously-represented items can be distinguished. We propose a palimpsest model, with the occurrent activity of a single population of neurons coding for several multi-featured items. Using a probabilistic approach to storage and recall, we show how this model can account for many qualitative aspects of existing experimental data. In our account, the underlying nature of a memory item depends entirely on the characteristics of the population representation, and we provide analytical and numerical insights into critical issues such as multiplicity and binding. We consider representations in which information about individual feature values is partially separate from the information about binding that creates single items out of multiple features. An appropriate balance between these two types of information is required to capture fully the different types of error seen in human experimental data. Our model provides the first principled account of misbinding errors. We also suggest a specific set of stimuli designed to elucidate the representations that subjects actually employ.
ChlamyCyc: an integrative systems biology database and web-portal for Chlamydomonas reinhardtii.
May, Patrick; Christian, Jan-Ole; Kempa, Stefan; Walther, Dirk
2009-05-04
The unicellular green alga Chlamydomonas reinhardtii is an important eukaryotic model organism for the study of photosynthesis and plant growth. In the era of modern high-throughput technologies there is an imperative need to integrate large-scale data sets from high-throughput experimental techniques using computational methods and database resources to provide comprehensive information about the molecular and cellular organization of a single organism. In the framework of the German Systems Biology initiative GoFORSYS, a pathway database and web-portal for Chlamydomonas (ChlamyCyc) was established, which currently features about 250 metabolic pathways with associated genes, enzymes, and compound information. ChlamyCyc was assembled using an integrative approach combining the recently published genome sequence, bioinformatics methods, and experimental data from metabolomics and proteomics experiments. We analyzed and integrated a combination of primary and secondary database resources, such as existing genome annotations from JGI, EST collections, orthology information, and MapMan classification. ChlamyCyc provides a curated and integrated systems biology repository that will enable and assist in systematic studies of fundamental cellular processes in Chlamydomonas. The ChlamyCyc database and web-portal is freely available under http://chlamycyc.mpimp-golm.mpg.de.
Huang, Yu-An; You, Zhu-Hong; Chen, Xing
2018-01-01
Drug-Target Interactions (DTI) play a crucial role in discovering new drug candidates and finding new proteins to target for drug development. Although the number of detected DTI obtained by high-throughput techniques has been increasing, the number of known DTI is still limited. On the other hand, the experimental methods for detecting the interactions among drugs and proteins are costly and inefficient. Therefore, computational approaches for predicting DTI are drawing increasing attention in recent years. In this paper, we report a novel computational model for predicting the DTI using extremely randomized trees model and protein amino acids information. More specifically, the protein sequence is represented as a Pseudo Substitution Matrix Representation (Pseudo-SMR) descriptor in which the influence of biological evolutionary information is retained. For the representation of drug molecules, a novel fingerprint feature vector is utilized to describe its substructure information. Then the DTI pair is characterized by concatenating the two vector spaces of protein sequence and drug substructure. Finally, the proposed method is explored for predicting the DTI on four benchmark datasets: Enzyme, Ion Channel, GPCRs and Nuclear Receptor. The experimental results demonstrate that this method achieves promising prediction accuracies of 89.85%, 87.87%, 82.99% and 81.67%, respectively. For further evaluation, we compared the performance of Extremely Randomized Trees model with that of the state-of-the-art Support Vector Machine classifier. And we also compared the proposed model with existing computational models, and confirmed 15 potential drug-target interactions by looking for existing databases. The experiment results show that the proposed method is feasible and promising for predicting drug-target interactions for new drug candidate screening based on sizeable features. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.
Abdulrehman, Dário; Monteiro, Pedro Tiago; Teixeira, Miguel Cacho; Mira, Nuno Pereira; Lourenço, Artur Bastos; dos Santos, Sandra Costa; Cabrito, Tânia Rodrigues; Francisco, Alexandre Paulo; Madeira, Sara Cordeiro; Aires, Ricardo Santos; Oliveira, Arlindo Limede; Sá-Correia, Isabel; Freitas, Ana Teresa
2011-01-01
The YEAst Search for Transcriptional Regulators And Consensus Tracking (YEASTRACT) information system (http://www.yeastract.com) was developed to support the analysis of transcription regulatory associations in Saccharomyces cerevisiae. Last updated in June 2010, this database contains over 48 200 regulatory associations between transcription factors (TFs) and target genes, including 298 specific DNA-binding sites for 110 characterized TFs. All regulatory associations stored in the database were revisited and detailed information on the experimental evidences that sustain those associations was added and classified as direct or indirect evidences. The inclusion of this new data, gathered in response to the requests of YEASTRACT users, allows the user to restrict its queries to subsets of the data based on the existence or not of experimental evidences for the direct action of the TFs in the promoter region of their target genes. Another new feature of this release is the availability of all data through a machine readable web-service interface. Users are no longer restricted to the set of available queries made available through the existing web interface, and can use the web service interface to query, retrieve and exploit the YEASTRACT data using their own implementation of additional functionalities. The YEASTRACT information system is further complemented with several computational tools that facilitate the use of the curated data when answering a number of important biological questions. Since its first release in 2006, YEASTRACT has been extensively used by hundreds of researchers from all over the world. We expect that by making the new data and services available, the system will continue to be instrumental for yeast biologists and systems biology researchers. PMID:20972212
A virtual experimenter to increase standardization for the investigation of placebo effects.
Horing, Bjoern; Newsome, Nathan D; Enck, Paul; Babu, Sabarish V; Muth, Eric R
2016-07-18
Placebo effects are mediated by expectancy, which is highly influenced by psychosocial factors of a treatment context. These factors are difficult to standardize. Furthermore, dedicated placebo research often necessitates single-blind deceptive designs where biases are easily introduced. We propose a study protocol employing a virtual experimenter - a computer program designed to deliver treatment and instructions - for the purpose of standardization and reduction of biases when investigating placebo effects. To evaluate the virtual experimenter's efficacy in inducing placebo effects via expectancy manipulation, we suggest a partially blinded, deceptive design with a baseline/retest pain protocol (hand immersions in hot water bath). Between immersions, participants will receive an (actually inert) medication. Instructions pertaining to the medication will be delivered by one of three metaphors: The virtual experimenter, a human experimenter, and an audio/text presentation (predictor "Metaphor"). The second predictor includes falsely informing participants that the medication is an effective pain killer, or correctly informing them that it is, in fact, inert (predictor "Instruction"). Analysis will be performed with hierarchical linear modelling, with a sample size of N = 50. Results from two pilot studies are presented that indicate the viability of the pain protocol (N = 33), and of the virtual experimenter software and placebo manipulation (N = 48). It will be challenging to establish full comparability between all metaphors used for instruction delivery, and to account for participant differences in acceptance of their virtual interaction partner. Once established, the presence of placebo effects would suggest that the virtual experimenter exhibits sufficient cues to be perceived as a social agent. He could consequently provide a convenient platform to investigate effects of experimenter behavior, or other experimenter characteristics, e.g., sex, age, race/ethnicity or professional status. More general applications are possible, for example in psychological research such as bias research, or virtual reality research. Potential applications also exist for standardizing clinical research by documenting and communicating instructions used in clinical trials.
Praveen, Paurush; Fröhlich, Holger
2013-01-01
Inferring regulatory networks from experimental data via probabilistic graphical models is a popular framework to gain insights into biological systems. However, the inherent noise in experimental data coupled with a limited sample size reduces the performance of network reverse engineering. Prior knowledge from existing sources of biological information can address this low signal to noise problem by biasing the network inference towards biologically plausible network structures. Although integrating various sources of information is desirable, their heterogeneous nature makes this task challenging. We propose two computational methods to incorporate various information sources into a probabilistic consensus structure prior to be used in graphical model inference. Our first model, called Latent Factor Model (LFM), assumes a high degree of correlation among external information sources and reconstructs a hidden variable as a common source in a Bayesian manner. The second model, a Noisy-OR, picks up the strongest support for an interaction among information sources in a probabilistic fashion. Our extensive computational studies on KEGG signaling pathways as well as on gene expression data from breast cancer and yeast heat shock response reveal that both approaches can significantly enhance the reconstruction accuracy of Bayesian Networks compared to other competing methods as well as to the situation without any prior. Our framework allows for using diverse information sources, like pathway databases, GO terms and protein domain data, etc. and is flexible enough to integrate new sources, if available. PMID:23826291
CELDA – an ontology for the comprehensive representation of cells in complex systems
2013-01-01
Background The need for detailed description and modeling of cells drives the continuous generation of large and diverse datasets. Unfortunately, there exists no systematic and comprehensive way to organize these datasets and their information. CELDA (Cell: Expression, Localization, Development, Anatomy) is a novel ontology for the association of primary experimental data and derived knowledge to various types of cells of organisms. Results CELDA is a structure that can help to categorize cell types based on species, anatomical localization, subcellular structures, developmental stages and origin. It targets cells in vitro as well as in vivo. Instead of developing a novel ontology from scratch, we carefully designed CELDA in such a way that existing ontologies were integrated as much as possible, and only minimal extensions were performed to cover those classes and areas not present in any existing model. Currently, ten existing ontologies and models are linked to CELDA through the top-level ontology BioTop. Together with 15.439 newly created classes, CELDA contains more than 196.000 classes and 233.670 relationship axioms. CELDA is primarily used as a representational framework for modeling, analyzing and comparing cells within and across species in CellFinder, a web based data repository on cells (http://cellfinder.org). Conclusions CELDA can semantically link diverse types of information about cell types. It has been integrated within the research platform CellFinder, where it exemplarily relates cell types from liver and kidney during development on the one hand and anatomical locations in humans on the other, integrating information on all spatial and temporal stages. CELDA is available from the CellFinder website: http://cellfinder.org/about/ontology. PMID:23865855
Are selective serotonin reuptake inhibitors safe for drivers? What is the evidence?
Ravera, Silvia; Ramaekers, Johannes G; de Jong-van den Berg, Lolkje T W; de Gier, Johan J
2012-05-01
Selective serotonin reuptake inhibitors (SSRIs) are widely used medications to treat several psychiatric diseases and, above all, depression. They seem to be as effective as older antidepressants but have a different adverse effect profile. Despite their favorable safety profile, little is known about their influence on traffic safety. To conduct a literature review to summarize the current evidence on the role of SSRIs in traffic safety, particularly concerning undesirable effects that could potentially impair fitness to drive, experimental and pharmacoepidemiologic studies on driving impairment, 2 existing categorization systems for driving-impairing medications, and the European legislative procedures for assessing fitness to drive before issuing a driver's license and driving under the influence of medicines. The article search was performed in the following electronic databases: MEDLINE, PsycINFO, ScienceDirect, and SafetyLit. The English-language scientific literature was searched using key words such as SSRIs and psychomotor performance, car crash or traffic accident, and adverse effects. For inclusion in this review, papers had to be full-text articles, refer to possible driving-related adverse effects, and be experimental or pharmacoepidemiologic studies on SSRIs and traffic accident risks. No restrictions concerning publication year were applied. Ten articles were selected as background information on driving-related adverse effects, and 15 articles were selected regarding experimental and pharmacoepidemiologic work. Regarding SSRI adverse effects, the most reported undesirable effects referring to driving impairment were anxiety, agitation, sleep disturbances, headache, increased risk of suicidal behavior, and deliberate self-harm. Regarding the remaining issues addressed in this article, inconsistencies were found between the outcomes of the selected experimental and epidemiologic studies and between the 2 existing categorization systems under evaluation. Some pitfalls of the current legislative scenario were identified as well. Based on the current evidence, it was concluded that more experimental and epidemiologic research is needed to elucidate the relationship between SSRI use and traffic safety. Furthermore, a revision of the existing categorization systems and harmonized European legislation in the field of medication use and driving were highly recommended. Copyright © 2012 Elsevier HS Journals, Inc. All rights reserved.
A review of unsteady turbulent boundary-layer experiments
NASA Technical Reports Server (NTRS)
Carr, L. W.
1981-01-01
The essential results of a comprehensive review of existing unsteady turbulent boundary-layer experiments are presented. Different types of unsteady flow facilities are described, and the related unsteady turbulent boundary-layer experiments are cataloged and discussed. The measurements that were obtained in the various experiments are described, and a complete list of experimental results is presented. All the experiments that measured instantaneous values of velocity, turbulence intensity, or turbulent shear stress are identified, and the availability of digital data is indicated. The results of the experiments are analyzed, and several significant trends are identified. An assessment of the available data is presented, delineating gaps in the existing data, and indicating where new or extended information is needed. Guidelines for future experiments are included. Previously announced in STAR as N81-29382
Research on multi-user encrypted search scheme in cloud environment
NASA Astrophysics Data System (ADS)
Yu, Zonghua; Lin, Sui
2017-05-01
Aiming at the existing problems of multi-user encrypted search scheme in cloud computing environment, a basic multi-user encrypted scheme is proposed firstly, and then the basic scheme is extended to an anonymous hierarchical management authority. Compared with most of the existing schemes, the scheme not only to achieve the protection of keyword information, but also to achieve the protection of user identity privacy; the same time, data owners can directly control the user query permissions, rather than the cloud server. In addition, through the use of a special query key generation rules, to achieve the hierarchical management of the user's query permissions. The safety analysis shows that the scheme is safe and that the performance analysis and experimental data show that the scheme is practicable.
Calculation of the room-temperature shapes of unsymmetric laminates
NASA Technical Reports Server (NTRS)
Hyer, M. W.
1981-01-01
A theory explaining the characteristics of the cured shapes of unsymmetric laminates is presented. The theory is based on an extension of classical lamination theory which accounts for geometric nonlinearities. A Rayleigh-Ritz approach to minimizing the total potential energy is used to obtain quantitative information regarding the room temperature shapes of square T300/5208 (0(2)/90(2))T and (0(4)/90(4))T graphite-epoxy laminates. It is shown that, depending on the thickness of the laminate and the length of the side the square, the saddle shape configuration is actually unstable. For values of length and thickness that render the saddle shape unstable, it is shown that two stable cylindrical shapes exist. The predictions of the theory are compared with existing experimental data.
Pyviko: an automated Python tool to design gene knockouts in complex viruses with overlapping genes.
Taylor, Louis J; Strebel, Klaus
2017-01-07
Gene knockouts are a common tool used to study gene function in various organisms. However, designing gene knockouts is complicated in viruses, which frequently contain sequences that code for multiple overlapping genes. Designing mutants that can be traced by the creation of new or elimination of existing restriction sites further compounds the difficulty in experimental design of knockouts of overlapping genes. While software is available to rapidly identify restriction sites in a given nucleotide sequence, no existing software addresses experimental design of mutations involving multiple overlapping amino acid sequences in generating gene knockouts. Pyviko performed well on a test set of over 240,000 gene pairs collected from viral genomes deposited in the National Center for Biotechnology Information Nucleotide database, identifying a point mutation which added a premature stop codon within the first 20 codons of the target gene in 93.2% of all tested gene-overprinted gene pairs. This shows that Pyviko can be used successfully in a wide variety of contexts to facilitate the molecular cloning and study of viral overprinted genes. Pyviko is an extensible and intuitive Python tool for designing knockouts of overlapping genes. Freely available as both a Python package and a web-based interface ( http://louiejtaylor.github.io/pyViKO/ ), Pyviko simplifies the experimental design of gene knockouts in complex viruses with overlapping genes.
Two color holographic interferometry for microgravity application
NASA Technical Reports Server (NTRS)
Trolinger, James D.; Weber, David C.
1995-01-01
Holographic interferometry is a primary candidate for determining temperature and concentration in crystal growth experiments designed for space. The method measures refractive index changes within the fluid of an experimental test cell resulting from temperature and/or concentration changes. When the refractive index changes are caused by simultaneous temperature and concentration changes, the contributions of the two effects cannot be separated by single wavelength interferometry. By using two wavelengths, however, two independent interferograms can provide the additional independent equation required to determine the two unknowns. There is no other technique available that provides this type of information. The primary objectives of this effort were to experimentally verify the mathematical theory of two color holographic interferometry (TCHI) and to determine the practical value of this technique for space application. In the foregoing study, the theory of TCHI has been tested experimentally over a range of interest for materials processing in space where measurements of temperature and concentration in a solution are required. New techniques were developed and applied to stretch the limits beyond what could be done with existing procedures. The study resulted in the production of one of the most advanced, enhanced sensitivity holographic interferometers in existence. The interferometric measurements made at MSFC represent what is believed to be the most accurate holographic interferometric measurements made in a fluid to date. The tests have provided an understanding of the limitations of the technique in practical use.
A new pattern associative memory model for image recognition based on Hebb rules and dot product
NASA Astrophysics Data System (ADS)
Gao, Mingyue; Deng, Limiao; Wang, Yanjiang
2018-04-01
A great number of associative memory models have been proposed to realize information storage and retrieval inspired by human brain in the last few years. However, there is still much room for improvement for those models. In this paper, we extend a binary pattern associative memory model to accomplish real-world image recognition. The learning process is based on the fundamental Hebb rules and the retrieval is implemented by a normalized dot product operation. Our proposed model can not only fulfill rapid memory storage and retrieval for visual information but also have the ability on incremental learning without destroying the previous learned information. Experimental results demonstrate that our model outperforms the existing Self-Organizing Incremental Neural Network (SOINN) and Back Propagation Neuron Network (BPNN) on recognition accuracy and time efficiency.
Extracting 3d Semantic Information from Video Surveillance System Using Deep Learning
NASA Astrophysics Data System (ADS)
Zhang, J. S.; Cao, J.; Mao, B.; Shen, D. Q.
2018-04-01
At present, intelligent video analysis technology has been widely used in various fields. Object tracking is one of the important part of intelligent video surveillance, but the traditional target tracking technology based on the pixel coordinate system in images still exists some unavoidable problems. Target tracking based on pixel can't reflect the real position information of targets, and it is difficult to track objects across scenes. Based on the analysis of Zhengyou Zhang's camera calibration method, this paper presents a method of target tracking based on the target's space coordinate system after converting the 2-D coordinate of the target into 3-D coordinate. It can be seen from the experimental results: Our method can restore the real position change information of targets well, and can also accurately get the trajectory of the target in space.
The Use of Empirical Data Sources in HRA
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bruce Hallbert; David Gertman; Julie Marble
This paper presents a review of available information related to human performance to support Human Reliability Analysis (HRA) performed for nuclear power plants (NPPs). A number of data sources are identified as potentially useful. These include NPP licensee event reports (LERs), augmented inspection team (AIT) reports, operator requalification data, results from the literature in experimental psychology, and the Aviation Safety Reporting System (ASRSs). The paper discusses how utilizing such information improves our capability to model and quantify human performance. In particular the paper discusses how information related to performance shaping factors (PSFs) can be extracted from empirical data to determinemore » their size effect, their relative effects, as well as their interactions. The paper concludes that appropriate use of existing sources can help addressing some of the important issues we are currently facing in HRA.« less
A comparison of SAR ATR performance with information theoretic predictions
NASA Astrophysics Data System (ADS)
Blacknell, David
2003-09-01
Performance assessment of automatic target detection and recognition algorithms for SAR systems (or indeed any other sensors) is essential if the military utility of the system / algorithm mix is to be quantified. This is a relatively straightforward task if extensive trials data from an existing system is used. However, a crucial requirement is to assess the potential performance of novel systems as a guide to procurement decisions. This task is no longer straightforward since a hypothetical system cannot provide experimental trials data. QinetiQ has previously developed a theoretical technique for classification algorithm performance assessment based on information theory. The purpose of the study presented here has been to validate this approach. To this end, experimental SAR imagery of targets has been collected using the QinetiQ Enhanced Surveillance Radar to allow algorithm performance assessments as a number of parameters are varied. In particular, performance comparisons can be made for (i) resolutions up to 0.1m, (ii) single channel versus polarimetric (iii) targets in the open versus targets in scrubland and (iv) use versus non-use of camouflage. The change in performance as these parameters are varied has been quantified from the experimental imagery whilst the information theoretic approach has been used to predict the expected variation of performance with parameter value. A comparison of these measured and predicted assessments has revealed the strengths and weaknesses of the theoretical technique as will be discussed in the paper.
A global parallel model based design of experiments method to minimize model output uncertainty.
Bazil, Jason N; Buzzard, Gregory T; Rundell, Ann E
2012-03-01
Model-based experiment design specifies the data to be collected that will most effectively characterize the biological system under study. Existing model-based design of experiment algorithms have primarily relied on Fisher Information Matrix-based methods to choose the best experiment in a sequential manner. However, these are largely local methods that require an initial estimate of the parameter values, which are often highly uncertain, particularly when data is limited. In this paper, we provide an approach to specify an informative sequence of multiple design points (parallel design) that will constrain the dynamical uncertainty of the biological system responses to within experimentally detectable limits as specified by the estimated experimental noise. The method is based upon computationally efficient sparse grids and requires only a bounded uncertain parameter space; it does not rely upon initial parameter estimates. The design sequence emerges through the use of scenario trees with experimental design points chosen to minimize the uncertainty in the predicted dynamics of the measurable responses of the system. The algorithm was illustrated herein using a T cell activation model for three problems that ranged in dimension from 2D to 19D. The results demonstrate that it is possible to extract useful information from a mathematical model where traditional model-based design of experiments approaches most certainly fail. The experiments designed via this method fully constrain the model output dynamics to within experimentally resolvable limits. The method is effective for highly uncertain biological systems characterized by deterministic mathematical models with limited data sets. Also, it is highly modular and can be modified to include a variety of methodologies such as input design and model discrimination.
Trickey, Heather; Thomson, Gill; Grant, Aimee; Sanders, Julia; Mann, Mala; Murphy, Simon; Paranjothy, Shantini
2018-01-01
The World Health Organisation guidance recommends breastfeeding peer support (BFPS) as part of a strategy to improve breastfeeding rates. In the UK, BFPS is supported by National Institute for Health and Care Excellence guidance and a variety of models are in use. The experimental evidence for BFPS in developed countries is mixed and traditional methods of systematic review are ill-equipped to explore heterogeneity, complexity, and context influences on effectiveness. This review aimed to enhance learning from the experimental evidence base for one-to-one BFPS intervention. Principles of realist review were applied to intervention case studies associated with published experimental studies. The review aimed (a) to explore heterogeneity in theoretical underpinnings and intervention design for one-to-one BFPS intervention; (b) inform design decisions by identifying transferable lessons developed from cross-case comparison of context-mechanism-outcome relationships; and (c) inform evaluation design by identifying context-mechanism-outcome relationships associated with experimental conditions. Findings highlighted poor attention to intervention theory and considerable heterogeneity in BFPS intervention design. Transferable mid-range theories to inform design emerged, which could be grouped into seven categories: (a) congruence with local infant feeding norms, (b) integration with the existing system of health care, (c) overcoming practical and emotional barriers to access, (d) ensuring friendly, competent, and proactive peers, (e) facilitating authentic peer-mother interactions, (f) motivating peers to ensure positive within-intervention amplification, and (g) ensuring positive legacy and maintenance of gains. There is a need to integrate realist principles into evaluation design to improve our understanding of what forms of BFPS work, for whom and under what circumstances. © 2017 John Wiley & Sons Ltd.
Integrating mean and variance heterogeneities to identify differentially expressed genes.
Ouyang, Weiwei; An, Qiang; Zhao, Jinying; Qin, Huaizhen
2016-12-06
In functional genomics studies, tests on mean heterogeneity have been widely employed to identify differentially expressed genes with distinct mean expression levels under different experimental conditions. Variance heterogeneity (aka, the difference between condition-specific variances) of gene expression levels is simply neglected or calibrated for as an impediment. The mean heterogeneity in the expression level of a gene reflects one aspect of its distribution alteration; and variance heterogeneity induced by condition change may reflect another aspect. Change in condition may alter both mean and some higher-order characteristics of the distributions of expression levels of susceptible genes. In this report, we put forth a conception of mean-variance differentially expressed (MVDE) genes, whose expression means and variances are sensitive to the change in experimental condition. We mathematically proved the null independence of existent mean heterogeneity tests and variance heterogeneity tests. Based on the independence, we proposed an integrative mean-variance test (IMVT) to combine gene-wise mean heterogeneity and variance heterogeneity induced by condition change. The IMVT outperformed its competitors under comprehensive simulations of normality and Laplace settings. For moderate samples, the IMVT well controlled type I error rates, and so did existent mean heterogeneity test (i.e., the Welch t test (WT), the moderated Welch t test (MWT)) and the procedure of separate tests on mean and variance heterogeneities (SMVT), but the likelihood ratio test (LRT) severely inflated type I error rates. In presence of variance heterogeneity, the IMVT appeared noticeably more powerful than all the valid mean heterogeneity tests. Application to the gene profiles of peripheral circulating B raised solid evidence of informative variance heterogeneity. After adjusting for background data structure, the IMVT replicated previous discoveries and identified novel experiment-wide significant MVDE genes. Our results indicate tremendous potential gain of integrating informative variance heterogeneity after adjusting for global confounders and background data structure. The proposed informative integration test better summarizes the impacts of condition change on expression distributions of susceptible genes than do the existent competitors. Therefore, particular attention should be paid to explicitly exploit the variance heterogeneity induced by condition change in functional genomics analysis.
Bistable metamaterial for switching and cascading elastic vibrations
Foehr, André; Daraio, Chiara
2017-01-01
The realization of acoustic devices analogous to electronic systems, like diodes, transistors, and logic elements, suggests the potential use of elastic vibrations (i.e., phonons) in information processing, for example, in advanced computational systems, smart actuators, and programmable materials. Previous experimental realizations of acoustic diodes and mechanical switches have used nonlinearities to break transmission symmetry. However, existing solutions require operation at different frequencies or involve signal conversion in the electronic or optical domains. Here, we show an experimental realization of a phononic transistor-like device using geometric nonlinearities to switch and amplify elastic vibrations, via magnetic coupling, operating at a single frequency. By cascading this device in a tunable mechanical circuit board, we realize the complete set of mechanical logic elements and interconnect selected ones to execute simple calculations. PMID:28416663
Experimental EPR-steering using Bell-local states
NASA Astrophysics Data System (ADS)
Saunders, D. J.; Jones, S. J.; Wiseman, H. M.; Pryde, G. J.
2010-11-01
The concept of `steering' was introduced in 1935 by Schrödinger as a generalization of the EPR (Einstein-Podolsky-Rosen) paradox. It has recently been formalized as a quantum-information task with arbitrary bipartite states and measurements, for which the existence of entanglement is necessary but not sufficient. Previous experiments in this area have been restricted to an approach that followed the original EPR argument in considering only two different measurement settings per side. Here we demonstrate experimentally that EPR-steering occurs for mixed entangled states that are Bell local (that is, that cannot possibly demonstrate Bell non-locality). Unlike the case of Bell inequalities, increasing the number of measurement settings beyond two-we use up to six-significantly increases the robustness of the EPR-steering phenomenon to noise.
BPP: a sequence-based algorithm for branch point prediction.
Zhang, Qing; Fan, Xiaodan; Wang, Yejun; Sun, Ming-An; Shao, Jianlin; Guo, Dianjing
2017-10-15
Although high-throughput sequencing methods have been proposed to identify splicing branch points in the human genome, these methods can only detect a small fraction of the branch points subject to the sequencing depth, experimental cost and the expression level of the mRNA. An accurate computational model for branch point prediction is therefore an ongoing objective in human genome research. We here propose a novel branch point prediction algorithm that utilizes information on the branch point sequence and the polypyrimidine tract. Using experimentally validated data, we demonstrate that our proposed method outperforms existing methods. Availability and implementation: https://github.com/zhqingit/BPP. djguo@cuhk.edu.hk. Supplementary data are available at Bioinformatics online. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com
Optic cup segmentation from fundus images for glaucoma diagnosis.
Hu, Man; Zhu, Chenghao; Li, Xiaoxing; Xu, Yongli
2017-01-02
Glaucoma is a serious disease that can cause complete, permanent blindness, and its early diagnosis is very difficult. In recent years, computer-aided screening and diagnosis of glaucoma has made considerable progress. The optic cup segmentation from fundus images is an extremely important part for the computer-aided screening and diagnosis of glaucoma. This paper presented an automatic optic cup segmentation method that used both color difference information and vessel bends information from fundus images to determine the optic cup boundary. During the implementation of this algorithm, not only were the locations of the 2 types of information points used, but also the confidences of the information points were evaluated. In this way, the information points with higher confidence levels contributed more to the determination of the final cup boundary. The proposed method was evaluated using a public database for fundus images. The experimental results demonstrated that the cup boundaries obtained by the proposed method were more consistent than existing methods with the results obtained by ophthalmologists.
Secret Sharing of a Quantum State.
Lu, He; Zhang, Zhen; Chen, Luo-Kan; Li, Zheng-Da; Liu, Chang; Li, Li; Liu, Nai-Le; Ma, Xiongfeng; Chen, Yu-Ao; Pan, Jian-Wei
2016-07-15
Secret sharing of a quantum state, or quantum secret sharing, in which a dealer wants to share a certain amount of quantum information with a few players, has wide applications in quantum information. The critical criterion in a threshold secret sharing scheme is confidentiality: with less than the designated number of players, no information can be recovered. Furthermore, in a quantum scenario, one additional critical criterion exists: the capability of sharing entangled and unknown quantum information. Here, by employing a six-photon entangled state, we demonstrate a quantum threshold scheme, where the shared quantum secrecy can be efficiently reconstructed with a state fidelity as high as 93%. By observing that any one or two parties cannot recover the secrecy, we show that our scheme meets the confidentiality criterion. Meanwhile, we also demonstrate that entangled quantum information can be shared and recovered via our setting, which shows that our implemented scheme is fully quantum. Moreover, our experimental setup can be treated as a decoding circuit of the five-qubit quantum error-correcting code with two erasure errors.
NASA Astrophysics Data System (ADS)
Fu, Z.; Qin, Q.; Wu, C.; Chang, Y.; Luo, B.
2017-09-01
Due to the differences of imaging principles, image matching between visible and thermal infrared images still exist new challenges and difficulties. Inspired by the complementary spatial and frequency information of geometric structural features, a robust descriptor is proposed for visible and thermal infrared images matching. We first divide two different spatial regions to the region around point of interest, using the histogram of oriented magnitudes, which corresponds to the 2-D structural shape information to describe the larger region and the edge oriented histogram to describe the spatial distribution for the smaller region. Then the two vectors are normalized and combined to a higher feature vector. Finally, our proposed descriptor is obtained by applying principal component analysis (PCA) to reduce the dimension of the combined high feature vector to make our descriptor more robust. Experimental results showed that our proposed method was provided with significant improvements in correct matching numbers and obvious advantages by complementing information within spatial and frequency structural information.
Optic cup segmentation from fundus images for glaucoma diagnosis
Hu, Man; Zhu, Chenghao; Li, Xiaoxing; Xu, Yongli
2017-01-01
ABSTRACT Glaucoma is a serious disease that can cause complete, permanent blindness, and its early diagnosis is very difficult. In recent years, computer-aided screening and diagnosis of glaucoma has made considerable progress. The optic cup segmentation from fundus images is an extremely important part for the computer-aided screening and diagnosis of glaucoma. This paper presented an automatic optic cup segmentation method that used both color difference information and vessel bends information from fundus images to determine the optic cup boundary. During the implementation of this algorithm, not only were the locations of the 2 types of information points used, but also the confidences of the information points were evaluated. In this way, the information points with higher confidence levels contributed more to the determination of the final cup boundary. The proposed method was evaluated using a public database for fundus images. The experimental results demonstrated that the cup boundaries obtained by the proposed method were more consistent than existing methods with the results obtained by ophthalmologists. PMID:27764542
Schlosser, Ralf W; Belfiore, Phillip J; Sigafoos, Jeff; Briesch, Amy M; Wendt, Oliver
2018-05-28
Evidence-based practice as a process requires the appraisal of research as a critical step. In the field of developmental disabilities, single-case experimental designs (SCEDs) figure prominently as a means for evaluating the effectiveness of non-reversible instructional interventions. Comparative SCEDs contrast two or more instructional interventions to document their relative effectiveness and efficiency. As such, these designs have great potential to inform evidence-based decision-making. To harness this potential, however, interventionists and authors of systematic reviews need tools to appraise the evidence generated by these designs. Our literature review revealed that existing tools do not adequately address the specific methodological considerations of comparative SCEDs that aim to compare instructional interventions of non-reversible target behaviors. The purpose of this paper is to introduce the Comparative Single-Case Experimental Design Rating System (CSCEDARS, "cedars") as a tool for appraising the internal validity of comparative SCEDs of two or more non-reversible instructional interventions. Pertinent literature will be reviewed to establish the need for this tool and to underpin the rationales for individual rating items. Initial reliability information will be provided as well. Finally, directions for instrument validation will be proposed. Copyright © 2018 Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yanguas-Gil, Angel; Elam, Jeffrey W.
2014-05-01
In this work, the authors present analytic models for atomic layer deposition (ALD) in three common experimental configurations: cross-flow, particle coating, and spatial ALD. These models, based on the plug-flow and well-mixed approximations, allow us to determine the minimum dose times and materials utilization for all three configurations. A comparison between the three models shows that throughput and precursor utilization can each be expressed by universal equations, in which the particularity of the experimental system is contained in a single parameter related to the residence time of the precursor in the reactor. For the case of cross-flow reactors, the authorsmore » show how simple analytic expressions for the reactor saturation profiles agree well with experimental results. Consequently, the analytic model can be used to extract information about the ALD surface chemistry (e. g., the reaction probability) by comparing the analytic and experimental saturation profiles, providing a useful tool for characterizing new and existing ALD processes. (C) 2014 American Vacuum Society« less
A comparative study of the constitutive models for silicon carbide
NASA Astrophysics Data System (ADS)
Ding, Jow-Lian; Dwivedi, Sunil; Gupta, Yogendra
2001-06-01
Most of the constitutive models for polycrystalline silicon carbide were developed and evaluated using data from either normal plate impact or Hopkinson bar experiments. At ISP, extensive efforts have been made to gain detailed insight into the shocked state of the silicon carbide (SiC) using innovative experimental methods, viz., lateral stress measurements, in-material unloading measurements, and combined compression shear experiments. The data obtained from these experiments provide some unique information for both developing and evaluating material models. In this study, these data for SiC were first used to evaluate some of the existing models to identify their strength and possible deficiencies. Motivated by both the results of this comparative study and the experimental observations, an improved phenomenological model was developed. The model incorporates pressure dependence of strength, rate sensitivity, damage evolution under both tension and compression, pressure confinement effect on damage evolution, stiffness degradation due to damage, and pressure dependence of stiffness. The model developments are able to capture most of the material features observed experimentally, but more work is needed to better match the experimental data quantitatively.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Coleman, Justin Leigh; Smith, Curtis Lee; Burns, Douglas Edward
This report describes the development plan for a new multi-partner External Hazards Experimental Group (EHEG) coordinated by Idaho National Laboratory (INL) within the Risk-Informed Safety Margin Characterization (RISMC) technical pathway of the Light Water Reactor Sustainability Program. Currently, there is limited data available for development and validation of the tools and methods being developed in the RISMC Toolkit. The EHEG is being developed to obtain high-quality, small- and large-scale experimental data validation of RISMC tools and methods in a timely and cost-effective way. The group of universities and national laboratories that will eventually form the EHEG (which is ultimately expectedmore » to include both the initial participants and other universities and national laboratories that have been identified) have the expertise and experimental capabilities needed to both obtain and compile existing data archives and perform additional seismic and flooding experiments. The data developed by EHEG will be stored in databases for use within RISMC. These databases will be used to validate the advanced external hazard tools and methods.« less
A protein-dependent side-chain rotamer library.
Bhuyan, Md Shariful Islam; Gao, Xin
2011-12-14
Protein side-chain packing problem has remained one of the key open problems in bioinformatics. The three main components of protein side-chain prediction methods are a rotamer library, an energy function and a search algorithm. Rotamer libraries summarize the existing knowledge of the experimentally determined structures quantitatively. Depending on how much contextual information is encoded, there are backbone-independent rotamer libraries and backbone-dependent rotamer libraries. Backbone-independent libraries only encode sequential information, whereas backbone-dependent libraries encode both sequential and locally structural information. However, side-chain conformations are determined by spatially local information, rather than sequentially local information. Since in the side-chain prediction problem, the backbone structure is given, spatially local information should ideally be encoded into the rotamer libraries. In this paper, we propose a new type of backbone-dependent rotamer library, which encodes structural information of all the spatially neighboring residues. We call it protein-dependent rotamer libraries. Given any rotamer library and a protein backbone structure, we first model the protein structure as a Markov random field. Then the marginal distributions are estimated by the inference algorithms, without doing global optimization or search. The rotamers from the given library are then re-ranked and associated with the updated probabilities. Experimental results demonstrate that the proposed protein-dependent libraries significantly outperform the widely used backbone-dependent libraries in terms of the side-chain prediction accuracy and the rotamer ranking ability. Furthermore, without global optimization/search, the side-chain prediction power of the protein-dependent library is still comparable to the global-search-based side-chain prediction methods.
NASA Astrophysics Data System (ADS)
Goodrich, D. C.; Kustas, W. P.; Cosh, M. H.; Moran, S. M.; Marks, D. G.; Jackson, T. J.; Bosch, D. D.; Rango, A.; Seyfried, M. S.; Scott, R. L.; Prueger, J. H.; Starks, P. J.; Walbridge, M. R.
2014-12-01
The USDA-Agricultural Research Service has led, or been integrally involved in, a myriad of interdisciplinary field campaigns in a wide range of locations both nationally and internationally. Many of the shorter campaigns were anchored over the existing national network of ARS Experimental Watersheds and Rangelands. These long-term outdoor laboratories provided a critical knowledge base for designing the campaigns as well as historical data, hydrologic and meteorological infrastructure coupled with shop, laboratory, and visiting scientist facilities. This strong outdoor laboratory base enabled cost-efficient campaigns informed by historical context, local knowledge, and detailed existing watershed characterization. These long-term experimental facilities have also enabled much longer term lower intensity experiments, observing and building an understanding of both seasonal and inter-annual biosphere-hydrosphere-atmosphere interactions across a wide range of conditions. A sampling of these experiments include MONSOON'90, SGP97, SGP99, Washita'92, Washita'94, SMEX02-05 and JORNEX series of experiments, SALSA, CLASIC and longer-term efforts over the ARS Little Washita, Walnut Gulch, Little River, Reynolds Creek, and OPE3 Experimental Watersheds. This presentation will review some of the highlights and key findings of these campaigns and long-term efforts including the inclusion of many of the experimental watersheds and ranges in the Long-Term Agro-ecosystems Research (LTAR) network. The LTAR network also contains several locations that are also part of other observational networks including the CZO, LTER, and NEON networks. Lessons learned will also be provided for scientists initiating their participation in large-scale, multi-site interdisciplinary science.
An Overview and Empirical Comparison of Distance Metric Learning Methods.
Moutafis, Panagiotis; Leng, Mengjun; Kakadiaris, Ioannis A
2016-02-16
In this paper, we first offer an overview of advances in the field of distance metric learning. Then, we empirically compare selected methods using a common experimental protocol. The number of distance metric learning algorithms proposed keeps growing due to their effectiveness and wide application. However, existing surveys are either outdated or they focus only on a few methods. As a result, there is an increasing need to summarize the obtained knowledge in a concise, yet informative manner. Moreover, existing surveys do not conduct comprehensive experimental comparisons. On the other hand, individual distance metric learning papers compare the performance of the proposed approach with only a few related methods and under different settings. This highlights the need for an experimental evaluation using a common and challenging protocol. To this end, we conduct face verification experiments, as this task poses significant challenges due to varying conditions during data acquisition. In addition, face verification is a natural application for distance metric learning because the encountered challenge is to define a distance function that: 1) accurately expresses the notion of similarity for verification; 2) is robust to noisy data; 3) generalizes well to unseen subjects; and 4) scales well with the dimensionality and number of training samples. In particular, we utilize well-tested features to assess the performance of selected methods following the experimental protocol of the state-of-the-art database labeled faces in the wild. A summary of the results is presented along with a discussion of the insights obtained and lessons learned by employing the corresponding algorithms.
Detection of Memory B Activity Against a Therapeutic Protein in Treatment-Naïve Subjects.
Liao, Karen; Derbyshire, Stacy; Wang, Kai-Fen; Caucci, Cherilyn; Tang, Shuo; Holland, Claire; Loercher, Amy; Gunn, George R
2018-03-16
Bridging immunoassays commonly used to detect and characterize immunogenicity during biologic development do not provide direct information on the presence or development of a memory anti-drug antibody (ADA) response. In this study, a B cell ELISPOT assay method was used to evaluate pre-existing ADA for anti-TNFR1 domain antibody, GSK1995057, an experimental biologic in treatment naive subjects. This assay utilized a 7-day activation of PBMCs by a combination of GSK1995057 (antigen) and polyclonal stimulator followed by GSK1995057-specific ELISPOT for the enumeration of memory B cells that have differentiated into antibody secreting cells (ASC) in vitro. We demonstrated that GSK1995057-specific ASC were detectable in treatment-naïve subjects with pre-existing ADA; the frequency of drug-specific ASC was low and ranged from 1 to 10 spot forming units (SFU) per million cells. Interestingly, the frequency of drug-specific ASC correlated with the ADA level measured using an in vitro ADA assay. We further confirmed that the ASC originated from CD27 + memory B cells, not from CD27 - -naïve B cells. Our data demonstrated the utility of the B cell ELISPOT method in therapeutic protein immunogenicity evaluation, providing a novel way to confirm and characterize the cell population producing pre-existing ADA. This novel application of a B cell ELISPOT assay informs and characterizes immune memory activity regarding incidence and magnitude associated with a pre-existing ADA response.
Ravikumar, Balaguru; Parri, Elina; Timonen, Sanna; Airola, Antti; Wennerberg, Krister
2017-01-01
Due to relatively high costs and labor required for experimental profiling of the full target space of chemical compounds, various machine learning models have been proposed as cost-effective means to advance this process in terms of predicting the most potent compound-target interactions for subsequent verification. However, most of the model predictions lack direct experimental validation in the laboratory, making their practical benefits for drug discovery or repurposing applications largely unknown. Here, we therefore introduce and carefully test a systematic computational-experimental framework for the prediction and pre-clinical verification of drug-target interactions using a well-established kernel-based regression algorithm as the prediction model. To evaluate its performance, we first predicted unmeasured binding affinities in a large-scale kinase inhibitor profiling study, and then experimentally tested 100 compound-kinase pairs. The relatively high correlation of 0.77 (p < 0.0001) between the predicted and measured bioactivities supports the potential of the model for filling the experimental gaps in existing compound-target interaction maps. Further, we subjected the model to a more challenging task of predicting target interactions for such a new candidate drug compound that lacks prior binding profile information. As a specific case study, we used tivozanib, an investigational VEGF receptor inhibitor with currently unknown off-target profile. Among 7 kinases with high predicted affinity, we experimentally validated 4 new off-targets of tivozanib, namely the Src-family kinases FRK and FYN A, the non-receptor tyrosine kinase ABL1, and the serine/threonine kinase SLK. Our sub-sequent experimental validation protocol effectively avoids any possible information leakage between the training and validation data, and therefore enables rigorous model validation for practical applications. These results demonstrate that the kernel-based modeling approach offers practical benefits for probing novel insights into the mode of action of investigational compounds, and for the identification of new target selectivities for drug repurposing applications. PMID:28787438
Turton, J
1998-04-01
A non-experimental research design using questionnaires, was undertaken to find out what information out of that commonly given following myocardial infarction (MI), patients and their spouse/partners rate as being most and least important. These results were then compared with the results obtained from nurse subjects, who were given the same instrument to complete. Eighteen subjects were recruited for each of the three subject groups. Results indicated that some congruency existed between the three groups in terms of what they perceived as the most and least important categories of information. Yet, the scores for some informational categories included on the instrument, were significantly different between the nursing and two other groups (P < 0.01). However, in relation to the patient and spouse/partner groups, only a weak difference (P < 0.10) was found for the category 'dietary information'. These findings and others are discussed, and recommendations are made for improving the information giving process post-MI.
Uncertain decision tree inductive inference
NASA Astrophysics Data System (ADS)
Zarban, L.; Jafari, S.; Fakhrahmad, S. M.
2011-10-01
Induction is the process of reasoning in which general rules are formulated based on limited observations of recurring phenomenal patterns. Decision tree learning is one of the most widely used and practical inductive methods, which represents the results in a tree scheme. Various decision tree algorithms have already been proposed such as CLS, ID3, Assistant C4.5, REPTree and Random Tree. These algorithms suffer from some major shortcomings. In this article, after discussing the main limitations of the existing methods, we introduce a new decision tree induction algorithm, which overcomes all the problems existing in its counterparts. The new method uses bit strings and maintains important information on them. This use of bit strings and logical operation on them causes high speed during the induction process. Therefore, it has several important features: it deals with inconsistencies in data, avoids overfitting and handles uncertainty. We also illustrate more advantages and the new features of the proposed method. The experimental results show the effectiveness of the method in comparison with other methods existing in the literature.
Yamagata, Koichi; Yamanishi, Ayako; Kokubu, Chikara; Takeda, Junji; Sese, Jun
2016-01-01
An important challenge in cancer genomics is precise detection of structural variations (SVs) by high-throughput short-read sequencing, which is hampered by the high false discovery rates of existing analysis tools. Here, we propose an accurate SV detection method named COSMOS, which compares the statistics of the mapped read pairs in tumor samples with isogenic normal control samples in a distinct asymmetric manner. COSMOS also prioritizes the candidate SVs using strand-specific read-depth information. Performance tests on modeled tumor genomes revealed that COSMOS outperformed existing methods in terms of F-measure. We also applied COSMOS to an experimental mouse cell-based model, in which SVs were induced by genome engineering and gamma-ray irradiation, followed by polymerase chain reaction-based confirmation. The precision of COSMOS was 84.5%, while the next best existing method was 70.4%. Moreover, the sensitivity of COSMOS was the highest, indicating that COSMOS has great potential for cancer genome analysis. PMID:26833260
Mackey, Tim K; Schoenfeld, Virginia J
2016-02-02
Social media is fundamentally altering how we access health information and make decisions about medical treatment, including for terminally ill patients. This specifically includes the growing phenomenon of patients who use online petitions and social media campaigns in an attempt to gain access to experimental drugs through expanded access pathways. Importantly, controversy surrounding expanded access and "compassionate use" involves several disparate stakeholders, including patients, manufacturers, policymakers, and regulatory agencies-all with competing interests and priorities, leading to confusion, frustration, and ultimately advocacy. In order to explore this issue in detail, this correspondence article first conducts a literature review to describe how the expanded access policy and regulatory environment in the United States has evolved over time and how it currently impacts access to experimental drugs. We then conducted structured web searches to identify patient use of online petitions and social media campaigns aimed at compelling access to experimental drugs. This was carried out in order to characterize the types of communication strategies utilized, the diseases and drugs subject to expanded access petitions, and the prevalent themes associated with this form of "digital" patient advocacy. We find that patients and their families experience mixed results, but still gravitate towards the use of online campaigns out of desperation, lack of reliable information about treatment access options, and in direct response to limitations of the current fragmented structure of expanded access regulation and policy currently in place. In response, we discuss potential policy reforms to improve expanded access processes, including advocating greater transparency for expanded access programs, exploring use of targeted economic incentives for manufacturers, and developing systems to facilitate patient information about existing treatment options. This includes leveraging recent legislative attention to reform expanded access through the CURE Act Provisions contained in the proposed U.S. 21st Century Cures Act. While expanded access may not be the best option for the majority of individuals, terminally ill patients and their families nevertheless deserve better processes, policies, and availability to potentially life-changing information, before they decide to pursue an online campaign in the desperate hope of gaining access to experimental drugs.
MMM: A toolbox for integrative structure modeling.
Jeschke, Gunnar
2018-01-01
Structural characterization of proteins and their complexes may require integration of restraints from various experimental techniques. MMM (Multiscale Modeling of Macromolecules) is a Matlab-based open-source modeling toolbox for this purpose with a particular emphasis on distance distribution restraints obtained from electron paramagnetic resonance experiments on spin-labelled proteins and nucleic acids and their combination with atomistic structures of domains or whole protomers, small-angle scattering data, secondary structure information, homology information, and elastic network models. MMM does not only integrate various types of restraints, but also various existing modeling tools by providing a common graphical user interface to them. The types of restraints that can support such modeling and the available model types are illustrated by recent application examples. © 2017 The Protein Society.
Bottema-Beutel, Kristen; Lloyd, Blair; Carter, Erik W; Asmus, Jennifer M
2014-11-01
Attaining reliable estimates of observational measures can be challenging in school and classroom settings, as behavior can be influenced by multiple contextual factors. Generalizability (G) studies can enable researchers to estimate the reliability of observational data, and decision (D) studies can inform how many observation sessions are necessary to achieve a criterion level of reliability. We conducted G and D studies using observational data from a randomized control trial focusing on social and academic participation of students with severe disabilities in inclusive secondary classrooms. Results highlight the importance of anchoring observational decisions to reliability estimates from existing or pilot data sets. We outline steps for conducting G and D studies and address options when reliability estimates are lower than desired.
Recent advances and remaining challenges for the spectroscopic detection of explosive threats.
Fountain, Augustus W; Christesen, Steven D; Moon, Raphael P; Guicheteau, Jason A; Emmons, Erik D
2014-01-01
In 2010, the U.S. Army initiated a program through the Edgewood Chemical Biological Center to identify viable spectroscopic signatures of explosives and initiate environmental persistence, fate, and transport studies for trace residues. These studies were ultimately designed to integrate these signatures into algorithms and experimentally evaluate sensor performance for explosives and precursor materials in existing chemical point and standoff detection systems. Accurate and validated optical cross sections and signatures are critical in benchmarking spectroscopic-based sensors. This program has provided important information for the scientists and engineers currently developing trace-detection solutions to the homemade explosive problem. With this information, the sensitivity of spectroscopic methods for explosives detection can now be quantitatively evaluated before the sensor is deployed and tested.
SoftLab: A Soft-Computing Software for Experimental Research with Commercialization Aspects
NASA Technical Reports Server (NTRS)
Akbarzadeh-T, M.-R.; Shaikh, T. S.; Ren, J.; Hubbell, Rob; Kumbla, K. K.; Jamshidi, M
1998-01-01
SoftLab is a software environment for research and development in intelligent modeling/control using soft-computing paradigms such as fuzzy logic, neural networks, genetic algorithms, and genetic programs. SoftLab addresses the inadequacies of the existing soft-computing software by supporting comprehensive multidisciplinary functionalities from management tools to engineering systems. Furthermore, the built-in features help the user process/analyze information more efficiently by a friendly yet powerful interface, and will allow the user to specify user-specific processing modules, hence adding to the standard configuration of the software environment.
Octupole deformation in odd-odd nuclei
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sheline, R.K.
1988-01-01
Comparison of the experimental and theoretical ground-state spins of odd-odd nuclei in the region 220less than or equal toAless than or equal to228 generally shows agreement with a folded Yukawa octupole deformed model with epsilon/sub 3/ = 0.08 and some lack of agreement with the same model with epsilon/sub 3/ = 0. Thus in spite of limited spectroscopic information, the ground-state spins suggest the existence of octupole deformation in odd-odd nuclei in the region 220less than or equal toAless than or equal to228.
Structure of turbulence in three-dimensional boundary layers
NASA Technical Reports Server (NTRS)
Subramanian, Chelakara S.
1993-01-01
This report provides an overview of the three dimensional turbulent boundary layer concepts and of the currently available experimental information for their turbulence modeling. It is found that more reliable turbulence data, especially of the Reynolds stress transport terms, is needed to improve the existing modeling capabilities. An experiment is proposed to study the three dimensional boundary layer formed by a 'sink flow' in a fully developed two dimensional turbulent boundary layer. Also, the mean and turbulence field measurement procedure using a three component laser Doppler velocimeter is described.
A centre for accommodative vergence motor control
NASA Technical Reports Server (NTRS)
Wilson, D.
1973-01-01
Latencies in accommodation, accommodative-vergence, and pupil-diameter responses to changing accommodation stimuli, as well as latencies in pupil response to light-intensity changes were measured. From the information obtained, a block diagram has been derived that uses the least number of blocks for representing the accommodation, accommodative-vergence, and pupil systems. The signal transmission delays over the various circuits of the model have been determined and compared to known experimental physiological-delay data. The results suggest the existence of a motor center that controls the accommodative vergence and is completely independent of the accommodation system.
PopED lite: An optimal design software for preclinical pharmacokinetic and pharmacodynamic studies.
Aoki, Yasunori; Sundqvist, Monika; Hooker, Andrew C; Gennemark, Peter
2016-04-01
Optimal experimental design approaches are seldom used in preclinical drug discovery. The objective is to develop an optimal design software tool specifically designed for preclinical applications in order to increase the efficiency of drug discovery in vivo studies. Several realistic experimental design case studies were collected and many preclinical experimental teams were consulted to determine the design goal of the software tool. The tool obtains an optimized experimental design by solving a constrained optimization problem, where each experimental design is evaluated using some function of the Fisher Information Matrix. The software was implemented in C++ using the Qt framework to assure a responsive user-software interaction through a rich graphical user interface, and at the same time, achieving the desired computational speed. In addition, a discrete global optimization algorithm was developed and implemented. The software design goals were simplicity, speed and intuition. Based on these design goals, we have developed the publicly available software PopED lite (http://www.bluetree.me/PopED_lite). Optimization computation was on average, over 14 test problems, 30 times faster in PopED lite compared to an already existing optimal design software tool. PopED lite is now used in real drug discovery projects and a few of these case studies are presented in this paper. PopED lite is designed to be simple, fast and intuitive. Simple, to give many users access to basic optimal design calculations. Fast, to fit a short design-execution cycle and allow interactive experimental design (test one design, discuss proposed design, test another design, etc). Intuitive, so that the input to and output from the software tool can easily be understood by users without knowledge of the theory of optimal design. In this way, PopED lite is highly useful in practice and complements existing tools. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Learning Contrast-Invariant Cancellation of Redundant Signals in Neural Systems
Bol, Kieran; Maler, Leonard; Longtin, André
2013-01-01
Cancellation of redundant information is a highly desirable feature of sensory systems, since it would potentially lead to a more efficient detection of novel information. However, biologically plausible mechanisms responsible for such selective cancellation, and especially those robust to realistic variations in the intensity of the redundant signals, are mostly unknown. In this work, we study, via in vivo experimental recordings and computational models, the behavior of a cerebellar-like circuit in the weakly electric fish which is known to perform cancellation of redundant stimuli. We experimentally observe contrast invariance in the cancellation of spatially and temporally redundant stimuli in such a system. Our model, which incorporates heterogeneously-delayed feedback, bursting dynamics and burst-induced STDP, is in agreement with our in vivo observations. In addition, the model gives insight on the activity of granule cells and parallel fibers involved in the feedback pathway, and provides a strong prediction on the parallel fiber potentiation time scale. Finally, our model predicts the existence of an optimal learning contrast around 15% contrast levels, which are commonly experienced by interacting fish. PMID:24068898
Analyzing thresholds and efficiency with hierarchical Bayesian logistic regression.
Houpt, Joseph W; Bittner, Jennifer L
2018-07-01
Ideal observer analysis is a fundamental tool used widely in vision science for analyzing the efficiency with which a cognitive or perceptual system uses available information. The performance of an ideal observer provides a formal measure of the amount of information in a given experiment. The ratio of human to ideal performance is then used to compute efficiency, a construct that can be directly compared across experimental conditions while controlling for the differences due to the stimuli and/or task specific demands. In previous research using ideal observer analysis, the effects of varying experimental conditions on efficiency have been tested using ANOVAs and pairwise comparisons. In this work, we present a model that combines Bayesian estimates of psychometric functions with hierarchical logistic regression for inference about both unadjusted human performance metrics and efficiencies. Our approach improves upon the existing methods by constraining the statistical analysis using a standard model connecting stimulus intensity to human observer accuracy and by accounting for variability in the estimates of human and ideal observer performance scores. This allows for both individual and group level inferences. Copyright © 2018 Elsevier Ltd. All rights reserved.
Nucleation and particle coagulation experiments in microgravity
NASA Technical Reports Server (NTRS)
Nuth, J.
1987-01-01
Measurements of the conditions under which carbon, aluminum oxide, and silicon carbide smokes condense and of the morphology and crystal structure of the resulting grains are essential if the nature of the materials ejected into the interstellar medium and the nature of the grains which eventually became part of the proto solar nebular are to be understood. Little information is currently available on the vapor-solid phase transitions of refractory metals and solids. What little experimental data do exist are, however, not in agreement with currently accepted models of the nucleation process for more volatile materials. The major obstacle to performing such experiments in earth-based laboratories is the susceptibility of these systems to convection. Evaporation of refractory materials into a low-pressure environment with a carefully controlled temperature gradient will produce refractory smokes when the critical supersaturation of the system is exceeded. Measurement of the point at which nucleation occurs, via light scattering or extinction, can not only yield nucleation data but also, information on the chemical composition and crystal structure of the condensate. Experimental requirements are presented.
Parekh, Ruchi; Armañanzas, Rubén; Ascoli, Giorgio A
2015-04-01
Digital reconstructions of axonal and dendritic arbors provide a powerful representation of neuronal morphology in formats amenable to quantitative analysis, computational modeling, and data mining. Reconstructed files, however, require adequate metadata to identify the appropriate animal species, developmental stage, brain region, and neuron type. Moreover, experimental details about tissue processing, neurite visualization and microscopic imaging are essential to assess the information content of digital morphologies. Typical morphological reconstructions only partially capture the underlying biological reality. Tracings are often limited to certain domains (e.g., dendrites and not axons), may be incomplete due to tissue sectioning, imperfect staining, and limited imaging resolution, or can disregard aspects irrelevant to their specific scientific focus (such as branch thickness or depth). Gauging these factors is critical in subsequent data reuse and comparison. NeuroMorpho.Org is a central repository of reconstructions from many laboratories and experimental conditions. Here, we introduce substantial additions to the existing metadata annotation aimed to describe the completeness of the reconstructed neurons in NeuroMorpho.Org. These expanded metadata form a suitable basis for effective description of neuromorphological data.
Caldas, José; Gehlenborg, Nils; Kettunen, Eeva; Faisal, Ali; Rönty, Mikko; Nicholson, Andrew G; Knuutila, Sakari; Brazma, Alvis; Kaski, Samuel
2012-01-15
Genome-wide measurement of transcript levels is an ubiquitous tool in biomedical research. As experimental data continues to be deposited in public databases, it is becoming important to develop search engines that enable the retrieval of relevant studies given a query study. While retrieval systems based on meta-data already exist, data-driven approaches that retrieve studies based on similarities in the expression data itself have a greater potential of uncovering novel biological insights. We propose an information retrieval method based on differential expression. Our method deals with arbitrary experimental designs and performs competitively with alternative approaches, while making the search results interpretable in terms of differential expression patterns. We show that our model yields meaningful connections between biological conditions from different studies. Finally, we validate a previously unknown connection between malignant pleural mesothelioma and SIM2s suggested by our method, via real-time polymerase chain reaction in an independent set of mesothelioma samples. Supplementary data and source code are available from http://www.ebi.ac.uk/fg/research/rex.
Bayesian network prior: network analysis of biological data using external knowledge
Isci, Senol; Dogan, Haluk; Ozturk, Cengizhan; Otu, Hasan H.
2014-01-01
Motivation: Reverse engineering GI networks from experimental data is a challenging task due to the complex nature of the networks and the noise inherent in the data. One way to overcome these hurdles would be incorporating the vast amounts of external biological knowledge when building interaction networks. We propose a framework where GI networks are learned from experimental data using Bayesian networks (BNs) and the incorporation of external knowledge is also done via a BN that we call Bayesian Network Prior (BNP). BNP depicts the relation between various evidence types that contribute to the event ‘gene interaction’ and is used to calculate the probability of a candidate graph (G) in the structure learning process. Results: Our simulation results on synthetic, simulated and real biological data show that the proposed approach can identify the underlying interaction network with high accuracy even when the prior information is distorted and outperforms existing methods. Availability: Accompanying BNP software package is freely available for academic use at http://bioe.bilgi.edu.tr/BNP. Contact: hasan.otu@bilgi.edu.tr Supplementary Information: Supplementary data are available at Bioinformatics online. PMID:24215027
Sims, Lee B; Frieboes, Hermann B; Steinbach-Rankins, Jill M
2018-01-01
A variety of drug-delivery platforms have been employed to deliver therapeutic agents across cervicovaginal mucus (CVM) and the vaginal mucosa, offering the capability to increase the longevity and retention of active agents to treat infections of the female reproductive tract (FRT). Nanoparticles (NPs) have been shown to improve retention, diffusion, and cell-specific targeting via specific surface modifications, relative to other delivery platforms. In particular, polymeric NPs represent a promising option that has shown improved distribution through the CVM. These NPs are typically fabricated from nontoxic, non-inflammatory, US Food and Drug Administration-approved polymers that improve biocompatibility. This review summarizes recent experimental studies that have evaluated NP transport in the FRT, and highlights research areas that more thoroughly and efficiently inform polymeric NP design, including mathematical modeling. An overview of the in vitro, ex vivo, and in vivo NP studies conducted to date - whereby transport parameters are determined, extrapolated, and validated - is presented first. The impact of different NP design features on transport through the FRT is summarized, and gaps that exist due to the limitations of iterative experimentation alone are identified. The potential of mathematical modeling to complement the characterization and evaluation of diffusion and transport of delivery vehicles and active agents through the CVM and mucosa is discussed. Lastly, potential advancements combining experimental and mathematical knowledge are suggested to inform next-generation NP designs, such that infections in the FRT may be more effectively treated.
Detection of allosteric signal transmission by information-theoretic analysis of protein dynamics
Pandini, Alessandro; Fornili, Arianna; Fraternali, Franca; Kleinjung, Jens
2012-01-01
Allostery offers a highly specific way to modulate protein function. Therefore, understanding this mechanism is of increasing interest for protein science and drug discovery. However, allosteric signal transmission is difficult to detect experimentally and to model because it is often mediated by local structural changes propagating along multiple pathways. To address this, we developed a method to identify communication pathways by an information-theoretical analysis of molecular dynamics simulations. Signal propagation was described as information exchange through a network of correlated local motions, modeled as transitions between canonical states of protein fragments. The method was used to describe allostery in two-component regulatory systems. In particular, the transmission from the allosteric site to the signaling surface of the receiver domain NtrC was shown to be mediated by a layer of hub residues. The location of hubs preferentially connected to the allosteric site was found in close agreement with key residues experimentally identified as involved in the signal transmission. The comparison with the networks of the homologues CheY and FixJ highlighted similarities in their dynamics. In particular, we showed that a preorganized network of fragment connections between the allosteric and functional sites exists already in the inactive state of all three proteins.—Pandini, A., Fornili, A., Fraternali, F., Kleinjung, J. Detection of allosteric signal transmission by information-theoretic analysis of protein dynamics. PMID:22071506
Predicting Drug-Target Interactions With Multi-Information Fusion.
Peng, Lihong; Liao, Bo; Zhu, Wen; Li, Zejun; Li, Keqin
2017-03-01
Identifying potential associations between drugs and targets is a critical prerequisite for modern drug discovery and repurposing. However, predicting these associations is difficult because of the limitations of existing computational methods. Most models only consider chemical structures and protein sequences, and other models are oversimplified. Moreover, datasets used for analysis contain only true-positive interactions, and experimentally validated negative samples are unavailable. To overcome these limitations, we developed a semi-supervised based learning framework called NormMulInf through collaborative filtering theory by using labeled and unlabeled interaction information. The proposed method initially determines similarity measures, such as similarities among samples and local correlations among the labels of the samples, by integrating biological information. The similarity information is then integrated into a robust principal component analysis model, which is solved using augmented Lagrange multipliers. Experimental results on four classes of drug-target interaction networks suggest that the proposed approach can accurately classify and predict drug-target interactions. Part of the predicted interactions are reported in public databases. The proposed method can also predict possible targets for new drugs and can be used to determine whether atropine may interact with alpha1B- and beta1- adrenergic receptors. Furthermore, the developed technique identifies potential drugs for new targets and can be used to assess whether olanzapine and propiomazine may target 5HT2B. Finally, the proposed method can potentially address limitations on studies of multitarget drugs and multidrug targets.
Applicability of computational systems biology in toxicology.
Kongsbak, Kristine; Hadrup, Niels; Audouze, Karine; Vinggaard, Anne Marie
2014-07-01
Systems biology as a research field has emerged within the last few decades. Systems biology, often defined as the antithesis of the reductionist approach, integrates information about individual components of a biological system. In integrative systems biology, large data sets from various sources and databases are used to model and predict effects of chemicals on, for instance, human health. In toxicology, computational systems biology enables identification of important pathways and molecules from large data sets; tasks that can be extremely laborious when performed by a classical literature search. However, computational systems biology offers more advantages than providing a high-throughput literature search; it may form the basis for establishment of hypotheses on potential links between environmental chemicals and human diseases, which would be very difficult to establish experimentally. This is possible due to the existence of comprehensive databases containing information on networks of human protein-protein interactions and protein-disease associations. Experimentally determined targets of the specific chemical of interest can be fed into these networks to obtain additional information that can be used to establish hypotheses on links between the chemical and human diseases. Such information can also be applied for designing more intelligent animal/cell experiments that can test the established hypotheses. Here, we describe how and why to apply an integrative systems biology method in the hypothesis-generating phase of toxicological research. © 2014 Nordic Association for the Publication of BCPT (former Nordic Pharmacological Society).
Literature-based condition-specific miRNA-mRNA target prediction.
Oh, Minsik; Rhee, Sungmin; Moon, Ji Hwan; Chae, Heejoon; Lee, Sunwon; Kang, Jaewoo; Kim, Sun
2017-01-01
miRNAs are small non-coding RNAs that regulate gene expression by binding to the 3'-UTR of genes. Many recent studies have reported that miRNAs play important biological roles by regulating specific mRNAs or genes. Many sequence-based target prediction algorithms have been developed to predict miRNA targets. However, these methods are not designed for condition-specific target predictions and produce many false positives; thus, expression-based target prediction algorithms have been developed for condition-specific target predictions. A typical strategy to utilize expression data is to leverage the negative control roles of miRNAs on genes. To control false positives, a stringent cutoff value is typically set, but in this case, these methods tend to reject many true target relationships, i.e., false negatives. To overcome these limitations, additional information should be utilized. The literature is probably the best resource that we can utilize. Recent literature mining systems compile millions of articles with experiments designed for specific biological questions, and the systems provide a function to search for specific information. To utilize the literature information, we used a literature mining system, BEST, that automatically extracts information from the literature in PubMed and that allows the user to perform searches of the literature with any English words. By integrating omics data analysis methods and BEST, we developed Context-MMIA, a miRNA-mRNA target prediction method that combines expression data analysis results and the literature information extracted based on the user-specified context. In the pathway enrichment analysis using genes included in the top 200 miRNA-targets, Context-MMIA outperformed the four existing target prediction methods that we tested. In another test on whether prediction methods can re-produce experimentally validated target relationships, Context-MMIA outperformed the four existing target prediction methods. In summary, Context-MMIA allows the user to specify a context of the experimental data to predict miRNA targets, and we believe that Context-MMIA is very useful for predicting condition-specific miRNA targets.
Zhu, Hong; Tang, Xinming; Xie, Junfeng; Song, Weidong; Mo, Fan; Gao, Xiaoming
2018-01-01
There are many problems in existing reconstruction-based super-resolution algorithms, such as the lack of texture-feature representation and of high-frequency details. Multi-scale detail enhancement can produce more texture information and high-frequency information. Therefore, super-resolution reconstruction of remote-sensing images based on adaptive multi-scale detail enhancement (AMDE-SR) is proposed in this paper. First, the information entropy of each remote-sensing image is calculated, and the image with the maximum entropy value is regarded as the reference image. Subsequently, spatio-temporal remote-sensing images are processed using phase normalization, which is to reduce the time phase difference of image data and enhance the complementarity of information. The multi-scale image information is then decomposed using the L0 gradient minimization model, and the non-redundant information is processed by difference calculation and expanding non-redundant layers and the redundant layer by the iterative back-projection (IBP) technique. The different-scale non-redundant information is adaptive-weighted and fused using cross-entropy. Finally, a nonlinear texture-detail-enhancement function is built to improve the scope of small details, and the peak signal-to-noise ratio (PSNR) is used as an iterative constraint. Ultimately, high-resolution remote-sensing images with abundant texture information are obtained by iterative optimization. Real results show an average gain in entropy of up to 0.42 dB for an up-scaling of 2 and a significant promotion gain in enhancement measure evaluation for an up-scaling of 2. The experimental results show that the performance of the AMED-SR method is better than existing super-resolution reconstruction methods in terms of visual and accuracy improvements. PMID:29414893
Zhu, Hong; Tang, Xinming; Xie, Junfeng; Song, Weidong; Mo, Fan; Gao, Xiaoming
2018-02-07
There are many problems in existing reconstruction-based super-resolution algorithms, such as the lack of texture-feature representation and of high-frequency details. Multi-scale detail enhancement can produce more texture information and high-frequency information. Therefore, super-resolution reconstruction of remote-sensing images based on adaptive multi-scale detail enhancement (AMDE-SR) is proposed in this paper. First, the information entropy of each remote-sensing image is calculated, and the image with the maximum entropy value is regarded as the reference image. Subsequently, spatio-temporal remote-sensing images are processed using phase normalization, which is to reduce the time phase difference of image data and enhance the complementarity of information. The multi-scale image information is then decomposed using the L ₀ gradient minimization model, and the non-redundant information is processed by difference calculation and expanding non-redundant layers and the redundant layer by the iterative back-projection (IBP) technique. The different-scale non-redundant information is adaptive-weighted and fused using cross-entropy. Finally, a nonlinear texture-detail-enhancement function is built to improve the scope of small details, and the peak signal-to-noise ratio (PSNR) is used as an iterative constraint. Ultimately, high-resolution remote-sensing images with abundant texture information are obtained by iterative optimization. Real results show an average gain in entropy of up to 0.42 dB for an up-scaling of 2 and a significant promotion gain in enhancement measure evaluation for an up-scaling of 2. The experimental results show that the performance of the AMED-SR method is better than existing super-resolution reconstruction methods in terms of visual and accuracy improvements.
Aerodynamic Database Development for Mars Smart Lander Vehicle Configurations
NASA Technical Reports Server (NTRS)
Bobskill, Glenn J.; Parikh, Paresh C.; Prabhu, Ramadas K.; Tyler, Erik D.
2002-01-01
An aerodynamic database has been generated for the Mars Smart Lander Shelf-All configuration using computational fluid dynamics (CFD) simulations. Three different CFD codes, USM3D and FELISA, based on unstructured grid technology and LAURA, an established and validated structured CFD code, were used. As part of this database development, the results for the Mars continuum were validated with experimental data and comparisons made where applicable. The validation of USM3D and LAURA with the Unitary experimental data, the use of intermediate LAURA check analyses, as well as the validation of FELISA with the Mach 6 CF(sub 4) experimental data provided a higher confidence in the ability for CFD to provide aerodynamic data in order to determine the static trim characteristics for longitudinal stability. The analyses of the noncontinuum regime showed the existence of multiple trim angles of attack that can be unstable or stable trim points. This information is needed to design guidance controller throughout the trajectory.
NASA Technical Reports Server (NTRS)
Erickson, Gary E.
2007-01-01
An overview is given of selected measurement techniques used in the NASA Langley Research Center (NASA LaRC) Unitary Plan Wind Tunnel (UPWT) to determine the aerodynamic characteristics of aerospace vehicles operating at supersonic speeds. A broad definition of a measurement technique is adopted in this paper and is any qualitative or quantitative experimental approach that provides information leading to the improved understanding of the supersonic aerodynamic characteristics. On-surface and off-surface measurement techniques used to obtain discrete (point) and global (field) measurements and planar and global flow visualizations are described, and examples of all methods are included. The discussion is limited to recent experiences in the UPWT and is, therefore, not an exhaustive review of existing experimental techniques. The diversity and high quality of the measurement techniques and the resultant data illustrate the capabilities of a ground-based experimental facility and the key role that it plays in the advancement of our understanding, prediction, and control of supersonic aerodynamics.
Broken SU(3) antidecuplet for {Theta}{sup +} and {Xi}{sub 3/2}
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pakvasa, Sandip; Suzuki, Mahiko
2004-05-05
If the narrow exotic baryon resonances {Theta}{sup +}(1540) and {Xi}{sub 3/2} are members of the J{sup P} = 1/2{sup +} antidecuplet with N*(1710), the octet-antidecuplet mixing is required not only by the mass spectrum but also by the decay pattern of N*(1710). This casts doubt on validity of the {Theta}{sup +} mass prediction by the chiral soliton model. While all pieces of the existing experimental information point to a small octet-decuplet mixing, the magnitude of mixing required by the mass spectrum is not consistent with the value needed to account for the hadronic decay rates. The discrepancy is not resolvedmore » even after the large experimental uncertainty is taken into consideration. We fail to find an alternative SU(3) assignment even with different spin-parity assignment. When we extend the analysis to mixing with a higher SU(3) multiplet, we find one experimentally testable scenario in the case of mixing with a 27-plet.« less
Enhancing collaborative filtering by user interest expansion via personalized ranking.
Liu, Qi; Chen, Enhong; Xiong, Hui; Ding, Chris H Q; Chen, Jian
2012-02-01
Recommender systems suggest a few items from many possible choices to the users by understanding their past behaviors. In these systems, the user behaviors are influenced by the hidden interests of the users. Learning to leverage the information about user interests is often critical for making better recommendations. However, existing collaborative-filtering-based recommender systems are usually focused on exploiting the information about the user's interaction with the systems; the information about latent user interests is largely underexplored. To that end, inspired by the topic models, in this paper, we propose a novel collaborative-filtering-based recommender system by user interest expansion via personalized ranking, named iExpand. The goal is to build an item-oriented model-based collaborative-filtering framework. The iExpand method introduces a three-layer, user-interests-item, representation scheme, which leads to more accurate ranking recommendation results with less computation cost and helps the understanding of the interactions among users, items, and user interests. Moreover, iExpand strategically deals with many issues that exist in traditional collaborative-filtering approaches, such as the overspecialization problem and the cold-start problem. Finally, we evaluate iExpand on three benchmark data sets, and experimental results show that iExpand can lead to better ranking performance than state-of-the-art methods with a significant margin.
Structural and Functional Concepts in Current Mouse Phenotyping and Archiving Facilities
Kollmus, Heike; Post, Rainer; Brielmeier, Markus; Fernández, Julia; Fuchs, Helmut; McKerlie, Colin; Montoliu, Lluis; Otaegui, Pedro J; Rebelo, Manuel; Riedesel, Hermann; Ruberte, Jesús; Sedlacek, Radislav; de Angelis, Martin Hrabě; Schughart, Klaus
2012-01-01
Collecting and analyzing available information on the building plans, concepts, and workflow from existing animal facilities is an essential prerequisite for most centers that are planning and designing the construction of a new animal experimental research unit. Here, we have collected and analyzed such information in the context of the European project Infrafrontier, which aims to develop a common European infrastructure for high-throughput systemic phenotyping, archiving, and dissemination of mouse models. A team of experts visited 9 research facilities and 3 commercial breeders in Europe, Canada, the United States, and Singapore. During the visits, detailed data of each facility were collected and subsequently represented in standardized floor plans and descriptive tables. These data showed that because the local needs of scientists and their projects, property issues, and national and regional laws require very specific solutions, a common strategy for the construction of such facilities does not exist. However, several basic concepts were apparent that can be described by standardized floor plans showing the principle functional units and their interconnection. Here, we provide detailed information of how individual facilities addressed their specific needs by using different concepts of connecting the principle units. Our analysis likely will be valuable to research centers that are planning to design new mouse phenotyping and archiving facilities. PMID:23043807
Devasenapathy, Deepa; Kannan, Kathiravan
2015-01-01
The traffic in the road network is progressively increasing at a greater extent. Good knowledge of network traffic can minimize congestions using information pertaining to road network obtained with the aid of communal callers, pavement detectors, and so on. Using these methods, low featured information is generated with respect to the user in the road network. Although the existing schemes obtain urban traffic information, they fail to calculate the energy drain rate of nodes and to locate equilibrium between the overhead and quality of the routing protocol that renders a great challenge. Thus, an energy-efficient cluster-based vehicle detection in road network using the intention numeration method (CVDRN-IN) is developed. Initially, sensor nodes that detect a vehicle are grouped into separate clusters. Further, we approximate the strength of the node drain rate for a cluster using polynomial regression function. In addition, the total node energy is estimated by taking the integral over the area. Finally, enhanced data aggregation is performed to reduce the amount of data transmission using digital signature tree. The experimental performance is evaluated with Dodgers loop sensor data set from UCI repository and the performance evaluation outperforms existing work on energy consumption, clustering efficiency, and node drain rate. PMID:25793221
Devasenapathy, Deepa; Kannan, Kathiravan
2015-01-01
The traffic in the road network is progressively increasing at a greater extent. Good knowledge of network traffic can minimize congestions using information pertaining to road network obtained with the aid of communal callers, pavement detectors, and so on. Using these methods, low featured information is generated with respect to the user in the road network. Although the existing schemes obtain urban traffic information, they fail to calculate the energy drain rate of nodes and to locate equilibrium between the overhead and quality of the routing protocol that renders a great challenge. Thus, an energy-efficient cluster-based vehicle detection in road network using the intention numeration method (CVDRN-IN) is developed. Initially, sensor nodes that detect a vehicle are grouped into separate clusters. Further, we approximate the strength of the node drain rate for a cluster using polynomial regression function. In addition, the total node energy is estimated by taking the integral over the area. Finally, enhanced data aggregation is performed to reduce the amount of data transmission using digital signature tree. The experimental performance is evaluated with Dodgers loop sensor data set from UCI repository and the performance evaluation outperforms existing work on energy consumption, clustering efficiency, and node drain rate.
DNA as information: at the crossroads between biology, mathematics, physics and chemistry
2016-01-01
On the one hand, biology, chemistry and also physics tell us how the process of translating the genetic information into life could possibly work, but we are still very far from a complete understanding of this process. On the other hand, mathematics and statistics give us methods to describe such natural systems—or parts of them—within a theoretical framework. Also, they provide us with hints and predictions that can be tested at the experimental level. Furthermore, there are peculiar aspects of the management of genetic information that are intimately related to information theory and communication theory. This theme issue is aimed at fostering the discussion on the problem of genetic coding and information through the presentation of different innovative points of view. The aim of the editors is to stimulate discussions and scientific exchange that will lead to new research on why and how life can exist from the point of view of the coding and decoding of genetic information. The present introduction represents the point of view of the editors on the main aspects that could be the subject of future scientific debate. PMID:26857674
Palter, S F
1996-05-01
The modern clinical trial is a form of human experimentation. There is a long history of disregard for individual rights of the patient in this context, and special attention must be paid to ethical guidelines for these studies. Clinical trials differ in basic ways from clinical practice. Foremost is the introduction of outside interests, beyond those of the patient's health, into the doctor-patient therapeutic alliance. Steps must be taken to protect the interests of the patient when such outside influence exists. Kantian moral theory and the Hippocratic oath dictate that the physician must respect the individual patient's rights and hold such interests paramount. These principles are the basis for informed consent. Randomization of patients is justified when a condition of equipoise exists. The changing nature of health care delivery in the United States introduces new outside interests into the doctor-patient relationship.
Light Field Imaging Based Accurate Image Specular Highlight Removal
Wang, Haoqian; Xu, Chenxue; Wang, Xingzheng; Zhang, Yongbing; Peng, Bo
2016-01-01
Specular reflection removal is indispensable to many computer vision tasks. However, most existing methods fail or degrade in complex real scenarios for their individual drawbacks. Benefiting from the light field imaging technology, this paper proposes a novel and accurate approach to remove specularity and improve image quality. We first capture images with specularity by the light field camera (Lytro ILLUM). After accurately estimating the image depth, a simple and concise threshold strategy is adopted to cluster the specular pixels into “unsaturated” and “saturated” category. Finally, a color variance analysis of multiple views and a local color refinement are individually conducted on the two categories to recover diffuse color information. Experimental evaluation by comparison with existed methods based on our light field dataset together with Stanford light field archive verifies the effectiveness of our proposed algorithm. PMID:27253083
From protein sequence to dynamics and disorder with DynaMine.
Cilia, Elisa; Pancsa, Rita; Tompa, Peter; Lenaerts, Tom; Vranken, Wim F
2013-01-01
Protein function and dynamics are closely related; however, accurate dynamics information is difficult to obtain. Here based on a carefully assembled data set derived from experimental data for proteins in solution, we quantify backbone dynamics properties on the amino-acid level and develop DynaMine--a fast, high-quality predictor of protein backbone dynamics. DynaMine uses only protein sequence information as input and shows great potential in distinguishing regions of different structural organization, such as folded domains, disordered linkers, molten globules and pre-structured binding motifs of different sizes. It also identifies disordered regions within proteins with an accuracy comparable to the most sophisticated existing predictors, without depending on prior disorder knowledge or three-dimensional structural information. DynaMine provides molecular biologists with an important new method that grasps the dynamical characteristics of any protein of interest, as we show here for human p53 and E1A from human adenovirus 5.
Robust bidirectional links for photonic quantum networks
Xu, Jin-Shi; Yung, Man-Hong; Xu, Xiao-Ye; Tang, Jian-Shun; Li, Chuan-Feng; Guo, Guang-Can
2016-01-01
Optical fibers are widely used as one of the main tools for transmitting not only classical but also quantum information. We propose and report an experimental realization of a promising method for creating robust bidirectional quantum communication links through paired optical polarization-maintaining fibers. Many limitations of existing protocols can be avoided with the proposed method. In particular, the path and polarization degrees of freedom are combined to deterministically create a photonic decoherence-free subspace without the need for any ancillary photon. This method is input state–independent, robust against dephasing noise, postselection-free, and applicable bidirectionally. To rigorously quantify the amount of quantum information transferred, the optical fibers are analyzed with the tools developed in quantum communication theory. These results not only suggest a practical means for protecting quantum information sent through optical quantum networks but also potentially provide a new physical platform for enriching the structure of the quantum communication theory. PMID:26824069
NASA Astrophysics Data System (ADS)
Rimal, Dipak
The electromagnetic form factors are the most fundamental observables that encode information about the internal structure of the nucleon. The electric (GE) and the magnetic ( GM) form factors contain information about the spatial distribution of the charge and magnetization inside the nucleon. A significant discrepancy exists between the Rosenbluth and the polarization transfer measurements of the electromagnetic form factors of the proton. One possible explanation for the discrepancy is the contributions of two-photon exchange (TPE) effects. Theoretical calculations estimating the magnitude of the TPE effect are highly model dependent, and limited experimental evidence for such effects exists. Experimentally, the TPE effect can be measured by comparing the ratio of positron-proton elastic scattering cross section to that of the electron-proton [R = sigma(e +p)/sigma(e+p)]. The ratio R was measured over a wide range of kinematics, utilizing a 5.6 GeV primary electron beam produced by the Continuous Electron Beam Accelerator Facility (CEBAF) at Jefferson Lab. This dissertation explored dependence of R on kinematic variables such as squared four-momentum transfer (Q2) and the virtual photon polarization parameter (epsilon). A mixed electron-positron beam was produced from the primary electron beam in experimental Hall B. The mixed beam was scattered from a liquid hydrogen (LH2) target. Both the scattered lepton and the recoil proton were detected by the CEBAF Large Acceptance Spectrometer (CLAS). The elastic events were then identified by using elastic scattering kinematics. This work extracted the Q2 dependence of R at high epsilon(epsilon > 0.8) and the $epsilon dependence of R at approx 0.85 GeV2. In these kinematics, our data confirm the validity of the hadronic calculations of the TPE effect by Blunden, Melnitchouk, and Tjon. This hadronic TPE effect, with additional corrections contributed by higher excitations of the intermediate state nucleon, largely reconciles the Rosenbluth and the polarization transfer measurements of the electromagnetic form factors.
Fiber-MZI-based FBG sensor interrogation: comparative study with a CCD spectrometer.
Das, Bhargab; Chandra, Vikash
2016-10-10
We present an experimental comparative study of the two most commonly used fiber Bragg grating (FBG) sensor interrogation techniques: a charge-coupled device (CCD) spectrometer and a fiber Mach-Zehnder interferometer (F-MZI). Although the interferometric interrogation technique is historically known to offer the highest sensitivity measurements, very little information exists regarding how it compares with the current commercially available spectral-characteristics-based interrogation systems. It is experimentally established here that the performance of a modern-day CCD spectrometer interrogator is very close to a F-MZI interrogator with the capability of measuring Bragg wavelength shifts with sub-picometer-level accuracy. The results presented in this research study can further be used as a guideline for choosing between the two FBG sensor interrogator types for small-amplitude dynamic perturbation measurements down to nano-level strain.
Conformational flexibility and packing plausibility of repaglinide polymorphs
NASA Astrophysics Data System (ADS)
Rani, Dimpy; Goyal, Parnika; Chadha, Renu
2018-04-01
The present manuscript highlights the structural insight into the repaglinide polymorphs. The experimental screening for the possible crystal forms were carried out using various solvents, which generated three forms. The crystal structure of Form II and III was determined using PXRD pattern whereas structural analysis of Form I has already been reported. Form I, II and II was found to exist in P212121, PNA21 and P21/c space groups respectively. Conformational analysis was performed to account the conformational flexibility of RPG. The obtained conformers were further utilized to obtain the information about the crystal packing pattern of RPG polymorphs by polymorph prediction module. The lattice energy landscape, depicting the relationship between lattice energy and density of the polymorphs has been obtained for various possible polymorphs. The experimentally isolated polymorphs were successfully fitted into lattice energy landscape.
ASSESSMENT OF THE POTENTIAL FOR TRANSPORT OF ...
Dioxins are very toxic contaminants and warrant study under a variety of experimental conditions. Studies were performed to evaluate the mobility of several of the dioxins in both soil columns as well as in batch experiments. The studies showed that the amount of chlorination did not necessarily control the partitioning of the dioxins, as expected, but also suggested that the structure or location where the Cl ion was attached to the benzene ring modified the hydrophobicity of the compound. Studies were performed with a variety of cosolvents which might mediate the movement of the Dioxin. The observed modification in mobility was consistent with existing theory for enhanced mobility with truly miscible solvents. Experimental data appears to show reversibility in the sorption process, but significantly limited by kinetics with 30 to 50 days reguired to release 50-90% of the contaminant. present information
Ireland, Kathryn B; Hansen, Andrew J; Keane, Robert E; Legg, Kristin; Gump, Robert L
2018-06-01
Natural resource managers face the need to develop strategies to adapt to projected future climates. Few existing climate adaptation frameworks prescribe where to place management actions to be most effective under anticipated future climate conditions. We developed an approach to spatially allocate climate adaptation actions and applied the method to whitebark pine (WBP; Pinus albicaulis) in the Greater Yellowstone Ecosystem (GYE). WBP is expected to be vulnerable to climate-mediated shifts in suitable habitat, pests, pathogens, and fire. We spatially prioritized management actions aimed at mitigating climate impacts to WBP under two management strategies: (1) current management and (2) climate-informed management. The current strategy reflected management actions permissible under existing policy and access constraints. Our goal was to understand how consideration of climate might alter the placement of management actions, so the climate-informed strategies did not include these constraints. The spatial distribution of actions differed among the current and climate-informed management strategies, with 33-60% more wilderness area prioritized for action under climate-informed management. High priority areas for implementing management actions include the 1-8% of the GYE where current and climate-informed management agreed, since this is where actions are most likely to be successful in the long-term and where current management permits implementation. Areas where climate-informed strategies agreed with one another but not with current management (6-22% of the GYE) are potential locations for experimental testing of management actions. Our method for spatial climate adaptation planning is applicable to any species for which information regarding climate vulnerability and climate-mediated risk factors is available.
NASA Astrophysics Data System (ADS)
Ireland, Kathryn B.; Hansen, Andrew J.; Keane, Robert E.; Legg, Kristin; Gump, Robert L.
2018-06-01
Natural resource managers face the need to develop strategies to adapt to projected future climates. Few existing climate adaptation frameworks prescribe where to place management actions to be most effective under anticipated future climate conditions. We developed an approach to spatially allocate climate adaptation actions and applied the method to whitebark pine (WBP; Pinus albicaulis) in the Greater Yellowstone Ecosystem (GYE). WBP is expected to be vulnerable to climate-mediated shifts in suitable habitat, pests, pathogens, and fire. We spatially prioritized management actions aimed at mitigating climate impacts to WBP under two management strategies: (1) current management and (2) climate-informed management. The current strategy reflected management actions permissible under existing policy and access constraints. Our goal was to understand how consideration of climate might alter the placement of management actions, so the climate-informed strategies did not include these constraints. The spatial distribution of actions differed among the current and climate-informed management strategies, with 33-60% more wilderness area prioritized for action under climate-informed management. High priority areas for implementing management actions include the 1-8% of the GYE where current and climate-informed management agreed, since this is where actions are most likely to be successful in the long-term and where current management permits implementation. Areas where climate-informed strategies agreed with one another but not with current management (6-22% of the GYE) are potential locations for experimental testing of management actions. Our method for spatial climate adaptation planning is applicable to any species for which information regarding climate vulnerability and climate-mediated risk factors is available.
Real-time 3D video compression for tele-immersive environments
NASA Astrophysics Data System (ADS)
Yang, Zhenyu; Cui, Yi; Anwar, Zahid; Bocchino, Robert; Kiyanclar, Nadir; Nahrstedt, Klara; Campbell, Roy H.; Yurcik, William
2006-01-01
Tele-immersive systems can improve productivity and aid communication by allowing distributed parties to exchange information via a shared immersive experience. The TEEVE research project at the University of Illinois at Urbana-Champaign and the University of California at Berkeley seeks to foster the development and use of tele-immersive environments by a holistic integration of existing components that capture, transmit, and render three-dimensional (3D) scenes in real time to convey a sense of immersive space. However, the transmission of 3D video poses significant challenges. First, it is bandwidth-intensive, as it requires the transmission of multiple large-volume 3D video streams. Second, existing schemes for 2D color video compression such as MPEG, JPEG, and H.263 cannot be applied directly because the 3D video data contains depth as well as color information. Our goal is to explore from a different angle of the 3D compression space with factors including complexity, compression ratio, quality, and real-time performance. To investigate these trade-offs, we present and evaluate two simple 3D compression schemes. For the first scheme, we use color reduction to compress the color information, which we then compress along with the depth information using zlib. For the second scheme, we use motion JPEG to compress the color information and run-length encoding followed by Huffman coding to compress the depth information. We apply both schemes to 3D videos captured from a real tele-immersive environment. Our experimental results show that: (1) the compressed data preserves enough information to communicate the 3D images effectively (min. PSNR > 40) and (2) even without inter-frame motion estimation, very high compression ratios (avg. > 15) are achievable at speeds sufficient to allow real-time communication (avg. ~ 13 ms per 3D video frame).
Children's difficulties handling dual identity.
Apperly, I A; Robinson, E J
2001-04-01
Thirty-nine 6-year-old children participated in a longitudinal study using tasks that required handling of dual identity. Pre- and posttest sessions employed tasks involving a protagonist who was partially informed about an object or person; for example, he knew an item as a ball but not as a present. Children who judged correctly that the protagonist did not know the ball was a present (thereby demonstrating some understanding of the consequences of limited information access), often judged incorrectly (1) that he knew that there was a present in the box, and (2) that he would search as if fully informed. Intervening sessions added contextual support and tried to clarify the experimenter's communicative intentions in a range of ways. Despite signs of general improvement, the distinctive pattern of errors persisted in every case. These findings go beyond previous studies of children's handling of limited information access, and are hard to accommodate within existing accounts of developing understanding of the mind. Copyright 2001 Academic Press.
Saccharomyces genome database informs human biology.
Skrzypek, Marek S; Nash, Robert S; Wong, Edith D; MacPherson, Kevin A; Hellerstedt, Sage T; Engel, Stacia R; Karra, Kalpana; Weng, Shuai; Sheppard, Travis K; Binkley, Gail; Simison, Matt; Miyasato, Stuart R; Cherry, J Michael
2018-01-04
The Saccharomyces Genome Database (SGD; http://www.yeastgenome.org) is an expertly curated database of literature-derived functional information for the model organism budding yeast, Saccharomyces cerevisiae. SGD constantly strives to synergize new types of experimental data and bioinformatics predictions with existing data, and to organize them into a comprehensive and up-to-date information resource. The primary mission of SGD is to facilitate research into the biology of yeast and to provide this wealth of information to advance, in many ways, research on other organisms, even those as evolutionarily distant as humans. To build such a bridge between biological kingdoms, SGD is curating data regarding yeast-human complementation, in which a human gene can successfully replace the function of a yeast gene, and/or vice versa. These data are manually curated from published literature, made available for download, and incorporated into a variety of analysis tools provided by SGD. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.
A Low-Storage-Consumption XML Labeling Method for Efficient Structural Information Extraction
NASA Astrophysics Data System (ADS)
Liang, Wenxin; Takahashi, Akihiro; Yokota, Haruo
Recently, labeling methods to extract and reconstruct the structural information of XML data, which are important for many applications such as XPath query and keyword search, are becoming more attractive. To achieve efficient structural information extraction, in this paper we propose C-DO-VLEI code, a novel update-friendly bit-vector encoding scheme, based on register-length bit operations combining with the properties of Dewey Order numbers, which cannot be implemented in other relevant existing schemes such as ORDPATH. Meanwhile, the proposed method also achieves lower storage consumption because it does not require either prefix schema or any reserved codes for node insertion. We performed experiments to evaluate and compare the performance and storage consumption of the proposed method with those of the ORDPATH method. Experimental results show that the execution times for extracting depth information and parent node labels using the C-DO-VLEI code are about 25% and 15% less, respectively, and the average label size using the C-DO-VLEI code is about 24% smaller, comparing with ORDPATH.
The Cl + O3 reaction: a detailed QCT simulation of molecular beam experiments.
Menéndez, M; Castillo, J F; Martínez-Haya, B; Aoiz, F J
2015-10-14
We have studied in detail the dynamics of the Cl + O3 reaction in the 1-56 kcal mol(-1) collision energy range using quasi-classical trajectory (QCT) calculations on a recent potential energy surface (PES) [J. F. Castillo et al., Phys. Chem. Chem. Phys., 2011, 13, 8537]. The main goal of this work has been to assess the accuracy of the PES and the reliability of the QCT method by comparison with the existing crossed molecular beam results [J. Zhang and Y. T. Lee J. Phys. Chem. A, 1997, 101, 6485]. For this purpose, we have developed a methodology that allows us to determine the experimental observables in crossed molecular beam experiments (integral and differential cross sections, recoil velocity distributions, scattering angle-recoil velocity polar maps, etc.) as continuous functions of the collision energy. Using these distributions, raw experimental data in the laboratory frame (angular distributions and time-of-flight spectra) have been simulated from first principles with the sole information on the instrumental parameters and taking into account the energy spread. A general good agreement with the experimental data has been found, thereby demonstrating the adequacy of the QCT method and the quality of the PES to describe the dynamics of this reaction at the level of resolution of the existing crossed beam experiments. Some features which are apparent in the differential cross sections have also been analysed in terms of the dynamics of the reaction and its evolution with the collision energy.
NASA Astrophysics Data System (ADS)
Kalwarczyk, Tomasz; Sozanski, Krzysztof; Jakiela, Slawomir; Wisniewska, Agnieszka; Kalwarczyk, Ewelina; Kryszczuk, Katarzyna; Hou, Sen; Holyst, Robert
2014-08-01
We propose a scaling equation describing transport properties (diffusion and viscosity) in the solutions of colloidal particles. We apply the equation to 23 different systems including colloids and proteins differing in size (range of diameters: 4 nm to 1 μm), and volume fractions (10-3-0.56). In solutions under study colloids/proteins interact via steric, hydrodynamic, van der Waals and/or electrostatic interactions. We implement contribution of those interactions into the scaling law. Finally we use our scaling law together with the literature values of the barrier for nucleation to predict crystal nucleation rates of hard-sphere like colloids. The resulting crystal nucleation rates agree with existing experimental data.We propose a scaling equation describing transport properties (diffusion and viscosity) in the solutions of colloidal particles. We apply the equation to 23 different systems including colloids and proteins differing in size (range of diameters: 4 nm to 1 μm), and volume fractions (10-3-0.56). In solutions under study colloids/proteins interact via steric, hydrodynamic, van der Waals and/or electrostatic interactions. We implement contribution of those interactions into the scaling law. Finally we use our scaling law together with the literature values of the barrier for nucleation to predict crystal nucleation rates of hard-sphere like colloids. The resulting crystal nucleation rates agree with existing experimental data. Electronic supplementary information (ESI) available: Experimental and some analysis details. See DOI: 10.1039/c4nr00647j
Broadcasting a Lab Measurement over Existing Conductor Networks
ERIC Educational Resources Information Center
Knipp, Peter A.
2009-01-01
Students learn about physical laws and the scientific method when they analyze experimental data in a laboratory setting. Three common sources exist for the experimental data that they analyze: (1) "hands-on" measurements by the students themselves, (2) electronic transfer (by downloading a spreadsheet, video, or computer-aided data-acquisition…
Laboratory Spectroscopy of Ices of Astrophysical Interest
NASA Technical Reports Server (NTRS)
Hudson, Reggie; Moore, M. H.
2011-01-01
Ongoing and future NASA and ESA astronomy missions need detailed information on the spectra of a variety of molecular ices to help establish the identity and abundances of molecules observed in astronomical data. Examples of condensed-phase molecules already detected on cold surfaces include H2O, CO, CO2, N2, NH3, CH4, SO2, O2, and O3. In addition, strong evidence exists for the solid-phase nitriles HCN, HC3N, and C2N2 in Titan's atmosphere. The wavelength region over which these identifications have been made is roughly 0.5 to 100 micron. Searches for additional features of complex carbon-containing species are in progress. Existing and future observations often impose special requirements on the information that comes from the laboratory. For example, the measurement of spectra, determination of integrated band strengths, and extraction of complex refractive indices of ices (and icy mixtures) in both amorphous and crystalline phases at relevant temperatures are all important tasks. In addition, the determination of the index of refraction of amorphous and crystalline ices in the visible region is essential for the extraction of infrared optical constants. Similarly, the measurement of spectra of ions and molecules embedded in relevant ices is important. This laboratory review will examine some of the existing experimental work and capabilities in these areas along with what more may be needed to meet current and future NASA and ESA planetary needs.
Indexes of the proceedings for the nine symposia (international) on detonation, 1951--89
DOE Office of Scientific and Technical Information (OSTI.GOV)
Crane, S.L.; Deal, W.E.; Ramsay, J.B.
1993-01-01
The Proceedings of the nine Detonation Symposia have become the major archival source of information of international research in explosive phenomenology, theory, experimental techniques, numerical modeling, and high-rate reaction chemistry. In many cases, they contain the original reference or the only reference to major progress in the field. For some papers, the information is more complete than the complementary article appearing in a formal journal, yet for others, authors elected to publish only an abstract in the Proceedings. For the large majority of papers, the Symposia Proceedings provide the only published reference to a body of work. This report indexesmore » the nine existing Proceedings of the Detonation Symposia by paper titles, topic phrases, authors, and first appearance of acronyms and code names.« less
Modelling of the Thermo-Physical and Physical Properties for Solidification of Al-Alloys
NASA Astrophysics Data System (ADS)
Saunders, N.; Li, X.; Miodownik, A. P.; Schillé, J.-P.
The thermo-physical and physical properties of the liquid and solid phases are critical components in casting simulations. Such properties include the fraction solid transformed, enthalpy release, thermal conductivity, volume and density, all as a function of temperature. Due to the difficulty in experimentally determining such properties at solidification temperatures, little information exists for multi-component alloys. As part of the development of a new computer program for modelling of materials properties (JMatPro) extensive work has been carried out on the development of sound, physically based models for these properties. Wide ranging results will presented for Al-based alloys, which will include more detailed information concerning the density change of the liquid that intrinsically occurs during solidification due to its change in composition.
A class-based link prediction using Distance Dependent Chinese Restaurant Process
NASA Astrophysics Data System (ADS)
Andalib, Azam; Babamir, Seyed Morteza
2016-08-01
One of the important tasks in relational data analysis is link prediction which has been successfully applied on many applications such as bioinformatics, information retrieval, etc. The link prediction is defined as predicting the existence or absence of edges between nodes of a network. In this paper, we propose a novel method for link prediction based on Distance Dependent Chinese Restaurant Process (DDCRP) model which enables us to utilize the information of the topological structure of the network such as shortest path and connectivity of the nodes. We also propose a new Gibbs sampling algorithm for computing the posterior distribution of the hidden variables based on the training data. Experimental results on three real-world datasets show the superiority of the proposed method over other probabilistic models for link prediction problem.
Live face detection based on the analysis of Fourier spectra
NASA Astrophysics Data System (ADS)
Li, Jiangwei; Wang, Yunhong; Tan, Tieniu; Jain, Anil K.
2004-08-01
Biometrics is a rapidly developing technology that is to identify a person based on his or her physiological or behavioral characteristics. To ensure the correction of authentication, the biometric system must be able to detect and reject the use of a copy of a biometric instead of the live biometric. This function is usually termed "liveness detection". This paper describes a new method for live face detection. Using structure and movement information of live face, an effective live face detection algorithm is presented. Compared to existing approaches, which concentrate on the measurement of 3D depth information, this method is based on the analysis of Fourier spectra of a single face image or face image sequences. Experimental results show that the proposed method has an encouraging performance.
Qu, Jianfeng; Ouyang, Dantong; Hua, Wen; Ye, Yuxin; Li, Ximing
2018-04-01
Distant supervision for neural relation extraction is an efficient approach to extracting massive relations with reference to plain texts. However, the existing neural methods fail to capture the critical words in sentence encoding and meanwhile lack useful sentence information for some positive training instances. To address the above issues, we propose a novel neural relation extraction model. First, we develop a word-level attention mechanism to distinguish the importance of each individual word in a sentence, increasing the attention weights for those critical words. Second, we investigate the semantic information from word embeddings of target entities, which can be developed as a supplementary feature for the extractor. Experimental results show that our model outperforms previous state-of-the-art baselines. Copyright © 2018 Elsevier Ltd. All rights reserved.
Li, Yuankun; Xu, Tingfa; Deng, Honggao; Shi, Guokai; Guo, Jie
2018-02-23
Although correlation filter (CF)-based visual tracking algorithms have achieved appealing results, there are still some problems to be solved. When the target object goes through long-term occlusions or scale variation, the correlation model used in existing CF-based algorithms will inevitably learn some non-target information or partial-target information. In order to avoid model contamination and enhance the adaptability of model updating, we introduce the keypoints matching strategy and adjust the model learning rate dynamically according to the matching score. Moreover, the proposed approach extracts convolutional features from a deep convolutional neural network (DCNN) to accurately estimate the position and scale of the target. Experimental results demonstrate that the proposed tracker has achieved satisfactory performance in a wide range of challenging tracking scenarios.
Performance of device-independent quantum key distribution
NASA Astrophysics Data System (ADS)
Cao, Zhu; Zhao, Qi; Ma, Xiongfeng
2016-07-01
Quantum key distribution provides information-theoretically-secure communication. In practice, device imperfections may jeopardise the system security. Device-independent quantum key distribution solves this problem by providing secure keys even when the quantum devices are untrusted and uncharacterized. Following a recent security proof of the device-independent quantum key distribution, we improve the key rate by tightening the parameter choice in the security proof. In practice where the system is lossy, we further improve the key rate by taking into account the loss position information. From our numerical simulation, our method can outperform existing results. Meanwhile, we outline clear experimental requirements for implementing device-independent quantum key distribution. The maximal tolerable error rate is 1.6%, the minimal required transmittance is 97.3%, and the minimal required visibility is 96.8 % .
Indexes of the Proceedings for the Ten International Symposia on Detonation 1951-93
DOE Office of Scientific and Technical Information (OSTI.GOV)
Deal, William E.; Ramsay, John B.; Roach, Alita M.
1998-09-01
The Proceedings of the ten Detonation Symposia have become the major archival source of information of international research in explosive phenomenology, theory, experimental techniques, numerical modeling, and high-rate reaction chemistry. In many cases, they contain the original reference or the only reference to major progress in the field. For some papers, the information is more complete than the complementary article appearing in a formal journal; yet for others, authors elected to publish only an abstract in the Proceedings. For the large majority of papers, the Symposia Proceedings provide the only published reference to a body of work. This report indexesmore » the ten existing Proceedings of the Detonation Symposia by paper titles, topic phrases, authors, and first appearance of acronyms and code names.« less
Twisted Acoustics: Metasurface-Enabled Multiplexing and Demultiplexing.
Jiang, Xue; Liang, Bin; Cheng, Jian-Chun; Qiu, Cheng-Wei
2018-05-01
Metasurfaces are used to enable acoustic orbital angular momentum (a-OAM)-based multiplexing in real-time, postprocess-free, and sensor-scanning-free fashions to improve the bandwidth of acoustic communication, with intrinsic compatibility and expandability to cooperate with other multiplexing schemes. The metasurface-based communication relying on encoding information onto twisted beams is numerically and experimentally demonstrated by realizing real-time picture transfer, which differs from existing static data transfer by encoding data onto OAM states. With the advantages of real-time transmission, passive and instantaneous data decoding, vanishingly low loss, compact size, and high transmitting accuracy, the study of a-OAM-based information transfer with metasurfaces offers new route to boost the capacity of acoustic communication and great potential to profoundly advance relevant fields. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Automated tracking and quantification of angiogenic vessel formation in 3D microfluidic devices.
Wang, Mengmeng; Ong, Lee-Ling Sharon; Dauwels, Justin; Asada, H Harry
2017-01-01
Angiogenesis, the growth of new blood vessels from pre-existing vessels, is a critical step in cancer invasion. Better understanding of the angiogenic mechanisms is required to develop effective antiangiogenic therapies for cancer treatment. We culture angiogenic vessels in 3D microfluidic devices under different Sphingosin-1-phosphate (S1P) conditions and develop an automated vessel formation tracking system (AVFTS) to track the angiogenic vessel formation and extract quantitative vessel information from the experimental time-lapse phase contrast images. The proposed AVFTS first preprocesses the experimental images, then applies a distance transform and an augmented fast marching method in skeletonization, and finally implements the Hungarian method in branch tracking. When applying the AVFTS to our experimental data, we achieve 97.3% precision and 93.9% recall by comparing with the ground truth obtained from manual tracking by visual inspection. This system enables biologists to quantitatively compare the influence of different growth factors. Specifically, we conclude that the positive S1P gradient increases cell migration and vessel elongation, leading to a higher probability for branching to occur. The AVFTS is also applicable to distinguish tip and stalk cells by considering the relative cell locations in a branch. Moreover, we generate a novel type of cell lineage plot, which not only provides cell migration and proliferation histories but also demonstrates cell phenotypic changes and branch information.
NASA Technical Reports Server (NTRS)
Simoneau, Robert J.; Strazisar, Anthony J.; Sockol, Peter M.; Reid, Lonnie; Adamczyk, John J.
1987-01-01
The discipline research in turbomachinery, which is directed toward building the tools needed to understand such a complex flow phenomenon, is based on the fact that flow in turbomachinery is fundamentally unsteady or time dependent. Success in building a reliable inventory of analytic and experimental tools will depend on how the time and time-averages are treated, as well as on who the space and space-averages are treated. The raw tools at disposal (both experimentally and computational) are truly powerful and their numbers are growing at a staggering pace. As a result of this power, a case can be made that a situation exists where information is outstripping understanding. The challenge is to develop a set of computational and experimental tools which genuinely increase understanding of the fluid flow and heat transfer in a turbomachine. Viewgraphs outline a philosophy based on working on a stairstep hierarchy of mathematical and experimental complexity to build a system of tools, which enable one to aggressively design the turbomachinery of the next century. Examples of the types of computational and experimental tools under current development at Lewis, with progress to date, are examined. The examples include work in both the time-resolved and time-averaged domains. Finally, an attempt is made to identify the proper place for Lewis in this continuum of research.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-12-12
...; Experimental Study: Disease Information in Branded Promotional Material AGENCY: Food and Drug Administration... of information entitled ``Experimental Study: Disease Information in Branded Promotional Material... of information entitled ``Experimental Study: Disease Information in Branded Promotional Material...
The use of clinical trials in comparative effectiveness research on mental health
Blanco, Carlos; Rafful, Claudia; Olfson, Mark
2013-01-01
Objectives A large body of research on comparative effectiveness research (CER) focuses on the use of observational and quasi-experimental approaches. We sought to examine the use of clinical trials as a tool for CER, particularly in mental health. Study Design and Setting Examination of three ongoing randomized clinical trials in psychiatry that address issues which would pose difficulties for non-experimental CER methods. Results Existing statistical approaches to non-experimental data appear insufficient to compensate for biases that may arise when the pattern of missing data cannot be properly modeled such as when there are no standards for treatment, when affected populations have limited access to treatment, or when there are high rates of treatment dropout. Conclusions Clinical trials should retain an important role in CER, particularly in cases of high disorder prevalence, large expected effect sizes, difficult to reach populations or when examining sequential treatments or stepped-care algorithms. Progress in CER in mental health will require careful consideration of appropriate selection between clinical trials and non-experimental designs and on allocation of research resources to optimally inform key treatment decisions for each individual patient. PMID:23849150
Data base for the prediction of inlet external drag
NASA Technical Reports Server (NTRS)
Mcmillan, O. J.; Perkins, E. W.; Perkins, S. C., Jr.
1980-01-01
Results are presented from a study to define and evaluate the data base for predicting an airframe/propulsion system interference effect shown to be of considerable importance, inlet external drag. The study is focused on supersonic tactical aircraft with highly integrated jet propulsion systems, although some information is included for supersonic strategic aircraft and for transport aircraft designed for high subsonic or low supersonic cruise. The data base for inlet external drag is considered to consist of the theoretical and empirical prediction methods as well as the experimental data identified in an extensive literature search. The state of the art in the subsonic and transonic speed regimes is evaluated. The experimental data base is organized and presented in a series of tables in which the test article, the quantities measured and the ranges of test conditions covered are described for each set of data; in this way, the breadth of coverage and gaps in the existing experimental data are evident. Prediction methods are categorized by method of solution, type of inlet and speed range to which they apply, major features are given, and their accuracy is assessed by means of comparison to experimental data.
Experimentally superposing two pure states with partial prior knowledge
NASA Astrophysics Data System (ADS)
Li, Keren; Long, Guofei; Katiyar, Hemant; Xin, Tao; Feng, Guanru; Lu, Dawei; Laflamme, Raymond
2017-02-01
Superposition, arguably the most fundamental property of quantum mechanics, lies at the heart of quantum information science. However, how to create the superposition of any two unknown pure states remains as a daunting challenge. Recently, it was proved that such a quantum protocol does not exist if the two input states are completely unknown, whereas a probabilistic protocol is still available with some prior knowledge about the input states [M. Oszmaniec et al., Phys. Rev. Lett. 116, 110403 (2016), 10.1103/PhysRevLett.116.110403]. The knowledge is that both of the two input states have nonzero overlaps with some given referential state. In this work, we experimentally realize the probabilistic protocol of superposing two pure states in a three-qubit nuclear magnetic resonance system. We demonstrate the feasibility of the protocol by preparing a families of input states, and the average fidelity between the prepared state and expected superposition state is over 99%. Moreover, we experimentally illustrate the limitation of the protocol that it is likely to fail or yields very low fidelity, if the nonzero overlaps are approaching zero. Our experimental implementation can be extended to more complex situations and other quantum systems.
Hughes, Alicia M; Gordon, Rola; Chalder, Trudie; Hirsch, Colette R; Moss-Morris, Rona
2016-11-01
There is an abundance of research into cognitive processing biases in clinical psychology including the potential for applying cognitive bias modification techniques to assess the causal role of biases in maintaining anxiety and depression. Within the health psychology field, there is burgeoning interest in applying these experimental methods to assess potential cognitive biases in relation to physical health conditions and health-related behaviours. Experimental research in these areas could inform theoretical development by enabling measurement of implicit cognitive processes that may underlie unhelpful illness beliefs and help drive health-related behaviours. However, to date, there has been no systematic approach to adapting existing experimental paradigms for use within physical health research. Many studies fail to report how materials were developed for the population of interest or have used untested materials developed ad hoc. The lack of protocol for developing stimuli specificity has contributed to large heterogeneity in methodologies and findings. In this article, we emphasize the need for standardized methods for stimuli development and replication in experimental work, particularly as it extends beyond its original anxiety and depression scope to other physical conditions. We briefly describe the paradigms commonly used to assess cognitive biases in attention and interpretation and then describe the steps involved in comprehensive/robust stimuli development for attention and interpretation paradigms using illustrative examples from two conditions: chronic fatigue syndrome and breast cancer. This article highlights the value of preforming rigorous stimuli development and provides tools to aid researchers engage in this process. We believe this work is worthwhile to establish a body of high-quality and replicable experimental research within the health psychology literature. Statement of contribution What is already known on this subject? Cognitive biases (e.g., tendencies to attend to negative information and/or interpret ambiguous information in negative ways) have a causal role in maintaining anxiety and depression. There is mixed evidence of cognitive biases in physical health conditions and chronic illness; one reason for this may be the heterogeneous stimuli used to assess attention and interpretation biases in these conditions. What does this study add? Steps for comprehensive/robust stimuli development for attention and interpretation paradigms are presented. Illustrative examples are provided from two conditions: chronic fatigue syndrome and breast cancer. We provide tools to help researchers develop condition-specific materials for experimental studies. © 2016 The British Psychological Society.
Maier, Dieter; Kalus, Wenzel; Wolff, Martin; Kalko, Susana G; Roca, Josep; Marin de Mas, Igor; Turan, Nil; Cascante, Marta; Falciani, Francesco; Hernandez, Miguel; Villà-Freixa, Jordi; Losko, Sascha
2011-03-05
To enhance our understanding of complex biological systems like diseases we need to put all of the available data into context and use this to detect relations, pattern and rules which allow predictive hypotheses to be defined. Life science has become a data rich science with information about the behaviour of millions of entities like genes, chemical compounds, diseases, cell types and organs, which are organised in many different databases and/or spread throughout the literature. Existing knowledge such as genotype-phenotype relations or signal transduction pathways must be semantically integrated and dynamically organised into structured networks that are connected with clinical and experimental data. Different approaches to this challenge exist but so far none has proven entirely satisfactory. To address this challenge we previously developed a generic knowledge management framework, BioXM™, which allows the dynamic, graphic generation of domain specific knowledge representation models based on specific objects and their relations supporting annotations and ontologies. Here we demonstrate the utility of BioXM for knowledge management in systems biology as part of the EU FP6 BioBridge project on translational approaches to chronic diseases. From clinical and experimental data, text-mining results and public databases we generate a chronic obstructive pulmonary disease (COPD) knowledge base and demonstrate its use by mining specific molecular networks together with integrated clinical and experimental data. We generate the first semantically integrated COPD specific public knowledge base and find that for the integration of clinical and experimental data with pre-existing knowledge the configuration based set-up enabled by BioXM reduced implementation time and effort for the knowledge base compared to similar systems implemented as classical software development projects. The knowledgebase enables the retrieval of sub-networks including protein-protein interaction, pathway, gene--disease and gene--compound data which are used for subsequent data analysis, modelling and simulation. Pre-structured queries and reports enhance usability; establishing their use in everyday clinical settings requires further simplification with a browser based interface which is currently under development.
2011-01-01
Background To enhance our understanding of complex biological systems like diseases we need to put all of the available data into context and use this to detect relations, pattern and rules which allow predictive hypotheses to be defined. Life science has become a data rich science with information about the behaviour of millions of entities like genes, chemical compounds, diseases, cell types and organs, which are organised in many different databases and/or spread throughout the literature. Existing knowledge such as genotype - phenotype relations or signal transduction pathways must be semantically integrated and dynamically organised into structured networks that are connected with clinical and experimental data. Different approaches to this challenge exist but so far none has proven entirely satisfactory. Results To address this challenge we previously developed a generic knowledge management framework, BioXM™, which allows the dynamic, graphic generation of domain specific knowledge representation models based on specific objects and their relations supporting annotations and ontologies. Here we demonstrate the utility of BioXM for knowledge management in systems biology as part of the EU FP6 BioBridge project on translational approaches to chronic diseases. From clinical and experimental data, text-mining results and public databases we generate a chronic obstructive pulmonary disease (COPD) knowledge base and demonstrate its use by mining specific molecular networks together with integrated clinical and experimental data. Conclusions We generate the first semantically integrated COPD specific public knowledge base and find that for the integration of clinical and experimental data with pre-existing knowledge the configuration based set-up enabled by BioXM reduced implementation time and effort for the knowledge base compared to similar systems implemented as classical software development projects. The knowledgebase enables the retrieval of sub-networks including protein-protein interaction, pathway, gene - disease and gene - compound data which are used for subsequent data analysis, modelling and simulation. Pre-structured queries and reports enhance usability; establishing their use in everyday clinical settings requires further simplification with a browser based interface which is currently under development. PMID:21375767
Spielmann, Horst; Sauer, Ursula G; Mekenyan, Ovanes
2011-10-01
On 30 June 2011, the European Chemicals Agency published two reports, one on the functioning of the REACH system, the other on the use of alternatives to animal testing in compliance with that system. The data presented are based on information gained during the first registration period under the REACH system, which included high production volume chemicals and substances of very high concern, which have the most extensive information requirements. A total of 25,460 registration dossiers were received, covering 3,400 existing, so-called 'phase-in', substances, and 900 new, so-called 'non-phase-in', substances. Data sharing and the joint submission of data are reported to have worked successfully. In the registration dossiers for these substances, results from new animal tests were included for less than 1% of all the endpoints; testing proposals (required for 'higher-tier' information requirements) were submitted for 711 in vivo tests involving vertebrate animals. The registrants mainly used old, existing experimental data, or options for the adaptation (waiving) of information requirements, before collecting new information. For predicting substance toxicity, 'read-across' was the second most-used approach, followed by 'weight-of-evidence'. In vitro toxicity tests played a minor role, and were only used when the respective test methods had gained the status of regulatory acceptance. All in all, a successful start to the REACH programme was reported, particularly since, in contrast to most predictions, it did not contribute to a significant increase in toxicity testing in animals. 2011 FRAME.
Competition between Homophily and Information Entropy Maximization in Social Networks
Zhao, Jichang; Liang, Xiao; Xu, Ke
2015-01-01
In social networks, it is conventionally thought that two individuals with more overlapped friends tend to establish a new friendship, which could be stated as homophily breeding new connections. While the recent hypothesis of maximum information entropy is presented as the possible origin of effective navigation in small-world networks. We find there exists a competition between information entropy maximization and homophily in local structure through both theoretical and experimental analysis. This competition suggests that a newly built relationship between two individuals with more common friends would lead to less information entropy gain for them. We demonstrate that in the evolution of the social network, both of the two assumptions coexist. The rule of maximum information entropy produces weak ties in the network, while the law of homophily makes the network highly clustered locally and the individuals would obtain strong and trust ties. A toy model is also presented to demonstrate the competition and evaluate the roles of different rules in the evolution of real networks. Our findings could shed light on the social network modeling from a new perspective. PMID:26334994
DOE Office of Scientific and Technical Information (OSTI.GOV)
Livescu, Veronica; Bronkhorst, Curt Allan; Vander Wiel, Scott Alan
Many challenges exist with regard to understanding and representing complex physical processes involved with ductile damage and failure in polycrystalline metallic materials. Currently, the ability to accurately predict the macroscale ductile damage and failure response of metallic materials is lacking. Research at Los Alamos National Laboratory (LANL) is aimed at building a coupled experimental and computational methodology that supports the development of predictive damage capabilities by: capturing real distributions of microstructural features from real material and implementing them as digitally generated microstructures in damage model development; and, distilling structure-property information to link microstructural details to damage evolution under a multitudemore » of loading states.« less
Representation of viruses in the remediated PDB archive
Lawson, Catherine L.; Dutta, Shuchismita; Westbrook, John D.; Henrick, Kim; Berman, Helen M.
2008-01-01
A new scheme has been devised to represent viruses and other biological assemblies with regular noncrystallographic symmetry in the Protein Data Bank (PDB). The scheme describes existing and anticipated PDB entries of this type using generalized descriptions of deposited and experimental coordinate frames, symmetry and frame transformations. A simplified notation has been adopted to express the symmetry generation of assemblies from deposited coordinates and matrix operations describing the required point, helical or crystallographic symmetry. Complete correct information for building full assemblies, subassemblies and crystal asymmetric units of all virus entries is now available in the remediated PDB archive. PMID:18645236
NASA Astrophysics Data System (ADS)
Gontrani, Lorenzo; Caminiti, Ruggero; Salma, Umme; Campetella, Marco
2017-09-01
We present here a structural and vibrational analysis of melted methylammonium nitrate, the simplest compound of the family of alkylammonium nitrates. The static and dynamical features calculated were endorsed by comparing the experimental X-ray data with the theoretical ones. A reliable description cannot be obtained with classical molecular dynamics owing to polarization effects. Contrariwise, the structure factor and the vibrational frequencies obtained from ab initio molecular dynamics trajectories are in very good agreement with the experiment. A careful analysis has provided additional information on the complex hydrogen bonding network that exists in this liquid.
Genetic programming for evolving due-date assignment models in job shop environments.
Nguyen, Su; Zhang, Mengjie; Johnston, Mark; Tan, Kay Chen
2014-01-01
Due-date assignment plays an important role in scheduling systems and strongly influences the delivery performance of job shops. Because of the stochastic and dynamic nature of job shops, the development of general due-date assignment models (DDAMs) is complicated. In this study, two genetic programming (GP) methods are proposed to evolve DDAMs for job shop environments. The experimental results show that the evolved DDAMs can make more accurate estimates than other existing dynamic DDAMs with promising reusability. In addition, the evolved operation-based DDAMs show better performance than the evolved DDAMs employing aggregate information of jobs and machines.
Polarization holograms allow highly efficient generation of complex light beams.
Ruiz, U; Pagliusi, P; Provenzano, C; Volke-Sepúlveda, K; Cipparrone, Gabriella
2013-03-25
We report a viable method to generate complex beams, such as the non-diffracting Bessel and Weber beams, which relies on the encoding of amplitude information, in addition to phase and polarization, using polarization holography. The holograms are recorded in polarization sensitive films by the interference of a reference plane wave with a tailored complex beam, having orthogonal circular polarizations. The high efficiency, the intrinsic achromaticity and the simplicity of use of the polarization holograms make them competitive with respect to existing methods and attractive for several applications. Theoretical analysis, based on the Jones formalism, and experimental results are shown.
Superstitiousness in obsessive-compulsive disorder
Brugger, Peter; Viaud-Delmon, Isabelle
2010-01-01
It has been speculated that superstitiousness and obsessivecompulsive disorder (OCD) exist along a continuum. The distinction between superstitious behavior italic>and superstitious belief, however, is crucial for any theoretical account of claimed associations between superstitiousness and OCD. By demonstrating that there is a dichotomy between behavior and belief, which is experimentally testable, we can differentiate superstitious behavior from superstitious belief, or magical ideation. Different brain circuits are responsible for these two forms of superstitiousness; thus, determining which type of superstition is prominent in the symptomatology of an individual patient may inform us about the primarily affected neurocognitive systems. PMID:20623929
Yin, Zheng; Zhou, Xiaobo; Bakal, Chris; Li, Fuhai; Sun, Youxian; Perrimon, Norbert; Wong, Stephen TC
2008-01-01
Background The recent emergence of high-throughput automated image acquisition technologies has forever changed how cell biologists collect and analyze data. Historically, the interpretation of cellular phenotypes in different experimental conditions has been dependent upon the expert opinions of well-trained biologists. Such qualitative analysis is particularly effective in detecting subtle, but important, deviations in phenotypes. However, while the rapid and continuing development of automated microscope-based technologies now facilitates the acquisition of trillions of cells in thousands of diverse experimental conditions, such as in the context of RNA interference (RNAi) or small-molecule screens, the massive size of these datasets precludes human analysis. Thus, the development of automated methods which aim to identify novel and biological relevant phenotypes online is one of the major challenges in high-throughput image-based screening. Ideally, phenotype discovery methods should be designed to utilize prior/existing information and tackle three challenging tasks, i.e. restoring pre-defined biological meaningful phenotypes, differentiating novel phenotypes from known ones and clarifying novel phenotypes from each other. Arbitrarily extracted information causes biased analysis, while combining the complete existing datasets with each new image is intractable in high-throughput screens. Results Here we present the design and implementation of a novel and robust online phenotype discovery method with broad applicability that can be used in diverse experimental contexts, especially high-throughput RNAi screens. This method features phenotype modelling and iterative cluster merging using improved gap statistics. A Gaussian Mixture Model (GMM) is employed to estimate the distribution of each existing phenotype, and then used as reference distribution in gap statistics. This method is broadly applicable to a number of different types of image-based datasets derived from a wide spectrum of experimental conditions and is suitable to adaptively process new images which are continuously added to existing datasets. Validations were carried out on different dataset, including published RNAi screening using Drosophila embryos [Additional files 1, 2], dataset for cell cycle phase identification using HeLa cells [Additional files 1, 3, 4] and synthetic dataset using polygons, our methods tackled three aforementioned tasks effectively with an accuracy range of 85%–90%. When our method is implemented in the context of a Drosophila genome-scale RNAi image-based screening of cultured cells aimed to identifying the contribution of individual genes towards the regulation of cell-shape, it efficiently discovers meaningful new phenotypes and provides novel biological insight. We also propose a two-step procedure to modify the novelty detection method based on one-class SVM, so that it can be used to online phenotype discovery. In different conditions, we compared the SVM based method with our method using various datasets and our methods consistently outperformed SVM based method in at least two of three tasks by 2% to 5%. These results demonstrate that our methods can be used to better identify novel phenotypes in image-based datasets from a wide range of conditions and organisms. Conclusion We demonstrate that our method can detect various novel phenotypes effectively in complex datasets. Experiment results also validate that our method performs consistently under different order of image input, variation of starting conditions including the number and composition of existing phenotypes, and dataset from different screens. In our findings, the proposed method is suitable for online phenotype discovery in diverse high-throughput image-based genetic and chemical screens. PMID:18534020
Dasgupta, Annwesa P; Anderson, Trevor R; Pelaez, Nancy
2014-01-01
It is essential to teach students about experimental design, as this facilitates their deeper understanding of how most biological knowledge was generated and gives them tools to perform their own investigations. Despite the importance of this area, surprisingly little is known about what students actually learn from designing biological experiments. In this paper, we describe a rubric for experimental design (RED) that can be used to measure knowledge of and diagnose difficulties with experimental design. The development and validation of the RED was informed by a literature review and empirical analysis of undergraduate biology students' responses to three published assessments. Five areas of difficulty with experimental design were identified: the variable properties of an experimental subject; the manipulated variables; measurement of outcomes; accounting for variability; and the scope of inference appropriate for experimental findings. Our findings revealed that some difficulties, documented some 50 yr ago, still exist among our undergraduate students, while others remain poorly investigated. The RED shows great promise for diagnosing students' experimental design knowledge in lecture settings, laboratory courses, research internships, and course-based undergraduate research experiences. It also shows potential for guiding the development and selection of assessment and instructional activities that foster experimental design. © 2014 A. P. Dasgupta et al. CBE—Life Sciences Education © 2014 The American Society for Cell Biology. This article is distributed by The American Society for Cell Biology under license from the author(s). It is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http://creativecommons.org/licenses/by-nc-sa/3.0).
Robust distant-entanglement generation using coherent multiphoton scattering
NASA Astrophysics Data System (ADS)
Chan, Ching-Kit; Sham, L. J.
2013-03-01
The generation and controllability of entanglement between distant quantum states have been the heart of quantum computation and quantum information processing. Existing schemes for solid state qubit entanglement are based on the single-photon spectroscopy that has the merit of a high fidelity entanglement creation, but with a very limited efficiency. This severely restricts the scalability for a qubit network system. Here, we describe a new distant entanglement protocol using coherent multiphoton scattering. The scheme makes use of the postselection of large and distinguishable photon signals, and has both a high success probability and a high entanglement fidelity. Our result shows that the entanglement generation is robust against photon fluctuations, and has an average entanglement duration within the decoherence time in various qubit systems, based on existing experimental parameters. This research was supported by the U.S. Army Research Office MURI award W911NF0910406 and by NSF grant PHY-1104446.
A Novel Method for Block Size Forensics Based on Morphological Operations
NASA Astrophysics Data System (ADS)
Luo, Weiqi; Huang, Jiwu; Qiu, Guoping
Passive forensics analysis aims to find out how multimedia data is acquired and processed without relying on pre-embedded or pre-registered information. Since most existing compression schemes for digital images are based on block processing, one of the fundamental steps for subsequent forensics analysis is to detect the presence of block artifacts and estimate the block size for a given image. In this paper, we propose a novel method for blind block size estimation. A 2×2 cross-differential filter is first applied to detect all possible block artifact boundaries, morphological operations are then used to remove the boundary effects caused by the edges of the actual image contents, and finally maximum-likelihood estimation (MLE) is employed to estimate the block size. The experimental results evaluated on over 1300 nature images show the effectiveness of our proposed method. Compared with existing gradient-based detection method, our method achieves over 39% accuracy improvement on average.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ren, Hao; Zhang, Yu; Guo, Sibei
The aggregation of amyloid beta (Aβ) peptides plays a crucial role in the pathology and etiology of Alzheimer's disease. Experimental evidence shows that copper ion is an aggregation-prone species with the ability to coordinately bind to Aβ and further induce the formation of neurotoxic Aβ oligomers. However, the detailed structures of Cu(II)–Aβ complexes have not been illustrated, and the kinetics and dynamics of the Cu(II) binding are not well understood. Two Cu(II)–Aβ complexes have been proposed to exist under physiological conditions, and another two might exist at higher pH values. By using ab initio simulations for the spontaneous resonance Ramanmore » and time domain stimulated resonance Raman spectroscopy signals, we obtained the characteristic Raman vibronic features of each complex. Finally, these signals contain rich structural information with high temporal resolution, enabling the characterization of transient states during the fast Cu–Aβ binding and interconversion processes.« less
Finger vein recognition with personalized feature selection.
Xi, Xiaoming; Yang, Gongping; Yin, Yilong; Meng, Xianjing
2013-08-22
Finger veins are a promising biometric pattern for personalized identification in terms of their advantages over existing biometrics. Based on the spatial pyramid representation and the combination of more effective information such as gray, texture and shape, this paper proposes a simple but powerful feature, called Pyramid Histograms of Gray, Texture and Orientation Gradients (PHGTOG). For a finger vein image, PHGTOG can reflect the global spatial layout and local details of gray, texture and shape. To further improve the recognition performance and reduce the computational complexity, we select a personalized subset of features from PHGTOG for each subject by using the sparse weight vector, which is trained by using LASSO and called PFS-PHGTOG. We conduct extensive experiments to demonstrate the promise of the PHGTOG and PFS-PHGTOG, experimental results on our databases show that PHGTOG outperforms the other existing features. Moreover, PFS-PHGTOG can further boost the performance in comparison with PHGTOG.
Finger Vein Recognition with Personalized Feature Selection
Xi, Xiaoming; Yang, Gongping; Yin, Yilong; Meng, Xianjing
2013-01-01
Finger veins are a promising biometric pattern for personalized identification in terms of their advantages over existing biometrics. Based on the spatial pyramid representation and the combination of more effective information such as gray, texture and shape, this paper proposes a simple but powerful feature, called Pyramid Histograms of Gray, Texture and Orientation Gradients (PHGTOG). For a finger vein image, PHGTOG can reflect the global spatial layout and local details of gray, texture and shape. To further improve the recognition performance and reduce the computational complexity, we select a personalized subset of features from PHGTOG for each subject by using the sparse weight vector, which is trained by using LASSO and called PFS-PHGTOG. We conduct extensive experiments to demonstrate the promise of the PHGTOG and PFS-PHGTOG, experimental results on our databases show that PHGTOG outperforms the other existing features. Moreover, PFS-PHGTOG can further boost the performance in comparison with PHGTOG. PMID:23974154
NASA Astrophysics Data System (ADS)
Botha, J. D. M.; Shahroki, A.; Rice, H.
2017-12-01
This paper presents an enhanced method for predicting aerodynamically generated broadband noise produced by a Vertical Axis Wind Turbine (VAWT). The method improves on existing work for VAWT noise prediction and incorporates recently developed airfoil noise prediction models. Inflow-turbulence and airfoil self-noise mechanisms are both considered. Airfoil noise predictions are dependent on aerodynamic input data and time dependent Computational Fluid Dynamics (CFD) calculations are carried out to solve for the aerodynamic solution. Analytical flow methods are also benchmarked against the CFD informed noise prediction results to quantify errors in the former approach. Comparisons to experimental noise measurements for an existing turbine are encouraging. A parameter study is performed and shows the sensitivity of overall noise levels to changes in inflow velocity and inflow turbulence. Noise sources are characterised and the location and mechanism of the primary sources is determined, inflow-turbulence noise is seen to be the dominant source. The use of CFD calculations is seen to improve the accuracy of noise predictions when compared to the analytic flow solution as well as showing that, for inflow-turbulence noise sources, blade generated turbulence dominates the atmospheric inflow turbulence.
Szatkiewicz, Jin P; Wang, WeiBo; Sullivan, Patrick F; Wang, Wei; Sun, Wei
2013-02-01
Structural variation is an important class of genetic variation in mammals. High-throughput sequencing (HTS) technologies promise to revolutionize copy-number variation (CNV) detection but present substantial analytic challenges. Converging evidence suggests that multiple types of CNV-informative data (e.g. read-depth, read-pair, split-read) need be considered, and that sophisticated methods are needed for more accurate CNV detection. We observed that various sources of experimental biases in HTS confound read-depth estimation, and note that bias correction has not been adequately addressed by existing methods. We present a novel read-depth-based method, GENSENG, which uses a hidden Markov model and negative binomial regression framework to identify regions of discrete copy-number changes while simultaneously accounting for the effects of multiple confounders. Based on extensive calibration using multiple HTS data sets, we conclude that our method outperforms existing read-depth-based CNV detection algorithms. The concept of simultaneous bias correction and CNV detection can serve as a basis for combining read-depth with other types of information such as read-pair or split-read in a single analysis. A user-friendly and computationally efficient implementation of our method is freely available.
Improved medical image fusion based on cascaded PCA and shift invariant wavelet transforms.
Reena Benjamin, J; Jayasree, T
2018-02-01
In the medical field, radiologists need more informative and high-quality medical images to diagnose diseases. Image fusion plays a vital role in the field of biomedical image analysis. It aims to integrate the complementary information from multimodal images, producing a new composite image which is expected to be more informative for visual perception than any of the individual input images. The main objective of this paper is to improve the information, to preserve the edges and to enhance the quality of the fused image using cascaded principal component analysis (PCA) and shift invariant wavelet transforms. A novel image fusion technique based on cascaded PCA and shift invariant wavelet transforms is proposed in this paper. PCA in spatial domain extracts relevant information from the large dataset based on eigenvalue decomposition, and the wavelet transform operating in the complex domain with shift invariant properties brings out more directional and phase details of the image. The significance of maximum fusion rule applied in dual-tree complex wavelet transform domain enhances the average information and morphological details. The input images of the human brain of two different modalities (MRI and CT) are collected from whole brain atlas data distributed by Harvard University. Both MRI and CT images are fused using cascaded PCA and shift invariant wavelet transform method. The proposed method is evaluated based on three main key factors, namely structure preservation, edge preservation, contrast preservation. The experimental results and comparison with other existing fusion methods show the superior performance of the proposed image fusion framework in terms of visual and quantitative evaluations. In this paper, a complex wavelet-based image fusion has been discussed. The experimental results demonstrate that the proposed method enhances the directional features as well as fine edge details. Also, it reduces the redundant details, artifacts, distortions.
Text Mining for Protein Docking
Badal, Varsha D.; Kundrotas, Petras J.; Vakser, Ilya A.
2015-01-01
The rapidly growing amount of publicly available information from biomedical research is readily accessible on the Internet, providing a powerful resource for predictive biomolecular modeling. The accumulated data on experimentally determined structures transformed structure prediction of proteins and protein complexes. Instead of exploring the enormous search space, predictive tools can simply proceed to the solution based on similarity to the existing, previously determined structures. A similar major paradigm shift is emerging due to the rapidly expanding amount of information, other than experimentally determined structures, which still can be used as constraints in biomolecular structure prediction. Automated text mining has been widely used in recreating protein interaction networks, as well as in detecting small ligand binding sites on protein structures. Combining and expanding these two well-developed areas of research, we applied the text mining to structural modeling of protein-protein complexes (protein docking). Protein docking can be significantly improved when constraints on the docking mode are available. We developed a procedure that retrieves published abstracts on a specific protein-protein interaction and extracts information relevant to docking. The procedure was assessed on protein complexes from Dockground (http://dockground.compbio.ku.edu). The results show that correct information on binding residues can be extracted for about half of the complexes. The amount of irrelevant information was reduced by conceptual analysis of a subset of the retrieved abstracts, based on the bag-of-words (features) approach. Support Vector Machine models were trained and validated on the subset. The remaining abstracts were filtered by the best-performing models, which decreased the irrelevant information for ~ 25% complexes in the dataset. The extracted constraints were incorporated in the docking protocol and tested on the Dockground unbound benchmark set, significantly increasing the docking success rate. PMID:26650466
Avoiding disentanglement of multipartite entangled optical beams with a correlated noisy channel
Deng, Xiaowei; Tian, Caixing; Su, Xiaolong; Xie, Changde
2017-01-01
A quantum communication network can be constructed by distributing a multipartite entangled state to space-separated nodes. Entangled optical beams with highest flying speed and measurable brightness can be used as carriers to convey information in quantum communication networks. Losses and noises existing in real communication channels will reduce or even totally destroy entanglement. The phenomenon of disentanglement will result in the complete failure of quantum communication. Here, we present the experimental demonstrations on the disentanglement and the entanglement revival of tripartite entangled optical beams used in a quantum network. We experimentally demonstrate that symmetric tripartite entangled optical beams are robust in pure lossy but noiseless channels. In a noisy channel, the excess noise will lead to the disentanglement and the destroyed entanglement can be revived by the use of a correlated noisy channel (non-Markovian environment). The presented results provide useful technical references for establishing quantum networks. PMID:28295024
Constraints on the [Formula: see text] form factor from analyticity and unitarity.
Ananthanarayan, B; Caprini, I; Kubis, B
Motivated by the discrepancies noted recently between the theoretical calculations of the electromagnetic [Formula: see text] form factor and certain experimental data, we investigate this form factor using analyticity and unitarity in a framework known as the method of unitarity bounds. We use a QCD correlator computed on the spacelike axis by operator product expansion and perturbative QCD as input, and exploit unitarity and the positivity of its spectral function, including the two-pion contribution that can be reliably calculated using high-precision data on the pion form factor. From this information, we derive upper and lower bounds on the modulus of the [Formula: see text] form factor in the elastic region. The results provide a significant check on those obtained with standard dispersion relations, confirming the existence of a disagreement with experimental data in the region around [Formula: see text].
THERMODYNAMICS OF FE-CU ALLOYS AS DESCRIBED BY A CLASSIC POTENTIALS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Caro, A; Caro, M; Lopasso, E M
2005-04-14
The Fe-Cu system is of relevance to the nuclear industry because of the deleterious consequences of Cu precipitates in the mechanical properties of Fe. Several sets of classical potentials are used in molecular dynamics simulations studies of this system, in particular that proposed by Ludwig et al. (Modelling Simul. Mater. Sci. Eng. 6, 19 (1998)). In this work we extract thermodynamic information from this interatomic potentials. We obtain equilibrium phase diagram and find a reasonable agreement with the experimental phases in the regions of relevance to radiation damage studies. We compare the results with the predicted phase diagram based onmore » other potential, as calculated in previous work. We discuss the disagreements found between the phase diagram calculated here and experimental results, focusing on the pure components and discuss the applicability of these potentials; finally we suggest an approach to improve existing potentials for this system.« less
Prethermal time crystals in a one-dimensional periodically driven Floquet system
NASA Astrophysics Data System (ADS)
Zeng, Tian-Sheng; Sheng, D. N.
2017-09-01
Motivated by experimental observations of time-symmetry breaking behavior in a periodically driven (Floquet) system, we study a one-dimensional spin model to explore the stability of such Floquet discrete time crystals (DTCs) under the interplay between interaction and the microwave driving. For intermediate interactions and high drivings, from the time evolution of both stroboscopic spin polarization and mutual information between two ends, we show that Floquet DTCs can exist in a prethermal time regime without the tuning of strong disorder. For much weak interactions the system is a symmetry-unbroken phase, while for strong interactions it gives its way to a thermal phase. Through analyzing the entanglement dynamics, we show that large driving fields protect the prethermal DTCs from many-body localization and thermalization. Our results suggest that by increasing the spin interaction, one can drive the experimental system into optimal regime for observing a robust prethermal DTC phase.
Experimental determination of entanglement with a single measurement.
Walborn, S P; Souto Ribeiro, P H; Davidovich, L; Mintert, F; Buchleitner, A
2006-04-20
Nearly all protocols requiring shared quantum information--such as quantum teleportation or key distribution--rely on entanglement between distant parties. However, entanglement is difficult to characterize experimentally. All existing techniques for doing so, including entanglement witnesses or Bell inequalities, disclose the entanglement of some quantum states but fail for other states; therefore, they cannot provide satisfactory results in general. Such methods are fundamentally different from entanglement measures that, by definition, quantify the amount of entanglement in any state. However, these measures suffer from the severe disadvantage that they typically are not directly accessible in laboratory experiments. Here we report a linear optics experiment in which we directly observe a pure-state entanglement measure, namely concurrence. Our measurement set-up includes two copies of a quantum state: these 'twin' states are prepared in the polarization and momentum degrees of freedom of two photons, and concurrence is measured with a single, local measurement on just one of the photons.
Kinase Identification with Supervised Laplacian Regularized Least Squares
Zhang, He; Wang, Minghui
2015-01-01
Phosphorylation is catalyzed by protein kinases and is irreplaceable in regulating biological processes. Identification of phosphorylation sites with their corresponding kinases contributes to the understanding of molecular mechanisms. Mass spectrometry analysis of phosphor-proteomes generates a large number of phosphorylated sites. However, experimental methods are costly and time-consuming, and most phosphorylation sites determined by experimental methods lack kinase information. Therefore, computational methods are urgently needed to address the kinase identification problem. To this end, we propose a new kernel-based machine learning method called Supervised Laplacian Regularized Least Squares (SLapRLS), which adopts a new method to construct kernels based on the similarity matrix and minimizes both structure risk and overall inconsistency between labels and similarities. The results predicted using both Phospho.ELM and an additional independent test dataset indicate that SLapRLS can more effectively identify kinases compared to other existing algorithms. PMID:26448296
Coulomb and nuclear excitations of narrow resonances in 17Ne
Marganiec, J.; Wamers, F.; Aksouh, F.; ...
2016-05-25
New experimental data for dissociation of relativistic 17Ne projectiles incident on targets of lead, carbon, and polyethylene targets at GSI are presented. Special attention is paid to the excitation and decay of narrow resonant states in 17Ne. Distributions of internal energy in the 15O+p +p three-body system have been determined together with angular and partial-energy correlations between the decay products in different energy regions. The analysis was done using existing experimental data on 17Ne and its mirror nucleus 17N. The isobaric multiplet mass equation is used for assignment of observed resonances and their spins and parities. A combination of datamore » from the heavy and light targets yielded cross sections and transition probabilities for the Coulomb excitations of the narrow resonant states. Finally, the resulting transition probabilities provide information relevant for a better understanding of the 17Ne structure.« less
Cocaine, Appetitive Memory and Neural Connectivity
Ray, Suchismita
2013-01-01
This review examines existing cognitive experimental and brain imaging research related to cocaine addiction. In section 1, previous studies that have examined cognitive processes, such as implicit and explicit memory processes in cocaine users are reported. Next, in section 2, brain imaging studies are reported that have used chronic users of cocaine as study participants. In section 3, several conclusions are drawn. They are: (a) in cognitive experimental literature, no study has examined both implicit and explicit memory processes involving cocaine related visual information in the same cocaine user, (b) neural mechanisms underlying implicit and explicit memory processes for cocaine-related visual cues have not been directly investigated in cocaine users in the imaging literature, and (c) none of the previous imaging studies has examined connectivity between the memory system and craving system in the brain of chronic users of cocaine. Finally, future directions in the field of cocaine addiction are suggested. PMID:25009766
Influence of rotation on the near-wake development behind an impulsively started circular cylinder
NASA Astrophysics Data System (ADS)
Coutanceau, M.; Menard, C.
1985-09-01
A rotating body, travelling through a fluid in such a way that the rotation axis is at right angles to the translational path, experiences a transverse force, called the Magnus force. The present study is concerned with a rotating cylinder which is in a state of translational motion. In the considered case, the existence of a lift force may be explained easily on the basis of the theory of inviscid fluids. An experimental investigation provides new information regarding the mechanism of the near-wake development of the classical unsteady flow and the influence of the rotational effects. Attention is given to the experimental technique, aspects of flow topology and notation, the time development of the wake flow pattern, the time evolution of certain flow properties, the flow structure in the neighborhood of the front stagnation point, and the influence of the Reynolds number on flow establishment.
NASA Astrophysics Data System (ADS)
Dagdeviren, Canan; Shi, Yan; Joe, Pauline; Ghaffari, Roozbeh; Balooch, Guive; Usgaonkar, Karan; Gur, Onur; Tran, Phat L.; Crosby, Jessi R.; Meyer, Marcin; Su, Yewang; Chad Webb, R.; Tedesco, Andrew S.; Slepian, Marvin J.; Huang, Yonggang; Rogers, John A.
2015-07-01
Mechanical assessment of soft biological tissues and organs has broad relevance in clinical diagnosis and treatment of disease. Existing characterization methods are invasive, lack microscale spatial resolution, and are tailored only for specific regions of the body under quasi-static conditions. Here, we develop conformal and piezoelectric devices that enable in vivo measurements of soft tissue viscoelasticity in the near-surface regions of the epidermis. These systems achieve conformal contact with the underlying complex topography and texture of the targeted skin, as well as other organ surfaces, under both quasi-static and dynamic conditions. Experimental and theoretical characterization of the responses of piezoelectric actuator-sensor pairs laminated on a variety of soft biological tissues and organ systems in animal models provide information on the operation of the devices. Studies on human subjects establish the clinical significance of these devices for rapid and non-invasive characterization of skin mechanical properties.
Kinase Identification with Supervised Laplacian Regularized Least Squares.
Li, Ao; Xu, Xiaoyi; Zhang, He; Wang, Minghui
2015-01-01
Phosphorylation is catalyzed by protein kinases and is irreplaceable in regulating biological processes. Identification of phosphorylation sites with their corresponding kinases contributes to the understanding of molecular mechanisms. Mass spectrometry analysis of phosphor-proteomes generates a large number of phosphorylated sites. However, experimental methods are costly and time-consuming, and most phosphorylation sites determined by experimental methods lack kinase information. Therefore, computational methods are urgently needed to address the kinase identification problem. To this end, we propose a new kernel-based machine learning method called Supervised Laplacian Regularized Least Squares (SLapRLS), which adopts a new method to construct kernels based on the similarity matrix and minimizes both structure risk and overall inconsistency between labels and similarities. The results predicted using both Phospho.ELM and an additional independent test dataset indicate that SLapRLS can more effectively identify kinases compared to other existing algorithms.
Inkjet printing-based volumetric display projecting multiple full-colour 2D patterns
NASA Astrophysics Data System (ADS)
Hirayama, Ryuji; Suzuki, Tomotaka; Shimobaba, Tomoyoshi; Shiraki, Atsushi; Naruse, Makoto; Nakayama, Hirotaka; Kakue, Takashi; Ito, Tomoyoshi
2017-04-01
In this study, a method to construct a full-colour volumetric display is presented using a commercially available inkjet printer. Photoreactive luminescence materials are minutely and automatically printed as the volume elements, and volumetric displays are constructed with high resolution using easy-to-fabricate means that exploit inkjet printing technologies. The results experimentally demonstrate the first prototype of an inkjet printing-based volumetric display composed of multiple layers of transparent films that yield a full-colour three-dimensional (3D) image. Moreover, we propose a design algorithm with 3D structures that provide multiple different 2D full-colour patterns when viewed from different directions and experimentally demonstrate prototypes. It is considered that these types of 3D volumetric structures and their fabrication methods based on widely deployed existing printing technologies can be utilised as novel information display devices and systems, including digital signage, media art, entertainment and security.
Research on B Cell Algorithm for Learning to Rank Method Based on Parallel Strategy.
Tian, Yuling; Zhang, Hongxian
2016-01-01
For the purposes of information retrieval, users must find highly relevant documents from within a system (and often a quite large one comprised of many individual documents) based on input query. Ranking the documents according to their relevance within the system to meet user needs is a challenging endeavor, and a hot research topic-there already exist several rank-learning methods based on machine learning techniques which can generate ranking functions automatically. This paper proposes a parallel B cell algorithm, RankBCA, for rank learning which utilizes a clonal selection mechanism based on biological immunity. The novel algorithm is compared with traditional rank-learning algorithms through experimentation and shown to outperform the others in respect to accuracy, learning time, and convergence rate; taken together, the experimental results show that the proposed algorithm indeed effectively and rapidly identifies optimal ranking functions.
Research on B Cell Algorithm for Learning to Rank Method Based on Parallel Strategy
Tian, Yuling; Zhang, Hongxian
2016-01-01
For the purposes of information retrieval, users must find highly relevant documents from within a system (and often a quite large one comprised of many individual documents) based on input query. Ranking the documents according to their relevance within the system to meet user needs is a challenging endeavor, and a hot research topic–there already exist several rank-learning methods based on machine learning techniques which can generate ranking functions automatically. This paper proposes a parallel B cell algorithm, RankBCA, for rank learning which utilizes a clonal selection mechanism based on biological immunity. The novel algorithm is compared with traditional rank-learning algorithms through experimentation and shown to outperform the others in respect to accuracy, learning time, and convergence rate; taken together, the experimental results show that the proposed algorithm indeed effectively and rapidly identifies optimal ranking functions. PMID:27487242
The Simple Video Coder: A free tool for efficiently coding social video data.
Barto, Daniel; Bird, Clark W; Hamilton, Derek A; Fink, Brandi C
2017-08-01
Videotaping of experimental sessions is a common practice across many disciplines of psychology, ranging from clinical therapy, to developmental science, to animal research. Audio-visual data are a rich source of information that can be easily recorded; however, analysis of the recordings presents a major obstacle to project completion. Coding behavior is time-consuming and often requires ad-hoc training of a student coder. In addition, existing software is either prohibitively expensive or cumbersome, which leaves researchers with inadequate tools to quickly process video data. We offer the Simple Video Coder-free, open-source software for behavior coding that is flexible in accommodating different experimental designs, is intuitive for students to use, and produces outcome measures of event timing, frequency, and duration. Finally, the software also offers extraction tools to splice video into coded segments suitable for training future human coders or for use as input for pattern classification algorithms.
Yamagata, Koichi; Yamanishi, Ayako; Kokubu, Chikara; Takeda, Junji; Sese, Jun
2016-05-05
An important challenge in cancer genomics is precise detection of structural variations (SVs) by high-throughput short-read sequencing, which is hampered by the high false discovery rates of existing analysis tools. Here, we propose an accurate SV detection method named COSMOS, which compares the statistics of the mapped read pairs in tumor samples with isogenic normal control samples in a distinct asymmetric manner. COSMOS also prioritizes the candidate SVs using strand-specific read-depth information. Performance tests on modeled tumor genomes revealed that COSMOS outperformed existing methods in terms of F-measure. We also applied COSMOS to an experimental mouse cell-based model, in which SVs were induced by genome engineering and gamma-ray irradiation, followed by polymerase chain reaction-based confirmation. The precision of COSMOS was 84.5%, while the next best existing method was 70.4%. Moreover, the sensitivity of COSMOS was the highest, indicating that COSMOS has great potential for cancer genome analysis. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.
Electroweak Symmetry Breaking and the Higgs Boson: Confronting Theories at Colliders
NASA Astrophysics Data System (ADS)
Azatov, Aleksandr; Galloway, Jamison
2013-01-01
In this review, we discuss methods of parsing direct information from collider experiments regarding the Higgs boson and describe simple ways in which experimental likelihoods can be consistently reconstructed and interfaced with model predictions in pertinent parameter spaces. We review prevalent scenarios for extending the electroweak symmetry breaking sector and emphasize their predictions for nonstandard Higgs phenomenology that could be observed in large hadron collider (LHC) data if naturalness is realized in particular ways. Specifically we identify how measurements of Higgs couplings can be used to imply the existence of new physics at particular scales within various contexts. The most dominant production and decay modes of the Higgs-like state observed in the early data sets have proven to be consistent with predictions of the Higgs boson of the Standard Model, though interesting directions in subdominant channels still exist and will require our careful attention in further experimental tests. Slightly anomalous rates in certain channels at the early LHC have spurred effort in model building and spectra analyses of particular theories, and we discuss these developments in some detail. Finally, we highlight some parameter spaces of interest in order to give examples of how the data surrounding the new state can most effectively be used to constrain specific models of weak scale physics.
Predicting the Types of Ion Channel-Targeted Conotoxins Based on AVC-SVM Model.
Xianfang, Wang; Junmei, Wang; Xiaolei, Wang; Yue, Zhang
2017-01-01
The conotoxin proteins are disulfide-rich small peptides. Predicting the types of ion channel-targeted conotoxins has great value in the treatment of chronic diseases, epilepsy, and cardiovascular diseases. To solve the problem of information redundancy existing when using current methods, a new model is presented to predict the types of ion channel-targeted conotoxins based on AVC (Analysis of Variance and Correlation) and SVM (Support Vector Machine). First, the F value is used to measure the significance level of the feature for the result, and the attribute with smaller F value is filtered by rough selection. Secondly, redundancy degree is calculated by Pearson Correlation Coefficient. And the threshold is set to filter attributes with weak independence to get the result of the refinement. Finally, SVM is used to predict the types of ion channel-targeted conotoxins. The experimental results show the proposed AVC-SVM model reaches an overall accuracy of 91.98%, an average accuracy of 92.17%, and the total number of parameters of 68. The proposed model provides highly useful information for further experimental research. The prediction model will be accessed free of charge at our web server.
Predicting the Types of Ion Channel-Targeted Conotoxins Based on AVC-SVM Model
Xiaolei, Wang
2017-01-01
The conotoxin proteins are disulfide-rich small peptides. Predicting the types of ion channel-targeted conotoxins has great value in the treatment of chronic diseases, epilepsy, and cardiovascular diseases. To solve the problem of information redundancy existing when using current methods, a new model is presented to predict the types of ion channel-targeted conotoxins based on AVC (Analysis of Variance and Correlation) and SVM (Support Vector Machine). First, the F value is used to measure the significance level of the feature for the result, and the attribute with smaller F value is filtered by rough selection. Secondly, redundancy degree is calculated by Pearson Correlation Coefficient. And the threshold is set to filter attributes with weak independence to get the result of the refinement. Finally, SVM is used to predict the types of ion channel-targeted conotoxins. The experimental results show the proposed AVC-SVM model reaches an overall accuracy of 91.98%, an average accuracy of 92.17%, and the total number of parameters of 68. The proposed model provides highly useful information for further experimental research. The prediction model will be accessed free of charge at our web server. PMID:28497044
FROGS (Friends of Granites) report
NASA Astrophysics Data System (ADS)
Miller, Calvin
This VGP News, which is devoted to petrology, is a good one for noting the existence of FROGS. FROGS is, as the name suggests, an informal organization of people whose research relates in one way or another to granitic rocks. Its purpose has been to promote communication among geoscientists with different perspectives and concerns about felsic plutonism. Initially, a major focus was experimental petrology and integration of field-oriented and lab-oriented viewpoints; now that there is the opportunity to communicate with the Eos readership, an obvious additional goal will be to bring together volcanic and plutonic views of felsic magmatism.FROGS first gathered in late 1982 under the guidance of E-an Zen and Pete Toulmin (both at U.S. Geological Survey (USGS), Reston, Va.), who saw a need for greater interaction among those interested in granites and for renewed, focused experimental investigations. They produced two newsletters (which were sent out by direct mail) and organized an informal meeting at the Geological Society of America meeting at Indianapolis, Ind., and then turned over the FROG reins to Sue Kieffer (USGS, Flagstaff, Ariz.) and John Clemens (Arizona State University, Tempe). They generated another newsletter, which was directly mailed to a readership that had grown beyond 200.
Exploiting Complexity Information for Brain Activation Detection
Zhang, Yan; Liang, Jiali; Lin, Qiang; Hu, Zhenghui
2016-01-01
We present a complexity-based approach for the analysis of fMRI time series, in which sample entropy (SampEn) is introduced as a quantification of the voxel complexity. Under this hypothesis the voxel complexity could be modulated in pertinent cognitive tasks, and it changes through experimental paradigms. We calculate the complexity of sequential fMRI data for each voxel in two distinct experimental paradigms and use a nonparametric statistical strategy, the Wilcoxon signed rank test, to evaluate the difference in complexity between them. The results are compared with the well known general linear model based Statistical Parametric Mapping package (SPM12), where a decided difference has been observed. This is because SampEn method detects brain complexity changes in two experiments of different conditions and the data-driven method SampEn evaluates just the complexity of specific sequential fMRI data. Also, the larger and smaller SampEn values correspond to different meanings, and the neutral-blank design produces higher predictability than threat-neutral. Complexity information can be considered as a complementary method to the existing fMRI analysis strategies, and it may help improving the understanding of human brain functions from a different perspective. PMID:27045838
Similarity-based gene detection: using COGs to find evolutionarily-conserved ORFs.
Powell, Bradford C; Hutchison, Clyde A
2006-01-19
Experimental verification of gene products has not kept pace with the rapid growth of microbial sequence information. However, existing annotations of gene locations contain sufficient information to screen for probable errors. Furthermore, comparisons among genomes become more informative as more genomes are examined. We studied all open reading frames (ORFs) of at least 30 codons from the genomes of 27 sequenced bacterial strains. We grouped the potential peptide sequences encoded from the ORFs by forming Clusters of Orthologous Groups (COGs). We used this grouping in order to find homologous relationships that would not be distinguishable from noise when using simple BLAST searches. Although COG analysis was initially developed to group annotated genes, we applied it to the task of grouping anonymous DNA sequences that may encode proteins. "Mixed COGs" of ORFs (clusters in which some sequences correspond to annotated genes and some do not) are attractive targets when seeking errors of gene prediction. Examination of mixed COGs reveals some situations in which genes appear to have been missed in current annotations and a smaller number of regions that appear to have been annotated as gene loci erroneously. This technique can also be used to detect potential pseudogenes or sequencing errors. Our method uses an adjustable parameter for degree of conservation among the studied genomes (stringency). We detail results for one level of stringency at which we found 83 potential genes which had not previously been identified, 60 potential pseudogenes, and 7 sequences with existing gene annotations that are probably incorrect. Systematic study of sequence conservation offers a way to improve existing annotations by identifying potentially homologous regions where the annotation of the presence or absence of a gene is inconsistent among genomes.
Similarity-based gene detection: using COGs to find evolutionarily-conserved ORFs
Powell, Bradford C; Hutchison, Clyde A
2006-01-01
Background Experimental verification of gene products has not kept pace with the rapid growth of microbial sequence information. However, existing annotations of gene locations contain sufficient information to screen for probable errors. Furthermore, comparisons among genomes become more informative as more genomes are examined. We studied all open reading frames (ORFs) of at least 30 codons from the genomes of 27 sequenced bacterial strains. We grouped the potential peptide sequences encoded from the ORFs by forming Clusters of Orthologous Groups (COGs). We used this grouping in order to find homologous relationships that would not be distinguishable from noise when using simple BLAST searches. Although COG analysis was initially developed to group annotated genes, we applied it to the task of grouping anonymous DNA sequences that may encode proteins. Results "Mixed COGs" of ORFs (clusters in which some sequences correspond to annotated genes and some do not) are attractive targets when seeking errors of gene predicion. Examination of mixed COGs reveals some situations in which genes appear to have been missed in current annotations and a smaller number of regions that appear to have been annotated as gene loci erroneously. This technique can also be used to detect potential pseudogenes or sequencing errors. Our method uses an adjustable parameter for degree of conservation among the studied genomes (stringency). We detail results for one level of stringency at which we found 83 potential genes which had not previously been identified, 60 potential pseudogenes, and 7 sequences with existing gene annotations that are probably incorrect. Conclusion Systematic study of sequence conservation offers a way to improve existing annotations by identifying potentially homologous regions where the annotation of the presence or absence of a gene is inconsistent among genomes. PMID:16423288
High-Threshold Fault-Tolerant Quantum Computation with Analog Quantum Error Correction
NASA Astrophysics Data System (ADS)
Fukui, Kosuke; Tomita, Akihisa; Okamoto, Atsushi; Fujii, Keisuke
2018-04-01
To implement fault-tolerant quantum computation with continuous variables, the Gottesman-Kitaev-Preskill (GKP) qubit has been recognized as an important technological element. However, it is still challenging to experimentally generate the GKP qubit with the required squeezing level, 14.8 dB, of the existing fault-tolerant quantum computation. To reduce this requirement, we propose a high-threshold fault-tolerant quantum computation with GKP qubits using topologically protected measurement-based quantum computation with the surface code. By harnessing analog information contained in the GKP qubits, we apply analog quantum error correction to the surface code. Furthermore, we develop a method to prevent the squeezing level from decreasing during the construction of the large-scale cluster states for the topologically protected, measurement-based, quantum computation. We numerically show that the required squeezing level can be relaxed to less than 10 dB, which is within the reach of the current experimental technology. Hence, this work can considerably alleviate this experimental requirement and take a step closer to the realization of large-scale quantum computation.
Accurate thermoelastic tensor and acoustic velocities of NaCl
DOE Office of Scientific and Technical Information (OSTI.GOV)
Marcondes, Michel L., E-mail: michel@if.usp.br; Chemical Engineering and Material Science, University of Minnesota, Minneapolis, 55455; Shukla, Gaurav, E-mail: shukla@physics.umn.edu
Despite the importance of thermoelastic properties of minerals in geology and geophysics, their measurement at high pressures and temperatures are still challenging. Thus, ab initio calculations are an essential tool for predicting these properties at extreme conditions. Owing to the approximate description of the exchange-correlation energy, approximations used in calculations of vibrational effects, and numerical/methodological approximations, these methods produce systematic deviations. Hybrid schemes combining experimental data and theoretical results have emerged as a way to reconcile available information and offer more reliable predictions at experimentally inaccessible thermodynamics conditions. Here we introduce a method to improve the calculated thermoelastic tensor bymore » using highly accurate thermal equation of state (EoS). The corrective scheme is general, applicable to crystalline solids with any symmetry, and can produce accurate results at conditions where experimental data may not exist. We apply it to rock-salt-type NaCl, a material whose structural properties have been challenging to describe accurately by standard ab initio methods and whose acoustic/seismic properties are important for the gas and oil industry.« less
NASA Technical Reports Server (NTRS)
LOVE EUGENE S
1957-01-01
An analysis has been made of available experimental data to show the effects of most of the variables that are more predominant in determining base pressure at supersonic speeds. The analysis covers base pressures for two-dimensional airfoils and for bodies of revolution with and without stabilizing fins and is restricted to turbulent boundary layers. The present status of available experimental information is summarized as are the existing methods for predicting base pressure. A simple semiempirical method is presented for estimating base pressure. For two-dimensional bases, this method stems from an analogy established between the base-pressure phenomena and the peak pressure rise associated with the separation of the boundary layer. An analysis made for axially symmetric flow indicates that the base pressure for bodies of revolution is subject to the same analogy. Based upon the methods presented, estimations are made of such effects as Mach number, angle of attack, boattailing, fineness ratio, and fins. These estimations give fair predictions of experimental results. (author)
Cross-correlation between EMG and center of gravity during quiet stance: theory and simulations.
Kohn, André Fabio
2005-11-01
Several signal processing tools have been employed in the experimental study of the postural control system in humans. Among them, the cross-correlation function has been used to analyze the time relationship between signals such as the electromyogram and the horizontal projection of the center of gravity. The common finding is that the electromyogram precedes the biomechanical signal, a result that has been interpreted in different ways, for example, the existence of feedforward control or the preponderance of a velocity feedback. It is shown here, analytically and by simulation, that the cross-correlation function is dependent in a complicated way on system parameters and on noise spectra. Results similar to those found experimentally, e.g., electromyogram preceding the biomechanical signal may be obtained in a postural control model without any feedforward control and without any velocity feedback. Therefore, correct interpretations of experimentally obtained cross-correlation functions may require additional information about the system. The results extend to other biomedical applications where two signals from a closed loop system are cross-correlated.
Shanks, Ryan A; Robertson, Chuck L; Haygood, Christian S; Herdliksa, Anna M; Herdliska, Heather R; Lloyd, Steven A
2017-01-01
Introductory biology courses provide an important opportunity to prepare students for future courses, yet existing cookbook labs, although important in their own way, fail to provide many of the advantages of semester-long research experiences. Engaging, authentic research experiences aid biology students in meeting many learning goals. Therefore, overlaying a research experience onto the existing lab structure allows faculty to overcome barriers involving curricular change. Here we propose a working model for this overlay design in an introductory biology course and detail a means to conduct this lab with minimal increases in student and faculty workloads. Furthermore, we conducted exploratory factor analysis of the Experimental Design Ability Test (EDAT) and uncovered two latent factors which provide valid means to assess this overlay model's ability to increase advanced experimental design abilities. In a pre-test/post-test design, we demonstrate significant increases in both basic and advanced experimental design abilities in an experimental and comparison group. We measured significantly higher gains in advanced experimental design understanding in students in the experimental group. We believe this overlay model and EDAT factor analysis contribute a novel means to conduct and assess the effectiveness of authentic research experiences in an introductory course without major changes to the course curriculum and with minimal increases in faculty and student workloads.
The FTS atomic spectrum tool (FAST) for rapid analysis of line spectra
NASA Astrophysics Data System (ADS)
Ruffoni, M. P.
2013-07-01
The FTS Atomic Spectrum Tool (FAST) is an interactive graphical program designed to simplify the analysis of atomic emission line spectra obtained from Fourier transform spectrometers. Calculated, predicted and/or known experimental line parameters are loaded alongside experimentally observed spectral line profiles for easy comparison between new experimental data and existing results. Many such line profiles, which could span numerous spectra, may be viewed simultaneously to help the user detect problems from line blending or self-absorption. Once the user has determined that their experimental line profile fits are good, a key feature of FAST is the ability to calculate atomic branching fractions, transition probabilities, and oscillator strengths-and their uncertainties-which is not provided by existing analysis packages. Program SummaryProgram title: FAST: The FTS Atomic Spectrum Tool Catalogue identifier: AEOW_v1_0 Program summary URL: http://cpc.cs.qub.ac.uk/summaries/AEOW_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU General Public License version 3 No. of lines in distributed program, including test data, etc.: 293058 No. of bytes in distributed program, including test data, etc.: 13809509 Distribution format: tar.gz Programming language: C++. Computer: Intel x86-based systems. Operating system: Linux/Unix/Windows. RAM: 8 MB minimum. About 50-200 MB for a typical analysis. Classification: 2.2, 2.3, 21.2. Nature of problem: Visualisation of atomic line spectra including the comparison of theoretical line parameters with experimental atomic line profiles. Accurate intensity calibration of experimental spectra, and the determination of observed relative line intensities that are needed for calculating atomic branching fractions and oscillator strengths. Solution method: FAST is centred around a graphical interface, where a user may view sets of experimental line profiles and compare them to calculated data (such as from the Kurucz database [1]), predicted line parameters, and/or previously known experimental results. With additional information on the spectral response of the spectrometer, obtained from a calibrated standard light source, FT spectra may be intensity calibrated. In turn, this permits the user to calculate atomic branching fractions and oscillator strengths, and their respective uncertainties. Running time: Open ended. Defined by the user. References: [1] R.L. Kurucz (2007). URL http://kurucz.harvard.edu/atoms/.
Labelling effects and adolescent responses to peers with depression: an experimental investigation.
Dolphin, Louise; Hennessy, Eilis
2017-06-24
The impact of illness labels on the stigma experiences of individuals with mental health problems is a matter of ongoing debate. Some argue that labels have a negative influence on judgments and should be avoided in favour of information emphasising the existence of a continuum of mental health/illness. Others believe that behavioral symptoms are more powerful influencers of stigma than labels. The phenomenon has received little attention in adolescent research, despite the critical importance of the peer group at this developmental stage. This study employs a novel experimental design to examine the impact of the depression label and continuum information on adolescents' responses to peers with depression. Participants were 156 adolescents, 76 male, 80 female (M = 16.25 years; SD = .361), assigned to one of three conditions (Control, Label, Continuum). Participants respond to four audio-visual vignette characters (two clinically depressed) on three occasions. Outcome measures included judgment of the mental health of the vignette characters and emotional responses to them. Neither the provision of a depression label or continuum information influenced perceptions of the mental health of the characters in the audio-visual vignettes or participants' emotional responses to them. The findings have implications for the design of interventions to combat depression stigma with adolescents. Interventions should not necessarily target perceptions of psychiatric labels, but rather perceptions of symptomatic behaviour.
Global Design Optimization for Aerodynamics and Rocket Propulsion Components
NASA Technical Reports Server (NTRS)
Shyy, Wei; Papila, Nilay; Vaidyanathan, Rajkumar; Tucker, Kevin; Turner, James E. (Technical Monitor)
2000-01-01
Modern computational and experimental tools for aerodynamics and propulsion applications have matured to a stage where they can provide substantial insight into engineering processes involving fluid flows, and can be fruitfully utilized to help improve the design of practical devices. In particular, rapid and continuous development in aerospace engineering demands that new design concepts be regularly proposed to meet goals for increased performance, robustness and safety while concurrently decreasing cost. To date, the majority of the effort in design optimization of fluid dynamics has relied on gradient-based search algorithms. Global optimization methods can utilize the information collected from various sources and by different tools. These methods offer multi-criterion optimization, handle the existence of multiple design points and trade-offs via insight into the entire design space, can easily perform tasks in parallel, and are often effective in filtering the noise intrinsic to numerical and experimental data. However, a successful application of the global optimization method needs to address issues related to data requirements with an increase in the number of design variables, and methods for predicting the model performance. In this article, we review recent progress made in establishing suitable global optimization techniques employing neural network and polynomial-based response surface methodologies. Issues addressed include techniques for construction of the response surface, design of experiment techniques for supplying information in an economical manner, optimization procedures and multi-level techniques, and assessment of relative performance between polynomials and neural networks. Examples drawn from wing aerodynamics, turbulent diffuser flows, gas-gas injectors, and supersonic turbines are employed to help demonstrate the issues involved in an engineering design context. Both the usefulness of the existing knowledge to aid current design practices and the need for future research are identified.
Wang, Yongcui; Chen, Shilong; Deng, Naiyang; Wang, Yong
2013-01-01
Computational inference of novel therapeutic values for existing drugs, i.e., drug repositioning, offers the great prospect for faster and low-risk drug development. Previous researches have indicated that chemical structures, target proteins, and side-effects could provide rich information in drug similarity assessment and further disease similarity. However, each single data source is important in its own way and data integration holds the great promise to reposition drug more accurately. Here, we propose a new method for drug repositioning, PreDR (Predict Drug Repositioning), to integrate molecular structure, molecular activity, and phenotype data. Specifically, we characterize drug by profiling in chemical structure, target protein, and side-effects space, and define a kernel function to correlate drugs with diseases. Then we train a support vector machine (SVM) to computationally predict novel drug-disease interactions. PreDR is validated on a well-established drug-disease network with 1,933 interactions among 593 drugs and 313 diseases. By cross-validation, we find that chemical structure, drug target, and side-effects information are all predictive for drug-disease relationships. More experimentally observed drug-disease interactions can be revealed by integrating these three data sources. Comparison with existing methods demonstrates that PreDR is competitive both in accuracy and coverage. Follow-up database search and pathway analysis indicate that our new predictions are worthy of further experimental validation. Particularly several novel predictions are supported by clinical trials databases and this shows the significant prospects of PreDR in future drug treatment. In conclusion, our new method, PreDR, can serve as a useful tool in drug discovery to efficiently identify novel drug-disease interactions. In addition, our heterogeneous data integration framework can be applied to other problems. PMID:24244318
NASA Astrophysics Data System (ADS)
Flubacher, Moritz; Sedlmeier, Katrin; Lechthaler, Filippo; Rohrer, Mario; Cristobal, Lizet; Vinogradova, Alexandra
2017-04-01
In the semi-arid Altiplano in Peru, smallholder farmers are extremely exposed to climatic hazards like drought, frost and hail. These unfavorable weather and climate events can lead to significant crop losses and thereby provoke periods of food insecurity for subsistence farmers. The use of specific climate information can serve as an adaptation strategy to reduce the impact of these natural hazards. In this context, the Climandes project (a project of the Global Framework for Climate Services led by WMO) aims at developing user-tailored seasonal forecast products for the agricultural sector in the Peruvian Andes such as indices on increased frost risk, the occurrence of long dry periods, or the start of the rainy season. In order to develop such user-tailored climate information and link it efficiently to the existing implementation context, it is important to understand the complex interrelation between climate variability and change, socio-economic vulnerability and adaptation limits. Moreover, as it has been widely shown, the process of making climate information useful for end-users, in particular for smallholder farmers in developing countries, remains a considerable challenge due to existing cognitive, cultural and institutional constraints. In this sense, it is necessary to identify these constraints and formulate strategies to overcome them. While there exist different studies about climate change and anomalies in Puno, there is no consolidated evidence on the corresponding socio-economic vulnerabilities in the specific agricultural context of Puno. In order to fill this gap, we conducted a field survey collecting primary data in the Andean highlands based on a representative sample of 726 smallholder farmers in the region of Puno (Peru). The assessment primarily focused on exploring smallholders' agro-climatic risk exposure, socio-economic profiles, existing coping strategies as well as prevailing barriers to utilization of science-based climate information. The study was complemented with an artefactual experimental game performed with 176 smallholders to identify and describe their risk preferences. The existing economic literature shows that farmers' risk preferences generally play a decisive role for agricultural decision-making indicating the importance of understanding farmer's risk profile when evaluating the potential use of climate information at the individual level. First results indicate that smallholders in the region are regularly exposed to extreme weather events such as frost, hailstorms and droughts. Under these conditions, farmers often do not have the capacity and sufficient resources to prevent periods of food insecurity at the end of the growing period. Hereby climate information can support the agricultural production decisions and improve food security but only if developed in close collaboration with the end-users.
DNA as information: at the crossroads between biology, mathematics, physics and chemistry.
Cartwright, Julyan H E; Giannerini, Simone; González, Diego L
2016-03-13
On the one hand, biology, chemistry and also physics tell us how the process of translating the genetic information into life could possibly work, but we are still very far from a complete understanding of this process. On the other hand, mathematics and statistics give us methods to describe such natural systems-or parts of them-within a theoretical framework. Also, they provide us with hints and predictions that can be tested at the experimental level. Furthermore, there are peculiar aspects of the management of genetic information that are intimately related to information theory and communication theory. This theme issue is aimed at fostering the discussion on the problem of genetic coding and information through the presentation of different innovative points of view. The aim of the editors is to stimulate discussions and scientific exchange that will lead to new research on why and how life can exist from the point of view of the coding and decoding of genetic information. The present introduction represents the point of view of the editors on the main aspects that could be the subject of future scientific debate. © 2016 The Author(s).
Short text sentiment classification based on feature extension and ensemble classifier
NASA Astrophysics Data System (ADS)
Liu, Yang; Zhu, Xie
2018-05-01
With the rapid development of Internet social media, excavating the emotional tendencies of the short text information from the Internet, the acquisition of useful information has attracted the attention of researchers. At present, the commonly used can be attributed to the rule-based classification and statistical machine learning classification methods. Although micro-blog sentiment analysis has made good progress, there still exist some shortcomings such as not highly accurate enough and strong dependence from sentiment classification effect. Aiming at the characteristics of Chinese short texts, such as less information, sparse features, and diverse expressions, this paper considers expanding the original text by mining related semantic information from the reviews, forwarding and other related information. First, this paper uses Word2vec to compute word similarity to extend the feature words. And then uses an ensemble classifier composed of SVM, KNN and HMM to analyze the emotion of the short text of micro-blog. The experimental results show that the proposed method can make good use of the comment forwarding information to extend the original features. Compared with the traditional method, the accuracy, recall and F1 value obtained by this method have been improved.
Kohlmayer, Florian; Prasser, Fabian; Kuhn, Klaus A
2015-12-01
With the ARX data anonymization tool structured biomedical data can be de-identified using syntactic privacy models, such as k-anonymity. Data is transformed with two methods: (a) generalization of attribute values, followed by (b) suppression of data records. The former method results in data that is well suited for analyses by epidemiologists, while the latter method significantly reduces loss of information. Our tool uses an optimal anonymization algorithm that maximizes output utility according to a given measure. To achieve scalability, existing optimal anonymization algorithms exclude parts of the search space by predicting the outcome of data transformations regarding privacy and utility without explicitly applying them to the input dataset. These optimizations cannot be used if data is transformed with generalization and suppression. As optimal data utility and scalability are important for anonymizing biomedical data, we had to develop a novel method. In this article, we first confirm experimentally that combining generalization with suppression significantly increases data utility. Next, we proof that, within this coding model, the outcome of data transformations regarding privacy and utility cannot be predicted. As a consequence, existing algorithms fail to deliver optimal data utility. We confirm this finding experimentally. The limitation of previous work can be overcome at the cost of increased computational complexity. However, scalability is important for anonymizing data with user feedback. Consequently, we identify properties of datasets that may be predicted in our context and propose a novel and efficient algorithm. Finally, we evaluate our solution with multiple datasets and privacy models. This work presents the first thorough investigation of which properties of datasets can be predicted when data is anonymized with generalization and suppression. Our novel approach adopts existing optimization strategies to our context and combines different search methods. The experiments show that our method is able to efficiently solve a broad spectrum of anonymization problems. Our work shows that implementing syntactic privacy models is challenging and that existing algorithms are not well suited for anonymizing data with transformation models which are more complex than generalization alone. As such models have been recommended for use in the biomedical domain, our results are of general relevance for de-identifying structured biomedical data. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.
Annotating novel genes by integrating synthetic lethals and genomic information
Schöner, Daniel; Kalisch, Markus; Leisner, Christian; Meier, Lukas; Sohrmann, Marc; Faty, Mahamadou; Barral, Yves; Peter, Matthias; Gruissem, Wilhelm; Bühlmann, Peter
2008-01-01
Background Large scale screening for synthetic lethality serves as a common tool in yeast genetics to systematically search for genes that play a role in specific biological processes. Often the amounts of data resulting from a single large scale screen far exceed the capacities of experimental characterization of every identified target. Thus, there is need for computational tools that select promising candidate genes in order to reduce the number of follow-up experiments to a manageable size. Results We analyze synthetic lethality data for arp1 and jnm1, two spindle migration genes, in order to identify novel members in this process. To this end, we use an unsupervised statistical method that integrates additional information from biological data sources, such as gene expression, phenotypic profiling, RNA degradation and sequence similarity. Different from existing methods that require large amounts of synthetic lethal data, our method merely relies on synthetic lethality information from two single screens. Using a Multivariate Gaussian Mixture Model, we determine the best subset of features that assign the target genes to two groups. The approach identifies a small group of genes as candidates involved in spindle migration. Experimental testing confirms the majority of our candidates and we present she1 (YBL031W) as a novel gene involved in spindle migration. We applied the statistical methodology also to TOR2 signaling as another example. Conclusion We demonstrate the general use of Multivariate Gaussian Mixture Modeling for selecting candidate genes for experimental characterization from synthetic lethality data sets. For the given example, integration of different data sources contributes to the identification of genetic interaction partners of arp1 and jnm1 that play a role in the same biological process. PMID:18194531
A Robust Adaptive Autonomous Approach to Optimal Experimental Design
NASA Astrophysics Data System (ADS)
Gu, Hairong
Experimentation is the fundamental tool of scientific inquiries to understand the laws governing the nature and human behaviors. Many complex real-world experimental scenarios, particularly in quest of prediction accuracy, often encounter difficulties to conduct experiments using an existing experimental procedure for the following two reasons. First, the existing experimental procedures require a parametric model to serve as the proxy of the latent data structure or data-generating mechanism at the beginning of an experiment. However, for those experimental scenarios of concern, a sound model is often unavailable before an experiment. Second, those experimental scenarios usually contain a large number of design variables, which potentially leads to a lengthy and costly data collection cycle. Incompetently, the existing experimental procedures are unable to optimize large-scale experiments so as to minimize the experimental length and cost. Facing the two challenges in those experimental scenarios, the aim of the present study is to develop a new experimental procedure that allows an experiment to be conducted without the assumption of a parametric model while still achieving satisfactory prediction, and performs optimization of experimental designs to improve the efficiency of an experiment. The new experimental procedure developed in the present study is named robust adaptive autonomous system (RAAS). RAAS is a procedure for sequential experiments composed of multiple experimental trials, which performs function estimation, variable selection, reverse prediction and design optimization on each trial. Directly addressing the challenges in those experimental scenarios of concern, function estimation and variable selection are performed by data-driven modeling methods to generate a predictive model from data collected during the course of an experiment, thus exempting the requirement of a parametric model at the beginning of an experiment; design optimization is performed to select experimental designs on the fly of an experiment based on their usefulness so that fewest designs are needed to reach useful inferential conclusions. Technically, function estimation is realized by Bayesian P-splines, variable selection is realized by Bayesian spike-and-slab prior, reverse prediction is realized by grid-search and design optimization is realized by the concepts of active learning. The present study demonstrated that RAAS achieves statistical robustness by making accurate predictions without the assumption of a parametric model serving as the proxy of latent data structure while the existing procedures can draw poor statistical inferences if a misspecified model is assumed; RAAS also achieves inferential efficiency by taking fewer designs to acquire useful statistical inferences than non-optimal procedures. Thus, RAAS is expected to be a principled solution to real-world experimental scenarios pursuing robust prediction and efficient experimentation.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-03-27
... of information technology. Experimental Study of Graphic Cigarette Warning Labels--(OMB Control... graphic warnings required by the Tobacco Control Act. The experimental study data will be collected from...] Agency Information Collection Activities; Proposed Collection; Comment Request; Experimental Study of...
NASA Astrophysics Data System (ADS)
Arkhipkin, D.; Lauret, J.
2017-10-01
One of the STAR experiment’s modular Messaging Interface and Reliable Architecture framework (MIRA) integration goals is to provide seamless and automatic connections with the existing control systems. After an initial proof of concept and operation of the MIRA system as a parallel data collection system for online use and real-time monitoring, the STAR Software and Computing group is now working on the integration of Experimental Physics and Industrial Control System (EPICS) with MIRA’s interfaces. This integration goals are to allow functional interoperability and, later on, to replace the existing/legacy Detector Control System components at the service level. In this report, we describe the evolutionary integration process and, as an example, will discuss the EPICS Alarm Handler conversion. We review the complete upgrade procedure starting with the integration of EPICS-originated alarm signals propagation into MIRA, followed by the replacement of the existing operator interface based on Motif Editor and Display Manager (MEDM) with modern portable web-based Alarm Handler interface. To achieve this aim, we have built an EPICS-to-MQTT [8] bridging service, and recreated the functionality of the original Alarm Handler using low-latency web messaging technologies. The integration of EPICS alarm handling into our messaging framework allowed STAR to improve the DCS alarm awareness of existing STAR DAQ and RTS services, which use MIRA as a primary source of experiment control information.
Calibration of streamflow gauging stations at the Tenderfoot Creek Experimental Forest
Scott W. Woods
2007-01-01
We used tracer based methods to calibrate eleven streamflow gauging stations at the Tenderfoot Creek Experimental Forest in western Montana. At six of the stations the measured flows were consistent with the existing rating curves. At Lower and Upper Stringer Creek, Upper Sun Creek and Upper Tenderfoot Creek the published flows, based on the existing rating curves,...
Experimental and CFD evidence of multiple solutions in a naturally ventilated building.
Heiselberg, P; Li, Y; Andersen, A; Bjerre, M; Chen, Z
2004-02-01
This paper considers the existence of multiple solutions to natural ventilation of a simple one-zone building, driven by combined thermal and opposing wind forces. The present analysis is an extension of an earlier analytical study of natural ventilation in a fully mixed building, and includes the effect of thermal stratification. Both computational and experimental investigations were carried out in parallel with an analytical investigation. When flow is dominated by thermal buoyancy, it was found experimentally that there is thermal stratification. When the flow is wind-dominated, the room is fully mixed. Results from all three methods have shown that the hysteresis phenomena exist. Under certain conditions, two different stable steady-state solutions are found to exist by all three methods for the same set of parameters. As shown by both the computational fluid dynamics (CFD) and experimental results, one of the solutions can shift to another when there is a sufficient perturbation. These results have probably provided the strongest evidence so far for the conclusion that multiple states exist in natural ventilation of simple buildings. Different initial conditions in the CFD simulations led to different solutions, suggesting that caution must be taken when adopting the commonly used 'zero initialization'.
THE BLAZAR EMISSION ENVIRONMENT: INSIGHT FROM SOFT X-RAY ABSORPTION
DOE Office of Scientific and Technical Information (OSTI.GOV)
Furniss, A.; Williams, D. A.; Fumagalli, M.
Collecting experimental insight into the relativistic particle populations and emission mechanisms at work within TeV-emitting blazar jets, which are spatially unresolvable in most bands and have strong beaming factors, is a daunting task. New observational information has the potential to lead to major strides in understanding the acceleration site parameters. Detection of molecular carbon monoxide (CO) in TeV emitting blazars, however, implies the existence of intrinsic gas, a connection often found in photo-dissociated region models and numerical simulations. The existence of intrinsic gas within a blazar could provide a target photon field for Compton up-scattering of photons to TeV energiesmore » by relativistic particles. We investigate the possible existence of intrinsic gas within the three TeV emitting blazars RGB J0710+591, W Comae, and 1ES 1959+650 which have measurements or upper limits on molecular CO line luminosity using an independent technique that is based on the spectral analysis of soft X-rays. Evidence for X-ray absorption by additional gas beyond that measured within the Milky Way is searched for in Swift X-ray Telescope (XRT) data between 0.3 and 10 keV. Without complementary information from another measurement, additional absorption could be misinterpreted as an intrinsically curved X-ray spectrum since both models can frequently fit the soft X-ray data. After breaking this degeneracy, we do not find evidence for intrinsically curved spectra for any of the three blazars. Moreover, no evidence for intrinsic gas is evident for RGB J0710+591 and W Comae, while the 1ES 1959+650 XRT data support the existence of intrinsic gas with a column density of {approx}1 Multiplication-Sign 10{sup 21} cm{sup -2}.« less
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-16
... notice solicits comments on research entitled ``Experimental Study: Disease Information in Branded Promotional Material.'' The proposed research will explore the nature of including information about a disease...] Agency Information Collection Activities; Proposed Collection; Comment Request; Experimental Study...
González-Beltrán, Alejandra N; Yong, May Y; Dancey, Gairin; Begent, Richard
2012-01-06
Biology, biomedicine and healthcare have become data-driven enterprises, where scientists and clinicians need to generate, access, validate, interpret and integrate different kinds of experimental and patient-related data. Thus, recording and reporting of data in a systematic and unambiguous fashion is crucial to allow aggregation and re-use of data. This paper reviews the benefits of existing biomedical data standards and focuses on key elements to record experiments for therapy development. Specifically, we describe the experiments performed in molecular, cellular, animal and clinical models. We also provide an example set of elements for a therapy tested in a phase I clinical trial. We introduce the Guidelines for Information About Therapy Experiments (GIATE), a minimum information checklist creating a consistent framework to transparently report the purpose, methods and results of the therapeutic experiments. A discussion on the scope, design and structure of the guidelines is presented, together with a description of the intended audience. We also present complementary resources such as a classification scheme, and two alternative ways of creating GIATE information: an electronic lab notebook and a simple spreadsheet-based format. Finally, we use GIATE to record the details of the phase I clinical trial of CHT-25 for patients with refractory lymphomas. The benefits of using GIATE for this experiment are discussed. While data standards are being developed to facilitate data sharing and integration in various aspects of experimental medicine, such as genomics and clinical data, no previous work focused on therapy development. We propose a checklist for therapy experiments and demonstrate its use in the 131Iodine labeled CHT-25 chimeric antibody cancer therapy. As future work, we will expand the set of GIATE tools to continue to encourage its use by cancer researchers, and we will engineer an ontology to annotate GIATE elements and facilitate unambiguous interpretation and data integration.
2012-01-01
Background Biology, biomedicine and healthcare have become data-driven enterprises, where scientists and clinicians need to generate, access, validate, interpret and integrate different kinds of experimental and patient-related data. Thus, recording and reporting of data in a systematic and unambiguous fashion is crucial to allow aggregation and re-use of data. This paper reviews the benefits of existing biomedical data standards and focuses on key elements to record experiments for therapy development. Specifically, we describe the experiments performed in molecular, cellular, animal and clinical models. We also provide an example set of elements for a therapy tested in a phase I clinical trial. Findings We introduce the Guidelines for Information About Therapy Experiments (GIATE), a minimum information checklist creating a consistent framework to transparently report the purpose, methods and results of the therapeutic experiments. A discussion on the scope, design and structure of the guidelines is presented, together with a description of the intended audience. We also present complementary resources such as a classification scheme, and two alternative ways of creating GIATE information: an electronic lab notebook and a simple spreadsheet-based format. Finally, we use GIATE to record the details of the phase I clinical trial of CHT-25 for patients with refractory lymphomas. The benefits of using GIATE for this experiment are discussed. Conclusions While data standards are being developed to facilitate data sharing and integration in various aspects of experimental medicine, such as genomics and clinical data, no previous work focused on therapy development. We propose a checklist for therapy experiments and demonstrate its use in the 131Iodine labeled CHT-25 chimeric antibody cancer therapy. As future work, we will expand the set of GIATE tools to continue to encourage its use by cancer researchers, and we will engineer an ontology to annotate GIATE elements and facilitate unambiguous interpretation and data integration. PMID:22226027
Leonidou, Chrysanthi; Panayiotou, Georgia
2018-08-01
According to the cognitive-behavioral model, illness anxiety is developed and maintained through biased processing of health-threatening information and maladaptive responses to such information. This study is a systematic review of research that attempted to validate central tenets of the cognitive-behavioral model regarding etiological and maintenance mechanisms in illness anxiety. Sixty-two studies, including correlational and experimental designs, were identified through a systematic search of databases and were evaluated for their quality. Outcomes were synthesized following a qualitative thematic approach under categories of theoretically driven mechanisms derived from the cognitive-behavioral model: attention, memory and interpretation biases, perceived awareness and inaccuracy in perception of somatic sensations, negativity bias, emotion dysregulation, and behavioral avoidance. Findings partly support the cognitive-behavioral model, but several of its hypothetical mechanisms only receive weak support due to the scarcity of relevant studies. Directions for future research are suggested based on identified gaps in the existing literature. Copyright © 2018 Elsevier Inc. All rights reserved.
Bowie, Christopher R.; Reichenberg, Abraham; McClure, Margaret M.; Leung, Winnie L.; Harvey, Philip D.
2008-01-01
Cognitive dysfunction is a common feature of schizophrenia and deficits are present before the onset of psychosis, and are moderate to severe by the time of the first episode. Controversy exists over the course of cognitive dysfunction after the first episode. This study examined age-associated differences in performance on clinical neuropsychological (NP) and information processing tasks in a sample of geriatric community living schizophrenia patients (n=172). Compared to healthy control subjects (n=70), people with schizophrenia did not differ on NP tests across age groups but showed evidence for age-associated cognitive worsening on the more complex components of an information-processing test. Age-related changes in cognitive function in schizophrenia may be a function of both the course of illness and the processing demands of the cognitive measure of interest. Tests with fixed difficulty, such as clinical NP tests, may differ in their sensitivity from tests for which parametric difficulty manipulations can be performed. PMID:18053687
NASA Astrophysics Data System (ADS)
Starshynov, I.; Paniagua-Diaz, A. M.; Fayard, N.; Goetschy, A.; Pierrat, R.; Carminati, R.; Bertolotti, J.
2018-04-01
The propagation of monochromatic light through a scattering medium produces speckle patterns in reflection and transmission, and the apparent randomness of these patterns prevents direct imaging through thick turbid media. Yet, since elastic multiple scattering is fundamentally a linear and deterministic process, information is not lost but distributed among many degrees of freedom that can be resolved and manipulated. Here, we demonstrate experimentally that the reflected and transmitted speckle patterns are robustly correlated, and we unravel all the complex and unexpected features of this fundamentally non-Gaussian and long-range correlation. In particular, we show that it is preserved even for opaque media with thickness much larger than the scattering mean free path, proving that information survives the multiple scattering process and can be recovered. The existence of correlations between the two sides of a scattering medium opens up new possibilities for the control of transmitted light without any feedback from the target side, but using only information gathered from the reflected speckle.
A Firefly Algorithm-based Approach for Pseudo-Relevance Feedback: Application to Medical Database.
Khennak, Ilyes; Drias, Habiba
2016-11-01
The difficulty of disambiguating the sense of the incomplete and imprecise keywords that are extensively used in the search queries has caused the failure of search systems to retrieve the desired information. One of the most powerful and promising method to overcome this shortcoming and improve the performance of search engines is Query Expansion, whereby the user's original query is augmented by new keywords that best characterize the user's information needs and produce more useful query. In this paper, a new Firefly Algorithm-based approach is proposed to enhance the retrieval effectiveness of query expansion while maintaining low computational complexity. In contrast to the existing literature, the proposed approach uses a Firefly Algorithm to find the best expanded query among a set of expanded query candidates. Moreover, this new approach allows the determination of the length of the expanded query empirically. Experimental results on MEDLINE, the on-line medical information database, show that our proposed approach is more effective and efficient compared to the state-of-the-art.
Evaluating ritual efficacy: evidence from the supernatural.
Legare, Cristine H; Souza, André L
2012-07-01
Rituals pose a cognitive paradox: although widely used to treat problems, rituals are causally opaque (i.e., they lack a causal explanation for their effects). How is the efficacy of ritual action evaluated in the absence of causal information? To examine this question using ecologically valid content, three studies (N=162) were conducted in Brazil, a cultural context in which rituals called simpatias are used to treat a great variety of problems ranging from asthma to infidelity. Using content from existing simpatias, experimental simpatias were designed to manipulate the kinds of information that influences perceptions of efficacy. A fourth study (N=68) with identical stimuli was conducted with a US sample to assess the generalizability of the findings across two different cultural contexts. The results provide evidence that information reflecting intuitive causal principles (i.e., repetition of procedures, number of procedural steps) and transcendental influence (i.e., presence of religious icons) affects how people evaluate ritual efficacy. Copyright © 2012 Elsevier B.V. All rights reserved.
Protein complex prediction in large ontology attributed protein-protein interaction networks.
Zhang, Yijia; Lin, Hongfei; Yang, Zhihao; Wang, Jian; Li, Yanpeng; Xu, Bo
2013-01-01
Protein complexes are important for unraveling the secrets of cellular organization and function. Many computational approaches have been developed to predict protein complexes in protein-protein interaction (PPI) networks. However, most existing approaches focus mainly on the topological structure of PPI networks, and largely ignore the gene ontology (GO) annotation information. In this paper, we constructed ontology attributed PPI networks with PPI data and GO resource. After constructing ontology attributed networks, we proposed a novel approach called CSO (clustering based on network structure and ontology attribute similarity). Structural information and GO attribute information are complementary in ontology attributed networks. CSO can effectively take advantage of the correlation between frequent GO annotation sets and the dense subgraph for protein complex prediction. Our proposed CSO approach was applied to four different yeast PPI data sets and predicted many well-known protein complexes. The experimental results showed that CSO was valuable in predicting protein complexes and achieved state-of-the-art performance.
Anonymizing and Sharing Medical Text Records
Li, Xiao-Bai; Qin, Jialun
2017-01-01
Health information technology has increased accessibility of health and medical data and benefited medical research and healthcare management. However, there are rising concerns about patient privacy in sharing medical and healthcare data. A large amount of these data are in free text form. Existing techniques for privacy-preserving data sharing deal largely with structured data. Current privacy approaches for medical text data focus on detection and removal of patient identifiers from the data, which may be inadequate for protecting privacy or preserving data quality. We propose a new systematic approach to extract, cluster, and anonymize medical text records. Our approach integrates methods developed in both data privacy and health informatics fields. The key novel elements of our approach include a recursive partitioning method to cluster medical text records based on the similarity of the health and medical information and a value-enumeration method to anonymize potentially identifying information in the text data. An experimental study is conducted using real-world medical documents. The results of the experiments demonstrate the effectiveness of the proposed approach. PMID:29569650
On the implementation of IP protection using biometrics based information hiding and firewall
NASA Astrophysics Data System (ADS)
Basu, Abhishek; Nandy, Kingshuk; Banerjee, Avishek; Giri, Supratick; Sarkar, Souvik; Sarkar, Subir Kumar
2016-02-01
System-on-chip-based design style creates a revolution in very large scale integration industry with design efficiency, operating speed and development time. To support this process, reuse and exchange of components are essential in electronic form called intellectual property (IP). This, however, increases the possibility of encroachment of IP of the design. So copyright protection of IP against piracy is the most important concern for IP vendors. The existing solutions for IP protection are still not secure enough with flexibility, cost, etc. This paper proposes an information-hiding-based solution for IP protection by embedding a biometric copyright information and firewall inside an IP in the form of a finite state machine with unique configuration. The scheme first introduces biometric signature-based copyright as ownership proof. Second, firewall interrupts the normal functionality of IP at the end of the user time period. The experimental outcomes of field-programmable-gate-array implementation illustrate the efficiency of the proposed method.
Delivery of laboratory data with World Wide Web technology.
Hahn, A W; Leon, M A; Klein-Leon, S; Allen, G K; Boon, G D; Patrick, T B; Klimczak, J C
1997-01-01
We have developed an experimental World Wide Web (WWW) based system to deliver laboratory results to clinicians in our Veterinary Medical Teaching Hospital. Laboratory results are generated by the clinical pathology section of our Veterinary Medical Diagnostic Laboratory and stored in a legacy information system. This system does not interface directly to the hospital information system, and it cannot be accessed directly by clinicians. Our "meta" system first parses routine print reports and then instantiates the data into a modern, open-architecture relational database using a data model constructed with currently accepted international standards for data representation and communication. The system does not affect either of the existing legacy systems. Location-independent delivery of patient data is via a secure WWW based system which maximizes usability and allows "value-added" graphic representations. The data can be viewed with any web browser. Future extensibility and intra- and inter-institutional compatibility served as key design criteria. The system is in the process of being evaluated using accepted methods of assessment of information technologies.
Significance of perceptually relevant image decolorization for scene classification
NASA Astrophysics Data System (ADS)
Viswanathan, Sowmya; Divakaran, Govind; Soman, Kutti Padanyl
2017-11-01
Color images contain luminance and chrominance components representing the intensity and color information, respectively. The objective of this paper is to show the significance of incorporating chrominance information to the task of scene classification. An improved color-to-grayscale image conversion algorithm that effectively incorporates chrominance information is proposed using the color-to-gray structure similarity index and singular value decomposition to improve the perceptual quality of the converted grayscale images. The experimental results based on an image quality assessment for image decolorization and its success rate (using the Cadik and COLOR250 datasets) show that the proposed image decolorization technique performs better than eight existing benchmark algorithms for image decolorization. In the second part of the paper, the effectiveness of incorporating the chrominance component for scene classification tasks is demonstrated using a deep belief network-based image classification system developed using dense scale-invariant feature transforms. The amount of chrominance information incorporated into the proposed image decolorization technique is confirmed with the improvement to the overall scene classification accuracy. Moreover, the overall scene classification performance improved by combining the models obtained using the proposed method and conventional decolorization methods.
Classical Wave Model of Quantum-Like Processing in Brain
NASA Astrophysics Data System (ADS)
Khrennikov, A.
2011-01-01
We discuss the conjecture on quantum-like (QL) processing of information in the brain. It is not based on the physical quantum brain (e.g., Penrose) - quantum physical carriers of information. In our approach the brain created the QL representation (QLR) of information in Hilbert space. It uses quantum information rules in decision making. The existence of such QLR was (at least preliminary) confirmed by experimental data from cognitive psychology. The violation of the law of total probability in these experiments is an important sign of nonclassicality of data. In so called "constructive wave function approach" such data can be represented by complex amplitudes. We presented 1,2 the QL model of decision making. In this paper we speculate on a possible physical realization of QLR in the brain: a classical wave model producing QLR . It is based on variety of time scales in the brain. Each pair of scales (fine - the background fluctuations of electromagnetic field and rough - the cognitive image scale) induces the QL representation. The background field plays the crucial role in creation of "superstrong QL correlations" in the brain.
Information technology model for evaluating emergency medicine teaching
NASA Astrophysics Data System (ADS)
Vorbach, James; Ryan, James
1996-02-01
This paper describes work in progress to develop an Information Technology (IT) model and supporting information system for the evaluation of clinical teaching in the Emergency Medicine (EM) Department of North Shore University Hospital. In the academic hospital setting student physicians, i.e. residents, and faculty function daily in their dual roles as teachers and students respectively, and as health care providers. Databases exist that are used to evaluate both groups in either academic or clinical performance, but rarely has this information been integrated to analyze the relationship between academic performance and the ability to care for patients. The goal of the IT model is to improve the quality of teaching of EM physicians by enabling the development of integrable metrics for faculty and resident evaluation. The IT model will include (1) methods for tracking residents in order to develop experimental databases; (2) methods to integrate lecture evaluation, clinical performance, resident evaluation, and quality assurance databases; and (3) a patient flow system to monitor patient rooms and the waiting area in the Emergency Medicine Department, to record and display status of medical orders, and to collect data for analyses.
Link prediction with node clustering coefficient
NASA Astrophysics Data System (ADS)
Wu, Zhihao; Lin, Youfang; Wang, Jing; Gregory, Steve
2016-06-01
Predicting missing links in incomplete complex networks efficiently and accurately is still a challenging problem. The recently proposed Cannistrai-Alanis-Ravai (CAR) index shows the power of local link/triangle information in improving link-prediction accuracy. Inspired by the idea of employing local link/triangle information, we propose a new similarity index with more local structure information. In our method, local link/triangle structure information can be conveyed by clustering coefficient of common-neighbors directly. The reason why clustering coefficient has good effectiveness in estimating the contribution of a common-neighbor is that it employs links existing between neighbors of a common-neighbor and these links have the same structural position with the candidate link to this common-neighbor. In our experiments, three estimators: precision, AUP and AUC are used to evaluate the accuracy of link prediction algorithms. Experimental results on ten tested networks drawn from various fields show that our new index is more effective in predicting missing links than CAR index, especially for networks with low correlation between number of common-neighbors and number of links between common-neighbors.
Evidence Combination From an Evolutionary Game Theory Perspective.
Deng, Xinyang; Han, Deqiang; Dezert, Jean; Deng, Yong; Shyr, Yu
2016-09-01
Dempster-Shafer evidence theory is a primary methodology for multisource information fusion because it is good at dealing with uncertain information. This theory provides a Dempster's rule of combination to synthesize multiple evidences from various information sources. However, in some cases, counter-intuitive results may be obtained based on that combination rule. Numerous new or improved methods have been proposed to suppress these counter-intuitive results based on perspectives, such as minimizing the information loss or deviation. Inspired by evolutionary game theory, this paper considers a biological and evolutionary perspective to study the combination of evidences. An evolutionary combination rule (ECR) is proposed to help find the most biologically supported proposition in a multievidence system. Within the proposed ECR, we develop a Jaccard matrix game to formalize the interaction between propositions in evidences, and utilize the replicator dynamics to mimick the evolution of propositions. Experimental results show that the proposed ECR can effectively suppress the counter-intuitive behaviors appeared in typical paradoxes of evidence theory, compared with many existing methods. Properties of the ECR, such as solution's stability and convergence, have been mathematically proved as well.
Bailey, Susan F; Bataillon, Thomas
2016-01-01
There have been a variety of approaches taken to try to characterize and identify the genetic basis of adaptation in nature, spanning theoretical models, experimental evolution studies and direct tests of natural populations. Theoretical models can provide formalized and detailed hypotheses regarding evolutionary processes and patterns, from which experimental evolution studies can then provide important proofs of concepts and characterize what is biologically reasonable. Genetic and genomic data from natural populations then allow for the identification of the particular factors that have and continue to play an important role in shaping adaptive evolution in the natural world. Further to this, experimental evolution studies allow for tests of theories that may be difficult or impossible to test in natural populations for logistical and methodological reasons and can even generate new insights, suggesting further refinement of existing theories. However, as experimental evolution studies often take place in a very particular set of controlled conditions--that is simple environments, a small range of usually asexual species, relatively short timescales--the question remains as to how applicable these experimental results are to natural populations. In this review, we discuss important insights coming from experimental evolution, focusing on four key topics tied to the evolutionary genetics of adaptation, and within those topics, we discuss the extent to which the experimental work compliments and informs natural population studies. We finish by making suggestions for future work in particular a need for natural population genomic time series data, as well as the necessity for studies that combine both experimental evolution and natural population approaches. © 2015 The Authors. Molecular Ecology Published by John Wiley & Sons Ltd.
Kβ Mainline X-ray Emission Spectroscopy as an Experimental Probe of Metal–Ligand Covalency
2015-01-01
The mainline feature in metal Kβ X-ray emission spectroscopy (XES) has long been recognized as an experimental marker for the spin state of the metal center. However, even within a series of metal compounds with the same nominal oxidation and spin state, significant changes are observed that cannot be explained on the basis of overall spin. In this work, the origin of these effects is explored, both experimentally and theoretically, in order to develop the chemical information content of Kβ mainline XES. Ligand field expressions are derived that describe the behavior of Kβ mainlines for first row transition metals with any dn count, allowing for a detailed analysis of the factors governing mainline shape. Further, due to limitations associated with existing computational approaches, we have developed a new methodology for calculating Kβ mainlines using restricted active space configuration interaction (RAS–CI) calculations. This approach eliminates the need for empirical parameters and provides a powerful tool for investigating the effects that chemical environment exerts on the mainline spectra. On the basis of a detailed analysis of the intermediate and final states involved in these transitions, we confirm the known sensitivity of Kβ mainlines to metal spin state via the 3p–3d exchange coupling. Further, a quantitative relationship between the splitting of the Kβ mainline features and the metal–ligand covalency is established. Thus, this study furthers the quantitative electronic structural information that can be extracted from Kβ mainline spectroscopy. PMID:24914450
Semantic modeling of plastic deformation of polycrystalline rock
NASA Astrophysics Data System (ADS)
Babaie, Hassan A.; Davarpanah, Armita
2018-02-01
We have developed the first iteration of the Plastic Rock Deformation (PRD) ontology by modeling the semantics of a selected set of deformational processes and mechanisms that produce, reconfigure, displace, and/or consume the material components of inhomogeneous polycrystalline rocks. The PRD knowledge model also classifies and formalizes the properties (relations) that hold between instances of the dynamic physical and chemical processes and the rock components, the complex physio-chemical, mathematical, and informational concepts of the plastic rock deformation system, the measured or calculated laboratory testing conditions, experimental procedures and protocols, the state and system variables, and the empirical flow laws that define the inter-relationships among the variables. The ontology reuses classes and properties from several existing ontologies that are built for physics, chemistry, biology, and mathematics. With its flexible design, the PRD ontology is well positioned to incrementally develop into a model that more fully represents the knowledge of plastic deformation of polycrystalline rocks in the future. The domain ontology will be used to consistently annotate varied data and information related to the microstructures and the physical and chemical processes that produce them at different spatial and temporal scales in the laboratory and in the solid Earth. The PRDKB knowledge base, when built based on the ontology, will help the community of experimental structural geologists and metamorphic petrologists to coherently and uniformly distribute, discover, access, share, and use their data through automated reasoning and integration and query of heterogeneous experimental deformation data that originate from autonomous rock testing laboratories.
Zhou, Bailing; Zhao, Huiying; Yu, Jiafeng; Guo, Chengang; Dou, Xianghua; Song, Feng; Hu, Guodong; Cao, Zanxia; Qu, Yuanxu; Yang, Yuedong; Zhou, Yaoqi; Wang, Jihua
2018-01-04
Long non-coding RNAs (lncRNAs) play important functional roles in various biological processes. Early databases were utilized to deposit all lncRNA candidates produced by high-throughput experimental and/or computational techniques to facilitate classification, assessment and validation. As more lncRNAs are validated by low-throughput experiments, several databases were established for experimentally validated lncRNAs. However, these databases are small in scale (with a few hundreds of lncRNAs only) and specific in their focuses (plants, diseases or interactions). Thus, it is highly desirable to have a comprehensive dataset for experimentally validated lncRNAs as a central repository for all of their structures, functions and phenotypes. Here, we established EVLncRNAs by curating lncRNAs validated by low-throughput experiments (up to 1 May 2016) and integrating specific databases (lncRNAdb, LncRANDisease, Lnc2Cancer and PLNIncRBase) with additional functional and disease-specific information not covered previously. The current version of EVLncRNAs contains 1543 lncRNAs from 77 species that is 2.9 times larger than the current largest database for experimentally validated lncRNAs. Seventy-four percent lncRNA entries are partially or completely new, comparing to all existing experimentally validated databases. The established database allows users to browse, search and download as well as to submit experimentally validated lncRNAs. The database is available at http://biophy.dzu.edu.cn/EVLncRNAs. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.
Zhao, Huiying; Yu, Jiafeng; Guo, Chengang; Dou, Xianghua; Song, Feng; Hu, Guodong; Cao, Zanxia; Qu, Yuanxu
2018-01-01
Abstract Long non-coding RNAs (lncRNAs) play important functional roles in various biological processes. Early databases were utilized to deposit all lncRNA candidates produced by high-throughput experimental and/or computational techniques to facilitate classification, assessment and validation. As more lncRNAs are validated by low-throughput experiments, several databases were established for experimentally validated lncRNAs. However, these databases are small in scale (with a few hundreds of lncRNAs only) and specific in their focuses (plants, diseases or interactions). Thus, it is highly desirable to have a comprehensive dataset for experimentally validated lncRNAs as a central repository for all of their structures, functions and phenotypes. Here, we established EVLncRNAs by curating lncRNAs validated by low-throughput experiments (up to 1 May 2016) and integrating specific databases (lncRNAdb, LncRANDisease, Lnc2Cancer and PLNIncRBase) with additional functional and disease-specific information not covered previously. The current version of EVLncRNAs contains 1543 lncRNAs from 77 species that is 2.9 times larger than the current largest database for experimentally validated lncRNAs. Seventy-four percent lncRNA entries are partially or completely new, comparing to all existing experimentally validated databases. The established database allows users to browse, search and download as well as to submit experimentally validated lncRNAs. The database is available at http://biophy.dzu.edu.cn/EVLncRNAs. PMID:28985416
Mayhew, T M; Desoye, G
2004-07-01
Colloidal gold-labelling, combined with transmission electron microscopy, is a valuable technique for high-resolution immunolocalization of identified antigens in different subcellular compartments. Whilst the technique has been applied to placental tissues, few quantitative studies have been made. Subcellular compartments exist in three main categories (viz. organelles, membranes, filaments/tubules) and this affects the possibilities for quantification. Generally, gold particles are counted in order to compare either (a) compartments within an experimental group or (b) compartmental labelling distributions between groups. For the former, recent developments make it possible to test whether or not there is differential (nonrandom) labelling of compartments. The methods (relative labelling index and labelling density) are ideally suited to analysing label in one category of compartment (organelle or membrane or filament) but may be adapted to deal with a mixture of categories. They also require information about compartment size (e.g. profile area or trace length). Here, a simple and efficient method for drawing between-group comparisons of labelling distributions is presented. The method does not require information about compartment size or specimen magnification. It relies on multistage random sampling of specimens and unbiased counting of gold particles associated with different compartments. Distributions of observed gold counts in different experimental groups are compared by contingency table analysis with degrees of freedom for chi-squared (chi(2)) values being determined by the numbers of compartments and experimental groups. Compartmental values of chi(2)which contribute substantially to total chi(2)identify the principal subcellular sites of between-group differences. The method is illustrated using datasets from immunolabelling studies on the localization of GLUT1 glucose transporters in cultured human trophoblast cells exposed to different treatments.
Design and validation of instruments to measure knowledge.
Elliott, T E; Regal, R R; Elliott, B A; Renier, C M
2001-01-01
Measuring health care providers' learning after they have participated in educational interventions that use experimental designs requires valid, reliable, and practical instruments. A literature review was conducted. In addition, experience gained from designing and validating instruments for measuring the effect of an educational intervention informed this process. The eight main steps for designing, validating, and testing the reliability of instruments for measuring learning outcomes are presented. The key considerations and rationale for this process are discussed. Methods for critiquing and adapting existent instruments and creating new ones are offered. This study may help other investigators in developing valid, reliable, and practical instruments for measuring the outcomes of educational activities.
3D Microstructures for Materials and Damage Models
Livescu, Veronica; Bronkhorst, Curt Allan; Vander Wiel, Scott Alan
2017-02-01
Many challenges exist with regard to understanding and representing complex physical processes involved with ductile damage and failure in polycrystalline metallic materials. Currently, the ability to accurately predict the macroscale ductile damage and failure response of metallic materials is lacking. Research at Los Alamos National Laboratory (LANL) is aimed at building a coupled experimental and computational methodology that supports the development of predictive damage capabilities by: capturing real distributions of microstructural features from real material and implementing them as digitally generated microstructures in damage model development; and, distilling structure-property information to link microstructural details to damage evolution under a multitudemore » of loading states.« less
Evaluation of SAPHIRE: an automated approach to indexing and retrieving medical literature.
Hersh, W.; Hickam, D. H.; Haynes, R. B.; McKibbon, K. A.
1991-01-01
An analysis of SAPHIRE, an experimental information retrieval system featuring automated indexing and natural language retrieval, was performed on MEDLINE references using data previously generated for a MEDLINE evaluation. Compared with searches performed by novice and expert physicians using MEDLINE, SAPHIRE achieved comparable recall and precision. While its combined recall and precision performance did not equal the level of librarians, SAPHIRE did achieve a significantly higher level of absolute recall. SAPHIRE has other potential advantages over existing MEDLINE systems. Its natural language interface does not require knowledge of MeSH, and it provides relevance ranking of retrieved references. PMID:1807718
Research and Implementation of Tibetan Word Segmentation Based on Syllable Methods
NASA Astrophysics Data System (ADS)
Jiang, Jing; Li, Yachao; Jiang, Tao; Yu, Hongzhi
2018-03-01
Tibetan word segmentation (TWS) is an important problem in Tibetan information processing, while abbreviated word recognition is one of the key and most difficult problems in TWS. Most of the existing methods of Tibetan abbreviated word recognition are rule-based approaches, which need vocabulary support. In this paper, we propose a method based on sequence tagging model for abbreviated word recognition, and then implement in TWS systems with sequence labeling models. The experimental results show that our abbreviated word recognition method is fast and effective and can be combined easily with the segmentation model. This significantly increases the effect of the Tibetan word segmentation.
Regenerative memory in time-delayed neuromorphic photonic resonators
NASA Astrophysics Data System (ADS)
Romeira, B.; Avó, R.; Figueiredo, José M. L.; Barland, S.; Javaloyes, J.
2016-01-01
We investigate a photonic regenerative memory based upon a neuromorphic oscillator with a delayed self-feedback (autaptic) connection. We disclose the existence of a unique temporal response characteristic of localized structures enabling an ideal support for bits in an optical buffer memory for storage and reshaping of data information. We link our experimental implementation, based upon a nanoscale nonlinear resonant tunneling diode driving a laser, to the paradigm of neuronal activity, the FitzHugh-Nagumo model with delayed feedback. This proof-of-concept photonic regenerative memory might constitute a building block for a new class of neuron-inspired photonic memories that can handle high bit-rate optical signals.
Eckert, K; Lange, M
2016-06-01
Exercise programs do not belong to standard treatment within disease management programmes for diabetes mellitus type 2, up to now. For these reason the effects of a 10-week behaviour-oriented exercise programme have been evaluated focusing on change in activity behaviour and health-related qualitiy of life. 202 patients took part in the investigation. There were significant inbetween group differences in some aspects of the outcome parameters. The study presents useful information on how to modify existing DMPs successfully for improving patient treatment. © Georg Thieme Verlag KG Stuttgart · New York.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Murokh, A.; Pellegrini, C.; Rosenzweig, J.
The VISA (Visible to Infrared SASE Amplifier) project is designed to be a SASE-FEL driven to saturation in the sub-micron wavelength region. Its goal is to test various aspects of the existing theory of Self-Amplified Spontaneous Emission, as well as numerical codes. Measurements include: angular and spectral distribution of the FEL light at the exit and inside of the undulator; electron beam micro-bunching using CTR; single-shot time resolved measurements of the pulse profile, using auto-correlation technique and FROG algorithm. The diagnostics are designed to provide maximum information on the physics of the SASE-FEL process, to ensure a close comparison ofmore » the experimental results with theory and simulations.« less
NASA Astrophysics Data System (ADS)
Chen, Zi-Yu; Chen, Shi; Dan, Jia-Kun; Li, Jian-Feng; Peng, Qi-Xian
2011-10-01
A simple one-dimensional analytical model for electromagnetic emission from an unmagnetized wakefield excited by an intense short-pulse laser in the nonlinear regime has been developed in this paper. The expressions for the spectral and angular distributions of the radiation have been derived. The model suggests that the origin of the radiation can be attributed to the violent sudden acceleration of plasma electrons experiencing the accelerating potential of the laser wakefield. The radiation process could help to provide a qualitative interpretation of existing experimental results, and offers useful information for future laser wakefield experiments.
Low-speed single-element airfoil synthesis
NASA Technical Reports Server (NTRS)
Mcmasters, J. H.; Henderson, M. L.
1979-01-01
The use of recently developed airfoil analysis/design computational tools to clarify, enrich and extend the existing experimental data base on low-speed, single element airfoils is demonstrated. A discussion of the problem of tailoring an airfoil for a specific application at its appropriate Reynolds number is presented. This problem is approached by use of inverse (or synthesis) techniques, wherein a desirable set of boundary layer characteristics, performance objectives, and constraints are specified, which then leads to derivation of a corresponding viscous flow pressure distribution. Examples are presented which demonstrate the synthesis approach, following presentation of some historical information and background data which motivate the basic synthesis process.
Enhanced low-rank representation via sparse manifold adaption for semi-supervised learning.
Peng, Yong; Lu, Bao-Liang; Wang, Suhang
2015-05-01
Constructing an informative and discriminative graph plays an important role in various pattern recognition tasks such as clustering and classification. Among the existing graph-based learning models, low-rank representation (LRR) is a very competitive one, which has been extensively employed in spectral clustering and semi-supervised learning (SSL). In SSL, the graph is composed of both labeled and unlabeled samples, where the edge weights are calculated based on the LRR coefficients. However, most of existing LRR related approaches fail to consider the geometrical structure of data, which has been shown beneficial for discriminative tasks. In this paper, we propose an enhanced LRR via sparse manifold adaption, termed manifold low-rank representation (MLRR), to learn low-rank data representation. MLRR can explicitly take the data local manifold structure into consideration, which can be identified by the geometric sparsity idea; specifically, the local tangent space of each data point was sought by solving a sparse representation objective. Therefore, the graph to depict the relationship of data points can be built once the manifold information is obtained. We incorporate a regularizer into LRR to make the learned coefficients preserve the geometric constraints revealed in the data space. As a result, MLRR combines both the global information emphasized by low-rank property and the local information emphasized by the identified manifold structure. Extensive experimental results on semi-supervised classification tasks demonstrate that MLRR is an excellent method in comparison with several state-of-the-art graph construction approaches. Copyright © 2015 Elsevier Ltd. All rights reserved.
Visser, Leonie N C; Tollenaar, Marieke S; Bosch, Jos A; van Doornen, Lorenz J P; de Haes, Hanneke C J M; Smets, Ellen M A
2017-01-01
Patients forget 20-80% of information provided during medical consultations. The emotional stress often experienced by patients during consultations could be one of the mechanisms that lead to limited recall. The current experimental study therefore investigated the associations between (analog) patients' psychophysiological arousal, self-reported emotional stress and their (long term) memory of information provided by the physician. One hundred and eighty one cancer-naïve individuals acted as so-called analog patients (APs), i.e. they were instructed to watch a scripted video-recoding of an oncological bad news consultation while imagining themselves being in the patient's situation. Electrodermal and cardiovascular activity (e.g. skin conductance level and heart rate) were recorded during watching. Self-reported emotional stress was assessed before and after watching, using the STAI-State and seven Visual Analog Scales. Memory, both free recall and recognition, was assessed after 24-28 h. Watching the consultation evoked significant psychophysiological and self-reported stress responses. However, investigating the associations between 24 psychophysiological arousal measures, eight self-reported stress measures and free recall and recognition of information resulted in one significant, small (partial) correlation (r = 0.19). Considering multiple testing, this significant result was probably due to chance. Alternative analytical methods yielded identical results, strengthening our conclusion that no evidence was found for relationships between variables of interest. These null-findings are highly relevant, as they may be considered to refute the long-standing, but yet untested assumption that a relationship between stress and memory exists within this context. Moreover, these findings suggest that lowering patients' stress levels during the consultation would probably not be sufficient to raise memory of information to an optimal level. Alternative explanations for these findings are discussed.
Prahl, Andrew; Dexter, Franklin; Braun, Michael T; Van Swol, Lyn
2013-11-01
Because operating room (OR) management decisions with optimal choices are made with ubiquitous biases, decisions are improved with decision-support systems. We reviewed experimental social-psychology studies to explore what an OR leader can do when working with stakeholders lacking interest in learning the OR management science but expressing opinions about decisions, nonetheless. We considered shared information to include the rules-of-thumb (heuristics) that make intuitive sense and often seem "close enough" (e.g., staffing is planned based on the average workload). We considered unshared information to include the relevant mathematics (e.g., staffing calculations). Multiple studies have shown that group discussions focus more on shared than unshared information. Quality decisions are more likely when all group participants share knowledge (e.g., have taken a course in OR management science). Several biases in OR management are caused by humans' limited abilities to estimate tails of probability distributions in their heads. Groups are more susceptible to analogous biases than are educated individuals. Since optimal solutions are not demonstrable without groups sharing common language, only with education of most group members can a knowledgeable individual influence the group. The appropriate model of decision-making is autocratic, with information obtained from stakeholders. Although such decisions are good quality, the leaders often are disliked and the decisions considered unjust. In conclusion, leaders will find the most success if they do not bring OR management operational decisions to groups, but instead act autocratically while obtaining necessary information in 1:1 conversations. The only known route for the leader making such decisions to be considered likable and for the decisions to be considered fair is through colleagues and subordinates learning the management science.
Shanks, Ryan A.; Robertson, Chuck L.; Haygood, Christian S.; Herdliksa, Anna M.; Herdliska, Heather R.; Lloyd, Steven A.
2017-01-01
Introductory biology courses provide an important opportunity to prepare students for future courses, yet existing cookbook labs, although important in their own way, fail to provide many of the advantages of semester-long research experiences. Engaging, authentic research experiences aid biology students in meeting many learning goals. Therefore, overlaying a research experience onto the existing lab structure allows faculty to overcome barriers involving curricular change. Here we propose a working model for this overlay design in an introductory biology course and detail a means to conduct this lab with minimal increases in student and faculty workloads. Furthermore, we conducted exploratory factor analysis of the Experimental Design Ability Test (EDAT) and uncovered two latent factors which provide valid means to assess this overlay model’s ability to increase advanced experimental design abilities. In a pre-test/post-test design, we demonstrate significant increases in both basic and advanced experimental design abilities in an experimental and comparison group. We measured significantly higher gains in advanced experimental design understanding in students in the experimental group. We believe this overlay model and EDAT factor analysis contribute a novel means to conduct and assess the effectiveness of authentic research experiences in an introductory course without major changes to the course curriculum and with minimal increases in faculty and student workloads. PMID:28904647
Does Informal Employment Exist in the United States and Other Developed Countries?
Siqueira, Carlos E
2016-08-01
This editorial argues that informal employment does exist in developed countries and needs to be studied as such to complement the existing literature mostly published on informal work in developing countries. © The Author(s) 2016.
Model building with wind and water: Friedrich Ahlborn's photo-optical flow analysis.
Hinterwaldner, Inge
2015-02-01
Around 1900, several experimenters investigated turbulences in wind tunnels or water basins by creating visualizations. One of them, the German zoologist Friedrich Ahlborn (1858-1937), was familiar with the works by his contemporaries but he struck a new path. He combined three different kinds of photographs taken at the same time and showed the same situation in his water trough-but each in a different way. With this first basic operation, Ahlborn heuristically opened up a previously non-existent space for experimentation, analysis, and recombination. He generated an astonishing diversity of information by adopting the tactics of 'inversions' in which he interpreted one part of the experimental setup, or its results, in different ways. Between the variants of the 'autographs' which he developed, he defined areas of intersection to be able to translate results from individual records into each other. To this end, Ahlborn created other sets of visual artifacts such as drawn diagrams, three-dimensional wire frame constructions, and clay reliefs. His working method can be described as a cascading array of successive modeling steps, as elaborated by Eric Winsberg (1999), or of inscriptions in Bruno Latour's words (Latour, 1986). By examining Ahlborn's procedures closely we propose conceptualizations for the experimenter's various operations. Copyright © 2014 Elsevier Ltd. All rights reserved.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-06-13
...; Proposed Revision To the Nonessential Experimental Population of the Mexican Wolf AGENCY: Fish and Wildlife...), propose to revise the existing nonessential experimental population designation of the Mexican wolf (Canis... nonessential experimental population designation of Mexican wolves in order to correctly associate this...
An object-oriented software approach for a distributed human tracking motion system
NASA Astrophysics Data System (ADS)
Micucci, Daniela L.
2003-06-01
Tracking is a composite job involving the co-operation of autonomous activities which exploit a complex information model and rely on a distributed architecture. Both information and activities must be classified and related in several dimensions: abstraction levels (what is modelled and how information is processed); topology (where the modelled entities are); time (when entities exist); strategy (why something happens); responsibilities (who is in charge of processing the information). A proper Object-Oriented analysis and design approach leads to a modular architecture where information about conceptual entities is modelled at each abstraction level via classes and intra-level associations, whereas inter-level associations between classes model the abstraction process. Both information and computation are partitioned according to level-specific topological models. They are also placed in a temporal framework modelled by suitable abstractions. Domain-specific strategies control the execution of the computations. Computational components perform both intra-level processing and intra-level information conversion. The paper overviews the phases of the analysis and design process, presents major concepts at each abstraction level, and shows how the resulting design turns into a modular, flexible and adaptive architecture. Finally, the paper sketches how the conceptual architecture can be deployed into a concrete distribute architecture by relying on an experimental framework.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-01-19
... Extension of Existing Information Collection; Rock Burst Control Plan, Metal and Nonmetal Mines AGENCY: Mine... extension of the information collection for 30 CFR 57.3461 Rock Bursts. DATES: All comments must be received... contains the request for an extension of the existing collection of information in 30 CFR 57.3461 Rock...
Exploring relations between task conflict and informational conflict in the Stroop task.
Entel, Olga; Tzelgov, Joseph; Bereby-Meyer, Yoella; Shahar, Nitzan
2015-11-01
In this study, we tested the proposal that the Stroop task involves two conflicts--task conflict and informational conflict. Task conflict was defined as the latency difference between color words and non-letter neutrals, and manipulated by varying the proportion of color words versus non-letter neutrals. Informational conflict was defined as the latency difference between incongruent and congruent trials and manipulated by varying the congruent-to-incongruent trial ratio. We replicated previous findings showing that increasing the ratio of incongruent-to-congruent trials reduces the latency difference between the incongruent and congruent condition (i.e., informational conflict), as does increasing the proportion of color words (i.e., task conflict). A significant under-additive interaction between the two proportion manipulations (congruent vs. incongruent and color words vs. neutrals) indicated that the effects of task conflict and informational conflict were not additive. By assessing task conflict as the contrast between color words and neutrals, we found that task conflict existed in all of our experimental conditions. Under specific conditions, when task conflict dominated behavior by explaining most of the variability between congruency conditions, we also found negative facilitation, thus demonstrating that this effect is a special case of task conflict.
Log-Gabor Weber descriptor for face recognition
NASA Astrophysics Data System (ADS)
Li, Jing; Sang, Nong; Gao, Changxin
2015-09-01
The Log-Gabor transform, which is suitable for analyzing gradually changing data such as in iris and face images, has been widely used in image processing, pattern recognition, and computer vision. In most cases, only the magnitude or phase information of the Log-Gabor transform is considered. However, the complementary effect taken by combining magnitude and phase information simultaneously for an image-feature extraction problem has not been systematically explored in the existing works. We propose a local image descriptor for face recognition, called Log-Gabor Weber descriptor (LGWD). The novelty of our LGWD is twofold: (1) to fully utilize the information from the magnitude or phase feature of multiscale and orientation Log-Gabor transform, we apply the Weber local binary pattern operator to each transform response. (2) The encoded Log-Gabor magnitude and phase information are fused at the feature level by utilizing kernel canonical correlation analysis strategy, considering that feature level information fusion is effective when the modalities are correlated. Experimental results on the AR, Extended Yale B, and UMIST face databases, compared with those available from recent experiments reported in the literature, show that our descriptor yields a better performance than state-of-the art methods.
Kasper, Jürgen; Heesen, Christoph; Köpke, Sascha; Mühlhauser, Ingrid; Lenz, Matthias
2011-01-01
Statistical health risk information has been proven confusing and difficult to understand. While existing research indicates that presenting risk information in frequency formats is superior to relative risk and probability formats, the optimal design of frequency formats is still unclear. The aim of this study was to compare presentation of multi-figure pictographs in consecutive and random arrangements regarding accuracy in perception and vulnerability for cognitive bias. A total of 111 patients with multiple sclerosis were randomly assigned to two experimental conditions: patient information using 100 figure pictographs in 1) unsorted (UP group) or 2) consecutive arrangement (CP group).The study experiment was framed as patient information on how risks and benefit could be explained. The information comprised two scenarios of a treatment decision with varying levels of emotional relevance. Primary outcome measure was accuracy of information recall (errors made when recalling previously presented frequencies of benefits and side effects). Cognitive bias was measured as additional error appearing with higher emotional involvement. The uncertainty tolerance scale and a set of items to assess risk attribution were surveyed. The study groups did not differ in their accuracy of recalling benefits, but recall of side effects was more accurate in the CP-group. Cognitive bias when recalling benefits was higher in the UP-group than in the CP-group and equal for side effects in both groups. RESULTS were similar in subgroup analyses of patients 1) with highly irrational risk attribution 2) with experience regarding the hypothetical contents or 3) with experience regarding pictograph presentation of frequencies. Overall, benefit was overestimated by more than 100% and variance of recall was extremely high. Consecutive arrangement as commonly used seems not clearly superior to unsorted arrangement which is more close to reality. General poor performance and the corresponding high variance of recall might have clouded existing effects of the arrangement types. More research is needed with varying proportions and other samples.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-12-21
... information technology, e.g., permitting electronic submission of responses. Overview of This Information... Information Collection Activities: Extension, Without Change, of an Existing Information Collection; Comment Request. ACTION: 60-Day Notice of Information Collection; No form; Emergency Federal Law Enforcement...
Cofta-Woerpel, Ludmila; Randhawa, Veenu; McFadden, H Gene; Fought, Angela; Bullard, Emily; Spring, Bonnie
2009-12-02
High-quality cancer information resources are available but underutilized by the public. Despite greater awareness of the National Cancer Institute's Cancer Information Service among low-income African Americans and Hispanics compared with Caucasians, actual Cancer Information Service usage is lower than expected, paralleling excess cancer-related morbidity and mortality for these subgroups. The proposed research examines how to connect the Cancer Information Service to low-income African-American and Hispanic women and their health care providers. The study will examine whether targeted physician mailing to women scheduled for colposcopy to follow up an abnormal Pap test can increase calls to the Cancer Information Service, enhance appropriate medical follow-up, and improve satisfaction with provider-patient communication. The study will be conducted in two clinics in ethnically diverse low-income communities in Chicago. During the formative phase, patients and providers will provide input regarding materials planned for use in the experimental phase of the study. The experimental phase will use a two-group prospective randomized controlled trial design. African American and Hispanic women with an abnormal Pap test will be randomized to Usual Care (routine colposcopy reminder letter) or Intervention (reminder plus provider recommendation to call the Cancer Information Service and sample questions to ask). Primary outcomes will be: 1) calls to the Cancer Information Service; 2) timely medical follow-up, operationalized by whether the patient keeps her colposcopy appointment within six months of the abnormal Pap; and 3) patient satisfaction with provider-patient communication at follow-up. The study examines the effectiveness of a feasible, sustainable, and culturally sensitive strategy to increase awareness and use of the Cancer Information Service among an underserved population. The goal of linking a public service (the Cancer Information Service) with real-life settings of practice (the clinics), and considering input from patients, providers, and Cancer Information Service staff, is to ensure that the intervention, if proven effective, can be incorporated into existing care systems and sustained. The approach to study design and planning is aimed at bridging the gap between research and practice/service. NCT00873288.
Extension of the sasCIF format and its applications for data processing and deposition
Kachala, Michael; Westbrook, John; Svergun, Dmitri
2016-02-01
Recent advances in small-angle scattering (SAS) experimental facilities and data analysis methods have prompted a dramatic increase in the number of users and of projects conducted, causing an upsurge in the number of objects studied, experimental data available and structural models generated. To organize the data and models and make them accessible to the community, the Task Forces on SAS and hybrid methods for the International Union of Crystallography and the Worldwide Protein Data Bank envisage developing a federated approach to SAS data and model archiving. Within the framework of this approach, the existing databases may exchange information and providemore » independent but synchronized entries to users. At present, ways of exchanging information between the various SAS databases are not established, leading to possible duplication and incompatibility of entries, and limiting the opportunities for data-driven research for SAS users. In this work, a solution is developed to resolve these issues and provide a universal exchange format for the community, based on the use of the widely adopted crystallographic information framework (CIF). The previous version of the sasCIF format, implemented as an extension of the core CIF dictionary, has been available since 2000 to facilitate SAS data exchange between laboratories. The sasCIF format has now been extended to describe comprehensively the necessary experimental information, results and models, including relevant metadata for SAS data analysis and for deposition into a database. Processing tools for these files (sasCIFtools) have been developed, and these are available both as standalone open-source programs and integrated into the SAS Biological Data Bank, allowing the export and import of data entries as sasCIF files. Software modules to save the relevant information directly from beamline data-processing pipelines in sasCIF format are also developed. Lastly, this update of sasCIF and the relevant tools are an important step in the standardization of the way SAS data are presented and exchanged, to make the results easily accessible to users and to promote further the application of SAS in the structural biology community.« less
Advanced Capabilities for Wind Tunnel Testing in the 21st Century
NASA Technical Reports Server (NTRS)
Kegelman, Jerome T.; Danehy, Paul M.; Schwartz, Richard J.
2010-01-01
Wind tunnel testing methods and test technologies for the 21st century using advanced capabilities are presented. These capabilities are necessary to capture more accurate and high quality test results by eliminating the uncertainties in testing and to facilitate verification of computational tools for design. This paper discusses near term developments underway in ground testing capabilities, which will enhance the quality of information of both the test article and airstream flow details. Also discussed is a selection of new capability investments that have been made to accommodate such developments. Examples include advanced experimental methods for measuring the test gas itself; using efficient experiment methodologies, including quality assurance strategies within the test; and increasing test result information density by using extensive optical visualization together with computed flow field results. These points could be made for both major investments in existing tunnel capabilities or for entirely new capabilities.
Working memory in healthy aging and in Parkinson's disease: evidence of interference effects.
Di Rosa, Elisa; Pischedda, Doris; Cherubini, Paolo; Mapelli, Daniela; Tamburin, Stefano; Burigo, Michele
2017-05-01
Focusing on relevant information while suppressing the irrelevant one are critical abilities for different cognitive processes. However, their functioning has been scarcely investigated in the working memory (WM) domain, in both healthy and pathological conditions. The present research aimed to study these abilities in aging and Parkinson's disease (PD), testing three groups of healthy participants (young, older and elderly) and one of PD patients, employing a new experimental paradigm. Results showed that the transient storing of irrelevant information in WM causes substantial interference effects, which were remarkable in elderly individuals on both response latency and accuracy. Interestingly, PD patients responded faster and were equally accurate compared to a matched control group. Taken together, findings confirm the existence of similar mechanisms for orienting attention inwards to WM contents or outwards to perceptual stimuli, and suggest the suitability of our task to assess WM functioning in both healthy aging and PD.
NASA Technical Reports Server (NTRS)
Sweet, D. C.; Pincura, P. G.; Wukelic, G. E. (Principal Investigator)
1974-01-01
The author has identified the following significant results. During the first year of project effort the ability of ERTS-1 imagery to be used for mapping and inventorying strip-mined areas in south eastern Ohio, the potential of using ERTS-1 imagery in water quality and coastal zone management in the Lake Erie region, and the extent that ERTS-1 imagery could contribute to localized (metropolitan/urban), multicounty, and overall state land use needs were experimentally demonstrated and reported as significant project results. Significant research accomplishments were achieved in the technological development of manual and computerized methods to extract multi-feature information as well as singular feature information from ERTS-1 data as is exemplified by the forestry transparency overlay. Fabrication of an image transfer device to superimpose ERTS-1 data onto existing maps and other data sources was also a significant analytical accomplishment.
Finger Vein Recognition Based on Local Directional Code
Meng, Xianjing; Yang, Gongping; Yin, Yilong; Xiao, Rongyang
2012-01-01
Finger vein patterns are considered as one of the most promising biometric authentication methods for its security and convenience. Most of the current available finger vein recognition methods utilize features from a segmented blood vessel network. As an improperly segmented network may degrade the recognition accuracy, binary pattern based methods are proposed, such as Local Binary Pattern (LBP), Local Derivative Pattern (LDP) and Local Line Binary Pattern (LLBP). However, the rich directional information hidden in the finger vein pattern has not been fully exploited by the existing local patterns. Inspired by the Webber Local Descriptor (WLD), this paper represents a new direction based local descriptor called Local Directional Code (LDC) and applies it to finger vein recognition. In LDC, the local gradient orientation information is coded as an octonary decimal number. Experimental results show that the proposed method using LDC achieves better performance than methods using LLBP. PMID:23202194
An experimental investigation of recruitment bias in eating pathology research.
Moss, Erin L; von Ranson, Kristin M
2006-04-01
Previous, uncontrolled research has suggested a bias may exist in recruiting participants for eating disorder research. Recruitment biases may affect sample representativeness and generalizability of findings. This experiment investigated whether revealing that a study's topic was related to eating disorders created a self-selection bias. Young women at a university responded to advertisements containing contrasting information about the nature of a single study. We recruited one group by advertising the study under the title "Disordered Eating in Young Women" (n = 251) and another group using the title "Consumer Preferences" (n = 259). Results indicated similar levels of eating pathology in both groups, so the different recruitment techniques did not engender self-selection. However, the consumer preferences group scored higher in self-reported social desirability. The level of information conveyed in study advertising does not impact reporting of eating disturbances among nonclinical samples, although there is evidence social desirability might. 2006 by Wiley Periodicals, Inc.
Visual search, visual streams, and visual architectures.
Green, M
1991-10-01
Most psychological, physiological, and computational models of early vision suggest that retinal information is divided into a parallel set of feature modules. The dominant theories of visual search assume that these modules form a "blackboard" architecture: a set of independent representations that communicate only through a central processor. A review of research shows that blackboard-based theories, such as feature-integration theory, cannot easily explain the existing data. The experimental evidence is more consistent with a "network" architecture, which stresses that: (1) feature modules are directly connected to one another, (2) features and their locations are represented together, (3) feature detection and integration are not distinct processing stages, and (4) no executive control process, such as focal attention, is needed to integrate features. Attention is not a spotlight that synthesizes objects from raw features. Instead, it is better to conceptualize attention as an aperture which masks irrelevant visual information.
Kernel-aligned multi-view canonical correlation analysis for image recognition
NASA Astrophysics Data System (ADS)
Su, Shuzhi; Ge, Hongwei; Yuan, Yun-Hao
2016-09-01
Existing kernel-based correlation analysis methods mainly adopt a single kernel in each view. However, only a single kernel is usually insufficient to characterize nonlinear distribution information of a view. To solve the problem, we transform each original feature vector into a 2-dimensional feature matrix by means of kernel alignment, and then propose a novel kernel-aligned multi-view canonical correlation analysis (KAMCCA) method on the basis of the feature matrices. Our proposed method can simultaneously employ multiple kernels to better capture the nonlinear distribution information of each view, so that correlation features learned by KAMCCA can have well discriminating power in real-world image recognition. Extensive experiments are designed on five real-world image datasets, including NIR face images, thermal face images, visible face images, handwritten digit images, and object images. Promising experimental results on the datasets have manifested the effectiveness of our proposed method.
Autonomous Congestion Control in Delay-Tolerant Networks
NASA Technical Reports Server (NTRS)
Burleigh, Scott; Jennings, Esther; Schoolcraft, Joshua
2006-01-01
Congestion control is an important feature that directly affects network performance. Network congestion may cause loss of data or long delays. Although this problem has been studied extensively in the Internet, the solutions for Internet congestion control do not apply readily to challenged network environments such as Delay Tolerant Networks (DTN) where end-to-end connectivity may not exist continuously and latency can be high. In DTN, end-to-end rate control is not feasible. This calls for congestion control mechanisms where the decisions can be made autonomously with local information only. We use an economic pricing model and propose a rule-based congestion control mechanism where each router can autonomously decide on whether to accept a bundle (data) based on local information such as available storage and the value and risk of accepting the bundle (derived from historical statistics). Preliminary experimental results show that this congestion control mechanism can protect routers from resource depletion without loss of data.
Trends in Exposure to Chemicals in Personal Care and Consumer Products.
Calafat, Antonia M; Valentin-Blasini, Liza; Ye, Xiaoyun
2015-12-01
Synthetic organic chemicals can be used in personal care and consumer products. Data on potential human health effects of these chemicals are limited-sometimes even contradictory-but because several of these chemicals are toxic in experimental animals, alternative compounds are entering consumer markets. Nevertheless, limited information exists on consequent exposure trends to both the original chemicals and their replacements. Biomonitoring (measuring concentrations of chemicals or their metabolites in people) provides invaluable information for exposure assessment. We use phthalates and bisphenol A-known industrial chemicals-and organophosphate insecticides as case studies to show exposure trends to these chemicals and their replacements (e.g., other phthalates, non-phthalate plasticizers, various bisphenols, pyrethroid insecticides) among the US general population. We compare US trends to national trends from Canada and Germany. Exposure to the original compounds is still prevalent among these general populations, but exposures to alternative chemicals may be increasing.
Clinical Assistant Diagnosis for Electronic Medical Record Based on Convolutional Neural Network.
Yang, Zhongliang; Huang, Yongfeng; Jiang, Yiran; Sun, Yuxi; Zhang, Yu-Jin; Luo, Pengcheng
2018-04-20
Automatically extracting useful information from electronic medical records along with conducting disease diagnoses is a promising task for both clinical decision support(CDS) and neural language processing(NLP). Most of the existing systems are based on artificially constructed knowledge bases, and then auxiliary diagnosis is done by rule matching. In this study, we present a clinical intelligent decision approach based on Convolutional Neural Networks(CNN), which can automatically extract high-level semantic information of electronic medical records and then perform automatic diagnosis without artificial construction of rules or knowledge bases. We use collected 18,590 copies of the real-world clinical electronic medical records to train and test the proposed model. Experimental results show that the proposed model can achieve 98.67% accuracy and 96.02% recall, which strongly supports that using convolutional neural network to automatically learn high-level semantic features of electronic medical records and then conduct assist diagnosis is feasible and effective.
Research on data collection key technology of smart electric energy meters
NASA Astrophysics Data System (ADS)
Chen, Xiangqun; Huang, Rui; Shen, Liman; Chen, Hao; Xiong, Dezhi; Xiao, Xiangqi; Mouhailiu; Renheng, Xu
2018-02-01
In recent years, smart electric energy meters are demand at 70 million to 90 million with the strong smart grid construction every year in China. However, there are some issues in smart electric energy meters data collection such as the interference of environment, low collection efficiency and inability to work when the power is off. In order to solve these issues above, it uses the RFID communication technology to collect the numbers and electric energy information of smart electric energy meters on the basis of the existing smart electric energy meters, and the related data collection communication experiments were made. The experimental result shows that the electric information and other data batch collection of RFID smart electric energy meters are realized in power and power off. It improves the efficiency and the overall success rate is 99.2% within 2 meters. It provides a new method for smart electric energy meters data collection.
Finger vein recognition based on local directional code.
Meng, Xianjing; Yang, Gongping; Yin, Yilong; Xiao, Rongyang
2012-11-05
Finger vein patterns are considered as one of the most promising biometric authentication methods for its security and convenience. Most of the current available finger vein recognition methods utilize features from a segmented blood vessel network. As an improperly segmented network may degrade the recognition accuracy, binary pattern based methods are proposed, such as Local Binary Pattern (LBP), Local Derivative Pattern (LDP) and Local Line Binary Pattern (LLBP). However, the rich directional information hidden in the finger vein pattern has not been fully exploited by the existing local patterns. Inspired by the Webber Local Descriptor (WLD), this paper represents a new direction based local descriptor called Local Directional Code (LDC) and applies it to finger vein recognition. In LDC, the local gradient orientation information is coded as an octonary decimal number. Experimental results show that the proposed method using LDC achieves better performance than methods using LLBP.
Estimation of road profile variability from measured vehicle responses
NASA Astrophysics Data System (ADS)
Fauriat, W.; Mattrand, C.; Gayton, N.; Beakou, A.; Cembrzynski, T.
2016-05-01
When assessing the statistical variability of fatigue loads acting throughout the life of a vehicle, the question of the variability of road roughness naturally arises, as both quantities are strongly related. For car manufacturers, gathering information on the environment in which vehicles evolve is a long and costly but necessary process to adapt their products to durability requirements. In the present paper, a data processing algorithm is proposed in order to estimate the road profiles covered by a given vehicle, from the dynamic responses measured on this vehicle. The algorithm based on Kalman filtering theory aims at solving a so-called inverse problem, in a stochastic framework. It is validated using experimental data obtained from simulations and real measurements. The proposed method is subsequently applied to extract valuable statistical information on road roughness from an existing load characterisation campaign carried out by Renault within one of its markets.
Wan, Shibiao; Mak, Man-Wai; Kung, Sun-Yuan
2016-12-02
In the postgenomic era, the number of unreviewed protein sequences is remarkably larger and grows tremendously faster than that of reviewed ones. However, existing methods for protein subchloroplast localization often ignore the information from these unlabeled proteins. This paper proposes a multi-label predictor based on ensemble linear neighborhood propagation (LNP), namely, LNP-Chlo, which leverages hybrid sequence-based feature information from both labeled and unlabeled proteins for predicting localization of both single- and multi-label chloroplast proteins. Experimental results on a stringent benchmark dataset and a novel independent dataset suggest that LNP-Chlo performs at least 6% (absolute) better than state-of-the-art predictors. This paper also demonstrates that ensemble LNP significantly outperforms LNP based on individual features. For readers' convenience, the online Web server LNP-Chlo is freely available at http://bioinfo.eie.polyu.edu.hk/LNPChloServer/ .
NASA Astrophysics Data System (ADS)
Khoirunnisa, Humaira; Aziz Majidi, Muhammad
2018-04-01
The emergence of exitonic signal in the optical response of a wide band-gap semiconductor has been a common knowledge in physics. There have been numerous experimental studies exploring the important role of excitons on influencing both the transport and optical properties of the materials. Despite the existence of much information on excitonic effects, there has not been much literature that explores detailed theoretical explanation on how the exitonic signal appears and how it evolves with temperature. Here, we propose a theoretical study on the optical conductivity of ZnO, a well-known wide band-gap semiconductor that we choose as a case study. ZnO has been known to exhibit excitonic states in its optical spectra in the energy range of ∼3.13-3.41 eV, with a high exciton binding energy of ∼60 meV. An experimental study on ZnO in 2014 revealed such a signal in its optical conductivity spectrum. We present a theoretical investigation on the appearance of excitonic signal in optical conductivity of ZnO. We model the wurtzite ZnO within an 8-band k.p approximation. We calculate the optical conductivity by incorporating the first-order vertex correction derived from the Feynman diagrams. Our calculation up to the first-order correction spectrum qualitatively confirms the existence of excitons in wurtzite ZnO.
Towards Smart Homes Using Low Level Sensory Data
Khattak, Asad Masood; Truc, Phan Tran Ho; Hung, Le Xuan; Vinh, La The; Dang, Viet-Hung; Guan, Donghai; Pervez, Zeeshan; Han, Manhyung; Lee, Sungyoung; Lee, Young-Koo
2011-01-01
Ubiquitous Life Care (u-Life care) is receiving attention because it provides high quality and low cost care services. To provide spontaneous and robust healthcare services, knowledge of a patient’s real-time daily life activities is required. Context information with real-time daily life activities can help to provide better services and to improve healthcare delivery. The performance and accuracy of existing life care systems is not reliable, even with a limited number of services. This paper presents a Human Activity Recognition Engine (HARE) that monitors human health as well as activities using heterogeneous sensor technology and processes these activities intelligently on a Cloud platform for providing improved care at low cost. We focus on activity recognition using video-based, wearable sensor-based, and location-based activity recognition engines and then use intelligent processing to analyze the context of the activities performed. The experimental results of all the components showed good accuracy against existing techniques. The system is deployed on Cloud for Alzheimer’s disease patients (as a case study) with four activity recognition engines to identify low level activity from the raw data captured by sensors. These are then manipulated using ontology to infer higher level activities and make decisions about a patient’s activity using patient profile information and customized rules. PMID:22247682
Aeroelasticity Benchmark Assessment: Subsonic Fixed Wing Program
NASA Technical Reports Server (NTRS)
Florance, Jennifer P.; Chwalowski, Pawel; Wieseman, Carol D.
2010-01-01
The fundamental technical challenge in computational aeroelasticity is the accurate prediction of unsteady aerodynamic phenomena and the effect on the aeroelastic response of a vehicle. Currently, a benchmarking standard for use in validating the accuracy of computational aeroelasticity codes does not exist. Many aeroelastic data sets have been obtained in wind-tunnel and flight testing throughout the world; however, none have been globally presented or accepted as an ideal data set. There are numerous reasons for this. One reason is that often, such aeroelastic data sets focus on the aeroelastic phenomena alone (flutter, for example) and do not contain associated information such as unsteady pressures and time-correlated structural dynamic deflections. Other available data sets focus solely on the unsteady pressures and do not address the aeroelastic phenomena. Other discrepancies can include omission of relevant data, such as flutter frequency and / or the acquisition of only qualitative deflection data. In addition to these content deficiencies, all of the available data sets present both experimental and computational technical challenges. Experimental issues include facility influences, nonlinearities beyond those being modeled, and data processing. From the computational perspective, technical challenges include modeling geometric complexities, coupling between the flow and the structure, grid issues, and boundary conditions. The Aeroelasticity Benchmark Assessment task seeks to examine the existing potential experimental data sets and ultimately choose the one that is viewed as the most suitable for computational benchmarking. An initial computational evaluation of that configuration will then be performed using the Langley-developed computational fluid dynamics (CFD) software FUN3D1 as part of its code validation process. In addition to the benchmarking activity, this task also includes an examination of future research directions. Researchers within the Aeroelasticity Branch will examine other experimental efforts within the Subsonic Fixed Wing (SFW) program (such as testing of the NASA Common Research Model (CRM)) and other NASA programs and assess aeroelasticity issues and research topics.
Dynamic Photorefractive Memory and its Application for Opto-Electronic Neural Networks.
NASA Astrophysics Data System (ADS)
Sasaki, Hironori
This dissertation describes the analysis of the photorefractive crystal dynamics and its application for opto-electronic neural network systems. The realization of the dynamic photorefractive memory is investigated in terms of the following aspects: fast memory update, uniform grating multiplexing schedules and the prevention of the partial erasure of existing gratings. The fast memory update is realized by the selective erasure process that superimposes a new grating on the original one with an appropriate phase shift. The dynamics of the selective erasure process is analyzed using the first-order photorefractive material equations and experimentally confirmed. The effects of beam coupling and fringe bending on the selective erasure dynamics are also analyzed by numerically solving a combination of coupled wave equations and the photorefractive material equation. Incremental recording technique is proposed as a uniform grating multiplexing schedule and compared with the conventional scheduled recording technique in terms of phase distribution in the presence of an external dc electric field, as well as the image gray scale dependence. The theoretical analysis and experimental results proved the superiority of the incremental recording technique over the scheduled recording. Novel recirculating information memory architecture is proposed and experimentally demonstrated to prevent partial degradation of the existing gratings by accessing the memory. Gratings are circulated through a memory feed back loop based on the incremental recording dynamics and demonstrate robust read/write/erase capabilities. The dynamic photorefractive memory is applied to opto-electronic neural network systems. Module architecture based on the page-oriented dynamic photorefractive memory is proposed. This module architecture can implement two complementary interconnection organizations, fan-in and fan-out. The module system scalability and the learning capabilities are theoretically investigated using the photorefractive dynamics described in previous chapters of the dissertation. The implementation of the feed-forward image compression network with 900 input and 9 output neurons with 6-bit interconnection accuracy is experimentally demonstrated. Learning of the Perceptron network that determines sex based on input face images of 900 pixels is also successfully demonstrated.
Wang, Huilin; Wang, Mingjun; Tan, Hao; Li, Yuan; Zhang, Ziding; Song, Jiangning
2014-01-01
X-ray crystallography is the primary approach to solve the three-dimensional structure of a protein. However, a major bottleneck of this method is the failure of multi-step experimental procedures to yield diffraction-quality crystals, including sequence cloning, protein material production, purification, crystallization and ultimately, structural determination. Accordingly, prediction of the propensity of a protein to successfully undergo these experimental procedures based on the protein sequence may help narrow down laborious experimental efforts and facilitate target selection. A number of bioinformatics methods based on protein sequence information have been developed for this purpose. However, our knowledge on the important determinants of propensity for a protein sequence to produce high diffraction-quality crystals remains largely incomplete. In practice, most of the existing methods display poorer performance when evaluated on larger and updated datasets. To address this problem, we constructed an up-to-date dataset as the benchmark, and subsequently developed a new approach termed 'PredPPCrys' using the support vector machine (SVM). Using a comprehensive set of multifaceted sequence-derived features in combination with a novel multi-step feature selection strategy, we identified and characterized the relative importance and contribution of each feature type to the prediction performance of five individual experimental steps required for successful crystallization. The resulting optimal candidate features were used as inputs to build the first-level SVM predictor (PredPPCrys I). Next, prediction outputs of PredPPCrys I were used as the input to build second-level SVM classifiers (PredPPCrys II), which led to significantly enhanced prediction performance. Benchmarking experiments indicated that our PredPPCrys method outperforms most existing procedures on both up-to-date and previous datasets. In addition, the predicted crystallization targets of currently non-crystallizable proteins were provided as compendium data, which are anticipated to facilitate target selection and design for the worldwide structural genomics consortium. PredPPCrys is freely available at http://www.structbioinfor.org/PredPPCrys.
USAF Flight Test Investigation of Focused Sonic Booms: Project Have Bears
NASA Technical Reports Server (NTRS)
Downing, Micah; Zamot, Noel; Moss, Chris; Morin, Daniel; Wolski, Ed; Chung, Sukhwan; Plotkin, Kenneth; Maglieri, Domenic
1996-01-01
Supersonic operations from military aircraft generate sonic booms that can affect people, animals and structures. A substantial experimental data base exists on sonic booms for aircraft in steady flight and confidence in the predictive techniques has been established. All the focus sonic boom data that are in existence today were collected during the 60's and 70's as part of the information base to the US Supersonic Transport program and the French Jericho studies for the Concorde. These experiments formed the data base to develop sonic boom propagation and prediction theories for focusing. There is a renewed interest in high-speed transports for civilian application. Moreover, today's fighter aircraft have better performance capabilities, and supersonic flights ars more common during air combat maneuvers. Most of the existing data on focus booms are related to high-speed civil operations such as transitional linear accelerations and mild turns. However, military aircraft operating in training areas perform more drastic maneuvers such as dives and high-g turns. An update and confirmation of USAF prediction capabilities is required to demonstrate the ability to predict and control sonic boom impacts, especially those produced by air combat maneuvers.
Retrieval activates related words more than presentation.
Hausman, Hannah; Rhodes, Matthew G
2018-03-23
Retrieving information enhances learning more than restudying. One explanation of this effect is based on the role of mediators (e.g., sand-castle can be mediated by beach). Retrieval is hypothesised to activate mediators more than restudying, but existing tests of this hypothesis have had mixed results [Carpenter, S. K. (2011). Semantic information activated during retrieval contributes to later retention: Support for the mediator effectiveness hypothesis of the testing effect. Journal of Experimental Psychology: Learning, Memory, and Cognition, 37(6), 1547-1552. doi: 10.1037/a0024140 ; Lehman, M., & Karpicke, J. D. (2016). Elaborative retrieval: Do semantic mediators improve memory? Journal of Experimental Psychology: Learning, Memory, and Cognition, 42(10), 1573-1591. doi: 10.1037/xlm0000267 ]. The present experiments explored an explanation of the conflicting results, testing whether mediator activation during a retrieval attempt depends on the accessibility of the target information. A target was considered less versus more accessible when fewer versus more cues were given during retrieval practice (Experiments 1 and 2), when the target had been studied once versus three times initially (Experiment 3), or when the target could not be recalled versus could be recalled during retrieval practice (Experiments 1-3). A mini meta-analysis of all three experiments revealed a small effect such that retrieval activated mediators more than presentation, but mediator activation was not reliably related to target accessibility. Thus, retrieval may enhance learning by activating mediators, in part, but these results suggest the role of other processes, too.
Bayard, David S.; Neely, Michael
2016-01-01
An experimental design approach is presented for individualized therapy in the special case where the prior information is specified by a nonparametric (NP) population model. Here, a nonparametric model refers to a discrete probability model characterized by a finite set of support points and their associated weights. An important question arises as to how to best design experiments for this type of model. Many experimental design methods are based on Fisher Information or other approaches originally developed for parametric models. While such approaches have been used with some success across various applications, it is interesting to note that they largely fail to address the fundamentally discrete nature of the nonparametric model. Specifically, the problem of identifying an individual from a nonparametric prior is more naturally treated as a problem of classification, i.e., to find a support point that best matches the patient’s behavior. This paper studies the discrete nature of the NP experiment design problem from a classification point of view. Several new insights are provided including the use of Bayes Risk as an information measure, and new alternative methods for experiment design. One particular method, denoted as MMopt (Multiple-Model Optimal), will be examined in detail and shown to require minimal computation while having distinct advantages compared to existing approaches. Several simulated examples, including a case study involving oral voriconazole in children, are given to demonstrate the usefulness of MMopt in pharmacokinetics applications. PMID:27909942
Bayard, David S; Neely, Michael
2017-04-01
An experimental design approach is presented for individualized therapy in the special case where the prior information is specified by a nonparametric (NP) population model. Here, a NP model refers to a discrete probability model characterized by a finite set of support points and their associated weights. An important question arises as to how to best design experiments for this type of model. Many experimental design methods are based on Fisher information or other approaches originally developed for parametric models. While such approaches have been used with some success across various applications, it is interesting to note that they largely fail to address the fundamentally discrete nature of the NP model. Specifically, the problem of identifying an individual from a NP prior is more naturally treated as a problem of classification, i.e., to find a support point that best matches the patient's behavior. This paper studies the discrete nature of the NP experiment design problem from a classification point of view. Several new insights are provided including the use of Bayes Risk as an information measure, and new alternative methods for experiment design. One particular method, denoted as MMopt (multiple-model optimal), will be examined in detail and shown to require minimal computation while having distinct advantages compared to existing approaches. Several simulated examples, including a case study involving oral voriconazole in children, are given to demonstrate the usefulness of MMopt in pharmacokinetics applications.
Drug-target interaction prediction from PSSM based evolutionary information.
Mousavian, Zaynab; Khakabimamaghani, Sahand; Kavousi, Kaveh; Masoudi-Nejad, Ali
2016-01-01
The labor-intensive and expensive experimental process of drug-target interaction prediction has motivated many researchers to focus on in silico prediction, which leads to the helpful information in supporting the experimental interaction data. Therefore, they have proposed several computational approaches for discovering new drug-target interactions. Several learning-based methods have been increasingly developed which can be categorized into two main groups: similarity-based and feature-based. In this paper, we firstly use the bi-gram features extracted from the Position Specific Scoring Matrix (PSSM) of proteins in predicting drug-target interactions. Our results demonstrate the high-confidence prediction ability of the Bigram-PSSM model in terms of several performance indicators specifically for enzymes and ion channels. Moreover, we investigate the impact of negative selection strategy on the performance of the prediction, which is not widely taken into account in the other relevant studies. This is important, as the number of non-interacting drug-target pairs are usually extremely large in comparison with the number of interacting ones in existing drug-target interaction data. An interesting observation is that different levels of performance reduction have been attained for four datasets when we change the sampling method from the random sampling to the balanced sampling. Copyright © 2015 Elsevier Inc. All rights reserved.
Corten, Rense; Rosenkranz, Stephanie; Buskens, Vincent; Cook, Karen S
2016-01-01
Despite the popularity of the notion that social cohesion in the form of dense social networks promotes cooperation in Prisoner's Dilemmas through reputation, very little experimental evidence for this claim exists. We address this issue by testing hypotheses from one of the few rigorous game-theoretic models on this topic, the Raub & Weesie model, in two incentivized lab experiments. In the experiments, 156 subjects played repeated two-person PDs in groups of six. In the "atomized interactions" condition, subjects were only informed about the outcomes of their own interactions, while in the "embedded" condition, subjects were informed about the outcomes of all interactions in their group, allowing for reputation effects. The design of the experiments followed the specification of the RW model as closely as possible. For those aspects of the model that had to be modified to allow practical implementation in an experiment, we present additional analyses that show that these modifications do not affect the predictions. Contrary to expectations, we do not find that cooperation is higher in the embedded condition than in the atomized interaction. Instead, our results are consistent with an interpretation of the RW model that includes random noise, or with learning models of cooperation in networks.
Liu, Yang; Chiaromonte, Francesca; Li, Bing
2017-06-01
In many scientific and engineering fields, advanced experimental and computing technologies are producing data that are not just high dimensional, but also internally structured. For instance, statistical units may have heterogeneous origins from distinct studies or subpopulations, and features may be naturally partitioned based on experimental platforms generating them, or on information available about their roles in a given phenomenon. In a regression analysis, exploiting this known structure in the predictor dimension reduction stage that precedes modeling can be an effective way to integrate diverse data. To pursue this, we propose a novel Sufficient Dimension Reduction (SDR) approach that we call structured Ordinary Least Squares (sOLS). This combines ideas from existing SDR literature to merge reductions performed within groups of samples and/or predictors. In particular, it leads to a version of OLS for grouped predictors that requires far less computation than recently proposed groupwise SDR procedures, and provides an informal yet effective variable selection tool in these settings. We demonstrate the performance of sOLS by simulation and present a first application to genomic data. The R package "sSDR," publicly available on CRAN, includes all procedures necessary to implement the sOLS approach. © 2016, The International Biometric Society.
Schwitzer, Thomas; Schwan, Raymund; Angioi-Duprez, Karine; Ingster-Moati, Isabelle; Lalanne, Laurence; Giersch, Anne; Laprevote, Vincent
2015-01-01
Cannabis is one of the most prevalent drugs used worldwide. Regular cannabis use is associated with impairments in highly integrative cognitive functions such as memory, attention and executive functions. To date, the cerebral mechanisms of these deficits are still poorly understood. Studying the processing of visual information may offer an innovative and relevant approach to evaluate the cerebral impact of exogenous cannabinoids on the human brain. Furthermore, this knowledge is required to understand the impact of cannabis intake in everyday life, and especially in car drivers. Here we review the role of the endocannabinoids in the functioning of the visual system and the potential involvement of cannabis use in visual dysfunctions. This review describes the presence of the endocannabinoids in the critical stages of visual information processing, and their role in the modulation of visual neurotransmission and visual synaptic plasticity, thereby enabling them to alter the transmission of the visual signal. We also review several induced visual changes, together with experimental dysfunctions reported in cannabis users. In the discussion, we consider these results in relation to the existing literature. We argue for more involvement of public health research in the study of visual function in cannabis users, especially because cannabis use is implicated in driving impairments. Copyright © 2014 Elsevier B.V. and ECNP. All rights reserved.
The Use of GOCE/GRACE Information in the Latest NGS xGeoid15 Model for the USA
NASA Astrophysics Data System (ADS)
Holmes, S. A.; Li, X.; Youngman, M.
2015-12-01
The U.S. National Geodetic Survey [NGS], through its Gravity for the Redefinition of the American Vertical Datum [GRAV-D] program, is flying airborne gravity surveys over the USA and its territories. By 2022, NGS intends that all orthometric heights in the USA will be determined in the field using a reliable national gravimetric geoid model to transform from geodetic heights obtained from GPS. Towards this end, all available airborne data has been incorporated into a new NGS experimental geoid model - xGEOID15. The xGEOID15 model is the second in a series of annual experimental geoid models that incorporates NGS GRAV-D airborne data. This series provides a useful benchmark for assessing and improving current techniques, to ultimately compute a geoid model that can support a national physical height system by 2022. Here, we focus on the combination of the latest GOCE/GRACE models with the terrestrial gravimetry (land/airborne) that was applied for xGeoid15. Comparisons against existing combination gravitational solutions, such as EGM2008 and EIGEN6C4, as well as recent geoid models, such as xGeoid14 and CGG2013, are interesting for what they reveal about the respective use of the GOCE/GRACE satgrav information.
Dupuytren's: a systems biology disease
2011-01-01
Dupuytren's disease (DD) is an ill-defined fibroproliferative disorder of the palm of the hands leading to digital contracture. DD commonly occurs in individuals of northern European extraction. Cellular components and processes associated with DD pathogenesis include altered gene and protein expression of cytokines, growth factors, adhesion molecules, and extracellular matrix components. Histology has shown increased but varying levels of particular types of collagen, myofibroblasts and myoglobin proteins in DD tissue. Free radicals and localised ischaemia have been suggested to trigger the proliferation of DD tissue. Although the existing available biological information on DD may contain potentially valuable (though largely uninterpreted) information, the precise aetiology of DD remains unknown. Systems biology combines mechanistic modelling with quantitative experimentation in studies of networks and better understanding of the interaction of multiple components in disease processes. Adopting systems biology may be the ideal approach for future research in order to improve understanding of complex diseases of multifactorial origin. In this review, we propose that DD is a disease of several networks rather than of a single gene, and show that this accounts for the experimental observations obtained to date from a variety of sources. We outline how DD may be investigated more effectively by employing a systems biology approach that considers the disease network as a whole rather than focusing on any specific single molecule. PMID:21943049
Corten, Rense; Rosenkranz, Stephanie; Buskens, Vincent; Cook, Karen S.
2016-01-01
Despite the popularity of the notion that social cohesion in the form of dense social networks promotes cooperation in Prisoner’s Dilemmas through reputation, very little experimental evidence for this claim exists. We address this issue by testing hypotheses from one of the few rigorous game-theoretic models on this topic, the Raub & Weesie model, in two incentivized lab experiments. In the experiments, 156 subjects played repeated two-person PDs in groups of six. In the “atomized interactions” condition, subjects were only informed about the outcomes of their own interactions, while in the “embedded” condition, subjects were informed about the outcomes of all interactions in their group, allowing for reputation effects. The design of the experiments followed the specification of the RW model as closely as possible. For those aspects of the model that had to be modified to allow practical implementation in an experiment, we present additional analyses that show that these modifications do not affect the predictions. Contrary to expectations, we do not find that cooperation is higher in the embedded condition than in the atomized interaction. Instead, our results are consistent with an interpretation of the RW model that includes random noise, or with learning models of cooperation in networks. PMID:27366907
Bapat, Shweta S; Patel, Harshali K; Sansgiry, Sujit S
2017-10-16
In this study, we evaluate the role of information anxiety and information load on the intention to read information from prescription drug information leaflets (PILs). These PILs were developed based on the principals of information load and consumer information processing. This was an experimental prospective repeated measures study conducted in the United States where 360 (62% response rate) university students (>18 years old) participated. Participants were presented with a scenario followed by exposure to the three drug product information sources used to operationalize information load. The three sources were: (i) current practice; (ii) pre-existing one-page text only; and (iii) interventional one-page prototype PILs designed for the study. Information anxiety was measured as anxiety experienced by the individual when encountering information. The outcome variable of intention to read PILs was defined as the likelihood that the patient will read the information provided in the leaflets. A survey questionnaire was used to capture the data and the objectives were analyzed by performing a repeated measures MANOVA using SAS version 9.3. When compared to current practice and one-page text only leaflets, one-page PILs had significantly lower scores on information anxiety ( p < 0.001) and information load ( p < 0.001). The intention to read was highest and significantly different ( p < 0.001) for PILs as compared to current practice or text only leaflets. Information anxiety and information load significantly impacted intention to read ( p < 0.001). Newly developed PILs increased patient's intention to read and can help in improving the counseling services provided by pharmacists.
Bapat, Shweta S.; Patel, Harshali K.; Sansgiry, Sujit S.
2017-01-01
In this study, we evaluate the role of information anxiety and information load on the intention to read information from prescription drug information leaflets (PILs). These PILs were developed based on the principals of information load and consumer information processing. This was an experimental prospective repeated measures study conducted in the United States where 360 (62% response rate) university students (>18 years old) participated. Participants were presented with a scenario followed by exposure to the three drug product information sources used to operationalize information load. The three sources were: (i) current practice; (ii) pre-existing one-page text only; and (iii) interventional one-page prototype PILs designed for the study. Information anxiety was measured as anxiety experienced by the individual when encountering information. The outcome variable of intention to read PILs was defined as the likelihood that the patient will read the information provided in the leaflets. A survey questionnaire was used to capture the data and the objectives were analyzed by performing a repeated measures MANOVA using SAS version 9.3. When compared to current practice and one-page text only leaflets, one-page PILs had significantly lower scores on information anxiety (p < 0.001) and information load (p < 0.001). The intention to read was highest and significantly different (p < 0.001) for PILs as compared to current practice or text only leaflets. Information anxiety and information load significantly impacted intention to read (p < 0.001). Newly developed PILs increased patient’s intention to read and can help in improving the counseling services provided by pharmacists. PMID:29035337
Barbieri, Marcello
2016-03-13
Molecular biology is based on two great discoveries: the first is that genes carry hereditary information in the form of linear sequences of nucleotides; the second is that in protein synthesis a sequence of nucleotides is translated into a sequence of amino acids, a process that amounts to a transfer of information from genes to proteins. These discoveries have shown that the information of genes and proteins is the specific linear order of their sequences. This is a clear definition of information and there is no doubt that it reflects an experimental reality. What is not clear, however, is the ontological status of information, and the result is that today we have two conflicting paradigms in biology. One is the 'chemical paradigm', the idea that 'life is chemistry', or, more precisely, that 'life is an extremely complex form of chemistry'. The other is the 'information paradigm', the view that chemistry is not enough, that 'life is chemistry plus information'. This implies that there is an ontological difference between information and chemistry, a difference which is often expressed by saying that information-based processes like heredity and natural selection simply do not exist in the world of chemistry. Against this conclusion, the supporters of the chemical paradigm have argued that the concept of information is only a linguistic metaphor, a word that summarizes the result of countless underlying chemical reactions. The supporters of the information paradigm insist that information is a real and fundamental component of the living world, but have not been able to prove this point. As a result, the chemical view has not been abandoned and the two paradigms both coexist today. Here, it is shown that a solution to the ontological problem of information does exist. It comes from the idea that life is artefact-making, that genes and proteins are molecular artefacts manufactured by molecular machines and that artefacts necessarily require sequences and coding rules in addition to the quantities of physics and chemistry. More precisely, it is shown that the production of artefacts requires new observables that are referred to as nominable entities because they can be described only by naming their components in their natural order. From an ontological point of view, in conclusion, information is a nominable entity, a fundamental but not-computable observable. © 2016 The Author(s).
Federal Register 2010, 2011, 2012, 2013, 2014
2012-10-31
... technological collection techniques or other forms of information technology, e.g., permitting electronic... Information Collection Activities: Extension, With Change, of an Existing Information Collection; Comment Request ACTION: 30-Day Notice of Information Collection for Review; File No. 10-002, Electronic Funds...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-03-14
..., mechanical, or other technological collection techniques or other forms of information technology, e.g...; or inquiries for additional information should be directed to: John Ramsay, Forms Program Manager, U... Information Collection Activities: Extension, without Change, of an Existing Information Collection; Comment...
Sketch Matching on Topology Product Graph.
Liang, Shuang; Luo, Jun; Liu, Wenyin; Wei, Yichen
2015-08-01
Sketch matching is the fundamental problem in sketch based interfaces. After years of study, it remains challenging when there exists large irregularity and variations in the hand drawn sketch shapes. While most existing works exploit topology relations and graph representations for this problem, they are usually limited by the coarse topology exploration and heuristic (thus suboptimal) similarity metrics between graphs. We present a new sketch matching method with two novel contributions. We introduce a comprehensive definition of topology relations, which results in a rich and informative graph representation of sketches. For graph matching, we propose topology product graph that retains the full correspondence for matching two graphs. Based on it, we derive an intuitive sketch similarity metric whose exact solution is easy to compute. In addition, the graph representation and new metric naturally support partial matching, an important practical problem that received less attention in the literature. Extensive experimental results on a real challenging dataset and the superior performance of our method show that it outperforms the state-of-the-art.
Nonexposure Accurate Location K-Anonymity Algorithm in LBS
2014-01-01
This paper tackles location privacy protection in current location-based services (LBS) where mobile users have to report their exact location information to an LBS provider in order to obtain their desired services. Location cloaking has been proposed and well studied to protect user privacy. It blurs the user's accurate coordinate and replaces it with a well-shaped cloaked region. However, to obtain such an anonymous spatial region (ASR), nearly all existent cloaking algorithms require knowing the accurate locations of all users. Therefore, location cloaking without exposing the user's accurate location to any party is urgently needed. In this paper, we present such two nonexposure accurate location cloaking algorithms. They are designed for K-anonymity, and cloaking is performed based on the identifications (IDs) of the grid areas which were reported by all the users, instead of directly on their accurate coordinates. Experimental results show that our algorithms are more secure than the existent cloaking algorithms, need not have all the users reporting their locations all the time, and can generate smaller ASR. PMID:24605060
NASA Astrophysics Data System (ADS)
Li, Miao; Lin, Zaiping; Long, Yunli; An, Wei; Zhou, Yiyu
2016-05-01
The high variability of target size makes small target detection in Infrared Search and Track (IRST) a challenging task. A joint detection and tracking method based on block-wise sparse decomposition is proposed to address this problem. For detection, the infrared image is divided into overlapped blocks, and each block is weighted on the local image complexity and target existence probabilities. Target-background decomposition is solved by block-wise inexact augmented Lagrange multipliers. For tracking, label multi-Bernoulli (LMB) tracker tracks multiple targets taking the result of single-frame detection as input, and provides corresponding target existence probabilities for detection. Unlike fixed-size methods, the proposed method can accommodate size-varying targets, due to no special assumption for the size and shape of small targets. Because of exact decomposition, classical target measurements are extended and additional direction information is provided to improve tracking performance. The experimental results show that the proposed method can effectively suppress background clutters, detect and track size-varying targets in infrared images.
Image steganalysis using Artificial Bee Colony algorithm
NASA Astrophysics Data System (ADS)
Sajedi, Hedieh
2017-09-01
Steganography is the science of secure communication where the presence of the communication cannot be detected while steganalysis is the art of discovering the existence of the secret communication. Processing a huge amount of information takes extensive execution time and computational sources most of the time. As a result, it is needed to employ a phase of preprocessing, which can moderate the execution time and computational sources. In this paper, we propose a new feature-based blind steganalysis method for detecting stego images from the cover (clean) images with JPEG format. In this regard, we present a feature selection technique based on an improved Artificial Bee Colony (ABC). ABC algorithm is inspired by honeybees' social behaviour in their search for perfect food sources. In the proposed method, classifier performance and the dimension of the selected feature vector depend on using wrapper-based methods. The experiments are performed using two large data-sets of JPEG images. Experimental results demonstrate the effectiveness of the proposed steganalysis technique compared to the other existing techniques.
Ren, Hao; Zhang, Yu; Guo, Sibei; ...
2017-10-31
The aggregation of amyloid beta (Aβ) peptides plays a crucial role in the pathology and etiology of Alzheimer's disease. Experimental evidence shows that copper ion is an aggregation-prone species with the ability to coordinately bind to Aβ and further induce the formation of neurotoxic Aβ oligomers. However, the detailed structures of Cu(II)–Aβ complexes have not been illustrated, and the kinetics and dynamics of the Cu(II) binding are not well understood. Two Cu(II)–Aβ complexes have been proposed to exist under physiological conditions, and another two might exist at higher pH values. By using ab initio simulations for the spontaneous resonance Ramanmore » and time domain stimulated resonance Raman spectroscopy signals, we obtained the characteristic Raman vibronic features of each complex. Finally, these signals contain rich structural information with high temporal resolution, enabling the characterization of transient states during the fast Cu–Aβ binding and interconversion processes.« less
Integrating functional genomics to accelerate mechanistic personalized medicine.
Tyner, Jeffrey W
2017-03-01
The advent of deep sequencing technologies has resulted in the deciphering of tremendous amounts of genetic information. These data have led to major discoveries, and many anecdotes now exist of individual patients whose clinical outcomes have benefited from novel, genetically guided therapeutic strategies. However, the majority of genetic events in cancer are currently undrugged, leading to a biological gap between understanding of tumor genetic etiology and translation to improved clinical approaches. Functional screening has made tremendous strides in recent years with the development of new experimental approaches to studying ex vivo and in vivo drug sensitivity. Numerous discoveries and anecdotes also exist for translation of functional screening into novel clinical strategies; however, the current clinical application of functional screening remains largely confined to small clinical trials at specific academic centers. The intersection between genomic and functional approaches represents an ideal modality to accelerate our understanding of drug sensitivities as they relate to specific genetic events and further understand the full mechanisms underlying drug sensitivity patterns.
Disease and infection in the Tetraonidae
Herman, C.M.
1963-01-01
Disease is one of many factors advanced to explain the fluctuations of grouse populations, but no profound study of natural disease losses in Tetraonidae exists. The literature contains frequent references to THE grouse disease, although many potential pathogens are listed in numerous surveys and limited investigations, and the relevant data indicate that no single etiologic agent is universally responsible for disease in grouse. Few experimental infections or related studies on parasite biology have been attempted. Well-trained personnel and specialized facilities are required for research and analysis (1) to develop new methods of interpretation to be used with existing census techniques, (2) to conduct intensive studies of ecological factors of host and habitat, and (3) to establish base lines for recognition of deviations from the norm. Disease in wildlife can be controlled only through management procedures based on information concerning the biology of pathogens, hosts, and environments. It cannot be studied as a separate entity if its impact on survival or population fluctuations of grouse is to be correctly assessed.
Meeting new challenges: The 2014 HUPO-PSI/COSMOS Workshop: 13-15 April 2014, Frankfurt, Germany.
Orchard, Sandra; Albar, Juan Pablo; Binz, Pierre-Alain; Kettner, Carsten; Jones, Andrew R; Salek, Reza M; Vizcaino, Juan Antonio; Deutsch, Eric W; Hermjakob, Henning
2014-11-01
The Annual 2014 Spring Workshop of the Proteomics Standards Initiative (PSI) of the Human Proteome Organization (HUPO) was held this year jointly with the metabolomics COordination of Standards in MetabOlomicS (COSMOS) group. The range of existing MS standards (mzML, mzIdentML, mzQuantML, mzTab, TraML) was reviewed and updated in the light of new methodologies and advances in technologies. Adaptations to meet the needs of the metabolomics community were incorporated and a new data format for NMR, nmrML, was presented. The molecular interactions workgroup began work on a new version of the existing XML data interchange format. PSI-MI XML3.0 will enable the capture of more abstract data types such as protein complex topology derived from experimental data, allosteric binding, and dynamic interactions. Further information about the work of the HUPO-PSI can be found at http://www.psidev.info. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
NASA Astrophysics Data System (ADS)
Liu, Miaofeng
2017-07-01
In recent years, deep convolutional neural networks come into use in image inpainting and super-resolution in many fields. Distinct to most of the former methods requiring to know beforehand the local information for corrupted pixels, we propose a 20-depth fully convolutional network to learn an end-to-end mapping a dataset of damaged/ground truth subimage pairs realizing non-local blind inpainting and super-resolution. As there often exist image with huge corruptions or inpainting on a low-resolution image that the existing approaches unable to perform well, we also share parameters in local area of layers to achieve spatial recursion and enlarge the receptive field. To avoid the difficulty of training this deep neural network, skip-connections between symmetric convolutional layers are designed. Experimental results shows that the proposed method outperforms state-of-the-art methods for diverse corrupting and low-resolution conditions, it works excellently when realizing super-resolution and image inpainting simultaneously
A general strategy to solve the phase problem in RNA crystallography
Keel, Amanda Y.; Rambo, Robert P.; Batey, Robert T.; Kieft, Jeffrey S.
2007-01-01
SUMMARY X-ray crystallography of biologically important RNA molecules has been hampered by technical challenges, including finding a heavy-atom derivative to obtain high-quality experimental phase information. Existing techniques have drawbacks, severely limiting the rate at which important new structures are solved. To address this need, we have developed a reliable means to localize heavy atoms specifically to virtually any RNA. By solving the crystal structures of thirteen variants of the G·U wobble pair cation binding motif we have identified an optimal version that when inserted into an RNA helix introduces a high-occupancy cation binding site suitable for phasing. This “directed soaking” strategy can be integrated fully into existing RNA and crystallography methods, potentially increasing the rate at which important structures are solved and facilitating routine solving of structures using Cu-Kα radiation. The success of this method has been proven in that it has already been used to solve several novel crystal structures. PMID:17637337
Gradient-based reliability maps for ACM-based segmentation of hippocampus.
Zarpalas, Dimitrios; Gkontra, Polyxeni; Daras, Petros; Maglaveras, Nicos
2014-04-01
Automatic segmentation of deep brain structures, such as the hippocampus (HC), in MR images has attracted considerable scientific attention due to the widespread use of MRI and to the principal role of some structures in various mental disorders. In this literature, there exists a substantial amount of work relying on deformable models incorporating prior knowledge about structures' anatomy and shape information. However, shape priors capture global shape characteristics and thus fail to model boundaries of varying properties; HC boundaries present rich, poor, and missing gradient regions. On top of that, shape prior knowledge is blended with image information in the evolution process, through global weighting of the two terms, again neglecting the spatially varying boundary properties, causing segmentation faults. An innovative method is hereby presented that aims to achieve highly accurate HC segmentation in MR images, based on the modeling of boundary properties at each anatomical location and the inclusion of appropriate image information for each of those, within an active contour model framework. Hence, blending of image information and prior knowledge is based on a local weighting map, which mixes gradient information, regional and whole brain statistical information with a multi-atlas-based spatial distribution map of the structure's labels. Experimental results on three different datasets demonstrate the efficacy and accuracy of the proposed method.
Modeling Systems-Level Regulation of Host Immune Responses
Thakar, Juilee; Pilione, Mylisa; Kirimanjeswara, Girish; Harvill, Eric T; Albert, Réka
2007-01-01
Many pathogens are able to manipulate the signaling pathways responsible for the generation of host immune responses. Here we examine and model a respiratory infection system in which disruption of host immune functions or of bacterial factors changes the dynamics of the infection. We synthesize the network of interactions between host immune components and two closely related bacteria in the genus Bordetellae. We incorporate existing experimental information on the timing of immune regulatory events into a discrete dynamic model, and verify the model by comparing the effects of simulated disruptions to the experimental outcome of knockout mutations. Our model indicates that the infection time course of both Bordetellae can be separated into three distinct phases based on the most active immune processes. We compare and discuss the effect of the species-specific virulence factors on disrupting the immune response during their infection of naive, antibody-treated, diseased, or convalescent hosts. Our model offers predictions regarding cytokine regulation, key immune components, and clearance of secondary infections; we experimentally validate two of these predictions. This type of modeling provides new insights into the virulence, pathogenesis, and host adaptation of disease-causing microorganisms and allows systems-level analysis that is not always possible using traditional methods. PMID:17559300
Quantifying variability in delta experiments
NASA Astrophysics Data System (ADS)
Miller, K. L.; Berg, S. R.; McElroy, B. J.
2017-12-01
Large populations of people and wildlife make their homes on river deltas, therefore it is important to be able to make useful and accurate predictions of how these landforms will change over time. However, making predictions can be a challenge due to inherent variability of the natural system. Furthermore, when we extrapolate results from the laboratory to the field setting, we bring with it random and systematic errors of the experiment. We seek to understand both the intrinsic and experimental variability of river delta systems to help better inform predictions of how these landforms will evolve. We run exact replicates of experiments with steady sediment and water discharge and record delta evolution with overhead time lapse imaging. We measure aspects of topset progradation and channel dynamics and compare these metrics of delta morphology between the 6 replicated experimental runs. We also use data from all experimental runs collectively to build a large dataset to extract statistics of the system properties. We find that although natural variability exists, the processes in the experiments must have outcomes that no longer depend on their initial conditions after some time. Applying these results to the field scale will aid in our ability to make forecasts of how these landforms will progress.
Quantum fingerprinting with coherent states and a constant mean number of photons
NASA Astrophysics Data System (ADS)
Arrazola, Juan Miguel; Lütkenhaus, Norbert
2014-06-01
We present a protocol for quantum fingerprinting that is ready to be implemented with current technology and is robust to experimental errors. The basis of our scheme is an implementation of the signal states in terms of a coherent state in a superposition of time-bin modes. Experimentally, this requires only the ability to prepare coherent states of low amplitude and to interfere them in a balanced beam splitter. The states used in the protocol are arbitrarily close in trace distance to states of O (log2n) qubits, thus exhibiting an exponential separation in abstract communication complexity compared to the classical case. The protocol uses a number of optical modes that is proportional to the size n of the input bit strings but a total mean photon number that is constant and independent of n. Given the expended resources, our protocol achieves a task that is provably impossible using classical communication only. In fact, even in the presence of realistic experimental errors and loss, we show that there exist a large range of input sizes for which our quantum protocol transmits an amount of information that can be more than two orders of magnitude smaller than a classical fingerprinting protocol.
Experimental and Theoretical Study of Propeller Spinner/Shank Interference. M.S. Thesis
NASA Technical Reports Server (NTRS)
Cornell, C. C.
1986-01-01
A fundamental experimental and theoretical investigation into the aerodynamic interference associated with propeller spinner and shank regions was conducted. The research program involved a theoretical assessment of solutions previously proposed, followed by a systematic experimental study to supplement the existing data base. As a result, a refined computational procedure was established for prediction of interference effects in terms of interference drag and resolved into propeller thrust and torque components. These quantities were examined with attention to engineering parameters such as two spinner finess ratios, three blade shank forms, and two/three/four/six/eight blades. Consideration of the physics of the phenomena aided in the logical deduction of two individual interference quantities (cascade effects and spinner/shank juncture interference). These interference effects were semi-empirically modeled using existing theories and placed into a compatible form with an existing propeller performance scheme which provided the basis for examples of application.
Nanowires and Nanostructures That Grow Like Polymer Molecules
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shaw, Santosh; Cademartiri, Ludovico
Unique properties (e.g., rubber elasticity, viscoelasticity, folding, reptation) determine the utility of polymer molecules and derive from their morphology (i.e., one-dimensional connectivity and large aspect ratios) and flexibility. Crystals do not display similar properties because they have smaller aspect ratios, they are rigid, and they are often too large and heavy to be colloidally stable. We argue, with the support of recent experimental studies, that these limitations are not fundamental and that they might be overcome by growth processes that mimic polymerization. Furthermore, we (i) discuss the similarities between crystallization and polymerization, (ii) critically review the existing experimental evidence ofmore » polymer-like growth kinetic and behavior in crystals and nanostructures, and (iii) propose heuristic guidelines for the synthesis of “polymer-like” crystals and assemblies. Understanding these anisotropic materials at the boundary between molecules and solids will determine whether we can confer the unique properties of polymer molecules to crystals, expanding them with topology, dynamics, and information and not just tuning them with size.« less
Prosocial preferences do not explain human cooperation in public-goods games
Burton-Chellew, Maxwell N.; West, Stuart A.
2013-01-01
It has become an accepted paradigm that humans have “prosocial preferences” that lead to higher levels of cooperation than those that would maximize their personal financial gain. However, the existence of prosocial preferences has been inferred post hoc from the results of economic games, rather than with direct experimental tests. Here, we test how behavior in a public-goods game is influenced by knowledge of the consequences of actions for other players. We found that (i) individuals cooperate at similar levels, even when they are not informed that their behavior benefits others; (ii) an increased awareness of how cooperation benefits others leads to a reduction, rather than an increase, in the level of cooperation; and (iii) cooperation can be either lower or higher than expected, depending on experimental design. Overall, these results contradict the suggested role of the prosocial preferences hypothesis and show how the complexity of human behavior can lead to misleading conclusions from controlled laboratory experiments. PMID:23248298
Charge transfer and adsorption-desorption kinetics in carbon nanotube and graphene gas sensing
NASA Astrophysics Data System (ADS)
Liang, Sang-Zi; Chen, Gugang; Harutyunyan, Avetik; Cole, Milton; Sofo, Jorge
2014-03-01
Detection of molecules in the gas phase by carbon nanotube and graphene has great application potentials due to the high sensitivity and surface-to-volume ratio. In chemiresistor, the conductance of the materials has been proposed to change as a result of charge transfer from the adsorbed molecules. Due to self-interaction errors, calculations using LDA or GGA density functionals have an innate disadvantage in dealing with charge transfer situations. A model which takes into consideration the dielectric interaction between the graphene surface and the molecule is employed to estimate the distance where charge transfer becomes favorable. Adsorption-desorption kinetics is studied with a modified Langmuir model, including sites from which the molecules do not desorb within the experimental time. Assuming a constant mobility, the model reproduces existing experimental conductance data. Its parameters provide information about the microscopic process during the detection and varying them allows optimization of aspects of sensor performance, including sensitivity, detection limit and response time. This work is supported by Honda Research Institute USA, Inc.
Condie, Brian G; Urbanski, William M
2014-01-01
Effective tools for searching the biomedical literature are essential for identifying reagents or mouse strains as well as for effective experimental design and informed interpretation of experimental results. We have built the Textpresso Site Specific Recombinases (Textpresso SSR) Web server to enable researchers who use mice to perform in-depth searches of a rapidly growing and complex part of the mouse literature. Our Textpresso Web server provides an interface for searching the full text of most of the peer-reviewed publications that report the characterization or use of mouse strains that express Cre or Flp recombinase. The database also contains most of the publications that describe the characterization or analysis of strains carrying conditional alleles or transgenes that can be inactivated or activated by site-specific recombinases such as Cre or Flp. Textpresso SSR complements the existing online databases that catalog Cre and Flp expression patterns by providing a unique online interface for the in-depth text mining of the site specific recombinase literature.
A method for feature selection of APT samples based on entropy
NASA Astrophysics Data System (ADS)
Du, Zhenyu; Li, Yihong; Hu, Jinsong
2018-05-01
By studying the known APT attack events deeply, this paper propose a feature selection method of APT sample and a logic expression generation algorithm IOCG (Indicator of Compromise Generate). The algorithm can automatically generate machine readable IOCs (Indicator of Compromise), to solve the existing IOCs logical relationship is fixed, the number of logical items unchanged, large scale and cannot generate a sample of the limitations of the expression. At the same time, it can reduce the redundancy and useless APT sample processing time consumption, and improve the sharing rate of information analysis, and actively respond to complex and volatile APT attack situation. The samples were divided into experimental set and training set, and then the algorithm was used to generate the logical expression of the training set with the IOC_ Aware plug-in. The contrast expression itself was different from the detection result. The experimental results show that the algorithm is effective and can improve the detection effect.
Observation of the spin-polarized surface state in a noncentrosymmetric superconductor BiPd
DOE Office of Scientific and Technical Information (OSTI.GOV)
Neupane, Madhab; Alidoust, Nasser; Hosen, M. Mofazzel
Recently, noncentrosymmetric superconductor BiPd has attracted considerable research interest due to the possibility of hosting topological superconductivity. Here in this paper we report a systematic high-resolution angle-resolved photoemission spectroscopy (ARPES) and spin-resolved ARPES study of the normal state electronic and spin properties of BiPd. Our experimental results show the presence of a surface state at higher-binding energy with the location of Dirac point at around 700 meV below the Fermi level. The detailed photon energy, temperature-dependent and spin-resolved ARPES measurements complemented by our first-principles calculations demonstrate the existence of the spin-polarized surface states at high-binding energy. The absence of suchmore » spin-polarized surface states near the Fermi level negates the possibility of a topological superconducting behaviour on the surface. Our direct experimental observation of spin-polarized surface states in BiPd provides critical information that will guide the future search for topological superconductivity in noncentrosymmetric materials.« less
Observation of the spin-polarized surface state in a noncentrosymmetric superconductor BiPd
Neupane, Madhab; Alidoust, Nasser; Hosen, M. Mofazzel; ...
2016-11-07
Recently, noncentrosymmetric superconductor BiPd has attracted considerable research interest due to the possibility of hosting topological superconductivity. Here in this paper we report a systematic high-resolution angle-resolved photoemission spectroscopy (ARPES) and spin-resolved ARPES study of the normal state electronic and spin properties of BiPd. Our experimental results show the presence of a surface state at higher-binding energy with the location of Dirac point at around 700 meV below the Fermi level. The detailed photon energy, temperature-dependent and spin-resolved ARPES measurements complemented by our first-principles calculations demonstrate the existence of the spin-polarized surface states at high-binding energy. The absence of suchmore » spin-polarized surface states near the Fermi level negates the possibility of a topological superconducting behaviour on the surface. Our direct experimental observation of spin-polarized surface states in BiPd provides critical information that will guide the future search for topological superconductivity in noncentrosymmetric materials.« less
Prosocial preferences do not explain human cooperation in public-goods games.
Burton-Chellew, Maxwell N; West, Stuart A
2013-01-02
It has become an accepted paradigm that humans have "prosocial preferences" that lead to higher levels of cooperation than those that would maximize their personal financial gain. However, the existence of prosocial preferences has been inferred post hoc from the results of economic games, rather than with direct experimental tests. Here, we test how behavior in a public-goods game is influenced by knowledge of the consequences of actions for other players. We found that (i) individuals cooperate at similar levels, even when they are not informed that their behavior benefits others; (ii) an increased awareness of how cooperation benefits others leads to a reduction, rather than an increase, in the level of cooperation; and (iii) cooperation can be either lower or higher than expected, depending on experimental design. Overall, these results contradict the suggested role of the prosocial preferences hypothesis and show how the complexity of human behavior can lead to misleading conclusions from controlled laboratory experiments.
Ramírez, Alvaro; García-Torrent, Javier; Tascón, Alberto
2010-03-15
Agricultural products stored in silos, and their dusts, can undergo oxidation and self-heating, increasing the risk of self-ignition and therefore of fires and explosions. The aim of the present work was to determine the thermal susceptibility (as reflected by the Maciejasz index, the temperature of the emission of flammable volatile substances and the combined information provided by the apparent activation energy and the oxidation temperature) of icing sugar, bread-making flour, maize, wheat, barley, alfalfa, and soybean dusts, using experimental methods for the characterisation of different types of coal (no standardised procedure exists for characterising the thermal susceptibility of either coal or agricultural products). In addition, the thermal stability of wheat, i.e., the risk of self-ignition determined as a function of sample volume, ignition temperature and storage time, was determined using the methods outlined in standard EN 15188:2007. The advantages and drawbacks of the different methods used are discussed. (c) 2009 Elsevier B.V. All rights reserved.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-03-10
...-0037] Agency Information Collection Activities: Form I-730, Revision of an Existing Information Collection Request; Comment Request ACTION: 60-Day Notice of Information Collection under Review: Form I- 730..., mechanical, or other technological collection techniques, or other forms of information technology, e.g...
Kharroubi, Samer A
2018-06-05
Experimental studies to develop valuations of health state descriptive systems like EQ-5D or SF-6D need to be conducted in different countries, because social and cultural differences are likely to lead to systematically different valuations. There is a scope utilize the evidence in one country to help with the design and the analysis of a study in another, for this to enable the generation of utility estimates of the second country much more precisely than would have been possible when collecting and analyzing the country's data alone. We analyze SF-6D valuation data elicited from representative samples corresponding to the Hong Kong (HK) and United Kingdom (UK) general adult populations through the use of the standard gamble technique to value 197 and 249 health states respectively. We apply a nonparametric Bayesian model to estimate a HK value set using the UK dataset as informative prior to improve its estimation. Estimates are compared to a HK value set estimated using HK values alone using mean predictions and root mean square error. The novel method of modelling utility functions permitted the UK valuations to contribute significant prior information to the Hong Kong analysis. The results suggest that using HK data alongside the existing UK data produces HK utility estimates better than using the HK study data by itself. The promising results suggest that existing preference data could be combined with valuation study in a new country to generate preference weights, making own country value sets more achievable for low and middle income countries. Further research is encouraged.
Experimental demonstration of the vertical spin existence in evanescent waves
NASA Astrophysics Data System (ADS)
Maksimyak, P. P.; Maksimyak, A. P.; Ivanskyi, D. I.
2018-01-01
Physical existence of the recently discovered vertical spin arising in an evanescent light wave due to the total internal reflection of a linearly polarized probing beam with azimuthal angle 45° is experimentally verified. Mechanical action, caused by optical force, associated with the extraordinary transverse component of the spin in evanescent wave is demonstrated. The motion of a birefringent plate in a direction controlled by simultaneous action of the canonical momentum and the transversal spin momentum is observed. The contribution of the canonical and spin momenta in determination of the trajectory of the resulting motion occur commensurable under exceptionally delicately determined experimental conditions.
Dasgupta, Annwesa P; Anderson, Trevor R; Pelaez, Nancy J
2016-01-01
Researchers, instructors, and funding bodies in biology education are unanimous about the importance of developing students' competence in experimental design. Despite this, only limited measures are available for assessing such competence development, especially in the areas of molecular and cellular biology. Also, existing assessments do not measure how well students use standard symbolism to visualize biological experiments. We propose an assessment-design process that 1) provides background knowledge and questions for developers of new "experimentation assessments," 2) elicits practices of representing experiments with conventional symbol systems, 3) determines how well the assessment reveals expert knowledge, and 4) determines how well the instrument exposes student knowledge and difficulties. To illustrate this process, we developed the Neuron Assessment and coded responses from a scientist and four undergraduate students using the Rubric for Experimental Design and the Concept-Reasoning Mode of representation (CRM) model. Some students demonstrated sound knowledge of concepts and representations. Other students demonstrated difficulty with depicting treatment and control group data or variability in experimental outcomes. Our process, which incorporates an authentic research situation that discriminates levels of visualization and experimentation abilities, shows potential for informing assessment design in other disciplines. © 2016 A. P. Dasgupta et al. CBE—Life Sciences Education © 2016 The American Society for Cell Biology. This article is distributed by The American Society for Cell Biology under license from the author(s). It is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http://creativecommons.org/licenses/by-nc-sa/3.0).
14 CFR § 1203.604 - Mandatory review for declassification.
Code of Federal Regulations, 2014 CFR
2014-01-01
...) Presidential papers. Information originated by the President or Vice President; the President's White House... confirm or deny the existence or non-existence of requested information whenever the fact of its existence...
Protein-protein interaction networks: unraveling the wiring of molecular machines within the cell.
De Las Rivas, Javier; Fontanillo, Celia
2012-11-01
Mapping and understanding of the protein interaction networks with their key modules and hubs can provide deeper insights into the molecular machinery underlying complex phenotypes. In this article, we present the basic characteristics and definitions of protein networks, starting with a distinction of the different types of associations between proteins. We focus the review on protein-protein interactions (PPIs), a subset of associations defined as physical contacts between proteins that occur by selective molecular docking in a particular biological context. We present such definition as opposed to other types of protein associations derived from regulatory, genetic, structural or functional relations. To determine PPIs, a variety of binary and co-complex methods exist; however, not all the technologies provide the same information and data quality. A way of increasing confidence in a given protein interaction is to integrate orthogonal experimental evidences. The use of several complementary methods testing each single interaction assesses the accuracy of PPI data and tries to minimize the occurrence of false interactions. Following this approach there have been important efforts to unify primary databases of experimentally proven PPIs into integrated databases. These meta-databases provide a measure of the confidence of interactions based on the number of experimental proofs that report them. As a conclusion, we can state that integrated information allows the building of more reliable interaction networks. Identification of communities, cliques, modules and hubs by analysing the topological parameters and graph properties of the protein networks allows the discovery of central/critical nodes, which are candidates to regulate cellular flux and dynamics.
Constraints on the ωπ Form Factor from Analyticity and Unitarity
NASA Astrophysics Data System (ADS)
Ananthanarayan, B.; Caprini, Irinel; Kubis, Bastian
Form factors are important low-energy quantities and an accurate knowledge of these sheds light on the strong interactions. A variety of methods based on general principles have been developed to use information known in different energy regimes to constrain them in regions where experimental information needs to be tested precisely. Here we review our recent work on the electromagnetic ωπ form factor in a model-independent framework known as the method of unitarity bounds, partly motivated by the discre-pancies noted recently between the theoretical calculations of the form factor based on dispersion relations and certain experimental data measured from the decay ω → π0γ*. We have applied a modified dispersive formalism, which uses as input the discontinuity of the ωπ form factor calculated by unitarity below the ωπ threshold and an integral constraint on the square of its modulus above this threshold. The latter constraint was obtained by exploiting unitarity and the positivity of the spectral function of a QCD correlator, computed on the spacelike axis by operator product expansion and perturbative QCD. An alternative constraint is obtained by using data available at higher energies for evaluating an integral of the modulus squared with a suitable weight function. From these conditions we derived upper and lower bounds on the modulus of the ωπ form factor in the region below the ωπ threshold. The results confirm the existence of a disagreement between dispersion theory and experimental data on the ωπ form factor around 0:6 GeV, including those from NA60 published in 2016.
Constraints on the ωπ form factor from analyticity and unitarity
NASA Astrophysics Data System (ADS)
Ananthanarayan, B.; Caprini, Irinel; Kubis, Bastian
2016-05-01
Form factors are important low-energy quantities and an accurate knowledge of these sheds light on the strong interactions. A variety of methods based on general principles have been developed to use information known in different energy regimes to constrain them in regions where experimental information needs to be tested precisely. Here we review our recent work on the electromagnetic ωπ form factor in a model-independent framework known as the method of unitarity bounds, partly motivated by the discrepancies noted recently between the theoretical calculations of the form factor based on dispersion relations and certain experimental data measured from the decay ω → π0γ∗. We have applied a modified dispersive formalism, which uses as input the discontinuity of the ωπ form factor calculated by unitarity below the ωπ threshold and an integral constraint on the square of its modulus above this threshold. The latter constraint was obtained by exploiting unitarity and the positivity of the spectral function of a QCD correlator, computed on the spacelike axis by operator product expansion and perturbative QCD. An alternative constraint is obtained by using data available at higher energies for evaluating an integral of the modulus squared with a suitable weight function. From these conditions we derived upper and lower bounds on the modulus of the ωπ form factor in the region below the ωπ threshold. The results confirm the existence of a disagreement between dispersion theory and experimental data on the ωπ form factor around 0.6 GeV, including those from NA60 published in 2016.
Browne, Richard W; Whitcomb, Brian W
2010-07-01
Problems in the analysis of laboratory data commonly arise in epidemiologic studies in which biomarkers subject to lower detection thresholds are used. Various thresholds exist including limit of detection (LOD), limit of quantification (LOQ), and limit of blank (LOB). Choosing appropriate strategies for dealing with data affected by such limits relies on proper understanding of the nature of the detection limit and its determination. In this paper, we demonstrate experimental and statistical procedures generally used for estimating different detection limits according to standard procedures in the context of analysis of fat-soluble vitamins and micronutrients in human serum. Fat-soluble vitamins and micronutrients were analyzed by high-performance liquid chromatography with diode array detection. A simulated serum matrix blank was repeatedly analyzed for determination of LOB parametrically by using the observed blank distribution as well as nonparametrically by using ranks. The LOD was determined by combining information regarding the LOB with data from repeated analysis of standard reference materials (SRMs), diluted to low levels; from LOB to 2-3 times LOB. The LOQ was determined experimentally by plotting the observed relative standard deviation (RSD) of SRM replicates compared with the concentration, where the LOQ is the concentration at an RSD of 20%. Experimental approaches and example statistical procedures are given for determination of LOB, LOD, and LOQ. These quantities are reported for each measured analyte. For many analyses, there is considerable information available below the LOQ. Epidemiologic studies must understand the nature of these detection limits and how they have been estimated for appropriate treatment of affected data.
Charmonium excited state spectrum in lattice QCD
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jozef Dudek; Robert Edwards; Nilmani Mathur
2008-02-01
Working with a large basis of covariant derivative-based meson interpolating fields we demonstrate the feasibility of reliably extracting multiple excited states using a variational method. The study is performed on quenched anisotropic lattices with clover quarks at the charm mass. We demonstrate how a knowledge of the continuum limit of a lattice interpolating field can give additional spin-assignment information, even at a single lattice spacing, via the overlap factors of interpolating field and state. Excited state masses are systematically high with respect to quark potential model predictions and, where they exist, experimental states. We conclude that this is most likelymore » a result of the quenched approximation.« less
DNA-Based Single-Molecule Electronics: From Concept to Function.
Wang, Kun
2018-01-17
Beyond being the repository of genetic information, DNA is playing an increasingly important role as a building block for molecular electronics. Its inherent structural and molecular recognition properties render it a leading candidate for molecular electronics applications. The structural stability, diversity and programmability of DNA provide overwhelming freedom for the design and fabrication of molecular-scale devices. In the past two decades DNA has therefore attracted inordinate amounts of attention in molecular electronics. This review gives a brief survey of recent experimental progress in DNA-based single-molecule electronics with special focus on single-molecule conductance and I-V characteristics of individual DNA molecules. Existing challenges and exciting future opportunities are also discussed.
Neutron Data Compilation Centre, European Nuclear Energy Agency, Newsletter No. 13
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
1972-02-15
This edition of the newsletter is intended to inform all users of neutron data about the content of the CCDN Experimental Neutron Data Library as of February 1972. It supercedes the last index issue, no. 11, published in October 1969. Since then, the database has been greatly enlarged thanks to the collaboration of neutron data users in the ENEA area (Western Europe plus Japan) and to the truly worldwide cooperation between the four existing data centers: NNCSC at Brookhaven Lab. in Upton, NY, United States, CCDN in Gif-sur_yvette, France, Centr po Jadernym Dannym in Obninsk, USSR, and the Nuclear Datamore » Section, IAEA, Vienna, Austria.« less
DNA-Based Single-Molecule Electronics: From Concept to Function
2018-01-01
Beyond being the repository of genetic information, DNA is playing an increasingly important role as a building block for molecular electronics. Its inherent structural and molecular recognition properties render it a leading candidate for molecular electronics applications. The structural stability, diversity and programmability of DNA provide overwhelming freedom for the design and fabrication of molecular-scale devices. In the past two decades DNA has therefore attracted inordinate amounts of attention in molecular electronics. This review gives a brief survey of recent experimental progress in DNA-based single-molecule electronics with special focus on single-molecule conductance and I–V characteristics of individual DNA molecules. Existing challenges and exciting future opportunities are also discussed. PMID:29342091
A new evaluation method research for fusion quality of infrared and visible images
NASA Astrophysics Data System (ADS)
Ge, Xingguo; Ji, Yiguo; Tao, Zhongxiang; Tian, Chunyan; Ning, Chengda
2017-03-01
In order to objectively evaluate the fusion effect of infrared and visible image, a fusion evaluation method for infrared and visible images based on energy-weighted average structure similarity and edge information retention value is proposed for drawbacks of existing evaluation methods. The evaluation index of this method is given, and the infrared and visible image fusion results under different algorithms and environments are made evaluation experiments on the basis of this index. The experimental results show that the objective evaluation index is consistent with the subjective evaluation results obtained from this method, which shows that the method is a practical and effective fusion image quality evaluation method.
NASA Astrophysics Data System (ADS)
Kwon, D.-H.; Lee, W.; Preval, S.; Ballance, C. P.; Behar, E.; Colgan, J.; Fontes, C. J.; Nakano, T.; Li, B.; Ding, X.; Dong, C. Z.; Fu, Y. B.; Badnell, N. R.; O'Mullane, M.; Chung, H.-K.; Braams, B. J.
2018-01-01
Under the auspices of the IAEA Atomic and Molecular Data Center and the Korean Atomic Energy Research Institute, our assembled group of authors has reviewed the current state of dielectronic recombination (DR) rate coefficients for various ion stages of tungsten (W). Subsequent recommendations were based upon available experimental data, first-principle calculations carried out in support of this paper and from available recombination data within existing atomic databases. If a recommendation was possible, data were compiled, evaluated and fitted to a functional form with associated uncertainty information retained, where available. This paper also considers the variation of the W fractional abundance due to the underlying atomic data when employing different data sets.
A new stationary gridline artifact suppression method based on the 2D discrete wavelet transform
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tang, Hui, E-mail: corinna@seu.edu.cn; Key Laboratory of Computer Network and Information Integration; Centre de Recherche en Information Biomédicale sino-français, Laboratoire International Associé, Inserm, Université de Rennes 1, Rennes 35000
2015-04-15
Purpose: In digital x-ray radiography, an antiscatter grid is inserted between the patient and the image receptor to reduce scattered radiation. If the antiscatter grid is used in a stationary way, gridline artifacts will appear in the final image. In most of the gridline removal image processing methods, the useful information with spatial frequencies close to that of the gridline is usually lost or degraded. In this study, a new stationary gridline suppression method is designed to preserve more of the useful information. Methods: The method is as follows. The input image is first recursively decomposed into several smaller subimagesmore » using a multiscale 2D discrete wavelet transform. The decomposition process stops when the gridline signal is found to be greater than a threshold in one or several of these subimages using a gridline detection module. An automatic Gaussian band-stop filter is then applied to the detected subimages to remove the gridline signal. Finally, the restored image is achieved using the corresponding 2D inverse discrete wavelet transform. Results: The processed images show that the proposed method can remove the gridline signal efficiently while maintaining the image details. The spectra of a 1D Fourier transform of the processed images demonstrate that, compared with some existing gridline removal methods, the proposed method has better information preservation after the removal of the gridline artifacts. Additionally, the performance speed is relatively high. Conclusions: The experimental results demonstrate the efficiency of the proposed method. Compared with some existing gridline removal methods, the proposed method can preserve more information within an acceptable execution time.« less
Predicting novel substrates for enzymes with minimal experimental effort with active learning
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pertusi, Dante A.; Moura, Matthew E.; Jeffryes, James G.
Enzymatic substrate promiscuity is more ubiquitous than previously thought, with significant consequences for understanding metabolism and its application to biocatalysis. This realization has given rise to the need for efficient characterization of enzyme promiscuity. Enzyme promiscuity is currently characterized with a limited number of human-selected compounds that may not be representative of the enzyme's versatility. While testing large numbers of compounds may be impractical, computational approaches can exploit existing data to determine the most informative substrates to test next, thereby more thoroughly exploring an enzyme's versatility. To demonstrate this, we used existing studies and tested compounds for four different enzymes,more » developed support vector machine (SVM) models using these datasets, and selected additional compounds for experiments using an active learning approach. SVMs trained on a chemically diverse set of compounds were discovered to achieve maximum accuracies of similar to 80% using similar to 33% fewer compounds than datasets based on all compounds tested in existing studies. Active learning-selected compounds for testing resolved apparent conflicts in the existing training data, while adding diversity to the dataset. The application of these algorithms to wide arrays of metabolic enzymes would result in a library of SVMs that can predict high-probability promiscuous enzymatic reactions and could prove a valuable resource for the design of novel metabolic pathways.« less
Predicting novel substrates for enzymes with minimal experimental effort with active learning.
Pertusi, Dante A; Moura, Matthew E; Jeffryes, James G; Prabhu, Siddhant; Walters Biggs, Bradley; Tyo, Keith E J
2017-11-01
Enzymatic substrate promiscuity is more ubiquitous than previously thought, with significant consequences for understanding metabolism and its application to biocatalysis. This realization has given rise to the need for efficient characterization of enzyme promiscuity. Enzyme promiscuity is currently characterized with a limited number of human-selected compounds that may not be representative of the enzyme's versatility. While testing large numbers of compounds may be impractical, computational approaches can exploit existing data to determine the most informative substrates to test next, thereby more thoroughly exploring an enzyme's versatility. To demonstrate this, we used existing studies and tested compounds for four different enzymes, developed support vector machine (SVM) models using these datasets, and selected additional compounds for experiments using an active learning approach. SVMs trained on a chemically diverse set of compounds were discovered to achieve maximum accuracies of ~80% using ~33% fewer compounds than datasets based on all compounds tested in existing studies. Active learning-selected compounds for testing resolved apparent conflicts in the existing training data, while adding diversity to the dataset. The application of these algorithms to wide arrays of metabolic enzymes would result in a library of SVMs that can predict high-probability promiscuous enzymatic reactions and could prove a valuable resource for the design of novel metabolic pathways. Copyright © 2017 International Metabolic Engineering Society. Published by Elsevier Inc. All rights reserved.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-19
...: 60-Day Notice of Information Collection Under Review: Form I- 602, Application by Refugee for Waiver...: Extension of an existing information collection. (2) Title of the Form/Collection: Application by Refugee...
DOT National Transportation Integrated Search
1988-06-01
This report documents the construction and initial evaluation of several experimental features which were incorporated as part of an overlay of an existing PCC pavement in order to determine the feasibility of extending overlay service life. The expe...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-09
... Activities: Form G-884, Extension of an Existing Information Collection; Comment Request ACTION: 30-Day Notice of Information Collection Under Review: Form G- 884, Request for the Return of Original Document(s... information technology, e.g., permitting electronic submission of responses. Overview of This Information...
An improved swarm optimization for parameter estimation and biological model selection.
Abdullah, Afnizanfaizal; Deris, Safaai; Mohamad, Mohd Saberi; Anwar, Sohail
2013-01-01
One of the key aspects of computational systems biology is the investigation on the dynamic biological processes within cells. Computational models are often required to elucidate the mechanisms and principles driving the processes because of the nonlinearity and complexity. The models usually incorporate a set of parameters that signify the physical properties of the actual biological systems. In most cases, these parameters are estimated by fitting the model outputs with the corresponding experimental data. However, this is a challenging task because the available experimental data are frequently noisy and incomplete. In this paper, a new hybrid optimization method is proposed to estimate these parameters from the noisy and incomplete experimental data. The proposed method, called Swarm-based Chemical Reaction Optimization, integrates the evolutionary searching strategy employed by the Chemical Reaction Optimization, into the neighbouring searching strategy of the Firefly Algorithm method. The effectiveness of the method was evaluated using a simulated nonlinear model and two biological models: synthetic transcriptional oscillators, and extracellular protease production models. The results showed that the accuracy and computational speed of the proposed method were better than the existing Differential Evolution, Firefly Algorithm and Chemical Reaction Optimization methods. The reliability of the estimated parameters was statistically validated, which suggests that the model outputs produced by these parameters were valid even when noisy and incomplete experimental data were used. Additionally, Akaike Information Criterion was employed to evaluate the model selection, which highlighted the capability of the proposed method in choosing a plausible model based on the experimental data. In conclusion, this paper presents the effectiveness of the proposed method for parameter estimation and model selection problems using noisy and incomplete experimental data. This study is hoped to provide a new insight in developing more accurate and reliable biological models based on limited and low quality experimental data.
47 CFR 74.131 - Licensing requirements, necessary showing.
Code of Federal Regulations, 2010 CFR
2010-10-01
... experimental broadcast station, change in facilities of any existing station, or modification of license is... research and experimentation in the technical phases of broadcasting which indicates reasonable promise of...
47 CFR 74.131 - Licensing requirements, necessary showing.
Code of Federal Regulations, 2011 CFR
2011-10-01
... experimental broadcast station, change in facilities of any existing station, or modification of license is... research and experimentation in the technical phases of broadcasting which indicates reasonable promise of...
Protection of Location Privacy Based on Distributed Collaborative Recommendations
Wang, Peng; Yang, Jing; Zhang, Jian-Pei
2016-01-01
In the existing centralized location services system structure, the server is easily attracted and be the communication bottleneck. It caused the disclosure of users’ location. For this, we presented a new distributed collaborative recommendation strategy that is based on the distributed system. In this strategy, each node establishes profiles of their own location information. When requests for location services appear, the user can obtain the corresponding location services according to the recommendation of the neighboring users’ location information profiles. If no suitable recommended location service results are obtained, then the user can send a service request to the server according to the construction of a k-anonymous data set with a centroid position of the neighbors. In this strategy, we designed a new model of distributed collaborative recommendation location service based on the users’ location information profiles and used generalization and encryption to ensure the safety of the user’s location information privacy. Finally, we used the real location data set to make theoretical and experimental analysis. And the results show that the strategy proposed in this paper is capable of reducing the frequency of access to the location server, providing better location services and protecting better the user’s location privacy. PMID:27649308
Protection of Location Privacy Based on Distributed Collaborative Recommendations.
Wang, Peng; Yang, Jing; Zhang, Jian-Pei
2016-01-01
In the existing centralized location services system structure, the server is easily attracted and be the communication bottleneck. It caused the disclosure of users' location. For this, we presented a new distributed collaborative recommendation strategy that is based on the distributed system. In this strategy, each node establishes profiles of their own location information. When requests for location services appear, the user can obtain the corresponding location services according to the recommendation of the neighboring users' location information profiles. If no suitable recommended location service results are obtained, then the user can send a service request to the server according to the construction of a k-anonymous data set with a centroid position of the neighbors. In this strategy, we designed a new model of distributed collaborative recommendation location service based on the users' location information profiles and used generalization and encryption to ensure the safety of the user's location information privacy. Finally, we used the real location data set to make theoretical and experimental analysis. And the results show that the strategy proposed in this paper is capable of reducing the frequency of access to the location server, providing better location services and protecting better the user's location privacy.
A Hybrid Ant Colony Optimization Algorithm for the Extended Capacitated Arc Routing Problem.
Li-Ning Xing; Rohlfshagen, P; Ying-Wu Chen; Xin Yao
2011-08-01
The capacitated arc routing problem (CARP) is representative of numerous practical applications, and in order to widen its scope, we consider an extended version of this problem that entails both total service time and fixed investment costs. We subsequently propose a hybrid ant colony optimization (ACO) algorithm (HACOA) to solve instances of the extended CARP. This approach is characterized by the exploitation of heuristic information, adaptive parameters, and local optimization techniques: Two kinds of heuristic information, arc cluster information and arc priority information, are obtained continuously from the solutions sampled to guide the subsequent optimization process. The adaptive parameters ease the burden of choosing initial values and facilitate improved and more robust results. Finally, local optimization, based on the two-opt heuristic, is employed to improve the overall performance of the proposed algorithm. The resulting HACOA is tested on four sets of benchmark problems containing a total of 87 instances with up to 140 nodes and 380 arcs. In order to evaluate the effectiveness of the proposed method, some existing capacitated arc routing heuristics are extended to cope with the extended version of this problem; the experimental results indicate that the proposed ACO method outperforms these heuristics.
Gorini, Alessandra; Mazzocco, Ketti; Pravettoni, Gabriella
2015-01-01
Due to the lack of other treatment options, patient candidates for participation in phase I clinical trials are considered the most vulnerable, and many ethical concerns have emerged regarding the informed consent process used in the experimental design of such trials. Starting with these considerations, this nonsystematic review is aimed at analyzing the decision-making processes underlying patients' decision about whether to participate (or not) in phase I trials in order to clarify the cognitive and emotional aspects most strongly implicated in this decision. Considering that there is no uniform decision calculus and that many different variables other than the patient-physician relationship (including demographic, clinical, and personal characteristics) may influence patients' preferences for and processing of information, we conclude that patients' informed decision-making can be facilitated by creating a rigorously developed, calibrated, and validated computer tool modeled on each single patient's knowledge, values, and emotional and cognitive decisional skills. Such a tool will also help oncologists to provide tailored medical information that is useful to improve the shared decision-making process, thereby possibly increasing patient participation in clinical trials. © 2015 S. Karger AG, Basel.
Evidence Combination From an Evolutionary Game Theory Perspective
Deng, Xinyang; Han, Deqiang; Dezert, Jean; Deng, Yong; Shyr, Yu
2017-01-01
Dempster-Shafer evidence theory is a primary methodology for multi-source information fusion because it is good at dealing with uncertain information. This theory provides a Dempster’s rule of combination to synthesize multiple evidences from various information sources. However, in some cases, counter-intuitive results may be obtained based on that combination rule. Numerous new or improved methods have been proposed to suppress these counter-intuitive results based on perspectives, such as minimizing the information loss or deviation. Inspired by evolutionary game theory, this paper considers a biological and evolutionary perspective to study the combination of evidences. An evolutionary combination rule (ECR) is proposed to help find the most biologically supported proposition in a multi-evidence system. Within the proposed ECR, we develop a Jaccard matrix game (JMG) to formalize the interaction between propositions in evidences, and utilize the replicator dynamics to mimick the evolution of propositions. Experimental results show that the proposed ECR can effectively suppress the counter-intuitive behaviors appeared in typical paradoxes of evidence theory, compared with many existing methods. Properties of the ECR, such as solution’s stability and convergence, have been mathematically proved as well. PMID:26285231
Understanding user intents in online health forums.
Zhang, Thomas; Cho, Jason H D; Zhai, Chengxiang
2015-07-01
Online health forums provide a convenient way for patients to obtain medical information and connect with physicians and peers outside of clinical settings. However, large quantities of unstructured and diversified content generated on these forums make it difficult for users to digest and extract useful information. Understanding user intents would enable forums to find and recommend relevant information to users by filtering out threads that do not match particular intents. In this paper, we derive a taxonomy of intents to capture user information needs in online health forums and propose novel pattern-based features for use with a multiclass support vector machine (SVM) classifier to classify original thread posts according to their underlying intents. Since no dataset existed for this task, we employ three annotators to manually label a dataset of 1192 HealthBoards posts spanning four forum topics. Experimental results show that a SVM using pattern-based features is highly capable of identifying user intents in forum posts, reaching a maximum precision of 75%, and that a SVM-based hierarchical classifier using both pattern and word features outperforms its SVM counterpart that uses only word features. Furthermore, comparable classification performance can be achieved by training and testing on posts from different forum topics.
Establishing and storing of deterministic quantum entanglement among three distant atomic ensembles.
Yan, Zhihui; Wu, Liang; Jia, Xiaojun; Liu, Yanhong; Deng, Ruijie; Li, Shujing; Wang, Hai; Xie, Changde; Peng, Kunchi
2017-09-28
It is crucial for the physical realization of quantum information networks to first establish entanglement among multiple space-separated quantum memories and then, at a user-controlled moment, to transfer the stored entanglement to quantum channels for distribution and conveyance of information. Here we present an experimental demonstration on generation, storage, and transfer of deterministic quantum entanglement among three spatially separated atomic ensembles. The off-line prepared multipartite entanglement of optical modes is mapped into three distant atomic ensembles to establish entanglement of atomic spin waves via electromagnetically induced transparency light-matter interaction. Then the stored atomic entanglement is transferred into a tripartite quadrature entangled state of light, which is space-separated and can be dynamically allocated to three quantum channels for conveying quantum information. The existence of entanglement among three released optical modes verifies that the system has the capacity to preserve multipartite entanglement. The presented protocol can be directly extended to larger quantum networks with more nodes.Continuous-variable encoding is a promising approach for quantum information and communication networks. Here, the authors show how to map entanglement from three spatial optical modes to three separated atomic samples via electromagnetically induced transparency, releasing it later on demand.
Adaptive structured dictionary learning for image fusion based on group-sparse-representation
NASA Astrophysics Data System (ADS)
Yang, Jiajie; Sun, Bin; Luo, Chengwei; Wu, Yuzhong; Xu, Limei
2018-04-01
Dictionary learning is the key process of sparse representation which is one of the most widely used image representation theories in image fusion. The existing dictionary learning method does not use the group structure information and the sparse coefficients well. In this paper, we propose a new adaptive structured dictionary learning algorithm and a l1-norm maximum fusion rule that innovatively utilizes grouped sparse coefficients to merge the images. In the dictionary learning algorithm, we do not need prior knowledge about any group structure of the dictionary. By using the characteristics of the dictionary in expressing the signal, our algorithm can automatically find the desired potential structure information that hidden in the dictionary. The fusion rule takes the physical meaning of the group structure dictionary, and makes activity-level judgement on the structure information when the images are being merged. Therefore, the fused image can retain more significant information. Comparisons have been made with several state-of-the-art dictionary learning methods and fusion rules. The experimental results demonstrate that, the dictionary learning algorithm and the fusion rule both outperform others in terms of several objective evaluation metrics.
Highly Efficient Design-of-Experiments Methods for Combining CFD Analysis and Experimental Data
NASA Technical Reports Server (NTRS)
Anderson, Bernhard H.; Haller, Harold S.
2009-01-01
It is the purpose of this study to examine the impact of "highly efficient" Design-of-Experiments (DOE) methods for combining sets of CFD generated analysis data with smaller sets of Experimental test data in order to accurately predict performance results where experimental test data were not obtained. The study examines the impact of micro-ramp flow control on the shock wave boundary layer (SWBL) interaction where a complete paired set of data exist from both CFD analysis and Experimental measurements By combining the complete set of CFD analysis data composed of fifteen (15) cases with a smaller subset of experimental test data containing four/five (4/5) cases, compound data sets (CFD/EXP) were generated which allows the prediction of the complete set of Experimental results No statistical difference were found to exist between the combined (CFD/EXP) generated data sets and the complete Experimental data set composed of fifteen (15) cases. The same optimal micro-ramp configuration was obtained using the (CFD/EXP) generated data as obtained with the complete set of Experimental data, and the DOE response surfaces generated by the two data sets were also not statistically different.
Data management routines for reproducible research using the G-Node Python Client library
Sobolev, Andrey; Stoewer, Adrian; Pereira, Michael; Kellner, Christian J.; Garbers, Christian; Rautenberg, Philipp L.; Wachtler, Thomas
2014-01-01
Structured, efficient, and secure storage of experimental data and associated meta-information constitutes one of the most pressing technical challenges in modern neuroscience, and does so particularly in electrophysiology. The German INCF Node aims to provide open-source solutions for this domain that support the scientific data management and analysis workflow, and thus facilitate future data access and reproducible research. G-Node provides a data management system, accessible through an application interface, that is based on a combination of standardized data representation and flexible data annotation to account for the variety of experimental paradigms in electrophysiology. The G-Node Python Library exposes these services to the Python environment, enabling researchers to organize and access their experimental data using their familiar tools while gaining the advantages that a centralized storage entails. The library provides powerful query features, including data slicing and selection by metadata, as well as fine-grained permission control for collaboration and data sharing. Here we demonstrate key actions in working with experimental neuroscience data, such as building a metadata structure, organizing recorded data in datasets, annotating data, or selecting data regions of interest, that can be automated to large degree using the library. Compliant with existing de-facto standards, the G-Node Python Library is compatible with many Python tools in the field of neurophysiology and thus enables seamless integration of data organization into the scientific data workflow. PMID:24634654
Dornburg, Alex; Su, Zhuo; Townsend, Jeffrey P
2018-06-25
With the rise of genome- scale datasets there has been a call for increased data scrutiny and careful selection of loci appropriate for attempting the resolution of a phylogenetic problem. Such loci are desired to maximize phylogenetic information content while minimizing the risk of homoplasy. Theory posits the existence of characters that evolve under such an optimum rate, and efforts to determine optimal rates of inference have been a cornerstone of phylogenetic experimental design for over two decades. However, both theoretical and empirical investigations of optimal rates have varied dramatically in their conclusions: spanning no relationship to a tight relationship between the rate of change and phylogenetic utility. Here we synthesize these apparently contradictory views, demonstrating both empirical and theoretical conditions under which each is correct. We find that optimal rates of characters-not genes-are generally robust to most experimental design decisions. Moreover, consideration of site rate heterogeneity within a given locus is critical to accurate predictions of utility. Factors such as taxon sampling or the targeted number of characters providing support for a topology are additionally critical to the predictions of phylogenetic utility based on the rate of character change. Further, optimality of rates and predictions of phylogenetic utility are not equivalent, demonstrating the need for further development of comprehensive theory of phylogenetic experimental design.
Data management routines for reproducible research using the G-Node Python Client library.
Sobolev, Andrey; Stoewer, Adrian; Pereira, Michael; Kellner, Christian J; Garbers, Christian; Rautenberg, Philipp L; Wachtler, Thomas
2014-01-01
Structured, efficient, and secure storage of experimental data and associated meta-information constitutes one of the most pressing technical challenges in modern neuroscience, and does so particularly in electrophysiology. The German INCF Node aims to provide open-source solutions for this domain that support the scientific data management and analysis workflow, and thus facilitate future data access and reproducible research. G-Node provides a data management system, accessible through an application interface, that is based on a combination of standardized data representation and flexible data annotation to account for the variety of experimental paradigms in electrophysiology. The G-Node Python Library exposes these services to the Python environment, enabling researchers to organize and access their experimental data using their familiar tools while gaining the advantages that a centralized storage entails. The library provides powerful query features, including data slicing and selection by metadata, as well as fine-grained permission control for collaboration and data sharing. Here we demonstrate key actions in working with experimental neuroscience data, such as building a metadata structure, organizing recorded data in datasets, annotating data, or selecting data regions of interest, that can be automated to large degree using the library. Compliant with existing de-facto standards, the G-Node Python Library is compatible with many Python tools in the field of neurophysiology and thus enables seamless integration of data organization into the scientific data workflow.
Memory bias for threatening information in anxiety and anxiety disorders: a meta-analytic review.
Mitte, Kristin
2008-11-01
Although some theories suggest that anxious individuals selectively remember threatening stimuli, findings remain contradictory despite a considerable amount of research. A quantitative integration of 165 studies with 9,046 participants (clinical and nonclinical samples) examined whether a memory bias exists and which moderator variables influence its magnitude. Implicit memory bias was investigated in lexical decision/stimulus identification and word-stem completion paradigms; explicit memory bias was investigated in recognition and recall paradigms. Overall, effect sizes showed no significant impact of anxiety on implicit memory and recognition. Analyses indicated a memory bias for recall, whose magnitude depended on experimental study procedures like the encoding procedure or retention interval. Anxiety influenced recollection of previous experiences; anxious individuals favored threat-related information. Across all paradigms, clinical status was not significantly linked to effect sizes, indicating no qualitative difference in information processing between anxiety patients and high-anxious persons. The large discrepancy between study effects in recall and recognition indicates that future research is needed to identify moderator variables for avoidant and preferred remembering.
Feature-fused SSD: fast detection for small objects
NASA Astrophysics Data System (ADS)
Cao, Guimei; Xie, Xuemei; Yang, Wenzhe; Liao, Quan; Shi, Guangming; Wu, Jinjian
2018-04-01
Small objects detection is a challenging task in computer vision due to its limited resolution and information. In order to solve this problem, the majority of existing methods sacrifice speed for improvement in accuracy. In this paper, we aim to detect small objects at a fast speed, using the best object detector Single Shot Multibox Detector (SSD) with respect to accuracy-vs-speed trade-off as base architecture. We propose a multi-level feature fusion method for introducing contextual information in SSD, in order to improve the accuracy for small objects. In detailed fusion operation, we design two feature fusion modules, concatenation module and element-sum module, different in the way of adding contextual information. Experimental results show that these two fusion modules obtain higher mAP on PASCAL VOC2007 than baseline SSD by 1.6 and 1.7 points respectively, especially with 2-3 points improvement on some small objects categories. The testing speed of them is 43 and 40 FPS respectively, superior to the state of the art Deconvolutional single shot detector (DSSD) by 29.4 and 26.4 FPS.
Indoor detection of passive targets recast as an inverse scattering problem
NASA Astrophysics Data System (ADS)
Gottardi, G.; Moriyama, T.
2017-10-01
The wireless local area networks represent an alternative to custom sensors and dedicated surveillance systems for target indoor detection. The availability of the channel state information has opened the exploitation of the spatial and frequency diversity given by the orthogonal frequency division multiplexing. Such a fine-grained information can be used to solve the detection problem as an inverse scattering problem. The goal of the detection is to reconstruct the properties of the investigation domain, namely to estimate if the domain is empty or occupied by targets, starting from the measurement of the electromagnetic perturbation of the wireless channel. An innovative inversion strategy exploiting both the frequency and the spatial diversity of the channel state information is proposed. The target-dependent features are identified combining the Kruskal-Wallis test and the principal component analysis. The experimental validation points out the detection performance of the proposed method when applied to an existing wireless link of a WiFi architecture deployed in a real indoor scenario. False detection rates lower than 2 [%] have been obtained.
DOT National Transportation Integrated Search
1991-06-01
This report documents the construction, initial evaluation and final evaluation of several experimental features which were incorporated as part of an overlay of an existing PCC pavement in order to determine the feasibility of extending the asphalti...
NREL Opens Large Database of Inorganic Thin-Film Materials | News | NREL
Inorganic Thin-Film Materials April 3, 2018 An extensive experimental database of inorganic thin-film Energy Laboratory (NREL) is now publicly available. The High Throughput Experimental Materials (HTEM Schroeder / NREL) "All existing experimental databases either contain many entries or have all this
Federal Register 2010, 2011, 2012, 2013, 2014
2013-05-06
... appropriate habitat is found, the area will be considered for experimental introduction. The conservation committee will increase research efforts in experimental translocations in Conservation Area B and evaluate... conduct experimental vegetation treatments within existing conservation areas to determine if this...
Rajaraman, Prathish K; Manteuffel, T A; Belohlavek, M; Heys, Jeffrey J
2017-01-01
A new approach has been developed for combining and enhancing the results from an existing computational fluid dynamics model with experimental data using the weighted least-squares finite element method (WLSFEM). Development of the approach was motivated by the existence of both limited experimental blood velocity in the left ventricle and inexact numerical models of the same flow. Limitations of the experimental data include measurement noise and having data only along a two-dimensional plane. Most numerical modeling approaches do not provide the flexibility to assimilate noisy experimental data. We previously developed an approach that could assimilate experimental data into the process of numerically solving the Navier-Stokes equations, but the approach was limited because it required the use of specific finite element methods for solving all model equations and did not support alternative numerical approximation methods. The new approach presented here allows virtually any numerical method to be used for approximately solving the Navier-Stokes equations, and then the WLSFEM is used to combine the experimental data with the numerical solution of the model equations in a final step. The approach dynamically adjusts the influence of the experimental data on the numerical solution so that more accurate data are more closely matched by the final solution and less accurate data are not closely matched. The new approach is demonstrated on different test problems and provides significantly reduced computational costs compared with many previous methods for data assimilation. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
Broadcasting GPS integrity information using Loran-C
NASA Astrophysics Data System (ADS)
Lo, Sherman Chih
The United States Federal Aviation Administration (FAA) will adopt the Global Positioning System (GPS) as its primary navigation systems for aviation as stated by the Federal Radionavigation Plans (FRP) of 1996 and 1999. The FRP also proposes the reduction or termination of some existing radionavigation system in favor of GPS and satellite navigation. It may be beneficial to retain some of these existing terrestrial navigation systems if they can provide increased safety and redundancy to the GPS based architecture. One manner in which this can be done is by using or creating a data link on these existing radionavigation systems. These systems thus can provide both navigation and an additional broadcast of GPS integrity information. This thesis examines the use of terrestrial data links to provide Wide Area Augmentation System (WAAS) based GPS integrity information for aviation. The thesis focuses on using Loran-C to broadcast WAAS data. Analysis and experimental results demonstrating the capabilities of these designs are also discussed. Using Loran for this purpose requires increasing its data capacity. Many Loran modulation schemes are developed and analyzed. The data rates developed significantly increased the Loran data capacity. However, retaining compatibility with Loran legacy users resulted in data rates below the WARS data rate of 250 bps. As a result, this thesis also examines means of reducing the data requirements for WAAS information. While higher data rates offer improved performance and compatibility with WAAS, this thesis demonstrates that higher rates incur greater interference. Therefore, this work develops and considers a 108 bps and 167 bps Loran GPS integrity channel (LOGIC) design. The performance of the two designs illustrates some of the advantages and disadvantages of using a higher data rate. Analysis demonstrated means of maintaining integrity with these low data rate systems and determined the theoretical capabilities of the systems. The system was tested empirically by developing software that generated the LOGIC message and applied these messages to a GPS user. The resulting 108 bps and 167 bps systems demonstrated capability to provide lateral navigation/vertical navigation (LNAV/VNAV) and approach with vertical guidance (APV) respectively.
Curriculum system for experimental teaching in optoelectronic information
NASA Astrophysics Data System (ADS)
Di, Hongwei; Chen, Zhenqiang; Zhang, Jun; Luo, Yunhan
2017-08-01
The experimental curriculum system is directly related to talent training quality. Based on the careful investigation of the developing request of the optoelectronic information talents in the new century, the experimental teaching goal and the content, the teaching goal was set to cultivate students' innovative consciousness, innovative thinking, creativity and problem solving ability. Through straightening out the correlation among the experimental teaching in the main courses, the whole structure design was phased out, as well as the hierarchical curriculum connotation. According to the ideas of "basic, comprehensive, applied and innovative", the construction of experimental teaching system called "triple-three" was put forward for the optoelectronic information experimental teaching practice.
Chen, Yue; Gao, Qin; Song, Fei; Li, Zhizhong; Wang, Yufan
2017-08-01
In the main control rooms of nuclear power plants, operators frequently have to switch between procedure displays and system information displays. In this study, we proposed an operation-unit-based integrated design, which combines the two displays to facilitate the synthesis of information. We grouped actions that complete a single goal into operation units and showed these operation units on the displays of system states. In addition, we used different levels of visual salience to highlight the current unit and provided a list of execution history records. A laboratory experiment, with 42 students performing a simulated procedure to deal with unexpected high pressuriser level, was conducted to compare this design against an action-based integrated design and the existing separated-displays design. The results indicate that our operation-unit-based integrated design yields the best performance in terms of time and completion rate and helped more participants to detect unexpected system failures. Practitioner Summary: In current nuclear control rooms, operators frequently have to switch between procedure and system information displays. We developed an integrated design that incorporates procedure information into system displays. A laboratory study showed that the proposed design significantly improved participants' performance and increased the probability of detecting unexpected system failures.
Predicting protein contact map using evolutionary and physical constraints by integer programming.
Wang, Zhiyong; Xu, Jinbo
2013-07-01
Protein contact map describes the pairwise spatial and functional relationship of residues in a protein and contains key information for protein 3D structure prediction. Although studied extensively, it remains challenging to predict contact map using only sequence information. Most existing methods predict the contact map matrix element-by-element, ignoring correlation among contacts and physical feasibility of the whole-contact map. A couple of recent methods predict contact map by using mutual information, taking into consideration contact correlation and enforcing a sparsity restraint, but these methods demand for a very large number of sequence homologs for the protein under consideration and the resultant contact map may be still physically infeasible. This article presents a novel method PhyCMAP for contact map prediction, integrating both evolutionary and physical restraints by machine learning and integer linear programming. The evolutionary restraints are much more informative than mutual information, and the physical restraints specify more concrete relationship among contacts than the sparsity restraint. As such, our method greatly reduces the solution space of the contact map matrix and, thus, significantly improves prediction accuracy. Experimental results confirm that PhyCMAP outperforms currently popular methods no matter how many sequence homologs are available for the protein under consideration. http://raptorx.uchicago.edu.
Idaho National Engineering Laboratory code assessment of the Rocky Flats transuranic waste
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1995-07-01
This report is an assessment of the content codes associated with transuranic waste shipped from the Rocky Flats Plant in Golden, Colorado, to INEL. The primary objective of this document is to characterize and describe the transuranic wastes shipped to INEL from Rocky Flats by item description code (IDC). This information will aid INEL in determining if the waste meets the waste acceptance criteria (WAC) of the Waste Isolation Pilot Plant (WIPP). The waste covered by this content code assessment was shipped from Rocky Flats between 1985 and 1989. These years coincide with the dates for information available in themore » Rocky Flats Solid Waste Information Management System (SWIMS). The majority of waste shipped during this time was certified to the existing WIPP WAC. This waste is referred to as precertified waste. Reassessment of these precertified waste containers is necessary because of changes in the WIPP WAC. To accomplish this assessment, the analytical and process knowledge available on the various IDCs used at Rocky Flats were evaluated. Rocky Flats sources for this information include employee interviews, SWIMS, Transuranic Waste Certification Program, Transuranic Waste Inspection Procedure, Backlog Waste Baseline Books, WIPP Experimental Waste Characterization Program (headspace analysis), and other related documents, procedures, and programs. Summaries are provided of: (a) certification information, (b) waste description, (c) generation source, (d) recovery method, (e) waste packaging and handling information, (f) container preparation information, (g) assay information, (h) inspection information, (i) analytical data, and (j) RCRA characterization.« less
Cycle frequency in standard Rock-Paper-Scissors games: Evidence from experimental economics
NASA Astrophysics Data System (ADS)
Xu, Bin; Zhou, Hai-Jun; Wang, Zhijian
2013-10-01
The Rock-Paper-Scissors (RPS) game is a widely used model system in game theory. Evolutionary game theory predicts the existence of persistent cycles in the evolutionary trajectories of the RPS game, but experimental evidence has remained to be rather weak. In this work, we performed laboratory experiments on the RPS game and analyzed the social-state evolutionary trajectories of twelve populations of N=6 players. We found strong evidence supporting the existence of persistent cycles. The mean cycling frequency was measured to be 0.029±0.009 period per experimental round. Our experimental observations can be quantitatively explained by a simple non-equilibrium model, namely the discrete-time logit dynamical process with a noise parameter. Our work therefore favors the evolutionary game theory over the classical game theory for describing the dynamical behavior of the RPS game.
Experimental Demonstration of In-Place Calibration for Time Domain Microwave Imaging System
NASA Astrophysics Data System (ADS)
Kwon, S.; Son, S.; Lee, K.
2018-04-01
In this study, the experimental demonstration of in-place calibration was conducted using the developed time domain measurement system. Experiments were conducted using three calibration methods—in-place calibration and two existing calibrations, that is, array rotation and differential calibration. The in-place calibration uses dual receivers located at an equal distance from the transmitter. The received signals at the dual receivers contain similar unwanted signals, that is, the directly received signal and antenna coupling. In contrast to the simulations, the antennas are not perfectly matched and there might be unexpected environmental errors. Thus, we experimented with the developed experimental system to demonstrate the proposed method. The possible problems with low signal-to-noise ratio and clock jitter, which may exist in time domain systems, were rectified by averaging repeatedly measured signals. The tumor was successfully detected using the three calibration methods according to the experimental results. The cross correlation was calculated using the reconstructed image of the ideal differential calibration for a quantitative comparison between the existing rotation calibration and the proposed in-place calibration. The mean value of cross correlation between the in-place calibration and ideal differential calibration was 0.80, and the mean value of cross correlation of the rotation calibration was 0.55. Furthermore, the results of simulation were compared with the experimental results to verify the in-place calibration method. A quantitative analysis was also performed, and the experimental results show a tendency similar to the simulation.
A cooperative strategy for parameter estimation in large scale systems biology models.
Villaverde, Alejandro F; Egea, Jose A; Banga, Julio R
2012-06-22
Mathematical models play a key role in systems biology: they summarize the currently available knowledge in a way that allows to make experimentally verifiable predictions. Model calibration consists of finding the parameters that give the best fit to a set of experimental data, which entails minimizing a cost function that measures the goodness of this fit. Most mathematical models in systems biology present three characteristics which make this problem very difficult to solve: they are highly non-linear, they have a large number of parameters to be estimated, and the information content of the available experimental data is frequently scarce. Hence, there is a need for global optimization methods capable of solving this problem efficiently. A new approach for parameter estimation of large scale models, called Cooperative Enhanced Scatter Search (CeSS), is presented. Its key feature is the cooperation between different programs ("threads") that run in parallel in different processors. Each thread implements a state of the art metaheuristic, the enhanced Scatter Search algorithm (eSS). Cooperation, meaning information sharing between threads, modifies the systemic properties of the algorithm and allows to speed up performance. Two parameter estimation problems involving models related with the central carbon metabolism of E. coli which include different regulatory levels (metabolic and transcriptional) are used as case studies. The performance and capabilities of the method are also evaluated using benchmark problems of large-scale global optimization, with excellent results. The cooperative CeSS strategy is a general purpose technique that can be applied to any model calibration problem. Its capability has been demonstrated by calibrating two large-scale models of different characteristics, improving the performance of previously existing methods in both cases. The cooperative metaheuristic presented here can be easily extended to incorporate other global and local search solvers and specific structural information for particular classes of problems.
A cooperative strategy for parameter estimation in large scale systems biology models
2012-01-01
Background Mathematical models play a key role in systems biology: they summarize the currently available knowledge in a way that allows to make experimentally verifiable predictions. Model calibration consists of finding the parameters that give the best fit to a set of experimental data, which entails minimizing a cost function that measures the goodness of this fit. Most mathematical models in systems biology present three characteristics which make this problem very difficult to solve: they are highly non-linear, they have a large number of parameters to be estimated, and the information content of the available experimental data is frequently scarce. Hence, there is a need for global optimization methods capable of solving this problem efficiently. Results A new approach for parameter estimation of large scale models, called Cooperative Enhanced Scatter Search (CeSS), is presented. Its key feature is the cooperation between different programs (“threads”) that run in parallel in different processors. Each thread implements a state of the art metaheuristic, the enhanced Scatter Search algorithm (eSS). Cooperation, meaning information sharing between threads, modifies the systemic properties of the algorithm and allows to speed up performance. Two parameter estimation problems involving models related with the central carbon metabolism of E. coli which include different regulatory levels (metabolic and transcriptional) are used as case studies. The performance and capabilities of the method are also evaluated using benchmark problems of large-scale global optimization, with excellent results. Conclusions The cooperative CeSS strategy is a general purpose technique that can be applied to any model calibration problem. Its capability has been demonstrated by calibrating two large-scale models of different characteristics, improving the performance of previously existing methods in both cases. The cooperative metaheuristic presented here can be easily extended to incorporate other global and local search solvers and specific structural information for particular classes of problems. PMID:22727112
A review of reaction rates in high temperature air
NASA Technical Reports Server (NTRS)
Park, Chul
1989-01-01
The existing experimental data on the rate coefficients for the chemical reactions in nonequilibrium high temperature air are reviewed and collated, and a selected set of such values is recommended for use in hypersonic flow calculations. For the reactions of neutral species, the recommended values are chosen from the experimental data that existed mostly prior to 1970, and are slightly different from those used previously. For the reactions involving ions, the recommended rate coefficients are newly chosen from the experimental data obtained more recently. The reacting environment is assumed to lack thermal equilibrium, and the rate coefficients are expressed as a function of the controlling temperature, incorporating the recent multitemperature reaction concept.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-02-22
... implement the mandatory graphic warnings required by the Tobacco Control Act. The experimental study data...] Agency Information Collection Activities; Proposed Collection; Comment Request; Experimental Study of... on the Experimental Study of Graphic Cigarette Warning Labels that is being conducted in support of...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-08-20
... Request ACTION: 60-day notice of information collection; I-515A; Notice to Student or Exchange Visitor... other forms of information technology, e.g., permitting electronic submission of responses. Overview of... existing information collection. (2) Title of the form/collection: Notice to Student or Exchange Visitor...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-08-18
...-0057] Agency Information Collection Activities: Form N-600; Extension of an Existing Information Collection; Comment Request ACTION: 60-Day Notice of Information Collection under Review; Form N- 600... be evaluating whether to revise the Form N-600. Should USCIS decide to revise Form N-600 we will...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-08-18
...-0050] Agency Information Collection Activities: Form N-336; Extension of an Existing Information Collection; Comment Request ACTION: 60-Day Notice of Information Collection under Review; Form N- 336..., 2010. During this 60 day period, USCIS will be evaluating whether to revise the Form N-336. Should...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-12-15
...-0050] Agency Information Collection Activities: Form N-336, Revision to an Existing Information Collection; Comment Request ACTION: 30-Day notice of information collection under review: Form N- 336... announcing the extension of the Form N-336. The 60-day notice announced that during the 60-day comment period...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-02-15
... DEPARTMENT OF HOMELAND SECURITY United States Immigration and Customs Enforcement Agency Information Collection Activities: Extension, Without Change, of an Existing Information Collection; Comment Request ACTION: 60-Day Notice of Information Collection; File No. I-352, Immigration Bond; OMB Control No. 1653-0022. The Department of Homeland...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-04-19
... DEPARTMENT OF HOMELAND SECURITY United States Immigration and Customs Enforcement Agency Information Collection Activities: Extension, Without Change, of an Existing Information Collection; Comment Request ACTION: 30-Day Notice of Information Collection for Review; File No. I- 352, Immigration Bond; OMB Control No. 1653-0022. The Department of...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-05-03
... Collection Activities: Form G-639; Extension of an Existing Information Collection; Comment Request ACTION: 60-Day Notice of Information Collection Under Review; Form G- 639, Freedom of Information/Privacy Act... Form G-639. Should USCIS decide to revise Form G-639 we will advise the [[Page 24909
Ancient Glass: A Literature Search and its Role in Waste Management
DOE Office of Scientific and Technical Information (OSTI.GOV)
Strachan, Denis M.; Pierce, Eric M.
2010-07-01
When developing a performance assessment model for the long-term disposal of immobilized low-activity waste (ILAW) glass, it is desirable to determine the durability of glass forms over very long periods of time. However, testing is limited to short time spans, so experiments are performed under conditions that accelerate the key geochemical processes that control weathering. Verification that models currently being used can reliably calculate the long term behavior ILAW glass is a key component of the overall PA strategy. Therefore, Pacific Northwest National Laboratory was contracted by Washington River Protection Solutions, LLC to evaluate alternative strategies that can be usedmore » for PA source term model validation. One viable alternative strategy is the use of independent experimental data from archaeological studies of ancient or natural glass contained in the literature. These results represent a potential independent experiment that date back to approximately 3600 years ago or 1600 before the current era (bce) in the case of ancient glass and 106 years or older in the case of natural glass. The results of this literature review suggest that additional experimental data may be needed before the result from archaeological studies can be used as a tool for model validation of glass weathering and more specifically disposal facility performance. This is largely because none of the existing data set contains all of the information required to conduct PA source term calculations. For example, in many cases the sediments surrounding the glass was not collected and analyzed; therefore having the data required to compare computer simulations of concentration flux is not possible. This type of information is important to understanding the element release profile from the glass to the surrounding environment and provides a metric that can be used to calibrate source term models. Although useful, the available literature sources do not contain the required information needed to simulate the long-term performance of nuclear waste glasses in a near-surface or deep geologic repositories. The information that will be required include 1) experimental measurements to quantify the model parameters, 2) detailed analyses of altered glass samples, and 3) detailed analyses of the sediment surrounding the ancient glass samples.« less
Innate immunity in Alzheimer's disease: the relevance of animal models?
Franco Bocanegra, Diana K; Nicoll, James A R; Boche, Delphine
2018-05-01
The mouse is one of the organisms most widely used as an animal model in biomedical research, due to the particular ease with which it can be handled and reproduced in laboratory. As a member of the mammalian class, mice share with humans many features regarding metabolic pathways, cell morphology and anatomy. However, important biological differences between mice and humans exist and must be taken into consideration when interpreting research results, to properly translate evidence from experimental studies into information that can be useful for human disease prevention and/or treatment. With respect to Alzheimer's disease (AD), much of the experimental information currently known about this disease has been gathered from studies using mainly mice as models. Therefore, it is notably important to fully characterise the differences between mice and humans regarding important aspects of the disease. It is now widely known that inflammation plays an important role in the development of AD, a role that is not only a response to the surrounding pathological environment, but rather seems to be strongly implicated in the aetiology of the disease as indicated by the genetic studies. This review highlights relevant differences in inflammation and in microglia, the innate immune cell of the brain, between mice and humans regarding genetics and morphology in normal ageing, and the relationship of microglia with AD-like pathology, the inflammatory profile, and cognition. We conclude that some noteworthy differences exist between mice and humans regarding microglial characteristics, in distribution, gene expression, and states of activation. This may have repercussions in the way that transgenic mice respond to, and influence, the AD-like pathology. However, despite these differences, human and mouse microglia also show similarities in morphology and behaviour, such that the mouse is a suitable model for studying the role of microglia, as long as these differences are taken into consideration when delineating new strategies to approach the study of neurodegenerative diseases.
Littel, Marianne; van Schie, Kevin; van den Hout, Marcel A.
2017-01-01
ABSTRACT Background: Eye movement desensitization and reprocessing (EMDR) is an effective psychological treatment for posttraumatic stress disorder. Recalling a memory while simultaneously making eye movements (EM) decreases a memory’s vividness and/or emotionality. It has been argued that non-specific factors, such as treatment expectancy and experimental demand, may contribute to the EMDR’s effectiveness. Objective: The present study was designed to test whether expectations about the working mechanism of EMDR would alter the memory attenuating effects of EM. Two experiments were conducted. In Experiment 1, we examined the effects of pre-existing (non-manipulated) knowledge of EMDR in participants with and without prior knowledge. In Experiment 2, we experimentally manipulated prior knowledge by providing participants without prior knowledge with correct or incorrect information about EMDR’s working mechanism. Method: Participants in both experiments recalled two aversive, autobiographical memories during brief sets of EM (Recall+EM) or keeping eyes stationary (Recall Only). Before and after the intervention, participants scored their memories on vividness and emotionality. A Bayesian approach was used to compare two competing hypotheses on the effects of (existing/given) prior knowledge: (1) Prior (correct) knowledge increases the effects of Recall+EM vs. Recall Only, vs. (2) prior knowledge does not affect the effects of Recall+EM. Results: Recall+EM caused greater reductions in memory vividness and emotionality than Recall Only in all groups, including the incorrect information group. In Experiment 1, both hypotheses were supported by the data: prior knowledge boosted the effects of EM, but only modestly. In Experiment 2, the second hypothesis was clearly supported over the first: providing knowledge of the underlying mechanism of EMDR did not alter the effects of EM. Conclusions: Recall+EM appears to be quite robust against the effects of prior expectations. As Recall+EM is the core component of EMDR, expectancy effects probably contribute little to the effectiveness of EMDR treatment. PMID:29038685
NASA Astrophysics Data System (ADS)
Lebedeva, L.; Semenova, O.
2013-12-01
Lack of detailed process-oriented observational data is often claimed as one of the major obstacle for further advance of hydrological process understanding and development of deterministic models that do not rely on calibration. New sources of hydrological information (satellites, radars etc.) have the perspectives for the future but can not completely replace conventional and experimental observations at the moment. Long-term data-rich research catchments remain valuable if not the only source of information for development, verification, regionalization and comparison of different hydrological and environmental models. There existed the set of more than 20 such basins that were operated according to single observational program from the 1930-1950th to 1990th in the former Soviet Union. Research basins, so called water-balance stations, covered all main climatic and landscape zones such as taiga, forest-steppe, steppe, desert, mountains and permafrost regions. Each station conducted broad range of standard, special and experimental hydrometeorological field studies including spatially distributed meteorological observations, soil and snow variable states, measurements of the groundwater levels, hydrochemistry, evapotranspiration, discharges in several, often nested, slope- and small-scale watersheds, etc. The data were accompanied by the descriptions of observational techniques and landscapes allowing linking natural conditions with dominant hydrological processes. Each station is representative for larger area and the results of local studies could be transferred to other basins in similar conditions. Till recently the data existed only in hard copies in Russian language therefore they are not enough explored yet. We are currently digitizing main part of the observational and supportive materials and make it available for any scientific purpose via website http://hydrograph-model.ru/. We propose to hydrological community to use the data for comprehensive intercomparison studies of our models and their modules to reject inadequate algorithms and advance our process understanding and modeling efforts in different environments.
Pre-existing periodontitis exacerbates experimental arthritis in a mouse model.
Cantley, Melissa D; Haynes, David R; Marino, Victor; Bartold, P Mark
2011-06-01
Previous studies have shown a higher incidence of alveolar bone loss in patients with rheumatoid arthritis (RA) and that patients with periodontitis are at a greater risk of developing RA. The aim of this study was to develop an animal model to assess the relationship between pre-existing periodontitis and experimental arthritis (EA). Periodontitis was first induced in mice by oral gavage with Porphyromonas gingivalis followed by EA using the collagen antibody-induced arthritis model. These animals were compared with animals with periodontitis alone, EA alone and no disease (controls). Visual changes in paw swelling were assessed to determine clinical development of EA. Alveolar bone and joint changes were assessed using micro-CT, histological analyses and immunohistochemistry. Serum levels of C-reactive protein were used to monitor systemic inflammation. Mice with pre-existing periodontitis developed more severe arthritis, which developed at a faster rate. Mice with periodontitis only also showed evidence of loss of bone within the radiocarpal joint. There was also evidence of alveolar bone loss in mice with EA alone. The results of this study indicate that pre-existing periodontitis exacerbated experimental arthritis in a mouse model. © 2011 John Wiley & Sons A/S.
An Experimental Study on Strengthening of Reinforced Concrete Flexural Members using Steel Wire Mesh
NASA Astrophysics Data System (ADS)
Al Saadi, Hamza Salim Mohammed; Mohandas, Hoby P.; Namasivayam, Aravind
2017-01-01
One of the major challenges and contemporary research in the field of structural engineering is strengthening of existing structural elements using readily available materials in the market. Several investigations were conducted on strengthening of various structural components using traditional and advanced materials. Many researchers tried to enhance the reinforced concrete (RC) beams strength using steel plate, Glass and Carbon Fibre Reinforced Polymers (GFRP & CFRP). For the reason that high weight to the strength ratio and compatibility in strength between FRP composites and steel bars, steel plates and GFRP and CFRP composites are not used for strengthening works practically. Hence, in this present work the suitability of using wire mesh for the purpose of strengthening the RC flexural members is studied by conducting experimental works. New technique of strengthening system using wire mesh with a view to improve sectional properties and subsequently flexural strength of RC beams is adopted in this work. The results for experimental and theoretical analysis were compared and found that good correlation exists between them. The experimental results indicate that RC beams strengthened with steel wire mesh are easy technique for strengthening of existing flexural members.