Sample records for original proposal called

  1. Social calls provide novel insights into the evolution of vocal learning

    PubMed Central

    Sewall, Kendra B.; Young, Anna M.; Wright, Timothy F.

    2016-01-01

    Learned song is among the best-studied models of animal communication. In oscine songbirds, where learned song is most prevalent, it is used primarily for intrasexual selection and mate attraction. Learning of a different class of vocal signals, known as contact calls, is found in a diverse array of species, where they are used to mediate social interactions among individuals. We argue that call learning provides a taxonomically rich system for studying testable hypotheses for the evolutionary origins of vocal learning. We describe and critically evaluate four nonmutually exclusive hypotheses for the origin and current function of vocal learning of calls, which propose that call learning (1) improves auditory detection and recognition, (2) signals local knowledge, (3) signals group membership, or (4) allows for the encoding of more complex social information. We propose approaches to testing these four hypotheses but emphasize that all of them share the idea that social living, not sexual selection, is a central driver of vocal learning. Finally, we identify future areas for research on call learning that could provide new perspectives on the origins and mechanisms of vocal learning in both animals and humans. PMID:28163325

  2. An Improved Call Admission Control Mechanism with Prioritized Handoff Queuing Scheme for BWA Networks

    NASA Astrophysics Data System (ADS)

    Chowdhury, Prasun; Saha Misra, Iti

    2014-10-01

    Nowadays, due to increased demand for using the Broadband Wireless Access (BWA) networks in a satisfactory manner a promised Quality of Service (QoS) is required to manage the seamless transmission of the heterogeneous handoff calls. To this end, this paper proposes an improved Call Admission Control (CAC) mechanism with prioritized handoff queuing scheme that aims to reduce dropping probability of handoff calls. Handoff calls are queued when no bandwidth is available even after the allowable bandwidth degradation of the ongoing calls and get admitted into the network when an ongoing call is terminated with a higher priority than the newly originated call. An analytical Markov model for the proposed CAC mechanism is developed to analyze various performance parameters. Analytical results show that our proposed CAC with handoff queuing scheme prioritizes the handoff calls effectively and reduces dropping probability of the system by 78.57% for real-time traffic without degrading the number of failed new call attempts. This results in the increased bandwidth utilization of the network.

  3. SEE: improving nurse-patient communications and preventing software piracy in nurse call applications.

    PubMed

    Unluturk, Mehmet S

    2012-06-01

    Nurse call system is an electrically functioning system by which patients can call upon from a bedside station or from a duty station. An intermittent tone shall be heard and a corridor lamp located outside the room starts blinking with a slow or a faster rate depending on the call origination. It is essential to alert nurses on time so that they can offer care and comfort without any delay. There are currently many devices available for a nurse call system to improve communication between nurses and patients such as pagers, RFID (radio frequency identification) badges, wireless phones and so on. To integrate all these devices into an existing nurse call system and make they communicate with each other, we propose software client applications called bridges in this paper. We also propose a window server application called SEE (Supervised Event Executive) that delivers messages among these devices. A single hardware dongle is utilized for authentication and copy protection for SEE. Protecting SEE with securities provided by dongle only is a weak defense against hackers. In this paper, we develop some defense patterns for hackers such as calculating checksums in runtime, making calls to dongle from multiple places in code and handling errors properly by logging them into database.

  4. Bioelementology as an interdisciplinary integrative approach in life sciences: terminology, classification, perspectives.

    PubMed

    Skalny, Anatoly V

    2011-01-01

    The article presents the proposed concept of bioelements and the basic postulates of bioelementology for assessing and discussing them in the scientific community. It is known that chemical elements exist in the organism not by themselves, but in certain species having close interaction with other components. Such units are proposed to be called bioelements: the elementary functioning units of living matter, which are biologically active complexes of chemical elements as atoms, ions or nanoparticles with organic compounds of exogenous or biogenous origin. The scientific discipline that studies bioelements, is proposed to be called bioelementology. This discipline could lay the foundation for the integration of bioorganic chemistry, bioinorganic chemistry, biophysics, molecular biology and other parts of life sciences. Copyright © 2010 Elsevier GmbH. All rights reserved.

  5. Call progress time measurement in IP telephony

    NASA Astrophysics Data System (ADS)

    Khasnabish, Bhumip

    1999-11-01

    Usually a voice call is established through multiple stages in IP telephony. In the first stage, a phone number is dialed to reach a near-end or call-originating IP-telephony gateway. The next stages involve user identification through delivering an m-digit user-id to the authentication and/or billing server, and then user authentication by using an n- digit PIN. After that, the caller is allowed (last stage dial tone is provided) to dial a destination phone number provided that authentication is successful. In this paper, we present a very flexible method for measuring call progress time in IP telephony. The proposed technique can be used to measure the system response time at every stage. It is flexible, so that it can be easily modified to include new `tone' or a set of tones, or `voice begin' can be used in every stage to detect the system's response. The proposed method has been implemented using scripts written in Hammer visual basic language for testing with a few commercially available IP telephony gateways.

  6. Experiment and the Nature of Quantum Reality.

    ERIC Educational Resources Information Center

    Corwin, T. Mike; Wachowiak, Dale

    1984-01-01

    Although the Einstein-Podolsky-Rosen experiment was originally a hypothetical situation, John Bell was able to apply a version of their argument to an experiment that could actually be done. This experiment (called "Bell's Inequality") and a hypothetical experiment analogous to the one Bell proposed at the atomic level are described. (JN)

  7. Navy-Marine Corps Amphibious and Maritime Prepositioning Ship Programs: Background and Oversight Issues for Congress

    DTIC Science & Technology

    2005-08-02

    called LHD-8 and is also procuring new LPD-17 class amphibious ships. A total of 12 LPD-17s were originally planned , but the FY2006-FY2011 Future Years...Defense Plan (FYDP) proposes reducing that figure to nine, with the final two to be procured in FY2006 and FY2007. The FY2006-FY2011 FYDP also calls...developments have caused the Navy to reconsider its plans for procuring amphibious ships and maritime prepositioning ships. One is a new concept of operations

  8. A Mobility Management Using Follow-Me Cloud-Cloudlet in Fog-Computing-Based RANs for Smart Cities.

    PubMed

    Chen, Yuh-Shyan; Tsai, Yi-Ting

    2018-02-06

    Mobility management for supporting the location tracking and location-based service (LBS) is an important issue of smart city by providing the means for the smooth transportation of people and goods. The mobility is useful to contribute the innovation in both public and private transportation infrastructures for smart cities. With the assistance of edge/fog computing, this paper presents a fully new mobility management using the proposed follow-me cloud-cloudlet (FMCL) approach in fog-computing-based radio access networks (Fog-RANs) for smart cities. The proposed follow-me cloud-cloudlet approach is an integration strategy of follow-me cloud (FMC) and follow-me edge (FME) (or called cloudlet). A user equipment (UE) receives the data, transmitted from original cloud, into the original edge cloud before the handover operation. After the handover operation, an UE searches for a new cloud, called as a migrated cloud, and a new edge cloud, called as a migrated edge cloud near to UE, where the remaining data is migrated from the original cloud to the migrated cloud and all the remaining data are received in the new edge cloud. Existing FMC results do not have the property of the VM migration between cloudlets for the purpose of reducing the transmission latency, and existing FME results do not keep the property of the service migration between data centers for reducing the transmission latency. Our proposed FMCL approach can simultaneously keep the VM migration between cloudlets and service migration between data centers to significantly reduce the transmission latency. The new proposed mobility management using FMCL approach aims to reduce the total transmission time if some data packets are pre-scheduled and pre-stored into the cache of cloudlet if UE is switching from the previous Fog-RAN to the serving Fog-RAN. To illustrate the performance achievement, the mathematical analysis and simulation results are examined in terms of the total transmission time, the throughput, the probability of packet loss, and the number of control messages.

  9. A Mobility Management Using Follow-Me Cloud-Cloudlet in Fog-Computing-Based RANs for Smart Cities

    PubMed Central

    Tsai, Yi-Ting

    2018-01-01

    Mobility management for supporting the location tracking and location-based service (LBS) is an important issue of smart city by providing the means for the smooth transportation of people and goods. The mobility is useful to contribute the innovation in both public and private transportation infrastructures for smart cities. With the assistance of edge/fog computing, this paper presents a fully new mobility management using the proposed follow-me cloud-cloudlet (FMCL) approach in fog-computing-based radio access networks (Fog-RANs) for smart cities. The proposed follow-me cloud-cloudlet approach is an integration strategy of follow-me cloud (FMC) and follow-me edge (FME) (or called cloudlet). A user equipment (UE) receives the data, transmitted from original cloud, into the original edge cloud before the handover operation. After the handover operation, an UE searches for a new cloud, called as a migrated cloud, and a new edge cloud, called as a migrated edge cloud near to UE, where the remaining data is migrated from the original cloud to the migrated cloud and all the remaining data are received in the new edge cloud. Existing FMC results do not have the property of the VM migration between cloudlets for the purpose of reducing the transmission latency, and existing FME results do not keep the property of the service migration between data centers for reducing the transmission latency. Our proposed FMCL approach can simultaneously keep the VM migration between cloudlets and service migration between data centers to significantly reduce the transmission latency. The new proposed mobility management using FMCL approach aims to reduce the total transmission time if some data packets are pre-scheduled and pre-stored into the cache of cloudlet if UE is switching from the previous Fog-RAN to the serving Fog-RAN. To illustrate the performance achievement, the mathematical analysis and simulation results are examined in terms of the total transmission time, the throughput, the probability of packet loss, and the number of control messages. PMID:29415510

  10. Mapping replication origins in yeast chromosomes.

    PubMed

    Brewer, B J; Fangman, W L

    1991-07-01

    The replicon hypothesis, first proposed in 1963 by Jacob and Brenner, states that DNA replication is controlled at sites called origins. Replication origins have been well studied in prokaryotes. However, the study of eukaryotic chromosomal origins has lagged behind, because until recently there has been no method for reliably determining the identity and location of origins from eukaryotic chromosomes. Here, we review a technique we developed with the yeast Saccharomyces cerevisiae that allows both the mapping of replication origins and an assessment of their activity. Two-dimensional agarose gel electrophoresis and Southern hybridization with total genomic DNA are used to determine whether a particular restriction fragment acquires the branched structure diagnostic of replication initiation. The technique has been used to localize origins in yeast chromosomes and assess their initiation efficiency. In some cases, origin activation is dependent upon the surrounding context. The technique is also being applied to a variety of eukaryotic organisms.

  11. A Theory of Information Genetics: How Four Subforces Generate Information and the Implications for Total Quality Knowledge Management.

    ERIC Educational Resources Information Center

    Tsai, Bor-sheng

    2002-01-01

    Proposes a model called information genetics to elaborate on the origin of information generating. Explains conceptual and data models; and describes a software program that was developed for citation data mining, infomapping, and information repackaging for total quality knowledge management in Web representation. (Contains 112 references.)…

  12. Potential Originality and Effectiveness: The Dynamic Definition of Creativity

    ERIC Educational Resources Information Center

    Corazza, Giovanni Emanuele

    2016-01-01

    Given the central role of creativity in the future post-information society, a call for a pragmatist approach to the study of creativity is advocated, that brings as a consequence the recognition of the dynamic nature of this phenomenon. At the foundation of the proposed new theoretical framework lies the definition of creativity itself, which is…

  13. Research Focus: Reflections on Science Education Research Presentations at ASE 2012

    ERIC Educational Resources Information Center

    McGregor, Deb; Oversby, John; Woodhouse, Fiona

    2012-01-01

    The original call for research papers went out in the summer of 2011 and, by September, there were over thirty abstracts returned for review, from many countries including Hong Kong, Nigeria, Poland, Jamaica, Malta, the United States, Japan, Ireland and, of course, Britain. Of the proposals reviewed and accepted, 21 were finally presented at…

  14. Chroma intra prediction based on inter-channel correlation for HEVC.

    PubMed

    Zhang, Xingyu; Gisquet, Christophe; François, Edouard; Zou, Feng; Au, Oscar C

    2014-01-01

    In this paper, we investigate a new inter-channel coding mode called LM mode proposed for the next generation video coding standard called high efficiency video coding. This mode exploits inter-channel correlation using reconstructed luma to predict chroma linearly with parameters derived from neighboring reconstructed luma and chroma pixels at both encoder and decoder to avoid overhead signaling. In this paper, we analyze the LM mode and prove that the LM parameters for predicting original chroma and reconstructed chroma are statistically the same. We also analyze the error sensitivity of the LM parameters. We identify some LM mode problematic situations and propose three novel LM-like modes called LMA, LML, and LMO to address the situations. To limit the increase in complexity due to the LM-like modes, we propose some fast algorithms with the help of some new cost functions. We further identify some potentially-problematic conditions in the parameter estimation (including regression dilution problem) and introduce a novel model correction technique to detect and correct those conditions. Simulation results suggest that considerable BD-rate reduction can be achieved by the proposed LM-like modes and model correction technique. In addition, the performance gain of the two techniques appears to be essentially additive when combined.

  15. Understanding the Physical Nature of Coronal "EIT Waves".

    PubMed

    Long, D M; Bloomfield, D S; Chen, P F; Downs, C; Gallagher, P T; Kwon, R-Y; Vanninathan, K; Veronig, A M; Vourlidas, A; Vršnak, B; Warmuth, A; Žic, T

    2017-01-01

    For almost 20 years the physical nature of globally propagating waves in the solar corona (commonly called "EIT waves") has been controversial and subject to debate. Additional theories have been proposed over the years to explain observations that did not agree with the originally proposed fast-mode wave interpretation. However, the incompatibility of observations made using the Extreme-ultraviolet Imaging Telescope (EIT) onboard the Solar and Heliospheric Observatory with the fast-mode wave interpretation was challenged by differing viewpoints from the twin Solar Terrestrial Relations Observatory spacecraft and data with higher spatial and temporal resolution from the Solar Dynamics Observatory . In this article, we reexamine the theories proposed to explain EIT waves to identify measurable properties and behaviours that can be compared to current and future observations. Most of us conclude that the so-called EIT waves are best described as fast-mode large-amplitude waves or shocks that are initially driven by the impulsive expansion of an erupting coronal mass ejection in the low corona.

  16. The pancreas from Aristotle to Galen.

    PubMed

    Tsuchiya, Ryoichi; Kuroki, Tamotsu; Eguchi, Susumu

    2015-01-01

    The first description of the pancreas in literature is found in Aristotle's Historia Animalium, but it is modified by "so-called". Therefore, the origin is pursued more extensively. The Greek-English Lexicon recommends three treatises as a possible original source. These three and Galen's other papers are investigated. In 2005, Sachs et al. suggested an origin of the pancreas might have derived from the intestinal divination using the avian pancreas. This report is evaluated. The avian pancreas which is the intraperitoneal organ, might have been well known by the intestinal divination, and people have called the organ pankreas or kallikreas. Anatomical dissection on human body was not accepted before the Aristotle's time. "So-called pancreas" in Historia must have been interpolated by Theophrastus. He was the most faithful and reliable disciple of Aristotle and succeeded the Aristotle's school. He and Macedonian ruler of Egypt Ptolemy I had known each other and there had been a strong link between them. The contemporary Herophilus performed many public dissections on both human and animal bodies in Alexandria. He named the various parts of the human body and designated the beginning intestine as duodenum. Yet in his extant works, the pancreas is not found. It is surmised that Herophilus may be the first to recognize the human pancreas, which is fixed with retroperitoneal tissue, and he named it "so-called pancreas". Theophrastus might have interpolated Herophilus' designation in Historia Animalium. Galen also uses "so-called pancreas" to designate the human pancreas. Galen's descriptions, that is, "Nature created 'so-called pancreas 'and spread it beneath all vessels" are not generally acceptable but propose the very rare portal vein anomalies. Since the early years of the 20th century, cases with a preduodenal portal vein or a prepancreatic portal vein have been reported. Although the incidence is very rare, its surgical importance is emphasized. Copyright © 2014 IAP and EPC. Published by Elsevier B.V. All rights reserved.

  17. Robust sensorimotor representation to physical interaction changes in humanoid motion learning.

    PubMed

    Shimizu, Toshihiko; Saegusa, Ryo; Ikemoto, Shuhei; Ishiguro, Hiroshi; Metta, Giorgio

    2015-05-01

    This paper proposes a learning from demonstration system based on a motion feature, called phase transfer sequence. The system aims to synthesize the knowledge on humanoid whole body motions learned during teacher-supported interactions, and apply this knowledge during different physical interactions between a robot and its surroundings. The phase transfer sequence represents the temporal order of the changing points in multiple time sequences. It encodes the dynamical aspects of the sequences so as to absorb the gaps in timing and amplitude derived from interaction changes. The phase transfer sequence was evaluated in reinforcement learning of sitting-up and walking motions conducted by a real humanoid robot and compatible simulator. In both tasks, the robotic motions were less dependent on physical interactions when learned by the proposed feature than by conventional similarity measurements. Phase transfer sequence also enhanced the convergence speed of motion learning. Our proposed feature is original primarily because it absorbs the gaps caused by changes of the originally acquired physical interactions, thereby enhancing the learning speed in subsequent interactions.

  18. Optimal Golomb Ruler Sequences Generation for Optical WDM Systems: A Novel Parallel Hybrid Multi-objective Bat Algorithm

    NASA Astrophysics Data System (ADS)

    Bansal, Shonak; Singh, Arun Kumar; Gupta, Neena

    2017-02-01

    In real-life, multi-objective engineering design problems are very tough and time consuming optimization problems due to their high degree of nonlinearities, complexities and inhomogeneity. Nature-inspired based multi-objective optimization algorithms are now becoming popular for solving multi-objective engineering design problems. This paper proposes original multi-objective Bat algorithm (MOBA) and its extended form, namely, novel parallel hybrid multi-objective Bat algorithm (PHMOBA) to generate shortest length Golomb ruler called optimal Golomb ruler (OGR) sequences at a reasonable computation time. The OGRs found their application in optical wavelength division multiplexing (WDM) systems as channel-allocation algorithm to reduce the four-wave mixing (FWM) crosstalk. The performances of both the proposed algorithms to generate OGRs as optical WDM channel-allocation is compared with other existing classical computing and nature-inspired algorithms, including extended quadratic congruence (EQC), search algorithm (SA), genetic algorithms (GAs), biogeography based optimization (BBO) and big bang-big crunch (BB-BC) optimization algorithms. Simulations conclude that the proposed parallel hybrid multi-objective Bat algorithm works efficiently as compared to original multi-objective Bat algorithm and other existing algorithms to generate OGRs for optical WDM systems. The algorithm PHMOBA to generate OGRs, has higher convergence and success rate than original MOBA. The efficiency improvement of proposed PHMOBA to generate OGRs up to 20-marks, in terms of ruler length and total optical channel bandwidth (TBW) is 100 %, whereas for original MOBA is 85 %. Finally the implications for further research are also discussed.

  19. How many rumbles are there? Acoustic variation and individual identity in the rumble vocalizations of African elephants (Loxodonta africana)

    NASA Astrophysics Data System (ADS)

    Soltis, Joseph M.; Savage, Anne; Leong, Kirsten M.

    2004-05-01

    The most commonly occurring elephant vocalization is the rumble, a frequency-modulated call with infrasonic components. Upwards of ten distinct rumble subtypes have been proposed, but little quantitative work on the acoustic properties of rumbles has been conducted. Rumble vocalizations (N=269) from six females housed at Disney's Animal Kingdom were analyzed. Vocalizations were recorded from microphones in collars around subject necks, and rumbles were digitized and measured using SIGNAL software. Sixteen acoustic variables were measured for each call, extracting both source and filter features. Multidimensional scaling analysis indicates that there are no acoustically distinct rumble subtypes, but that there is quantitative variation across rumbles. Discriminant function analysis showed that the acoustic characteristics of rumbles differ across females. A classification success rate of 65% was achieved when assigning unselected rumbles to one of the six females (test set =64 calls) according to the functions derived from the originally selected calls (training set =205 calls). The rumble is best viewed as a single call type with graded variation, but information regarding individual identity is encoded in female rumbles.

  20. The activities of eukaryotic replication origins in chromatin.

    PubMed

    Weinreich, Michael; Palacios DeBeer, Madeleine A; Fox, Catherine A

    2004-03-15

    DNA replication initiates at chromosomal positions called replication origins. This review will focus on the activity, regulation and roles of replication origins in Saccharomyces cerevisiae. All eukaryotic cells, including S. cerevisiae, depend on the initiation (activity) of hundreds of replication origins during a single cell cycle for the duplication of their genomes. However, not all origins are identical. For example, there is a temporal order to origin activation with some origins firing early during the S-phase and some origins firing later. Recent studies provide evidence that posttranslational chromatin modifications, heterochromatin-binding proteins and nucleosome positioning can control the efficiency and/or timing of chromosomal origin activity in yeast. Many more origins exist than are necessary for efficient replication. The availability of excess replication origins leaves individual origins free to evolve distinct forms of regulation and/or roles in chromosomes beyond their fundamental role in DNA synthesis. We propose that some origins have acquired roles in controlling chromatin structure and/or gene expression. These roles are not linked obligatorily to replication origin activity per se, but instead exploit multi-subunit replication proteins with the potential to form context-dependent protein-protein interactions.

  1. A Chaotic Ordered Hierarchies Consistency Analysis Performance Evaluation Model

    NASA Astrophysics Data System (ADS)

    Yeh, Wei-Chang

    2013-02-01

    The Hierarchies Consistency Analysis (HCA) is proposed by Guh in-cooperated along with some case study on a Resort to reinforce the weakness of Analytical Hierarchy Process (AHP). Although the results obtained enabled aid for the Decision Maker to make more reasonable and rational verdicts, the HCA itself is flawed. In this paper, our objective is to indicate the problems of HCA, and then propose a revised method called chaotic ordered HCA (COH in short) which can avoid problems. Since the COH is based upon Guh's method, the Decision Maker establishes decisions in a way similar to that of the original method.

  2. Developmental Origins of Chronic Kidney Disease: Should We Focus on Early Life?

    PubMed Central

    Tain, You-Lin; Hsu, Chien-Ning

    2017-01-01

    Chronic kidney disease (CKD) is becoming a global burden, despite recent advances in management. CKD can begin in early life by so-called “developmental programming” or “developmental origins of health and disease” (DOHaD). Early-life insults cause structural and functional changes in the developing kidney, which is called renal programming. Epidemiological and experimental evidence supports the proposition that early-life adverse events lead to renal programming and make subjects vulnerable to developing CKD and its comorbidities in later life. In addition to low nephron endowment, several mechanisms have been proposed for renal programming. The DOHaD concept opens a new window to offset the programming process in early life to prevent the development of adult kidney disease, namely reprogramming. Here, we review the key themes on the developmental origins of CKD. We have particularly focused on the following areas: evidence from human studies support fetal programming of kidney disease; insight from animal models of renal programming; hypothetical mechanisms of renal programming; alterations of renal transcriptome in response to early-life insults; and the application of reprogramming interventions to prevent the programming of kidney disease. PMID:28208659

  3. A Group Decision Framework with Intuitionistic Preference Relations and Its Application to Low Carbon Supplier Selection.

    PubMed

    Tong, Xiayu; Wang, Zhou-Jing

    2016-09-19

    This article develops a group decision framework with intuitionistic preference relations. An approach is first devised to rectify an inconsistent intuitionistic preference relation to derive an additive consistent one. A new aggregation operator, the so-called induced intuitionistic ordered weighted averaging (IIOWA) operator, is proposed to aggregate individual intuitionistic fuzzy judgments. By using the mean absolute deviation between the original and rectified intuitionistic preference relations as an order inducing variable, the rectified consistent intuitionistic preference relations are aggregated into a collective preference relation. This treatment is presumably able to assign different weights to different decision-makers' judgments based on the quality of their inputs (in terms of consistency of their original judgments). A solution procedure is then developed for tackling group decision problems with intuitionistic preference relations. A low carbon supplier selection case study is developed to illustrate how to apply the proposed decision model in practice.

  4. A Group Decision Framework with Intuitionistic Preference Relations and Its Application to Low Carbon Supplier Selection

    PubMed Central

    Tong, Xiayu; Wang, Zhou-Jing

    2016-01-01

    This article develops a group decision framework with intuitionistic preference relations. An approach is first devised to rectify an inconsistent intuitionistic preference relation to derive an additive consistent one. A new aggregation operator, the so-called induced intuitionistic ordered weighted averaging (IIOWA) operator, is proposed to aggregate individual intuitionistic fuzzy judgments. By using the mean absolute deviation between the original and rectified intuitionistic preference relations as an order inducing variable, the rectified consistent intuitionistic preference relations are aggregated into a collective preference relation. This treatment is presumably able to assign different weights to different decision-makers’ judgments based on the quality of their inputs (in terms of consistency of their original judgments). A solution procedure is then developed for tackling group decision problems with intuitionistic preference relations. A low carbon supplier selection case study is developed to illustrate how to apply the proposed decision model in practice. PMID:27657097

  5. A self-organizing learning account of number-form synaesthesia.

    PubMed

    Makioka, Shogo

    2009-09-01

    Some people automatically and involuntarily "see" mental images of numbers in spatial arrays when they think of numbers. This phenomenon, called number forms, shares three key characteristics with the other types of synaesthesia, within-individual consistency, between-individual variety, and mixture of regularity and randomness. A theoretical framework called SOLA (self-organizing learning account of number forms) is proposed, which explains the generation process of number forms and the origin of those three characteristics. The simulations replicated the qualitative properties of the shapes of number forms, the property that numbers are aligned in order of size, that discontinuity usually occurs at the point of carry, and that continuous lines tend to have many bends.

  6. Progress Report on Landing Site Evaluation for the Next Japanese Lunar Exploration Project: SELENE-2

    NASA Astrophysics Data System (ADS)

    Saiki, K.; Arai, T.; Araki, H.; Ishihara, Y.; Ohtake, M.; Karouji, Y.; Kobayashi, N.; Sugihara, T.; Haruyama, J.; Honda, C.

    2010-12-01

    SELENE-2 is the next Japanese lunar exploration project that is planned to be launched by the end of fiscal year 2015. In order to select the landing site candidates which maximize the scientific return from the project, "SELENE-2 Landing Site Research Board" was organized in March, 2010. The board called for scientific proposals with landing site candidates from domestic researchers who are interested in lunar science and members of the Japanese Society for Planetary Sciences, Japan Association of Mineralogical Sciences, the Geochemical Society of Japan, Seismological society of Japan, or the Geodetic society of Japan. At present, we have 35 scientific proposals with over 70 landing site candidates submitted from 21 groups. The proposals were categorized into nine research subjects as follows: 1) Identification of mantle materials, 2) Temporal variation of igneous activity and thermal history of the moon, 3) Lava morphology, 4) Origin of swirl, 5) Crater formation mechanism, 6) Core size, 7) Internal structure (crust - mantle), 8) Origin of the region enriched in heat source elements, and 9) Origin of highland crust. We are evaluating the proposals with the landing sites, and discussing the scientific target of SELENE-2. Within 6 months, we will propose several model missions which execute the scientific exploration with the highest priority today. In our presentation, the present landing site candidates, the policy of the selection, and a plan of a further landing site selection process would be shown.

  7. Modeling decoherence with qubits

    NASA Astrophysics Data System (ADS)

    Heusler, Stefan; Dür, Wolfgang

    2018-03-01

    Quantum effects like the superposition principle contradict our experience of daily life. Decoherence can be viewed as a possible explanation why we do not observe quantum superposition states in the macroscopic world. In this article, we use the qubit ansatz to discuss decoherence in the simplest possible model system and propose a visualization for the microscopic origin of decoherence, and the emergence of a so-called pointer basis. Finally, we discuss the possibility of ‘macroscopic’ quantum effects.

  8. Automatic luminous reflections detector using global threshold with increased luminosity contrast in images

    NASA Astrophysics Data System (ADS)

    Silva, Ricardo Petri; Naozuka, Gustavo Taiji; Mastelini, Saulo Martiello; Felinto, Alan Salvany

    2018-01-01

    The incidence of luminous reflections (LR) in captured images can interfere with the color of the affected regions. These regions tend to oversaturate, becoming whitish and, consequently, losing the original color information of the scene. Decision processes that employ images acquired from digital cameras can be impaired by the LR incidence. Such applications include real-time video surgeries, facial, and ocular recognition. This work proposes an algorithm called contrast enhancement of potential LR regions, which is a preprocessing to increase the contrast of potential LR regions, in order to improve the performance of automatic LR detectors. In addition, three automatic detectors were compared with and without the employment of our preprocessing method. The first one is a technique already consolidated in the literature called the Chang-Tseng threshold. We propose two automatic detectors called adapted histogram peak and global threshold. We employed four performance metrics to evaluate the detectors, namely, accuracy, precision, exactitude, and root mean square error. The exactitude metric is developed by this work. Thus, a manually defined reference model was created. The global threshold detector combined with our preprocessing method presented the best results, with an average exactitude rate of 82.47%.

  9. ConformRank: A conformity-based rank for finding top-k influential users

    NASA Astrophysics Data System (ADS)

    Wang, Qiyao; Jin, Yuehui; Cheng, Shiduan; Yang, Tan

    2017-05-01

    Finding influential users is a hot topic in social networks. For example, advertisers identify influential users to make a successful campaign. Retweeters forward messages from original users, who originally publish messages. This action is referred to as retweeting. Retweeting behaviors generate influence. Original users have influence on retweeters. Whether retweeters keep the same sentiment as original users is taken into consideration in this study. Influence is calculated based on conformity from emotional perspective after retweeting. A conformity-based algorithm, called ConformRank, is proposed to find top-k influential users, who make the most users keep the same sentiment after retweeting messages. Emotional conformity is introduced to denote how users conform to original users from the emotional perspective. Conforming weights are introduced to denote how two users keep the same sentiment after retweeting messages. Emotional conformity is applied for users and conforming weights are used for relations. Experiments were conducted on Sina Weibo. Experimental results show that users have larger influence when they publish positive messages.

  10. Dispatching function calls across accelerator devices

    DOEpatents

    Jacob, Arpith C.; Sallenave, Olivier H.

    2017-01-10

    In one embodiment, a computer-implemented method for dispatching a function call includes receiving, at a supervisor processing element (PE) and from an origin PE, an identifier of a target device, a stack frame of the origin PE, and an address of a function called from the origin PE. The supervisor PE allocates a target PE of the target device. The supervisor PE copies the stack frame of the origin PE to a new stack frame on a call stack of the target PE. The supervisor PE instructs the target PE to execute the function. The supervisor PE receives a notification that execution of the function is complete. The supervisor PE copies the stack frame of the target PE to the stack frame of the origin PE. The supervisor PE releases the target PE of the target device. The supervisor PE instructs the origin PE to resume execution of the program.

  11. Dispatching function calls across accelerator devices

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jacob, Arpith C.; Sallenave, Olivier H.

    In one embodiment, a computer-implemented method for dispatching a function call includes receiving, at a supervisor processing element (PE) and from an origin PE, an identifier of a target device, a stack frame of the origin PE, and an address of a function called from the origin PE. The supervisor PE allocates a target PE of the target device. The supervisor PE copies the stack frame of the origin PE to a new stack frame on a call stack of the target PE. The supervisor PE instructs the target PE to execute the function. The supervisor PE receives a notificationmore » that execution of the function is complete. The supervisor PE copies the stack frame of the target PE to the stack frame of the origin PE. The supervisor PE releases the target PE of the target device. The supervisor PE instructs the origin PE to resume execution of the program.« less

  12. The fluid mechanics of the inner-ear disorder BPPV

    NASA Astrophysics Data System (ADS)

    Weidman, Michael; Squires, Todd; Stone, Howard

    2001-11-01

    The inner ear of mammals contains fluid-filled semi-circular canals with a flexible sensory membrane (called a cupula) which detects rotational acceleration. Benign Paroxysmal Positional Vertigo (BPPV) is one of the most common disorders of this system diagnosed today, and is characterized by symptoms of dizziness and nausea brought on by sudden changes in head orientation. BPPV is believed to have a mechanical (rather than nervous) origin, in which dense particles called otoconia settle into the canals and trigger false sensations of rotational acceleration. Several qualitative mechanisms have been proposed by the medical community, which we examine from a fluid mechanical standpoint. Traditionally, the semicircular canal and the cupula are modeled as an over-damped torsional pendulum with a driving force provided by rotational acceleration. We extend this model to include the time-dependent mechanical response owing to sedimentation of the otoconia. We make qualitative and quantitative predictions associated with the proposed mechanisms, with an eye towards differentiating between them and perhaps towards more effective diagnostic and therapeutic methods.

  13. Department of Defense Report on Search for Human Radiation Experiment Records, 1944 - 1994, Volume 2

    DTIC Science & Technology

    1997-06-01

    1-66 Serial investigation of a variety of congenital deformities of the brain case and facial skeleton and the response to treatment ...subsequent treatment of cleft lip and palate defects. The original proposal called for 100 patients and a comparison group of 100 "normals." To date, no...brain case and facial skeleton and the response to treatment . Document Type: Protocol. Document Date: 12 January 1966 Authors: Guy C. Nicholson; Thomas

  14. Speech-Like Rhythm in a Voiced and Voiceless Orangutan Call

    PubMed Central

    Lameira, Adriano R.; Hardus, Madeleine E.; Bartlett, Adrian M.; Shumaker, Robert W.; Wich, Serge A.; Menken, Steph B. J.

    2015-01-01

    The evolutionary origins of speech remain obscure. Recently, it was proposed that speech derived from monkey facial signals which exhibit a speech-like rhythm of ∼5 open-close lip cycles per second. In monkeys, these signals may also be vocalized, offering a plausible evolutionary stepping stone towards speech. Three essential predictions remain, however, to be tested to assess this hypothesis' validity; (i) Great apes, our closest relatives, should likewise produce 5Hz-rhythm signals, (ii) speech-like rhythm should involve calls articulatorily similar to consonants and vowels given that speech rhythm is the direct product of stringing together these two basic elements, and (iii) speech-like rhythm should be experience-based. Via cinematic analyses we demonstrate that an ex-entertainment orangutan produces two calls at a speech-like rhythm, coined “clicks” and “faux-speech.” Like voiceless consonants, clicks required no vocal fold action, but did involve independent manoeuvring over lips and tongue. In parallel to vowels, faux-speech showed harmonic and formant modulations, implying vocal fold and supralaryngeal action. This rhythm was several times faster than orangutan chewing rates, as observed in monkeys and humans. Critically, this rhythm was seven-fold faster, and contextually distinct, than any other known rhythmic calls described to date in the largest database of the orangutan repertoire ever assembled. The first two predictions advanced by this study are validated and, based on parsimony and exclusion of potential alternative explanations, initial support is given to the third prediction. Irrespectively of the putative origins of these calls and underlying mechanisms, our findings demonstrate irrevocably that great apes are not respiratorily, articulatorilly, or neurologically constrained for the production of consonant- and vowel-like calls at speech rhythm. Orangutan clicks and faux-speech confirm the importance of rhythmic speech antecedents within the primate lineage, and highlight potential articulatory homologies between great ape calls and human consonants and vowels. PMID:25569211

  15. Emergence of multilateral proto-institutions in global health and new approaches to governance: analysis using path dependency and institutional theory.

    PubMed

    Gómez, Eduardo J; Atun, Rifat

    2013-05-10

    The role of multilateral donor agencies in global health is a new area of research, with limited research on how these agencies differ in terms of their governance arrangements, especially in relation to transparency, inclusiveness, accountability, and responsiveness to civil society. We argue that historical analysis of the origins of these agencies and their coalition formation processes can help to explain these differences. We propose an analytical approach that links the theoretical literature discussing institutional origins to path dependency and institutional theory relating to proto institutions in order to illustrate the differences in coalition formation processes that shape governance within four multilateral agencies involved in global health. We find that two new multilateral donor agencies that were created by a diverse coalition of state and non-state actors, such as the Global Fund to Fight AIDS, Tuberculosis and Malaria and GAVI, what we call proto-institutions, were more adaptive in strengthening their governance processes. This contrasts with two well-established multilateral donor agencies, such as the World Bank and the Asian Development Bank, what we call Bretton Woods (BW) institutions, which were created by nation states alone; and hence, have different origins and consequently different path dependent processes.

  16. Emergence of multilateral proto-institutions in global health and new approaches to governance: analysis using path dependency and institutional theory

    PubMed Central

    2013-01-01

    The role of multilateral donor agencies in global health is a new area of research, with limited research on how these agencies differ in terms of their governance arrangements, especially in relation to transparency, inclusiveness, accountability, and responsiveness to civil society. We argue that historical analysis of the origins of these agencies and their coalition formation processes can help to explain these differences. We propose an analytical approach that links the theoretical literature discussing institutional origins to path dependency and institutional theory relating to proto institutions in order to illustrate the differences in coalition formation processes that shape governance within four multilateral agencies involved in global health. We find that two new multilateral donor agencies that were created by a diverse coalition of state and non-state actors, such as the Global Fund to Fight AIDS, Tuberculosis and Malaria and GAVI, what we call proto-institutions, were more adaptive in strengthening their governance processes. This contrasts with two well-established multilateral donor agencies, such as the World Bank and the Asian Development Bank, what we call Bretton Woods (BW) institutions, which were created by nation states alone; and hence, have different origins and consequently different path dependent processes. PMID:23663485

  17. A philosophical taxonomy of ethically significant moral distress.

    PubMed

    Thomas, Tessy A; McCullough, Laurence B

    2015-02-01

    Moral distress is one of the core topics of clinical ethics. Although there is a large and growing empirical literature on the psychological aspects of moral distress, scholars, and empirical investigators of moral distress have recently called for greater conceptual clarity. To meet this recognized need, we provide a philosophical taxonomy of the categories of what we call ethically significant moral distress: the judgment that one is not able, to differing degrees, to act on one's moral knowledge about what one ought to do. We begin by unpacking the philosophical components of Andrew Jameton's original formulation from his landmark 1984 work and identify two key respects in which that formulation remains unclear: the origins of moral knowledge and impediments to acting on that moral knowledge. We then selectively review subsequent literature that shows that there is more than one concept of moral distress and that explores the origin of the values implicated in moral distress and impediments to acting on those values. This review sets the stage for identifying the elements of a philosophical taxonomy of ethically significant moral distress. The taxonomy uses these elements to create six categories of ethically significant moral distress: challenges to, threats to, and violations of professional integrity; and challenges to, threats to, and violations of individual integrity. We close with suggestions about how the proposed philosophical taxonomy of ethically significant moral distress sheds light on the concepts of moral residue and crescendo effect of moral distress and how the proposed taxonomy might usefully guide prevention of and future qualitative and quantitative empirical research on ethically significant moral distress. © The Author 2014. Published by Oxford University Press, on behalf of the Journal of Medicine and Philosophy Inc. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  18. A cryptographic hash function based on chaotic network automata

    NASA Astrophysics Data System (ADS)

    Machicao, Jeaneth; Bruno, Odemir M.

    2017-12-01

    Chaos theory has been used to develop several cryptographic methods relying on the pseudo-random properties extracted from simple nonlinear systems such as cellular automata (CA). Cryptographic hash functions (CHF) are commonly used to check data integrity. CHF “compress” arbitrary long messages (input) into much smaller representations called hash values or message digest (output), designed to prevent the ability to reverse the hash values into the original message. This paper proposes a chaos-based CHF inspired on an encryption method based on chaotic CA rule B1357-S2468. Here, we propose an hybrid model that combines CA and networks, called network automata (CNA), whose chaotic spatio-temporal outputs are used to compute a hash value. Following the Merkle and Damgård model of construction, a portion of the message is entered as the initial condition of the network automata, so that the rest parts of messages are iteratively entered to perturb the system. The chaotic network automata shuffles the message using flexible control parameters, so that the generated hash value is highly sensitive to the message. As demonstrated in our experiments, the proposed model has excellent pseudo-randomness and sensitivity properties with acceptable performance when compared to conventional hash functions.

  19. A Proposed Neurological Interpretation of Language Evolution.

    PubMed

    Ardila, Alfredo

    2015-01-01

    Since the very beginning of the aphasia history it has been well established that there are two major aphasic syndromes (Wernicke's-type and Broca's-type aphasia); each one of them is related to the disturbance at a specific linguistic level (lexical/semantic and grammatical) and associated with a particular brain damage localization (temporal and frontal-subcortical). It is proposed that three stages in language evolution could be distinguished: (a) primitive communication systems similar to those observed in other animals, including nonhuman primates; (b) initial communication systems using sound combinations (lexicon) but without relationships among the elements (grammar); and (c) advanced communication systems including word-combinations (grammar). It is proposed that grammar probably originated from the internal representation of actions, resulting in the creation of verbs; this is an ability that depends on the so-called Broca's area and related brain networks. It is suggested that grammar is the basic ability for the development of so-called metacognitive executive functions. It is concluded that while the lexical/semantic language system (vocabulary) probably appeared during human evolution long before the contemporary man (Homo sapiens sapiens), the grammatical language historically represents a recent acquisition and is correlated with the development of complex cognition (metacognitive executive functions).

  20. A Proposed Neurological Interpretation of Language Evolution

    PubMed Central

    2015-01-01

    Since the very beginning of the aphasia history it has been well established that there are two major aphasic syndromes (Wernicke's-type and Broca's-type aphasia); each one of them is related to the disturbance at a specific linguistic level (lexical/semantic and grammatical) and associated with a particular brain damage localization (temporal and frontal-subcortical). It is proposed that three stages in language evolution could be distinguished: (a) primitive communication systems similar to those observed in other animals, including nonhuman primates; (b) initial communication systems using sound combinations (lexicon) but without relationships among the elements (grammar); and (c) advanced communication systems including word-combinations (grammar). It is proposed that grammar probably originated from the internal representation of actions, resulting in the creation of verbs; this is an ability that depends on the so-called Broca's area and related brain networks. It is suggested that grammar is the basic ability for the development of so-called metacognitive executive functions. It is concluded that while the lexical/semantic language system (vocabulary) probably appeared during human evolution long before the contemporary man (Homo sapiens sapiens), the grammatical language historically represents a recent acquisition and is correlated with the development of complex cognition (metacognitive executive functions). PMID:26124540

  1. Navy-Marine Corps Amphibious and Maritime Prepositioning Ship Programs: Background and Oversight Issues for Congress

    DTIC Science & Technology

    2005-05-31

    building a new amphibious assault ship called LHD-8 and is also procuring new LPD-17 class amphibious ships. A total of 12 LPD-17s were originally planned ...but the FY2006-FY2011 Future Years Defense Plan (FYDP) proposes reducing that figure to nine, with the final two to be procured in FY2006 and FY2007...Three developments have caused the Navy to reconsider its plans for procuring amphibious ships, maritime prepositioning ships, and connector ships

  2. Induction of relaxor state in ordinary ferroelectrics by isovalent ion substitution: A pretransitional martensitic texture case

    NASA Astrophysics Data System (ADS)

    Lente, M. H.; Moreira, E. N.; Garcia, D.; Eiras, J. A.; Neves, P. P.; Doriguetto, A. C.; Mastelaro, V. R.; Mascarenhas, Y. P.

    2006-02-01

    The understanding of the structural origin of relaxor ferroelectrics has been doubtlessly a long-standing puzzle in the field of ferroelectricity. Thus, motivated by the interest in improving the comprehension of this important issue, it a framework is proposed for explaining the origin of the relaxor state in ordinary ferroelectrics induced via the isovalent-ion substitution. Based on the martensitic transformation concepts, it is proposed that the continuous addition of isovalent ions in a so-called normal ferroelectric decreases considerably the elastic strain energy. This results in a gradual transformation of ferroelectric domain patterns from a micrometer polydomain structure (twins), through single domains, to nanometer-polar-“tweed” structures with glasslike behavior, that are, in turn, strongly driven by point defects and surface effects. The electrical interaction between these weakly coupled polar-tweed structures leads to a wide spectrum of relaxation times, thus resulting in a dielectric relaxation process, the signature of relaxor ferroelectrics.

  3. An approach to efficient mobility management in intelligent networks

    NASA Technical Reports Server (NTRS)

    Murthy, K. M. S.

    1995-01-01

    Providing personal communication systems supporting full mobility require intelligent networks for tracking mobile users and facilitating outgoing and incoming calls over different physical and network environments. In realizing the intelligent network functionalities, databases play a major role. Currently proposed network architectures envision using the SS7-based signaling network for linking these DB's and also for interconnecting DB's with switches. If the network has to support ubiquitous, seamless mobile services, then it has to support additionally mobile application parts, viz., mobile origination calls, mobile destination calls, mobile location updates and inter-switch handovers. These functions will generate significant amount of data and require them to be transferred between databases (HLR, VLR) and switches (MSC's) very efficiently. In the future, the users (fixed or mobile) may use and communicate with sophisticated CPE's (e.g. multimedia, multipoint and multisession calls) which may require complex signaling functions. This will generate volumness service handling data and require efficient transfer of these message between databases and switches. Consequently, the network providers would be able to add new services and capabilities to their networks incrementally, quickly and cost-effectively.

  4. A discrete trinomial model for the birth and death of stock financial bubbles

    NASA Astrophysics Data System (ADS)

    Di Persio, Luca; Guida, Francesco

    2017-11-01

    The present work proposes a novel way to model the dynamic of financial bubbles. In particular we exploit the so called trinomial tree technique, which is mainly inspired by the typical market order book (MOB) structure. According to the typical MOB rules, we exploit a bottom-up approach to derive the relevant generator process for the financial quantities characterizing the market we are considering. Our proposal pays attention in considering the real world changes in probability levels characterizing the bid-ask preferences, focusing the attention on the market movements. In particular, we show that financial bubbles are originated by these movements which also act amplify their growth.

  5. Shen-Jing as a Chinese medicine concept might be a counterpart of stem cells in regenerative medicine.

    PubMed

    Ren, Yan-Bo; Huang, Jian-Hua; Cai, Wai-Jiao; Shen, Zi-Yin

    2015-07-04

    As the epitome of the modern regenerative medicine, stem cells were proposed in the basic sense no more than 200 years ago. However, the concept of "stem cells" existed long before the modern medical description. The hypothesis that all things, including our sentient body, were generated from a small origin was shared between Western and Chinese people. The ancient Chinese philosophers considered Jing (also known as essence) as the origin of life. In Chinese medicine (CM), Jing is mainly stored in Kidney (Shen) and the so-called Shen-Jing (Kidney essence). Here, we propose that Shen-Jing is the CM term used to express the meaning of "origin and regeneration". This theoretical discovery has at least two applications. First, the actions underlying causing Shen-Jing deficiency, such as excess sexual intercourse, chronic diseases, and aging, might damage the function of stem cells. Second, a large number of Chinese herbs with Shen-Jing-nourishing efficacy had been proven to affect stem cell proliferation and differentiation. Therefore, if Shen-Jing in CM is equivalent with stem cells in regenerative medicine, higher effective modulators for regulating stem-cell behaviors from Kidney-tonifying herbs would be expected.

  6. On the Origin of Hyperfast Neutron Stars

    NASA Astrophysics Data System (ADS)

    Gvaramadze, V. V.; Gualandris, A.; Portegies Zwart, S.

    2008-05-01

    We propose an explanation for the origin of hyperfast neutron stars (e.g. PSR B1508+55, PSR B2224+65, RX J0822 4300) based on the hypothesis that they could be the remnants of a symmetric supernova explosion of a high-velocity massive star (or its helium core) which attained its peculiar velocity (similar to that of the neutron star) in the course of a strong three- or four-body dynamical encounter in the core of a young massive star cluster. This hypothesis implies that the dense cores of star clusters (located either in the Galactic disk or near the Galactic centre) could also produce the so-called hypervelocity stars ordinary stars moving with a speed of ~ 1 000 km s-1.

  7. Autocatalytic sets and chemical organizations: modeling self-sustaining reaction networks at the origin of life

    NASA Astrophysics Data System (ADS)

    Hordijk, Wim; Steel, Mike; Dittrich, Peter

    2018-01-01

    Two related but somewhat different approaches have been proposed to formalize the notion of a self-sustaining chemical reaction network. One is the notion of collectively autocatalytic sets, formalized as RAF theory, and the other is chemical organization theory. Both formalisms have been argued to be relevant to the origin of life. RAF sets and chemical organizations are defined differently, but previously some relationships between the two have been shown. Here, we refine and explore these connections in more detail. In particular, we show that so-called closed RAFs are chemical organizations, but that the converse is not necessarily true. We then introduce and apply a procedure to show how chemical organizations can be used to find all closed RAFs within any chemical reaction system. We end with a discussion of why and how closed RAFs could be important in the context of the origin and early evolution of life.

  8. Dynamic frame resizing with convolutional neural network for efficient video compression

    NASA Astrophysics Data System (ADS)

    Kim, Jaehwan; Park, Youngo; Choi, Kwang Pyo; Lee, JongSeok; Jeon, Sunyoung; Park, JeongHoon

    2017-09-01

    In the past, video codecs such as vc-1 and H.263 used a technique to encode reduced-resolution video and restore original resolution from the decoder for improvement of coding efficiency. The techniques of vc-1 and H.263 Annex Q are called dynamic frame resizing and reduced-resolution update mode, respectively. However, these techniques have not been widely used due to limited performance improvements that operate well only under specific conditions. In this paper, video frame resizing (reduced/restore) technique based on machine learning is proposed for improvement of coding efficiency. The proposed method features video of low resolution made by convolutional neural network (CNN) in encoder and reconstruction of original resolution using CNN in decoder. The proposed method shows improved subjective performance over all the high resolution videos which are dominantly consumed recently. In order to assess subjective quality of the proposed method, Video Multi-method Assessment Fusion (VMAF) which showed high reliability among many subjective measurement tools was used as subjective metric. Moreover, to assess general performance, diverse bitrates are tested. Experimental results showed that BD-rate based on VMAF was improved by about 51% compare to conventional HEVC. Especially, VMAF values were significantly improved in low bitrate. Also, when the method is subjectively tested, it had better subjective visual quality in similar bit rate.

  9. Principle of the electrically induced Transient Current Technique

    NASA Astrophysics Data System (ADS)

    Bronuzzi, J.; Moll, M.; Bouvet, D.; Mapelli, A.; Sallese, J. M.

    2018-05-01

    In the field of detector development for High Energy Physics, the so-called Transient Current Technique (TCT) is used to characterize the electric field profile and the charge trapping inside silicon radiation detectors where particles or photons create electron-hole pairs in the bulk of a semiconductor device, as PiN diodes. In the standard approach, the TCT signal originates from the free carriers generated close to the surface of a silicon detector, by short pulses of light or by alpha particles. This work proposes a new principle of charge injection by means of lateral PN junctions implemented in one of the detector electrodes, called the electrical TCT (el-TCT). This technique is fully compatible with CMOS technology and therefore opens new perspectives for assessment of radiation detectors performances.

  10. A Detailed Study of Patent System for Protection of Inventions

    PubMed Central

    Tulasi, G. Krishna; Rao, B. Subba

    2008-01-01

    Creations of brain are called intellect. Since these creations have good commercial value, are called as property. Inventions are intellectual property and can be protected by patents provided the invention is novel, non-obvious, useful and enabled. To have fare trade among member countries, World Trade Organisation proposed TRIPS agreement. India had taken necessary initiation by signing the World Trade Organisation agreement and transformed to global needs. The aim of this article is to enlighten pharmaceutical professionals especially in the field of research and development about planning inventions by thorough review of prior-art, which saves time and money. A thorough understanding is made possible by providing details of origin; present governing bodies, their role along with the Act that is safeguarding the patent system. PMID:21394248

  11. Light in condensed matter in the upper atmosphere as the origin of homochirality: circularly polarized light from Rydberg matter.

    PubMed

    Holmlid, Leif

    2009-01-01

    Clouds of the condensed excited Rydberg matter (RM) exist in the atmospheres of comets and planetary bodies (most easily observed at Mercury and the Moon), where they surround the entire bodies. Vast such clouds are recently proposed to exist in the upper atmosphere of Earth (giving rise to the enormous features called noctilucent clouds, polar mesospheric clouds, and polar mesospheric summer radar echoes). It has been shown in experiments with RM that linearly polarized visible light scattered from an RM layer is transformed to circularly polarized light with a probability of approximately 50%. The circular Rydberg electrons in the magnetic field in the RM may be chiral scatterers. The magnetic and anisotropic RM medium acts as a circular polarizer probably by delaying one of the perpendicular components of the light wave. The delay process involved is called Rabi-flopping and gives delays of the order of femtoseconds. This strong effect thus gives intense circularly polarized visible and UV light within RM clouds. Amino acids and other chiral molecules will experience a strong interaction with this light field in the upper atmospheres of planets. The interaction will vary with the stereogenic conformation of the molecules and in all probability promote the survival of one enantiomer. Here, this strong effect is proposed to be the origin of homochirality. The formation of amino acids in the RM clouds is probably facilitated by the catalytic effect of RM.

  12. Light in Condensed Matter in the Upper Atmosphere as the Origin of Homochirality: Circularly Polarized Light from Rydberg Matter

    NASA Astrophysics Data System (ADS)

    Holmlid, Leif

    2009-08-01

    Clouds of the condensed excited Rydberg matter (RM) exist in the atmospheres of comets and planetary bodies (most easily observed at Mercury and the Moon), where they surround the entire bodies. Vast such clouds are recently proposed to exist in the upper atmosphere of Earth (giving rise to the enormous features called noctilucent clouds, polar mesospheric clouds, and polar mesospheric summer radar echoes). It has been shown in experiments with RM that linearly polarized visible light scattered from an RM layer is transformed to circularly polarized light with a probability of approximately 50%. The circular Rydberg electrons in the magnetic field in the RM may be chiral scatterers. The magnetic and anisotropic RM medium acts as a circular polarizer probably by delaying one of the perpendicular components of the light wave. The delay process involved is called Rabi-flopping and gives delays of the order of femtoseconds. This strong effect thus gives intense circularly polarized visible and UV light within RM clouds. Amino acids and other chiral molecules will experience a strong interaction with this light field in the upper atmospheres of planets. The interaction will vary with the stereogenic conformation of the molecules and in all probability promote the survival of one enantiomer. Here, this strong effect is proposed to be the origin of homochirality. The formation of amino acids in the RM clouds is probably facilitated by the catalytic effect of RM.

  13. A Function-Behavior-State Approach to Designing Human Machine Interface for Nuclear Power Plant Operators

    NASA Astrophysics Data System (ADS)

    Lin, Y.; Zhang, W. J.

    2005-02-01

    This paper presents an approach to human-machine interface design for control room operators of nuclear power plants. The first step in designing an interface for a particular application is to determine information content that needs to be displayed. The design methodology for this step is called the interface design framework (called framework ). Several frameworks have been proposed for applications at varying levels, including process plants. However, none is based on the design and manufacture of a plant system for which the interface is designed. This paper presents an interface design framework which originates from design theory and methodology for general technical systems. Specifically, the framework is based on a set of core concepts of a function-behavior-state model originally proposed by the artificial intelligence research community and widely applied in the design research community. Benefits of this new framework include the provision of a model-based fault diagnosis facility, and the seamless integration of the design (manufacture, maintenance) of plants and the design of human-machine interfaces. The missing linkage between design and operation of a plant was one of the causes of the Three Mile Island nuclear reactor incident. A simulated plant system is presented to explain how to apply this framework in designing an interface. The resulting human-machine interface is discussed; specifically, several fault diagnosis examples are elaborated to demonstrate how this interface could support operators' fault diagnosis in an unanticipated situation.

  14. LINEs Contribute to the Origins of Middle Bodies of SINEs besides 3′ Tails

    PubMed Central

    2018-01-01

    Abstract Short interspersed elements (SINEs), which are nonautonomous transposable elements, require the transposition machinery of long interspersed elements (LINEs) to mobilize. SINEs are composed of two or more independently originating parts. The 5′ region is called the “head” and is derived mainly from small RNAs, and the 3′ region (“tail”) originates from the 3′ region of LINEs and is responsible for being recognized by counterpart LINE proteins. The origin of the middle “body” of SINEs is enigmatic, although significant sequence similarities among SINEs from very diverse species have been observed. Here, a systematic analysis of the similarities among SINEs and LINEs deposited on Repbase, a comprehensive database of eukaryotic repeat sequences was performed. Three primary findings are described: 1) The 5′ regions of only two clades of LINEs, RTE and Vingi, were revealed to have contributed to the middle parts of SINEs; 2) The linkage of the 5′ and 3′ parts of LINEs can be lost due to occasional tail exchange of SINEs; and 3) The previously proposed Ceph-domain was revealed to be a fusion of a CORE-domain and a 5′ part of RTE clade of LINE. Based on these findings, a hypothesis that the 5′ parts of bipartite nonautonomous LINEs, which possess only the 5′ and 3′ regions of the original LINEs, can contribute to the undefined middle part of SINEs is proposed. PMID:29325122

  15. Defining Baconian Probability for Use in Assurance Argumentation

    NASA Technical Reports Server (NTRS)

    Graydon, Patrick J.

    2016-01-01

    The use of assurance cases (e.g., safety cases) in certification raises questions about confidence in assurance argument claims. Some researchers propose to assess confidence in assurance cases using Baconian induction. That is, a writer or analyst (1) identifies defeaters that might rebut or undermine each proposition in the assurance argument and (2) determines whether each defeater can be dismissed or ignored and why. Some researchers also propose denoting confidence using the counts of defeaters identified and eliminated-which they call Baconian probability-and performing arithmetic on these measures. But Baconian probabilities were first defined as ordinal rankings which cannot be manipulated arithmetically. In this paper, we recount noteworthy definitions of Baconian induction, review proposals to assess confidence in assurance claims using Baconian probability, analyze how these comport with or diverge from the original definition, and make recommendations for future practice.

  16. Possible Quantum Absorber Effects in Cortical Synchronization

    NASA Astrophysics Data System (ADS)

    Kämpf, Uwe

    The Wheeler-Feynman transactional "absorber" approach was proposed originally to account for anomalous resonance coupling between spatio-temporally distant measurement partners in entangled quantum states of so-called Einstein-Podolsky-Rosen paradoxes, e.g. of spatio-temporal non-locality, quantum teleportation, etc. Applied to quantum brain dynamics, however, this view provides an anticipative resonance coupling model for aspects of cortical synchronization and recurrent visual action control. It is proposed to consider the registered activation patterns of neuronal loops in so-called synfire chains not as a result of retarded brain communication processes, but rather as surface effects of a system of standing waves generated in the depth of visual processing. According to this view, they arise from a counterbalance between the actual input's delayed bottom-up data streams and top-down recurrent information-processing of advanced anticipative signals in a Wheeler-Feynman-type absorber mode. In the framework of a "time-loop" model, findings about mirror neurons in the brain cortex are suggested to be at least partially associated with temporal rather than spatial mirror functions of visual processing, similar to phase conjugate adaptive resonance-coupling in nonlinear optics.

  17. The origin of ethics and social order in a society without state power.

    PubMed

    Yamamoto, K

    1999-06-01

    How ethics and social order in a society without state power had originated and developed is one of enigmas which human beings have tried to solve for a long time. Several theories on the origin of social order have been proposed since the "Social Contract" theory of Thomas Hobbes. According to Hobbes, as a society without state power is in a condition called war, a social contract among men is the origin of social order in a society where every man is against every man. Rousseau says that when human beings reach the stage in which they live in a permanent neighborhood, a property system is introduced. Then, too much ambition and avarice of man who has possessions compel him to propose the formation of a political institution, providing social order which enables him to keep his possessions. According to Nietzsche, the principle of equilibrium, that is, an eye for an eye, a tooth for a tooth is an important concept for the oldest theory of low and morality as well as the basis of justice. The sense of superiority and nobility which a strong man brave enough to take revenge feels is the origin of the antithesis "good" and "bad". Girard says that the sacred violence wielded by the community to sacrifice a surrogate victim brings about social order in a society without state power. All the aforementioned theories seem to have failed to find out that a society without state power has its own ethics that had spontaneously developed on the pagan culture. Previously, I indicated that a society without state power or a society where state power cannot function well, such as the tribal society in northern Albania, has ethics which is based on the ancient concepts of "Guest-god", "food (commensality)" and "blood". In the present paper, I propose a new theory on the origin of ethics and social order, using the model of ethics of the Kanun.

  18. Quantum secret sharing using orthogonal multiqudit entangled states

    NASA Astrophysics Data System (ADS)

    Bai, Chen-Ming; Li, Zhi-Hui; Liu, Cheng-Ji; Li, Yong-Ming

    2017-12-01

    In this work, we investigate the distinguishability of orthogonal multiqudit entangled states under restricted local operations and classical communication. According to these properties, we propose a quantum secret sharing scheme to realize three types of access structures, i.e., the ( n, n)-threshold, the restricted (3, n)-threshold and restricted (4, n)-threshold schemes (called LOCC-QSS scheme). All cooperating players in the restricted threshold schemes are from two disjoint groups. In the proposed protocol, the participants use the computational basis measurement and classical communication to distinguish between those orthogonal states and reconstruct the original secret. Furthermore, we also analyze the security of our scheme in four primary quantum attacks and give a simple encoding method in order to better prevent the participant conspiracy attack.

  19. Single-Photon-Triggered Quantum Phase Transition

    NASA Astrophysics Data System (ADS)

    Lü, Xin-You; Zheng, Li-Li; Zhu, Gui-Lei; Wu, Ying

    2018-06-01

    We propose a hybrid quantum model combining cavity QED and optomechanics, which allows the occurrence of an equilibrium superradiant quantum phase transition (QPT) triggered by a single photon. This single-photon-triggered QPT exists in the cases of both ignoring and including the so-called A2 term; i.e., it is immune to the no-go theorem. It originally comes from the photon-dependent quantum criticality featured by the proposed hybrid quantum model. Moreover, a reversed superradiant QPT is induced by the competition between the introduced A2 term and the optomechanical interaction. This work offers an approach to manipulate QPT with a single photon, which should inspire the exploration of single-photon quantum-criticality physics and the engineering of new single-photon quantum devices.

  20. Cultural Resources Survey and Monitoring of Joint Task Force Six (JTF-6) Actions in Webb, Zapata, Dimmit, La Salle, Duvall, and Jim Hogg Counties, Texas

    DTIC Science & Technology

    1994-08-01

    vegetation, game, and riverine resources. A recent survey conducted for the proposed Camino Colombia Toll Road resulted in the recording of numerous...Trevino, who was reportedly from old Guerrero (also called Revilla, one of the 12 original Spanish colonies founded by Jose De Escandon in 1749 [(Hume 1972...1985). (Scale 1:1) 19 a b c de Figure 10. Diagnostic projectile points of the Lae Archaic period of South Texas: (a) Ensor (Bell 1960); (b) Frio (Turner

  1. An overview of the Douglas Aircraft Company Aeroelastic Design Optimization Program (ADOP)

    NASA Technical Reports Server (NTRS)

    Dodd, Alan J.

    1989-01-01

    From a program manager's viewpoint, the history, scope and architecture of a major structural design program at Douglas Aircraft Company called Aeroelastic Design Optimization Program (ADOP) are described. ADOP was originally intended for the rapid, accurate, cost-effective evaluation of relatively small structural models at the advanced design level, resulting in improved proposal competitiveness and avoiding many costly changes later in the design cycle. Before release of the initial version in November 1987, however, the program was expanded to handle very large production-type analyses.

  2. [Squirting and female ejaculation in 2015?].

    PubMed

    Salama, S; Boitrelle, F; Gauquelin, A; Lesaffre, C; Thiounn, N; Desvaux, P

    2015-06-01

    Since Antiquity, women who expulse a large quantity of liquid during sexual stimulation have remained a mystery. This phenomena is usually called "squirting". Many physicians have proposed different explications, however, there are very few scientific publications and their conclusions are discordant. Today, squirting is fashionable in the media, and some recent studies have brought new information. Through medical publications, we present the conclusions concerning the origin and the nature of squirting, the psychological experience of these squirting women and the feelings of their partners. Copyright © 2015 Elsevier Masson SAS. All rights reserved.

  3. A convenient basis for the Izergin-Korepin model

    NASA Astrophysics Data System (ADS)

    Qiao, Yi; Zhang, Xin; Hao, Kun; Cao, Junpeng; Li, Guang-Liang; Yang, Wen-Li; Shi, Kangjie

    2018-05-01

    We propose a convenient orthogonal basis of the Hilbert space for the quantum spin chain associated with the A2(2) algebra (or the Izergin-Korepin model). It is shown that compared with the original basis the monodromy-matrix elements acting on this basis take relatively simple forms, which is quite similar as that for the quantum spin chain associated with An algebra in the so-called F-basis. As an application of our general results, we present the explicit recursive expressions of the Bethe states in this basis for the Izergin-Korepin model.

  4. Switchable electric polarization and ferroelectric domains in a metal-organic-framework

    DOE PAGES

    Jain, Prashant; Stroppa, Alessandro; Nabok, Dmitrii; ...

    2016-09-30

    Multiferroics and magnetoelectrics with coexisting and coupled multiple ferroic orders are materials promising new technological advances. While most studies have focused on single-phase or heterostructures of inorganic materials, a new class of materials called metal–organic frameworks (MOFs) has been recently proposed as candidate materials demonstrating interesting new routes for multiferroism and magnetoelectric coupling. Herein, we report on the origin of multiferroicity of (CH 3) 2NH 2Mn(HCOO) 3 via direct observation of ferroelectric domains using second-harmonic generation techniques. For the first time, we observe how these domains are organized (sized in micrometer range), and how they are mutually affected by appliedmore » electric and magnetic fields. Lastly, calculations provide an estimate of the electric polarization and give insights into its microscopic origin.« less

  5. Origin of Superconductivity and Latent Charge Density Wave in NbS2

    NASA Astrophysics Data System (ADS)

    Heil, Christoph; Poncé, Samuel; Lambert, Henry; Schlipf, Martin; Margine, Elena R.; Giustino, Feliciano

    2017-08-01

    We elucidate the origin of the phonon-mediated superconductivity in 2 H -NbS2 using the ab initio anisotropic Migdal-Eliashberg theory including Coulomb interactions. We demonstrate that superconductivity is associated with Fermi surface hot spots exhibiting an unusually strong electron-phonon interaction. The electron-lattice coupling is dominated by low-energy anharmonic phonons, which place the system on the verge of a charge density wave instability. We also provide definitive evidence for two-gap superconductivity in 2 H -NbS2 , and show that the low- and high-energy peaks observed in tunneling spectra correspond to the Γ - and K -centered Fermi surface pockets, respectively. The present findings call for further efforts to determine whether our proposed mechanism underpins superconductivity in the whole family of metallic transition metal dichalcogenides.

  6. Quality Tetrahedral Mesh Smoothing via Boundary-Optimized Delaunay Triangulation

    PubMed Central

    Gao, Zhanheng; Yu, Zeyun; Holst, Michael

    2012-01-01

    Despite its great success in improving the quality of a tetrahedral mesh, the original optimal Delaunay triangulation (ODT) is designed to move only inner vertices and thus cannot handle input meshes containing “bad” triangles on boundaries. In the current work, we present an integrated approach called boundary-optimized Delaunay triangulation (B-ODT) to smooth (improve) a tetrahedral mesh. In our method, both inner and boundary vertices are repositioned by analytically minimizing the error between a paraboloid function and its piecewise linear interpolation over the neighborhood of each vertex. In addition to the guaranteed volume-preserving property, the proposed algorithm can be readily adapted to preserve sharp features in the original mesh. A number of experiments are included to demonstrate the performance of our method. PMID:23144522

  7. Name-changes in post-war France: the traumatic experiences of the Shoah and its consequences on the second and third generation with reference to the example of name-changes.

    PubMed

    Masson, Céline

    2013-02-01

    Starting from our collective initiative to work on the theme of 'The strength of the name', which has given rise both to a conference as well as a documentary called: And their name, they have changed it, I have sought to draw attention in this article to the difference between proper names, patronymic names, and the so-called Name-of-the-father. Pronouncing names involves designating the languages of names, which also refer to the accents of names, since I have proposed the idea that each name is evocative of a language, and that changing it also modifies the language of the name. I have approached the question of the name by considering cases of name-changes, essential with regard to Ashkenazi Jewish families who changed their name after the Shoah, along with the trauma that numerous Jewish families suffered after the war. French jurisprudence does not permit reversion to the original name, once it has been changed to a more French-sounding name, owing to the immutability of the name and the foreign sound of the names of origin. Copyright © 2013 Institute of Psychoanalysis.

  8. Centre-based restricted nearest feature plane with angle classifier for face recognition

    NASA Astrophysics Data System (ADS)

    Tang, Linlin; Lu, Huifen; Zhao, Liang; Li, Zuohua

    2017-10-01

    An improved classifier based on the nearest feature plane (NFP), called the centre-based restricted nearest feature plane with the angle (RNFPA) classifier, is proposed for the face recognition problems here. The famous NFP uses the geometrical information of samples to increase the number of training samples, but it increases the computation complexity and it also has an inaccuracy problem coursed by the extended feature plane. To solve the above problems, RNFPA exploits a centre-based feature plane and utilizes a threshold of angle to restrict extended feature space. By choosing the appropriate angle threshold, RNFPA can improve the performance and decrease computation complexity. Experiments in the AT&T face database, AR face database and FERET face database are used to evaluate the proposed classifier. Compared with the original NFP classifier, the nearest feature line (NFL) classifier, the nearest neighbour (NN) classifier and some other improved NFP classifiers, the proposed one achieves competitive performance.

  9. Locality-preserving sparse representation-based classification in hyperspectral imagery

    NASA Astrophysics Data System (ADS)

    Gao, Lianru; Yu, Haoyang; Zhang, Bing; Li, Qingting

    2016-10-01

    This paper proposes to combine locality-preserving projections (LPP) and sparse representation (SR) for hyperspectral image classification. The LPP is first used to reduce the dimensionality of all the training and testing data by finding the optimal linear approximations to the eigenfunctions of the Laplace Beltrami operator on the manifold, where the high-dimensional data lies. Then, SR codes the projected testing pixels as sparse linear combinations of all the training samples to classify the testing pixels by evaluating which class leads to the minimum approximation error. The integration of LPP and SR represents an innovative contribution to the literature. The proposed approach, called locality-preserving SR-based classification, addresses the imbalance between high dimensionality of hyperspectral data and the limited number of training samples. Experimental results on three real hyperspectral data sets demonstrate that the proposed approach outperforms the original counterpart, i.e., SR-based classification.

  10. Information dissemination model for social media with constant updates

    NASA Astrophysics Data System (ADS)

    Zhu, Hui; Wu, Heng; Cao, Jin; Fu, Gang; Li, Hui

    2018-07-01

    With the development of social media tools and the pervasiveness of smart terminals, social media has become a significant source of information for many individuals. However, false information can spread rapidly, which may result in negative social impacts and serious economic losses. Thus, reducing the unfavorable effects of false information has become an urgent challenge. In this paper, a new competitive model called DMCU is proposed to describe the dissemination of information with constant updates in social media. In the model, we focus on the competitive relationship between the original false information and updated information, and then propose the priority of related information. To more effectively evaluate the effectiveness of the proposed model, data sets containing actual social media activity are utilized in experiments. Simulation results demonstrate that the DMCU model can precisely describe the process of information dissemination with constant updates, and that it can be used to forecast information dissemination trends on social media.

  11. Temperature based Restricted Boltzmann Machines

    NASA Astrophysics Data System (ADS)

    Li, Guoqi; Deng, Lei; Xu, Yi; Wen, Changyun; Wang, Wei; Pei, Jing; Shi, Luping

    2016-01-01

    Restricted Boltzmann machines (RBMs), which apply graphical models to learning probability distribution over a set of inputs, have attracted much attention recently since being proposed as building blocks of multi-layer learning systems called deep belief networks (DBNs). Note that temperature is a key factor of the Boltzmann distribution that RBMs originate from. However, none of existing schemes have considered the impact of temperature in the graphical model of DBNs. In this work, we propose temperature based restricted Boltzmann machines (TRBMs) which reveals that temperature is an essential parameter controlling the selectivity of the firing neurons in the hidden layers. We theoretically prove that the effect of temperature can be adjusted by setting the parameter of the sharpness of the logistic function in the proposed TRBMs. The performance of RBMs can be improved by adjusting the temperature parameter of TRBMs. This work provides a comprehensive insights into the deep belief networks and deep learning architectures from a physical point of view.

  12. Approximation-based common principal component for feature extraction in multi-class brain-computer interfaces.

    PubMed

    Hoang, Tuan; Tran, Dat; Huang, Xu

    2013-01-01

    Common Spatial Pattern (CSP) is a state-of-the-art method for feature extraction in Brain-Computer Interface (BCI) systems. However it is designed for 2-class BCI classification problems. Current extensions of this method to multiple classes based on subspace union and covariance matrix similarity do not provide a high performance. This paper presents a new approach to solving multi-class BCI classification problems by forming a subspace resembled from original subspaces and the proposed method for this approach is called Approximation-based Common Principal Component (ACPC). We perform experiments on Dataset 2a used in BCI Competition IV to evaluate the proposed method. This dataset was designed for motor imagery classification with 4 classes. Preliminary experiments show that the proposed ACPC feature extraction method when combining with Support Vector Machines outperforms CSP-based feature extraction methods on the experimental dataset.

  13. Evolution of Martian polar landscapes - Interplay of long-term variations in perennial ice cover and dust storm intensity

    NASA Technical Reports Server (NTRS)

    Cutts, J. A.; Blasius, K. R.; Roberts, W. J.

    1979-01-01

    The discovery of a new type of Martian polar terrain, called undulating plain, is reported and the evolution of the plains and other areas of the Martian polar region is discussed in terms of the trapping of dust by the perennial ice cover. High-resolution Viking Orbiter 2 observations of the north polar terrain reveal perennially ice-covered surfaces with low relief, wavelike, regularly spaced, parallel ridges and troughs (undulating plains) occupying areas of the polar terrain previously thought to be flat, and associated with troughs of considerable local relief which exhibit at least partial annual melting. It is proposed that the wavelike topography of the undulating plains originates from long-term periodic variations in cyclical dust precipitation at the margin of a growing or receding perennial polar cap in response to changes in insolation. The troughs are proposed to originate from areas of steep slope in the undulating terrain which have lost their perennial ice cover and have become incapable of trapping dust. The polar landscape thus appears to record the migrations, expansions and contractions of the Martian polar cap.

  14. Color filter array pattern identification using variance of color difference image

    NASA Astrophysics Data System (ADS)

    Shin, Hyun Jun; Jeon, Jong Ju; Eom, Il Kyu

    2017-07-01

    A color filter array is placed on the image sensor of a digital camera to acquire color images. Each pixel uses only one color, since the image sensor can measure only one color per pixel. Therefore, empty pixels are filled using an interpolation process called demosaicing. The original and the interpolated pixels have different statistical characteristics. If the image is modified by manipulation or forgery, the color filter array pattern is altered. This pattern change can be a clue for image forgery detection. However, most forgery detection algorithms have the disadvantage of assuming the color filter array pattern. We present an identification method of the color filter array pattern. Initially, the local mean is eliminated to remove the background effect. Subsequently, the color difference block is constructed to emphasize the difference between the original pixel and the interpolated pixel. The variance measure of the color difference image is proposed as a means of estimating the color filter array configuration. The experimental results show that the proposed method is effective in identifying the color filter array pattern. Compared with conventional methods, our method provides superior performance.

  15. Overlapping community detection based on link graph using distance dynamics

    NASA Astrophysics Data System (ADS)

    Chen, Lei; Zhang, Jing; Cai, Li-Jun

    2018-01-01

    The distance dynamics model was recently proposed to detect the disjoint community of a complex network. To identify the overlapping structure of a network using the distance dynamics model, an overlapping community detection algorithm, called L-Attractor, is proposed in this paper. The process of L-Attractor mainly consists of three phases. In the first phase, L-Attractor transforms the original graph to a link graph (a new edge graph) to assure that one node has multiple distances. In the second phase, using the improved distance dynamics model, a dynamic interaction process is introduced to simulate the distance dynamics (shrink or stretch). Through the dynamic interaction process, all distances converge, and the disjoint community structure of the link graph naturally manifests itself. In the third phase, a recovery method is designed to convert the disjoint community structure of the link graph to the overlapping community structure of the original graph. Extensive experiments are conducted on the LFR benchmark networks as well as real-world networks. Based on the results, our algorithm demonstrates higher accuracy and quality than other state-of-the-art algorithms.

  16. 78 FR 76218 - Rural Call Completion

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-17

    ... calls to rural areas, and enforce restrictions against blocking, choking, reducing, or restricting calls... to alert the Commission of systemic problems receiving calls from a particular originating long... associated with completing calls to rural areas. These rules will also enhance our ability to enforce...

  17. Shared developmental and evolutionary origins for neural basis of vocal–acoustic and pectoral–gestural signaling

    PubMed Central

    Bass, Andrew H.; Chagnaud, Boris P.

    2012-01-01

    Acoustic signaling behaviors are widespread among bony vertebrates, which include the majority of living fishes and tetrapods. Developmental studies in sound-producing fishes and tetrapods indicate that central pattern generating networks dedicated to vocalization originate from the same caudal hindbrain rhombomere (rh) 8-spinal compartment. Together, the evidence suggests that vocalization and its morphophysiological basis, including mechanisms of vocal–respiratory coupling that are widespread among tetrapods, are ancestral characters for bony vertebrates. Premotor-motor circuitry for pectoral appendages that function in locomotion and acoustic signaling develops in the same rh8-spinal compartment. Hence, vocal and pectoral phenotypes in fishes share both developmental origins and roles in acoustic communication. These findings lead to the proposal that the coupling of more highly derived vocal and pectoral mechanisms among tetrapods, including those adapted for nonvocal acoustic and gestural signaling, originated in fishes. Comparative studies further show that rh8 premotor populations have distinct neurophysiological properties coding for equally distinct behavioral attributes such as call duration. We conclude that neural network innovations in the spatiotemporal patterning of vocal and pectoral mechanisms of social communication, including forelimb gestural signaling, have their evolutionary origins in the caudal hindbrain of fishes. PMID:22723366

  18. Durbin-Watson partial least-squares regression applied to MIR data on adulteration with edible oils of different origins.

    PubMed

    Jović, Ozren

    2016-12-15

    A novel method for quantitative prediction and variable-selection on spectroscopic data, called Durbin-Watson partial least-squares regression (dwPLS), is proposed in this paper. The idea is to inspect serial correlation in infrared data that is known to consist of highly correlated neighbouring variables. The method selects only those variables whose intervals have a lower Durbin-Watson statistic (dw) than a certain optimal cutoff. For each interval, dw is calculated on a vector of regression coefficients. Adulteration of cold-pressed linseed oil (L), a well-known nutrient beneficial to health, is studied in this work by its being mixed with cheaper oils: rapeseed oil (R), sesame oil (Se) and sunflower oil (Su). The samples for each botanical origin of oil vary with respect to producer, content and geographic origin. The results obtained indicate that MIR-ATR, combined with dwPLS could be implemented to quantitative determination of edible-oil adulteration. Copyright © 2016 Elsevier Ltd. All rights reserved.

  19. Localized lossless authentication watermark (LAW)

    NASA Astrophysics Data System (ADS)

    Celik, Mehmet U.; Sharma, Gaurav; Tekalp, A. Murat; Saber, Eli S.

    2003-06-01

    A novel framework is proposed for lossless authentication watermarking of images which allows authentication and recovery of original images without any distortions. This overcomes a significant limitation of traditional authentication watermarks that irreversibly alter image data in the process of watermarking and authenticate the watermarked image rather than the original. In particular, authenticity is verified before full reconstruction of the original image, whose integrity is inferred from the reversibility of the watermarking procedure. This reduces computational requirements in situations when either the verification step fails or the zero-distortion reconstruction is not required. A particular instantiation of the framework is implemented using a hierarchical authentication scheme and the lossless generalized-LSB data embedding mechanism. The resulting algorithm, called localized lossless authentication watermark (LAW), can localize tampered regions of the image; has a low embedding distortion, which can be removed entirely if necessary; and supports public/private key authentication and recovery options. The effectiveness of the framework and the instantiation is demonstrated through examples.

  20. G4RNA: an RNA G-quadruplex database

    PubMed Central

    Garant, Jean-Michel; Luce, Mikael J.; Scott, Michelle S.

    2015-01-01

    Abstract G-quadruplexes (G4) are tetrahelical structures formed from planar arrangement of guanines in nucleic acids. A simple, regular motif was originally proposed to describe G4-forming sequences. More recently, however, formation of G4 was discovered to depend, at least in part, on the contextual backdrop of neighboring sequences. Prediction of G4 folding is thus becoming more challenging as G4 outlier structures, not described by the originally proposed motif, are increasingly reported. Recent observations thus call for a comprehensive tool, capable of consolidating the expanding information on tested G4s, in order to conduct systematic comparative analyses of G4-promoting sequences. The G4RNA Database we propose was designed to help meet the need for easily-retrievable data on known RNA G4s. A user-friendly, flexible query system allows for data retrieval on experimentally tested sequences, from many separate genes, to assess G4-folding potential. Query output sorts data according to sequence position, G4 likelihood, experimental outcomes and associated bibliographical references. G4RNA also provides an ideal foundation to collect and store additional sequence and experimental data, considering the growing interest G4s currently generate. Database URL: scottgroup.med.usherbrooke.ca/G4RNA PMID:26200754

  1. Determining the Biogenicity of Microfossils in the Apex Chert, Western Australia, Using Transmission Electron Microscopy

    NASA Technical Reports Server (NTRS)

    DeGregorio, B. T.; Sharp, T. G.

    2003-01-01

    For over a decade, the oldest evidence for life on this planet has been microfossils in the 3.5 Ga Apex Chert in Western Australia. Recently, the biogenicity of these carbon-rich structures has been called into question through reanalysis of the local geology and reinterpretation of the original thin sections. Although initially described as a stratiform, bedded chert of siliceous clasts, the unit is now thought to be a brecciated hydrothermal vein chert. The high temperatures of a hydrothermal environment would probably have detrimental effects to early non-hyperthermophilic life, compared to that of a shallow sea. Conversely, a hydrothermal origin would suggest that if the microfossils were valid, they might have been hyperthermophilic. Apex Chert controversy. The Apex Chert microfossils were originally described as septate filaments composed of kerogen similar in morphology to Proterozoic and modern cyanobacteria. However new thin section analysis shows that these carbonaceous structures are not simple filaments. Many of the original microfossils are branched and have variable thickness when the plane of focus is changed. Hydrothermal alteration of organic remains has also been suggested for the creation of these strange morphologies. Another point of contention lies with the nature of the carbon material in these proposed microfossils. Kerogen is structurally amorphous, but transforms into well-ordered graphite under high pressures and temperatures. Raman spectrometry of the carbonaceous material in the proposed microfossils has been interpreted both as partially graphitized kerogen and amorphous graphite. However, these results are inconclusive, since Raman spectrometry cannot adequately discriminate between kerogen and disordered graphite. There are also opposing views for the origin of the carbon in the Apex Chert. The carbon would be biogenic if the proposed microfossils are indeed the remains of former living organisms. However, an inorganic Fischer- Tropsch-type synthesis is also a possible explanation for the formation of large-aggregate carbonaceous particles and could also account for the depletion of (13)C observed.

  2. Using data tagging to improve the performance of Kanerva's sparse distributed memory

    NASA Technical Reports Server (NTRS)

    Rogers, David

    1988-01-01

    The standard formulation of Kanerva's sparse distributed memory (SDM) involves the selection of a large number of data storage locations, followed by averaging the data contained in those locations to reconstruct the stored data. A variant of this model is discussed, in which the predominant pattern is the focus of reconstruction. First, one architecture is proposed which returns the predominant pattern rather than the average pattern. However, this model will require too much storage for most uses. Next, a hybrid model is proposed, called tagged SDM, which approximates the results of the predominant pattern machine, but is nearly as efficient as Kanerva's original formulation. Finally, some experimental results are shown which confirm that significant improvements in the recall capability of SDM can be achieved using the tagged architecture.

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tuo, Rui; Jeff Wu, C. F.

    Calibration parameters in deterministic computer experiments are those attributes that cannot be measured or available in physical experiments. Here, an approach to estimate them by using data from physical experiments and computer simulations. A theoretical framework is given which allows us to study the issues of parameter identifiability and estimation. We define the L 2-consistency for calibration as a justification for calibration methods. It is shown that a simplified version of the original KO method leads to asymptotically L 2-inconsistent calibration. This L 2-inconsistency can be remedied by modifying the original estimation procedure. A novel calibration method, called the Lmore » 2 calibration, is proposed and proven to be L 2-consistent and enjoys optimal convergence rate. Furthermore a numerical example and some mathematical analysis are used to illustrate the source of the L 2-inconsistency problem.« less

  4. New bioacoustic and distributional data on Bokermannohyla sapiranga Brandão et al., 2012 (Anura: Hylidae): revisiting its diagnosis in comparison with B. pseudopseudis (Miranda-Ribeiro, 1937).

    PubMed

    De Carvalho, Thiago Ribeiro; Giaretta, Ariovaldo Antonio; Teixeira, Bernardo Franco Da Veiga; Martins, Lucas Borges

    2013-12-11

    In this paper, we provide new bioacoustic and distributional data on Bokermannohyla sapiranga, as well as additional comparative bioacoustic data on topotypes of B. pseudopseudis, and re-evaluate the differential diagnosis of the former species with respect to the latter. Head shapes (dorsal and lateral views) presented such variation that should not be used to differentially diagnose them as originally proposed. On the other hand, the presence of a dermal ridge along outer tarsi, and color patterns of the eyes and dorsal surface of hand/toe disks still represent diagnostic characters between both species. We also found differences in temporal (call duration; notes per call), spectral (dominant frequency; harmonics), and structural (pulsed/non-pulsed note structure) traits of their calls. Distribution of B. sapiranga is extended eastward (Paracatu), which corresponds to the first record for the State of Minas Gerais, whereas B. pseudopseudis distribution seems to be restricted to rocky montane field environments of northern Goiás State.

  5. Descartes' dogma and damage to Western psychiatry.

    PubMed

    Ventriglio, A; Bhugra, D

    2015-10-01

    René Descartes described the concept of mind-body dualism in the 16th century. This concept has been called his error but we prefer to call it his dogma because the error was recognised much later. We studied the original writings translated by various scholars. We believe that his dogma has caused tremendous amount of damage to Western psychiatry. This dualism has created boundaries between mind and body but as we know they are inextricably interlinked and influence each other. This has affected clinical practice and has increased the dichotomy between psychiatric services and the physical health care services in the West at least. This dualism has also contributed to stigma against mental illness, the mentally ill and the psychiatric services. We propose that it is time to abandon this mind-body dualism and to look at the whole patient and their illness experiences as is done in some other health care systems such as Ayurveda.

  6. Darwin's missing link - a novel paradigm for evolution education

    NASA Astrophysics Data System (ADS)

    Catley, Kefyn M.

    2006-09-01

    Microevolutionary mechanisms are taught almost exclusively in our schools, to the detriment of those mechanisms that allow us to understand the larger picture - macroevolution. The results are demonstrable; as a result of the strong emphasis on micro processes in evolution education, students and teachers still have poor understanding of the processes which operate at the macro level, and virtually no understanding at all of the history of life on our planet. Natural selection has become synonymous with the suite of processes we call evolution. This paper makes the case for a paradigm shift in evolution education, so that both perspectives - micro and macro - are given equal weight. Increasingly, issues of bioethics, human origins, cloning, etc., are being cast in a light that requires an understanding of macroevolution. To deny our students access to this debate is to deny the call for universal science literacy. A methodology from professional practice is proposed that could achieve this goal, and discussed in light of its utility, theoretical underpinnings, and historical legacy. A mandate for research is proposed that focuses on learners' understanding of several challenging macroevolutionary concepts, including species, the formation of higher groups, deep time, and hierarchical thinking.

  7. New scene change control scheme based on pseudoskipped picture

    NASA Astrophysics Data System (ADS)

    Lee, Youngsun; Lee, Jinwhan; Chang, Hyunsik; Nam, Jae Y.

    1997-01-01

    A new scene change control scheme which improves the video coding performance for sequences that have many scene changed pictures is proposed in this paper. The scene changed pictures except intra-coded picture usually need more bits than normal pictures in order to maintain constant picture quality. The major idea of this paper is how to obtain extra bits which are needed to encode scene changed pictures. We encode a B picture which is located before a scene changed picture like a skipped picture. We call such a B picture as a pseudo-skipped picture. By generating the pseudo-skipped picture like a skipped picture. We call such a B picture as a pseudo-skipped picture. By generating the pseudo-skipped picture, we can save some bits and they are added to the originally allocated target bits to encode the scene changed picture. The simulation results show that the proposed algorithm improves encoding performance about 0.5 to approximately 2.0 dB of PSNR compared to MPEG-2 TM5 rate controls scheme. In addition, the suggested algorithm is compatible with MPEG-2 video syntax and the picture repetition is not recognizable.

  8. An IMS-Based Middleware Solution for Energy-Efficient and Cost-Effective Mobile Multimedia Services

    NASA Astrophysics Data System (ADS)

    Bellavista, Paolo; Corradi, Antonio; Foschini, Luca

    Mobile multimedia services have recently become of extreme industrial relevance due to the advances in both wireless client devices and multimedia communications. That has motivated important standardization efforts, such as the IP Multimedia Subsystem (IMS) to support session control, mobility, and interoperability in all-IP next generation networks. Notwithstanding the central role of IMS in novel mobile multimedia, the potential of IMS-based service composition for the development of new classes of ready-to-use, energy-efficient, and cost-effective services is still widely unexplored. The paper proposes an original solution for the dynamic and standard-compliant redirection of incoming voice calls towards WiFi-equipped smart phones. The primary design guideline is to reduce energy consumption and service costs for the final user by automatically switching from the 3G to the WiFi infrastructure whenever possible. The proposal is fully compliant with the IMS standard and exploits the recently released IMS presence service to update device location and current communication opportunities. The reported experimental results point out that our solution, in a simple way and with full compliance with state-of-the-art industrially-accepted standards, can significantly increase battery lifetime without negative effects on call initiation delay.

  9. SLMRACE: a noise-free RACE implementation with reduced computational time

    NASA Astrophysics Data System (ADS)

    Chauvin, Juliet; Provenzi, Edoardo

    2017-05-01

    We present a faster and noise-free implementation of the RACE algorithm. RACE has mixed characteristics between the famous Retinex model of Land and McCann and the automatic color equalization (ACE) color-correction algorithm. The original random spray-based RACE implementation suffers from two main problems: its computational time and the presence of noise. Here, we will show that it is possible to adapt two techniques recently proposed by Banić et al. to the RACE framework in order to drastically decrease the computational time and noise generation. The implementation will be called smart-light-memory-RACE (SLMRACE).

  10. Oxidoreductase mimic activity of natural pyrrhotite

    NASA Astrophysics Data System (ADS)

    Ibáñez de Aldecoa, A. L.; Velasco, F.; Menor-Salván, C.

    2012-09-01

    The theory of the chemo-autotrophic origin of life, also called the "iron-sulfur world hypothesis", proposes that the system FeS/FeS2 present in the primitive Earth crust gave the reductive power necessary to conduct the first protometabolic redox reactions. Some experimental studies demonstrated the redox activity of the FeS/SH2 system, but none of them used natural FeS. Here, we show that the iron sulfide mineral pyrrhotite is able to mimic the redox activity of the enzyme lactate dehydrogenase, which reversibly reduces the pyruvate in lactate, under prebiotic conditions with pyrite formation.

  11. An Image Processing Algorithm Based On FMAT

    NASA Technical Reports Server (NTRS)

    Wang, Lui; Pal, Sankar K.

    1995-01-01

    Information deleted in ways minimizing adverse effects on reconstructed images. New grey-scale generalization of medial axis transformation (MAT), called FMAT (short for Fuzzy MAT) proposed. Formulated by making natural extension to fuzzy-set theory of all definitions and conditions (e.g., characteristic function of disk, subset condition of disk, and redundancy checking) used in defining MAT of crisp set. Does not need image to have any kind of priori segmentation, and allows medial axis (and skeleton) to be fuzzy subset of input image. Resulting FMAT (consisting of maximal fuzzy disks) capable of reconstructing exactly original image.

  12. Why the water bridge does not collapse

    NASA Astrophysics Data System (ADS)

    Aerov, Artem A.

    2011-09-01

    In 2007 an interesting phenomenon was discovered [J. Phys. DJPAPBE0022-372710.1088/0022-3727/40/19/052 40, 6112 (2007)]: a horizontal thread of water, the so-called water bridge, hangs in a horizontal electrostatic field. A different explanation of the water bridge stability is proposed herein: the force supporting it is the surface tension of water, while the role of the electric field is to not allow the water bridge to reduce its surface energy by breaking into separate drops. It is proven that electrostatic field is not the origin of the tension holding the bridge.

  13. Sparse Contextual Activation for Efficient Visual Re-Ranking.

    PubMed

    Bai, Song; Bai, Xiang

    2016-03-01

    In this paper, we propose an extremely efficient algorithm for visual re-ranking. By considering the original pairwise distance in the contextual space, we develop a feature vector called sparse contextual activation (SCA) that encodes the local distribution of an image. Hence, re-ranking task can be simply accomplished by vector comparison under the generalized Jaccard metric, which has its theoretical meaning in the fuzzy set theory. In order to improve the time efficiency of re-ranking procedure, inverted index is successfully introduced to speed up the computation of generalized Jaccard metric. As a result, the average time cost of re-ranking for a certain query can be controlled within 1 ms. Furthermore, inspired by query expansion, we also develop an additional method called local consistency enhancement on the proposed SCA to improve the retrieval performance in an unsupervised manner. On the other hand, the retrieval performance using a single feature may not be satisfactory enough, which inspires us to fuse multiple complementary features for accurate retrieval. Based on SCA, a robust feature fusion algorithm is exploited that also preserves the characteristic of high time efficiency. We assess our proposed method in various visual re-ranking tasks. Experimental results on Princeton shape benchmark (3D object), WM-SRHEC07 (3D competition), YAEL data set B (face), MPEG-7 data set (shape), and Ukbench data set (image) manifest the effectiveness and efficiency of SCA.

  14. PIMS: Memristor-Based Processing-in-Memory-and-Storage.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cook, Jeanine

    Continued progress in computing has augmented the quest for higher performance with a new quest for higher energy efficiency. This has led to the re-emergence of Processing-In-Memory (PIM) ar- chitectures that offer higher density and performance with some boost in energy efficiency. Past PIM work either integrated a standard CPU with a conventional DRAM to improve the CPU- memory link, or used a bit-level processor with Single Instruction Multiple Data (SIMD) control, but neither matched the energy consumption of the memory to the computation. We originally proposed to develop a new architecture derived from PIM that more effectively addressed energymore » efficiency for high performance scientific, data analytics, and neuromorphic applications. We also originally planned to implement a von Neumann architecture with arithmetic/logic units (ALUs) that matched the power consumption of an advanced storage array to maximize energy efficiency. Implementing this architecture in storage was our original idea, since by augmenting storage (in- stead of memory), the system could address both in-memory computation and applications that accessed larger data sets directly from storage, hence Processing-in-Memory-and-Storage (PIMS). However, as our research matured, we discovered several things that changed our original direc- tion, the most important being that a PIM that implements a standard von Neumann-type archi- tecture results in significant energy efficiency improvement, but only about a O(10) performance improvement. In addition to this, the emergence of new memory technologies moved us to propos- ing a non-von Neumann architecture, called Superstrider, implemented not in storage, but in a new DRAM technology called High Bandwidth Memory (HBM). HBM is a stacked DRAM tech- nology that includes a logic layer where an architecture such as Superstrider could potentially be implemented.« less

  15. A proposal to describe a phenomenon of expanding language

    NASA Astrophysics Data System (ADS)

    Swietorzecka, Kordula

    Changes of knowledge, convictions or beliefs are subjects of interest in frame of so called epistemic logic. There are various proposed descriptions of a process (or its results) in which so a called agent may invent certain changes in a set of sentences that he had already chosen as a point of his knowledge, convictions or beliefs (and this is also considered in case of many agents). In the presented paper we are interested in the changeability of an agent's language which is by its own independent from already mentioned changes. Modern epistemic formalizations assume that the agent uses a fixed (and so we could say: static) language in which he expresses his various opinions which may change. Our interest is to simulate a situation when a language is extended by adding to it new expressions which were not known by the agent so he couldn't even consider them as subjects of his opinions. Actually such a phenomenon happens both in natural and scientific languages. Let us mention a fact of expanding languages in process of learning or in result of getting of new data about some described domain. We propose a simple idealization of extending sentential language used by one agent. Actually the language is treated as a family of so called n-languages which get some epistemic interpretation. Proposed semantics enables us to distinguish between two different types of changes - these which occur because of changing agent's convictions about logical values of some n-sentences - we describe them using one place operator C to be read it changes that - and changes that consist in increasing the level of n-language by adding to it new expressions. However the second type of change - symbolized by variable G - may be also considered independently of the first one. The logical frame of our considerations comes from and it was originally used to describe Aristotelian theory of substantial changes. This time we apply the mentioned logic in epistemology.

  16. Affinity learning with diffusion on tensor product graph.

    PubMed

    Yang, Xingwei; Prasad, Lakshman; Latecki, Longin Jan

    2013-01-01

    In many applications, we are given a finite set of data points sampled from a data manifold and represented as a graph with edge weights determined by pairwise similarities of the samples. Often the pairwise similarities (which are also called affinities) are unreliable due to noise or due to intrinsic difficulties in estimating similarity values of the samples. As observed in several recent approaches, more reliable similarities can be obtained if the original similarities are diffused in the context of other data points, where the context of each point is a set of points most similar to it. Compared to the existing methods, our approach differs in two main aspects. First, instead of diffusing the similarity information on the original graph, we propose to utilize the tensor product graph (TPG) obtained by the tensor product of the original graph with itself. Since TPG takes into account higher order information, it is not a surprise that we obtain more reliable similarities. However, it comes at the price of higher order computational complexity and storage requirement. The key contribution of the proposed approach is that the information propagation on TPG can be computed with the same computational complexity and the same amount of storage as the propagation on the original graph. We prove that a graph diffusion process on TPG is equivalent to a novel iterative algorithm on the original graph, which is guaranteed to converge. After its convergence we obtain new edge weights that can be interpreted as new, learned affinities. We stress that the affinities are learned in an unsupervised setting. We illustrate the benefits of the proposed approach for data manifolds composed of shapes, images, and image patches on two very different tasks of image retrieval and image segmentation. With learned affinities, we achieve the bull's eye retrieval score of 99.99 percent on the MPEG-7 shape dataset, which is much higher than the state-of-the-art algorithms. When the data- points are image patches, the NCut with the learned affinities not only significantly outperforms the NCut with the original affinities, but it also outperforms state-of-the-art image segmentation methods.

  17. The ring of life hypothesis for eukaryote origins is supported by multiple kinds of data

    PubMed Central

    McInerney, James; Pisani, Davide; O'Connell, Mary J.

    2015-01-01

    The literature is replete with manuscripts describing the origin of eukaryotic cells. Most of the models for eukaryogenesis are either autogenous (sometimes called slow-drip), or symbiogenic (sometimes called big-bang). In this article, we use large and diverse suites of ‘Omics' and other data to make the inference that autogeneous hypotheses are a very poor fit to the data and the origin of eukaryotic cells occurred in a single symbiosis. PMID:26323755

  18. Political decision-making in health care: the Dutch case.

    PubMed

    Elsinga, E

    1989-01-01

    In many western countries health care is a subject of increasing importance on the political agenda. Issues such as aging, development of medical technologies, equity and efficiency of care, increasing costs, market elements, etc. are leading to a review of existing health care systems. In The Netherlands the government has proposed fundamental changes in the structure and financing of care, based on a report by the so-called Dekker Committee. The final result of a step-wise process of change should be the introduction of a new insurance scheme and the strengthening of market elements. After a short description of the government proposals, this article gives an analysis of the process of decision-making for a restructuring of health care in the Netherlands. The analysis is based on a bureaupolitical model, as originally described by Allison.

  19. The Perlman syndrome: familial renal dysplasia with Wilms tumor, fetal gigantism and multiple congenital anomalies. 1984.

    PubMed

    Neri, Giovanni; Martini-Neri, Maria Enrica; Katz, Ben E; Opitz, John M

    2013-11-01

    The ensuing paper by Professor Giovanni Neri and colleagues was originally published in 1984, American Journal of Medical Genetics 19:195–207. The original article described a new family with a condition that the authors designated as the Perlman syndrome. This disorder, while uncommon, is an important multiple congenital anomaly and dysplasia syndrome; the causative gene was recently identified. This paper is a seminal work and is graciously republished by Wiley-Blackwell in the Special Festschrift issue honoring Professor Neri. We describe a familial syndrome of renal dysplasia, Wilms tumor, hyperplasia of the endocrine pancreas, fetal gigantism, multiple congenital anomalies and mental retardation. This condition was previously described by Perlman et al. [1973, 1975] and we propose to call it the "Perlman syndrome." It appears to be transmitted as an autosomal recessive trait. The possible relationships between dysplasia, neoplasia and malformation are discussed. © 2013 Wiley Periodicals, Inc.

  20. Negro, Black, Black African, African Caribbean, African American or what? Labelling African origin populations in the health arena in the 21st century

    PubMed Central

    Agyemang, C.; Bhopal, R.; Bruijnzeels, M.

    2005-01-01

    Broad terms such as Black, African, or Black African are entrenched in scientific writings although there is considerable diversity within African descent populations and such terms may be both offensive and inaccurate. This paper outlines the heterogeneity within African populations, and discusses the strengths and limitations of the term Black and related labels from epidemiological and public health perspectives in Europe and the USA. This paper calls for debate on appropriate terminologies for African descent populations and concludes with the proposals that (1) describing the population under consideration is of paramount importance (2) the word African origin or simply African is an appropriate and necessary prefix for an ethnic label, for example, African Caribbean or African Kenyan or African Surinamese (3) documents should define the ethnic labels (4) the label Black should be phased out except when used in political contexts. PMID:16286485

  1. Spontaneous Scalarization: Dead or Alive?

    NASA Astrophysics Data System (ADS)

    Berti, Emanuele; Crispino, Luis; Gerosa, Davide; Gualtieri, Leonardo; Horbatsch, Michael; Macedo, Caio; Okada da Silva, Hector; Pani, Paolo; Sotani, Hajime; Sperhake, Ulrich

    2015-04-01

    In 1993, Damour and Esposito-Farese showed that a wide class of scalar-tensor theories can pass weak-field gravitational tests and exhibit nonperturbative strong-field deviations away from General Relativity in systems involving neutron stars. These deviations are possible in the presence of ``spontaneous scalarization,'' a phase transition similar in nature to spontaneous magnetization in ferromagnets. More than twenty years after the original proposal, binary pulsar experiments have severely constrained the possibility of spontaneous scalarization occurring in nature. I will show that these experimental constraints have important implications for the torsional oscillation frequencies of neutron stars and for the so-called ``I-Love-Q'' relations in scalar-tensor theories. I will also argue that there is still hope to observe strong scalarization effects, despite the strong experimental bounds on the original mechanism. In particular, I will discuss two mechanisms that could produce strong scalarization in neutron stars: anisotropy and multiscalarization. This work was supported by NSF CAREER Award PHY-1055103.

  2. A Theoretical Framework for Calibration in Computer Models: Parametrization, Estimation and Convergence Properties

    DOE PAGES

    Tuo, Rui; Jeff Wu, C. F.

    2016-07-19

    Calibration parameters in deterministic computer experiments are those attributes that cannot be measured or available in physical experiments. Here, an approach to estimate them by using data from physical experiments and computer simulations. A theoretical framework is given which allows us to study the issues of parameter identifiability and estimation. We define the L 2-consistency for calibration as a justification for calibration methods. It is shown that a simplified version of the original KO method leads to asymptotically L 2-inconsistent calibration. This L 2-inconsistency can be remedied by modifying the original estimation procedure. A novel calibration method, called the Lmore » 2 calibration, is proposed and proven to be L 2-consistent and enjoys optimal convergence rate. Furthermore a numerical example and some mathematical analysis are used to illustrate the source of the L 2-inconsistency problem.« less

  3. Reply to comment by Fred L. Ogden et al. on "Beyond the SCS-CN method: A theoretical framework for spatially lumped rainfall-runoff response"

    NASA Astrophysics Data System (ADS)

    Bartlett, M. S.; Parolari, A. J.; McDonnell, J. J.; Porporato, A.

    2017-07-01

    Though Ogden et al. list several shortcomings of the original SCS-CN method, fit for purpose is a key consideration in hydrological modelling, as shown by the adoption of SCS-CN method in many design standards. The theoretical framework of Bartlett et al. [2016a] reveals a family of semidistributed models, of which the SCS-CN method is just one member. Other members include event-based versions of the Variable Infiltration Capacity (VIC) model and TOPMODEL. This general model allows us to move beyond the limitations of the original SCS-CN method under different rainfall-runoff mechanisms and distributions for soil and rainfall variability. Future research should link this general model approach to different hydrogeographic settings, in line with the call for action proposed by Ogden et al.

  4. Topological Anisotropy of Stone-Wales Waves in Graphenic Fragments

    PubMed Central

    Ori, Ottorino; Cataldo, Franco; Putz, Mihai V.

    2011-01-01

    Stone-Wales operators interchange four adjacent hexagons with two pentagon-heptagon 5|7 pairs that, graphically, may be iteratively propagated in the graphene layer, originating a new interesting structural defect called here Stone-Wales wave. By minimization, the Wiener index topological invariant evidences a marked anisotropy of the Stone-Wales defects that, topologically, are in fact preferably generated and propagated along the diagonal of the graphenic fragments, including carbon nanotubes and graphene nanoribbons. This peculiar edge-effect is shown in this paper having a predominant topological origin, leaving to future experimental investigations the task of verifying the occurrence in nature of wave-like defects similar to the ones proposed here. Graph-theoretical tools used in this paper for the generation and the propagation of the Stone-Wales defects waves are applicable to investigate isomeric modifications of chemical structures with various dimensionality like fullerenes, nanotubes, graphenic layers, schwarzites, zeolites. PMID:22174641

  5. An extinct vertebrate preserved by its living hybridogenetic descendant.

    PubMed

    Dubey, Sylvain; Dufresnes, Christophe

    2017-10-06

    Hybridogenesis is a special mode of hybrid reproduction where one parental genome is eliminated and the other is transmitted clonally. We propose that this mechanism can perpetuate the genome of extinct species, based on new genetic data from Pelophylax water frogs. We characterized the genetic makeup of Italian hybridogenetic hybrids (P. kl. hispanicus and esculentus) and identified a new endemic lineage of Eastern-Mediterranean origin as one parental ancestor of P. kl. hispanicus. This taxon is nowadays extinct in the wild but its germline subsists through its hybridogenetic descendant, which can thus be considered as a "semi living fossil". Such rare situation calls for realistic efforts of de-extinction through selective breeding without genetic engineering, and fuels the topical controversy of reviving long extinct species. "Ghost" species hidden by taxa of hybrid origin may be more frequent than suspected in vertebrate groups that experienced a strong history of hybridization and semi-sexual reproduction.

  6. Digital Synchronizer without Metastability

    NASA Technical Reports Server (NTRS)

    Simle, Robert M.; Cavazos, Jose A.

    2009-01-01

    A proposed design for a digital synchronizing circuit would eliminate metastability that plagues flip-flop circuits in digital input/output interfaces. This metastability is associated with sampling, by use of flip-flops, of an external signal that is asynchronous with a clock signal that drives the flip-flops: it is a temporary flip-flop failure that can occur when a rising or falling edge of an asynchronous signal occurs during the setup and/or hold time of a flip-flop. The proposed design calls for (1) use of a clock frequency greater than the frequency of the asynchronous signal, (2) use of flip-flop asynchronous preset or clear signals for the asynchronous input, (3) use of a clock asynchronous recovery delay with pulse width discriminator, and (4) tying the data inputs to constant logic levels to obtain (5) two half-rate synchronous partial signals - one for the falling and one for the rising edge. Inasmuch as the flip-flop data inputs would be permanently tied to constant logic levels, setup and hold times would not be violated. The half-rate partial signals would be recombined to construct a signal that would replicate the original asynchronous signal at its original rate but would be synchronous with the clock signal.

  7. Absorbers in the Transactional Interpretation of Quantum Mechanics

    NASA Astrophysics Data System (ADS)

    Boisvert, Jean-Sébastien; Marchildon, Louis

    2013-03-01

    The transactional interpretation of quantum mechanics, following the time-symmetric formulation of electrodynamics, uses retarded and advanced solutions of the Schrödinger equation and its complex conjugate to understand quantum phenomena by means of transactions. A transaction occurs between an emitter and a specific absorber when the emitter has received advanced waves from all possible absorbers. Advanced causation always raises the specter of paradoxes, and it must be addressed carefully. In particular, different devices involving contingent absorbers or various types of interaction-free measurements have been proposed as threatening the original version of the transactional interpretation. These proposals will be analyzed by examining in each case the configuration of absorbers and, in the special case of the so-called quantum liar experiment, by carefully following the development of retarded and advanced waves through the Mach-Zehnder interferometer. We will show that there is no need to resort to the hierarchy of transactions that some have proposed, and will argue that the transactional interpretation is consistent with the block-universe picture of time.

  8. Quantum exhaustive key search with simplified-DES as a case study.

    PubMed

    Almazrooie, Mishal; Samsudin, Azman; Abdullah, Rosni; Mutter, Kussay N

    2016-01-01

    To evaluate the security of a symmetric cryptosystem against any quantum attack, the symmetric algorithm must be first implemented on a quantum platform. In this study, a quantum implementation of a classical block cipher is presented. A quantum circuit for a classical block cipher of a polynomial size of quantum gates is proposed. The entire work has been tested on a quantum mechanics simulator called libquantum. First, the functionality of the proposed quantum cipher is verified and the experimental results are compared with those of the original classical version. Then, quantum attacks are conducted by using Grover's algorithm to recover the secret key. The proposed quantum cipher is used as a black box for the quantum search. The quantum oracle is then queried over the produced ciphertext to mark the quantum state, which consists of plaintext and key qubits. The experimental results show that for a key of n-bit size and key space of N such that [Formula: see text], the key can be recovered in [Formula: see text] computational steps.

  9. Understanding the Physical Nature of Coronal "EIT Waves"

    NASA Astrophysics Data System (ADS)

    Long, D. M.; Bloomfield, D. S.; Chen, P.-F.; Downs, C.; Gallagher, P. T.; Kwon, R.-Y.; Vanninathan, K.; Veronig, A.; Vourlidas, A.; Vrsnak, B.; Warmuth, A.; Zic, T.

    2016-10-01

    For almost 20 years the physical nature of globally-propagating waves in the solar corona (commonly called "EIT waves") has been controversial and subject to debate. Additional theories have been proposed throughout the years to explain observations that did not fit with the originally proposed fast-mode wave interpretation. However, the incompatibility of observations made using the Extreme-ultraviolet Imaging Telescope (EIT) on the Solar and Heliospheric Observatory with the fast-mode wave interpretation have been challenged by differing viewpoints from the Solar Terrestrial Relations Observatory spacecraft and higher spatial/temporal resolution data from the Solar Dynamics Observatory. In this paper, we reexamine the theories proposed to explain "EIT waves" to identify measurable properties and behaviours that can be compared to current and future observations. Most of us conclude that "EIT waves" are best described as fast-mode large-amplitude waves/shocks, which are initially driven by the impulsive expansion of an erupting coronal mass ejection in the low corona.

  10. A Novel Hybrid Firefly Algorithm for Global Optimization.

    PubMed

    Zhang, Lina; Liu, Liqiang; Yang, Xin-She; Dai, Yuntao

    Global optimization is challenging to solve due to its nonlinearity and multimodality. Traditional algorithms such as the gradient-based methods often struggle to deal with such problems and one of the current trends is to use metaheuristic algorithms. In this paper, a novel hybrid population-based global optimization algorithm, called hybrid firefly algorithm (HFA), is proposed by combining the advantages of both the firefly algorithm (FA) and differential evolution (DE). FA and DE are executed in parallel to promote information sharing among the population and thus enhance searching efficiency. In order to evaluate the performance and efficiency of the proposed algorithm, a diverse set of selected benchmark functions are employed and these functions fall into two groups: unimodal and multimodal. The experimental results show better performance of the proposed algorithm compared to the original version of the firefly algorithm (FA), differential evolution (DE) and particle swarm optimization (PSO) in the sense of avoiding local minima and increasing the convergence rate.

  11. A Novel Hybrid Firefly Algorithm for Global Optimization

    PubMed Central

    Zhang, Lina; Liu, Liqiang; Yang, Xin-She; Dai, Yuntao

    2016-01-01

    Global optimization is challenging to solve due to its nonlinearity and multimodality. Traditional algorithms such as the gradient-based methods often struggle to deal with such problems and one of the current trends is to use metaheuristic algorithms. In this paper, a novel hybrid population-based global optimization algorithm, called hybrid firefly algorithm (HFA), is proposed by combining the advantages of both the firefly algorithm (FA) and differential evolution (DE). FA and DE are executed in parallel to promote information sharing among the population and thus enhance searching efficiency. In order to evaluate the performance and efficiency of the proposed algorithm, a diverse set of selected benchmark functions are employed and these functions fall into two groups: unimodal and multimodal. The experimental results show better performance of the proposed algorithm compared to the original version of the firefly algorithm (FA), differential evolution (DE) and particle swarm optimization (PSO) in the sense of avoiding local minima and increasing the convergence rate. PMID:27685869

  12. Incremental Query Rewriting with Resolution

    NASA Astrophysics Data System (ADS)

    Riazanov, Alexandre; Aragão, Marcelo A. T.

    We address the problem of semantic querying of relational databases (RDB) modulo knowledge bases using very expressive knowledge representation formalisms, such as full first-order logic or its various fragments. We propose to use a resolution-based first-order logic (FOL) reasoner for computing schematic answers to deductive queries, with the subsequent translation of these schematic answers to SQL queries which are evaluated using a conventional relational DBMS. We call our method incremental query rewriting, because an original semantic query is rewritten into a (potentially infinite) series of SQL queries. In this chapter, we outline the main idea of our technique - using abstractions of databases and constrained clauses for deriving schematic answers, and provide completeness and soundness proofs to justify the applicability of this technique to the case of resolution for FOL without equality. The proposed method can be directly used with regular RDBs, including legacy databases. Moreover, we propose it as a potential basis for an efficient Web-scale semantic search technology.

  13. Color image segmentation with support vector machines: applications to road signs detection.

    PubMed

    Cyganek, Bogusław

    2008-08-01

    In this paper we propose efficient color segmentation method which is based on the Support Vector Machine classifier operating in a one-class mode. The method has been developed especially for the road signs recognition system, although it can be used in other applications. The main advantage of the proposed method comes from the fact that the segmentation of characteristic colors is performed not in the original but in the higher dimensional feature space. By this a better data encapsulation with a linear hypersphere can be usually achieved. Moreover, the classifier does not try to capture the whole distribution of the input data which is often difficult to achieve. Instead, the characteristic data samples, called support vectors, are selected which allow construction of the tightest hypersphere that encloses majority of the input data. Then classification of a test data simply consists in a measurement of its distance to a centre of the found hypersphere. The experimental results show high accuracy and speed of the proposed method.

  14. Reinterpreting features of the advertisement call of Dermatonotus muelleri (Boettger, 1885; Anura, Microhylidae).

    PubMed

    Giaretta, Ariovaldo Antonio; Vo, Pacific; Herche, Jesse; Tang, Justine Nicole; Gridi-Papp, Marcos

    2015-06-13

    The advertisement call of Dermatonotus muelleri was originally described by Nelson (1973) in a brief section of a review on the mating calls of the Microhylinae. He used two calls from São Leopoldo, state of Minas Gerais, in Brazil to determine that they have i) dominant frequency between 1.500-2.200 kHz (mean 1.854 + 0.216 kHz), and ii) harmonic intervals between 0.140 and 0.150 kHz (0.146 +/- 0.005 kHz). Nelson (1973) based his description on an audiospectrogram produced with high frequency resolution and did not quantify the pulse structure of the calls. More recently, Giaretta and colleagues (2013) expanded on the original description using a larger set of calls recorded from Gurinhat, state of Minas Gerais, in Brazil. They quantified the temporal structure of the call and confirmed that the dominant frequency is around 1.8 kHz. In addition, they identified a secondary low frequency band at 667 Hz.

  15. NASA Institute for Advanced Concepts

    NASA Technical Reports Server (NTRS)

    Cassanova, Robert A.

    1999-01-01

    The purpose of NASA Institute for Advanced Concepts (NIAC) is to provide an independent, open forum for the external analysis and definition of space and aeronautics advanced concepts to complement the advanced concepts activities conducted within the NASA Enterprises. The NIAC will issue Calls for Proposals during each year of operation and will select revolutionary advanced concepts for grant or contract awards through a peer review process. Final selection of awards will be with the concurrence of NASA's Chief Technologist. The operation of the NIAC is reviewed biannually by the NIAC Science, Exploration and Technology Council (NSETC) whose members are drawn from the senior levels of industry and universities. The process of defining the technical scope of the initial Call for Proposals was begun with the NIAC "Grand Challenges" workshop conducted on May 21-22, 1998 in Columbia, Maryland. These "Grand Challenges" resulting from this workshop became the essence of the technical scope for the first Phase I Call for Proposals which was released on June 19, 1998 with a due date of July 31, 1998. The first Phase I Call for Proposals attracted 119 proposals. After a thorough peer review, prioritization by NIAC and technical concurrence by NASA, sixteen subgrants were awarded. The second Phase I Call for Proposals was released on November 23, 1998 with a due date of January 31, 1999. Sixty-three (63) proposals were received in response to this Call. On December 2-3, 1998, the NSETC met to review the progress and future plans of the NIAC. The next NSETC meeting is scheduled for August 5-6, 1999. The first Phase II Call for Proposals was released to the current Phase I grantees on February 3,1999 with a due date of May 31, 1999. Plans for the second year of the contract include a continuation of the sequence of Phase I and Phase II Calls for Proposals and hosting the first NIAC Annual Meeting and USRA/NIAC Technical Symposium in NASA HQ.

  16. Divergence, convergence, and the ancestry of feral populations in the domestic rock pigeon.

    PubMed

    Stringham, Sydney A; Mulroy, Elisabeth E; Xing, Jinchuan; Record, David; Guernsey, Michael W; Aldenhoven, Jaclyn T; Osborne, Edward J; Shapiro, Michael D

    2012-02-21

    Domestic pigeons are spectacularly diverse and exhibit variation in more traits than any other bird species [1]. In The Origin of Species, Charles Darwin repeatedly calls attention to the striking variation among domestic pigeon breeds-generated by thousands of years of artificial selection on a single species by human breeders-as a model for the process of natural divergence among wild populations and species [2]. Darwin proposed a morphology-based classification of domestic pigeon breeds [3], but the relationships among major groups of breeds and their geographic origins remain poorly understood [4, 5]. We used a large, geographically diverse sample of 361 individuals from 70 domestic pigeon breeds and two free-living populations to determine genetic relationships within this species. We found unexpected relationships among phenotypically divergent breeds as well as convergent evolution of derived traits among several breed groups. Our findings also illuminate the geographic origins of breed groups in India and the Middle East and suggest that racing breeds have made substantial contributions to feral pigeon populations. Copyright © 2012 Elsevier Ltd. All rights reserved.

  17. An ECG signals compression method and its validation using NNs.

    PubMed

    Fira, Catalina Monica; Goras, Liviu

    2008-04-01

    This paper presents a new algorithm for electrocardiogram (ECG) signal compression based on local extreme extraction, adaptive hysteretic filtering and Lempel-Ziv-Welch (LZW) coding. The algorithm has been verified using eight of the most frequent normal and pathological types of cardiac beats and an multi-layer perceptron (MLP) neural network trained with original cardiac patterns and tested with reconstructed ones. Aspects regarding the possibility of using the principal component analysis (PCA) to cardiac pattern classification have been investigated as well. A new compression measure called "quality score," which takes into account both the reconstruction errors and the compression ratio, is proposed.

  18. Interaction sorting method for molecular dynamics on multi-core SIMD CPU architecture.

    PubMed

    Matvienko, Sergey; Alemasov, Nikolay; Fomin, Eduard

    2015-02-01

    Molecular dynamics (MD) is widely used in computational biology for studying binding mechanisms of molecules, molecular transport, conformational transitions, protein folding, etc. The method is computationally expensive; thus, the demand for the development of novel, much more efficient algorithms is still high. Therefore, the new algorithm designed in 2007 and called interaction sorting (IS) clearly attracted interest, as it outperformed the most efficient MD algorithms. In this work, a new IS modification is proposed which allows the algorithm to utilize SIMD processor instructions. This paper shows that the improvement provides an additional gain in performance, 9% to 45% in comparison to the original IS method.

  19. Between Peirce (1878) and James (1898): G. Stanley Hall, the origins of pragmatism, and the history of psychology.

    PubMed

    Leary, David E

    2009-01-01

    This article focuses on the 20-year gap between Charles S. Peirce's classic proposal of pragmatism in 1877-1878 and William James's equally classic call for pragmatism in 1898. It fills the gap by reviewing relevant developments in the work of Peirce and James and by introducing G. Stanley Hall, for the first time, as a figure in the history of pragmatism. In treating Hall and pragmatism, the article reveals a previously unnoted relation between the early history of pragmatism and the early history of the "new psychology" that Hall helped to pioneer. (c) 2009 Wiley Periodicals, Inc.

  20. Posterior double PCL sign: a case report of unusual MRI finding of bucket-handle tear of medial meniscus.

    PubMed

    Yoo, Jae Ho; Hahn, Sung Ho; Yi, Seung Rim; Kim, Seong Wan

    2007-11-01

    Among the MRI signs of bucket-handle tears of medial meniscus, double posterior cruciate ligament (PCL) sign denotes a low signal band anterior and parallel to the PCL, which looks like another PCL in MR images. If the bucket-handle fragment subsequently tears at the anterior horn, the torn meniscal substance can be displaced to the posterosuperior region of the PCL, and looks like another PCL behind the original PCL. We propose the lesion be called the "posterior double PCL sign" in contrast to the ordinary double PCL sign. We present a case showing the posterior double PCL sign.

  1. Automatic DNA Diagnosis for 1D Gel Electrophoresis Images using Bio-image Processing Technique.

    PubMed

    Intarapanich, Apichart; Kaewkamnerd, Saowaluck; Shaw, Philip J; Ukosakit, Kittipat; Tragoonrung, Somvong; Tongsima, Sissades

    2015-01-01

    DNA gel electrophoresis is a molecular biology technique for separating different sizes of DNA fragments. Applications of DNA gel electrophoresis include DNA fingerprinting (genetic diagnosis), size estimation of DNA, and DNA separation for Southern blotting. Accurate interpretation of DNA banding patterns from electrophoretic images can be laborious and error prone when a large number of bands are interrogated manually. Although many bio-imaging techniques have been proposed, none of them can fully automate the typing of DNA owing to the complexities of migration patterns typically obtained. We developed an image-processing tool that automatically calls genotypes from DNA gel electrophoresis images. The image processing workflow comprises three main steps: 1) lane segmentation, 2) extraction of DNA bands and 3) band genotyping classification. The tool was originally intended to facilitate large-scale genotyping analysis of sugarcane cultivars. We tested the proposed tool on 10 gel images (433 cultivars) obtained from polyacrylamide gel electrophoresis (PAGE) of PCR amplicons for detecting intron length polymorphisms (ILP) on one locus of the sugarcanes. These gel images demonstrated many challenges in automated lane/band segmentation in image processing including lane distortion, band deformity, high degree of noise in the background, and bands that are very close together (doublets). Using the proposed bio-imaging workflow, lanes and DNA bands contained within are properly segmented, even for adjacent bands with aberrant migration that cannot be separated by conventional techniques. The software, called GELect, automatically performs genotype calling on each lane by comparing with an all-banding reference, which was created by clustering the existing bands into the non-redundant set of reference bands. The automated genotype calling results were verified by independent manual typing by molecular biologists. This work presents an automated genotyping tool from DNA gel electrophoresis images, called GELect, which was written in Java and made available through the imageJ framework. With a novel automated image processing workflow, the tool can accurately segment lanes from a gel matrix, intelligently extract distorted and even doublet bands that are difficult to identify by existing image processing tools. Consequently, genotyping from DNA gel electrophoresis can be performed automatically allowing users to efficiently conduct large scale DNA fingerprinting via DNA gel electrophoresis. The software is freely available from http://www.biotec.or.th/gi/tools/gelect.

  2. A novel hybrid decomposition-and-ensemble model based on CEEMD and GWO for short-term PM2.5 concentration forecasting

    NASA Astrophysics Data System (ADS)

    Niu, Mingfei; Wang, Yufang; Sun, Shaolong; Li, Yongwu

    2016-06-01

    To enhance prediction reliability and accuracy, a hybrid model based on the promising principle of "decomposition and ensemble" and a recently proposed meta-heuristic called grey wolf optimizer (GWO) is introduced for daily PM2.5 concentration forecasting. Compared with existing PM2.5 forecasting methods, this proposed model has improved the prediction accuracy and hit rates of directional prediction. The proposed model involves three main steps, i.e., decomposing the original PM2.5 series into several intrinsic mode functions (IMFs) via complementary ensemble empirical mode decomposition (CEEMD) for simplifying the complex data; individually predicting each IMF with support vector regression (SVR) optimized by GWO; integrating all predicted IMFs for the ensemble result as the final prediction by another SVR optimized by GWO. Seven benchmark models, including single artificial intelligence (AI) models, other decomposition-ensemble models with different decomposition methods and models with the same decomposition-ensemble method but optimized by different algorithms, are considered to verify the superiority of the proposed hybrid model. The empirical study indicates that the proposed hybrid decomposition-ensemble model is remarkably superior to all considered benchmark models for its higher prediction accuracy and hit rates of directional prediction.

  3. Poisson-Box Sampling algorithms for three-dimensional Markov binary mixtures

    NASA Astrophysics Data System (ADS)

    Larmier, Coline; Zoia, Andrea; Malvagi, Fausto; Dumonteil, Eric; Mazzolo, Alain

    2018-02-01

    Particle transport in Markov mixtures can be addressed by the so-called Chord Length Sampling (CLS) methods, a family of Monte Carlo algorithms taking into account the effects of stochastic media on particle propagation by generating on-the-fly the material interfaces crossed by the random walkers during their trajectories. Such methods enable a significant reduction of computational resources as opposed to reference solutions obtained by solving the Boltzmann equation for a large number of realizations of random media. CLS solutions, which neglect correlations induced by the spatial disorder, are faster albeit approximate, and might thus show discrepancies with respect to reference solutions. In this work we propose a new family of algorithms (called 'Poisson Box Sampling', PBS) aimed at improving the accuracy of the CLS approach for transport in d-dimensional binary Markov mixtures. In order to probe the features of PBS methods, we will focus on three-dimensional Markov media and revisit the benchmark problem originally proposed by Adams, Larsen and Pomraning [1] and extended by Brantley [2]: for these configurations we will compare reference solutions, standard CLS solutions and the new PBS solutions for scalar particle flux, transmission and reflection coefficients. PBS will be shown to perform better than CLS at the expense of a reasonable increase in computational time.

  4. Liveness-enforcing supervisors synthesis for a class of generalised Petri nets based on two-stage deadlock control and mathematical programming

    NASA Astrophysics Data System (ADS)

    Zhao, Mi; Hou, Yifan; Liu, Ding

    2010-10-01

    In this article we deal with deadlock prevention problems for S4PR, a class of generalised Petri nets, which can well model a large class of flexible manufacturing systems where deadlocks are caused by insufficiently marked siphons. We present a deadlock prevention methodology that is an iterative approach consisting of two stages. The first one is called siphon control, which is to add for each insufficiently marked minimal siphon a control place to the original net. Its objective is to prevent a minimal siphon from being insufficiently marked. The second one, called control-induced siphon control, is to add a control place to the augmented net with its output arcs connecting to the source transitions, which assures that there are no new insufficiently marked siphons generated. At each iteration, a mixed integer programming approach is adopted for generalised Petri nets to obtain an insufficiently marked minimal siphon from the maximal deadly siphon. This way complete siphon enumeration is avoided that is much more time-consuming for a sizeable plant model than the proposed method. The relation of the proposed method and the liveness and reversibility of the controlled net is obtained. Examples are presented to demonstrate the presented method.

  5. 76 FR 16391 - Call for Innovative National Environmental Policy Act (NEPA) Pilot Project Proposals

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-23

    ... COUNCIL ON ENVIRONMENTAL QUALITY Call for Innovative National Environmental Policy Act (NEPA) Pilot Project Proposals AGENCY: Council On Environmental Quality. ACTION: Notice of Availability, Call... the Council on Environmental Quality (CEQ) invites the public and federal agencies to nominate...

  6. 77 FR 56710 - Proposed Information Collection (Call Center Satisfaction Survey): Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-09-13

    ... DEPARTMENT OF VETERANS AFFAIRS [OMB Control No. 2900-0744] Proposed Information Collection (Call Center Satisfaction Survey): Comment Request AGENCY: Veterans Benefits Administration, Department of... techniques or the use of other forms of information technology. Title: VBA Call Center Satisfaction Survey...

  7. Splitting of the weak hypercharge quantum

    NASA Astrophysics Data System (ADS)

    Nielsen, H. B.; Brene, N.

    1991-08-01

    The ratio between the weak hypercharge quantum for particles having no coupling to the gauge bosons corresponding to the semi-simple component of the gauge group and the smallest hypercharge quantum for particles that do have such couplings is exceptionally large for the standard model, considering its rank. To compare groups with respect to this property we propose a quantity χ which depends on the rank of the group and the splitting ratio of the hypercharge(s) to be found in the group. The quantity χ has maximal value for the gauge group of the standard model. This suggests that the hypercharge splitting may play an important rôle either in the origin of the gauge symmetry at a fundamental scale or in some kind of selection mechanism at a scale perhaps nearer to the experimental scale. Such a selection mechanism might be what we have called confusion which removes groups with many (so-called generalized) automorphisms. The quantity χ tends to be large for groups with few generalized automorphisms.

  8. A Model for Membrane Fusion

    NASA Astrophysics Data System (ADS)

    Ngatchou, Annita

    2010-01-01

    Pheochromocytoma is a tumor of the adrenal gland which originates from chromaffin cells and is characterized by the secretion of excessive amounts of neurotransmitter which lead to high blood pressure and palpitations. Pheochromocytoma contain membrane bound granules that store neurotransmitter. The release of these stored molecules into the extracellular space occurs by fusion of the granule membrane with the cell plasma membrane, a process called exocytosis. The molecular mechanism of this membrane fusion is not well understood. It is proposed that the so called SNARE proteins [1] are the pillar of vesicle fusion as their cleavage by clostridial toxin notably, Botulinum neurotoxin and Tetanus toxin abrogate the secretion of neurotransmitter [2]. Here, I describe how physical principles are applied to a biological cell to explore the role of the vesicle SNARE protein synaptobrevin-2 in easing granule fusion. The data presented here suggest a paradigm according to which the movement of the C-terminal of synaptobrevin-2 disrupts the lipid bilayer to form a fusion pore through which molecules can exit.

  9. OTHER: A multidisciplinary approach to the search for other inhabited worlds

    NASA Astrophysics Data System (ADS)

    Funes, J.; Lares, M.; De los Rios, M.; Martiarena, M.; Ahumada, A. V.

    2017-10-01

    We present project OTHER (Otros mundos, tierra, humanidad, and espacio remoto), a multidisciplinary laboratory of ideas, that addresses questions related to the scientific search for extraterrestrial intelligent life such as: what is life? how did it originate? what might be the criteria that we adopt to identify what we might call an extraterrestrial civilization? As a starting point, we consider the Drake equation which offers a platform from which to address these questions in a multidisciplinary approach. As part of the project OTHER, we propose to develop and explain the last two parameters of the Drake equation that we call the cultural factors: the fraction of intelligent civilizations that want or seek to communicate , and the average life time of the same, . The innovation of the project OTHER is the multidisciplinary approach in the context of the Argentine community. Our goal is to provide new ideas that could offer new perspectives on the old question: Are we alone?

  10. Presolar Grains as Tracers of Nebular Processes

    NASA Technical Reports Server (NTRS)

    Huss, Gary R.

    2001-01-01

    This grant provided two years of funding to investigate the abundances of presolar diamond, SiC, and graphite in primitive chondritic meteorites. The original proposal was for a three-year study, but two years of funding were awarded. The proposed work plan for the first year included preparation of acid residues for two meteorites and noble-gas measurements on those residues and residues of two other meteorites that had been previously prepared. The meteorites to be measured were Acfer 003, Adrar 214, RC075, and Axtell. In the second year, the plan called for measuring Renazzo and Murchison, and beginning chemical processing on another set of meteorites, including Murray, which were to be measured in the third year. All of the meteorites listed above have been measured and the results were presented in three abstracts. The project is continuing under follow-on grants and one of two planned major papers is almost ready for submission.

  11. A Survey and Proposed Framework on the Soft Biometrics Technique for Human Identification in Intelligent Video Surveillance System

    PubMed Central

    Kim, Min-Gu; Moon, Hae-Min; Chung, Yongwha; Pan, Sung Bum

    2012-01-01

    Biometrics verification can be efficiently used for intrusion detection and intruder identification in video surveillance systems. Biometrics techniques can be largely divided into traditional and the so-called soft biometrics. Whereas traditional biometrics deals with physical characteristics such as face features, eye iris, and fingerprints, soft biometrics is concerned with such information as gender, national origin, and height. Traditional biometrics is versatile and highly accurate. But it is very difficult to get traditional biometric data from a distance and without personal cooperation. Soft biometrics, although featuring less accuracy, can be used much more freely though. Recently, many researchers have been made on human identification using soft biometrics data collected from a distance. In this paper, we use both traditional and soft biometrics for human identification and propose a framework for solving such problems as lighting, occlusion, and shadowing. PMID:22919273

  12. Complexity multiscale asynchrony measure and behavior for interacting financial dynamics

    NASA Astrophysics Data System (ADS)

    Yang, Ge; Wang, Jun; Niu, Hongli

    2016-08-01

    A stochastic financial price process is proposed and investigated by the finite-range multitype contact dynamical system, in an attempt to study the nonlinear behaviors of real asset markets. The viruses spreading process in a finite-range multitype system is used to imitate the interacting behaviors of diverse investment attitudes in a financial market, and the empirical research on descriptive statistics and autocorrelation behaviors of return time series is performed for different values of propagation rates. Then the multiscale entropy analysis is adopted to study several different shuffled return series, including the original return series, the corresponding reversal series, the random shuffled series, the volatility shuffled series and the Zipf-type shuffled series. Furthermore, we propose and compare the multiscale cross-sample entropy and its modification algorithm called composite multiscale cross-sample entropy. We apply them to study the asynchrony of pairs of time series under different time scales.

  13. Culture or anonymity? Differences in proposer behaviour in Korea and Germany.

    PubMed

    Horak, Sven

    2015-10-01

    This study explores the proposer behaviour in an ultimatum game (UG) frame under anonymous and non-anonymous conditions among a Korean and German subject pool (n = 590) in comparison. Whereas the anonymous condition is represented by the standard UG, the non-anonymous condition integrates an aggregate of the Korean cultural context variables university affiliation, regional origin and seniority. The latter, a classic Confucian context variable, is measured by age differentials. The former two are impactful components of so-called Yongo networks, a unique Korean informal institution identical to Chinese Guanxi ties. Yongo networks, yet underrepresented in research, are said to be a central context variable to explain Korean social ties and decision-making behaviour. We observe significant differences between the offer behaviours of Korean and German subjects when exposing selected cultural variables. We argue that the behavioural differences observed are in fact due to culture not anonymity. © 2015 International Union of Psychological Science.

  14. Artificial immune system via Euclidean Distance Minimization for anomaly detection in bearings

    NASA Astrophysics Data System (ADS)

    Montechiesi, L.; Cocconcelli, M.; Rubini, R.

    2016-08-01

    In recent years new diagnostics methodologies have emerged, with particular interest into machinery operating in non-stationary conditions. In fact continuous speed changes and variable loads make non-trivial the spectrum analysis. A variable speed means a variable characteristic fault frequency related to the damage that is no more recognizable in the spectrum. To overcome this problem the scientific community proposed different approaches listed in two main categories: model-based approaches and expert systems. In this context the paper aims to present a simple expert system derived from the mechanisms of the immune system called Euclidean Distance Minimization, and its application in a real case of bearing faults recognition. The proposed method is a simplification of the original process, adapted by the class of Artificial Immune Systems, which proved to be useful and promising in different application fields. Comparative results are provided, with a complete explanation of the algorithm and its functioning aspects.

  15. Analytical estimation of the correlation dimension of integer lattices

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lacasa, Lucas, E-mail: l.lacasa@qmul.ac.uk; Gómez-Gardeñes, Jesús, E-mail: gardenes@gmail.com; Departamento de Fisica de la Materia Condensada, Universidad de Zaragoza, Zaragoza

    2014-12-01

    Recently [L. Lacasa and J. Gómez-Gardeñes, Phys. Rev. Lett. 110, 168703 (2013)], a fractal dimension has been proposed to characterize the geometric structure of networks. This measure is an extension to graphs of the so called correlation dimension, originally proposed by Grassberger and Procaccia to describe the geometry of strange attractors in dissipative chaotic systems. The calculation of the correlation dimension of a graph is based on the local information retrieved from a random walker navigating the network. In this contribution, we study such quantity for some limiting synthetic spatial networks and obtain analytical results on agreement with the previouslymore » reported numerics. In particular, we show that up to first order, the correlation dimension β of integer lattices ℤ{sup d} coincides with the Haussdorf dimension of their coarsely equivalent Euclidean spaces, β = d.« less

  16. Penalty dynamic programming algorithm for dim targets detection in sensor systems.

    PubMed

    Huang, Dayu; Xue, Anke; Guo, Yunfei

    2012-01-01

    In order to detect and track multiple maneuvering dim targets in sensor systems, an improved dynamic programming track-before-detect algorithm (DP-TBD) called penalty DP-TBD (PDP-TBD) is proposed. The performances of tracking techniques are used as a feedback to the detection part. The feedback is constructed by a penalty term in the merit function, and the penalty term is a function of the possible target state estimation, which can be obtained by the tracking methods. With this feedback, the algorithm combines traditional tracking techniques with DP-TBD and it can be applied to simultaneously detect and track maneuvering dim targets. Meanwhile, a reasonable constraint that a sensor measurement can originate from one target or clutter is proposed to minimize track separation. Thus, the algorithm can be used in the multi-target situation with unknown target numbers. The efficiency and advantages of PDP-TBD compared with two existing methods are demonstrated by several simulations.

  17. A survey and proposed framework on the soft biometrics technique for human identification in intelligent video surveillance system.

    PubMed

    Kim, Min-Gu; Moon, Hae-Min; Chung, Yongwha; Pan, Sung Bum

    2012-01-01

    Biometrics verification can be efficiently used for intrusion detection and intruder identification in video surveillance systems. Biometrics techniques can be largely divided into traditional and the so-called soft biometrics. Whereas traditional biometrics deals with physical characteristics such as face features, eye iris, and fingerprints, soft biometrics is concerned with such information as gender, national origin, and height. Traditional biometrics is versatile and highly accurate. But it is very difficult to get traditional biometric data from a distance and without personal cooperation. Soft biometrics, although featuring less accuracy, can be used much more freely though. Recently, many researchers have been made on human identification using soft biometrics data collected from a distance. In this paper, we use both traditional and soft biometrics for human identification and propose a framework for solving such problems as lighting, occlusion, and shadowing.

  18. SPIRITS: SPitzer InfraRed Intensive Transients Survey

    NASA Astrophysics Data System (ADS)

    Kasliwal, Mansi; Lau, Ryan; Cao, Yi; Masci, Frank; Helou, George; Williams, Robert; Bally, John; Bond, Howard; Whitelock, Patricia; Cody, Ann Marie; Gehrz, Robert; Jencson, Jacob; Tinyanont, Samaporn; Smith, Nathan; Surace, Jason; Armus, Lee; Cantiello, Matteo; Langer, Norbert; Levesque, Emily; Mohamed, Shazrene; Ofek, Eran; Parthasarathy, Mudumba; van Dyk, Schuyler; Boyer, Martha; Phillips, Mark; Hsiao, Eric; Morrell, Nidia; Perley, Dan; Gonzalez, Consuelo; Contreras, Carlos; Jones, Olivia; Ressler, Michael; Adams, Scott; Moore, Anna; Cook, David; Fox, Ori; Johansson, Joel; Khan, Rubab; Monson, Andy

    2016-08-01

    Spitzer is pioneering a systematic exploration of the dynamic infrared sky. Our SPitzer InfraRed Intensive Transients Survey (SPIRITS) has already discovered 147 explosive transients and 1948 eruptive variables. Of these 147 infrared transients, 35 are so red that they are devoid of optical counterparts and we call them SPRITEs (eSPecially Red Intermediate-luminosity Transient Events). The nature of SPRITEs is unknown and progress on deciphering the explosion physics depends on mid-IR spectroscopy. Multiple physical origins have been proposed including stellar merger, birth of a massive binary, electron capture supernova and stellar black-hole formation. Hence, we propose a modest continuation of SPIRITS, focusing on discovering and monitoring SPRITEs, in preparation for follow-up with the James Webb Space Telescope (JWST). As the SPRITEs evolve and cool, the bulk of the emission shifts to longer wavelengths. MIRI aboard JWST will be the only available platform in the near future capable of characterizing SPRITEs out to 28um. Specifically, the low resolution spectrometer would determine dust mass, grain chemistry, ice abundance and energetics to disentangle the proposed origins. The re-focused SPIRITS program consists of continued Spitzer monitoring of only those 104 luminous galaxies that are known SPRITE hosts or are most likely to host new SPRITEa. Scaling from the SPIRITS discovery rate, we estimate finding 22 new SPRITEs and 6 new supernovae over the next two years. The SPIRITS team remains committed to extensive ground-based follow-up. The Spitzer observations proposed here are essential for determining the final fates of active SPRITEs as well as bridging the time lag between the current SPIRITS survey and JWST launch.

  19. Design and evaluation of a parametric model for cardiac sounds.

    PubMed

    Ibarra-Hernández, Roilhi F; Alonso-Arévalo, Miguel A; Cruz-Gutiérrez, Alejandro; Licona-Chávez, Ana L; Villarreal-Reyes, Salvador

    2017-10-01

    Heart sound analysis plays an important role in the auscultative diagnosis process to detect the presence of cardiovascular diseases. In this paper we propose a novel parametric heart sound model that accurately represents normal and pathological cardiac audio signals, also known as phonocardiograms (PCG). The proposed model considers that the PCG signal is formed by the sum of two parts: one of them is deterministic and the other one is stochastic. The first part contains most of the acoustic energy. This part is modeled by the Matching Pursuit (MP) algorithm, which performs an analysis-synthesis procedure to represent the PCG signal as a linear combination of elementary waveforms. The second part, also called residual, is obtained after subtracting the deterministic signal from the original heart sound recording and can be accurately represented as an autoregressive process using the Linear Predictive Coding (LPC) technique. We evaluate the proposed heart sound model by performing subjective and objective tests using signals corresponding to different pathological cardiac sounds. The results of the objective evaluation show an average Percentage of Root-Mean-Square Difference of approximately 5% between the original heart sound and the reconstructed signal. For the subjective test we conducted a formal methodology for perceptual evaluation of audio quality with the assistance of medical experts. Statistical results of the subjective evaluation show that our model provides a highly accurate approximation of real heart sound signals. We are not aware of any previous heart sound model rigorously evaluated as our proposal. Copyright © 2017 Elsevier Ltd. All rights reserved.

  20. SPIRITS: SPitzer InfraRed Intensive Transients Survey

    NASA Astrophysics Data System (ADS)

    Kasliwal, Mansi; Jencson, Jacob; Lau, Ryan; Masci, Frank; Helou, George; Williams, Robert; Bally, John; Bond, Howard; Whitelock, Patricia; Cody, Ann Marie; Gehrz, Robert; Tinyanont, Samaporn; Smith, Nathan; Surace, Jason; Armus, Lee; Cantiello, Matteo; Langer, Norbert; Levesque, Emily; Mohamed, Shazrene; Ofek, Eran; Parthasarathy, Mudumba; van Dyk, Schuyler; Boyer, Martha; Phillips, Mark; Hsiao, Eric; Morrell, Nidia; Perley, Dan; Gonzalez, Consuelo; Contreras, Carlos; Jones, Olivia; Ressler, Michael; Adams, Scott; Moore, Anna; Cook, David; Fox, Ori; Johansson, Joel; Khan, Rubab; Monson, Andrew; Hankins, Matthew; Goldman, Steven; Jacob, Jencson

    2018-05-01

    Spitzer is pioneering a systematic exploration of the dynamic infrared sky. Our SPitzer InfraRed Intensive Transients Survey (SPIRITS) has already discovered 78 explosive transients and 2457 eruptive variables. Of these 78 infrared transients, 60 are so red that they are devoid of optical counterparts and we call them SPRITEs (eSPecially Red Intermediate-luminosity Transient Events). The nature of SPRITEs is unknown and progress on deciphering the explosion physics depends on mid-IR spectroscopy. Multiple physical origins have been proposed including stellar merger, birth of a massive binary, electron capture supernova and stellar black hole formation. Hence, we propose a modest continuation of SPIRITS, focusing on discovering and monitoring SPRITEs, in preparation for follow-up with the James Webb Space Telescope (JWST). As the SPRITEs evolve and cool, the bulk of the emission shifts to longer wavelengths. MIRI aboard JWST will be the only available platform in the near future capable of characterizing SPRITEs out to 28 um. Specifically, the low resolution spectrometer would determine dust mass, grain chemistry, ice abundance and energetics to disentangle the proposed origins. The re-focused SPIRITS program consists of continued Spitzer monitoring of those 106 luminous galaxies that are known SPRITE hosts or are most likely to host new SPRITEs. Scaling from the SPIRITS discovery rate, we estimate finding 10 new SPRITEs and 2-3 new supernovae in Cycle 14. The SPIRITS team remains committed to extensive ground-based follow-up. The Spitzer observations proposed here are essential for determining the final fates of active SPRITEs as well as bridging the time lag between the current SPIRITS survey and JWST launch.

  1. Linear stability analysis of collective neutrino oscillations without spurious modes

    NASA Astrophysics Data System (ADS)

    Morinaga, Taiki; Yamada, Shoichi

    2018-01-01

    Collective neutrino oscillations are induced by the presence of neutrinos themselves. As such, they are intrinsically nonlinear phenomena and are much more complex than linear counterparts such as the vacuum or Mikheyev-Smirnov-Wolfenstein oscillations. They obey integro-differential equations, for which it is also very challenging to obtain numerical solutions. If one focuses on the onset of collective oscillations, on the other hand, the equations can be linearized and the technique of linear analysis can be employed. Unfortunately, however, it is well known that such an analysis, when applied with discretizations of continuous angular distributions, suffers from the appearance of so-called spurious modes: unphysical eigenmodes of the discretized linear equations. In this paper, we analyze in detail the origin of these unphysical modes and present a simple solution to this annoying problem. We find that the spurious modes originate from the artificial production of pole singularities instead of a branch cut on the Riemann surface by the discretizations. The branching point singularities on the Riemann surface for the original nondiscretized equations can be recovered by approximating the angular distributions with polynomials and then performing the integrals analytically. We demonstrate for some examples that this simple prescription does remove the spurious modes. We also propose an even simpler method: a piecewise linear approximation to the angular distribution. It is shown that the same methodology is applicable to the multienergy case as well as to the dispersion relation approach that was proposed very recently.

  2. Unique voices in harmony: Call-and-response to address race and physics teaching

    NASA Astrophysics Data System (ADS)

    Cochran, Geraldine L.; White, Gary D.

    2017-09-01

    In the February 2016 issue of The Physics Teacher, we announced a call for papers on race and physics teaching. The response was muted at first, but has now grown to a respectable chorale-sized volume. As the manuscripts began to come in and the review process progressed, Geraldine Cochran graciously agreed to come on board as co-editor for this remarkable collection of papers, to be published throughout the fall of 2017 in TPT. Upon reviewing the original call and the responses from the physics community, the parallels between generating this collection and the grand call-and-response tradition became compelling. What follows is a conversation constructed by the co-editors that is intended to introduce the reader to the swell of voices that responded to the original call. The authors would like to thank Pam Aycock for providing many useful contributions to this editorial.

  3. Accelerating Computation of DCM for ERP in MATLAB by External Function Calls to the GPU.

    PubMed

    Wang, Wei-Jen; Hsieh, I-Fan; Chen, Chun-Chuan

    2013-01-01

    This study aims to improve the performance of Dynamic Causal Modelling for Event Related Potentials (DCM for ERP) in MATLAB by using external function calls to a graphics processing unit (GPU). DCM for ERP is an advanced method for studying neuronal effective connectivity. DCM utilizes an iterative procedure, the expectation maximization (EM) algorithm, to find the optimal parameters given a set of observations and the underlying probability model. As the EM algorithm is computationally demanding and the analysis faces possible combinatorial explosion of models to be tested, we propose a parallel computing scheme using the GPU to achieve a fast estimation of DCM for ERP. The computation of DCM for ERP is dynamically partitioned and distributed to threads for parallel processing, according to the DCM model complexity and the hardware constraints. The performance efficiency of this hardware-dependent thread arrangement strategy was evaluated using the synthetic data. The experimental data were used to validate the accuracy of the proposed computing scheme and quantify the time saving in practice. The simulation results show that the proposed scheme can accelerate the computation by a factor of 155 for the parallel part. For experimental data, the speedup factor is about 7 per model on average, depending on the model complexity and the data. This GPU-based implementation of DCM for ERP gives qualitatively the same results as the original MATLAB implementation does at the group level analysis. In conclusion, we believe that the proposed GPU-based implementation is very useful for users as a fast screen tool to select the most likely model and may provide implementation guidance for possible future clinical applications such as online diagnosis.

  4. Accelerating Computation of DCM for ERP in MATLAB by External Function Calls to the GPU

    PubMed Central

    Wang, Wei-Jen; Hsieh, I-Fan; Chen, Chun-Chuan

    2013-01-01

    This study aims to improve the performance of Dynamic Causal Modelling for Event Related Potentials (DCM for ERP) in MATLAB by using external function calls to a graphics processing unit (GPU). DCM for ERP is an advanced method for studying neuronal effective connectivity. DCM utilizes an iterative procedure, the expectation maximization (EM) algorithm, to find the optimal parameters given a set of observations and the underlying probability model. As the EM algorithm is computationally demanding and the analysis faces possible combinatorial explosion of models to be tested, we propose a parallel computing scheme using the GPU to achieve a fast estimation of DCM for ERP. The computation of DCM for ERP is dynamically partitioned and distributed to threads for parallel processing, according to the DCM model complexity and the hardware constraints. The performance efficiency of this hardware-dependent thread arrangement strategy was evaluated using the synthetic data. The experimental data were used to validate the accuracy of the proposed computing scheme and quantify the time saving in practice. The simulation results show that the proposed scheme can accelerate the computation by a factor of 155 for the parallel part. For experimental data, the speedup factor is about 7 per model on average, depending on the model complexity and the data. This GPU-based implementation of DCM for ERP gives qualitatively the same results as the original MATLAB implementation does at the group level analysis. In conclusion, we believe that the proposed GPU-based implementation is very useful for users as a fast screen tool to select the most likely model and may provide implementation guidance for possible future clinical applications such as online diagnosis. PMID:23840507

  5. Genetic and Epigenetic Events Generate Multiple Pathways in Colorectal Cancer Progression

    PubMed Central

    Pancione, Massimo; Remo, Andrea; Colantuoni, Vittorio

    2012-01-01

    Colorectal cancer (CRC) is one of the most common causes of death, despite decades of research. Initially considered as a disease due to genetic mutations, it is now viewed as a complex malignancy because of the involvement of epigenetic abnormalities. A functional equivalence between genetic and epigenetic mechanisms has been suggested in CRC initiation and progression. A hallmark of CRC is its pathogenetic heterogeneity attained through at least three distinct pathways: a traditional (adenoma-carcinoma sequence), an alternative, and more recently the so-called serrated pathway. While the alternative pathway is more heterogeneous and less characterized, the traditional and serrated pathways appear to be more homogeneous and clearly distinct. One unsolved question in colon cancer biology concerns the cells of origin and from which crypt compartment the different pathways originate. Based on molecular and pathological evidences, we propose that the traditional and serrated pathways originate from different crypt compartments explaining their genetic/epigenetic and clinicopathological differences. In this paper, we will discuss the current knowledge of CRC pathogenesis and, specifically, summarize the role of genetic/epigenetic changes in the origin and progression of the multiple CRC pathways. Elucidation of the link between the molecular and clinico-pathological aspects of CRC would improve our understanding of its etiology and impact both prevention and treatment. PMID:22888469

  6. Simultaneous Genotype Calling and Haplotype Phasing Improves Genotype Accuracy and Reduces False-Positive Associations for Genome-wide Association Studies

    PubMed Central

    Browning, Brian L.; Yu, Zhaoxia

    2009-01-01

    We present a novel method for simultaneous genotype calling and haplotype-phase inference. Our method employs the computationally efficient BEAGLE haplotype-frequency model, which can be applied to large-scale studies with millions of markers and thousands of samples. We compare genotype calls made with our method to genotype calls made with the BIRDSEED, CHIAMO, GenCall, and ILLUMINUS genotype-calling methods, using genotype data from the Illumina 550K and Affymetrix 500K arrays. We show that our method has higher genotype-call accuracy and yields fewer uncalled genotypes than competing methods. We perform single-marker analysis of data from the Wellcome Trust Case Control Consortium bipolar disorder and type 2 diabetes studies. For bipolar disorder, the genotype calls in the original study yield 25 markers with apparent false-positive association with bipolar disorder at a p < 10−7 significance level, whereas genotype calls made with our method yield no associated markers at this significance threshold. Conversely, for markers with replicated association with type 2 diabetes, there is good concordance between genotype calls used in the original study and calls made by our method. Results from single-marker and haplotypic analysis of our method's genotype calls for the bipolar disorder study indicate that our method is highly effective at eliminating genotyping artifacts that cause false-positive associations in genome-wide association studies. Our new genotype-calling methods are implemented in the BEAGLE and BEAGLECALL software packages. PMID:19931040

  7. Student Evaluation of CALL Tools during the Design Process

    ERIC Educational Resources Information Center

    Nesbitt, Dallas

    2013-01-01

    This article discusses the comparative effectiveness of student input at different times during the design of CALL tools for learning kanji, the Japanese characters of Chinese origin. The CALL software "package" consisted of tools to facilitate the writing, reading and practising of kanji characters in context. A pre-design questionnaire…

  8. Alchemy, Chinese versus Greek, an etymological approach: a rejoinder.

    PubMed

    Mahdihassan, S

    1988-01-01

    The theory generally accepted maintains that Alchemy arose at Alexandria as a child of Greek culture. It has two names, Chemeia as the earlier and Chumeia as the later. There is another theory that Alchemy arose in China. Its founder was the aged ascetic who longed after drugs of longevity. He first tried jade, next gold and cinnabar, but the ideal was a drug which was red like cinnabar and fire-proof like gold. But what was actually prepared was red colloidal gold or "calcined gold," by grinding gold granules in a decoction of an herb of longevity. It was called Chin-I; Chin = gold and I = plant juice. In Fukin dialect Chin-I = Kim-Iya. This was Arabicized, by pre-Islamic Arabs trading in silk with China, as Kimiya, whence arose Al-Kimiya and finally Al-chemy. It was first accepted by Bucharic speaking Copts in Egypt who transliterated Kimiya = Chemeia, pronouncing it as the Arabs did. With the increase of trade in silk the Chinese also went to Alexandria and helped the Greeks to translate Chin-I as Chrusozomion meaning, gold (making) ferment, instead of gold making plant juice. Consistent with this origin of the word Chemeia is the fact that the earlier Alchemists were not Greeks but probably Bucharic speaks Copts or Egyptians. The consumer of Chin-I or Chemeia became "a drug-made immortal" called Chin-Jen, Golden-Man. This was translated into Greek as Chrusanthropos. Thus the etymoloogy of two Greek words Chrusozomion and Chrusanthropos support the origin of the loan word, Chemeia as Chinese. To save space it is not proposed to discuss the origin of Chumeia.

  9. A new concept of real-time security camera monitoring with privacy protection by masking moving objects

    NASA Astrophysics Data System (ADS)

    Yabuta, Kenichi; Kitazawa, Hitoshi; Tanaka, Toshihisa

    2006-02-01

    Recently, monitoring cameras for security have been extensively increasing. However, it is normally difficult to know when and where we are monitored by these cameras and how the recorded images are stored and/or used. Therefore, how to protect privacy in the recorded images is a crucial issue. In this paper, we address this problem and introduce a framework for security monitoring systems considering the privacy protection. We state requirements for monitoring systems in this framework. We propose a possible implementation that satisfies the requirements. To protect privacy of recorded objects, they are made invisible by appropriate image processing techniques. Moreover, the original objects are encrypted and watermarked into the image with the "invisible" objects, which is coded by the JPEG standard. Therefore, the image decoded by a normal JPEG viewer includes the objects that are unrecognized or invisible. We also introduce in this paper a so-called "special viewer" in order to decrypt and display the original objects. This special viewer can be used by limited users when necessary for crime investigation, etc. The special viewer allows us to choose objects to be decoded and displayed. Moreover, in this proposed system, real-time processing can be performed, since no future frame is needed to generate a bitstream.

  10. A simple mechanistic explanation for original antigenic sin and its alleviation by adjuvants.

    PubMed

    Ndifon, Wilfred

    2015-11-06

    A large number of published studies have shown that adaptive immunity to a particular antigen, including pathogen-derived, can be boosted by another, cross-reacting antigen while inducing suboptimal immunity to the latter. Although this phenomenon, called original antigenic sin (OAS), was first reported approximately 70 years ago (Francis et al. 1947 Am. J. Public Health 37, 1013-1016 (doi:10.2105/AJPH.37.8.1013)), its underlying biological mechanisms are still inadequately understood (Kim et al. Proc. Natl Acad. Sci. USA 109, 13 751-13 756 (doi:10.1073/pnas.0912458109)). Here, focusing on the humoral aspects of adaptive immunity, I propose a simple and testable mechanism: that OAS occurs when T regulatory cells induced by the first antigen decrease the dose of the second antigen that is loaded by dendritic cells and available to activate naive lymphocytes. I use both a parsimonious mathematical model and experimental data to confirm the deductive validity of this proposal. This model also explains the puzzling experimental observation that administering certain dendritic cell-activating adjuvants during antigen exposure alleviates OAS. Specifically, the model predicts that such adjuvants will attenuate T regulatory suppression of naive lymphocyte activation. Together, these results suggest additional strategies for redeeming adaptive immunity from the destructive consequences of antigenic 'sin'. © 2015 The Author(s).

  11. The family medicine curriculum resource project structural framework.

    PubMed

    Stearns, Jeffrey A; Stearns, Marjorie A; Davis, Ardis K; Chessman, Alexander W

    2007-01-01

    In the original contract for the Family Medicine Curricular Resource Project (FMCRP), the Health Resources and Services Administration (HRSA), Division of Medicine and Dentistry, charged the FMCRP executive committee with reviewing recent medical education reform proposals and relevant recent curricula to develop an analytical framework for the project. The FMCRP executive and advisory committees engaged in a review and analysis of a variety of curricular reform proposals generated during the last decade of the 20th century. At the same time, in a separate and parallel process, representative individuals from all the family medicine organizations, all levels of learners, internal medicine and pediatric faculty, and the national associations of medical and osteopathic colleges (Association of American Medical Colleges and the American Association of Colleges of Osteopathic Medicine) were involved in group discussions to identify educational needs for physicians practicing in the 21st century. After deliberation, a theoretical framework was chosen for this undergraduate medical education resource that mirrors the Accreditation Council for Graduate Medical Education (ACGME) competencies, a conceptual design originated for graduate medical education. In addition to reflecting the current environment calling for change and greater accountability in medical education, use of the ACGME competencies as the theoretical framework for the FMCR provides a continuum of focus between the two major segments of physician education: medical school and residency.

  12. Evaluation of parameters for particles acceleration by the zero-point field of quantum electrodynamics

    NASA Technical Reports Server (NTRS)

    Rueda, A.

    1985-01-01

    That particles may be accelerated by vacuum effects in quantum field theory has been repeatedly proposed in the last few years. A natural upshot of this is a mechanism for cosmic rays (CR) primaries acceleration. A mechanism for acceleration by the zero-point field (ZPE) when the ZPE is taken in a realistic sense (in opposition to a virtual field) was considered. Originally the idea was developed within a semiclassical context. The classical Einstein-Hopf model (EHM) was used to show that free isolated electromagnrtically interacting particles performed a random walk in phase space and more importantly in momentum space when submitted to the perennial action of the so called classical electromagnrtic ZPE.

  13. Lesbian Studies after The Lesbian Postmodern: toward a new genealogy.

    PubMed

    Doan, Laura

    2007-01-01

    While Lesbian Studies is established as a commodity in the academic marketplace, its disciplinary contours are rather more obscure-and even more problematically, its disciplinary genealogy remains somewhat crude. The dominant genealogy of Lesbian Studies might best be characterized as a 'collision model,' a battle between politics and theory, even though much existing scholarship draws on both Lesbian-Feminist Theory and Queer Theory.1 This article proposes that the tools and methods of a sub-field called 'Lesbian Cultural History' might be useful in generating other historical accounts of the origins and evolution of Lesbian Studies. Such a project is vital because the writing of our disciplinary History clarifies how we envision a disciplinary future.

  14. Subcentimeter noninvasive follicular thyroid neoplasm with papillary-like nuclear features (NIFTP).

    PubMed

    Rosario, Pedro Weslley

    2018-05-23

    Recently, it was proposed that the noninvasive encapsulated follicular variant of papillary thyroid carcinoma (noninvasive E-FVPTC) start to be called "noninvasive follicular thyroid neoplasm with papillary-like nuclear features" (NIFTP). 1 As the original cohort of 109 patients with NIFTP studied by the consensus conference included only tumors which were equal or more than 1 cm in size, the consensus diagnostic criteria of NIFTP did not explicitly address subcentimeter lesions. 1,2 In fact, in a recent review published in this journal, Hung & Barletta recognize that there are a few published subcentimetre NIFTP in the literature. 3 This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.

  15. On creative machines and the physical origins of freedom

    PubMed Central

    Briegel, Hans J.

    2012-01-01

    We discuss the possibility of free behavior in embodied systems that are, with no exception and at all scales of their body, subject to physical law. We relate the discussion to a model of an artificial agent that exhibits a primitive notion of creativity and freedom in dealing with its environment, which is part of a recently introduced scheme of information processing called projective simulation. This provides an explicit proposal on how we can reconcile our understanding of universal physical law with the idea that higher biological entities can acquire a notion of freedom that allows them to increasingly detach themselves from a strict dependence on the surrounding world. PMID:22822427

  16. QMR: A Quasi-Minimal Residual method for non-Hermitian linear systems

    NASA Technical Reports Server (NTRS)

    Freund, Roland W.; Nachtigal, Noel M.

    1990-01-01

    The biconjugate gradient (BCG) method is the natural generalization of the classical conjugate gradient algorithm for Hermitian positive definite matrices to general non-Hermitian linear systems. Unfortunately, the original BCG algorithm is susceptible to possible breakdowns and numerical instabilities. A novel BCG like approach is presented called the quasi-minimal residual (QMR) method, which overcomes the problems of BCG. An implementation of QMR based on a look-ahead version of the nonsymmetric Lanczos algorithm is proposed. It is shown how BCG iterates can be recovered stably from the QMR process. Some further properties of the QMR approach are given and an error bound is presented. Finally, numerical experiments are reported.

  17. A test of multiple hypotheses for the function of call sharing in female budgerigars, Melopsittacus undulatus

    PubMed Central

    Young, Anna M.; Cordier, Breanne; Mundry, Roger; Wright, Timothy F.

    2014-01-01

    In many social species group, members share acoustically similar calls. Functional hypotheses have been proposed for call sharing, but previous studies have been limited by an inability to distinguish among these hypotheses. We examined the function of vocal sharing in female budgerigars with a two-part experimental design that allowed us to distinguish between two functional hypotheses. The social association hypothesis proposes that shared calls help animals mediate affiliative and aggressive interactions, while the password hypothesis proposes that shared calls allow animals to distinguish group identity and exclude nonmembers. We also tested the labeling hypothesis, a mechanistic explanation which proposes that shared calls are used to address specific individuals within the sender–receiver relationship. We tested the social association hypothesis by creating four–member flocks of unfamiliar female budgerigars (Melopsittacus undulatus) and then monitoring the birds’ calls, social behaviors, and stress levels via fecal glucocorticoid metabolites. We tested the password hypothesis by moving immigrants into established social groups. To test the labeling hypothesis, we conducted additional recording sessions in which individuals were paired with different group members. The social association hypothesis was supported by the development of multiple shared call types in each cage and a correlation between the number of shared call types and the number of aggressive interactions between pairs of birds. We also found support for calls serving as a labeling mechanism using discriminant function analysis with a permutation procedure. Our results did not support the password hypothesis, as there was no difference in stress or directed behaviors between immigrant and control birds. PMID:24860236

  18. A Multiple-Label Guided Clustering Algorithm for Historical Document Dating and Localization.

    PubMed

    He, Sheng; Samara, Petros; Burgers, Jan; Schomaker, Lambert

    2016-11-01

    It is of essential importance for historians to know the date and place of origin of the documents they study. It would be a huge advancement for historical scholars if it would be possible to automatically estimate the geographical and temporal provenance of a handwritten document by inferring them from the handwriting style of such a document. We propose a multiple-label guided clustering algorithm to discover the correlations between the concrete low-level visual elements in historical documents and abstract labels, such as date and location. First, a novel descriptor, called histogram of orientations of handwritten strokes, is proposed to extract and describe the visual elements, which is built on a scale-invariant polar-feature space. In addition, the multi-label self-organizing map (MLSOM) is proposed to discover the correlations between the low-level visual elements and their labels in a single framework. Our proposed MLSOM can be used to predict the labels directly. Moreover, the MLSOM can also be considered as a pre-structured clustering method to build a codebook, which contains more discriminative information on date and geography. The experimental results on the medieval paleographic scale data set demonstrate that our method achieves state-of-the-art results.

  19. Simplified Computation for Nonparametric Windows Method of Probability Density Function Estimation.

    PubMed

    Joshi, Niranjan; Kadir, Timor; Brady, Michael

    2011-08-01

    Recently, Kadir and Brady proposed a method for estimating probability density functions (PDFs) for digital signals which they call the Nonparametric (NP) Windows method. The method involves constructing a continuous space representation of the discrete space and sampled signal by using a suitable interpolation method. NP Windows requires only a small number of observed signal samples to estimate the PDF and is completely data driven. In this short paper, we first develop analytical formulae to obtain the NP Windows PDF estimates for 1D, 2D, and 3D signals, for different interpolation methods. We then show that the original procedure to calculate the PDF estimate can be significantly simplified and made computationally more efficient by a judicious choice of the frame of reference. We have also outlined specific algorithmic details of the procedures enabling quick implementation. Our reformulation of the original concept has directly demonstrated a close link between the NP Windows method and the Kernel Density Estimator.

  20. A hybrid approach of using symmetry technique for brain tumor segmentation.

    PubMed

    Saddique, Mubbashar; Kazmi, Jawad Haider; Qureshi, Kalim

    2014-01-01

    Tumor and related abnormalities are a major cause of disability and death worldwide. Magnetic resonance imaging (MRI) is a superior modality due to its noninvasiveness and high quality images of both the soft tissues and bones. In this paper we present two hybrid segmentation techniques and their results are compared with well-recognized techniques in this area. The first technique is based on symmetry and we call it a hybrid algorithm using symmetry and active contour (HASA). In HASA, we take refection image, calculate the difference image, and then apply the active contour on the difference image to segment the tumor. To avoid unimportant segmented regions, we improve the results by proposing an enhancement in the form of the second technique, EHASA. In EHASA, we also take reflection of the original image, calculate the difference image, and then change this image into a binary image. This binary image is mapped onto the original image followed by the application of active contouring to segment the tumor region.

  1. On the origin and evolutionary diversification of beetle horns

    PubMed Central

    Emlen, Douglas J.; Corley Lavine, Laura; Ewen-Campen, Ben

    2007-01-01

    Many scarab beetles produce rigid projections from the body called horns. The exaggerated sizes of these structures and the staggering diversity of their forms have impressed biologists for centuries. Recent comparative studies using DNA sequence-based phylogenies have begun to reconstruct the historical patterns of beetle horn evolution. At the same time, developmental genetic experiments have begun to elucidate how beetle horns grow and how horn growth is modulated in response to environmental variables, such as nutrition. We bring together these two perspectives to show that they converge on very similar conclusions regarding beetle evolution. Horns do not appear to be difficult structures to gain or lose, and they can diverge both dramatically and rapidly in form. Although much of this work is still preliminary, we use available information to propose a conceptual developmental model for the major trajectories of beetle horn evolution. We illustrate putative mechanisms underlying the evolutionary origin of horns and the evolution of horn location, shape, allometry, and dimorphism. PMID:17494751

  2. A data-hiding technique with authentication, integration, and confidentiality for electronic patient records.

    PubMed

    Chao, Hui-Mei; Hsu, Chin-Ming; Miaou, Shaou-Gang

    2002-03-01

    A data-hiding technique called the "bipolar multiple-number base" was developed to provide capabilities of authentication, integration, and confidentiality for an electronic patient record (EPR) transmitted among hospitals through the Internet. The proposed technique is capable of hiding those EPR related data such as diagnostic reports, electrocardiogram, and digital signatures from doctors or a hospital into a mark image. The mark image could be the mark of a hospital used to identify the origin of an EPR. Those digital signatures from doctors and a hospital could be applied for the EPR authentication. Thus, different types of medical data can be integrated into the same mark image. The confidentiality is ultimately achieved by decrypting the EPR related data and digital signatures with an exact copy of the original mark image. The experimental results validate the integrity and the invisibility of the hidden EPR related data. This newly developed technique allows all of the hidden data to be separated and restored perfectly by authorized users.

  3. Characterisation of the n-colour printing process using the spot colour overprint model.

    PubMed

    Deshpande, Kiran; Green, Phil; Pointer, Michael R

    2014-12-29

    This paper is aimed at reproducing the solid spot colours using the n-colour separation. A simplified numerical method, called as the spot colour overprint (SCOP) model, was used for characterising the n-colour printing process. This model was originally developed for estimating the spot colour overprints. It was extended to be used as a generic forward characterisation model for the n-colour printing process. The inverse printer model based on the look-up table was implemented to obtain the colour separation for n-colour printing process. Finally the real-world spot colours were reproduced using 7-colour separation on lithographic offset printing process. The colours printed with 7 inks were compared against the original spot colours to evaluate the accuracy. The results show good accuracy with the mean CIEDE2000 value between the target colours and the printed colours of 2.06. The proposed method can be used successfully to reproduce the spot colours, which can potentially save significant time and cost in the printing and packaging industry.

  4. Conodont biostratigraphy of a more complete Reef Trail Member section near the type section, latest Guadalupian Series type region

    USGS Publications Warehouse

    Wardlaw, Bruce R.; Lambert, L.L.; Bell, G.L.; Fronimos, J.A.; Yisa, M.O.

    2010-01-01

    The original type section of the Reef Trail Member (uppermost part of the Bell Canyon Formation) is called the Park Boundary Section, and is less than satisfactory in several aspects. We propose a new reference section designated Reef Trail Reference section 1 (RTR1) on the same hill as the original type section. Section RTR1 compensates for some of the Park Boundary Section’s shortcomings, including better exposure of a single measured section with only minor offset. The conodont biostratigraphy of section RTR1 is presented that, when combined with a better set of described correlation intervals, allows for improved correlation to recently discovered, complete, basinal sections in the Patterson Hills. In comparison with the South Boundary basin section, both the Park Boundary and RTR1 sections are missing approximately the upper third of the Reef Trail Member. Transitional conodonts from the basin demonstrate that Jinogondolella crofti evolved directly from J. altudaensis. We formally elevate Clarkina postbitteri hongshuiensis to C. hongshuiensis.

  5. Electrosensory ampullary organs are derived from lateral line placodes in cartilaginous fishes

    PubMed Central

    Gillis, J. Andrew; Modrell, Melinda S.; Northcutt, R. Glenn; Catania, Kenneth C.; Luer, Carl A.; Baker, Clare V. H.

    2016-01-01

    Summary Ampullary organ electroreceptors excited by weak cathodal electric fields are used for hunting by both cartilaginous and non-teleost bony fishes. Despite similarities of neurophysiology and innervation, their embryonic origins remain controversial: bony fish ampullary organs are derived from lateral line placodes, while a neural crest origin has been proposed for cartilaginous fish electroreceptors. This calls into question the homology of electroreceptors and ampullary organs in the two lineages of jawed vertebrates. Here, we test the hypothesis that lateral line placodes form electroreceptors in cartilaginous fishes by undertaking the first long-term in vivo fate-mapping study in any cartilaginous fish. Using DiI-tracing for up to 70 days in the little skate, Leucoraja erinacea, we show that lateral line placodes form both ampullary electroreceptors and mechanosensory neuromasts. These data confirm the homology of electroreceptors and ampullary organs in cartilaginous and non-teleost bony fishes and indicate that jawed vertebrates primitively possessed a lateral line placode-derived system of electrosensory ampullary organs and mechanosensory neuromasts. PMID:22833123

  6. Anthropogenic areas as incidental substitutes for original habitat.

    PubMed

    Martínez-Abraín, Alejandro; Jiménez, Juan

    2016-06-01

    One speaks of ecological substitutes when an introduced species performs, to some extent, the ecosystem function of an extirpated native species. We suggest that a similar case exists for habitats. Species evolve within ecosystems, but habitats can be destroyed or modified by natural and human-made causes. Sometimes habitat alteration forces animals to move to or remain in a suboptimal habitat type. In that case, the habitat is considered a refuge, and the species is called a refugee. Typically refugee species have lower population growth rates than in their original habitats. Human action may lead to the unintended generation of artificial or semiartificial habitat types that functionally resemble the essential features of the original habitat and thus allow a population growth rate of the same magnitude or higher than in the original habitat. We call such areas substitution habitats and define them as human-made habitats within the focal species range that by chance are partial substitutes for the species' original habitat. We call species occupying a substitution habitat adopted species. These are 2 new terms in conservation biology. Examples of substitution habitats are dams for European otters, wheat and rice fields for many steppeland and aquatic birds, and urban areas for storks, falcons, and swifts. Although substitution habitats can bring about increased resilience against the agents of global change, the conservation of original habitat types remains a conservation priority. © 2016 Society for Conservation Biology.

  7. Domain Regeneration for Cross-Database Micro-Expression Recognition

    NASA Astrophysics Data System (ADS)

    Zong, Yuan; Zheng, Wenming; Huang, Xiaohua; Shi, Jingang; Cui, Zhen; Zhao, Guoying

    2018-05-01

    In this paper, we investigate the cross-database micro-expression recognition problem, where the training and testing samples are from two different micro-expression databases. Under this setting, the training and testing samples would have different feature distributions and hence the performance of most existing micro-expression recognition methods may decrease greatly. To solve this problem, we propose a simple yet effective method called Target Sample Re-Generator (TSRG) in this paper. By using TSRG, we are able to re-generate the samples from target micro-expression database and the re-generated target samples would share same or similar feature distributions with the original source samples. For this reason, we can then use the classifier learned based on the labeled source samples to accurately predict the micro-expression categories of the unlabeled target samples. To evaluate the performance of the proposed TSRG method, extensive cross-database micro-expression recognition experiments designed based on SMIC and CASME II databases are conducted. Compared with recent state-of-the-art cross-database emotion recognition methods, the proposed TSRG achieves more promising results.

  8. On the topological sensitivity of cellular automata

    NASA Astrophysics Data System (ADS)

    Baetens, Jan M.; De Baets, Bernard

    2011-06-01

    Ever since the conceptualization of cellular automata (CA), much attention has been paid to the dynamical properties of these discrete dynamical systems, and, more in particular, to their sensitivity to the initial condition from which they are evolved. Yet, the sensitivity of CA to the topology upon which they are based has received only minor attention, such that a clear insight in this dependence is still lacking and, furthermore, a quantification of this so-called topological sensitivity has not yet been proposed. The lack of attention for this issue is rather surprising since CA are spatially explicit, which means that their dynamics is directly affected by their topology. To overcome these shortcomings, we propose topological Lyapunov exponents that measure the divergence of two close trajectories in phase space originating from a topological perturbation, and we relate them to a measure grasping the sensitivity of CA to their topology that relies on the concept of topological derivatives, which is introduced in this paper. The validity of the proposed methodology is illustrated for the 256 elementary CA and for a family of two-state irregular totalistic CA.

  9. The development of a peak-time criterion for designing controlled-release devices.

    PubMed

    Simon, Laurent; Ospina, Juan

    2016-08-25

    This work consists of estimating dynamic characteristics for topically-applied drugs when the magnitude of the flux increases to a maximum value, called peak flux, before declining to zero. This situation is typical of controlled-released systems with a finite donor or vehicle volume. Laplace transforms were applied to the governing equations and resulted in an expression for the flux in terms of the physical characteristics of the system. After approximating this function by a second-order model, three parameters of this reduced structure captured the essential features of the original process. Closed-form relationships were then developed for the peak flux and time-to-peak based on the empirical representation. Three case studies that involve mechanisms, such as diffusion, partitioning, dissolution and elimination, were selected to illustrate the procedure. The technique performed successfully as shown by the ability of the second-order flux to match the prediction of the original transport equations. A main advantage of the proposed method is that it does not require a solution of the original partial differential equations. Less accurate results were noted for longer lag times. Copyright © 2016 Elsevier B.V. All rights reserved.

  10. 47 CFR 64.1504 - Restrictions on the use of toll-free numbers.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... CARRIER SERVICES (CONTINUED) MISCELLANEOUS RULES RELATING TO COMMON CARRIERS Interstate Pay-Per-Call and... advertised or widely understood to be toll-free, in a manner that would result in: (a) The calling party or the subscriber to the originating line being assessed, by virtue of completing the call, a charge for...

  11. Called to Do Meaningful Work: A Blessing or a Curse?

    ERIC Educational Resources Information Center

    van Vuuren, Mark

    2017-01-01

    Two groups of people are particularly inclined to mention a calling when talking about their work motivation: those who are spiritual (because the concept of calling originated in the religious realm) and those in serving occupations (such as hospitals, schools, and nongovernmental organizations). Because Christian professors are in both groups,…

  12. Poisoning following exposure to chemicals stored in mislabelled or unlabelled containers: a recipe for potential disaster.

    PubMed

    Millard, Yvette C; Slaughter, Robin J; Shieffelbien, Lucy M; Schep, Leo J

    2014-09-26

    To investigate poisoning exposures to chemicals that were unlabelled, mislabelled or not in their original containers in New Zealand over the last 10 years, based on calls to the New Zealand National Poisons Centre (NZNPC). Call data from the NZNPC between 2003 and 2012 were analysed retrospectively. Parameters reviewed included patient age, route and site of exposure, product classification and recommended intervention. Of the 324,411 calls received between 2003 and 2012, 100,465 calls were associated with acute human exposure to chemicals. There were 757 inquiries related to human exposure to mislabelled or unlabelled chemicals consisting of 0.75% of chemical exposures. Adults were involved in 51% of incidents, children, <5 years 32%, 5-10 years 10%, and adolescents 5%. Child exploratory behaviour was responsible for 38% of calls and adult unintentional exposures 61%. Medical attention was advised in 26% of calls. Inadvertent exposure to toxic products stored in unlabelled or mislabelled containers is a problem for all age groups. Although it represents a small proportion of total calls to the NZNPC it remains a potential risk for serious poisoning. It is important that chemicals are stored securely, in their original containers, and never stored in drinking vessels.

  13. Bringing the Excitement of Exploring Mars and the Giant Planets to Educators and the Public

    NASA Astrophysics Data System (ADS)

    Morrow, C. A.; Dusenbery, P. B.; Harold, J.

    2003-05-01

    We are living in a wonderful era of planetary exploration. In 2004 alone, two rovers will land on Mars and the Cassini-Huygens mission will arrive in the Saturn system for an extended 4-year tour. These events will bring much public attention and provide excellent reasons for substantive educational outreach to educators and the public. The Space Science Institute (SSI) of Boulder, CO and collaborators are responding with a comprehensive array of funded and proposed projects. These include the refurbishment and redeployment of the 5000 sq. ft MarsQuest national traveling exhibition, the launch of a 600 sq. ft. "mini-MarsQuest" called Destination Mars, the launch of an interactive website called "MarsQuest Online" (in partnership with TERC and JPL), a variety of workshops for teachers, museum educators, and planetarians (in partnership with "To Mars with MER", and JPL), and the development of a "Family Guide to Mars" for use by adults and children in informal learning settings. SSI is also proposing to develop another national traveling exhibition called "Giant Planets: Exploring the Outer Solar System". This exhibit (envisioned to be 3500 sq.ft.) and its educational program will take advantage of the excitement generated by the Cassini mission and origins-related research. Its education program will also benefit from SSI having led the development of the "Saturn Educator Guide" - a JPL-sponsored resource for teachers in grades 5 and up. This paper will provide an overview of our resources in planetary science education and communicate the valuable lessons we've learned about their design, development and dissemination. SSI's educational endeavors related to planetary science have been funded by several NASA and NSF grants and contracts.

  14. Metastatic brain tumor

    MedlinePlus

    ... the brain, the type of tissue involved, the original location of the tumor, and other factors. In rare cases, doctors do not know the original location. This is called cancer of unknown primary ( ...

  15. DNA Cryptography and Deep Learning using Genetic Algorithm with NW algorithm for Key Generation.

    PubMed

    Kalsi, Shruti; Kaur, Harleen; Chang, Victor

    2017-12-05

    Cryptography is not only a science of applying complex mathematics and logic to design strong methods to hide data called as encryption, but also to retrieve the original data back, called decryption. The purpose of cryptography is to transmit a message between a sender and receiver such that an eavesdropper is unable to comprehend it. To accomplish this, not only we need a strong algorithm, but a strong key and a strong concept for encryption and decryption process. We have introduced a concept of DNA Deep Learning Cryptography which is defined as a technique of concealing data in terms of DNA sequence and deep learning. In the cryptographic technique, each alphabet of a letter is converted into a different combination of the four bases, namely; Adenine (A), Cytosine (C), Guanine (G) and Thymine (T), which make up the human deoxyribonucleic acid (DNA). Actual implementations with the DNA don't exceed laboratory level and are expensive. To bring DNA computing on a digital level, easy and effective algorithms are proposed in this paper. In proposed work we have introduced firstly, a method and its implementation for key generation based on the theory of natural selection using Genetic Algorithm with Needleman-Wunsch (NW) algorithm and Secondly, a method for implementation of encryption and decryption based on DNA computing using biological operations Transcription, Translation, DNA Sequencing and Deep Learning.

  16. E-GRASP/Eratosthenes: a mission proposal for millimetric TRF realization

    NASA Astrophysics Data System (ADS)

    Biancale, Richard; Pollet, Arnaud; Coulot, David; Mandea, Mioara

    2017-04-01

    The ITRF is currently worked out by independent concatenation of space technique information. GNSS, DORIS, SLR and VLBI data are processed independently by analysis centers before combination centers form mono-technique sets which are then combined together to produce official ITRF solutions. Actually this approach performs quite well, although systematisms between techniques remain visible in origin or scale parameters of the underlying terrestrial frames, for instance. Improvement and homogenization of TRF are expected in the future, provided that dedicated multi-technique platforms are used at best. The goal fixed by GGOS to realizing the terrestrial reference system with an accuracy of 1 mm and a long-term stability of 0.1 mm/yr can be next achieved in the E-GRASP/Eratosthenes scenario. This mission proposed to ESA as response of the 2017 Earth Explorer-9 call was already scientifically well assessed in the 2016 EE9 call. It co-locates all of the fundamental space-based geodetic instruments, GNSS and DORIS receivers, laser retro-reflectors, and a VLBI transmitter on the same satellite platform on a highly eccentric orbit with particular attention paid to the time and space metrology on board. Different kinds of simulations were performed both for discriminating the best orbital scenario according to many geometric/technical/physical criteria and for assessing the expected performances on the TRF according to GGOS goals. The presentation will focus on the mission scenario and simulation results.

  17. Birds, primates, and spoken language origins: behavioral phenotypes and neurobiological substrates

    PubMed Central

    Petkov, Christopher I.; Jarvis, Erich D.

    2012-01-01

    Vocal learners such as humans and songbirds can learn to produce elaborate patterns of structurally organized vocalizations, whereas many other vertebrates such as non-human primates and most other bird groups either cannot or do so to a very limited degree. To explain the similarities among humans and vocal-learning birds and the differences with other species, various theories have been proposed. One set of theories are motor theories, which underscore the role of the motor system as an evolutionary substrate for vocal production learning. For instance, the motor theory of speech and song perception proposes enhanced auditory perceptual learning of speech in humans and song in birds, which suggests a considerable level of neurobiological specialization. Another, a motor theory of vocal learning origin, proposes that the brain pathways that control the learning and production of song and speech were derived from adjacent motor brain pathways. Another set of theories are cognitive theories, which address the interface between cognition and the auditory-vocal domains to support language learning in humans. Here we critically review the behavioral and neurobiological evidence for parallels and differences between the so-called vocal learners and vocal non-learners in the context of motor and cognitive theories. In doing so, we note that behaviorally vocal-production learning abilities are more distributed than categorical, as are the auditory-learning abilities of animals. We propose testable hypotheses on the extent of the specializations and cross-species correspondences suggested by motor and cognitive theories. We believe that determining how spoken language evolved is likely to become clearer with concerted efforts in testing comparative data from many non-human animal species. PMID:22912615

  18. Initialization of metabolism in prebiotic petroleum

    NASA Astrophysics Data System (ADS)

    Mekki-Berrada, Ali

    The theoretical and bibliographical work on the geochemical origin of life, which I present here, it works on the assumption that: "The class of more complex molecules of life that can have a geochemical and abiotic origin is the class of fatty acid with long aliphatic chain". This idea comes from the controversy over the abiotic oil industry, and the first measurements of abiotic oil at mid-ocean ridges (Charlou J.L. et al. 2002, Proskurowski G. et al. 2008). To go further and propose a comprehensive experimentation on the origin of life, I propose in this article the idea that the prebiotic soup or prebiotic petroleum would stem from the diagenesis of the gas clathrates/sediments mixture. Gas, H2S H2 N2 CH4 CO2, are produced at mid-ocean ridges, and at large-scale at the seafloor, by serpentinization. Sediments contain hydrogenophosphates as a source of phosphate and minerals to the surface catalysis. Extreme conditions experienced by some prokaryotes and pressures and temperatures of submarine oilfields of fossil petroleum are close. The hydrostatic pressure is around 1.5 kbar and the temperature is below 150 °C. This experiment I propose is quite feasible today since these conditions are used: In research and exploration of fossil petroleum; In the field of organic chemistry called "green chemistry" and where temperatures remain low and the pressure can reach 10 kbar; to study the biology of prokaryotes living in the fossil petroleum of industrial interest, these studies are quite comparable to experiment with prebiotic oil; Finally, this experiment can be based on research on abiotic CH4 on Mars and abiotic hydrocarbons on Titan. The next step in the theoretical research of the origin of life is the abiotic synthesis of liposomes. Abiotic synthesis liposomes just requires synthesis of glycerol and ethanolamine (or serine) esterifying the phosphate and fatty acid. The state of research on the abiotic synthesis of these molecules shows that synthesis of glycerol in the laboratory and in industry are so drastic and complex that I proposed initialization metabolism in fatty acid vesicles with hydrogenation by H2 of glyceraldehyde-P or dihydroxyacetone-P to glycerol-3P after esterification to the fatty acid. Hydrogenation is assumed to be facilitated by the catalyst power of the multi-anionic surface of these vesicles. https://en.wikiversity.org/wiki/Prebiotic_Petroleum

  19. Peer Review of Grant Applications: Criteria Used and Qualitative Study of Reviewer Practices

    PubMed Central

    Abdoul, Hendy; Perrey, Christophe; Amiel, Philippe; Tubach, Florence; Gottot, Serge; Durand-Zaleski, Isabelle; Alberti, Corinne

    2012-01-01

    Background Peer review of grant applications has been criticized as lacking reliability. Studies showing poor agreement among reviewers supported this possibility but usually focused on reviewers’ scores and failed to investigate reasons for disagreement. Here, our goal was to determine how reviewers rate applications, by investigating reviewer practices and grant assessment criteria. Methods and Findings We first collected and analyzed a convenience sample of French and international calls for proposals and assessment guidelines, from which we created an overall typology of assessment criteria comprising nine domains relevance to the call for proposals, usefulness, originality, innovativeness, methodology, feasibility, funding, ethical aspects, and writing of the grant application. We then performed a qualitative study of reviewer practices, particularly regarding the use of assessment criteria, among reviewers of the French Academic Hospital Research Grant Agencies (Programmes Hospitaliers de Recherche Clinique, PHRCs). Semi-structured interviews and observation sessions were conducted. Both the time spent assessing each grant application and the assessment methods varied across reviewers. The assessment criteria recommended by the PHRCs were listed by all reviewers as frequently evaluated and useful. However, use of the PHRC criteria was subjective and varied across reviewers. Some reviewers gave the same weight to each assessment criterion, whereas others considered originality to be the most important criterion (12/34), followed by methodology (10/34) and feasibility (4/34). Conceivably, this variability might adversely affect the reliability of the review process, and studies evaluating this hypothesis would be of interest. Conclusions Variability across reviewers may result in mistrust among grant applicants about the review process. Consequently, ensuring transparency is of the utmost importance. Consistency in the review process could also be improved by providing common definitions for each assessment criterion and uniform requirements for grant application submissions. Further research is needed to assess the feasibility and acceptability of these measures. PMID:23029386

  20. Disappearing Scales in Carps: Re-Visiting Kirpichnikov's Model on the Genetics of Scale Pattern Formation

    PubMed Central

    Goh, Chin Heng; Kathiresan, Purushothaman; Németh, Sándor; Jeney, Zsigmond; Bercsényi, Miklós; Orbán, László

    2013-01-01

    The body of most fishes is fully covered by scales that typically form tight, partially overlapping rows. While some of the genes controlling the formation and growth of fish scales have been studied, very little is known about the genetic mechanisms regulating scale pattern formation. Although the existence of two genes with two pairs of alleles (S&s and N&n) regulating scale coverage in cyprinids has been predicted by Kirpichnikov and colleagues nearly eighty years ago, their identity was unknown until recently. In 2009, the ‘S’ gene was found to be a paralog of fibroblast growth factor receptor 1, fgfr1a1, while the second gene called ‘N’ has not yet been identified. We re-visited the original model of Kirpichnikov that proposed four major scale pattern types and observed a high degree of variation within the so-called scattered phenotype due to which this group was divided into two sub-types: classical mirror and irregular. We also analyzed the survival rates of offspring groups and found a distinct difference between Asian and European crosses. Whereas nude × nude crosses involving at least one parent of Asian origin or hybrid with Asian parent(s) showed the 25% early lethality predicted by Kirpichnikov (due to the lethality of the NN genotype), those with two Hungarian nude parents did not. We further extended Kirpichnikov's work by correlating changes in phenotype (scale-pattern) to the deformations of fins and losses of pharyngeal teeth. We observed phenotypic changes which were not restricted to nudes, as described by Kirpichnikov, but were also present in mirrors (and presumably in linears as well; not analyzed in detail here). We propose that the gradation of phenotypes observed within the scattered group is caused by a gradually decreasing level of signaling (a dose-dependent effect) probably due to a concerted action of multiple pathways involved in scale formation. PMID:24386179

  1. Origins of Life: Open Questions and Debates

    NASA Astrophysics Data System (ADS)

    Brack, André

    2017-10-01

    Stanley Miller demonstrated in 1953 that it was possible to form amino acids from methane, ammonia, and hydrogen in water, thus launching the ambitious hope that chemists would be able to shed light on the origins of life by recreating a simple life form in a test tube. However, it must be acknowledged that the dream has not yet been accomplished, despite the great volume of effort and innovation put forward by the scientific community. A minima, primitive life can be defined as an open chemical system, fed with matter and energy, capable of self-reproduction (i.e., making more of itself by itself), and also capable of evolving. The concept of evolution implies that chemical systems would transfer their information fairly faithfully but make some random errors. If we compared the components of primitive life to parts of a chemical automaton, we could conceive that, by chance, some parts self-assembled to generate an automaton capable of assembling other parts to produce a true copy. Sometimes, minor errors in the building generated a more efficient automaton, which then became the dominant species. Quite different scenarios and routes have been followed and tested in the laboratory to explain the origin of life. There are two schools of thought in proposing the prebiotic supply of organics. The proponents of a metabolism-first call for the spontaneous formation of simple molecules from carbon dioxide and water to rapidly generate life. In a second hypothesis, the primeval soup scenario, it is proposed that rather complex organic molecules accumulated in a warm little pond prior to the emergence of life. The proponents of the primeval soup or replication first approach are by far the more active. They succeeded in reconstructing small-scale versions of proteins, membranes, and RNA. Quite different scenarios have been proposed for the inception of life: the RNA world, an origin within droplets, self-organization counteracting entropy, or a stochastic approach merging chemistry and geology. Understanding the emergence of a critical feature of life, its one-handedness, is a shared preoccupation in all these approaches.

  2. 78 FR 38356 - 60-Day Notice of Proposed Information Collection: Notice of Proposed Information Collection for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-06-26

    ... Information Collection: Notice of Proposed Information Collection for Disaster Recovery Grant Reporting System... impairments may access this number through TTY by calling the toll-free Federal Relay Service at (800) 877... this number through TTY by calling the toll-free Federal Relay Service at (800) 877-8339. SUPPLEMENTARY...

  3. Origin recognition is the predominant role for DnaA-ATP in initiation of chromosome replication.

    PubMed

    Grimwade, Julia E; Rozgaja, Tania A; Gupta, Rajat; Dyson, Kyle; Rao, Prassanna; Leonard, Alan C

    2018-05-25

    In all cells, initiation of chromosome replication depends on the activity of AAA+ initiator proteins that form complexes with replication origin DNA. In bacteria, the conserved, adenosine triphosphate (ATP)-regulated initiator protein, DnaA, forms a complex with the origin, oriC, that mediates DNA strand separation and recruitment of replication machinery. Complex assembly and origin activation requires DnaA-ATP, which differs from DnaA-ADP in its ability to cooperatively bind specific low affinity sites and also to oligomerize into helical filaments. The degree to which each of these activities contributes to the DnaA-ATP requirement for initiation is not known. In this study, we compared the DnaA-ATP dependence of initiation from wild-type Escherichia coli oriC and a synthetic origin (oriCallADP), whose multiple low affinity DnaA sites bind DnaA-ATP and DnaA-ADP similarly. OriCallADP was fully occupied and unwound by DnaA-ADP in vitro, and, in vivo, oriCallADP suppressed lethality of DnaA mutants defective in ATP binding and ATP-specific oligomerization. However, loss of preferential DnaA-ATP binding caused over-initiation and increased sensitivity to replicative stress. The findings indicate both DnaA-ATP and DnaA-ADP can perform most of the mechanical functions needed for origin activation, and suggest that a key reason for ATP-regulation of DnaA is to control replication initiation frequency.

  4. Microbiome/microbiota and allergies.

    PubMed

    Inoue, Yuzaburo; Shimojo, Naoki

    2015-01-01

    Allergies are characterized by a hypersensitive immune reaction to originally harmless antigens. In recent decades, the incidence of allergic diseases has markedly increased, especially in developed countries. The increase in the frequency of allergic diseases is thought to be primarily due to environmental changes related to a westernized lifestyle, which affects the commensal microbes in the human body. The human gut is the largest organ colonized by bacteria and contains more than 1000 bacterial species, called the "gut microbiota." The recent development of sequencing technology has enabled researchers to genetically investigate and clarify the diversity of all species of commensal microbes. The collective genomes of commensal microbes are together called the "microbiome." Although the detailed mechanisms remain unclear, it has been proposed that the microbiota/microbiome, especially that in the gut, impacts the systemic immunity and metabolism, thus affecting the development of various immunological diseases, including allergies. In this review, we summarize the recent findings regarding the importance of the microbiome/microbiota in the development of allergic diseases and also the results of interventional studies using probiotics or prebiotics to prevent allergies.

  5. Purchase and Installation of NanoSIMS 50

    NASA Technical Reports Server (NTRS)

    Walker, Robert M.

    2001-01-01

    Although this is a final report on NASA grant number NAG5-8729 we wish to state at the outset that it was mistakenly written as a two-year grant instead of a three-year grant as should have been done. The grant was made for the purpose of purchasing and installing a novel ion microprobe initially called the NanoSIMS 50 and now called the NanoSIMS. The total cost to NASA for purchasing the instrument and refurbishing a laboratory to house it was $1.1 M, split into three installments of $400 (FY 1999), $350K (FY2000), and $350K (FY-2001). We received the first installment in full and $335K in FY2000 for the second installment. The final $350K necessary to complete the purchase and installation was expected by us in the spring of 2001. However, we were recently informed that no more money can be transferred on this grant since it was originally written as a two-year grant. Therefore, we are closing out the current grant and simultaneously writing a new proposal to obtain the final $350K needed to complete the purchase.

  6. Loss of lager specific genes and subtelomeric regions define two different Saccharomyces cerevisiae lineages for Saccharomyces pastorianus Group I and II strains.

    PubMed

    Monerawela, Chandre; James, Tharappel C; Wolfe, Kenneth H; Bond, Ursula

    2015-03-01

    Lager yeasts, Saccharomyces pastorianus, are interspecies hybrids between S. cerevisiae and S. eubayanus and are classified into Group I and Group II clades. The genome of the Group II strain, Weihenstephan 34/70, contains eight so-called 'lager-specific' genes that are located in subtelomeric regions. We evaluated the origins of these genes through bioinformatic and PCR analyses of Saccharomyces genomes. We determined that four are of cerevisiae origin while four originate from S. eubayanus. The Group I yeasts contain all four S. eubayanus genes but individual strains contain only a subset of the cerevisiae genes. We identified S. cerevisiae strains that contain all four cerevisiae 'lager-specific' genes, and distinct patterns of loss of these genes in other strains. Analysis of the subtelomeric regions uncovered patterns of loss in different S. cerevisiae strains. We identify two classes of S. cerevisiae strains: ale yeasts (Foster O) and stout yeasts with patterns of 'lager-specific' genes and subtelomeric regions identical to Group I and II S. pastorianus yeasts, respectively. These findings lead us to propose that Group I and II S. pastorianus strains originate from separate hybridization events involving different S. cerevisiae lineages. Using the combined bioinformatic and PCR data, we describe a potential classification map for industrial yeasts. © FEMS 2015. All rights reserved. For permissions, please e-mail: journals.permission@oup.com.

  7. Cranial Pair 0: The Nervus Terminalis.

    PubMed

    PeñA-Melian, Angel; Cabello-de la Rosa, Juan Pablo; Gallardo-Alcañiz, Maria Jose; Vaamonde-Gamo, Julia; Relea-Calatayud, Fernanda; Gonzalez-Lopez, Lucia; Villanueva-Anguita, Patricia; Flores-Cuadrado, Alicia; Saiz-Sanchez, Daniel; Martinez-Marcos, Alino

    2018-04-16

    Originally discovered in elasmobranchs by Fritsh in 1878, the nervus terminalis has been found in virtually all species, including humans. After more than one-century debate on its nomenclature, it is nowadays recognized as cranial pair zero. The nerve mostly originates in the olfactory placode, although neural crest contribution has been also proposed. Developmentally, the nervus terminalis is clearly observed in human embryos; subsequently, during the fetal period loses some of its ganglion cells, and it is less recognizable in adults. Fibers originating in the nasal cavity passes into the cranium through the middle area of the cribiform plate of the ethmoid bone. Intracranially, fibers joint the telencephalon at several sites including the olfactory trigone and the primordium of the hippocampus to reach preoptic and precommissural regions. The nervus terminalis shows ganglion cells, that sometimes form clusters, normally one or two located at the base of the crista galli, the so-called ganglion of the nervus terminalis. Its function is uncertain. It has been described that its fibers facilitates migration of luteinizing hormone-releasing hormone cells to the hypothalamus thus participating in the development of the hypothalamic-gonadal axis, which alteration may provoke Kallmann's syndrome in humans. This review summarizes current knowledge on this structure, incorporating original illustrations of the nerve at different developmental stages, and focuses on its anatomical and clinical relevance. Anat Rec, 2018. © 2018 Wiley Periodicals, Inc. © 2018 Wiley Periodicals, Inc.

  8. The evolutionary and genetic origins of consciousness in the Cambrian Period over 500 million years ago

    PubMed Central

    Feinberg, Todd E.; Mallatt, Jon

    2013-01-01

    Vertebrates evolved in the Cambrian Period before 520 million years ago, but we do not know when or how consciousness arose in the history of the vertebrate brain. Here we propose multiple levels of isomorphic or somatotopic neural representations as an objective marker for sensory consciousness. All extant vertebrates have these, so we deduce that consciousness extends back to the group's origin. The first conscious sense may have been vision. Then vision, coupled with additional sensory systems derived from ectodermal placodes and neural crest, transformed primitive reflexive systems into image forming brains that map and perceive the external world and the body's interior. We posit that the minimum requirement for sensory consciousness and qualia is a brain including a forebrain (but not necessarily a developed cerebral cortex/pallium), midbrain, and hindbrain. This brain must also have (1) hierarchical systems of intercommunicating, isomorphically organized, processing nuclei that extensively integrate the different senses into representations that emerge in upper levels of the neural hierarchy; and (2) a widespread reticular formation that integrates the sensory inputs and contributes to attention, awareness, and neural synchronization. We propose a two-step evolutionary history, in which the optic tectum was the original center of multi-sensory conscious perception (as in fish and amphibians: step 1), followed by a gradual shift of this center to the dorsal pallium or its cerebral cortex (in mammals, reptiles, birds: step 2). We address objections to the hypothesis and call for more studies of fish and amphibians. In our view, the lamprey has all the neural requisites and is likely the simplest extant vertebrate with sensory consciousness and qualia. Genes that pattern the proposed elements of consciousness (isomorphism, neural crest, placodes) have been identified in all vertebrates. Thus, consciousness is in the genes, some of which are already known. PMID:24109460

  9. The evolutionary and genetic origins of consciousness in the Cambrian Period over 500 million years ago.

    PubMed

    Feinberg, Todd E; Mallatt, Jon

    2013-01-01

    Vertebrates evolved in the Cambrian Period before 520 million years ago, but we do not know when or how consciousness arose in the history of the vertebrate brain. Here we propose multiple levels of isomorphic or somatotopic neural representations as an objective marker for sensory consciousness. All extant vertebrates have these, so we deduce that consciousness extends back to the group's origin. The first conscious sense may have been vision. Then vision, coupled with additional sensory systems derived from ectodermal placodes and neural crest, transformed primitive reflexive systems into image forming brains that map and perceive the external world and the body's interior. We posit that the minimum requirement for sensory consciousness and qualia is a brain including a forebrain (but not necessarily a developed cerebral cortex/pallium), midbrain, and hindbrain. This brain must also have (1) hierarchical systems of intercommunicating, isomorphically organized, processing nuclei that extensively integrate the different senses into representations that emerge in upper levels of the neural hierarchy; and (2) a widespread reticular formation that integrates the sensory inputs and contributes to attention, awareness, and neural synchronization. We propose a two-step evolutionary history, in which the optic tectum was the original center of multi-sensory conscious perception (as in fish and amphibians: step 1), followed by a gradual shift of this center to the dorsal pallium or its cerebral cortex (in mammals, reptiles, birds: step 2). We address objections to the hypothesis and call for more studies of fish and amphibians. In our view, the lamprey has all the neural requisites and is likely the simplest extant vertebrate with sensory consciousness and qualia. Genes that pattern the proposed elements of consciousness (isomorphism, neural crest, placodes) have been identified in all vertebrates. Thus, consciousness is in the genes, some of which are already known.

  10. Beyond E 11

    NASA Astrophysics Data System (ADS)

    Bossard, Guillaume; Kleinschmidt, Axel; Palmkvist, Jakob; Pope, Christopher N.; Sezgin, Ergin

    2017-05-01

    We study the non-linear realisation of E 11 originally proposed by West with particular emphasis on the issue of linearised gauge invariance. Our analysis shows even at low levels that the conjectured equations can only be invariant under local gauge transformations if a certain section condition that has appeared in a different context in the E 11 literature is satisfied. This section condition also generalises the one known from exceptional field theory. Even with the section condition, the E 11 duality equation for gravity is known to miss the trace component of the spin connection. We propose an extended scheme based on an infinite-dimensional Lie superalgebra, called the tensor hierarchy algebra, that incorporates the section condition and resolves the above issue. The tensor hierarchy algebra defines a generalised differential complex, which provides a systematic description of gauge invariance and Bianchi identities. It furthermore provides an E 11 representation for the field strengths, for which we define a twisted first order self-duality equation underlying the dynamics.

  11. Detection and measurement of the intracellular calcium variation in follicular cells.

    PubMed

    Herrera-Navarro, Ana M; Terol-Villalobos, Iván R; Jiménez-Hernández, Hugo; Peregrina-Barreto, Hayde; Gonzalez-Barboza, José-Joel

    2014-01-01

    This work presents a new method for measuring the variation of intracellular calcium in follicular cells. The proposal consists in two stages: (i) the detection of the cell's nuclei and (ii) the analysis of the fluorescence variations. The first stage is performed via watershed modified transformation, where the process of labeling is controlled. The detection process uses the contours of the cells as descriptors, where they are enhanced with a morphological filter that homogenizes the luminance variation of the image. In the second stage, the fluorescence variations are modeled as an exponential decreasing function, where the fluorescence variations are highly correlated with the changes of intracellular free Ca(2+). Additionally, it is introduced a new morphological called medium reconstruction process, which helps to enhance the data for the modeling process. This filter exploits the undermodeling and overmodeling properties of reconstruction operators, such that it preserves the structure of the original signal. Finally, an experimental process shows evidence of the capabilities of the proposal.

  12. Detection and Measurement of the Intracellular Calcium Variation in Follicular Cells

    PubMed Central

    Herrera-Navarro, Ana M.; Terol-Villalobos, Iván R.; Jiménez-Hernández, Hugo; Peregrina-Barreto, Hayde; Gonzalez-Barboza, José-Joel

    2014-01-01

    This work presents a new method for measuring the variation of intracellular calcium in follicular cells. The proposal consists in two stages: (i) the detection of the cell's nuclei and (ii) the analysis of the fluorescence variations. The first stage is performed via watershed modified transformation, where the process of labeling is controlled. The detection process uses the contours of the cells as descriptors, where they are enhanced with a morphological filter that homogenizes the luminance variation of the image. In the second stage, the fluorescence variations are modeled as an exponential decreasing function, where the fluorescence variations are highly correlated with the changes of intracellular free Ca2+. Additionally, it is introduced a new morphological called medium reconstruction process, which helps to enhance the data for the modeling process. This filter exploits the undermodeling and overmodeling properties of reconstruction operators, such that it preserves the structure of the original signal. Finally, an experimental process shows evidence of the capabilities of the proposal. PMID:25342958

  13. Penalty Dynamic Programming Algorithm for Dim Targets Detection in Sensor Systems

    PubMed Central

    Huang, Dayu; Xue, Anke; Guo, Yunfei

    2012-01-01

    In order to detect and track multiple maneuvering dim targets in sensor systems, an improved dynamic programming track-before-detect algorithm (DP-TBD) called penalty DP-TBD (PDP-TBD) is proposed. The performances of tracking techniques are used as a feedback to the detection part. The feedback is constructed by a penalty term in the merit function, and the penalty term is a function of the possible target state estimation, which can be obtained by the tracking methods. With this feedback, the algorithm combines traditional tracking techniques with DP-TBD and it can be applied to simultaneously detect and track maneuvering dim targets. Meanwhile, a reasonable constraint that a sensor measurement can originate from one target or clutter is proposed to minimize track separation. Thus, the algorithm can be used in the multi-target situation with unknown target numbers. The efficiency and advantages of PDP-TBD compared with two existing methods are demonstrated by several simulations. PMID:22666074

  14. Identifying Two-Dimensional Z 2 Antiferromagnetic Topological Insulators

    NASA Astrophysics Data System (ADS)

    Bègue, F.; Pujol, P.; Ramazashvili, R.

    2018-01-01

    We revisit the question of whether a two-dimensional topological insulator may arise in a commensurate Néel antiferromagnet, where staggered magnetization breaks the symmetry with respect to both elementary translation and time reversal, but retains their product as a symmetry. In contrast to the so-called Z 2 topological insulators, an exhaustive characterization of antiferromagnetic topological phases with the help of topological invariants has been missing. We analyze a simple model of an antiferromagnetic topological insulator and chart its phase diagram, using a recently proposed criterion for centrosymmetric systems [13]. We then adapt two methods, originally designed for paramagnetic systems, and make antiferromagnetic topological phases manifest. The proposed methods apply far beyond the particular examples treated in this work, and admit straightforward generalization. We illustrate this by two examples of non-centrosymmetric systems, where no simple criteria have been known to identify topological phases. We also present, for some cases, an explicit construction of edge states in an antiferromagnetic topological insulator.

  15. An Alternative Approach to the Extended Drude Model

    NASA Astrophysics Data System (ADS)

    Gantzler, N. J.; Dordevic, S. V.

    2018-05-01

    The original Drude model, proposed over a hundred years ago, is still used today for the analysis of optical properties of solids. Within this model, both the plasma frequency and quasiparticle scattering rate are constant, which makes the model rather inflexible. In order to circumvent this problem, the so-called extended Drude model was proposed, which allowed for the frequency dependence of both the quasiparticle scattering rate and the effective mass. In this work we will explore an alternative approach to the extended Drude model. Here, one also assumes that the quasiparticle scattering rate is frequency dependent; however, instead of the effective mass, the plasma frequency becomes frequency-dependent. This alternative model is applied to the high Tc superconductor Bi2Sr2CaCu2O8+δ (Bi2212) with Tc = 92 K, and the results are compared and contrasted with the ones obtained from the conventional extended Drude model. The results point to several advantages of this alternative approach to the extended Drude model.

  16. Repeated extragenic sequences in prokaryotic genomes: a proposal for the origin and dynamics of the RUP element in Streptococcus pneumoniae.

    PubMed

    Oggioni, M R; Claverys, J P

    1999-10-01

    A survey of all Streptococcus pneumoniae GenBank/EMBL DNA sequence entries and of the public domain sequence (representing more than 90% of the genome) of an S. pneumoniae type 4 strain allowed identification of 108 copies of a 107-bp-long highly repeated intergenic element called RUP (for repeat unit of pneumococcus). Several features of the element, revealed in this study, led to the proposal that RUP is an insertion sequence (IS)-derivative that could still be mobile. Among these features are: (1) a highly significant homology between the terminal inverted repeats (IRs) of RUPs and of IS630-Spn1, a new putative IS of S. pneumoniae; and (2) insertion at a TA dinucleotide, a characteristic target of several members of the IS630 family. Trans-mobilization of RUP is therefore proposed to be mediated by the transposase of IS630-Spn1. To account for the observation that RUPs are distributed among four subtypes which exhibit different degrees of sequence homogeneity, a scenario is invoked based on successive stages of RUP mobility and non-mobility, depending on whether an active transposase is present or absent. In the latter situation, an active transposase could be reintroduced into the species through natural transformation. Examination of sequences flanking RUP revealed a preferential association with ISs. It also provided evidence that RUPs promote sequence rearrangements, thereby contributing to genome flexibility. The possibility that RUP preferentially targets transforming DNA of foreign origin and subsequently favours disruption/rearrangement of exogenous sequences is discussed.

  17. Quick Vegas: Improving Performance of TCP Vegas for High Bandwidth-Delay Product Networks

    NASA Astrophysics Data System (ADS)

    Chan, Yi-Cheng; Lin, Chia-Liang; Ho, Cheng-Yuan

    An important issue in designing a TCP congestion control algorithm is that it should allow the protocol to quickly adjust the end-to-end communication rate to the bandwidth on the bottleneck link. However, the TCP congestion control may function poorly in high bandwidth-delay product networks because of its slow response with large congestion windows. In this paper, we propose an enhanced version of TCP Vegas called Quick Vegas, in which we present an efficient congestion window control algorithm for a TCP source. Our algorithm improves the slow-start and congestion avoidance techniques of original Vegas. Simulation results show that Quick Vegas significantly improves the performance of connections as well as remaining fair when the bandwidth-delay product increases.

  18. Heat shields for aircraft - A new concept to save lives in crash fires.

    NASA Technical Reports Server (NTRS)

    Neel, C. B.; Parker, J. A.; Fish, R. H.; Henshaw, J.; Newland, J. H.; Tempesta, F. L.

    1971-01-01

    A passenger compartment surrounded by a fire-retardant shell, to protect the occupants long enough for the fire to burn out or for fire-fighting equipment to reach the aircraft and extinguish it, is proposed as a new concept for saving lives in crash fires. This concept is made possible by the recent development of two new fire-retardant materials: a very lightweight foam plastic, called polyisocyanurate foam, and an intumescent paint. Exposed to heat, the intumescent paint expands to many times its original thickness and insulates the surface underneath it. Demonstration tests are illustrated, described and discussed. However, some problems, such as preventing fuselage rupture and protecting windows, must be solved before such a system can be used.

  19. Modeling the Kinetics of Root Gravireaction

    NASA Astrophysics Data System (ADS)

    Kondrachuk, Alexander V.; Starkov, Vyacheslav N.

    2011-02-01

    The known "sun-flower equation" (SFE), which was originally proposed to model root circumnutating, was used to describe the simplest tip root graviresponse. Two forms of the SFE (integro-differential and differential-delayed) were solved, analyzed and compared with each other. The numerical solutions of these equations were found to be matching with arbitrary accuracy. The analysis of the solutions focused on time-lag effects on the kinetics of tip root bending. The results of the modeling are in good correlation with an experiment at the initial stages of root tips graviresponse. Further development of the model calls for its systematic comparison with some specially designed experiments, which would include measuring the kinetics of root tip bending before gravistimulation over the period of time longer than the time lag.

  20. An accurate boundary element method for the exterior elastic scattering problem in two dimensions

    NASA Astrophysics Data System (ADS)

    Bao, Gang; Xu, Liwei; Yin, Tao

    2017-11-01

    This paper is concerned with a Galerkin boundary element method solving the two dimensional exterior elastic wave scattering problem. The original problem is first reduced to the so-called Burton-Miller [1] boundary integral formulation, and essential mathematical features of its variational form are discussed. In numerical implementations, a newly-derived and analytically accurate regularization formula [2] is employed for the numerical evaluation of hyper-singular boundary integral operator. A new computational approach is employed based on the series expansions of Hankel functions for the computation of weakly-singular boundary integral operators during the reduction of corresponding Galerkin equations into a discrete linear system. The effectiveness of proposed numerical methods is demonstrated using several numerical examples.

  1. Three-Dimensional General Relativistic Monte Carlo Neutrino Transport in Neutron Star Mergers

    NASA Astrophysics Data System (ADS)

    Richers, Sherwood; Radice, David

    2018-06-01

    How neutrinos interact with the debris ejected from merging neutron stars determines how much matter escapes, how hot the matter is, and the relative amounts of neutrons and protons. This makes understanding neutrino irradiation of ejected matter a necessary part of interpreting recent and future observations of so-called "kilonovae" to determine whether neutron star mergers can be the origin of heavy elements in the universe. I will discuss a new Monte Carlo method for simulating neutrino transport in these highly relativistic, multi-dimensional environments. I will use this tool to estimate how well approximate transport methods capture the neutrino irradiation and propose improvements to approximate methods that will aid in accurate modeling and interpretation of kilonovae.

  2. The dark side of flipped trinification

    NASA Astrophysics Data System (ADS)

    Dong, P. V.; Huong, D. T.; Queiroz, Farinaldo S.; Valle, José W. F.; Vaquera-Araujo, C. A.

    2018-04-01

    We propose a model which unifies the Left-Right symmetry with the SU(3) L gauge group, called flipped trinification, and based on the SU(3) C ⊗ SU(3) L ⊗ SU(3) R ⊗ U(1) X gauge group. The model inherits the interesting features of both symmetries while elegantly explaining the origin of the matter parity, W P = (-1)3( B- L)+2 s , and dark matter stability. We develop the details of the spontaneous symmetry breaking mechanism in the model, determining the relevant mass eigenstates, and showing how neutrino masses are easily generated via the seesaw mechanism. Moreover, we introduce viable dark matter candidates, encompassing a fermion, scalar and possibly vector fields, leading to a potentially novel dark matter phenomenology.

  3. New hematological key for bovine leukemia virus-infected Japanese Black cattle.

    PubMed

    Mekata, Hirohisa; Yamamoto, Mari; Kirino, Yumi; Sekiguchi, Satoshi; Konnai, Satoru; Horii, Yoichiro; Norimine, Junzo

    2018-02-20

    The European Community's (EC) Key, which is also called Bendixen's Key, is a well-established bovine leukemia virus (BLV) diagnostic method that classifies cattle according to the absolute lymphocyte count and age. The EC Key was originally designed for dairy cattle and is not necessarily suitable for Japanese Black (JB) beef cattle. This study revealed the lymphocyte counts in the BLV-free and -infected JB cattle were significantly lower than those in the Holstein cattle. Therefore, applying the EC Key to JB cattle could result in a large number of undetected BLV-infected cattle. Our proposed hematological key, which was designed for JB cattle, improves the detection of BLV-infected cattle by approximately 20%. We believe that this study could help promote BLV control.

  4. Annual Report on Our Call to Action: Strategic Plan for the Montgomery County Public Schools

    ERIC Educational Resources Information Center

    Montgomery County Public Schools, 2004

    2004-01-01

    In June 2003 the Board of Education adopted "Our Call to Action, Pursuit of Excellence," the second edition of the school system's strategic plan. This update of the original November 1999 Our Call to Action, while remaining focused on the core mission of providing every student with a high-quality, world-class education, strengthened the…

  5. Developmental and evolutionary significance of the mandibular arch and prechordal/premandibular cranium in vertebrates: revising the heterotopy scenario of gnathostome jaw evolution

    PubMed Central

    Kuratani, Shigeru; Adachi, Noritaka; Wada, Naoyuki; Oisi, Yasuhiro; Sugahara, Fumiaki

    2013-01-01

    The cephalic neural crest produces streams of migrating cells that populate pharyngeal arches and a more rostral, premandibular domain, to give rise to an extensive ectomesenchyme in the embryonic vertebrate head. The crest cells forming the trigeminal stream are the major source of the craniofacial skeleton; however, there is no clear distinction between the mandibular arch and the premandibular domain in this ectomesenchyme. The question regarding the evolution of the gnathostome jaw is, in part, a question about the differentiation of the mandibular arch, the rostralmost component of the pharynx, and in part a question about the developmental fate of the premandibular domain. We address the developmental definition of the mandibular arch in connection with the developmental origin of the trabeculae, paired cartilaginous elements generally believed to develop in the premandibular domain, and also of enigmatic cartilaginous elements called polar cartilages. Based on comparative embryology, we propose that the mandibular arch ectomesenchyme in gnathostomes can be defined as a Dlx1-positive domain, and that the polar cartilages, which develop from the Dlx1-negative premandibular ectomesenchyme, would represent merely posterior parts of the trabeculae. We also show, in the lamprey embryo, early migration of mandibular arch mesenchyme into the premandibular domain, and propose an updated version of the heterotopy theory on the origin of the jaw. PMID:22500853

  6. A model for the repeating FRB 121102 in the AGN scenario

    NASA Astrophysics Data System (ADS)

    Vieyro, F. L.; Romero, G. E.; Bosch-Ramon, V.; Marcote, B.; del Valle, M. V.

    2017-06-01

    Context. Fast radio bursts (FRBs) are transient sources of unknown origin. Recent radio and optical observations have provided strong evidence for an extragalactic origin of the phenomenon and the precise localization of the repeating FRB 121102. Observations using the Karl G. Jansky Very Large Array (VLA) and very-long-baseline interferometry (VLBI) have revealed the existence of a continuum non-thermal radio source consistent with the location of the bursts in a dwarf galaxy. All these new data rule out several models that were previously proposed, and impose stringent constraints to new models. Aims: We aim to model FRB 121102 in light of the new observational results in the active galactic nucleus (AGN) scenario. Methods: We propose a model for repeating FRBs in which a non-steady relativistic e±-beam, accelerated by an impulsive magnetohydrodynamic driven mechanism, interacts with a cloud at the centre of a star-forming dwarf galaxy. The interaction generates regions of high electrostatic field called cavitons in the plasma cloud. Turbulence is also produced in the beam. These processes, plus particle isotropization, the interaction scale, and light retardation effects, provide the necessary ingredients for short-lived, bright coherent radiation bursts. Results: The mechanism studied in this work explains the general properties of FRB 121102, and may also be applied to other repetitive FRBs. Conclusions: Coherent emission from electrons and positrons accelerated in cavitons provides a plausible explanation of FRBs.

  7. 78 FR 76257 - Rural Call Completion

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-17

    ... FCC seeks comments on additional measures that may help the Commission ensure a reasonable and... calls. The Further Notice of Proposed Rulemaking seeks public comment on additional measures intended to... signaling. In the Further Notice of Proposed Rulemaking (FNPRM), we seek comment on additional measures that...

  8. High-order statistics of weber local descriptors for image representation.

    PubMed

    Han, Xian-Hua; Chen, Yen-Wei; Xu, Gang

    2015-06-01

    Highly discriminant visual features play a key role in different image classification applications. This study aims to realize a method for extracting highly-discriminant features from images by exploring a robust local descriptor inspired by Weber's law. The investigated local descriptor is based on the fact that human perception for distinguishing a pattern depends not only on the absolute intensity of the stimulus but also on the relative variance of the stimulus. Therefore, we firstly transform the original stimulus (the images in our study) into a differential excitation-domain according to Weber's law, and then explore a local patch, called micro-Texton, in the transformed domain as Weber local descriptor (WLD). Furthermore, we propose to employ a parametric probability process to model the Weber local descriptors, and extract the higher-order statistics to the model parameters for image representation. The proposed strategy can adaptively characterize the WLD space using generative probability model, and then learn the parameters for better fitting the training space, which would lead to more discriminant representation for images. In order to validate the efficiency of the proposed strategy, we apply three different image classification applications including texture, food images and HEp-2 cell pattern recognition, which validates that our proposed strategy has advantages over the state-of-the-art approaches.

  9. Developing the fuzzy c-means clustering algorithm based on maximum entropy for multitarget tracking in a cluttered environment

    NASA Astrophysics Data System (ADS)

    Chen, Xiao; Li, Yaan; Yu, Jing; Li, Yuxing

    2018-01-01

    For fast and more effective implementation of tracking multiple targets in a cluttered environment, we propose a multiple targets tracking (MTT) algorithm called maximum entropy fuzzy c-means clustering joint probabilistic data association that combines fuzzy c-means clustering and the joint probabilistic data association (PDA) algorithm. The algorithm uses the membership value to express the probability of the target originating from measurement. The membership value is obtained through fuzzy c-means clustering objective function optimized by the maximum entropy principle. When considering the effect of the public measurement, we use a correction factor to adjust the association probability matrix to estimate the state of the target. As this algorithm avoids confirmation matrix splitting, it can solve the high computational load problem of the joint PDA algorithm. The results of simulations and analysis conducted for tracking neighbor parallel targets and cross targets in a different density cluttered environment show that the proposed algorithm can realize MTT quickly and efficiently in a cluttered environment. Further, the performance of the proposed algorithm remains constant with increasing process noise variance. The proposed algorithm has the advantages of efficiency and low computational load, which can ensure optimum performance when tracking multiple targets in a dense cluttered environment.

  10. A Novel Coarsening Method for Scalable and Efficient Mesh Generation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yoo, A; Hysom, D; Gunney, B

    2010-12-02

    In this paper, we propose a novel mesh coarsening method called brick coarsening method. The proposed method can be used in conjunction with any graph partitioners and scales to very large meshes. This method reduces problem space by decomposing the original mesh into fixed-size blocks of nodes called bricks, layered in a similar way to conventional brick laying, and then assigning each node of the original mesh to appropriate brick. Our experiments indicate that the proposed method scales to very large meshes while allowing simple RCB partitioner to produce higher-quality partitions with significantly less edge cuts. Our results further indicatemore » that the proposed brick-coarsening method allows more complicated partitioners like PT-Scotch to scale to very large problem size while still maintaining good partitioning performance with relatively good edge-cut metric. Graph partitioning is an important problem that has many scientific and engineering applications in such areas as VLSI design, scientific computing, and resource management. Given a graph G = (V,E), where V is the set of vertices and E is the set of edges, (k-way) graph partitioning problem is to partition the vertices of the graph (V) into k disjoint groups such that each group contains roughly equal number of vertices and the number of edges connecting vertices in different groups is minimized. Graph partitioning plays a key role in large scientific computing, especially in mesh-based computations, as it is used as a tool to minimize the volume of communication and to ensure well-balanced load across computing nodes. The impact of graph partitioning on the reduction of communication can be easily seen, for example, in different iterative methods to solve a sparse system of linear equation. Here, a graph partitioning technique is applied to the matrix, which is basically a graph in which each edge is a non-zero entry in the matrix, to allocate groups of vertices to processors in such a way that many of matrix-vector multiplication can be performed locally on each processor and hence to minimize communication. Furthermore, a good graph partitioning scheme ensures the equal amount of computation performed on each processor. Graph partitioning is a well known NP-complete problem, and thus the most commonly used graph partitioning algorithms employ some forms of heuristics. These algorithms vary in terms of their complexity, partition generation time, and the quality of partitions, and they tend to trade off these factors. A significant challenge we are currently facing at the Lawrence Livermore National Laboratory is how to partition very large meshes on massive-size distributed memory machines like IBM BlueGene/P, where scalability becomes a big issue. For example, we have found that the ParMetis, a very popular graph partitioning tool, can only scale to 16K processors. An ideal graph partitioning method on such an environment should be fast and scale to very large meshes, while producing high quality partitions. This is an extremely challenging task, as to scale to that level, the partitioning algorithm should be simple and be able to produce partitions that minimize inter-processor communications and balance the load imposed on the processors. Our goals in this work are two-fold: (1) To develop a new scalable graph partitioning method with good load balancing and communication reduction capability. (2) To study the performance of the proposed partitioning method on very large parallel machines using actual data sets and compare the performance to that of existing methods. The proposed method achieves the desired scalability by reducing the mesh size. For this, it coarsens an input mesh into a smaller size mesh by coalescing the vertices and edges of the original mesh into a set of mega-vertices and mega-edges. A new coarsening method called brick algorithm is developed in this research. In the brick algorithm, the zones in a given mesh are first grouped into fixed size blocks called bricks. These brick are then laid in a way similar to conventional brick laying technique, which reduces the number of neighboring blocks each block needs to communicate. Contributions of this research are as follows: (1) We have developed a novel method that scales to a really large problem size while producing high quality mesh partitions; (2) We measured the performance and scalability of the proposed method on a machine of massive size using a set of actual large complex data sets, where we have scaled to a mesh with 110 million zones using our method. To the best of our knowledge, this is the largest complex mesh that a partitioning method is successfully applied to; and (3) We have shown that proposed method can reduce the number of edge cuts by as much as 65%.« less

  11. Development of an RF-EMF Exposure Surrogate for Epidemiologic Research.

    PubMed

    Roser, Katharina; Schoeni, Anna; Bürgi, Alfred; Röösli, Martin

    2015-05-22

    Exposure assessment is a crucial part in studying potential effects of RF-EMF. Using data from the HERMES study on adolescents, we developed an integrative exposure surrogate combining near-field and far-field RF-EMF exposure in a single brain and whole-body exposure measure. Contributions from far-field sources were modelled by propagation modelling and multivariable regression modelling using personal measurements. Contributions from near-field sources were assessed from both, questionnaires and mobile phone operator records. Mean cumulative brain and whole-body doses were 1559.7 mJ/kg and 339.9 mJ/kg per day, respectively. 98.4% of the brain dose originated from near-field sources, mainly from GSM mobile phone calls (93.1%) and from DECT phone calls (4.8%). Main contributors to the whole-body dose were GSM mobile phone calls (69.0%), use of computer, laptop and tablet connected to WLAN (12.2%) and data traffic on the mobile phone via WLAN (6.5%). The exposure from mobile phone base stations contributed 1.8% to the whole-body dose, while uplink exposure from other people's mobile phones contributed 3.6%. In conclusion, the proposed approach is considered useful to combine near-field and far-field exposure to an integrative exposure surrogate for exposure assessment in epidemiologic studies. However, substantial uncertainties remain about exposure contributions from various near-field and far-field sources.

  12. Development of an RF-EMF Exposure Surrogate for Epidemiologic Research

    PubMed Central

    Roser, Katharina; Schoeni, Anna; Bürgi, Alfred; Röösli, Martin

    2015-01-01

    Exposure assessment is a crucial part in studying potential effects of RF-EMF. Using data from the HERMES study on adolescents, we developed an integrative exposure surrogate combining near-field and far-field RF-EMF exposure in a single brain and whole-body exposure measure. Contributions from far-field sources were modelled by propagation modelling and multivariable regression modelling using personal measurements. Contributions from near-field sources were assessed from both, questionnaires and mobile phone operator records. Mean cumulative brain and whole-body doses were 1559.7 mJ/kg and 339.9 mJ/kg per day, respectively. 98.4% of the brain dose originated from near-field sources, mainly from GSM mobile phone calls (93.1%) and from DECT phone calls (4.8%). Main contributors to the whole-body dose were GSM mobile phone calls (69.0%), use of computer, laptop and tablet connected to WLAN (12.2%) and data traffic on the mobile phone via WLAN (6.5%). The exposure from mobile phone base stations contributed 1.8% to the whole-body dose, while uplink exposure from other people’s mobile phones contributed 3.6%. In conclusion, the proposed approach is considered useful to combine near-field and far-field exposure to an integrative exposure surrogate for exposure assessment in epidemiologic studies. However, substantial uncertainties remain about exposure contributions from various near-field and far-field sources. PMID:26006132

  13. Corruption of phage display libraries by target-unrelated clones: diagnosis and countermeasures.

    PubMed

    Thomas, William D; Golomb, Miriam; Smith, George P

    2010-12-15

    Phage display is used to discover peptides or proteins with a desired target property-most often, affinity for a target selector molecule. Libraries of phage clones displaying diverse surface peptides are subject to a selection process designed to enrich for the target behavior and subsequently propagated to restore phage numbers. A recurrent problem is enrichment of clones, called target-unrelated phages or peptides (TUPs), that lack the target behavior. Many TUPs are propagation related; they have mutations conferring a growth advantage and are enriched during the propagations accompanying selection. Unlike other filamentous phage libraries, fd-tet-based libraries are relatively resistant to propagation-related TUP corruption. Their minus-strand origin is disrupted by a large cassette that simultaneously confers resistance to tetracycline and imposes a rate-limiting growth defect that cannot be bypassed with simple mutations. Nonetheless, a new type of propagation-related TUP emerged in the output of in vivo selections from an fd-tet library. The founding clone had a complex rearrangement that restored the minus-strand origin while retaining tetracycline resistance. The rearrangement involved two recombination events, one with a contaminant having a wild-type minus-strand origin. The founder's infectivity advantage spread by simple recombination to clones displaying different peptides. We propose measures for minimizing TUP corruption. Copyright © 2010 Elsevier Inc. All rights reserved.

  14. Corruption of phage-display libraries by target-unrelated clones: Diagnosis and countermeasures

    PubMed Central

    Thomas, William D.; Golomb, Miriam; Smith, George P.

    2010-01-01

    Phage display is used to discover peptides or proteins with a desired target property—most often, affinity for a target selector molecule. Libraries of phage clones displaying diverse surface peptides are subject to a selection process designed to enrich for the target behavior, and subsequently propagated to restore phage numbers. A recurrent problem is enrichment of clones, called target-unrelated phage (TUPs), that lack the target behavior. Many TUPs are propagation-related; they have mutations conferring a growth advantage, and are enriched during the propagations accompanying selection. Unlike other filamentous phage libraries, fd-tet-based libraries are relatively resistant to propagation-related TUP corruption. Their minus strand origin is disrupted by a large cassette that simultaneously confers resistance to tetracycline and imposes a rate-limiting growth defect that cannot be bypassed with simple mutations. Nonetheless, a new type of propagation-related TUP emerged in the output of in vivo selections from an fd-tet library. The founding clone had a complex rearrangement that restored the minus strand origin while retaining tetracycline resistance. The rearrangement involved two recombination events, one with a contaminant having a wild-type minus strand origin. The founder’s infectivity advantage spread by simple recombination to clones displaying different peptides. We propose measures for minimizing TUP corruption. PMID:20692225

  15. A secure online image trading system for untrusted cloud environments.

    PubMed

    Munadi, Khairul; Arnia, Fitri; Syaryadhi, Mohd; Fujiyoshi, Masaaki; Kiya, Hitoshi

    2015-01-01

    In conventional image trading systems, images are usually stored unprotected on a server, rendering them vulnerable to untrusted server providers and malicious intruders. This paper proposes a conceptual image trading framework that enables secure storage and retrieval over Internet services. The process involves three parties: an image publisher, a server provider, and an image buyer. The aim is to facilitate secure storage and retrieval of original images for commercial transactions, while preventing untrusted server providers and unauthorized users from gaining access to true contents. The framework exploits the Discrete Cosine Transform (DCT) coefficients and the moment invariants of images. Original images are visually protected in the DCT domain, and stored on a repository server. Small representation of the original images, called thumbnails, are generated and made publicly accessible for browsing. When a buyer is interested in a thumbnail, he/she sends a query to retrieve the visually protected image. The thumbnails and protected images are matched using the DC component of the DCT coefficients and the moment invariant feature. After the matching process, the server returns the corresponding protected image to the buyer. However, the image remains visually protected unless a key is granted. Our target application is the online market, where publishers sell their stock images over the Internet using public cloud servers.

  16. Cervical Vertigo: Historical Reviews and Advances.

    PubMed

    Peng, Baogan

    2018-01-01

    Vertigo is one of the most common presentations in adult patients. Among the various causes of vertigo, so-called cervical vertigo is still a controversial entity. Cervical vertigo was first thought to be due to abnormal input from cervical sympathetic nerves based on the work of Barré and Liéou in 1928. Later studies found that cerebral blood flow is not influenced by sympathetic stimulation. Ryan and Cope in 1955 proposed that abnormal sensory information from the damaged joint receptors of upper cervical regions may be related to pathologies of vertigo of cervical origin. Further studies found that cervical vertigo seems to originate from diseased cervical intervertebral discs. Recent research found that the ingrowth of a large number of Ruffini corpuscles into diseased cervical discs may be related to vertigo of cervical origin. Abnormal neck proprioceptive input integrated from the signals of Ruffini corpuscles in diseased cervical discs and muscle spindles in tense neck muscles secondary to neck pain is transmitted to the central nervous system and leads to a sensory mismatch with vestibular and other sensory information, resulting in a subjective feeling of vertigo and unsteadiness. Further studies are needed to illustrate the complex pathophysiologic mechanisms of cervical vertigo and to better understand and manage this perplexing entity. Copyright © 2017 Elsevier Inc. All rights reserved.

  17. Acoustic imaging of a duct spinning mode by the use of an in-duct circular microphone array.

    PubMed

    Wei, Qingkai; Huang, Xun; Peers, Edward

    2013-06-01

    An imaging method of acoustic spinning modes propagating within a circular duct simply with surface pressure information is introduced in this paper. The proposed method is developed in a theoretical way and is demonstrated by a numerical simulation case. Nowadays, the measurements within a duct have to be conducted using in-duct microphone array, which is unable to provide information of complete acoustic solutions across the test section. The proposed method can estimate immeasurable information by forming a so-called observer. The fundamental idea behind the testing method was originally developed in control theory for ordinary differential equations. Spinning mode propagation, however, is formulated in partial differential equations. A finite difference technique is used to reduce the associated partial differential equations to a classical form in control. The observer method can thereafter be applied straightforwardly. The algorithm is recursive and, thus, could be operated in real-time. A numerical simulation for a straight circular duct is conducted. The acoustic solutions on the test section can be reconstructed with good agreement to analytical solutions. The results suggest the potential and applications of the proposed method.

  18. Hierarchical Interactions Model for Predicting Mild Cognitive Impairment (MCI) to Alzheimer's Disease (AD) Conversion

    PubMed Central

    Li, Han; Liu, Yashu; Gong, Pinghua; Zhang, Changshui; Ye, Jieping

    2014-01-01

    Identifying patients with Mild Cognitive Impairment (MCI) who are likely to convert to dementia has recently attracted increasing attention in Alzheimer's disease (AD) research. An accurate prediction of conversion from MCI to AD can aid clinicians to initiate treatments at early stage and monitor their effectiveness. However, existing prediction systems based on the original biosignatures are not satisfactory. In this paper, we propose to fit the prediction models using pairwise biosignature interactions, thus capturing higher-order relationship among biosignatures. Specifically, we employ hierarchical constraints and sparsity regularization to prune the high-dimensional input features. Based on the significant biosignatures and underlying interactions identified, we build classifiers to predict the conversion probability based on the selected features. We further analyze the underlying interaction effects of different biosignatures based on the so-called stable expectation scores. We have used 293 MCI subjects from Alzheimer's Disease Neuroimaging Initiative (ADNI) database that have MRI measurements at the baseline to evaluate the effectiveness of the proposed method. Our proposed method achieves better classification performance than state-of-the-art methods. Moreover, we discover several significant interactions predictive of MCI-to-AD conversion. These results shed light on improving the prediction performance using interaction features. PMID:24416143

  19. Localized Ambient Solidity Separation Algorithm Based Computer User Segmentation.

    PubMed

    Sun, Xiao; Zhang, Tongda; Chai, Yueting; Liu, Yi

    2015-01-01

    Most of popular clustering methods typically have some strong assumptions of the dataset. For example, the k-means implicitly assumes that all clusters come from spherical Gaussian distributions which have different means but the same covariance. However, when dealing with datasets that have diverse distribution shapes or high dimensionality, these assumptions might not be valid anymore. In order to overcome this weakness, we proposed a new clustering algorithm named localized ambient solidity separation (LASS) algorithm, using a new isolation criterion called centroid distance. Compared with other density based isolation criteria, our proposed centroid distance isolation criterion addresses the problem caused by high dimensionality and varying density. The experiment on a designed two-dimensional benchmark dataset shows that our proposed LASS algorithm not only inherits the advantage of the original dissimilarity increments clustering method to separate naturally isolated clusters but also can identify the clusters which are adjacent, overlapping, and under background noise. Finally, we compared our LASS algorithm with the dissimilarity increments clustering method on a massive computer user dataset with over two million records that contains demographic and behaviors information. The results show that LASS algorithm works extremely well on this computer user dataset and can gain more knowledge from it.

  20. Metabolic network visualization eliminating node redundance and preserving metabolic pathways

    PubMed Central

    Bourqui, Romain; Cottret, Ludovic; Lacroix, Vincent; Auber, David; Mary, Patrick; Sagot, Marie-France; Jourdan, Fabien

    2007-01-01

    Background The tools that are available to draw and to manipulate the representations of metabolism are usually restricted to metabolic pathways. This limitation becomes problematic when studying processes that span several pathways. The various attempts that have been made to draw genome-scale metabolic networks are confronted with two shortcomings: 1- they do not use contextual information which leads to dense, hard to interpret drawings, 2- they impose to fit to very constrained standards, which implies, in particular, duplicating nodes making topological analysis considerably more difficult. Results We propose a method, called MetaViz, which enables to draw a genome-scale metabolic network and that also takes into account its structuration into pathways. This method consists in two steps: a clustering step which addresses the pathway overlapping problem and a drawing step which consists in drawing the clustered graph and each cluster. Conclusion The method we propose is original and addresses new drawing issues arising from the no-duplication constraint. We do not propose a single drawing but rather several alternative ways of presenting metabolism depending on the pathway on which one wishes to focus. We believe that this provides a valuable tool to explore the pathway structure of metabolism. PMID:17608928

  1. Localized Ambient Solidity Separation Algorithm Based Computer User Segmentation

    PubMed Central

    Sun, Xiao; Zhang, Tongda; Chai, Yueting; Liu, Yi

    2015-01-01

    Most of popular clustering methods typically have some strong assumptions of the dataset. For example, the k-means implicitly assumes that all clusters come from spherical Gaussian distributions which have different means but the same covariance. However, when dealing with datasets that have diverse distribution shapes or high dimensionality, these assumptions might not be valid anymore. In order to overcome this weakness, we proposed a new clustering algorithm named localized ambient solidity separation (LASS) algorithm, using a new isolation criterion called centroid distance. Compared with other density based isolation criteria, our proposed centroid distance isolation criterion addresses the problem caused by high dimensionality and varying density. The experiment on a designed two-dimensional benchmark dataset shows that our proposed LASS algorithm not only inherits the advantage of the original dissimilarity increments clustering method to separate naturally isolated clusters but also can identify the clusters which are adjacent, overlapping, and under background noise. Finally, we compared our LASS algorithm with the dissimilarity increments clustering method on a massive computer user dataset with over two million records that contains demographic and behaviors information. The results show that LASS algorithm works extremely well on this computer user dataset and can gain more knowledge from it. PMID:26221133

  2. 29 CFR 1607.15 - Documentation of impact and validity evidence.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... national origin group (see section 4 above) constituting more than two percent (2%) of the labor force in... origin (see § 4 above) if one race or national origin group in the relevant labor area constitutes more... selection process for that job has an adverse impact on any of the groups for which records are called for...

  3. Sensory biology: echolocation from click to call, mouth to wing.

    PubMed

    Fenton, M Brock; Ratcliffe, John M

    2014-12-15

    Echolocators use echoes of sounds they produce, clicks or calls, to detect objects. Usually, these signals originate from the head. New work reveals that three species of bats use their wings to generate echolocation signals. Copyright © 2014 Elsevier Ltd. All rights reserved.

  4. On some methods for assessing earthquake predictions

    NASA Astrophysics Data System (ADS)

    Molchan, G.; Romashkova, L.; Peresan, A.

    2017-09-01

    A regional approach to the problem of assessing earthquake predictions inevitably faces a deficit of data. We point out some basic limits of assessment methods reported in the literature, considering the practical case of the performance of the CN pattern recognition method in the prediction of large Italian earthquakes. Along with the classical hypothesis testing, a new game approach, the so-called parimutuel gambling (PG) method, is examined. The PG, originally proposed for the evaluation of the probabilistic earthquake forecast, has been recently adapted for the case of 'alarm-based' CN prediction. The PG approach is a non-standard method; therefore it deserves careful examination and theoretical analysis. We show that the PG alarm-based version leads to an almost complete loss of information about predicted earthquakes (even for a large sample). As a result, any conclusions based on the alarm-based PG approach are not to be trusted. We also show that the original probabilistic PG approach does not necessarily identifies the genuine forecast correctly among competing seismicity rate models, even when applied to extensive data.

  5. Fast algorithm of adaptive Fourier series

    NASA Astrophysics Data System (ADS)

    Gao, You; Ku, Min; Qian, Tao

    2018-05-01

    Adaptive Fourier decomposition (AFD, precisely 1-D AFD or Core-AFD) was originated for the goal of positive frequency representations of signals. It achieved the goal and at the same time offered fast decompositions of signals. There then arose several types of AFDs. AFD merged with the greedy algorithm idea, and in particular, motivated the so-called pre-orthogonal greedy algorithm (Pre-OGA) that was proven to be the most efficient greedy algorithm. The cost of the advantages of the AFD type decompositions is, however, the high computational complexity due to the involvement of maximal selections of the dictionary parameters. The present paper offers one formulation of the 1-D AFD algorithm by building the FFT algorithm into it. Accordingly, the algorithm complexity is reduced, from the original $\\mathcal{O}(M N^2)$ to $\\mathcal{O}(M N\\log_2 N)$, where $N$ denotes the number of the discretization points on the unit circle and $M$ denotes the number of points in $[0,1)$. This greatly enhances the applicability of AFD. Experiments are carried out to show the high efficiency of the proposed algorithm.

  6. Efficient Hardware Implementation of the Lightweight Block Encryption Algorithm LEA

    PubMed Central

    Lee, Donggeon; Kim, Dong-Chan; Kwon, Daesung; Kim, Howon

    2014-01-01

    Recently, due to the advent of resource-constrained trends, such as smartphones and smart devices, the computing environment is changing. Because our daily life is deeply intertwined with ubiquitous networks, the importance of security is growing. A lightweight encryption algorithm is essential for secure communication between these kinds of resource-constrained devices, and many researchers have been investigating this field. Recently, a lightweight block cipher called LEA was proposed. LEA was originally targeted for efficient implementation on microprocessors, as it is fast when implemented in software and furthermore, it has a small memory footprint. To reflect on recent technology, all required calculations utilize 32-bit wide operations. In addition, the algorithm is comprised of not complex S-Box-like structures but simple Addition, Rotation, and XOR operations. To the best of our knowledge, this paper is the first report on a comprehensive hardware implementation of LEA. We present various hardware structures and their implementation results according to key sizes. Even though LEA was originally targeted at software efficiency, it also shows high efficiency when implemented as hardware. PMID:24406859

  7. A Search Strategy of Level-Based Flooding for the Internet of Things

    PubMed Central

    Qiu, Tie; Ding, Yanhong; Xia, Feng; Ma, Honglian

    2012-01-01

    This paper deals with the query problem in the Internet of Things (IoT). Flooding is an important query strategy. However, original flooding is prone to cause heavy network loads. To address this problem, we propose a variant of flooding, called Level-Based Flooding (LBF). With LBF, the whole network is divided into several levels according to the distances (i.e., hops) between the sensor nodes and the sink node. The sink node knows the level information of each node. Query packets are broadcast in the network according to the levels of nodes. Upon receiving a query packet, sensor nodes decide how to process it according to the percentage of neighbors that have processed it. When the target node receives the query packet, it sends its data back to the sink node via random walk. We show by extensive simulations that the performance of LBF in terms of cost and latency is much better than that of original flooding, and LBF can be used in IoT of different scales. PMID:23112594

  8. 77 FR 24766 - Call for Proposals for a Micro Support Program on International Conflict Resolution and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-04-25

    ... UNITED STATES INSTITUTE OF PEACE Call for Proposals for a Micro Support Program on International Conflict Resolution and Peacebuilding For Immediate Release AGENCY: United States Institute of Peace. ACTION: Notice. SUMMARY: Micro Support Program on International Conflict Resolution and Peacebuilding...

  9. 76 FR 38202 - Proposed Information Collection; Mourning Dove Call Count Survey

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-06-29

    ...] Proposed Information Collection; Mourning Dove Call Count Survey AGENCY: Fish and Wildlife Service... we gather accurate data on various characteristics of migratory bird populations. The Mourning Dove... determine the population status of the mourning dove. If this survey were not conducted, we would not be...

  10. Calling on a million minds for community annotation in WikiProteins

    PubMed Central

    Mons, Barend; Ashburner, Michael; Chichester, Christine; van Mulligen, Erik; Weeber, Marc; den Dunnen, Johan; van Ommen, Gert-Jan; Musen, Mark; Cockerill, Matthew; Hermjakob, Henning; Mons, Albert; Packer, Abel; Pacheco, Roberto; Lewis, Suzanna; Berkeley, Alfred; Melton, William; Barris, Nickolas; Wales, Jimmy; Meijssen, Gerard; Moeller, Erik; Roes, Peter Jan; Borner, Katy; Bairoch, Amos

    2008-01-01

    WikiProteins enables community annotation in a Wiki-based system. Extracts of major data sources have been fused into an editable environment that links out to the original sources. Data from community edits create automatic copies of the original data. Semantic technology captures concepts co-occurring in one sentence and thus potential factual statements. In addition, indirect associations between concepts have been calculated. We call on a 'million minds' to annotate a 'million concepts' and to collect facts from the literature with the reward of collaborative knowledge discovery. The system is available for beta testing at . PMID:18507872

  11. SIRTF Studies of Galaxy Formation and Evolution

    NASA Technical Reports Server (NTRS)

    Eisenhardt, Peter

    1999-01-01

    The Space Infrared Telescope Facility (SIRTF) is a cornerstone of NASA's Origins program, and will complete NASA's family of Great Observatories when it is launched in December 2001. SIRTF will provide imaging with point source sensitivities ranging from a few microjanskies at 3.6 microns to several millijanskies at 160 microns, land spectroscopy of sources brighter than a millijansky over the 5 to 40 micron range. Over 75% of observing time during SIRTF's expected 5 year lifetime will be available to general investigators from the international community, with the first call for proposals in July 2000. I review SIRTF's capabilities and plans for the study of galaxy formation and evolution. This work was carried out at the Jet Propulsion Laboratory, California Institute of Technology, under contract to the National Aeronautics and Space Administration.

  12. Improvement to the scanning electron microscope image adaptive Canny optimization colorization by pseudo-mapping.

    PubMed

    Lo, T Y; Sim, K S; Tso, C P; Nia, M E

    2014-01-01

    An improvement to the previously proposed adaptive Canny optimization technique for scanning electron microscope image colorization is reported. The additional feature, called pseudo-mapping technique, is that the grayscale markings are temporarily mapped to a set of pre-defined pseudo-color map as a mean to instill color information for grayscale colors in chrominance channels. This allows the presence of grayscale markings to be identified; hence optimization colorization of grayscale colors is made possible. This additional feature enhances the flexibility of scanning electron microscope image colorization by providing wider range of possible color enhancement. Furthermore, the nature of this technique also allows users to adjust the luminance intensities of selected region from the original image within certain extent. © 2014 Wiley Periodicals, Inc.

  13. A Systematic Method for Verification and Validation of Gyrokinetic Microstability Codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bravenec, Ronald

    My original proposal for the period Feb. 15, 2014 through Feb. 14, 2017 called for an integrated validation and verification effort carried out by myself with collaborators. The validation component would require experimental profile and power-balance analysis. In addition, it would require running the gyrokinetic codes varying the input profiles within experimental uncertainties to seek agreement with experiment before discounting a code as invalidated. Therefore, validation would require a major increase of effort over my previous grant periods which covered only code verification (code benchmarking). Consequently, I had requested full-time funding. Instead, I am being funded at somewhat less thanmore » half time (5 calendar months per year). As a consequence, I decided to forego the validation component and to only continue the verification efforts.« less

  14. For the Boys in the Family: An Investigation Into the Relationship Between "Honor"-Based Violence and Endogamy.

    PubMed

    Payton, Joanne L

    2015-06-05

    Germaine Tillion's classic work of ethnology My Cousin, My Husband related so-called "honor"-based violence (HBV) to the institution of cousin marriage as a response to women's entitlement to inheritance within the Greater Mediterranean Region. This article will scrutinize Tillion's position using original survey data gathered in the Kurdistan region of Iraq, finding that although there is a correlation between HBV and cousin marriage, Tillion's association of this with inheritance laws is inadequate. An alternative position is proposed, in which the relationship between HBV and cousin marriage is situated in coercion around marriage, intergenerational tensions, and in-group exclusivity, exacerbated by the contemporary politics of nationalist neopatrimonialism and an economy based in oil rentierism. © The Author(s) 2015.

  15. A community genetics perspective: opportunities for the coming decade.

    PubMed

    Crutsinger, Gregory M

    2016-04-01

    Community genetics was originally proposed as a novel approach to identifying links between genes and ecosystems, and merging ecological and evolutional perspectives. The dozen years since the birth of community genetics have seen many empirical studies and common garden experiments, as well as the rise of eco-evolutionary dynamics research and a general shift in ecology to incorporate intraspecific variation. So what have we learned from community genetics? Can individual genes affect entire ecosystems? Are there interesting questions left to be answered, or has community genetics run its course? This perspective makes a series of key points about the general patterns that have emerged and calls attention to gaps in our understanding to be addressed in the coming years. © 2015 The Authors. New Phytologist © 2015 New Phytologist Trust.

  16. Heuristic Diagrams as a Tool to Teach History of Science

    NASA Astrophysics Data System (ADS)

    Chamizo, José A.

    2012-05-01

    The graphic organizer called here heuristic diagram as an improvement of Gowin's Vee heuristic is proposed as a tool to teach history of science. Heuristic diagrams have the purpose of helping students (or teachers, or researchers) to understand their own research considering that asks and problem-solving are central to scientific activity. The left side originally related in Gowin's Vee with philosophies, theories, models, laws or regularities now agrees with Toulmin's concepts (language, models as representation techniques and application procedures). Mexican science teachers without experience in science education research used the heuristic diagram to learn about the history of chemistry considering also in the left side two different historical times: past and present. Through a semantic differential scale teachers' attitude to the heuristic diagram was evaluated and its usefulness was demonstrated.

  17. Individual Distinctiveness in Call Types of Wild Western Female Gorillas

    PubMed Central

    Salmi, Roberta; Hammerschmidt, Kurt; Doran-Sheehy, Diane M.

    2014-01-01

    Individually distinct vocalizations play an important role in animal communication, allowing call recipients to respond differentially based on caller identity. However, which of the many calls in a species' repertoire should have more acoustic variability and be more recognizable is less apparent. One proposed hypothesis is that calls used over long distances should be more distinct because visual cues are not available to identify the caller. An alternative hypothesis proposes that close calls should be more recognizable because of their importance in social interactions. To examine which hypothesis garners more support, the acoustic variation and individual distinctiveness of eight call types of six wild western gorilla (Gorilla gorilla) females were investigated. Acoustic recordings of gorilla calls were collected at the Mondika Research Center (Republic of Congo). Acoustic variability was high in all gorilla calls. Similar high inter-individual variation and potential for identity coding (PIC) was found for all call types. Discriminant function analyses confirmed that all call types were individually distinct (although for call types with lowest sample size - hum, grumble and scream - this result cannot be generalized), suggesting that neither the distance at which communication occurs nor the call social function alone can explain the evolution of identity signaling in western gorilla communication. PMID:25029238

  18. Noise estimation for hyperspectral imagery using spectral unmixing and synthesis

    NASA Astrophysics Data System (ADS)

    Demirkesen, C.; Leloglu, Ugur M.

    2014-10-01

    Most hyperspectral image (HSI) processing algorithms assume a signal to noise ratio model in their formulation which makes them dependent on accurate noise estimation. Many techniques have been proposed to estimate the noise. A very comprehensive comparative study on the subject is done by Gao et al. [1]. In a nut-shell, most techniques are based on the idea of calculating standard deviation from assumed-to-be homogenous regions in the image. Some of these algorithms work on a regular grid parameterized with a window size w, while others make use of image segmentation in order to obtain homogenous regions. This study focuses not only to the statistics of the noise but to the estimation of the noise itself. A noise estimation technique motivated from a recent HSI de-noising approach [2] is proposed in this study. The denoising algorithm is based on estimation of the end-members and their fractional abundances using non-negative least squares method. The end-members are extracted using the well-known simplex volume optimization technique called NFINDR after manual selection of number of end-members and the image is reconstructed using the estimated endmembers and abundances. Actually, image de-noising and noise estimation are two sides of the same coin: Once we denoise an image, we can estimate the noise by calculating the difference of the de-noised image and the original noisy image. In this study, the noise is estimated as described above. To assess the accuracy of this method, the methodology in [1] is followed, i.e., synthetic images are created by mixing end-member spectra and noise. Since best performing method for noise estimation was spectral and spatial de-correlation (SSDC) originally proposed in [3], the proposed method is compared to SSDC. The results of the experiments conducted with synthetic HSIs suggest that the proposed noise estimation strategy outperforms the existing techniques in terms of mean and standard deviation of absolute error of the estimated noise. Finally, it is shown that the proposed technique demonstrated a robust behavior to the change of its single parameter, namely the number of end-members.

  19. Origines de la nomenclature astrale

    NASA Astrophysics Data System (ADS)

    Duchesne-Guillemin, J.

    Within a survey of the Indo-European, Sumero-Babylonian, Greek, Arabic, and modern origins of the names of the constellations, stars, planets, satellites, asteroids, etc., an explanation is offered of the Omega sign used in Greek horoscopes for the lunar nodes but already appearing on Babylonian reliefs. Its origin is traced back to the Sumerian constellations of the Yoke, later called the Dragon.

  20. Singularity-sensitive gauge-based radar rainfall adjustment methods for urban hydrological applications

    NASA Astrophysics Data System (ADS)

    Wang, L.-P.; Ochoa-Rodríguez, S.; Onof, C.; Willems, P.

    2015-09-01

    Gauge-based radar rainfall adjustment techniques have been widely used to improve the applicability of radar rainfall estimates to large-scale hydrological modelling. However, their use for urban hydrological applications is limited as they were mostly developed based upon Gaussian approximations and therefore tend to smooth off so-called "singularities" (features of a non-Gaussian field) that can be observed in the fine-scale rainfall structure. Overlooking the singularities could be critical, given that their distribution is highly consistent with that of local extreme magnitudes. This deficiency may cause large errors in the subsequent urban hydrological modelling. To address this limitation and improve the applicability of adjustment techniques at urban scales, a method is proposed herein which incorporates a local singularity analysis into existing adjustment techniques and allows the preservation of the singularity structures throughout the adjustment process. In this paper the proposed singularity analysis is incorporated into the Bayesian merging technique and the performance of the resulting singularity-sensitive method is compared with that of the original Bayesian (non singularity-sensitive) technique and the commonly used mean field bias adjustment. This test is conducted using as case study four storm events observed in the Portobello catchment (53 km2) (Edinburgh, UK) during 2011 and for which radar estimates, dense rain gauge and sewer flow records, as well as a recently calibrated urban drainage model were available. The results suggest that, in general, the proposed singularity-sensitive method can effectively preserve the non-normality in local rainfall structure, while retaining the ability of the original adjustment techniques to generate nearly unbiased estimates. Moreover, the ability of the singularity-sensitive technique to preserve the non-normality in rainfall estimates often leads to better reproduction of the urban drainage system's dynamics, particularly of peak runoff flows.

  1. Education, Cyberspace, and Change [Serial Article Online].

    ERIC Educational Resources Information Center

    Lemke, J. L.

    1993-01-01

    This article was originally written on the internet in Australia to provide a starting point for discussions of new perspectives on education made possible by advanced technologies. Ecosocial changes in the practices and institutions called education are discussed in the context of changes in the practices and institutions called information…

  2. Eyetracking Methodology in SCMC: A Tool for Empowering Learning and Teaching

    ERIC Educational Resources Information Center

    Stickler, Ursula; Shi, Lijing

    2017-01-01

    Computer-assisted language learning, or CALL, is an interdisciplinary area of research, positioned between science and social science, computing and education, linguistics and applied linguistics. This paper argues that by appropriating methods originating in some areas of CALL-related research, for example human-computer interaction (HCI) or…

  3. Indigenous Labor and Indigenous History

    ERIC Educational Resources Information Center

    McCallum, Mary Jane Logan

    2009-01-01

    This article was originally a response to a call from the Western History Association for papers by Indigenous academics. The call aimed to showcase Indigenous scholarship on certain terms: that it delves into some of the opportunities, challenges, and obstacles involved with "working from home" or doing research that bridges a space…

  4. 47 CFR 12.4 - Reliability of covered 911 service providers.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... electronic records, as long as they reflect whether critical 911 circuits are physically diverse. (7... calls and associated number or location information to the appropriate PSAP. (5) Critical 911 circuits... calls to the PSAP(s). Critical 911 circuits also include ALI and ANI facilities that originate at the...

  5. Original non-stationary eddy current imaging process for the evaluation of defects in metallic structures

    NASA Astrophysics Data System (ADS)

    Placko, Dominique; Bore, Thierry; Rivollet, Alain; Joubert, Pierre-Yves

    2015-10-01

    This paper deals with the problem of imaging defects in metallic structures through eddy current (EC) inspections, and proposes an original process for a possible tomographical crack evaluation. This process is based on a semi analytical modeling, called "distributed point source method" (DPSM) which is used to describe and equate the interactions between the implemented EC probes and the structure under test. Several steps will be successively described, illustrating the feasibility of this new imaging process dedicated to the quantitative evaluation of defects. The basic principles of this imaging process firstly consist in creating a 3D grid by meshing the volume potentially inspected by the sensor. As a result, a given number of elemental volumes (called voxels) are obtained. Secondly, the DPSM modeling is used to compute an image for all occurrences in which only one of the voxels has a different conductivity among all the other ones. The assumption consists to consider that a real defect may be truly represented by a superimposition of elemental voxels: the resulting accuracy will naturally depend on the density of space sampling. On other hand, the excitation device of the EC imager has the capability to be oriented in several directions, and driven by an excitation current at variable frequency. So, the simulation will be performed for several frequencies and directions of the eddy currents induced in the structure, which increases the signal entropy. All these results are merged in a so-called "observation matrix" containing all the probe/structure interaction configurations. This matrix is then used in an inversion scheme in order to perform the evaluation of the defect location and geometry. The modeled EC data provided by the DPSM are compared to the experimental images provided by an eddy current imager (ECI), implemented on aluminum plates containing some buried defects. In order to validate the proposed inversion process, we feed it with computed images of various acquisition configurations. Additive noise was added to the images so that they are more representative of actual EC data. In the case of simple notch type defects, for which the relative conductivity may only take two extreme values (1 or 0), a threshold was introduced on the inverted images, in a post processing step, taking advantage of a priori knowledge of the statistical properties of the restored images. This threshold allowed to enhance the image contrast and has contributed to eliminate both the residual noise and the pixels showing non-realistic values.

  6. 1. Photocopy of a photograph of the Barn. Original is ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    1. Photocopy of a photograph of the Barn. Original is on file with the Payette National Forest, Supervisor's Office, McCall, Idaho. BARN, CA. 1935, FACING NORTH. - Hornet Ranger Station, Four Horse Barn, Forest Service Road No. 50002, Council, Adams County, ID

  7. The Origins of the Field Concept in Physics

    NASA Astrophysics Data System (ADS)

    McMullin, Ernan

    The term, ``field,'' made its first appearance in physics as a technical term in the mid-nineteenth century. But the notion of what later came to be called a field had been a long time in gestation. Early discussions of magnetism and of the cause of the ocean tides had long ago suggested the idea of a ``zone of influence'' surrounding certain bodies. Johannes Kepler's mathematical rendering of the orbital motion of Mars encouraged him to formulate what he called ``a true theory of gravity'' involving the notion of attraction. Isaac Newton went on to construct an eminently effective dynamics, with attraction as its primary example of force. Was his a field theory? Historians of science disagree. Much depends on whether a theory consistent with the notion of action at a distance ought qualify as a ``field'' theory. Roger Boscovich and Immanuel Kant later took the Newtonian concept of attraction in new directions. It was left to Michael Faraday to propose the ``physical existence'' of lines of force and to James Clerk Maxwell to add as criterion the presence of energy as the ontological basis for a full-blown ``field theory'' of electromagnetic phenomena.

  8. The FLARE mission: deep and wide-field 1-5um imaging and spectroscopy for the early universe: a proposal for M5 cosmic vision call

    NASA Astrophysics Data System (ADS)

    Burgarella, D.; Levacher, P.; Vives, S.; Dohlen, K.; Pascal, S.

    2016-07-01

    FLARE (First Light And Reionization Explorer) is a space mission that will be submitted to ESA (M5 call). Its primary goal (~80% of lifetime) is to identify and study the universe before the end of the reionization at z > 6. A secondary objective (~20% of lifetime) is to survey star formation in the Milky Way. FLARE's strategy optimizes the science return: imaging and spectroscopic integral-field observations will be carried out simultaneously on two parallel focal planes and over very wide instantaneous fields of view. FLARE will help addressing two of ESA's Cosmic Vision themes: a) << How did the universe originate and what is it made of? » and b) « What are the conditions for planet formation and the emergence of life? >> and more specifically, << From gas and dust to stars and planets >>. FLARE will provide to the ESA community a leading position to statistically study the early universe after JWST's deep but pin-hole surveys. Moreover, the instrumental development of wide-field imaging and wide-field integral-field spectroscopy in space will be a major breakthrough after making them available on ground-based telescopes.

  9. Regularized spherical polar fourier diffusion MRI with optimal dictionary learning.

    PubMed

    Cheng, Jian; Jiang, Tianzi; Deriche, Rachid; Shen, Dinggang; Yap, Pew-Thian

    2013-01-01

    Compressed Sensing (CS) takes advantage of signal sparsity or compressibility and allows superb signal reconstruction from relatively few measurements. Based on CS theory, a suitable dictionary for sparse representation of the signal is required. In diffusion MRI (dMRI), CS methods proposed for reconstruction of diffusion-weighted signal and the Ensemble Average Propagator (EAP) utilize two kinds of Dictionary Learning (DL) methods: 1) Discrete Representation DL (DR-DL), and 2) Continuous Representation DL (CR-DL). DR-DL is susceptible to numerical inaccuracy owing to interpolation and regridding errors in a discretized q-space. In this paper, we propose a novel CR-DL approach, called Dictionary Learning - Spherical Polar Fourier Imaging (DL-SPFI) for effective compressed-sensing reconstruction of the q-space diffusion-weighted signal and the EAP. In DL-SPFI, a dictionary that sparsifies the signal is learned from the space of continuous Gaussian diffusion signals. The learned dictionary is then adaptively applied to different voxels using a weighted LASSO framework for robust signal reconstruction. Compared with the start-of-the-art CR-DL and DR-DL methods proposed by Merlet et al. and Bilgic et al., respectively, our work offers the following advantages. First, the learned dictionary is proved to be optimal for Gaussian diffusion signals. Second, to our knowledge, this is the first work to learn a voxel-adaptive dictionary. The importance of the adaptive dictionary in EAP reconstruction will be demonstrated theoretically and empirically. Third, optimization in DL-SPFI is only performed in a small subspace resided by the SPF coefficients, as opposed to the q-space approach utilized by Merlet et al. We experimentally evaluated DL-SPFI with respect to L1-norm regularized SPFI (L1-SPFI), which uses the original SPF basis, and the DR-DL method proposed by Bilgic et al. The experiment results on synthetic and real data indicate that the learned dictionary produces sparser coefficients than the original SPF basis and results in significantly lower reconstruction error than Bilgic et al.'s method.

  10. Partial differential equation-based approach for empirical mode decomposition: application on image analysis.

    PubMed

    Niang, Oumar; Thioune, Abdoulaye; El Gueirea, Mouhamed Cheikh; Deléchelle, Eric; Lemoine, Jacques

    2012-09-01

    The major problem with the empirical mode decomposition (EMD) algorithm is its lack of a theoretical framework. So, it is difficult to characterize and evaluate this approach. In this paper, we propose, in the 2-D case, the use of an alternative implementation to the algorithmic definition of the so-called "sifting process" used in the original Huang's EMD method. This approach, especially based on partial differential equations (PDEs), was presented by Niang in previous works, in 2005 and 2007, and relies on a nonlinear diffusion-based filtering process to solve the mean envelope estimation problem. In the 1-D case, the efficiency of the PDE-based method, compared to the original EMD algorithmic version, was also illustrated in a recent paper. Recently, several 2-D extensions of the EMD method have been proposed. Despite some effort, 2-D versions for EMD appear poorly performing and are very time consuming. So in this paper, an extension to the 2-D space of the PDE-based approach is extensively described. This approach has been applied in cases of both signal and image decomposition. The obtained results confirm the usefulness of the new PDE-based sifting process for the decomposition of various kinds of data. Some results have been provided in the case of image decomposition. The effectiveness of the approach encourages its use in a number of signal and image applications such as denoising, detrending, or texture analysis.

  11. 1. Photocopy of photograph of Blacksmith Shop. Original on file ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    1. Photocopy of photograph of Blacksmith Shop. Original on file with the Payette National Forest, Supervisor's Office, McCall, Idaho. BLACKSMITH SHOP CA. 1935, FACING NORTH. BARN IS IN BACKGROUND. - Hornet Ranger Station, Blacksmith Shop, Forest Service Road No. 50002, Council, Adams County, ID

  12. Anticlockwise or Clockwise? A Dynamic Perception-Action-Laterality Model for Directionality Bias in Visuospatial Functioning

    PubMed Central

    Karim, A.K.M. Rezaul; Proulx, Michael J.; Likova, Lora T.

    2016-01-01

    Reviewing the relevant literature in visual psychophysics and visual neuroscience we propose a three-stage model of directionality bias in visuospatial functioning. We call this model the ‘Perception-Action-Laterality’ (PAL) hypothesis. We analyzed the research findings for a wide range of visuospatial tasks, showing that there are two major directionality trends: clockwise versus anticlockwise. It appears these preferences are combinatorial, such that a majority of people fall in the first category demonstrating a preference for stimuli/objects arranged from left-to-right rather than from right-to-left, while people in the second category show an opposite trend. These perceptual biases can guide sensorimotor integration and action, creating two corresponding turner groups in the population. In support of PAL, we propose another model explaining the origins of the biases– how the neurogenetic factors and the cultural factors interact in a biased competition framework to determine the direction and extent of biases. This dynamic model can explain not only the two major categories of biases, but also the unbiased, unreliably biased or mildly biased cases in visuosptial functioning. PMID:27350096

  13. Use of conserved key amino acid positions to morph protein folds.

    PubMed

    Reddy, Boojala V B; Li, Wilfred W; Bourne, Philip E

    2002-07-15

    By using three-dimensional (3D) structure alignments and a previously published method to determine Conserved Key Amino Acid Positions (CKAAPs) we propose a theoretical method to design mutations that can be used to morph the protein folds. The original Paracelsus challenge, met by several groups, called for the engineering of a stable but different structure by modifying less than 50% of the amino acid residues. We have used the sequences from the Protein Data Bank (PDB) identifiers 1ROP, and 2CRO, which were previously used in the Paracelsus challenge by those groups, and suggest mutation to CKAAPs to morph the protein fold. The total number of mutations suggested is less than 40% of the starting sequence theoretically improving the challenge results. From secondary structure prediction experiments of the proposed mutant sequence structures, we observe that each of the suggested mutant protein sequences likely folds to a different, non-native potentially stable target structure. These results are an early indicator that analyses using structure alignments leading to CKAAPs of a given structure are of value in protein engineering experiments. Copyright 2002 Wiley Periodicals, Inc.

  14. A New Look at the Eclipse Timing Variation Diagram Analysis of Selected 3-body W UMa Systems

    NASA Astrophysics Data System (ADS)

    Christopoulou, P.-E.; Papageorgiou, A.

    2015-07-01

    The light travel effect produced by the presence of tertiary components can reveal much about the origin and evolution of over-contact binaries. Monitoring of W UMa systems over the last decade and/or the use of publicly available photometric surveys (NSVS, ASAS, etc.) has uncovered or suggested the presence of many unseen companions, which calls for an in-depth investigation of the parameters derived from cyclic period variations in order to confirm or reject the assumption of hidden companion(s). Progress in the analysis of eclipse timing variations is summarized here both from the empirical and the theoretical points of view, and a more extensive investigation of the proposed orbital parameters of third bodies is proposed. The code we have developed for this, implemented in Python, is set up to handle heuristic scanning with parameter perturbation in parameter space, and to establish realistic uncertainties from the least squares fitting. A computational example is given for TZ Boo, a W UMa system with a spectroscopically detected third component. Future options to be implemented include MCMC and bootstrapping.

  15. Metropolitan Boston air quality control region: transportation control plan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1975-02-28

    The EPA is considering a number of amendments to the transportation control plan which it promulgated Nov. 8, 1973 for the Metropolitan Boston Intrastate Air Quality Control Region. Included in the proposed amendments is a revised regulation for reduction of commuter travel which would include students and employees. This program would be implemented in conjunction with the carpool matching program being developed by the Commonwealth of Massachusetts and the employee pass program offered by the Massachusetts Bay Transportation Authority. A new provision for limiting overall hydrocarbon emissions from major users of organic compounds is included. Also published are a proposalmore » for encouraging bicycle use, new proposals for controlling carbon monoxide levels outside the Boston core area, and a new procedure for periodic monitoring and updating of the plan. Other features of the original plan are retained with modifications in areas including the ceiling on the level of commercial parking spaces in the so-called ''freeze'' area, limitations of on-street commuter parking, a semiannual inspection and maintenance program, a retrofit program, and incentives for carpool and transit use.« less

  16. Performance evaluation of distributed wavelength assignment in WDM optical networks

    NASA Astrophysics Data System (ADS)

    Hashiguchi, Tomohiro; Wang, Xi; Morikawa, Hiroyuki; Aoyama, Tomonori

    2004-04-01

    In WDM wavelength routed networks, prior to a data transfer, a call setup procedure is required to reserve a wavelength path between the source-destination node pairs. A distributed approach to a connection setup can achieve a very high speed, while improving the reliability and reducing the implementation cost of the networks. However, along with many advantages, several major challenges have been posed by the distributed scheme in how the management and allocation of wavelength could be efficiently carried out. In this thesis, we apply a distributed wavelength assignment algorithm named priority based wavelength assignment (PWA) that was originally proposed for the use in burst switched optical networks to the problem of reserving wavelengths of path reservation protocols in the distributed control optical networks. Instead of assigning wavelengths randomly, this approach lets each node select the "safest" wavelengths based on the information of wavelength utilization history, thus unnecessary future contention is prevented. The simulation results presented in this paper show that the proposed protocol can enhance the performance of the system without introducing any apparent drawbacks.

  17. Characterizing behavioural ‘characters’: an evolutionary framework

    PubMed Central

    Araya-Ajoy, Yimen G.; Dingemanse, Niels J.

    2014-01-01

    Biologists often study phenotypic evolution assuming that phenotypes consist of a set of quasi-independent units that have been shaped by selection to accomplish a particular function. In the evolutionary literature, such quasi-independent functional units are called ‘evolutionary characters’, and a framework based on evolutionary principles has been developed to characterize them. This framework mainly focuses on ‘fixed’ characters, i.e. those that vary exclusively between individuals. In this paper, we introduce multi-level variation and thereby expand the framework to labile characters, focusing on behaviour as a worked example. We first propose a concept of ‘behavioural characters’ based on the original evolutionary character concept. We then detail how integration of variation between individuals (cf. ‘personality’) and within individuals (cf. ‘individual plasticity’) into the framework gives rise to a whole suite of novel testable predictions about the evolutionary character concept. We further propose a corresponding statistical methodology to test whether observed behaviours should be considered expressions of a hypothesized evolutionary character. We illustrate the application of our framework by characterizing the behavioural character ‘aggressiveness’ in wild great tits, Parus major. PMID:24335984

  18. Rapporteur's report

    NASA Astrophysics Data System (ADS)

    Szollosy, Michael

    2017-07-01

    This report summarises the papers, presentations and discussion of the Artificial Intelligence and Simulation of Behaviour special workshop re-evaluating the Engineering and Physical Science and Arts and Humanities Research Councils (EPSRC and AHRC) 2010 Principles of Robotics. We describe the call for papers re-examining the workshop, summarise the papers and discussions that took place, and the voting that lead to our workshop adopting a series of proposals for amending the original Principles. The workshop discussed and voted on 14 specific "amendments, additions, or reflections" on the Principles. Of these, 9 out 14 were adopted by majority vote, 6 receiving strong support (67% or more in favour), 1 majority support (53%), with several of the remaining receiving mixed support of between 33% and 47%. An important and unanimous conclusion of the workshop was that "the Principles should be amended through a thorough and inclusive process". Adopted proposals also highlighted the need to "focus on the protection of humanity" from possible future risks created by AI and robotics, and to take into account how society is changing and adapting to technological advances.

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bonachea, D.; Dickens, P.; Thakur, R.

    There is a growing interest in using Java as the language for developing high-performance computing applications. To be successful in the high-performance computing domain, however, Java must not only be able to provide high computational performance, but also high-performance I/O. In this paper, we first examine several approaches that attempt to provide high-performance I/O in Java - many of which are not obvious at first glance - and evaluate their performance on two parallel machines, the IBM SP and the SGI Origin2000. We then propose extensions to the Java I/O library that address the deficiencies in the Java I/O APImore » and improve performance dramatically. The extensions add bulk (array) I/O operations to Java, thereby removing much of the overhead currently associated with array I/O in Java. We have implemented the extensions in two ways: in a standard JVM using the Java Native Interface (JNI) and in a high-performance parallel dialect of Java called Titanium. We describe the two implementations and present performance results that demonstrate the benefits of the proposed extensions.« less

  20. Dynamic Call Admission Control Scheme Based on Predictive User Mobility Behavior for Cellular Networks

    NASA Astrophysics Data System (ADS)

    Intarasothonchun, Silada; Thipchaksurat, Sakchai; Varakulsiripunth, Ruttikorn; Onozato, Yoshikuni

    In this paper, we propose a modified scheme of MSODB and PMS, called Predictive User Mobility Behavior (PUMB) to improve performance of resource reservation and call admission control for cellular networks. This algorithm is proposed in which bandwidth is allocated more efficiently to neighboring cells by key mobility parameters in order to provide QoS guarantees for transferring traffic. The probability is used to form a cluster of cells and the shadow cluster, where a mobile unit is likely to visit. When a mobile unit may change the direction and migrate to the cell that does not belong to its shadow cluster, we can support it by making efficient use of predicted nonconforming call. Concomitantly, to ensure continuity of on-going calls with better utilization of resources, bandwidth is borrowed from predicted nonconforming calls and existing adaptive calls without affecting the minimum QoS guarantees. The performance of the PUMB is demonstrated by simulation results in terms of new call blocking probability, handoff call dropping probability, bandwidth utilization, call successful probability, and overhead message transmission when arrival rate and speed of mobile units are varied. Our results show that PUMB provides the better performances comparing with those of MSODB and PMS under different traffic conditions.

  1. Simultaneous detection and classification of breast masses in digital mammograms via a deep learning YOLO-based CAD system.

    PubMed

    Al-Masni, Mohammed A; Al-Antari, Mugahed A; Park, Jeong-Min; Gi, Geon; Kim, Tae-Yeon; Rivera, Patricio; Valarezo, Edwin; Choi, Mun-Taek; Han, Seung-Moo; Kim, Tae-Seong

    2018-04-01

    Automatic detection and classification of the masses in mammograms are still a big challenge and play a crucial role to assist radiologists for accurate diagnosis. In this paper, we propose a novel Computer-Aided Diagnosis (CAD) system based on one of the regional deep learning techniques, a ROI-based Convolutional Neural Network (CNN) which is called You Only Look Once (YOLO). Although most previous studies only deal with classification of masses, our proposed YOLO-based CAD system can handle detection and classification simultaneously in one framework. The proposed CAD system contains four main stages: preprocessing of mammograms, feature extraction utilizing deep convolutional networks, mass detection with confidence, and finally mass classification using Fully Connected Neural Networks (FC-NNs). In this study, we utilized original 600 mammograms from Digital Database for Screening Mammography (DDSM) and their augmented mammograms of 2,400 with the information of the masses and their types in training and testing our CAD. The trained YOLO-based CAD system detects the masses and then classifies their types into benign or malignant. Our results with five-fold cross validation tests show that the proposed CAD system detects the mass location with an overall accuracy of 99.7%. The system also distinguishes between benign and malignant lesions with an overall accuracy of 97%. Our proposed system even works on some challenging breast cancer cases where the masses exist over the pectoral muscles or dense regions. Copyright © 2018 Elsevier B.V. All rights reserved.

  2. Evaluating the Generality and Limits of Blind Return-Oriented Programming Attacks

    DTIC Science & Technology

    2015-12-01

    consider a recently proposed information disclosure vulnerability called blind return-oriented programming (BROP). Under certain conditions, this...implementation disclosure attacks 15. NUMBER OF PAGES 75 16. PRICE CODE 17. SECURITY CLASSIFICATION OF REPORT Unclassified 18. SECURITY CLASSIFICATION OF...Science iii THIS PAGE INTENTIONALLY LEFT BLANK iv ABSTRACT We consider a recently proposed information disclosure vulnerability called blind return

  3. Efficient Algorithms for Handling Nondeterministic Automata

    NASA Astrophysics Data System (ADS)

    Vojnar, Tomáš

    Finite (word, tree, or omega) automata play an important role in different areas of computer science, including, for instance, formal verification. Often, deterministic automata are used for which traditional algorithms for important operations such as minimisation and inclusion checking are available. However, the use of deterministic automata implies a need to determinise nondeterministic automata that often arise during various computations even when the computations start with deterministic automata. Unfortunately, determinisation is a very expensive step since deterministic automata may be exponentially bigger than the original nondeterministic automata. That is why, it appears advantageous to avoid determinisation and work directly with nondeterministic automata. This, however, brings a need to be able to implement operations traditionally done on deterministic automata on nondeterministic automata instead. In particular, this is the case of inclusion checking and minimisation (or rather reduction of the size of automata). In the talk, we review several recently proposed techniques for inclusion checking on nondeterministic finite word and tree automata as well as Büchi automata. These techniques are based on using the so called antichains, possibly combined with a use of suitable simulation relations (and, in the case of Büchi automata, the so called Ramsey-based or rank-based approaches). Further, we discuss techniques for reducing the size of nondeterministic word and tree automata using quotienting based on the recently proposed notion of mediated equivalences. The talk is based on several common works with Parosh Aziz Abdulla, Ahmed Bouajjani, Yu-Fang Chen, Peter Habermehl, Lisa Kaati, Richard Mayr, Tayssir Touili, Lorenzo Clemente, Lukáš Holík, and Chih-Duo Hong.

  4. 2. Photocopy of photograph of Hornet Ranger Station. Original on ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    2. Photocopy of photograph of Hornet Ranger Station. Original on file with the Payette National Forest, Supervisor's Office, McCall, Idaho. VIEW OF RESIDENTIAL AREA, CA. 1936. RANGER DWELLING WITH WOODSHED CELLAR AND GARAGE IN BACKGROUND. - Hornet Ranger Station, Forest Service Road No. 50002, Council, Adams County, ID

  5. 47 CFR 74.783 - Station identification.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... originating local programming as defined by § 74.701(h) operating over 0.001 kw peak visual power (0.002 kw... visual presentation or a clearly understandable aural presentation of the translator station's call... identification procedures given in § 73.1201 when locally originating programming, as defined by § 74.701(h). The...

  6. Contaminant source identification using semi-supervised machine learning

    NASA Astrophysics Data System (ADS)

    Vesselinov, Velimir V.; Alexandrov, Boian S.; O'Malley, Daniel

    2018-05-01

    Identification of the original groundwater types present in geochemical mixtures observed in an aquifer is a challenging but very important task. Frequently, some of the groundwater types are related to different infiltration and/or contamination sources associated with various geochemical signatures and origins. The characterization of groundwater mixing processes typically requires solving complex inverse models representing groundwater flow and geochemical transport in the aquifer, where the inverse analysis accounts for available site data. Usually, the model is calibrated against the available data characterizing the spatial and temporal distribution of the observed geochemical types. Numerous different geochemical constituents and processes may need to be simulated in these models which further complicates the analyses. In this paper, we propose a new contaminant source identification approach that performs decomposition of the observation mixtures based on Non-negative Matrix Factorization (NMF) method for Blind Source Separation (BSS), coupled with a custom semi-supervised clustering algorithm. Our methodology, called NMFk, is capable of identifying (a) the unknown number of groundwater types and (b) the original geochemical concentration of the contaminant sources from measured geochemical mixtures with unknown mixing ratios without any additional site information. NMFk is tested on synthetic and real-world site data. The NMFk algorithm works with geochemical data represented in the form of concentrations, ratios (of two constituents; for example, isotope ratios), and delta notations (standard normalized stable isotope ratios).

  7. Contaminant source identification using semi-supervised machine learning

    DOE PAGES

    Vesselinov, Velimir Valentinov; Alexandrov, Boian S.; O’Malley, Dan

    2017-11-08

    Identification of the original groundwater types present in geochemical mixtures observed in an aquifer is a challenging but very important task. Frequently, some of the groundwater types are related to different infiltration and/or contamination sources associated with various geochemical signatures and origins. The characterization of groundwater mixing processes typically requires solving complex inverse models representing groundwater flow and geochemical transport in the aquifer, where the inverse analysis accounts for available site data. Usually, the model is calibrated against the available data characterizing the spatial and temporal distribution of the observed geochemical types. Numerous different geochemical constituents and processes may needmore » to be simulated in these models which further complicates the analyses. In this paper, we propose a new contaminant source identification approach that performs decomposition of the observation mixtures based on Non-negative Matrix Factorization (NMF) method for Blind Source Separation (BSS), coupled with a custom semi-supervised clustering algorithm. Our methodology, called NMFk, is capable of identifying (a) the unknown number of groundwater types and (b) the original geochemical concentration of the contaminant sources from measured geochemical mixtures with unknown mixing ratios without any additional site information. NMFk is tested on synthetic and real-world site data. Finally, the NMFk algorithm works with geochemical data represented in the form of concentrations, ratios (of two constituents; for example, isotope ratios), and delta notations (standard normalized stable isotope ratios).« less

  8. Contaminant source identification using semi-supervised machine learning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vesselinov, Velimir Valentinov; Alexandrov, Boian S.; O’Malley, Dan

    Identification of the original groundwater types present in geochemical mixtures observed in an aquifer is a challenging but very important task. Frequently, some of the groundwater types are related to different infiltration and/or contamination sources associated with various geochemical signatures and origins. The characterization of groundwater mixing processes typically requires solving complex inverse models representing groundwater flow and geochemical transport in the aquifer, where the inverse analysis accounts for available site data. Usually, the model is calibrated against the available data characterizing the spatial and temporal distribution of the observed geochemical types. Numerous different geochemical constituents and processes may needmore » to be simulated in these models which further complicates the analyses. In this paper, we propose a new contaminant source identification approach that performs decomposition of the observation mixtures based on Non-negative Matrix Factorization (NMF) method for Blind Source Separation (BSS), coupled with a custom semi-supervised clustering algorithm. Our methodology, called NMFk, is capable of identifying (a) the unknown number of groundwater types and (b) the original geochemical concentration of the contaminant sources from measured geochemical mixtures with unknown mixing ratios without any additional site information. NMFk is tested on synthetic and real-world site data. Finally, the NMFk algorithm works with geochemical data represented in the form of concentrations, ratios (of two constituents; for example, isotope ratios), and delta notations (standard normalized stable isotope ratios).« less

  9. Authenticity preservation with histogram-based reversible data hiding and quadtree concepts.

    PubMed

    Huang, Hsiang-Cheh; Fang, Wai-Chi

    2011-01-01

    With the widespread use of identification systems, establishing authenticity with sensors has become an important research issue. Among the schemes for making authenticity verification based on information security possible, reversible data hiding has attracted much attention during the past few years. With its characteristics of reversibility, the scheme is required to fulfill the goals from two aspects. On the one hand, at the encoder, the secret information needs to be embedded into the original image by some algorithms, such that the output image will resemble the input one as much as possible. On the other hand, at the decoder, both the secret information and the original image must be correctly extracted and recovered, and they should be identical to their embedding counterparts. Under the requirement of reversibility, for evaluating the performance of the data hiding algorithm, the output image quality, named imperceptibility, and the number of bits for embedding, called capacity, are the two key factors to access the effectiveness of the algorithm. Besides, the size of side information for making decoding possible should also be evaluated. Here we consider using the characteristics of original images for developing our method with better performance. In this paper, we propose an algorithm that has the ability to provide more capacity than conventional algorithms, with similar output image quality after embedding, and comparable side information produced. Simulation results demonstrate the applicability and better performance of our algorithm.

  10. Sparse learning of stochastic dynamical equations

    NASA Astrophysics Data System (ADS)

    Boninsegna, Lorenzo; Nüske, Feliks; Clementi, Cecilia

    2018-06-01

    With the rapid increase of available data for complex systems, there is great interest in the extraction of physically relevant information from massive datasets. Recently, a framework called Sparse Identification of Nonlinear Dynamics (SINDy) has been introduced to identify the governing equations of dynamical systems from simulation data. In this study, we extend SINDy to stochastic dynamical systems which are frequently used to model biophysical processes. We prove the asymptotic correctness of stochastic SINDy in the infinite data limit, both in the original and projected variables. We discuss algorithms to solve the sparse regression problem arising from the practical implementation of SINDy and show that cross validation is an essential tool to determine the right level of sparsity. We demonstrate the proposed methodology on two test systems, namely, the diffusion in a one-dimensional potential and the projected dynamics of a two-dimensional diffusion process.

  11. The origins of enzyme kinetics.

    PubMed

    Cornish-Bowden, Athel

    2013-09-02

    The equation commonly called the Michaelis-Menten equation is sometimes attributed to other authors. However, although Victor Henri had derived the equation from the correct mechanism, and Adrian Brown before him had proposed the idea of enzyme saturation, it was Leonor Michaelis and Maud Menten who showed that this mechanism could also be deduced on the basis of an experimental approach that paid proper attention to pH and spontaneous changes in the product after formation in the enzyme-catalysed reaction. By using initial rates of reaction they avoided the complications due to substrate depletion, product accumulation and progressive inactivation of the enzyme that had made attempts to analyse complete time courses very difficult. Their methodology has remained the standard approach to steady-state enzyme kinetics ever since. Copyright © 2013 Federation of European Biochemical Societies. Published by Elsevier B.V. All rights reserved.

  12. Experimental generalized quantum suppression law in Sylvester interferometers

    NASA Astrophysics Data System (ADS)

    Viggianiello, Niko; Flamini, Fulvio; Innocenti, Luca; Cozzolino, Daniele; Bentivegna, Marco; Spagnolo, Nicolò; Crespi, Andrea; Brod, Daniel J.; Galvão, Ernesto F.; Osellame, Roberto; Sciarrino, Fabio

    2018-03-01

    Photonic interference is a key quantum resource for optical quantum computation, and in particular for so-called boson sampling devices. In interferometers with certain symmetries, genuine multiphoton quantum interference effectively suppresses certain sets of events, as in the original Hong–Ou–Mandel effect. Recently, it was shown that some classical and semi-classical models could be ruled out by identifying such suppressions in Fourier interferometers. Here we propose a suppression law suitable for random-input experiments in multimode Sylvester interferometers, and verify it experimentally using 4- and 8-mode integrated interferometers. The observed suppression occurs for a much larger fraction of input–output combinations than what is observed in Fourier interferometers of the same size, and could be relevant to certification of boson sampling machines and other experiments relying on bosonic interference, such as quantum simulation and quantum metrology.

  13. Self-calibration for lensless color microscopy.

    PubMed

    Flasseur, Olivier; Fournier, Corinne; Verrier, Nicolas; Denis, Loïc; Jolivet, Frédéric; Cazier, Anthony; Lépine, Thierry

    2017-05-01

    Lensless color microscopy (also called in-line digital color holography) is a recent quantitative 3D imaging method used in several areas including biomedical imaging and microfluidics. By targeting cost-effective and compact designs, the wavelength of the low-end sources used is known only imprecisely, in particular because of their dependence on temperature and power supply voltage. This imprecision is the source of biases during the reconstruction step. An additional source of error is the crosstalk phenomenon, i.e., the mixture in color sensors of signals originating from different color channels. We propose to use a parametric inverse problem approach to achieve self-calibration of a digital color holographic setup. This process provides an estimation of the central wavelengths and crosstalk. We show that taking the crosstalk phenomenon into account in the reconstruction step improves its accuracy.

  14. Quasi-multi-pulse voltage source converter design with two control degrees of freedom

    NASA Astrophysics Data System (ADS)

    Vural, A. M.; Bayindir, K. C.

    2015-05-01

    In this article, the design details of a quasi-multi-pulse voltage source converter (VSC) switched at line frequency of 50 Hz are given in a step-by-step process. The proposed converter is comprised of four 12-pulse converter units, which is suitable for the simulation of single-/multi-converter flexible alternating current transmission system devices as well as high voltage direct current systems operating at the transmission level. The magnetic interface of the converter is originally designed with given all parameters for 100 MVA operation. The so-called two-angle control method is adopted to control the voltage magnitude and the phase angle of the converter independently. PSCAD simulation results verify both four-quadrant converter operation and closed-loop control of the converter operated as static synchronous compensator (STATCOM).

  15. Geometric constrained variational calculus. III: The second variation (Part II)

    NASA Astrophysics Data System (ADS)

    Massa, Enrico; Luria, Gianvittorio; Pagani, Enrico

    2016-03-01

    The problem of minimality for constrained variational calculus is analyzed within the class of piecewise differentiable extremaloids. A fully covariant representation of the second variation of the action functional based on a family of local gauge transformations of the original Lagrangian is proposed. The necessity of pursuing a local adaptation process, rather than the global one described in [1] is seen to depend on the value of certain scalar attributes of the extremaloid, here called the corners’ strengths. On this basis, both the necessary and the sufficient conditions for minimality are worked out. In the discussion, a crucial role is played by an analysis of the prolongability of the Jacobi fields across the corners. Eventually, in the appendix, an alternative approach to the concept of strength of a corner, more closely related to Pontryagin’s maximum principle, is presented.

  16. Learning object-to-class kernels for scene classification.

    PubMed

    Zhang, Lei; Zhen, Xiantong; Shao, Ling

    2014-08-01

    High-level image representations have drawn increasing attention in visual recognition, e.g., scene classification, since the invention of the object bank. The object bank represents an image as a response map of a large number of pretrained object detectors and has achieved superior performance for visual recognition. In this paper, based on the object bank representation, we propose the object-to-class (O2C) distances to model scene images. In particular, four variants of O2C distances are presented, and with the O2C distances, we can represent the images using the object bank by lower-dimensional but more discriminative spaces, called distance spaces, which are spanned by the O2C distances. Due to the explicit computation of O2C distances based on the object bank, the obtained representations can possess more semantic meanings. To combine the discriminant ability of the O2C distances to all scene classes, we further propose to kernalize the distance representation for the final classification. We have conducted extensive experiments on four benchmark data sets, UIUC-Sports, Scene-15, MIT Indoor, and Caltech-101, which demonstrate that the proposed approaches can significantly improve the original object bank approach and achieve the state-of-the-art performance.

  17. The Structural and Functional Organization of Cognition

    PubMed Central

    Snow, Peter J.

    2016-01-01

    This article proposes that what have been historically and contemporarily defined as different domains of human cognition are served by one of four functionally- and structurally-distinct areas of the prefrontal cortex (PFC). Their contributions to human intelligence are as follows: (a) BA9, enables our emotional intelligence, engaging the psychosocial domain; (b) BA47, enables our practical intelligence, engaging the material domain; (c) BA46 (or BA46-9/46), enables our abstract intelligence, engaging the hypothetical domain; and (d) BA10, enables our temporal intelligence, engaging in planning within any of the other three domains. Given their unique contribution to human cognition, it is proposed that these areas be called the, social (BA9), material (BA47), abstract (BA46-9/46) and temporal (BA10) mind. The evidence that BA47 participates strongly in verbal and gestural communication suggests that language evolved primarily as a consequence of the extreme selective pressure for practicality; an observation supported by the functional connectivity between BA47 and orbital areas that negatively reinforce lying. It is further proposed that the abstract mind (BA46-9/46) is the primary seat of metacognition charged with creating adaptive behavioral strategies by generating higher-order concepts (hypotheses) from lower-order concepts originating from the other three domains of cognition. PMID:27799901

  18. Cosmic Ray Acceleration by a Versatile Family of Galactic Wind Termination Shocks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bustard, Chad; Zweibel, Ellen G.; Cotter, Cory, E-mail: bustard@wisc.edu

    2017-01-20

    There are two distinct breaks in the cosmic ray (CR) spectrum: the so-called “knee” around 3 × 10{sup 15} eV and the so-called “ankle” around 10{sup 18} eV. Diffusive shock acceleration (DSA) at supernova remnant (SNR) shock fronts is thought to accelerate galactic CRs to energies below the knee, while an extragalactic origin is presumed for CRs with energies beyond the ankle. CRs with energies between 3 × 10{sup 15} and 10{sup 18} eV, which we dub the “shin,” have an unknown origin. It has been proposed that DSA at galactic wind termination shocks, rather than at SNR shocks, maymore » accelerate CRs to these energies. This paper uses the galactic wind model of Bustard et al. to analyze whether galactic wind termination shocks may accelerate CRs to shin energies within a reasonable acceleration time and whether such CRs can subsequently diffuse back to the Galaxy. We argue for acceleration times on the order of 100 Myr rather than a few billion years, as assumed in some previous works, and we discuss prospects for magnetic field amplification at the shock front. Ultimately, we generously assume that the magnetic field is amplified to equipartition. This formalism allows us to obtain analytic formulae, applicable to any wind model, for CR acceleration. Even with generous assumptions, we find that very high wind velocities are required to set up the necessary conditions for acceleration beyond 10{sup 17} eV. We also estimate the luminosities of CRs accelerated by outflow termination shocks, including estimates for the Milky Way wind.« less

  19. 78 FR 18377 - Self-Regulatory Organizations; National Stock Exchange, Inc.; Notice of Designation of a Longer...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-03-26

    ... Organizations; National Stock Exchange, Inc.; Notice of Designation of a Longer Period for Commission Action on Proposed Rule Change To Adopt a New Order Type Called the ``Auto-Ex Only'' Order March 19, 2013. On January... (``Act'') \\1\\ and Rule 19b-4 thereunder,\\2\\ a proposed rule change to adopt a new order type called the...

  20. A Probabilistic Model of Spin and Spin Measurements

    NASA Astrophysics Data System (ADS)

    Niehaus, Arend

    2016-01-01

    Several theoretical publications on the Dirac equation published during the last decades have shown that, an interpretation is possible, which ascribes the origin of electron spin and magnetic moment to an autonomous circular motion of the point-like charged particle around a fixed centre. In more recent publications an extension of the original so called "Zitterbewegung Interpretation" of quantum mechanics was suggested, in which the spin results from an average of instantaneous spin vectors over a Zitterbewegung period. We argue that, the corresponding autonomous motion of the electron should, if it is real, determine non-relativistic spin measurements. Such a direct connection with the established formal quantum mechanical description of spin measurements, into which spin is introduced as a "non-classical" quantity has, to our knowledge, not been reported. In the present work we show that, under certain "model assumptions" concerning the proposed autonomous motion, results of spin measurements, including measurements of angular correlations in singlet systems, can indeed be correctly described using classical probabilities. The success of the model is evidence for the "reality" of the assumed autonomous motion. The resulting model violates the Bell—inequalities to the same extent as quantum mechanics.

  1. [Language and reality: the origin of man].

    PubMed

    Maturana, H

    1989-07-01

    The author proposes: 1. That a lineage of living systems is constituted by the reproductive conservation of a manner of living under the form of an ontogenic phenotype. 2. That language is a manner of living in recurrent consensual coordinations of consensual coordinations of actions. 3. That the human manner of living entails among other things, a braiding of languaging and emotioning that we call conversation. 4. That human beings arise in the history of bipedal primates with the origin of language, and the constitution of a lineage defined by the conservation of an ontogenic phenotype that includes conversations as part of it. 5. That the magnitude of the involvement of the brain and anatomy of the larynx and face in speech as our main manner of languaging indicate that language cannot have arisen later than two to three millions year ago. 6. That rationally pertains to the operational coherences of languaging and that different rational domains are constituted by different basic notions that are accepted a priori. That is, on preference. 7. That responsibility and freedom are a function of our awareness of the participation of our emotions (preferences) in the constitution of the rational domains in which we operate.

  2. The revival of General Relativity at Princeton: Daring Conservatism

    NASA Astrophysics Data System (ADS)

    Brill, Dieter; Blum, Alexander

    2018-01-01

    After General Relativity was established in essentially its present form in 1915 it was celebrated as a great success of mathematical physics. But the initial hopes for this theory as a basis for all of physics began to fade in the next several decades, as General Relativity was relegated to the margins of theoretical physics. Its fate began to rise in the 1950's in a revival of interest and research that over time made gravitational physics one of the hottest research topics it is today. One center of this renaissance was Princeton, where two relative newcomers explored new and different approaches to gravitational physics. Robert Dicke showed that gravity is not as inaccessible to experiment as was thought, and John Wheeler propelled it into the mainstream by proposing highly original and imaginative consequences of Einstein's theory. We will concentrate on these ideas that, in his characteristically intriguing style, Wheeler called "Daring Conservatism" - a term well known to his associates, but one he never mentioned in print. With the aid of unpublished manuscripts and notes we will explore Daring Conservatism's origin and motivation, its successes and failures, and the legacy it left behind.

  3. Deep Investigation of Arabidopsis thaliana Junk DNA Reveals a Continuum between Repetitive Elements and Genomic Dark Matter

    PubMed Central

    Maumus, Florian; Quesneville, Hadi

    2014-01-01

    Eukaryotic genomes contain highly variable amounts of DNA with no apparent function. This so-called junk DNA is composed of two components: repeated and repeat-derived sequences (together referred to as the repeatome), and non-annotated sequences also known as genomic dark matter. Because of their high duplication rates as compared to other genomic features, transposable elements are predominant contributors to the repeatome and the products of their decay is thought to be a major source of genomic dark matter. Determining the origin and composition of junk DNA is thus important to help understanding genome evolution as well as host biology. In this study, we have used a combination of tools enabling to show that the repeatome from the small and reducing A. thaliana genome is significantly larger than previously thought. Furthermore, we present the concepts and results from a series of innovative approaches suggesting that a significant amount of the A. thaliana dark matter is of repetitive origin. As a tentative standard for the community, we propose a deep compendium annotation of the A. thaliana repeatome that may help addressing farther genome evolution as well as transcriptional and epigenetic regulation in this model plant. PMID:24709859

  4. MCM Paradox: Abundance of Eukaryotic Replicative Helicases and Genomic Integrity.

    PubMed

    Das, Mitali; Singh, Sunita; Pradhan, Satyajit; Narayan, Gopeshwar

    2014-01-01

    As a crucial component of DNA replication licensing system, minichromosome maintenance (MCM) 2-7 complex acts as the eukaryotic DNA replicative helicase. The six related MCM proteins form a heterohexamer and bind with ORC, CDC6, and Cdt1 to form the prereplication complex. Although the MCMs are well known as replicative helicases, their overabundance and distribution patterns on chromatin present a paradox called the "MCM paradox." Several approaches had been taken to solve the MCM paradox and describe the purpose of excess MCMs distributed beyond the replication origins. Alternative functions of these MCMs rather than a helicase had also been proposed. This review focuses on several models and concepts generated to solve the MCM paradox coinciding with their helicase function and provides insight into the concept that excess MCMs are meant for licensing dormant origins as a backup during replication stress. Finally, we extend our view towards the effect of alteration of MCM level. Though an excess MCM constituent is needed for normal cells to withstand stress, there must be a delineation of the threshold level in normal and malignant cells. This review also outlooks the future prospects to better understand the MCM biology.

  5. MCM Paradox: Abundance of Eukaryotic Replicative Helicases and Genomic Integrity

    PubMed Central

    Das, Mitali; Singh, Sunita; Pradhan, Satyajit

    2014-01-01

    As a crucial component of DNA replication licensing system, minichromosome maintenance (MCM) 2–7 complex acts as the eukaryotic DNA replicative helicase. The six related MCM proteins form a heterohexamer and bind with ORC, CDC6, and Cdt1 to form the prereplication complex. Although the MCMs are well known as replicative helicases, their overabundance and distribution patterns on chromatin present a paradox called the “MCM paradox.” Several approaches had been taken to solve the MCM paradox and describe the purpose of excess MCMs distributed beyond the replication origins. Alternative functions of these MCMs rather than a helicase had also been proposed. This review focuses on several models and concepts generated to solve the MCM paradox coinciding with their helicase function and provides insight into the concept that excess MCMs are meant for licensing dormant origins as a backup during replication stress. Finally, we extend our view towards the effect of alteration of MCM level. Though an excess MCM constituent is needed for normal cells to withstand stress, there must be a delineation of the threshold level in normal and malignant cells. This review also outlooks the future prospects to better understand the MCM biology. PMID:25386362

  6. Sequence analyses reveal that a TPR-DP module, surrounded by recombinable flanking introns, could be at the origin of eukaryotic Hop and Hip TPR-DP domains and prokaryotic GerD proteins.

    PubMed

    Hernández Torres, Jorge; Papandreou, Nikolaos; Chomilier, Jacques

    2009-05-01

    The co-chaperone Hop [heat shock protein (HSP) organising protein] is known to bind both Hsp70 and Hsp90. Hop comprises three repeats of a tetratricopeptide repeat (TPR) domain, each consisting of three TPR motifs. The first and last TPR domains are followed by a domain containing several dipeptide (DP) repeats called the DP domain. These analyses suggest that the hop genes result from successive recombination events of an ancestral TPR-DP module. From a hydrophobic cluster analysis of homologous Hop protein sequences derived from gene families, we can postulate that shifts in the open reading frames are at the origin of the present sequences. Moreover, these shifts can be related to the presence or absence of biological function. We propose to extend the family of Hop co-chaperons into the kingdom of bacteria, as several structurally related genes have been identified by hydrophobic cluster analysis. We also provide evidence of common structural characteristics between hop and hip genes, suggesting a shared precursor of ancestral TPR-DP domains.

  7. Tool for Merging Proposals Into DSN Schedules

    NASA Technical Reports Server (NTRS)

    Khanampornpan, Teerapat; Kwok, John; Call, Jared

    2008-01-01

    A Practical Extraction and Reporting Language (Perl) script called merge7da has been developed to facilitate determination, by a project scheduler in NASA's Deep Space Network, of whether a proposal for use of the DSN could create a conflict with the current DSN schedule. Prior to the development of merge7da, there was no way to quickly identify potential schedule conflicts: it was necessary to submit a proposal and wait a day or two for a response from a DSN scheduling facility. By using merge7da to detect and eliminate potential schedule conflicts before submitting a proposal, a project scheduler saves time and gains assurance that the proposal will probably be accepted. merge7da accepts two input files, one of which contains the current DSN schedule and is in a DSN-standard format called '7da'. The other input file contains the proposal and is in another DSN-standard format called 'C1/C2'. merge7da processes the two input files to produce a merged 7da-format output file that represents the DSN schedule as it would be if the proposal were to be adopted. This 7da output file can be loaded into various DSN scheduling software tools now in use.

  8. Video Guidance Sensor and Time-of-Flight Rangefinder

    NASA Technical Reports Server (NTRS)

    Bryan, Thomas; Howard, Richard; Bell, Joseph L.; Roe, Fred D.; Book, Michael L.

    2007-01-01

    A proposed video guidance sensor (VGS) would be based mostly on the hardware and software of a prior Advanced VGS (AVGS), with some additions to enable it to function as a time-of-flight rangefinder (in contradistinction to a triangulation or image-processing rangefinder). It would typically be used at distances of the order of 2 or 3 kilometers, where a typical target would appear in a video image as a single blob, making it possible to extract the direction to the target (but not the orientation of the target or the distance to the target) from a video image of light reflected from the target. As described in several previous NASA Tech Briefs articles, an AVGS system is an optoelectronic system that provides guidance for automated docking of two vehicles. In the original application, the two vehicles are spacecraft, but the basic principles of design and operation of the system are applicable to aircraft, robots, objects maneuvered by cranes, or other objects that may be required to be aligned and brought together automatically or under remote control. In a prior AVGS system of the type upon which the now-proposed VGS is largely based, the tracked vehicle is equipped with one or more passive targets that reflect light from one or more continuous-wave laser diode(s) on the tracking vehicle, a video camera on the tracking vehicle acquires images of the targets in the reflected laser light, the video images are digitized, and the image data are processed to obtain the direction to the target. The design concept of the proposed VGS does not call for any memory or processor hardware beyond that already present in the prior AVGS, but does call for some additional hardware and some additional software. It also calls for assignment of some additional tasks to two subsystems that are parts of the prior VGS: a field-programmable gate array (FPGA) that generates timing and control signals, and a digital signal processor (DSP) that processes the digitized video images. The additional timing and control signals generated by the FPGA would cause the VGS to alternate between an imaging (direction-finding) mode and a time-of-flight (range-finding mode) and would govern operation in the range-finding mode.

  9. 77 FR 73500 - Self-Regulatory Organizations; NYSE Arca, Inc.; Notice of Filing of Proposed Rule Change Relating...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-12-10

    ... Trading of Shares of the Horizons S&P 500 Covered Call ETF, Horizons S&P Financial Select Sector Covered Call ETF, and Horizons S&P Energy Select Sector Covered Call ETF Under NYSE Arca Equities Rule 5.2(j)(3... and trade shares (``Shares'') of the Horizons S&P 500 Covered Call ETF, Horizons S&P Financial Select...

  10. An introductory orientation to clinical pathology core and on-call responsibilities.

    PubMed

    Pappas, A A; Drew, M J; Flick, J; Fink, L; Fuller, G L; Hough, A J

    1994-05-01

    An introductory 4-week orientation for clinical pathology is described. There were 76 hours of lectures, 74 hours of conferences, and 68 hours of laboratories for a total of 221 hours. During the orientation, all calls handled by the residents were evaluated as to resolution, patient outcome, and interaction required. Eighty calls were received during the orientation from 57 technologists (71%), 16 physicians (20%), and seven nurses (9%). The calls originated concerning the following: blood banking, 37 (46%); hematology, 21 (27%); chemistry, 14 (18%); microbiology, five (6%); and administration, three (4%). Sixty percent of the calls were consultative and 40% were supervisory. Ninety-nine percent were handled appropriately by the residents. Patient outcome was moderately or significantly affected in 44% of all calls, divided between 67% of all consultative calls and 9% of all supervisory calls. Significant pathologist interaction was required in 49% of all calls, divided between 71% of the consultative calls and 16% of the supervisory calls. Using this integrated, dynamic system of resident instruction, on-call experience, and evaluation, residents quickly gain confidence in handling call, didactic clinical consultation, and patient management. The orientation and on-call system described provides for a relevant and dynamic system for resident education.

  11. 75 FR 8279 - Airworthiness Directives; The Boeing Company Model 747 Airplanes

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-02-24

    ... airplanes. The original NPRM would have superseded an existing AD that currently requires repetitive... inspection of the modified area. The original NPRM proposed to continue to require those actions using revised service information. For certain airplanes, the original NPRM proposed to require new repetitive...

  12. 75 FR 34516 - Bureau of Educational and Cultural Affairs; Edmund S. Muskie Graduate Fellowship Program

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-06-17

    .... Muskie Graduate Fellowship Program Notice: Correction to original Request for Grant Proposals. SUMMARY... revision to the original Request for Grant Proposals (RFGP) for the Edmund S. Muskie Graduate Fellowship... the original announcement remain the same. Additional Information Interested organizations should...

  13. 76 FR 69328 - Proposed Collection; Comment Request; Race and National Origin Identification

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-08

    ... DEPARTMENT OF THE TREASURY Proposed Collection; Comment Request; Race and National Origin... INFORMATION: OMB Number: 1505-0195. Type of Review: Revision of a currently approved collection. Title: Race...Connector, is used to capture race and national origin information electronically from an applicant. The...

  14. 1. VIEW LOOKING NORTHWEST AT BUILDING 444 UNDER CONSTRUCTION. ORIGINALLY ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    1. VIEW LOOKING NORTHWEST AT BUILDING 444 UNDER CONSTRUCTION. ORIGINALLY CALLED PLANT A, BUILDING 444 WAS ONE OF THE FIRST BUILDINGS CONSTRUCTED AT THE ROCKY FLATS PLANT. (4/15/52) - Rocky Flats Plant, Non-Nuclear Production Facility, South of Cottonwood Avenue, west of Seventh Avenue & east of Building 460, Golden, Jefferson County, CO

  15. Photocopy of original drawing showing Wing A (drawing located at ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Photocopy of original drawing showing Wing A (drawing located at NAWS China Lake, Division of Public Works). J.T. STAFFORD-J.H. DAVIES- H.L. GOGERTY: DISPENSARY, SICK CALL AND ADMINISTRATION, FLOOR PLAN AND ELEVATIONS - Naval Ordnance Test Station Inyokern, Dispensary, Main Site, Lauritsen Road at McIntyre Street, Ridgecrest, Kern County, CA

  16. Influence of atmospheric properties on detection of wood-warbler nocturnal flight calls

    NASA Astrophysics Data System (ADS)

    Horton, Kyle G.; Stepanian, Phillip M.; Wainwright, Charlotte E.; Tegeler, Amy K.

    2015-10-01

    Avian migration monitoring can take on many forms; however, monitoring active nocturnal migration of land birds is limited to a few techniques. Avian nocturnal flight calls are currently the only method for describing migrant composition at the species level. However, as this method develops, more information is needed to understand the sources of variation in call detection. Additionally, few studies examine how detection probabilities differ under varying atmospheric conditions. We use nocturnal flight call recordings from captive individuals to explore the dependence of flight call detection on atmospheric temperature and humidity. Height or distance from origin had the largest influence on call detection, while temperature and humidity also influenced detectability at higher altitudes. Because flight call detection varies with both atmospheric conditions and flight height, improved monitoring across time and space will require correction for these factors to generate standardized metrics of songbird migration.

  17. 25 CFR 224.62 - May a final proposed TERA differ from the original proposed TERA?

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 25 Indians 1 2010-04-01 2010-04-01 false May a final proposed TERA differ from the original proposed TERA? 224.62 Section 224.62 Indians BUREAU OF INDIAN AFFAIRS, DEPARTMENT OF THE INTERIOR ENERGY AND MINERALS TRIBAL ENERGY RESOURCE AGREEMENTS UNDER THE INDIAN TRIBAL ENERGY DEVELOPMENT AND SELF...

  18. 76 FR 2297 - Framework for Next Generation 911 Deployment

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-01-13

    ...: Parties who choose to file by paper must file an original and four copies of each filing. If more than one... determination, call routing, and call signaling in each case. 26. NG911 also provides far more flexibility to... politicians that ``the current communications landscape is a far cry from the one for which the current 9-1-1...

  19. Calling and Vocation at Work: Definitions and Prospects for Research and Practice

    ERIC Educational Resources Information Center

    Dik, Bryan J.; Duffy, Ryan D.

    2009-01-01

    The purpose of this article is to initiate an effort to establish the constructs calling and vocation within counseling psychology. First, updated definitions of calling and vocation, developed with an eye toward stimulating research and providing useful practice applications, are proposed. Next, the authors explain how the constructs apply to the…

  20. A general class of multinomial mixture models for anuran calling survey data

    USGS Publications Warehouse

    Royle, J. Andrew; Link, W.A.

    2005-01-01

    We propose a general framework for modeling anuran abundance using data collected from commonly used calling surveys. The data generated from calling surveys are indices of calling intensity (vocalization of males) that do not have a precise link to actual population size and are sensitive to factors that influence anuran behavior. We formulate a model for calling-index data in terms of the maximum potential calling index that could be observed at a site (the 'latent abundance class'), given its underlying breeding population, and we focus attention on estimating the distribution of this latent abundance class. A critical consideration in estimating the latent structure is imperfect detection, which causes the observed abundance index to be less than or equal to the latent abundance class. We specify a multinomial sampling model for the observed abundance index that is conditional on the latent abundance class. Estimation of the latent abundance class distribution is based on the marginal likelihood of the index data, having integrated over the latent class distribution. We apply the proposed modeling framework to data collected as part of the North American Amphibian Monitoring Program (NAAMP).

  1. Proposal and application of a regional methodology of comparative risk assessment for potentially contaminated sites.

    PubMed

    Marzocchini, Manrico; Tatàno, Fabio; Moretti, Michela Simona; Antinori, Caterina; Orilisi, Stefano

    2018-06-05

    A possible approach for determining soil and groundwater quality criteria for contaminated sites is the comparative risk assessment. Originating from but not limited to Italian interest in a decentralised (regional) implementation of comparative risk assessment, this paper first addresses the proposal of an original methodology called CORIAN REG-M , which was created with initial attention to the context of potentially contaminated sites in the Marche Region (Central Italy). To deepen the technical-scientific knowledge and applicability of the comparative risk assessment, the following characteristics of the CORIAN REG-M methodology appear to be relevant: the simplified but logical assumption of three categories of factors (source and transfer/transport of potential contamination, and impacted receptors) within each exposure pathway; the adaptation to quality and quantity of data that are available or derivable at the given scale of concern; the attention to a reliable but unsophisticated modelling; the achievement of a conceptual linkage to the absolute risk assessment approach; and the potential for easy updating and/or refining of the methodology. Further, the application of the CORIAN REG-M methodology to some case-study sites located in the Marche Region indicated the following: a positive correlation can be expected between air and direct contact pathway scores, as well as between individual pathway scores and the overall site scores based on a root-mean-square algorithm; the exposure pathway, which presents the highest variability of scores, tends to be dominant at sites with the highest computed overall site scores; and the adoption of a root-mean-square algorithm can be expected to emphasise the overall site scoring.

  2. Securing Provenance of Distributed Processes in an Untrusted Environment

    NASA Astrophysics Data System (ADS)

    Syalim, Amril; Nishide, Takashi; Sakurai, Kouichi

    Recently, there is much concern about the provenance of distributed processes, that is about the documentation of the origin and the processes to produce an object in a distributed system. The provenance has many applications in the forms of medical records, documentation of processes in the computer systems, recording the origin of data in the cloud, and also documentation of human-executed processes. The provenance of distributed processes can be modeled by a directed acyclic graph (DAG) where each node represents an entity, and an edge represents the origin and causal relationship between entities. Without sufficient security mechanisms, the provenance graph suffers from integrity and confidentiality problems, for example changes or deletions of the correct nodes, additions of fake nodes and edges, and unauthorized accesses to the sensitive nodes and edges. In this paper, we propose an integrity mechanism for provenance graph using the digital signature involving three parties: the process executors who are responsible in the nodes' creation, a provenance owner that records the nodes to the provenance store, and a trusted party that we call the Trusted Counter Server (TCS) that records the number of nodes stored by the provenance owner. We show that the mechanism can detect the integrity problem in the provenance graph, namely unauthorized and malicious “authorized” updates even if all the parties, except the TCS, collude to update the provenance. In this scheme, the TCS only needs a very minimal storage (linear with the number of the provenance owners). To protect the confidentiality and for an efficient access control administration, we propose a method to encrypt the provenance graph that allows access by paths and compartments in the provenance graph. We argue that encryption is important as a mechanism to protect the provenance data stored in an untrusted environment. We analyze the security of the integrity mechanism, and perform experiments to measure the performance of both mechanisms.

  3. Vecuum: identification and filtration of false somatic variants caused by recombinant vector contamination.

    PubMed

    Kim, Junho; Maeng, Ju Heon; Lim, Jae Seok; Son, Hyeonju; Lee, Junehawk; Lee, Jeong Ho; Kim, Sangwoo

    2016-10-15

    Advances in sequencing technologies have remarkably lowered the detection limit of somatic variants to a low frequency. However, calling mutations at this range is still confounded by many factors including environmental contamination. Vector contamination is a continuously occurring issue and is especially problematic since vector inserts are hardly distinguishable from the sample sequences. Such inserts, which may harbor polymorphisms and engineered functional mutations, can result in calling false variants at corresponding sites. Numerous vector-screening methods have been developed, but none could handle contamination from inserts because they are focusing on vector backbone sequences alone. We developed a novel method-Vecuum-that identifies vector-originated reads and resultant false variants. Since vector inserts are generally constructed from intron-less cDNAs, Vecuum identifies vector-originated reads by inspecting the clipping patterns at exon junctions. False variant calls are further detected based on the biased distribution of mutant alleles to vector-originated reads. Tests on simulated and spike-in experimental data validated that Vecuum could detect 93% of vector contaminants and could remove up to 87% of variant-like false calls with 100% precision. Application to public sequence datasets demonstrated the utility of Vecuum in detecting false variants resulting from various types of external contamination. Java-based implementation of the method is available at http://vecuum.sourceforge.net/ CONTACT: swkim@yuhs.acSupplementary information: Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  4. Cultural Resources Survey of Greenwood Bend and Iowa Point Revetment, Mississippi River M-293.1 to 280-L

    DTIC Science & Technology

    1993-10-01

    as Mesoamerica (Neuman 1984:218). Sometime after A.D. 1000, the Plaquemine phenomenon, originally defined by the Medora Site (16WBR1), continued the...the surface in the study area. Commonly called the Tunica Hills, it corresponds closely with the area originally delineated as the Citronelle...of dissection and structural influence, the original geomorphic expression of the surface has been obliterated, and depositional environment is best

  5. [Lectins from Sambucus nigra L inflorescences: isolation and investigation of biological activity using procaryotic test-systems].

    PubMed

    Karpova, I S; Korets'ka, N V; Pal'chykovs'ka, L H; Nehruts'ka, V V

    2007-01-01

    Isolation of lectins from extracts of the Sambucus nigra inflorescences and of pollen material have been performed using isoelectric focusing without carrier ampholytes (autofocusing). Fractions active in agglutination tests with different carbohydrate specificity were subjected to SDS-PAGE. The major lectin found in whole inflores-cences was GalNAc specific and is proposed to be a heterotetramer with subunits of about 30 and 33 kDa. It was called SNAflu-I. At least two other lectins were present in the pollen material and supposed to consist of identical subunits. Major positively charged lectin was Glc/Man specific with subunit of 26 kDa and called SNApol-I. Other pollen component (SNApol-II) was Gal specific with subunit of about 20 kDa. In order to elucidate cell targets sensitive for the S. nigra lectin's activity the combined effects of the lectins and transcriptional of phenazine origin on B. subtilis cells growth have been studied. Only SNApol-I demonstrated the antagonistic activity against these inhibitors in vivo. This lectin but not the SNAflu-I can also inhibit transcription in vitro. It is supposed that lectins from the same source may act in different directions on cell metabolism. Particularly one of the common targets may be the DNA-dependent synthesis of RNA.

  6. 34. DIABLO POWERHOUSE: ORIGINAL EQUIPMENT WESTINGHOUSE TYPE J RHEOSTAT. ALTHOUGH ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    34. DIABLO POWERHOUSE: ORIGINAL EQUIPMENT WESTINGHOUSE TYPE J RHEOSTAT. ALTHOUGH NOW CONSIDERED OBSOLETE, THE RHEOSTAT IS RETAINED AS BACK-UP EQUIPMENT AND HAS BEEN CALLED INTO SERVICE IN RECENT YEARS WHEN MORE MODERN EQUIPMENT FAILED, 1989. - Skagit Power Development, Diablo Powerhouse, On Skagit River, 6.1 miles upstream from Newhalem, Newhalem, Whatcom County, WA

  7. 75 FR 34950 - Walnuts Grown in California; Changes to the Quality Regulations for Shelled Walnuts

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-06-21

    ... called ``meal.'' Walnut meal is sold into the market for industrial use, such as in commercial bakery... both the end products and the meal derived from the original lot of shelled walnuts. Providing information about the original lot of walnuts from which the end products and meal were derived assures...

  8. 75 FR 51926 - Walnuts Grown in California; Changes to the Quality Regulations for Shelled Walnuts

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-08-24

    ... process and are called ``meal.'' Walnut meal is sold into the market for industrial use, such as in... both the end products and the meal derived from the original lot of shelled walnuts. Providing information about the original lot of walnuts from which the end products and meal were derived assures...

  9. Fiji Hindustani. Working Papers in Linguistics, Vol. 7, No. 3, May-June 1975.

    ERIC Educational Resources Information Center

    Siegel, Jeffrey

    More than 250,000 of Fiji's citizens are descendants of Indian indentured laborers of diverse origins. There are still distinct social groups based on language, religion, and place of origin. However, nearly all Fiji Indians speak one language called Fiji Hindustani. Other languages, such as Gujarati, Panjabi, Tamil, and Telugu, are still spoken,…

  10. Fermilab Today

    Science.gov Websites

    called a jet. A jet is a spray of particles all moving in the same direction and typically originating see in these two-jet (or "dijet") events. If these jets originate from the lighter quarks masses of one jet against the other, and indeed we see that most of the events in our sample have two

  11. Photocopy of original drawing showing Wing A (drawing located at ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Photocopy of original drawing showing Wing A (drawing located at NAWS China Lake, Division of Public Works). J.T. STAFFORD-J.H. DAVIES-H.L. GOGERTY: DISPENSARY, SICK CALL AND ADMINISTRATION, ROOF FRAMING PLAN AND DETAILS - Naval Ordnance Test Station Inyokern, Dispensary, Main Site, Lauritsen Road at McIntyre Street, Ridgecrest, Kern County, CA

  12. Evolution of regulatory networks towards adaptability and stability in a changing environment

    NASA Astrophysics Data System (ADS)

    Lee, Deok-Sun

    2014-11-01

    Diverse biological networks exhibit universal features distinguished from those of random networks, calling much attention to their origins and implications. Here we propose a minimal evolution model of Boolean regulatory networks, which evolve by selectively rewiring links towards enhancing adaptability to a changing environment and stability against dynamical perturbations. We find that sparse and heterogeneous connectivity patterns emerge, which show qualitative agreement with real transcriptional regulatory networks and metabolic networks. The characteristic scaling behavior of stability reflects the balance between robustness and flexibility. The scaling of fluctuation in the perturbation spread shows a dynamic crossover, which is analyzed by investigating separately the stochasticity of internal dynamics and the network structure differences depending on the evolution pathways. Our study delineates how the ambivalent pressure of evolution shapes biological networks, which can be helpful for studying general complex systems interacting with environments.

  13. Catheter and Laryngeal Mask Endotracheal Surfactant Therapy: the CALMEST approach as a novel MIST technique.

    PubMed

    Vannozzi, Ilaria; Ciantelli, Massimiliano; Moscuzza, Francesca; Scaramuzzo, Rosa T; Panizza, Davide; Sigali, Emilio; Boldrini, Antonio; Cuttano, Armando

    2017-10-01

    Neonatal respiratory distress syndrome (RDS) is a major cause of mortality and morbidity among preterm infants. Although the INSURE (INtubation, SURfactant administration, Estubation) technique for surfactant replacement therapy is so far the gold standard method, over the last years new approaches have been studied, i.e. less invasive surfactant administration (LISA) or minimally invasive surfactant therapy (MIST). Here we propose an originally modified MIST, called CALMEST (Catheter And Laryngeal Mask Endotracheal Surfactant Therapy), using a particular laryngeal mask as a guide for a thin catheter to deliver surfactant directly in the trachea. We performed a preliminary study on a mannequin and a subsequent in vivo pilot trial. This novel procedure is quick, effective and well tolerated and might represent an improvement in reducing neonatal stress. Ultimately, CALMEST offers an alternative approach that could be extremely useful for medical staff with low expertise in laryngoscopy and intubation.

  14. EVIDENCE FOR POLAR X-RAY JETS AS SOURCES OF MICROSTREAM PEAKS IN THE SOLAR WIND

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Neugebauer, Marcia, E-mail: mneugeb@lpl.arizona.edu

    2012-05-01

    It is proposed that the interplanetary manifestations of X-ray jets observed in solar polar coronal holes during periods of low solar activity are the peaks of the so-called microstreams observed in the fast polar solar wind. These microstreams exhibit velocity fluctuations of {+-}35 km s{sup -1}, higher kinetic temperatures, slightly higher proton fluxes, and slightly higher abundances of the low-first-ionization-potential element iron relative to oxygen ions than the average polar wind. Those properties can all be explained if the fast microstreams result from the magnetic reconnection of bright-point loops, which leads to X-ray jets which, in turn, result in solarmore » polar plumes. Because most of the microstream peaks are bounded by discontinuities of solar origin, jets are favored over plumes for the majority of the microstream peaks.« less

  15. Order from noise: Toward a social theory of geographic information

    USGS Publications Warehouse

    Poore, B.S.; Chrisman, N.R.

    2006-01-01

    In the so-called Information Age, it is surprising that the concept of information is imprecisely defined and almost taken for granted. Historic and recent geographic information science (GIScience) literature relies on two conflicting metaphors, often espoused by the same author in adjacent paragraphs. The metaphor of invariance, derived from telecommunications engineering, defines information as a thing to be transported without loss through a conduit. Another metaphor, originating in the utopian movements of the 19th century, locates information within a hierarchy of refinement-a stopping place on the path to convert mere data into higher forms of knowledge and perhaps to wisdom. Both metaphors rely on long-forgotten debates outside geography and preclude us from seeing that there are important social and ethical concerns in the relationship between geographic information technologies and society. We examine the conflicts between competing metaphors and propose a social theory of geographic information. ?? 2006 by Association of American Geographers.

  16. 3D sensor placement strategy using the full-range pheromone ant colony system

    NASA Astrophysics Data System (ADS)

    Shuo, Feng; Jingqing, Jia

    2016-07-01

    An optimized sensor placement strategy will be extremely beneficial to ensure the safety and cost reduction considerations of structural health monitoring (SHM) systems. The sensors must be placed such that important dynamic information is obtained and the number of sensors is minimized. The practice is to select individual sensor directions by several 1D sensor methods and the triaxial sensors are placed in these directions for monitoring. However, this may lead to non-optimal placement of many triaxial sensors. In this paper, a new method, called FRPACS, is proposed based on the ant colony system (ACS) to solve the optimal placement of triaxial sensors. The triaxial sensors are placed as single units in an optimal fashion. And then the new method is compared with other algorithms using Dalian North Bridge. The computational precision and iteration efficiency of the FRPACS has been greatly improved compared with the original ACS and EFI method.

  17. Future Research Challenges for a Computer-Based Interpretative 3D Reconstruction of Cultural Heritage - A German Community's View

    NASA Astrophysics Data System (ADS)

    Münster, S.; Kuroczyński, P.; Pfarr-Harfst, M.; Grellert, M.; Lengyel, D.

    2015-08-01

    The workgroup for Digital Reconstruction of the Digital Humanities in the German-speaking area association (Digital Humanities im deutschsprachigen Raum e.V.) was founded in 2014 as cross-disciplinary scientific society dealing with all aspects of digital reconstruction of cultural heritage and currently involves more than 40 German researchers. Moreover, the workgroup is dedicated to synchronise and foster methodological research for these topics. As one preliminary result a memorandum was created to name urgent research challenges and prospects in a condensed way and assemble a research agenda which could propose demands for further research and development activities within the next years. The version presented within this paper was originally created as a contribution to the so-called agenda development process initiated by the German Federal Ministry of Education and Research (BMBF) in 2014 and has been amended during a joint meeting of the digital reconstruction workgroup in November 2014.

  18. The angular structure of jet quenching within a hybrid strong/weak coupling model

    NASA Astrophysics Data System (ADS)

    Casalderrey-Solana, Jorge; Gulhan, Doga Can; Milhano, José Guilherme; Pablos, Daniel; Rajagopal, Krishna

    2017-08-01

    Building upon the hybrid strong/weak coupling model for jet quenching, we incorporate and study the effects of transverse momentum broadening and medium response of the plasma to jets on a variety of observables. For inclusive jet observables, we find little sensitivity to the strength of broadening. To constrain those dynamics, we propose new observables constructed from ratios of differential jet shapes, in which particles are binned in momentum, which are sensitive to the in-medium broadening parameter. We also investigate the effect of the back-reaction of the medium on the angular structure of jets as reconstructed with different cone radii R. Finally we provide results for the so called ;missing-pt;, finding a qualitative agreement between our model calculations and data in many respects, although a quantitative agreement is beyond our simplified treatment of the hadrons originating from the hydrodynamic wake.

  19. Simultaneous transmission for an encrypted image and a double random-phase encryption key

    NASA Astrophysics Data System (ADS)

    Yuan, Sheng; Zhou, Xin; Li, Da-Hai; Zhou, Ding-Fu

    2007-06-01

    We propose a method to simultaneously transmit double random-phase encryption key and an encrypted image by making use of the fact that an acceptable decryption result can be obtained when only partial data of the encrypted image have been taken in the decryption process. First, the original image data are encoded as an encrypted image by a double random-phase encryption technique. Second, a double random-phase encryption key is encoded as an encoded key by the Rivest-Shamir-Adelman (RSA) public-key encryption algorithm. Then the amplitude of the encrypted image is modulated by the encoded key to form what we call an encoded image. Finally, the encoded image that carries both the encrypted image and the encoded key is delivered to the receiver. Based on such a method, the receiver can have an acceptable result and secure transmission can be guaranteed by the RSA cipher system.

  20. Simultaneous transmission for an encrypted image and a double random-phase encryption key.

    PubMed

    Yuan, Sheng; Zhou, Xin; Li, Da-hai; Zhou, Ding-fu

    2007-06-20

    We propose a method to simultaneously transmit double random-phase encryption key and an encrypted image by making use of the fact that an acceptable decryption result can be obtained when only partial data of the encrypted image have been taken in the decryption process. First, the original image data are encoded as an encrypted image by a double random-phase encryption technique. Second, a double random-phase encryption key is encoded as an encoded key by the Rivest-Shamir-Adelman (RSA) public-key encryption algorithm. Then the amplitude of the encrypted image is modulated by the encoded key to form what we call an encoded image. Finally, the encoded image that carries both the encrypted image and the encoded key is delivered to the receiver. Based on such a method, the receiver can have an acceptable result and secure transmission can be guaranteed by the RSA cipher system.

  1. Human Immunodeficiency Virus Playing Hide-and-Seek: Understanding the TFH Cell Reservoir and Proposing Strategies to Overcome the Follicle Sanctuary.

    PubMed

    Leong, Yew Ann; Atnerkar, Anurag; Yu, Di

    2017-01-01

    Human immunodeficiency virus (HIV) infects millions of people worldwide, and new cases continue to emerge. Once infected, the virus cannot be cleared by the immune system and causes acquired immunodeficiency syndrome. Combination antiretroviral therapeutic regimen effectively suppresses viral replication and halts disease progression. The treatment, however, does not eliminate the virus-infected cells, and interruption of treatment inevitably leads to viral rebound. The rebound virus originates from a group of virus-infected cells referred to as the cellular reservoir of HIV. Identifying and eliminating the HIV reservoir will prevent viral rebound and cure HIV infection. In this review, we focus on a recently discovered HIV reservoir in a subset of CD4 + T cells called the follicular helper T (T FH ) cells. We describe the potential mechanisms for the emergence of reservoir in T FH cells, and the strategies to target and eliminate this viral reservoir.

  2. Controls on Cyclic Formation of Quaternary Early Diagenetic Dolomite

    NASA Astrophysics Data System (ADS)

    McCormack, J.; Bontognali, T. R. R.; Immenhauser, A.; Kwiecien, O.

    2018-04-01

    The origin of sedimentary dolomite and the factors that control its formation within the geological record remain speculative. In most models, dolomite formation is linked to evaporative conditions, high water temperature, increasing Mg/Ca ratio, increasing alkalinity, and high amounts of biomass. Here we challenge these archetypal views, by documenting a case example of Quaternary dolomite which formed in Lake Van at constantly low temperature (<4°C) and without direct control of the latter conditions. Dolomite occurs within highstand sediments related to suborbital climate variability (Dansgaard-Oeschger cycles). We propose that dolomite precipitation is a product of a microbially influenced process, triggered by ecological stress, resulting from reventilation of the water-sediment interface. Independently from the validity of this hypothesis, our results call for a reevaluation of the paleoenvironmental conditions often invoked for early diagenetic dolomite-rich intervals within sedimentary sequences and for caution when interpreting time series of subrecent lacustrine carbonates.

  3. Number-theoretic nature of communication in quantum spin systems.

    PubMed

    Godsil, Chris; Kirkland, Stephen; Severini, Simone; Smith, Jamie

    2012-08-03

    The last decade has witnessed substantial interest in protocols for transferring information on networks of quantum mechanical objects. A variety of control methods and network topologies have been proposed, on the basis that transfer with perfect fidelity-i.e., deterministic and without information loss-is impossible through unmodulated spin chains with more than a few particles. Solving the original problem formulated by Bose [Phys. Rev. Lett. 91, 207901 (2003)], we determine the exact number of qubits in unmodulated chains (with an XY Hamiltonian) that permit transfer with a fidelity arbitrarily close to 1, a phenomenon called pretty good state transfer. We prove that this happens if and only if the number of nodes is n = p - 1, 2p - 1, where p is a prime, or n = 2(m) - 1. The result highlights the potential of quantum spin system dynamics for reinterpreting questions about the arithmetic structure of integers and, in this case, primality.

  4. Vehicle Classification Using an Imbalanced Dataset Based on a Single Magnetic Sensor.

    PubMed

    Xu, Chang; Wang, Yingguan; Bao, Xinghe; Li, Fengrong

    2018-05-24

    This paper aims to improve the accuracy of automatic vehicle classifiers for imbalanced datasets. Classification is made through utilizing a single anisotropic magnetoresistive sensor, with the models of vehicles involved being classified into hatchbacks, sedans, buses, and multi-purpose vehicles (MPVs). Using time domain and frequency domain features in combination with three common classification algorithms in pattern recognition, we develop a novel feature extraction method for vehicle classification. These three common classification algorithms are the k-nearest neighbor, the support vector machine, and the back-propagation neural network. Nevertheless, a problem remains with the original vehicle magnetic dataset collected being imbalanced, and may lead to inaccurate classification results. With this in mind, we propose an approach called SMOTE, which can further boost the performance of classifiers. Experimental results show that the k-nearest neighbor (KNN) classifier with the SMOTE algorithm can reach a classification accuracy of 95.46%, thus minimizing the effect of the imbalance.

  5. [Alternative splicing regulation: implications in cancer diagnosis and treatment].

    PubMed

    Martínez-Montiel, Nancy; Rosas-Murrieta, Nora; Martínez-Contreras, Rebeca

    2015-04-08

    The accurate expression of the genetic information is regulated by processes like mRNA splicing, proposed after the discoveries of Phil Sharp and Richard Roberts, who demonstrated the existence of intronic sequences, present in almost every structural eukaryotic gene, which should be precisely removed. This intron removal is called "splicing", which generates different proteins from a single mRNA, with different or even antagonistic functions. We currently know that alternative splicing is the most important source of protein diversity, given that 70% of the human genes undergo splicing and that mutations causing defects in this process could originate up to 50% of genetic diseases, including cancer. When these defects occur in genes involved in cell adhesion, proliferation and cell cycle regulation, there is an impact on cancer progression, rising the opportunity to diagnose and treat some types of cancer according to a particular splicing profile. Copyright © 2013 Elsevier España, S.L.U. All rights reserved.

  6. Analytical investigation of the faster-is-slower effect with a simplified phenomenological model

    NASA Astrophysics Data System (ADS)

    Suzuno, K.; Tomoeda, A.; Ueyama, D.

    2013-11-01

    We investigate the mechanism of the phenomenon called the “faster-is-slower”effect in pedestrian flow studies analytically with a simplified phenomenological model. It is well known that the flow rate is maximized at a certain strength of the driving force in simulations using the social force model when we consider the discharge of self-driven particles through a bottleneck. In this study, we propose a phenomenological and analytical model based on a mechanics-based modeling to reveal the mechanism of the phenomenon. We show that our reduced system, with only a few degrees of freedom, still has similar properties to the original many-particle system and that the effect comes from the competition between the driving force and the nonlinear friction from the model. Moreover, we predict the parameter dependences on the effect from our model qualitatively, and they are confirmed numerically by using the social force model.

  7. Practical Strategies for Collaboration across Discipline-Based Education Research and the Learning Sciences

    PubMed Central

    Peffer, Melanie; Renken, Maggie

    2016-01-01

    Rather than pursue questions related to learning in biology from separate camps, recent calls highlight the necessity of interdisciplinary research agendas. Interdisciplinary collaborations allow for a complicated and expanded approach to questions about learning within specific science domains, such as biology. Despite its benefits, interdisciplinary work inevitably involves challenges. Some such challenges originate from differences in theoretical and methodological approaches across lines of work. Thus, aims at developing successful interdisciplinary research programs raise important considerations regarding methodologies for studying biology learning, strategies for approaching collaborations, and training of early-career scientists. Our goal here is to describe two fields important to understanding learning in biology, discipline-based education research and the learning sciences. We discuss differences between each discipline’s approach to biology education research and the benefits and challenges associated with incorporating these perspectives in a single research program. We then propose strategies for building productive interdisciplinary collaboration. PMID:27881446

  8. Thermodynamic origin of surface melting on ice crystals

    PubMed Central

    Murata, Ken-ichiro; Asakawa, Harutoshi; Nagashima, Ken; Furukawa, Yoshinori; Sazaki, Gen

    2016-01-01

    Since the pioneering prediction of surface melting by Michael Faraday, it has been widely accepted that thin water layers, called quasi-liquid layers (QLLs), homogeneously and completely wet ice surfaces. Contrary to this conventional wisdom, here we both theoretically and experimentally demonstrate that QLLs have more than two wetting states and that there is a first-order wetting transition between them. Furthermore, we find that QLLs are born not only under supersaturated conditions, as recently reported, but also at undersaturation, but QLLs are absent at equilibrium. This means that QLLs are a metastable transient state formed through vapor growth and sublimation of ice, casting a serious doubt on the conventional understanding presupposing the spontaneous formation of QLLs in ice–vapor equilibrium. We propose a simple but general physical model that consistently explains these aspects of surface melting and QLLs. Our model shows that a unique interfacial potential solely controls both the wetting and thermodynamic behavior of QLLs. PMID:27791107

  9. Learning-Based Just-Noticeable-Quantization- Distortion Modeling for Perceptual Video Coding.

    PubMed

    Ki, Sehwan; Bae, Sung-Ho; Kim, Munchurl; Ko, Hyunsuk

    2018-07-01

    Conventional predictive video coding-based approaches are reaching the limit of their potential coding efficiency improvements, because of severely increasing computation complexity. As an alternative approach, perceptual video coding (PVC) has attempted to achieve high coding efficiency by eliminating perceptual redundancy, using just-noticeable-distortion (JND) directed PVC. The previous JNDs were modeled by adding white Gaussian noise or specific signal patterns into the original images, which were not appropriate in finding JND thresholds due to distortion with energy reduction. In this paper, we present a novel discrete cosine transform-based energy-reduced JND model, called ERJND, that is more suitable for JND-based PVC schemes. Then, the proposed ERJND model is extended to two learning-based just-noticeable-quantization-distortion (JNQD) models as preprocessing that can be applied for perceptual video coding. The two JNQD models can automatically adjust JND levels based on given quantization step sizes. One of the two JNQD models, called LR-JNQD, is based on linear regression and determines the model parameter for JNQD based on extracted handcraft features. The other JNQD model is based on a convolution neural network (CNN), called CNN-JNQD. To our best knowledge, our paper is the first approach to automatically adjust JND levels according to quantization step sizes for preprocessing the input to video encoders. In experiments, both the LR-JNQD and CNN-JNQD models were applied to high efficiency video coding (HEVC) and yielded maximum (average) bitrate reductions of 38.51% (10.38%) and 67.88% (24.91%), respectively, with little subjective video quality degradation, compared with the input without preprocessing applied.

  10. Hypothesis Testing as an Act of Rationality

    NASA Astrophysics Data System (ADS)

    Nearing, Grey

    2017-04-01

    Statistical hypothesis testing is ad hoc in two ways. First, setting probabilistic rejection criteria is, as Neyman (1957) put it, an act of will rather than an act of rationality. Second, physical theories like conservation laws do not inherently admit probabilistic predictions, and so we must use what are called epistemic bridge principles to connect model predictions with the actual methods of hypothesis testing. In practice, these bridge principles are likelihood functions, error functions, or performance metrics. I propose that the reason we are faced with these problems is because we have historically failed to account for a fundamental component of basic logic - namely the portion of logic that explains how epistemic states evolve in the presence of empirical data. This component of Cox' (1946) calculitic logic is called information theory (Knuth, 2005), and adding information theory our hypothetico-deductive account of science yields straightforward solutions to both of the above problems. This also yields a straightforward method for dealing with Popper's (1963) problem of verisimilitude by facilitating a quantitative approach to measuring process isomorphism. In practice, this involves data assimilation. Finally, information theory allows us to reliably bound measures of epistemic uncertainty, thereby avoiding the problem of Bayesian incoherency under misspecified priors (Grünwald, 2006). I therefore propose solutions to four of the fundamental problems inherent in both hypothetico-deductive and/or Bayesian hypothesis testing. - Neyman (1957) Inductive Behavior as a Basic Concept of Philosophy of Science. - Cox (1946) Probability, Frequency and Reasonable Expectation. - Knuth (2005) Lattice Duality: The Origin of Probability and Entropy. - Grünwald (2006). Bayesian Inconsistency under Misspecification. - Popper (1963) Conjectures and Refutations: The Growth of Scientific Knowledge.

  11. HIV, HCV, HBV, and syphilis among transgender women from Brazil

    PubMed Central

    Bastos, Francisco I.; Bastos, Leonardo Soares; Coutinho, Carolina; Toledo, Lidiane; Mota, Jurema Corrêa; Velasco-de-Castro, Carlos Augusto; Sperandei, Sandro; Brignol, Sandra; Travassos, Tamiris Severino; dos Santos, Camila Mattos; Malta, Monica Siqueira

    2018-01-01

    Abstract Different sampling strategies, analytic alternatives, and estimators have been proposed to better assess the characteristics of different hard-to-reach populations and their respective infection rates (as well as their sociodemographic characteristics, associated harms, and needs) in the context of studies based on respondent-driven sampling (RDS). Despite several methodological advances and hundreds of empirical studies implemented worldwide, some inchoate findings and methodological challenges remain. The in-depth assessment of the local structure of networks and the performance of the available estimators are particularly relevant when the target populations are sparse and highly stigmatized. In such populations, bottlenecks as well as other sources of biases (for instance, due to homophily and/or too sparse or fragmented groups of individuals) may be frequent, affecting the estimates. In the present study, data were derived from a cross-sectional, multicity RDS study, carried out in 12 Brazilian cities with transgender women (TGW). Overall, infection rates for HIV and syphilis were very high, with some variation between different cities. Notwithstanding, findings are of great concern, considering the fact that female TGW are not only very hard-to-reach but also face deeply-entrenched prejudice and have been out of the reach of most therapeutic and preventive programs and projects. We cross-compared findings adjusted using 2 estimators (the classic estimator usually known as estimator II, originally proposed by Volz and Heckathorn) and a brand new strategy to adjust data generated by RDS, partially based on Bayesian statistics, called for the sake of this paper, the RDS-B estimator. Adjusted prevalence was cross-compared with estimates generated by non-weighted analyses, using what has been called by us a naïve estimator or rough estimates. PMID:29794601

  12. HIV, HCV, HBV, and syphilis among transgender women from Brazil: Assessing different methods to adjust infection rates of a hard-to-reach, sparse population.

    PubMed

    Bastos, Francisco I; Bastos, Leonardo Soares; Coutinho, Carolina; Toledo, Lidiane; Mota, Jurema Corrêa; Velasco-de-Castro, Carlos Augusto; Sperandei, Sandro; Brignol, Sandra; Travassos, Tamiris Severino; Dos Santos, Camila Mattos; Malta, Monica Siqueira

    2018-05-01

    Different sampling strategies, analytic alternatives, and estimators have been proposed to better assess the characteristics of different hard-to-reach populations and their respective infection rates (as well as their sociodemographic characteristics, associated harms, and needs) in the context of studies based on respondent-driven sampling (RDS). Despite several methodological advances and hundreds of empirical studies implemented worldwide, some inchoate findings and methodological challenges remain. The in-depth assessment of the local structure of networks and the performance of the available estimators are particularly relevant when the target populations are sparse and highly stigmatized. In such populations, bottlenecks as well as other sources of biases (for instance, due to homophily and/or too sparse or fragmented groups of individuals) may be frequent, affecting the estimates.In the present study, data were derived from a cross-sectional, multicity RDS study, carried out in 12 Brazilian cities with transgender women (TGW). Overall, infection rates for HIV and syphilis were very high, with some variation between different cities. Notwithstanding, findings are of great concern, considering the fact that female TGW are not only very hard-to-reach but also face deeply-entrenched prejudice and have been out of the reach of most therapeutic and preventive programs and projects.We cross-compared findings adjusted using 2 estimators (the classic estimator usually known as estimator II, originally proposed by Volz and Heckathorn) and a brand new strategy to adjust data generated by RDS, partially based on Bayesian statistics, called for the sake of this paper, the RDS-B estimator. Adjusted prevalence was cross-compared with estimates generated by non-weighted analyses, using what has been called by us a naïve estimator or rough estimates.

  13. A Giant in the Shadows: Major General Benjamin Foulois and the Rise of the Army Air Service in World War I

    DTIC Science & Technology

    2010-06-01

    America’s original military aviators, he flew the Army’s first dirigible balloon and its first airplane, learning to fly from early aviation pioneers such...entire first enlistment, Foulois asked everyone to call him “Ben.” No one asked him about the origin of his nickname, and he never volunteered the...during the Mexican Punitive Expedition. As a result, the War Department and Congress increased the original Fiscal Year 1917 aviation budget from

  14. Uncovering the Chemistry of Earth-like Planets

    NASA Astrophysics Data System (ADS)

    Zeng, Li; Jacobsen, Stein; Sasselov, Dimitar D.

    2015-01-01

    We propose to use evidence from our solar system to understand exoplanets, and in particular, to predict their surface chemistry and thereby the possibility of life. An Earth-like planet, born from the same nebula as its host star, is composed primarily of silicate rocks and an iron-nickel metal core, and depleted in volatile content in a systematic manner. The more volatile (easier to vaporize or dissociate into gas form) an element is in an Earth-like planet, the more depleted the element is compared to its host star. After depletion, an Earth-like planet would go through the process of core formation due to heat from radioactive decay and collisions. Core formation depletes a planet's rocky mantle of siderophile (iron-loving) elements, in addition to the volatile depletion. After that, Earth-like planets likely accrete some volatile-rich materials, called 'late veneer'. The late veneer could be essential to the origins of life on Earth and Earth-like planets, as it also delivers the volatiles such as nitrogen, sulfur, carbon and water to the planet's surface, which are crucial for life to occur. We plan to build an integrative model of Earth-like planets from the bottom up. We would like to infer their chemical compositions from their mass-radius relations and their host stars' elemental abundances, and understand the origins of volatile contents (especially water) on their surfaces, and thereby shed light on the origins of life on them.

  15. A new higher performance NGO satellite for direct audio/video broadcast

    NASA Astrophysics Data System (ADS)

    Briskman, Robert D.; Foust, Joseph V.

    2010-03-01

    A three satellite constellation using non-geostationary orbits (NGO) was launched in the latter half of 2000. It is providing direct satellite broadcasting audio and video services to over 9 million mobile and fixed subscribers throughout North America. The constellation will be augmented with a geostationary satellite called FM-5 in 2009, providing increased availability to the user with this "Hybrid" constellation. Effort has recently started on replacement satellites for the original NGO satellites, the first one called FM-6. This new satellite will be placed in a different orbital plane from the original ones providing a constellation which brings further operational improvements. The paper describes the new satellite which has twice the prime and radio frequency (RF) power than the original and a 9 m diameter aperture transmit antenna whose shaped antenna beam delivers much higher effective isotropic radiated power (EIRP). Other technology advances used in the satellite such as electric propulsion, precision star sensors, and enhanced performing lithium-ion batteries are also described in the paper.

  16. Cultivating Advanced Technical Writing Skills through a Graduate-Level Course on Writing Research Proposals

    ERIC Educational Resources Information Center

    McCarthy, Brian D.; Dempsey, Jillian L.

    2017-01-01

    A graduate-level course focused on original research proposals is introduced to address the uneven preparation in technical writing of new chemistry graduate students. This course focuses on writing original research proposals. The general course structure features extensive group discussions, small-group activities, and regular in-class…

  17. Free Vibration Analysis of DWCNTs Using CDM and Rayleigh-Schmidt Based on Nonlocal Euler-Bernoulli Beam Theory

    PubMed Central

    2014-01-01

    The free vibration response of double-walled carbon nanotubes (DWCNTs) is investigated. The DWCNTs are modelled as two beams, interacting between them through the van der Waals forces, and the nonlocal Euler-Bernoulli beam theory is used. The governing equations of motion are derived using a variational approach and the free frequencies of vibrations are obtained employing two different approaches. In the first method, the two double-walled carbon nanotubes are discretized by means of the so-called “cell discretization method” (CDM) in which each nanotube is reduced to a set of rigid bars linked together by elastic cells. The resulting discrete system takes into account nonlocal effects, constraint elasticities, and the van der Waals forces. The second proposed approach, belonging to the semianalytical methods, is an optimized version of the classical Rayleigh quotient, as proposed originally by Schmidt. The resulting conditions are solved numerically. Numerical examples end the paper, in which the two approaches give lower-upper bounds to the true values, and some comparisons with existing results are offered. Comparisons of the present numerical results with those from the open literature show an excellent agreement. PMID:24715807

  18. Contour-Based Corner Detection and Classification by Using Mean Projection Transform

    PubMed Central

    Kahaki, Seyed Mostafa Mousavi; Nordin, Md Jan; Ashtari, Amir Hossein

    2014-01-01

    Image corner detection is a fundamental task in computer vision. Many applications require reliable detectors to accurately detect corner points, commonly achieved by using image contour information. The curvature definition is sensitive to local variation and edge aliasing, and available smoothing methods are not sufficient to address these problems properly. Hence, we propose Mean Projection Transform (MPT) as a corner classifier and parabolic fit approximation to form a robust detector. The first step is to extract corner candidates using MPT based on the integral properties of the local contours in both the horizontal and vertical directions. Then, an approximation of the parabolic fit is calculated to localize the candidate corner points. The proposed method presents fewer false-positive (FP) and false-negative (FN) points compared with recent standard corner detection techniques, especially in comparison with curvature scale space (CSS) methods. Moreover, a new evaluation metric, called accuracy of repeatability (AR), is introduced. AR combines repeatability and the localization error (Le) for finding the probability of correct detection in the target image. The output results exhibit better repeatability, localization, and AR for the detected points compared with the criteria in original and transformed images. PMID:24590354

  19. Contour-based corner detection and classification by using mean projection transform.

    PubMed

    Kahaki, Seyed Mostafa Mousavi; Nordin, Md Jan; Ashtari, Amir Hossein

    2014-02-28

    Image corner detection is a fundamental task in computer vision. Many applications require reliable detectors to accurately detect corner points, commonly achieved by using image contour information. The curvature definition is sensitive to local variation and edge aliasing, and available smoothing methods are not sufficient to address these problems properly. Hence, we propose Mean Projection Transform (MPT) as a corner classifier and parabolic fit approximation to form a robust detector. The first step is to extract corner candidates using MPT based on the integral properties of the local contours in both the horizontal and vertical directions. Then, an approximation of the parabolic fit is calculated to localize the candidate corner points. The proposed method presents fewer false-positive (FP) and false-negative (FN) points compared with recent standard corner detection techniques, especially in comparison with curvature scale space (CSS) methods. Moreover, a new evaluation metric, called accuracy of repeatability (AR), is introduced. AR combines repeatability and the localization error (Le) for finding the probability of correct detection in the target image. The output results exhibit better repeatability, localization, and AR for the detected points compared with the criteria in original and transformed images.

  20. Quantum fluctuations and CMB anisotropies in one-bubble open inflation models

    NASA Astrophysics Data System (ADS)

    Yamamoto, Kazuhiro; Sasaki, Misao; Tanaka, Takahiro

    1996-10-01

    We first develop a method to calculate a complete set of mode functions that describe the quantum fluctuations generated in one-bubble open inflation models. We consider two classes of models. One is a single scalar field model proposed by Bucher, Goldhaber, and Turok and by us as an example of the open inflation scenario, and the other is a two-field model such as the ``supernatural'' inflation proposed by Linde and Mezhlumian. In both cases we assume the difference in the vacuum energy density between inside and outside the bubble is negligible. There are two kinds of mode functions. One kind has the usual continuous spectrum and the other has a discrete spectrum with characteristic wavelengths exceeding the spatial curvature scale. The latter can be further divided into two classes in terms of its origin. One is called the de Sitter supercurvature mode, which arises due to the global spacetime structure of de Sitter space, and the other is due to fluctuations of the bubble wall. We calculate the spectrum of quantum fluctuations in these models and evaluate the resulting large angular scale CMB anisotropies. We find there are ranges of model parameters that are consistent with observed CMB anisotropies.

  1. On the Latent Variable Interpretation in Sum-Product Networks.

    PubMed

    Peharz, Robert; Gens, Robert; Pernkopf, Franz; Domingos, Pedro

    2017-10-01

    One of the central themes in Sum-Product networks (SPNs) is the interpretation of sum nodes as marginalized latent variables (LVs). This interpretation yields an increased syntactic or semantic structure, allows the application of the EM algorithm and to efficiently perform MPE inference. In literature, the LV interpretation was justified by explicitly introducing the indicator variables corresponding to the LVs' states. However, as pointed out in this paper, this approach is in conflict with the completeness condition in SPNs and does not fully specify the probabilistic model. We propose a remedy for this problem by modifying the original approach for introducing the LVs, which we call SPN augmentation. We discuss conditional independencies in augmented SPNs, formally establish the probabilistic interpretation of the sum-weights and give an interpretation of augmented SPNs as Bayesian networks. Based on these results, we find a sound derivation of the EM algorithm for SPNs. Furthermore, the Viterbi-style algorithm for MPE proposed in literature was never proven to be correct. We show that this is indeed a correct algorithm, when applied to selective SPNs, and in particular when applied to augmented SPNs. Our theoretical results are confirmed in experiments on synthetic data and 103 real-world datasets.

  2. How protein chemists learned about the hydrophobic factor.

    PubMed Central

    Tanford, C.

    1997-01-01

    It is generally accepted today that the hydrophobic force is the dominant energetic factor that leads to the folding of polypeptide chains into compact globular entities. This principle was first explicitly introduced to protein chemists in 1938 by Irving Langmuir, past master in the application of hydrophobicity to other problems, and was enthusiastically endorsed by J.D. Bernal. But both proposal and endorsement came in the course of a debate about a quite different structural principle, the so-called "cyclol hypothesis" proposed by D. Wrinch, which soon proved to be theoretically and experimentally unsupportable. Being a more tangible idea, directly expressed in structural terms, the cyclol hypothesis received more attention than the hydrophobic principle and the latter never actually entered the mainstream of protein science until 1959, when it was thrust into the limelight in a lucid review by W. Kauzmann. A theoretical paper by H.S. Frank and M. Evans, not itself related to protein folding, probably played a major role in the acceptance of the hydrophobicity concept by protein chemists because it provided a crude but tangible picture of the origin of hydrophobicity per se in terms of water structure. PMID:9194199

  3. Contour detection improved by context-adaptive surround suppression.

    PubMed

    Sang, Qiang; Cai, Biao; Chen, Hao

    2017-01-01

    Recently, many image processing applications have taken advantage of a psychophysical and neurophysiological mechanism, called "surround suppression" to extract object contour from a natural scene. However, these traditional methods often adopt a single suppression model and a fixed input parameter called "inhibition level", which needs to be manually specified. To overcome these drawbacks, we propose a novel model, called "context-adaptive surround suppression", which can automatically control the effect of surround suppression according to image local contextual features measured by a surface estimator based on a local linear kernel. Moreover, a dynamic suppression method and its stopping mechanism are introduced to avoid manual intervention. The proposed algorithm is demonstrated and validated by a broad range of experimental results.

  4. Sports Training Support Method by Self-Coaching with Humanoid Robot

    NASA Astrophysics Data System (ADS)

    Toyama, S.; Ikeda, F.; Yasaka, T.

    2016-09-01

    This paper proposes a new training support method called self-coaching with humanoid robots. In the proposed method, two small size inexpensive humanoid robots are used because of their availability. One robot called target robot reproduces motion of a target player and another robot called reference robot reproduces motion of an expert player. The target player can recognize a target technique from the reference robot and his/her inadequate skill from the target robot. Modifying the motion of the target robot as self-coaching, the target player could get advanced cognition. Some experimental results show some possibility as the new training method and some issues of the self-coaching interface program as a future work.

  5. Self-Criticism: A Measure of Uncompassionate Behaviors Toward the Self, Based on the Negative Components of the Self-Compassion Scale.

    PubMed

    Montero-Marín, Jesús; Gaete, Jorge; Demarzo, Marcelo; Rodero, Baltasar; Lopez, Luiz C Serrano; García-Campayo, Javier

    2016-01-01

    The use of the Self-Compassion Scale (SCS) as a single measure has been pointed out as problematic by many authors and its originally proposed structure has repeatedly been called into question. The negative facets of this construct are more strongly related to psychopathology than the positive indicators. The aim of this study was to evaluate and compare the different structures proposed for the SCS, including a new measure based only on the negative factors, and to assess the psychometric features of the more plausible solution. The study employed a cross-sectional and cross-cultural design. A sample of Brazilian (n = 406) and Spanish (n = 416) primary care professionals completed the SCS, and other questionnaires to measure psychological health-related variables. The SCS factor structure was estimated using confirmatory factor analysis by the maximum likelihood method. Internal consistency was assessed by squaring the correlation between the latent true variable and the observed variables. The relationships between the SCS and other constructs were analyzed using Spearman's r s . The structure with the best fit was comprised of the three negative first-order factors of "self-judgment", "isolation" and "over-identification", and one negative second-order factor, which has been named "self-criticism" [CFI = 0.92; RMSEA = 0.06 (90% CI = 0.05-0.07); SRMR = 0.05]. This solution was supported by both samples, presented partial metric invariance [CFI = 0.91; RMSEA = 0.06 (90% CI = 0.05-0.06); SRMR = 0.06], and showed significant correlations with other health-related psychological constructs. Reliability was adequate for all the dimensions (R ≥ 0.70). The original structure proposed for the SCS was not supported by the data. Self-criticism, comprising only the negative SCS factors, might be a measure of uncompassionate behaviors toward the self, with good psychometric properties and practical implications from a clinical point of view, reaching a stable structure and overcoming possible methodological artifacts.

  6. Failure Waves in Glass and Ceramics under Shock Compression

    NASA Astrophysics Data System (ADS)

    Singh Brar, N.

    1999-06-01

    The response of various types of glasses (fused silica, borosilicates, soda-lime, and lead filled) to shock wave loading, especially the failure of glass behind the shock wave through the ``so called" failure wave or front has been the subject of intense research among a number of investigators. The variations in material properties across this front include complete loss of tensile (spall) strength, loss in shear strength, reduction in acoustic impedance, and opacity to light. Both the Stress and velocity history from VISAR measurements have shown that the failure front propagates at a speed of 1.5 to 2.5 mm/s, depending on the peak shock stress level. The shear strength [τ = 1/2(σ_x-σ_y)] behind the failure front, determined using embedded transverse gauges, is found to decrease to about 2 GPa for soda-lime, borosilicate, and filled glasses. The optical (high-speed photography) observations also confirm the formation of failure front. There is a general agreement among various researchers on these observations. However, three proposed mechanisms for the formation of failure front are based on totally different formulations. The first, due to Clifton is based on the process of nucleation of local densification due to shock compression followed by shear failure around inhomogeneities resulting in phase boundary between the comminuted from the intact material. The second, proposed by Grady involves the transfer of elastic shear strain energy to dilatant strain energy as a result of severe microcracking originating from impact face. The third, by Espinosa and Brar proposes that the front is created through shear microcracks, which nucleate and propagate from the impact face, as originally suggested by Kanel. This mechanism is incorporated in multiple-plane model and simulations predict the increase in lateral stress and an observed reduction in spall strength behind the failure front. Failure front studies, in terms of loss of shear strength, have been recently extended to alumina and SiC ceramics by Bourne et. al.

  7. On the Origin and Evolution of Stellar Chromospheres, Coronae and Winds

    NASA Technical Reports Server (NTRS)

    Musielak, Z. E.

    2000-01-01

    This grant was awarded by NASA to The University of Alabama in Huntsville (UAH) to construct state-of-the-art, theoretical, two-component, chromospheric models for single stars of different spectral types and different evolutionary status. In our proposal, we suggested to use these models to predict the level of the "basal flux", the observed range of variation of chromospheric activity for a given spectral type, and the decrease of this activity with stellar age. In addition, for red giants and supergiants, we also proposed to construct self-consistent, purely theoretical wind models, and used these models to investigate the origin of "dividing lines" in the H-R diagram. In the following, we describe our completed work. We have accomplished the first main goal of our proposal by constructing first purely theoretical, time-dependent and two-component models of stellar chromospheres.1 The models require specifying only three basic stellar parameters, namely, the effective temperature, gravity and rotation rate, and they take into account non-magnetic and magnetic regions in stellar chromospheres. The non-magnetic regions are heated by acoustic waves generated by the turbulent convection in the stellar subphotospheric layers. The magnetic regions are identified with magnetic flux tubes uniformly distributed over the entire stellar surface and they are heated by longitudinal tube waves generated by turbulent motions in the subphotospheric and photospheric layers. The coverage of stellar surface by magnetic regions (the so-called filling factor) is estimated for a given rotation rate from an observational relationship. The constructed models are time-dependent and are based on the energy balance between the amount of mechanical energy supplied by waves and radiative losses in strong Ca II and Mg II emission lines. To calculate the amount of wave energy in the non-magnetic regions, we have used the Lighthill-Stein theory for sound generation.

  8. Evolution of the vertebrate Pax4/6 class of genes with focus on its novel member, the Pax10 gene.

    PubMed

    Feiner, Nathalie; Meyer, Axel; Kuraku, Shigehiro

    2014-06-19

    The members of the paired box (Pax) family regulate key developmental pathways in many metazoans as tissue-specific transcription factors. Vertebrate genomes typically possess nine Pax genes (Pax1-9), which are derived from four proto-Pax genes in the vertebrate ancestor that were later expanded through the so-called two-round (2R) whole-genome duplication. A recent study proposed that pax6a genes of a subset of teleost fishes (namely, acanthopterygians) are remnants of a paralog generated in the 2R genome duplication, to be renamed pax6.3, and reported one more group of vertebrate Pax genes (Pax6.2), most closely related to the Pax4/6 class. We propose to designate this new member Pax10 instead and reconstruct the evolutionary history of the Pax4/6/10 class with solid phylogenetic evidence. Our synteny analysis showed that Pax4, -6, and -10 originated in the 2R genome duplications early in vertebrate evolution. The phylogenetic analyses of relationships between teleost pax6a and other Pax4, -6, and -10 genes, however, do not support the proposed hypothesis of an ancient origin of the acanthopterygian pax6a genes in the 2R genome duplication. Instead, we confirmed the traditional scenario that the acanthopterygian pax6a is derived from the more recent teleost-specific genome duplication. Notably, Pax6 is present in all vertebrates surveyed to date, whereas Pax4 and -10 were lost multiple times in independent vertebrate lineages, likely because of their restricted expression patterns: Among Pax6-positive domains, Pax10 has retained expression in the adult retina alone, which we documented through in situ hybridization and quantitative reverse transcription polymerase chain reaction experiments on zebrafish, Xenopus, and anole lizard. © The Author(s) 2014. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution.

  9. The Use of Original Sources and Its Potential Relation to the Recruitment Problem

    ERIC Educational Resources Information Center

    Jankvist, Uffe Thomas

    2014-01-01

    Based on a study about using original sources with Danish upper secondary students, the paper addresses the potential outcome of such an approach in regard to the so-called recruitment problem to the mathematical sciences. 24 students were exposed to questionnaire questions and 16 of these to follow-up interviews, which form the basis for both a…

  10. On Milne-Barbier-Unsöld relationships

    NASA Astrophysics Data System (ADS)

    Paletou, Frédéric

    2018-04-01

    This short review aims to clarify upon the origins of so-called Eddington-Barbier relationships, which relate the emergent specific intensity and the flux to the photospheric source function at specific optical depths. Here we discuss the assumptions behind the original derivation of Barbier (1943).We also point to the fact that Milne had already formulated these two relations in 1921.

  11. Simultaneous Planning and Control for Autonomous Ground Vehicles

    DTIC Science & Technology

    2009-02-01

    these applications is called A * ( A -star), and it was originally developed by Hart, Nilsson, and Raphael [HAR68]. Their research presented the formal...sequence, rather than a dynamic programming approach. A * search is a technique originally developed for Artificial Intelligence 43 applications ... developed at the Center for Intelligent Machines and Robotics, serves as a platform for the implementation and testing discussed. autonomous

  12. On the Origins of the Task Mixing Cost in the Cuing Task-Switching Paradigm

    ERIC Educational Resources Information Center

    Rubin, Orit; Meiran, Nachshon

    2005-01-01

    Poorer performance in conditions involving task repetition within blocks of mixed tasks relative to task repetition within blocks of single task is called mixing cost (MC). In 2 experiments exploring 2 hypotheses regarding the origins of MC, participants either switched between cued shape and color tasks, or they performed them as single tasks.…

  13. The Origins of Modernity: Was Autonomous Speech the Critical Factor?

    ERIC Educational Resources Information Center

    Corballis, Michael C.

    2004-01-01

    Although Homo sapiens emerged in Africa some 170,000 years ago, the origins of "modern" behavior, as expressed in technology and art, are attributed to people who migrated out of Africa around 50,000 years ago, creating what has been called a human revolution in Europe and Asia. There is recent evidence that a mutation of the FOXP2 gene (forkhead…

  14. 7 CFR 1737.21 - The completed loan application.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... application consists of four parts: (1) A completed RUS Form 490. (2) A market survey called the Area Coverage Survey (ACS). (3) The plan and associated costs for the proposed construction, called the Loan Design (LD...

  15. 7 CFR 1737.21 - The completed loan application.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... application consists of four parts: (1) A completed RUS Form 490. (2) A market survey called the Area Coverage Survey (ACS). (3) The plan and associated costs for the proposed construction, called the Loan Design (LD...

  16. 7 CFR 1737.21 - The completed loan application.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... application consists of four parts: (1) A completed RUS Form 490. (2) A market survey called the Area Coverage Survey (ACS). (3) The plan and associated costs for the proposed construction, called the Loan Design (LD...

  17. Consumer Perceptions About Pilot Training: An Emotional Response

    NASA Astrophysics Data System (ADS)

    Rosser, Timothy G.

    Civilian pilot training has followed a traditional path for several decades. With a potential pilot shortage approaching, ICAO proposed a new paradigm in pilot training methodology called the Multi-Crew Pilot License. This new methodology puts a pilot in the cockpit of an airliner with significantly less flight time experience than the traditional methodology. The purpose of this study was to determine to what extent gender, country of origin and pilot training methodology effect an aviation consumer's willingness to fly. Additionally, this study attempted to determine what emotions mediate a consumer's decision. This study surveyed participants from India and the United States to measure their willingness to fly using the Willingness to Fly Scale shown to be valid and reliable by Rice et al. (2015). The scale uses a five point Likert-type scale. In order to determine the mediating emotions, Ekman and Friesen's (1979) universal emotions, which are happiness, surprise, fear, disgust, anger, and sadness were used. Data were analyzed using SPSS. Descriptive statistics are provided for respondent's age and willingness to fly values. An ANOVA was conducted to test the first four hypotheses and Hayes (2004, 2008) bootstrapping process was used for the mediation analysis. Results indicated a significant main effect for training, F(1,972) = 227.76, p . .001, etap 2 = 0.190, country of origin, F(1, 972) = 28.86, p < .001, .p 2 = 0.029, and a two-way interaction was indicated between training and country of origin, F(7, 972) = 46.71, p < .001, etap 2 = 0.252. Mediation analysis indicated the emotions anger, fear, happiness, and surprise mediated the relationship between training and country of origin, and training. The findings of this study are important to designers of MPL training programs and airline marketers.

  18. Chimpanzee vocal signaling points to a multimodal origin of human language.

    PubMed

    Taglialatela, Jared P; Russell, Jamie L; Schaeffer, Jennifer A; Hopkins, William D

    2011-04-20

    The evolutionary origin of human language and its neurobiological foundations has long been the object of intense scientific debate. Although a number of theories have been proposed, one particularly contentious model suggests that human language evolved from a manual gestural communication system in a common ape-human ancestor. Consistent with a gestural origins theory are data indicating that chimpanzees intentionally and referentially communicate via manual gestures, and the production of manual gestures, in conjunction with vocalizations, activates the chimpanzee Broca's area homologue--a region in the human brain that is critical for the planning and execution of language. However, it is not known if this activity observed in the chimpanzee Broca's area is the result of the chimpanzees producing manual communicative gestures, communicative sounds, or both. This information is critical for evaluating the theory that human language evolved from a strictly manual gestural system. To this end, we used positron emission tomography (PET) to examine the neural metabolic activity in the chimpanzee brain. We collected PET data in 4 subjects, all of whom produced manual communicative gestures. However, 2 of these subjects also produced so-called attention-getting vocalizations directed towards a human experimenter. Interestingly, only the two subjects that produced these attention-getting sounds showed greater mean metabolic activity in the Broca's area homologue as compared to a baseline scan. The two subjects that did not produce attention-getting sounds did not. These data contradict an exclusive "gestural origins" theory for they suggest that it is vocal signaling that selectively activates the Broca's area homologue in chimpanzees. In other words, the activity observed in the Broca's area homologue reflects the production of vocal signals by the chimpanzees, suggesting that this critical human language region was involved in vocal signaling in the common ancestor of both modern humans and chimpanzees.

  19. Analysis of variation matrix array by bilinear least squares-residual bilinearization (BLLS-RBL) for resolving and quantifying of foodstuff dyes in a candy sample.

    PubMed

    Asadpour-Zeynali, Karim; Maryam Sajjadi, S; Taherzadeh, Fatemeh; Rahmanian, Reza

    2014-04-05

    Bilinear least square (BLLS) method is one of the most suitable algorithms for second-order calibration. Original BLLS method is not applicable to the second order pH-spectral data when an analyte has more than one spectroscopically active species. Bilinear least square-residual bilinearization (BLLS-RBL) was developed to achieve the second order advantage for analysis of complex mixtures. Although the modified method is useful, the pure profiles cannot be obtained and only the linear combination will be obtained. Moreover, for prediction of analyte in an unknown sample, the original algorithm of RBL may diverge; instead of converging to the desired analyte concentrations. Therefore, Gauss Newton-RLB algorithm should be used, which is not as simple as original protocol. Also, the analyte concentration can be predicted on the basis of each of the equilibrating species of the component of interest that are not exactly the same. The aim of the present work is to tackle the non-uniqueness problem in the second order calibration of monoprotic acid mixtures and divergence of RBL. Each pH-absorbance matrix was pretreated by subtraction of the first spectrum from other spectra in the data set to produce full rank array that is called variation matrix. Then variation matrices were analyzed uniquely by original BLLS-RBL that is more parsimonious than its modified counterpart. The proposed method was performed on the simulated as well as the analysis of real data. Sunset yellow and Carmosine as monoprotic acids were determined in candy sample in the presence of unknown interference by this method. Copyright © 2013 Elsevier B.V. All rights reserved.

  20. An MIP model to schedule the call center workforce and organize the breaks

    NASA Astrophysics Data System (ADS)

    Türker, Turgay; Demiriz, Ayhan

    2016-06-01

    In modern economies, companies place a premium on managing their workforce efficiently especially in labor intensive service sector, since the services have become the significant portion of the economies. Tour scheduling is an important tool to minimize the overall workforce costs while satisfying the minimum service level constraints. In this study, we consider the workforce management problem of an inbound call-center while satisfying the call demand within the short time periods with the minimum cost. We propose a mixed-integer programming model to assign workers to the daily shifts, to determine the weekly off-days, and to determine the timings of lunch and other daily breaks for each worker. The proposed model has been verified on the weekly demand data observed at a specific call center location of a satellite TV operator. The model was run on both 15 and 10 minutes demand estimation periods (planning time intervals).

  1. Formation of Compact Ellipticals in the merging star cluster scenario

    NASA Astrophysics Data System (ADS)

    Urrutia Zapata, Fernanda Cecilia; Theory and star formation group

    2018-01-01

    In the last years, extended old stellar clusters have been observed. They are like globular clusters (GCs) but with larger sizes(a limit of Re=10 pc is currently seen as reasonable). These extended objects (EOs) cover a huge range of mass. Objects at the low mass end with masses comparable to normal globular clusters are called extended clusters or faint fuzzies Larsen & Brodie (2000) and objects at the high-mass end are called ultra compact dwarf galaxies (UCDs). Ultra compact dwarf galaxies are compact object with luminositys above the brigtest known GCs. UCDs are more compact than typical dwarf galaxies but with comparable luminosities. Usually, a lower mass limit of 2 × 10^6 Solar masses is applied.Fellhauer & Kroupa (2002a,b) demostrated that object like ECs, FFs and UCDs can be the remnants of the merger of star clusters complexes, this scenario is called the Merging Star Cluster Scenario. Amore concise study was performed by Bruens et al. (2009, 2011).Our work tries to explain the formation of compact elliptical(cE). These objects are a comparatively rare class of spheroidal galaxies, possessing very small Re and high central surface brightnesses (Faber 1973). cEs have the same parameters as extended objects but they are slightly larger than 100 pc and the luminosities are in the range of -11 to -12 Mag.The standard formation sceanrio of these systems proposes a galaxy origin. CEs are the result of tidal stripping and truncation of nucleated larger systems. Or they could be a natural extension of the class of elliptical galaxies to lower luminosities and smaller sizes.We want to propose a completely new formation scenario for cEs. In our project we try to model cEs in a similar way that UCDs using the merging star cluster scenario extended to much higher masses and sizes. We think that in the early Universe we might have produced sufficiently strong star bursts to form cluster complexes which merge into cEs. So far it is observationally unknown if cEs are dark matter dominated objects. If our scenario is true, then they would be dark matter free very extended and massive "star clusters".

  2. Multitask TSK fuzzy system modeling by mining intertask common hidden structure.

    PubMed

    Jiang, Yizhang; Chung, Fu-Lai; Ishibuchi, Hisao; Deng, Zhaohong; Wang, Shitong

    2015-03-01

    The classical fuzzy system modeling methods implicitly assume data generated from a single task, which is essentially not in accordance with many practical scenarios where data can be acquired from the perspective of multiple tasks. Although one can build an individual fuzzy system model for each task, the result indeed tells us that the individual modeling approach will get poor generalization ability due to ignoring the intertask hidden correlation. In order to circumvent this shortcoming, we consider a general framework for preserving the independent information among different tasks and mining hidden correlation information among all tasks in multitask fuzzy modeling. In this framework, a low-dimensional subspace (structure) is assumed to be shared among all tasks and hence be the hidden correlation information among all tasks. Under this framework, a multitask Takagi-Sugeno-Kang (TSK) fuzzy system model called MTCS-TSK-FS (TSK-FS for multiple tasks with common hidden structure), based on the classical L2-norm TSK fuzzy system, is proposed in this paper. The proposed model can not only take advantage of independent sample information from the original space for each task, but also effectively use the intertask common hidden structure among multiple tasks to enhance the generalization performance of the built fuzzy systems. Experiments on synthetic and real-world datasets demonstrate the applicability and distinctive performance of the proposed multitask fuzzy system model in multitask regression learning scenarios.

  3. 77 FR 44291 - Self-Regulatory Organizations; NYSE Arca, Inc.; Order Granting Approval of Proposed Rule Change...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-27

    ... volatility. On the day of the monthly expiration of VIX call options, previously purchased VIX call options are cash-settled, and new VIX call options are purchased at the 10 a.m., Central Time asking price... Index. \\9\\ Tail hedging, in the context used by the Index Provider, is the practice of trying to hedge...

  4. 78 FR 15645 - Mandatory Country of Origin Labeling of Beef, Pork, Lamb, Chicken, Goat Meat, Wild and Farm...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-03-12

    ... Service 7 CFR Parts 60 and 65 [Document No. AMS-LS-13-0004] RIN 0581-AD29 Mandatory Country of Origin... (AMS), USDA. ACTION: Proposed rule. SUMMARY: This proposed rule would amend the Country of Origin....regulations.gov or at the above address during regular business hours. Comments submitted in response to this...

  5. Measuring the EMS patient access time interval and the impact of responding to high-rise buildings.

    PubMed

    Morrison, Laurie J; Angelini, Mark P; Vermeulen, Marian J; Schwartz, Brian

    2005-01-01

    To measure the patient access time interval and characterize its contribution to the total emergency medical services (EMS) response time interval; to compare the patient access time intervals for patients located three or more floors above ground with those less than three floors above or below ground, and specifically in the apartment subgroup; and to identify barriers that significantly impede EMS access to patients in high-rise apartments. An observational study of all patients treated by an emergency medical technician paramedics (EMT-P) crew was conducted using a trained independent observer to collect time intervals and identify potential barriers to access. Of 118 observed calls, 25 (21%) originated from patients three or more floors above ground. The overall median and 90th percentile (95% confidence interval) patient access time intervals were 1.61 (1.27, 1.91) and 3.47 (3.08, 4.05) minutes, respectively. The median interval was 2.73 (2.22, 3.03) minutes among calls from patients located three or more stories above ground compared with 1.25 (1.07, 1.55) minutes among those at lower levels. The patient access time interval represented 23.5% of the total EMS response time interval among calls originating less than three floors above or below ground and 32.2% of those located three or more stories above ground. The most frequently encountered barriers to access included security code entry requirements, lack of directional signs, and inability to fit the stretcher into the elevator. The patient access time interval is significantly long and represents a substantial component of the total EMS response time interval, especially among ambulance calls originating three or more floors above ground. A number of barriers appear to contribute to delayed paramedic access.

  6. Environmental Assessment for the Tula Peak Road Intersection

    DTIC Science & Technology

    2009-07-01

    essential habitat and as outlined in the Storm Water Pollution Prevention Plan (SWPPP). Long term monitoring will be conducted until the soil is...went beyond the evaluated impact. The actual breach of policy was not discovered until a routine storm water inspection questioned the inadequacy of...project called for disturbing less than one acre, no SWPPP was · originally called for. Upon inspection by the Holloman AFB Storm Water manager, the

  7. Airline competition : issues raised by consolidation proposals

    DOT National Transportation Integrated Search

    2001-02-01

    In May 2000, United Airlines (United) proposed to acquire US Airways and divest part of those assets to create a new airline to be called DC Air. More recently, American Airlines (American) has proposed to purchase Trans World Airlines (TWA), along w...

  8. A fuzzy call admission control scheme in wireless networks

    NASA Astrophysics Data System (ADS)

    Ma, Yufeng; Gong, Shenguang; Hu, Xiulin; Zhang, Yunyu

    2007-11-01

    Scarcity of the spectrum resource and mobility of users make quality of service (QoS) provision a critical issue in wireless networks. This paper presents a fuzzy call admission control scheme to meet the requirement of the QoS. A performance measure is formed as a weighted linear function of new call and handoff call blocking probabilities. Simulation compares the proposed fuzzy scheme with an adaptive channel reservation scheme. Simulation results show that fuzzy scheme has a better robust performance in terms of average blocking criterion.

  9. Retrospective analysis of phone queries to an epilepsy clinic hotline.

    PubMed

    Laforme, Anny; Jubinville, Suzie; Gravel, Micheline; Cossette, Patrick; Nguyen, Dang K

    2014-01-01

    We undertook a retrospective study of 5,189 telephone calls made between January 2004 and June 2011 through our adult epilepsy clinic hotline to a single epileptologist initially and two epileptologists from June 2010 onwards. The majority of calls were made by patients themselves (72%), followed by family members (16%) and health care providers (11%). Half of the calls originated from outside the city limits. Most were related to medication (25%), notification of seizures (23%), appointments or tests (12%), and side effects (9%). Half of the workload was generated by 10% of patients. The hotline service appears to respond to needs, with most calls requiring rapid intervention. It is desirable to develop novel approaches to address the needs of high-frequency callers.

  10. Measures to facilitate the reintegration of returning migrant workers: international experiences.

    PubMed

    Lohrmann, R

    1988-06-01

    Bilateral and multilateral measures implemented to assist migrants who return to their country of origin have been designed to respond to a number of different but specific situations. 2 bilateral agreements are briefly described: 1) an agreement between the Federal Republic of Germany and the Republic of Turkey signed in the early 1970s, and 2) an agreement between France and Algeria signed in 1980. 3 different types of multilateral activities are described: 1) the operation of the so-called Return of Talent program by the Intergovernmental Committee for Migration, 2) the Transfer of KNow-how Through Expatriate Nationals program of the UN Development Programme, and 3) the elaboration of a model machinery on return migration by the Organization for Economic Cooperation and Development. While the 1st 2 activities are operational programs, by which annually between 1000-2000 professionals are assisted in their permanent return to or temporary sojourn in their developing countries of origin, with the financial support of both the developed and the developing countries concerned, the 3rd initiative is a conceptual effort aimed at assisting governments to implement policy measures designed to make return migration commensurate with national development goals. 3 recent proposals include 1) the proposal for an international labor compensatory facility, 2) an international fund for vocational training, and 3) an international fund for manpower resources. A common factor shared by all these programs is that they have all involved on 1 side industrial receiving countries which feel themselves obliged to observe a number of principles guaranteed by law and which govern employment conditions and working relations. The reintegration measures implemented or proposed in cooperation with them have been adopted in full consideration of the prevailing standards of these countries, as different as they may be from 1 country to another. A common consideration has been that the returning migrant should reintegrate in his country of origin as far as possible in conditions allowing the returnee to attain self-sufficiency and social security coverage. However, this underlying context does not necessarily prevail in all world regions where different forms of labor migration take place. Therefore the measures experienced in the relationship of specific countries cannot be easily copied for implementation in other countries. Multilateral measures benefited a rather limited number of individuals only, in many instances skilled and highly skilled migrants.

  11. Binary Multidimensional Scaling for Hashing.

    PubMed

    Huang, Yameng; Lin, Zhouchen

    2017-10-04

    Hashing is a useful technique for fast nearest neighbor search due to its low storage cost and fast query speed. Unsupervised hashing aims at learning binary hash codes for the original features so that the pairwise distances can be best preserved. While several works have targeted on this task, the results are not satisfactory mainly due to the oversimplified model. In this paper, we propose a unified and concise unsupervised hashing framework, called Binary Multidimensional Scaling (BMDS), which is able to learn the hash code for distance preservation in both batch and online mode. In the batch mode, unlike most existing hashing methods, we do not need to simplify the model by predefining the form of hash map. Instead, we learn the binary codes directly based on the pairwise distances among the normalized original features by Alternating Minimization. This enables a stronger expressive power of the hash map. In the online mode, we consider the holistic distance relationship between current query example and those we have already learned, rather than only focusing on current data chunk. It is useful when the data come in a streaming fashion. Empirical results show that while being efficient for training, our algorithm outperforms state-of-the-art methods by a large margin in terms of distance preservation, which is practical for real-world applications.

  12. Heat extraction and refrigeration (HEAR) system. Phase I final progress report. [Restaurant kitchens

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Venable, B.M.

    1983-01-01

    Testing indicates that heat energy available to be recaptured grossly exceeds the capacity of the 1.5 ton medium temperature Freon 12 compressor being utilized. The unit produced 50 pounds of suction pressure with the damper (Figure 4) open and exceeded compressor operational limits with the damper closed. This indicates that the current compressor could be replaced by one of 5 ton capacity since current estimates indicate that 60,000 Btu's are available for recovery. This could be divided between space heating and water heating as required by using separate condensers. There were no real surprises in the feasibility model construction andmore » test phase, and the validity of the assumptions made in the original project description have been established. That is, it has been demonstrated that it is feasible to extract heat from the kitchen exhaust duct in a restaurant and keep the heat pump evaporator clean. It is concluded that work done under this $10,000 grant demonstrated the technical feasibility of the HEAR System. However, additional funding (our original proposal called for a $47,000 grant) would be required to economically evaluate the benefit realized and to advance the HEAR System design to a workable prototype stage.« less

  13. The ξ/ξ2nd ratio as a test for Effective Polyakov Loop Actions

    NASA Astrophysics Data System (ADS)

    Caselle, Michele; Nada, Alessandro

    2018-03-01

    Effective Polyakov line actions are a powerful tool to study the finite temperature behaviour of lattice gauge theories. They are much simpler to simulate than the original (3+1) dimensional LGTs and are affected by a milder sign problem. However it is not clear to which extent they really capture the rich spectrum of the original theories, a feature which is instead of great importance if one aims to address the sign problem. We propose here a simple way to address this issue based on the so called second moment correlation length ξ2nd. The ratio ξ/ξ2nd between the exponential correlation length and the second moment one is equal to 1 if only a single mass is present in the spectrum, and becomes larger and larger as the complexity of the spectrum increases. Since both ξexp and ξ2nd are easy to measure on the lattice, this is an economic and effective way to keep track of the spectrum of the theory. In this respect we show using both numerical simulation and effective string calculations that this ratio increases dramatically as the temperature decreases. This non-trivial behaviour should be reproduced by the Polyakov loop effective action.

  14. Sequence analyses reveal that a TPR–DP module, surrounded by recombinable flanking introns, could be at the origin of eukaryotic Hop and Hip TPR–DP domains and prokaryotic GerD proteins

    PubMed Central

    Papandreou, Nikolaos; Chomilier, Jacques

    2008-01-01

    The co-chaperone Hop [heat shock protein (HSP) organising protein] is known to bind both Hsp70 and Hsp90. Hop comprises three repeats of a tetratricopeptide repeat (TPR) domain, each consisting of three TPR motifs. The first and last TPR domains are followed by a domain containing several dipeptide (DP) repeats called the DP domain. These analyses suggest that the hop genes result from successive recombination events of an ancestral TPR–DP module. From a hydrophobic cluster analysis of homologous Hop protein sequences derived from gene families, we can postulate that shifts in the open reading frames are at the origin of the present sequences. Moreover, these shifts can be related to the presence or absence of biological function. We propose to extend the family of Hop co-chaperons into the kingdom of bacteria, as several structurally related genes have been identified by hydrophobic cluster analysis. We also provide evidence of common structural characteristics between hop and hip genes, suggesting a shared precursor of ancestral TPR–DP domains. Electronic supplementary material The online version of this article (doi:10.1007/s12192-008-0083-8) contains supplementary material, which is available to authorized users. PMID:18987995

  15. Dimensionality reduction of collective motion by principal manifolds

    NASA Astrophysics Data System (ADS)

    Gajamannage, Kelum; Butail, Sachit; Porfiri, Maurizio; Bollt, Erik M.

    2015-01-01

    While the existence of low-dimensional embedding manifolds has been shown in patterns of collective motion, the current battery of nonlinear dimensionality reduction methods is not amenable to the analysis of such manifolds. This is mainly due to the necessary spectral decomposition step, which limits control over the mapping from the original high-dimensional space to the embedding space. Here, we propose an alternative approach that demands a two-dimensional embedding which topologically summarizes the high-dimensional data. In this sense, our approach is closely related to the construction of one-dimensional principal curves that minimize orthogonal error to data points subject to smoothness constraints. Specifically, we construct a two-dimensional principal manifold directly in the high-dimensional space using cubic smoothing splines, and define the embedding coordinates in terms of geodesic distances. Thus, the mapping from the high-dimensional data to the manifold is defined in terms of local coordinates. Through representative examples, we show that compared to existing nonlinear dimensionality reduction methods, the principal manifold retains the original structure even in noisy and sparse datasets. The principal manifold finding algorithm is applied to configurations obtained from a dynamical system of multiple agents simulating a complex maneuver called predator mobbing, and the resulting two-dimensional embedding is compared with that of a well-established nonlinear dimensionality reduction method.

  16. Implementing a new EPR lineshape parameter for organic radicals in carbonaceous matter.

    PubMed

    Bourbin, Mathilde; Du, Yann Le; Binet, Laurent; Gourier, Didier

    2013-07-17

    Electron Paramagnetic Resonance (EPR) is a non-destructive, non-invasive technique useful for the characterization of organic moieties in primitive carbonaceous matter related to the origin of life. The classical EPR parameters are the peak-to-peak amplitude, the linewidth and the g factor; however, such parameters turn out not to suffice to fully determine a single EPR line. In this paper, we give the definition and practical implementation of a new EPR parameter based on the signal shape that we call the R10 factor. This parameter was originally defined in the case of a single symmetric EPR line and used as a new datation method for organic matter in the field of exobiology. Combined to classical EPR parameters, the proposed shape parameter provides a full description of an EPR spectrum and opens the way to novel applications like datation. Such a parameter is a powerful tool for future EPR studies, not only of carbonaceous matter, but also of any substance which spectrum exhibits a single symmetric line. The paper is a literate program-written using Noweb within the Org-mode as provided by the Emacs editor- and it also describes the full data analysis pipeline that computes the R10 on a real EPR spectrum.

  17. Compositional Models of Hematite-Rich Spherules (Blueberries) at Meridiani Planum, Mars and Constraints on Their Formation

    NASA Astrophysics Data System (ADS)

    Schneider, A.; Mittlefehldt, D.

    2006-10-01

    The Mars Exploration Rover Opportunity discovered hematite-rich spherules (``blueberries'') believed to be diagenetic concretions formed in the bedrock in stagnant or slow-moving groundwater. These spherules likely precipitated from solution, but their origins are poorly understood. Three formation mechanisms are possible: inclusive, replacive and displacive. The first would result in a distinct spherule composition compared to the other two. We propose that chemical clues may help to constrain the nature of blueberry formation. We used Alpha Particle X-ray Spectrometer data for undisturbed soils that were blueberry-free and with visible blueberries at the surface in Microscopic Imager images. We made plots of the elements versus iron for the spherule-rich soils and compared them to a mixing line representative of a pure hematite end member spherule (called ``the zero model''). This modeled the replacive formation mechanism, in which pure hematite would replace all of the original material. If the spherules grew inclusively, chemical data should reflect a compositional component of the rock grains included during formation. Four models were developed to test for possible compositions of a rock component. These models could not easily explain the APXS data and thus demonstrate that the most plausible rock compositions are not components of blueberries.

  18. CT to Cone-beam CT Deformable Registration With Simultaneous Intensity Correction

    PubMed Central

    Zhen, Xin; Gu, Xuejun; Yan, Hao; Zhou, Linghong; Jia, Xun; Jiang, Steve B.

    2012-01-01

    Computed tomography (CT) to cone-beam computed tomography (CBCT) deformable image registration (DIR) is a crucial step in adaptive radiation therapy. Current intensity-based registration algorithms, such as demons, may fail in the context of CT-CBCT DIR because of inconsistent intensities between the two modalities. In this paper, we propose a variant of demons, called Deformation with Intensity Simultaneously Corrected (DISC), to deal with CT-CBCT DIR. DISC distinguishes itself from the original demons algorithm by performing an adaptive intensity correction step on the CBCT image at every iteration step of the demons registration. Specifically, the intensity correction of a voxel in CBCT is achieved by matching the first and the second moments of the voxel intensities inside a patch around the voxel with those on the CT image. It is expected that such a strategy can remove artifacts in the CBCT image, as well as ensuring the intensity consistency between the two modalities. DISC is implemented on computer graphics processing units (GPUs) in compute unified device architecture (CUDA) programming environment. The performance of DISC is evaluated on a simulated patient case and six clinical head-and-neck cancer patient data. It is found that DISC is robust against the CBCT artifacts and intensity inconsistency and significantly improves the registration accuracy when compared with the original demons. PMID:23032638

  19. `Reverse Chemical Evolution': A New Method to Search for Thermally Stable Biopolymers

    NASA Astrophysics Data System (ADS)

    Mitsuzawa, Shigenobu; Yukawa, Tetsuyuki

    2003-04-01

    The primitive sea on Earth may have had high-temperature and high-pressure conditions similar to those in present-day hydrothermal environments. If life originated in the hot sea, thermal stability of the constituent molecules would have been necessary. Thus far, however, it has been reported that biopolymers hydrolyze too rapidly to support life at temperatures of more than 200 °C. We herein propose a novel approach, called reverse chemical evolution, to search for biopolymers notably more stable against thermal decomposition than previously reported. The essence of the approach is that hydrolysis of a protein or functional RNA (m-, t-, r-RNA) at high temperature and high pressure simulating the ancient sea environment may yield thermally stable peptides or RNAs at higher concentrations than other peptides or RNAs. An experimental test hydrolyzing bovine ribonuclease A in aqueous solution at 205 °C and 25 MPa yielded three prominently stable molecules weighing 859, 1030 and 695 Da. They are thermally some tens or hundreds times more stable than a polyglycine of comparable mass. Sequence analyses of the 859- and 1030-Da molecules revealed that they are a heptapeptide and its homologue, respectively, elongated by two amino acids at the N-terminal region, originally embedded as residues 112-120 in the protein. They consist mainly of hydrophobic amino acids.

  20. `The Wildest Speculation of All': Lemaître and the Primeval-Atom Universe

    NASA Astrophysics Data System (ADS)

    Kragh, Helge

    Although there is no logical connection between the expanding universe and the idea of a big bang, from a historical perspective the two concepts were intimately connected. Four years after his pioneering work on the expanding universe, Lemaître suggested that the entire universe had originated in a kind of explosive act from what he called a primeval atom and which he likened to a huge atomic nucleus. His theory of 1931 was the first realistic finite-age model based upon relativistic cosmology, but it presupposed a material proto-universe and thus avoided an initial singularity. What were the sources of Lemaître's daring proposal? Well aware that his new cosmological model needed to have testable consequences, he argued that the cosmic rays were fossils of the original radioactive explosion. However, this hypothesis turned out to be untenable. The first big-bang model ever was received with a mixture of indifference and hostility. Why? The answer is not that contemporary cosmologists failed to recognize Lemaître's genius, but rather that his model was scientifically unconvincing. Although Lemaître was indeed the father of big-bang cosmology, his brilliant idea was only turned into a viable cosmological theory by later physicists.

  1. Interactive visual exploration and analysis of origin-destination data

    NASA Astrophysics Data System (ADS)

    Ding, Linfang; Meng, Liqiu; Yang, Jian; Krisp, Jukka M.

    2018-05-01

    In this paper, we propose a visual analytics approach for the exploration of spatiotemporal interaction patterns of massive origin-destination data. Firstly, we visually query the movement database for data at certain time windows. Secondly, we conduct interactive clustering to allow the users to select input variables/features (e.g., origins, destinations, distance, and duration) and to adjust clustering parameters (e.g. distance threshold). The agglomerative hierarchical clustering method is applied for the multivariate clustering of the origin-destination data. Thirdly, we design a parallel coordinates plot for visualizing the precomputed clusters and for further exploration of interesting clusters. Finally, we propose a gradient line rendering technique to show the spatial and directional distribution of origin-destination clusters on a map view. We implement the visual analytics approach in a web-based interactive environment and apply it to real-world floating car data from Shanghai. The experiment results show the origin/destination hotspots and their spatial interaction patterns. They also demonstrate the effectiveness of our proposed approach.

  2. Analysis of preplate splitting and early cortical development illuminates the biology of neurological disease.

    PubMed

    Olson, Eric C

    2014-01-01

    The development of the layered cerebral cortex starts with a process called preplate splitting. Preplate splitting involves the establishment of prospective cortical layer 6 (L6) neurons within a plexus of pioneer neurons called the preplate. The forming layer 6 splits the preplate into a superficial layer of pioneer neurons called the marginal zone and a deeper layer of pioneer neurons called the subplate. Disruptions of this early developmental event by toxin exposure or mutation are associated with neurological disease including severe intellectual disability. This review explores recent findings that reveal the dynamism of gene expression and morphological differentiation during this early developmental period. Over 1000 genes show expression increases of ≥2-fold during this period in differentiating mouse L6 neurons. Surprisingly, 88% of previously identified non-syndromic intellectual-disability (NS-ID) genes are expressed at this time and show an average expression increase of 1.6-fold in these differentiating L6 neurons. This changing genetic program must, in part, support the dramatic cellular reorganizations that occur during preplate splitting. While different models have been proposed for the formation of a layer of L6 cortical neurons within the preplate, original histological studies and more recent work exploiting transgenic mice suggest that the process is largely driven by the coordinated polarization and coalescence of L6 neurons rather than by cellular translocation or migration. The observation that genes associated with forms of NS-ID are expressed during very early cortical development raises the possibility of studying the relevant biological events at a time point when the cortex is small, contains relatively few cell types, and few functional circuits. This review then outlines how explant models may prove particularly useful in studying the consequence of toxin and mutation on the etiology of some forms of NS-ID.

  3. The Biological Implausibility of the Nature-Nurture Dichotomy and What It Means for the Study of Infancy

    ERIC Educational Resources Information Center

    Lewkowicz, David J.

    2011-01-01

    Since the time of the Greeks, philosophers and scientists have wondered about the origins of structure and function. Plato proposed that the origins of structure and function lie in the organism's nature whereas Aristotle proposed that they lie in its nurture. This nature-nurture dichotomy and the emphasis on the origins question has had a…

  4. 75 FR 39613 - Request for Proposals To Accelerate Tariff Elimination and Modify the Rules of Origin Under the...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-09

    ... origin. Section 202(o)(2) of the USCFTA Act authorizes the President to proclaim modifications to the..., implementing, and monitoring modified rules of origin; and (3) the level and breadth of interest that..., but proposals may also be submitted at the 6, 4, or 2 digit level where the intent is to cover all...

  5. Compliance with telephone triage advice among adults aged 45 years and older: an Australian data linkage study.

    PubMed

    Tran, Duong Thuy; Gibson, Amy; Randall, Deborah; Havard, Alys; Byrne, Mary; Robinson, Maureen; Lawler, Anthony; Jorm, Louisa R

    2017-08-01

    Middle-aged and older patients are prominent users of telephone triage services for timely access to health information and appropriate referrals. Non-compliance with advice to seek appropriate care could potentially lead to poorer health outcomes among those patients. It is imperative to assess the extent to which middle-aged and older patients follow triage advice and how this varies according to their socio-demographic, lifestyle and health characteristics as well as features of the call. Records of calls to the Australian healthdirect helpline (July 2008-December 2011) were linked to baseline questionnaire data from the 45 and Up Study (participants age ≥ 45 years), records of emergency department (ED) presentations, hospital admissions, and medical consultation claims. Outcomes of the call included compliance with the advice "Attend ED immediately"; "See a doctor (immediately, within 4 hours, or within 24 hours)"; "Self-care"; and self-referral to ED or hospital within 24 h when given a self-care or low-urgency care advice. Multivariable logistic regression was used to investigate associations between call outcomes and patient and call characteristics. This study included 8406 adults (age ≥ 45 years) who were subjects of 11,088 calls to the healthdirect helpline. Rates of compliance with the advices "Attend ED immediately", "See a doctor" and "Self-care" were 68.6%, 64.6% and 77.5% respectively, while self-referral to ED within 24 h followed 7.0% of calls. Compliance with the advice "Attend ED immediately" was higher among patients who had three or more positive lifestyle behaviours, called after-hours, or stated that their original intention was to attend ED, while it was lower among those who lived in rural and remote areas or reported high or very high levels of psychological distress. Compliance with the advice "See a doctor" was higher in patients who were aged ≥65 years, worked full-time, or lived in socio-economically advantaged areas, when another person made the call on the patient's behalf, and when the original intention was to seek care from an ED or a doctor. It was lower among patients in rural and remote areas and those taking five medications or more. Patients aged ≥65 years were less likely to comply with the advice "Self-care". The rates of self-referral to ED within 24 h were greater in patients from disadvantaged areas, among calls made after-hours or by another person, and when the original intention was to attend ED. Patients who were given a self-care or low-urgency care advice, whose calls concerned bleeding, cardiac, gastrointestinal, head and facial injury symptoms, were more likely to self-refer to ED. Compliance with telephone triage advice among middle-age and older patients varied substantially according to both patient- and call-related factors. Knowledge about the patients who are less likely to comply with telephone triage advice, and about characteristics of calls that may influence compliance, will assist in refining patient triage protocols and referral pathways, training staff and tailoring service design and delivery to achieve optimal patient compliance.

  6. 75 FR 63040 - Airworthiness Directives; McDonnell Douglas Corporation Model DC-10-10, DC-10-10F, DC-10-30, DC...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-10-14

    ... transients to the fuel quantity indication system, which could cause voltage levels to go beyond original..., which could cause voltage levels to go beyond original design levels between fuel tank probes and... this material at an NARA facility, call 202-741-6030, or go to http://www.archives.gov/federal_register...

  7. Scientific and Religious Beliefs about the Origin of Life and Life after Death: Validation of a Scale

    ERIC Educational Resources Information Center

    Bautista, Jesús Silva; Escobar, Venazir Herrera; Miranda, Rodolfo Corona

    2017-01-01

    The variety of explanations to questions about the origin of life, life after death or about the role itself of being in the world are built on the rational reflection that integrates the ideology of human beings as well as less rational practices and more emotional ones than in the whole nourish what has been called "beliefs".…

  8. Cancer Stem Cell Theory and the Warburg Effect, Two Sides of the Same Coin?

    PubMed Central

    Pacini, Nicola; Borziani, Fabio

    2014-01-01

    Over the last 100 years, many studies have been performed to determine the biochemical and histopathological phenomena that mark the origin of neoplasms. At the end of the last century, the leading paradigm, which is currently well rooted, considered the origin of neoplasms to be a set of genetic and/or epigenetic mutations, stochastic and independent in a single cell, or rather, a stochastic monoclonal pattern. However, in the last 20 years, two important areas of research have underlined numerous limitations and incongruities of this pattern, the hypothesis of the so-called cancer stem cell theory and a revaluation of several alterations in metabolic networks that are typical of the neoplastic cell, the so-called Warburg effect. Even if this specific “metabolic sign” has been known for more than 85 years, only in the last few years has it been given more attention; therefore, the so-called Warburg hypothesis has been used in multiple and independent surveys. Based on an accurate analysis of a series of considerations and of biophysical thermodynamic events in the literature, we will demonstrate a homogeneous pattern of the cancer stem cell theory, of the Warburg hypothesis and of the stochastic monoclonal pattern; this pattern could contribute considerably as the first basis of the development of a new uniform theory on the origin of neoplasms. Thus, a new possible epistemological paradigm is represented; this paradigm considers the Warburg effect as a specific “metabolic sign” reflecting the stem origin of the neoplastic cell, where, in this specific metabolic order, an essential reason for the genetic instability that is intrinsic to the neoplastic cell is defined. PMID:24857919

  9. Cancer stem cell theory and the warburg effect, two sides of the same coin?

    PubMed

    Pacini, Nicola; Borziani, Fabio

    2014-05-19

    Over the last 100 years, many studies have been performed to determine the biochemical and histopathological phenomena that mark the origin of neoplasms. At the end of the last century, the leading paradigm, which is currently well rooted, considered the origin of neoplasms to be a set of genetic and/or epigenetic mutations, stochastic and independent in a single cell, or rather, a stochastic monoclonal pattern. However, in the last 20 years, two important areas of research have underlined numerous limitations and incongruities of this pattern, the hypothesis of the so-called cancer stem cell theory and a revaluation of several alterations in metabolic networks that are typical of the neoplastic cell, the so-called Warburg effect. Even if this specific "metabolic sign" has been known for more than 85 years, only in the last few years has it been given more attention; therefore, the so-called Warburg hypothesis has been used in multiple and independent surveys. Based on an accurate analysis of a series of considerations and of biophysical thermodynamic events in the literature, we will demonstrate a homogeneous pattern of the cancer stem cell theory, of the Warburg hypothesis and of the stochastic monoclonal pattern; this pattern could contribute considerably as the first basis of the development of a new uniform theory on the origin of neoplasms. Thus, a new possible epistemological paradigm is represented; this paradigm considers the Warburg effect as a specific "metabolic sign" reflecting the stem origin of the neoplastic cell, where, in this specific metabolic order, an essential reason for the genetic instability that is intrinsic to the neoplastic cell is defined.

  10. 78 FR 38750 - Self-Regulatory Organizations; NASDAQ OMX PHLX LLC; Notice of Filing of Proposed Rule Change...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-06-27

    ... (also known as origin code) refers to the participant types listed in Rule 1080.08(b) and Rule 1000(b..., and, therefore, is referring to the participant origin codes in Rule 1080.08(b) only. The proposed...-Regulatory Organizations; NASDAQ OMX PHLX LLC; Notice of Filing of Proposed Rule Change Relating to Which...

  11. Products purchased from family farming for school meals in the cities of Rio Grande do Sul

    PubMed Central

    Ferigollo, Daniele; Kirsten, Vanessa Ramos; Heckler, Dienifer; Figueredo, Oscar Agustín Torres; Perez-Cassarino, Julian; Triches, Rozane Márcia

    2017-01-01

    ABSTRACT OBJECTIVE This study aims to verify the adequacy profile of the cities of the State of Rio Grande do Sul, Brazil, in relation to the purchase of products of family farming by the Programa Nacional de Alimentação Escolar (PNAE - National Program of School Meals). METHODS This is a quantitative descriptive study, with secondary data analysis (public calls-to-bid). The sample consisted of approximately 10% (n = 52) of the cities in the State, establishing a representation by mesoregion and size of the population. We have assessed the percentage of food purchased from family farming, as well as the type of product, requirements of frequency, delivery points, and presence of prices in 114 notices of public calls-to-bid, in 2013. RESULTS Of the cities analyzed, 71.2% (n = 37) reached 30% of food purchased from family farming. Most public calls-to-bid demanded both products of plant (90.4%; n = 103) and animal origin (79.8%; n = 91). Regarding the degree of processing, fresh products appeared in 92.1% (n = 105) of the public calls-to-bid. In relation to the delivery of products, centralized (49.1%; n = 56) and weekly deliveries (47.4%; n = 54) were the most described. Only 60% (n = 68) of the public calls-to-bid contained the price of products. CONCLUSIONS Most of the cities analyzed have fulfilled what is determined by the legislation of the PNAE. We have found in the public calls-to-bid a wide variety of food, both of plant and animal origin, and most of it is fresh. In relation to the delivery of the products, the centralized and weekly options prevailed. PMID:28225910

  12. 75 FR 43995 - Advisory Committee on Immunization Practices (ACIP)

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-27

    ...: The conference call will originate at the National Center for Immunization and Respiratory Diseases in.... CONTACT PERSON FOR MORE INFORMATION: Leola Mitchell, National Center for Immunization and Respiratory...

  13. Modification of the random forest algorithm to avoid statistical dependence problems when classifying remote sensing imagery

    NASA Astrophysics Data System (ADS)

    Cánovas-García, Fulgencio; Alonso-Sarría, Francisco; Gomariz-Castillo, Francisco; Oñate-Valdivieso, Fernando

    2017-06-01

    Random forest is a classification technique widely used in remote sensing. One of its advantages is that it produces an estimation of classification accuracy based on the so called out-of-bag cross-validation method. It is usually assumed that such estimation is not biased and may be used instead of validation based on an external data-set or a cross-validation external to the algorithm. In this paper we show that this is not necessarily the case when classifying remote sensing imagery using training areas with several pixels or objects. According to our results, out-of-bag cross-validation clearly overestimates accuracy, both overall and per class. The reason is that, in a training patch, pixels or objects are not independent (from a statistical point of view) of each other; however, they are split by bootstrapping into in-bag and out-of-bag as if they were really independent. We believe that putting whole patch, rather than pixels/objects, in one or the other set would produce a less biased out-of-bag cross-validation. To deal with the problem, we propose a modification of the random forest algorithm to split training patches instead of the pixels (or objects) that compose them. This modified algorithm does not overestimate accuracy and has no lower predictive capability than the original. When its results are validated with an external data-set, the accuracy is not different from that obtained with the original algorithm. We analysed three remote sensing images with different classification approaches (pixel and object based); in the three cases reported, the modification we propose produces a less biased accuracy estimation.

  14. SHARP Demonstration Flight: Video Broadcast System for Research in Intelligent Flight Characterization and Control

    NASA Technical Reports Server (NTRS)

    Kitts, Christopher

    2001-01-01

    The NASA Ames Research Center (Thermal Protection Materials and Systems Branch) is investigating new ceramic materials for the thermal protection of atmospheric entry vehicles. An incremental approach to proving the capabilities of these materials calls for a lifting entry flight test of a sharp leading edge component on the proposed SHARP (Slender Hypervelocity Aerothermodynamic Research Probe) vehicle. This flight test will establish the aerothermal performance constraint under real lifting entry conditions. NASA Ames has been developing the SHARP test flight with SSDL (responsible for the SHARP S I vehicle avionics), Montana State University (responsible for the SHARP S I vehicle airframe), the Wickman Spacecraft and Propulsion Company (responsible for the sounding rocket and launch operations), and with the SCU Intelligent Robotics Program, The SCU team was added well after the rest of the development team had formed. The SCU role was to assist with the development of a real-time video broadcast system which would relay onboard flight video to a communication groundstation. The SCU team would also assist with general vehicle preparation as well as flight operations. At the time of the submission of the original SCU proposal, a test flight in Wyoming was originally targeted for September 2000. This date was moved several times into the Fall of 2000. It was then postponed until the Spring of 2001, and later pushed into late Summer 2001. To date, the flight has still not taken place. These project delays resulted in SCU requesting several no-cost extensions to the project. Based on the most recent conversations with the project technical lead, Paul Kolodjiez, the current plan is for the overall SHARP team to assemble what exists of the vehicle, to document the system, and to 'mothball' the vehicle in anticipation of future flight and funding opportunities.

  15. ξ /ξ2 n d ratio as a tool to refine effective Polyakov loop models

    NASA Astrophysics Data System (ADS)

    Caselle, Michele; Nada, Alessandro

    2017-10-01

    Effective Polyakov line actions are a powerful tool to study the finite temperature behavior of lattice gauge theories. They are much simpler to simulate than the original lattice model and are affected by a milder sign problem, but it is not clear to which extent they really capture the rich spectrum of the original theories. We propose here a simple way to address this issue based on the so-called second moment correlation length ξ2 n d . The ratio ξ /ξ2 n d between the exponential correlation length and the second moment one is equal to 1 if only a single mass is present in the spectrum, and it becomes larger and larger as the complexity of the spectrum increases. Since both ξ and ξ2 n d are easy to measure on the lattice, this is a cheap and efficient way to keep track of the spectrum of the theory. As an example of the information one can obtain with this tool, we study the behavior of ξ /ξ2 n d in the confining phase of the (D =3 +1 ) SU(2) gauge theory and show that it is compatible with 1 near the deconfinement transition, but it increases dramatically as the temperature decreases. We also show that this increase can be well understood in the framework of an effective string description of the Polyakov loop correlator. This nontrivial behavior should be reproduced by the Polyakov loop effective action; thus, it represents a stringent and challenging test of existing proposals, and it may be used to fine-tune the couplings and to identify the range of validity of the approximations involved in their construction.

  16. Structure-adaptive CBCT reconstruction using weighted total variation and Hessian penalties

    PubMed Central

    Shi, Qi; Sun, Nanbo; Sun, Tao; Wang, Jing; Tan, Shan

    2016-01-01

    The exposure of normal tissues to high radiation during cone-beam CT (CBCT) imaging increases the risk of cancer and genetic defects. Statistical iterative algorithms with the total variation (TV) penalty have been widely used for low dose CBCT reconstruction, with state-of-the-art performance in suppressing noise and preserving edges. However, TV is a first-order penalty and sometimes leads to the so-called staircase effect, particularly over regions with smooth intensity transition in the reconstruction images. A second-order penalty known as the Hessian penalty was recently used to replace TV to suppress the staircase effect in CBCT reconstruction at the cost of slightly blurring object edges. In this study, we proposed a new penalty, the TV-H, which combines TV and Hessian penalties for CBCT reconstruction in a structure-adaptive way. The TV-H penalty automatically differentiates the edges, gradual transition and uniform local regions within an image using the voxel gradient, and adaptively weights TV and Hessian according to the local image structures in the reconstruction process. Our proposed penalty retains the benefits of TV, including noise suppression and edge preservation. It also maintains the structures in regions with gradual intensity transition more successfully. A majorization-minimization (MM) approach was designed to optimize the objective energy function constructed with the TV-H penalty. The MM approach employed a quadratic upper bound of the original objective function, and the original optimization problem was changed to a series of quadratic optimization problems, which could be efficiently solved using the Gauss-Seidel update strategy. We tested the reconstruction algorithm on two simulated digital phantoms and two physical phantoms. Our experiments indicated that the TV-H penalty visually and quantitatively outperformed both TV and Hessian penalties. PMID:27699100

  17. Structure-adaptive CBCT reconstruction using weighted total variation and Hessian penalties.

    PubMed

    Shi, Qi; Sun, Nanbo; Sun, Tao; Wang, Jing; Tan, Shan

    2016-09-01

    The exposure of normal tissues to high radiation during cone-beam CT (CBCT) imaging increases the risk of cancer and genetic defects. Statistical iterative algorithms with the total variation (TV) penalty have been widely used for low dose CBCT reconstruction, with state-of-the-art performance in suppressing noise and preserving edges. However, TV is a first-order penalty and sometimes leads to the so-called staircase effect, particularly over regions with smooth intensity transition in the reconstruction images. A second-order penalty known as the Hessian penalty was recently used to replace TV to suppress the staircase effect in CBCT reconstruction at the cost of slightly blurring object edges. In this study, we proposed a new penalty, the TV-H, which combines TV and Hessian penalties for CBCT reconstruction in a structure-adaptive way. The TV-H penalty automatically differentiates the edges, gradual transition and uniform local regions within an image using the voxel gradient, and adaptively weights TV and Hessian according to the local image structures in the reconstruction process. Our proposed penalty retains the benefits of TV, including noise suppression and edge preservation. It also maintains the structures in regions with gradual intensity transition more successfully. A majorization-minimization (MM) approach was designed to optimize the objective energy function constructed with the TV-H penalty. The MM approach employed a quadratic upper bound of the original objective function, and the original optimization problem was changed to a series of quadratic optimization problems, which could be efficiently solved using the Gauss-Seidel update strategy. We tested the reconstruction algorithm on two simulated digital phantoms and two physical phantoms. Our experiments indicated that the TV-H penalty visually and quantitatively outperformed both TV and Hessian penalties.

  18. A Giant in the Shadows: Major General Benjamin Foulois and the Rise of the Army Air Service in World War I

    DTIC Science & Technology

    2013-05-01

    he was 28 years old. As one of America’s original military aviators, he flew the Army’s first dirigible balloon and its first airplane, learning to...training gratis from the Wrights, as the original contract only paid for the training of two pilots, and he received 54 minutes of student flight...first en- listment, Foulois asked everyone to call him Ben. No one asked him about the origin of his nickname, and he never volunteered the information

  19. Generalized Structured Component Analysis

    ERIC Educational Resources Information Center

    Hwang, Heungsun; Takane, Yoshio

    2004-01-01

    We propose an alternative method to partial least squares for path analysis with components, called generalized structured component analysis. The proposed method replaces factors by exact linear combinations of observed variables. It employs a well-defined least squares criterion to estimate model parameters. As a result, the proposed method…

  20. Spin transport in lateral structures with semiconducting channel

    NASA Astrophysics Data System (ADS)

    Zainuddin, Abu Naser

    Spintronics is an emerging field of electronics with the potential to be used in future integrated circuits. Spintronic devices are already making their mark in storage technologies in recent times and there are proposals for using spintronic effects in logic technologies as well. So far, major improvement in spintronic effects, for example, the `spin-valve' effect, is being achieved in metals or insulators as channel materials. But not much progress is made in semiconductors owing to the difficulty in injecting spins into them, which has only very recently been overcome with the combined efforts of many research groups around the world. The key motivations for semiconductor spintronics are their ease in integration with the existing semiconductor technology along with the gate controllability. At present semiconductor based spintronic devices are mostly lateral and are showing a very poor performance compared to their metal or insulator based vertical counterparts. The objective of this thesis is to analyze these devices based on spin-transport models and simulations. At first a lateral spin-valve device is modeled with the spin-diffusion equation based semiclassical approach. Identifying the important issues regarding the device performance, a compact circuit equivalent model is presented which would help to improve the device design. It is found that the regions outside the current path also have a significant influence on the device performance under certain conditions, which is ordinarily neglected when only charge transport is considered. Next, a modified spin-valve structure is studied where the spin signal is controlled with a gate in between the injecting and detecting contacts. The gate is used to modulate the rashba spin-orbit coupling of the channel which, in turn, modulates the spin-valve signal. The idea of gate controlled spin manipulation was originally proposed by Datta and Das back in 1990 and is called 'Datta-Das' effect. In this thesis, we have extended the model described in the original proposal to include the influence of channel dimensions on the nature of electron flow and the contact dimensions on the magnitude and phase of the spin-valve signal. In order to capture the spin-orbit effect a non-equilibrium Green's function (NEGF) based quantum transport model for spin-valve device have been developed which is also explained with simple theoretical treatment based on stationary phase approximation. The model is also compared against a recent experiment that demonstrated such gate modulated spin-valve effect. This thesis also evaluates the possibility of gate controlled magnetization reversal or spin-torque effect as a means to validate this, so called, 'Datta-Das' effect on a more solid footing. Finally, the scope for utilizing topological insulator material in semiconductor spintronics is discussed as a possible future work for this thesis.

  1. 76 FR 63640 - Public Housing Assessment System (PHAS): Proposed Physical Condition Interim Scoring Notice

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-13

    ... call-for-aid is a system designed to provide elderly residents the opportunity to call for help in the... open. This bars that are change also rewrites designed to open the Level 3 should open. If they...

  2. THRIVE: threshold homomorphic encryption based secure and privacy preserving biometric verification system

    NASA Astrophysics Data System (ADS)

    Karabat, Cagatay; Kiraz, Mehmet Sabir; Erdogan, Hakan; Savas, Erkay

    2015-12-01

    In this paper, we introduce a new biometric verification and template protection system which we call THRIVE. The system includes novel enrollment and authentication protocols based on threshold homomorphic encryption where a private key is shared between a user and a verifier. In the THRIVE system, only encrypted binary biometric templates are stored in a database and verification is performed via homomorphically randomized templates, thus, original templates are never revealed during authentication. Due to the underlying threshold homomorphic encryption scheme, a malicious database owner cannot perform full decryption on encrypted templates of the users in the database. In addition, security of the THRIVE system is enhanced using a two-factor authentication scheme involving user's private key and biometric data. Using simulation-based techniques, the proposed system is proven secure in the malicious model. The proposed system is suitable for applications where the user does not want to reveal her biometrics to the verifier in plain form, but needs to prove her identity by using biometrics. The system can be used with any biometric modality where a feature extraction method yields a fixed size binary template and a query template is verified when its Hamming distance to the database template is less than a threshold. The overall connection time for the proposed THRIVE system is estimated to be 336 ms on average for 256-bit biometric templates on a desktop PC running with quad core 3.2 GHz CPUs at 10 Mbit/s up/down link connection speed. Consequently, the proposed system can be efficiently used in real-life applications.

  3. Comprehensive genetic analyses reveal evolutionary distinction of a mouse (Zapus hudsonius preblei) proposed for delisting from the US Endangered Species Act.

    PubMed

    King, Tim L; Switzer, John F; Morrison, Cheryl L; Eackles, Michael S; Young, Colleen C; Lubinski, Barbara A; Cryan, Paul

    2006-12-01

    Zapus hudsonius preblei, listed as threatened under the US Endangered Species Act (ESA), is one of 12 recognized subspecies of meadow jumping mice found in North America. Recent morphometric and phylogenetic comparisons among Z. h. preblei and neighbouring conspecifics questioned the taxonomic status of selected subspecies, resulting in a proposal to delist the Z. h. preblei from the ESA. We present additional analyses of the phylogeographic structure within Z. hudsonius that calls into question previously published data (and conclusions) and confirms the original taxonomic designations. A survey of 21 microsatellite DNA loci and 1380 base pairs from two mitochondrial DNA (mtDNA) regions (control region and cytochrome b) revealed that each Z. hudsonius subspecies is genetically distinct. These data do not support the null hypothesis of a homogeneous gene pool among the five subspecies found within the southwestern portion of the species' range. The magnitude of the observed differentiation was considerable and supported by significant findings for nearly every statistical comparison made, regardless of the genome or the taxa under consideration. Structuring of nuclear multilocus genotypes and subspecies-specific mtDNA haplotypes corresponded directly with the disjunct distributions of the subspecies investigated. Given the level of correspondence between the observed genetic population structure and previously proposed taxonomic classification of subspecies (based on the geographic separation and surveys of morphological variation), we conclude that the nominal subspecies surveyed in this study do not warrant synonymy, as has been proposed for Z. h. preblei, Z. h. campestris, and Z. h. intermedius.

  4. Comprehensive genetic analyses reveal evolutionary distinction of a mouse (Zapus hudsonius preblei) proposed for delisting from the US Endangered Species Act

    USGS Publications Warehouse

    King, Timothy L.; Switzer, John F.; Morrison, Cheryl L.; Eackles, Michael S.; Young, Colleen C.; Lubinski, Barbara A.; Cryan, Paul M.

    2006-01-01

    Zapus hudsonius preblei, listed as threatened under the US Endangered Species Act (ESA), is one of 12 recognized subspecies of meadow jumping mice found in North America. Recent morphometric and phylogenetic comparisons among Z. h. preblei and neighbouring conspecifics questioned the taxonomic status of selected subspecies, resulting in a proposal to delist the Z. h. preblei from the ESA. We present additional analyses of the phylogeographic structure within Z. hudsonius that calls into question previously published data (and conclusions) and confirms the original taxonomic designations. A survey of 21 microsatellite DNA loci and 1380 base pairs from two mitochondrial DNA (mtDNA) regions (control region and cytochrome b) revealed that each Z. hudsonius subspecies is genetically distinct. These data do not support the null hypothesis of a homogeneous gene pool among the five subspecies found within the southwestern portion of the species' range. The magnitude of the observed differentiation was considerable and supported by significant findings for nearly every statistical comparison made, regardless of the genome or the taxa under consideration. Structuring of nuclear multilocus genotypes and subspecies-specific mtDNA haplotypes corresponded directly with the disjunct distributions of the subspecies investigated. Given the level of correspondence between the observed genetic population structure and previously proposed taxonomic classification of subspecies (based on the geographic separation and surveys of morphological variation), we conclude that the nominal subspecies surveyed in this study do not warrant synonymy, as has been proposed for Z. h. preblei, Z. h. campestris, and Z. h. intermedius. ?? 2006 The Authors.

  5. Reformulations of the Yang-Mills theory toward quark confinement and mass gap

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kondo, Kei-Ichi; Shinohara, Toru; Kato, Seikou

    2016-01-22

    We propose the reformulations of the SU (N) Yang-Mills theory toward quark confinement and mass gap. In fact, we have given a new framework for reformulating the SU (N) Yang-Mills theory using new field variables. This includes the preceding works given by Cho, Faddeev and Niemi, as a special case called the maximal option in our reformulations. The advantage of our reformulations is that the original non-Abelian gauge field variables can be changed into the new field variables such that one of them called the restricted field gives the dominant contribution to quark confinement in the gauge-independent way. Our reformulationsmore » can be combined with the SU (N) extension of the Diakonov-Petrov version of the non-Abelian Stokes theorem for the Wilson loop operator to give a gauge-invariant definition for the magnetic monopole in the SU (N) Yang-Mills theory without the scalar field. In the so-called minimal option, especially, the restricted field is non-Abelian and involves the non-Abelian magnetic monopole with the stability group U (N− 1). This suggests the non-Abelian dual superconductivity picture for quark confinement. This should be compared with the maximal option: the restricted field is Abelian and involves only the Abelian magnetic monopoles with the stability group U(1){sup N−1}, just like the Abelian projection. We give some applications of this reformulation, e.g., the stability for the homogeneous chromomagnetic condensation of the Savvidy type, the large N treatment for deriving the dimensional transmutation and understanding the mass gap, and also the numerical simulations on a lattice which are given by Dr. Shibata in a subsequent talk.« less

  6. Reformulations of the Yang-Mills theory toward quark confinement and mass gap

    NASA Astrophysics Data System (ADS)

    Kondo, Kei-Ichi; Kato, Seikou; Shibata, Akihiro; Shinohara, Toru

    2016-01-01

    We propose the reformulations of the SU (N) Yang-Mills theory toward quark confinement and mass gap. In fact, we have given a new framework for reformulating the SU (N) Yang-Mills theory using new field variables. This includes the preceding works given by Cho, Faddeev and Niemi, as a special case called the maximal option in our reformulations. The advantage of our reformulations is that the original non-Abelian gauge field variables can be changed into the new field variables such that one of them called the restricted field gives the dominant contribution to quark confinement in the gauge-independent way. Our reformulations can be combined with the SU (N) extension of the Diakonov-Petrov version of the non-Abelian Stokes theorem for the Wilson loop operator to give a gauge-invariant definition for the magnetic monopole in the SU (N) Yang-Mills theory without the scalar field. In the so-called minimal option, especially, the restricted field is non-Abelian and involves the non-Abelian magnetic monopole with the stability group U (N- 1). This suggests the non-Abelian dual superconductivity picture for quark confinement. This should be compared with the maximal option: the restricted field is Abelian and involves only the Abelian magnetic monopoles with the stability group U(1)N-1, just like the Abelian projection. We give some applications of this reformulation, e.g., the stability for the homogeneous chromomagnetic condensation of the Savvidy type, the large N treatment for deriving the dimensional transmutation and understanding the mass gap, and also the numerical simulations on a lattice which are given by Dr. Shibata in a subsequent talk.

  7. Making Research Matter Comment on "Public Spending on Health Service and Policy Research in Canada, the United Kingdom, and the United States: A Modest Proposal".

    PubMed

    Hunter, David J; Frank, John

    2017-08-13

    We offer a UK-based commentary on the recent "Perspective" published in IJHPM by Thakkar and Sullivan. We are sympathetic to the authors' call for increased funding for health service and policy research (HSPR). However, we point out that increasing that investment - in any of the three countries they compare: Canada, the United States and the United Kingdom- will ipso facto not necessarily lead to any better use of research by health system decision-makers in these settings. We cite previous authors' descriptions of the many factors that tend to make the worlds of researchers and decision-makers into "two solitudes." And we call for changes in the structure and funding of HSPR, particularly the incentives now in place for purely academic publishing, to tackle a widespread reality: most published research in HSPR, as in other applied fields of science, is never read or used by the vast majority of decision-makers, working out in the "real world. © 2018 The Author(s); Published by Kerman University of Medical Sciences. This is an open-access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

  8. The health-nutrition dimension: a methodological approach to assess the nutritional sustainability of typical agro-food products and the Mediterranean diet.

    PubMed

    Azzini, Elena; Maiani, Giuseppe; Turrini, Aida; Intorre, Federica; Lo Feudo, Gabriella; Capone, Roberto; Bottalico, Francesco; El Bilali, Hamid; Polito, Angela

    2018-08-01

    The aim of this paper is to provide a methodological approach to evaluate the nutritional sustainability of typical agro-food products, representing Mediterranean eating habits and included in the Mediterranean food pyramid. For each group of foods, suitable and easily measurable indicators were identified. Two macro-indicators were used to assess the nutritional sustainability of each product. The first macro-indicator, called 'business distinctiveness', takes into account the application of different regulations and standards regarding quality, safety and traceability as well as the origin of raw materials. The second macro-indicator, called 'nutritional quality', assesses product nutritional quality taking into account the contents of key compounds including micronutrients and bioactive phytochemicals. For each indicator a 0-10 scoring system was set up, with scores from 0 (unsustainable) to 10 (very sustainable), with 5 as a sustainability benchmark value. The benchmark value is the value from which a product can be considered sustainable. A simple formula was developed to produce a sustainability index. The proposed sustainability index could be considered a useful tool to describe both the qualitative and quantitative value of micronutrients and bioactive phytochemical present in foodstuffs. This methodological approach can also be applied beyond the Mediterranean, to food products in other world regions. © 2018 Society of Chemical Industry. © 2018 Society of Chemical Industry.

  9. HST Peer Review, Where We've Been, Where We Are Now and Possibly Where the Future Lies

    NASA Astrophysics Data System (ADS)

    Blacker, Brett S.; Macchetto, Duccio; Meylan, Georges; Stanghellini, Letizia; van der Marel, Roeland P.

    2002-12-01

    In some eyes, the Phase I proposal selection process is the most important activity handled by the Space Telescope Science Institute (STScI). Proposing for HST and other missions consists of requesting observing time and/or archival research funding. This step is called Phase I, where the scientific merit of a proposal is considered by a community based peer-review process. Accepted proposals then proceed thru Phase II, where the observations are specified in sufficient detail to enable scheduling on the telescope. Each cycle the Hubble Space Telescope (HST) Telescope Allocation Committee (TAC) reviews proposals and awards observing time that is valued at $0.5B, when the total expenditures for HST over its lifetime are figured on an annual basis. This is in fact a very important endeavor that we continue to fine-tune and tweak. This process is open to the science community and we constantly receive comments and praise for this process. Several cycles ago we instituted several significant changes to the process to address concerns such as: Fewer, broader panels, with redundancy to avoid conflicts of interest; Redefinition of the TAC role, to focus on Larger programs; and incentives for the panels to award time to medium sized proposals. In the last cycle, we offered new initiatives to try to enhance the scientific output of the telescope. Some of these initiatives were: Hubble Treasury Program; AR Legacy Program; and the AR Theory Program. This paper will outline the current HST Peer review process. We will discuss why we made changes and how we made changes from our original system. We will also discuss some ideas as to where we may go in the future to generate a stronger science program for HST and to reduce the burden on the science community. This paper is an update of the status of the HST Peer Review Process that was described in the published paper "Evolution of the HST Proposal Selection Process".

  10. Cholera and the Pump on Broad Street: The Life and Legacy of John Snow

    ERIC Educational Resources Information Center

    Ball, Laura

    2009-01-01

    There is still a pump in the Golden Square neighborhood on what was once called Broad Street. It does not work, for it is merely a replica of the original, and like the original its handle is missing. It serves as a curiously simple monument to the events that took place over one hundred years ago, when the real pump supplied water to the Broad…

  11. Towards Scalable 1024 Processor Shared Memory Systems

    NASA Technical Reports Server (NTRS)

    Ciotti, Robert B.; Thigpen, William W. (Technical Monitor)

    2001-01-01

    Over the past 3 years, NASA Ames has been involved in a cooperative effort with SGI to develop the largest single system image systems available. Currently a 1024 Origin3OOO is under development, with first boot expected later in the summer of 2001. This paper discusses some early results with a 512p Origin3OOO system and some arcane IRIX system calls that can dramatically improve scaling performance.

  12. 76 FR 9809 - Notice of a Federal Advisory Committee Meeting Manufactured Housing Consensus Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-02-22

    ..., including regulations specifying the permissible scope and conduct of monitoring in accordance with... Manufactured Housing Program Office Review Log of Proposals Call for Committee Reports Proposals Subcommittees...

  13. 75 FR 62364 - Manti-La Sal National Forest Resource Advisory Committee; Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-10-08

    ... who use telecommunication devices for the deaf (TDD) may call the Federal Information Relay Service... conducted: (1) Consideration of Proposal Forms, (2) Development of Proposal Review Process, (3) Development...

  14. 1990 Clean Air Act Amendment Summary

    EPA Pesticide Factsheets

    In 1989, President George W. Bush proposed revisions to the Clean Air Act designed to curb acid rain, urban air pollution, and toxic air emissions. The proposal also called for establishing a national permits program.

  15. 75 FR 4043 - Correction: Proposed Information Collection; Comment Request; Fisheries Certificate of Origin

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-01-26

    .... Dated: January 21, 2010. Gwellnar Banks, Management Analyst, Office of the Chief Information Officer... Information Collection; Comment Request; Fisheries Certificate of Origin AGENCY: National Oceanic and... the Federal Register (75 FR 2482) on the proposed information collection, Fisheries Certificate of...

  16. 75 FR 45124 - Agency Information Collection Activities: Proposed Collection; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-08-02

    ... to obtain a copy of the information collection plans, call the SAMHSA Reports Clearance Officer on... clinicians and supervisors, implementation calls and monthly progress reports, and topical workgroups that... evaluate the implementation, expansion, and sustainability of adolescent substance use services developed...

  17. 78 FR 78809 - Rates for Interstate Inmate Calling Services

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-27

    ... FEDERAL COMMUNICATIONS COMMISSION 47 CFR Part 64 [WC Docket No. 12-375; DA 13-2379] Rates for Interstate Inmate Calling Services AGENCY: Federal Communications Commission. ACTION: Proposed rule... Communications Commission's Web site: http://fjallfoss.fcc.gov/ecfs2/ . Follow the instructions for submitting...

  18. Adaptive threshold control for auto-rate fallback algorithm in IEEE 802.11 multi-rate WLANs

    NASA Astrophysics Data System (ADS)

    Wu, Qilin; Lu, Yang; Zhu, Xiaolin; Ge, Fangzhen

    2012-03-01

    The IEEE 802.11 standard supports multiple rates for data transmission in the physical layer. Nowadays, to improve network performance, a rate adaptation scheme called auto-rate fallback (ARF) is widely adopted in practice. However, ARF scheme suffers performance degradation in multiple contending nodes environments. In this article, we propose a novel rate adaptation scheme called ARF with adaptive threshold control. In multiple contending nodes environment, the proposed scheme can effectively mitigate the frame collision effect on rate adaptation decision by adaptively adjusting rate-up and rate-down threshold according to the current collision level. Simulation results show that the proposed scheme can achieve significantly higher throughput than the other existing rate adaptation schemes. Furthermore, the simulation results also demonstrate that the proposed scheme can effectively respond to the varying channel condition.

  19. A proposal for a worldwide definition of health resort medicine, balneology, medical hydrology and climatology.

    PubMed

    Gutenbrunner, Christoph; Bender, Tamas; Cantista, Pedro; Karagülle, Zeki

    2010-09-01

    Health Resort Medicine, Balneology, Medical Hydrology and Climatology are not fully recognised as independent medical specialties at a global international level. Analysing the reasons, we can identify both external (from outside the field) and internal (from inside the field) factors. External arguments include, e.g. the lack of scientific evidence, the fact that Balneotherapy and Climatotherapy is not used in all countries, and the fact that Health Resort Medicine, Balneology, Medical Hydrology and Climatology focus only on single methods and do not have a comprehensive concept. Implicit barriers are the lack of international accepted terms in the field, the restriction of being allowed to practice the activities only in specific settings, and the trend to use Balneotherapy mainly for wellness concepts. Especially the implicit barriers should be subject to intense discussions among scientists and specialists. This paper suggests one option to tackle the problem of implicit barriers by making a proposal for a structure and description of the medical field, and to provide some commonly acceptable descriptions of content and terminology. The medical area can be defined as "medicine in health resorts" (or "health resort medicine"). Health resort medicine includes "all medical activities originated and derived in health resorts based on scientific evidence aiming at health promotion, prevention, therapy and rehabilitation". Core elements of health resort interventions in health resorts are balneotherapy, hydrotherapy, and climatotherapy. Health resort medicine can be used for health promotion, prevention, treatment, and rehabilitation. The use of natural mineral waters, gases and peloids in many countries is called balneotherapy, but other (equivalent) terms exist. Substances used for balneotherapy are medical mineral waters, medical peloids, and natural gases (bathing, drinking, inhalation, etc.). The use of plain water (tap water) for therapy is called hydrotherapy, and the use of climatic factors for therapy is called climatotherapy. Reflecting the effects of health resort medicine, it is important to take other environmental factors into account. These can be classified within the framework of the ICF (International Classification of Functioning, Disability and Health). Examples include receiving health care by specialised doctors, being well educated (ICF-domain: e355), having an environment supporting social contacts (family, peer groups) (cf. ICF-domains: d740, d760), facilities for recreation, cultural activities, leisure and sports (cf. ICF-domain: d920), access to a health-promoting atmosphere and an environment close to nature (cf. ICF-domain: e210). The scientific field dealing with health resort medicine is called health resort sciences. It includes the medical sciences, psychology, social sciences, technical sciences, chemistry, physics, geography, jurisprudence, etc. Finally, this paper proposes a systematic international discussion of descriptions in the field of Health Resort Medicine, Balneology, Medical Hydrology and Climatology, and discusses short descriptive terms with the goal of achieving internationally accepted distinct terms. This task should be done via a structured consensus process and is of major importance for the publication of scientific results as well as for systematic reviews and meta-analyses.

  20. Efficient multifeature index structures for music data retrieval

    NASA Astrophysics Data System (ADS)

    Lee, Wegin; Chen, Arbee L. P.

    1999-12-01

    In this paper, we propose four index structures for music data retrieval. Based on suffix trees, we develop two index structures called combined suffix tree and independent suffix trees. These methods still show shortcomings for some search functions. Hence we develop another index, called Twin Suffix Trees, to overcome these problems. However, the Twin Suffix Trees lack of scalability when the amount of music data becomes large. Therefore we propose the fourth index, called Grid-Twin Suffix Trees, to provide scalability and flexibility for a large amount of music data. For each index, we can use different search functions, like exact search and approximate search, on different music features, like melody, rhythm or both. We compare the performance of the different search functions applied on each index structure by a series of experiments.

  1. Genetics Home Reference: familial erythrocytosis

    MedlinePlus

    ... tumors. Another form of acquired erythrocytosis, called polycythemia vera , results from somatic (non-inherited) mutations in other ... haematol.13250. Citation on PubMed Percy MJ, Rumi E. Genetic origins and clinical phenotype of familial and ...

  2. Institutional computing (IC) information session

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Koch, Kenneth R; Lally, Bryan R

    2011-01-19

    The LANL Institutional Computing Program (IC) will host an information session about the current state of unclassified Institutional Computing at Los Alamos, exciting plans for the future, and the current call for proposals for science and engineering projects requiring computing. Program representatives will give short presentations and field questions about the call for proposals and future planned machines, and discuss technical support available to existing and future projects. Los Alamos has started making a serious institutional investment in open computing available to our science projects, and that investment is expected to increase even more.

  3. Tracing the origin of 'blue Weimaraner' dogs by molecular genetics.

    PubMed

    Gerding, W M; Schreiber, S; Dekomien, G; Epplen, J T

    2011-04-01

    Weimaraner dogs are defined by light brown coat colour termed grey including several shadings ranging from silver and deer to mouse grey. In contrast, the so-called blue Weimaraners (BW) with lightened black-pigmented coat have been proposed to represent spontaneous revertants in the Weimaraner breed. In order to investigate the genetic determinants of the characteristic grey coat colour versus those of BW, known variation in coat colour genes including TYRP1 and MLPH were analysed in a number of grey and blue dogs. Variations at the B locus cause grey coat colour in Weimaraners via two non-functional TYRP1 copies (bb) including the b(s), b(d) and b(c) alleles. In all BW, at least one functional TYRP1 allele (Bb or BB genotype) was identified. Defined microsatellite alleles in TYRP1 intron 4 are linked to this functional B allele in BW. These alleles were also detected in various other dog breeds, but not in grey Weimaraners. The combination of a dominant trait for blue versus grey together with a specific TYRP1 haplotype in BW suggests that blue coat colour is not the result of spontaneous (back-) mutation in grey Weimaraners. This inference is even emphasized by the presence of a unique Y-chomosomal haplotype in a male offspring of the supposed ancestor of the BW population which - according to pedigree information - carries a copy of the original Y chromosome. Thus, molecular genetic analyses of coat colours combined with Y-chromosomal haplotypes allow tracing the origin of atypical dogs in respective canine populations. © 2010 Blackwell Verlag GmbH.

  4. Uncovering the Chemistry of Earth-like Planets

    NASA Astrophysics Data System (ADS)

    Zeng, Li; Sasselov, Dimitar; Jacobsen, Stein

    2015-08-01

    We propose to use the evidence from our solar system to understand exoplanets, and in particular, to predict their surface chemistry and thereby the possibility of life. An Earth-like planet, born from the same nebula as its host star, is composed primarily of silicate rocks and an iron-nickel metal core, and depleted in volatile content in a systematic manner. The more volatile (easier to vaporize or dissociate into gas form) an element is in an Earth-like planet, the more depleted the element is compared to its host star. After depletion, an Earth-like planet would go through the process of core formation due to heat from radioactive decay and collisions. Core formation depletes a planet’s rocky mantle of siderophile (iron-loving) elements, in addition to the volatile depletion. After that, Earth-like planets likely accrete some volatile-rich materials, called “late veneer”. The late veneer could be essential to the origins of life on Earth and Earth-like planets, as it also delivers the volatiles such as nitrogen, sulfur, carbon and water to the planet’s surface, which are crucial for life to occur. Here we build an integrative model of Earth-like planets from the bottom up. Thus the chemical compositions of Earth-like planets could be inferred from their mass-radius relations and their host stars’ elemental abundances, and the origins of volatile contents (especially water) on their surfaces could be understood, and thereby shed light on the origins of life on them. This elemental abundance model could be applied to other rocky exoplanets in exoplanet systems.

  5. Uncovering the Chemistry of Earth-like Planets

    NASA Astrophysics Data System (ADS)

    Zeng, L.; Jacobsen, S. B.; Sasselov, D. D.

    2015-12-01

    We propose to use the evidence from our solar system to understand exoplanets, and in particular, to predict their surface chemistry and thereby the possibility of life. An Earth-like planet, born from the same nebula as its host star, is composed primarily of silicate rocks and an iron-nickel metal core, and depleted in volatile content in a systematic manner. The more volatile (easier to vaporize or dissociate into gas form) an element is in an Earth-like planet, the more depleted the element is compared to its host star. After depletion, an Earth-like planet would go through the process of core formation due to heat from radioactive decay and collisions. Core formation depletes a planet's rocky mantle of siderophile (iron-loving) elements, in addition to the volatile depletion. After that, Earth-like planets likely accrete some volatile-rich materials, called "late veneer". The late veneer could be essential to the origins of life on Earth and Earth-like planets, as it also delivers the volatiles such as nitrogen, sulfur, carbon and water to the planet's surface, which are crucial for life to occur. Here we build an integrative model of Earth-like planets from the bottom up. Thus the chemical compositions of Earth-like planets could be inferred from their mass-radius relations and their host stars' elemental abundances, and the origins of volatile contents (especially water) on their surfaces could be understood, and thereby shed light on the origins of life on them. This elemental abundance model could be applied to other rocky exoplanets in exoplanet systems.

  6. We Ignore the Disciplines

    NASA Astrophysics Data System (ADS)

    Krakauer, David

    I want to begin these proceedings by giving some prominence to an elemental tension in the construction of creative institutions. One origin of tension is described by the pragmatist philos opher Charles Peirce: I do not call the solitary studies of a single man a science. It is only when a group of men, more or less in intercommunication, are aiding and stimulating one another by their understanding of a particular group of studies... that I call their life a science...

  7. CEE/CA: Report calls for decriminalization of sex work.

    PubMed

    Betteridge, Glenn

    2006-04-01

    In December 2005, the Central and Eastern European Harm Reduction Network (CEEHRN) released a report calling for the decriminalization of sex work in the 27 countries of Central and Eastern Europe and Central Asia (CEE/CA). The report brings together a wealth of published and original information concerning sex work, laws regulating sex work, epidemiological data regarding HIV and other sexually transmitted infections (STIs), services available to sex workers, and human rights abuses faced by sex workers.

  8. Origin, development, and evolution of butterfly eyespots.

    PubMed

    Monteiro, Antónia

    2015-01-07

    This article reviews the latest developments in our understanding of the origin, development, and evolution of nymphalid butterfly eyespots. Recent contributions to this field include insights into the evolutionary and developmental origin of eyespots and their ancestral deployment on the wing, the evolution of eyespot number and eyespot sexual dimorphism, and the identification of genes affecting eyespot development and black pigmentation. I also compare features of old and more recently proposed models of eyespot development and propose a schematic for the genetic regulatory architecture of eyespots. Using this schematic I propose two hypotheses for why we observe limits to morphological diversity across these serially homologous traits.

  9. Spitzer Space Telescope proposal process

    NASA Astrophysics Data System (ADS)

    Laine, S.; Silbermann, N. A.; Rebull, L. M.; Storrie-Lombardi, L. J.

    2006-06-01

    This paper discusses the Spitzer Space Telescope General Observer proposal process. Proposals, consisting of the scientific justification, basic contact information for the observer, and observation requests, are submitted electronically using a client-server Java package called Spot. The Spitzer Science Center (SSC) uses a one-phase proposal submission process, meaning that fully-planned observations are submitted for most proposals at the time of submission, not months after acceptance. Ample documentation and tools are available to the observers on SSC web pages to support the preparation of proposals, including an email-based Helpdesk. Upon submission proposals are immediately ingested into a database which can be queried at the SSC for program information, statistics, etc. at any time. Large proposals are checked for technical feasibility and all proposals are checked against duplicates of already approved observations. Output from these tasks is made available to the Time Allocation Committee (TAC) members. At the review meeting, web-based software is used to record reviewer comments and keep track of the voted scores. After the meeting, another Java-based web tool, Griffin, is used to track the approved programs as they go through technical reviews, duplication checks and minor modifications before the observations are released for scheduling. In addition to detailing the proposal process, lessons learned from the first two General Observer proposal calls are discussed.

  10. Detailed temporal structure of communication networks in groups of songbirds.

    PubMed

    Stowell, Dan; Gill, Lisa; Clayton, David

    2016-06-01

    Animals in groups often exchange calls, in patterns whose temporal structure may be influenced by contextual factors such as physical location and the social network structure of the group. We introduce a model-based analysis for temporal patterns of animal call timing, originally developed for networks of firing neurons. This has advantages over cross-correlation analysis in that it can correctly handle common-cause confounds and provides a generative model of call patterns with explicit parameters for the influences between individuals. It also has advantages over standard Markovian analysis in that it incorporates detailed temporal interactions which affect timing as well as sequencing of calls. Further, a fitted model can be used to generate novel synthetic call sequences. We apply the method to calls recorded from groups of domesticated zebra finch (Taeniopygia guttata) individuals. We find that the communication network in these groups has stable structure that persists from one day to the next, and that 'kernels' reflecting the temporal range of influence have a characteristic structure for a calling individual's effect on itself, its partner and on others in the group. We further find characteristic patterns of influences by call type as well as by individual. © 2016 The Authors.

  11. The last scream: the distress call of a probably extinct Brazilian anuran (Holoaden bradei Lutz, 1958).

    PubMed

    Martinelli, Amanda; Toledo, Luís Felipe

    2016-11-03

    The genus Holoaden Miranda-Ribeiro (Anura, Craugastoridae, Holoadeninae) includes four species endemic to the southeastern Brazilian Atlantic Forest, which occur in cloud forests at high elevations (Lutz 1958, Pombal et al. 2008, Martins & Zaher 2013). Out of these, two species are considered threatened by Ministério do Meio Ambiente (2014): H. bradei is classified as critically endangered (CR) and H. luederwaldti as endangered (EN). Holoaden bradei might be already extinct in the wild, as it has not been recorded in the last 40 years in spite of intense scientific activity within its original distribution range (Rocha & van Sluys 2004). The advertisement call has been described only for Holoaden luederwaldti (Martins 2010). Call descriptions, especially of advertisement calls, are important sources of evidence in taxonomic and phylogenetic studies (Roy 1996, Toledo et al. 2007, Andrade et al. 2016). However, there are other call types (see classification in Toledo et al. 2015) that can be used in absence of advertisement calls (e.g., Grenat & Martino 2013). We recently had access to a recording made in 1960's of the distress call of H. bradei. We hereby describe this call.

  12. The acoustic adaptation hypothesis in a widely distributed South American frog: Southernmost signals propagate better.

    PubMed

    Velásquez, Nelson A; Moreno-Gómez, Felipe N; Brunetti, Enzo; Penna, Mario

    2018-05-03

    Animal communication occurs in environments that affect the properties of signals as they propagate from senders to receivers. We studied the geographic variation of the advertisement calls of male Pleurodema thaul individuals from eight localities in Chile. Furthermore, by means of signal propagation experiments, we tested the hypothesis that local calls are better transmitted and less degraded than foreign calls (i.e. acoustic adaptation hypothesis). Overall, the advertisement calls varied greatly along the distribution of P. thaul in Chile, and it was possible to discriminate localities grouped into northern, central and southern stocks. Propagation distance affected signal amplitude and spectral degradation in all localities, but temporal degradation was only affected by propagation distance in one out of seven localities. Call origin affected signal amplitude in five out of seven localities and affected spectral and temporal degradation in six out of seven localities. In addition, in northern localities, local calls degraded more than foreign calls, and in southern localities the opposite was observed. The lack of a strict optimal relationship between signal characteristics and environment indicates partial concordance with the acoustic adaptation hypothesis. Inter-population differences in selectivity for call patterns may compensate for such environmental constraints on acoustic communication.

  13. Proposal for the Consolidation of ACPA & NASPA

    ERIC Educational Resources Information Center

    NASPA - Student Affairs Administrators in Higher Education, 2010

    2010-01-01

    This report presents an overview of a proposed consolidated, comprehensive student affairs association (called "New Association" in this report). The purpose of this report is to provide the memberships of ACPA and NASPA with a rationale, and proposed, organization, professional development, and governance structure based upon which the members…

  14. 76 FR 8713 - Pacific Fishery Management Council; Public Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-02-15

    ... purpose of the joint conference call is to consider any CPS-related fisheries research proposals that will... consider adopting for public review any proposals that are submitted. The CPSMT and CPSAS will discuss any EFP proposals, and will develop statements to be included in the March Council meeting record. Special...

  15. 75 FR 13134 - Proposed Data Collections Submitted for Public Comment and Recommendations

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-18

    ...] Proposed Data Collections Submitted for Public Comment and Recommendations In compliance with the... on proposed data collection projects, the Centers for Disease Control and Prevention (CDC) will... to obtain a copy of the data collection plans and instruments, call 404-639-5960 and send comments to...

  16. CINT - Center for Integrated Nanotechnologies

    Science.gov Websites

    Skip to Content Skip to Search Skip to Utility Navigation Skip to Top Navigation Search Site submit Facilities Discovery Platform Integration Lab User Facilities LUMOS Research Science Thrusts Integration Challenges Accepted User Proposals Data Management Becoming a User Call for Proposals Proposal Guidelines

  17. 75 FR 8956 - Proposed Data Collections Submitted for Public Comment and Recommendations

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-02-26

    ...] Proposed Data Collections Submitted for Public Comment and Recommendations In compliance with the... on proposed data collection projects, the Centers for Disease Control and Prevention (CDC) will... to obtain a copy of the data collection plans and instruments, call 404-639-5960 and send comments to...

  18. 76 FR 40917 - Proposed Data Collections Submitted for Public Comment and Recommendations

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-12

    ...] Proposed Data Collections Submitted for Public Comment and Recommendations In compliance with the... on proposed data collection projects, the Centers for Disease Control and Prevention (CDC) will... to obtain a copy of the data collection plans and instruments, call 404-639-5960 or send comments to...

  19. 77 FR 66467 - Proposed Data Collections Submitted for Public Comment and Recommendations

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-11-05

    ...] Proposed Data Collections Submitted for Public Comment and Recommendations In compliance with the... on proposed data collection projects, the Centers for Disease Control and Prevention (CDC) will... to obtain a copy of the data collection plans and instruments, call 404-639-7570 and send comments to...

  20. 75 FR 13502 - Takes of Marine Mammals Incidental to Specified Activities; Manette Bridge Replacement in...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-22

    ...), Commerce. ACTION: Notice; proposed incidental harassment authorization; request for comments. SUMMARY: NMFS... proposed action area could be affected by the proposed bridge replacement activities, the WSDOT is seeking... in the concrete has resulted from a process called Alkali Silica Reaction (ASR). ASR causes...

  1. 76 FR 34995 - Proposed Data Collections Submitted for Public Comment and Recommendations

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-06-15

    ... call volume data from the National Quitline Data Warehouse (NQDW, OMB No. 0920-0856, exp. 7/31/2012...] Proposed Data Collections Submitted for Public Comment and Recommendations In compliance with the... on proposed data collection projects, the Centers for Disease Control and Prevention (CDC) will...

  2. 78 FR 26572 - Rural Call Completion and List of Rural Operating Carrier Numbers

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-07

    ... FEDERAL COMMUNICATIONS COMMISSION 47 CFR Part 64 [WC Docket No. 13-39; DA 13-780] Rural Call Completion and List of Rural Operating Carrier Numbers AGENCY: Federal Communications Commission. ACTION: Proposed rule. SUMMARY: In this document, the Federal Communications Commission's Wireline Competition...

  3. ESA seeks gravitational-wave proposals

    NASA Astrophysics Data System (ADS)

    Banks, Michael

    2016-12-01

    The European Space Agency (ESA) has put out a call for European scientists to submit proposals for the first space mission to observe gravitational waves - ripples in the fabric of space-time created by accelerating massive objects.

  4. Training Needs Analysis and Evaluation for New Technologies through the Use of Problem-Based Inquiry

    ERIC Educational Resources Information Center

    Casey, Matthew Scott; Doverspike, Dennis

    2005-01-01

    The analysis of calls to a help desk, in this case calls to a computer help desk, can serve as a rich source of information on the real world problems that individuals are having with the implementation of a new technology. Thus, we propose that an analysis of help desk calls, a form of problem-based inquiry, can serve as a fast and low cost means…

  5. Direct Importance Estimation with Gaussian Mixture Models

    NASA Astrophysics Data System (ADS)

    Yamada, Makoto; Sugiyama, Masashi

    The ratio of two probability densities is called the importance and its estimation has gathered a great deal of attention these days since the importance can be used for various data processing purposes. In this paper, we propose a new importance estimation method using Gaussian mixture models (GMMs). Our method is an extention of the Kullback-Leibler importance estimation procedure (KLIEP), an importance estimation method using linear or kernel models. An advantage of GMMs is that covariance matrices can also be learned through an expectation-maximization procedure, so the proposed method — which we call the Gaussian mixture KLIEP (GM-KLIEP) — is expected to work well when the true importance function has high correlation. Through experiments, we show the validity of the proposed approach.

  6. Tiu Valles

    NASA Image and Video Library

    2002-11-26

    The ancient, catastrophic floods on Mars, whose origins remain a mystery, produced a channeled and scoured landscape like this one, which is called Tiu Valles and was imaged by NASA Mars Odyssey spacecraft. http://photojournal.jpl.nasa.gov/catalog/PIA04013

  7. Fuzzy object models for newborn brain MR image segmentation

    NASA Astrophysics Data System (ADS)

    Kobashi, Syoji; Udupa, Jayaram K.

    2013-03-01

    Newborn brain MR image segmentation is a challenging problem because of variety of size, shape and MR signal although it is the fundamental study for quantitative radiology in brain MR images. Because of the large difference between the adult brain and the newborn brain, it is difficult to directly apply the conventional methods for the newborn brain. Inspired by the original fuzzy object model introduced by Udupa et al. at SPIE Medical Imaging 2011, called fuzzy shape object model (FSOM) here, this paper introduces fuzzy intensity object model (FIOM), and proposes a new image segmentation method which combines the FSOM and FIOM into fuzzy connected (FC) image segmentation. The fuzzy object models are built from training datasets in which the cerebral parenchyma is delineated by experts. After registering FSOM with the evaluating image, the proposed method roughly recognizes the cerebral parenchyma region based on a prior knowledge of location, shape, and the MR signal given by the registered FSOM and FIOM. Then, FC image segmentation delineates the cerebral parenchyma using the fuzzy object models. The proposed method has been evaluated using 9 newborn brain MR images using the leave-one-out strategy. The revised age was between -1 and 2 months. Quantitative evaluation using false positive volume fraction (FPVF) and false negative volume fraction (FNVF) has been conducted. Using the evaluation data, a FPVF of 0.75% and FNVF of 3.75% were achieved. More data collection and testing are underway.

  8. Multi-layer cube sampling for liver boundary detection in PET-CT images.

    PubMed

    Liu, Xinxin; Yang, Jian; Song, Shuang; Song, Hong; Ai, Danni; Zhu, Jianjun; Jiang, Yurong; Wang, Yongtian

    2018-06-01

    Liver metabolic information is considered as a crucial diagnostic marker for the diagnosis of fever of unknown origin, and liver recognition is the basis of automatic diagnosis of metabolic information extraction. However, the poor quality of PET and CT images is a challenge for information extraction and target recognition in PET-CT images. The existing detection method cannot meet the requirement of liver recognition in PET-CT images, which is the key problem in the big data analysis of PET-CT images. A novel texture feature descriptor called multi-layer cube sampling (MLCS) is developed for liver boundary detection in low-dose CT and PET images. The cube sampling feature is proposed for extracting more texture information, which uses a bi-centric voxel strategy. Neighbour voxels are divided into three regions by the centre voxel and the reference voxel in the histogram, and the voxel distribution information is statistically classified as texture feature. Multi-layer texture features are also used to improve the ability and adaptability of target recognition in volume data. The proposed feature is tested on the PET and CT images for liver boundary detection. For the liver in the volume data, mean detection rate (DR) and mean error rate (ER) reached 95.15 and 7.81% in low-quality PET images, and 83.10 and 21.08% in low-contrast CT images. The experimental results demonstrated that the proposed method is effective and robust for liver boundary detection.

  9. "Pulse pair technique in high resolution NMR" a reprint of the historical 1971 lecture notes on two-dimensional spectroscopy.

    PubMed

    Jeener, Jean; Alewaeters, Gerrit

    2016-05-01

    The review articles published in "Progress in NMR Spectroscopy" are usually invited treatments of topics of current interest, but occasionally the Editorial Board may take an initiative to publish important historical material that is not widely available. The present article represents just such a case. Jean Jeener gave a lecture in 1971 at a summer school in Basko Polje, in what was then called Yugoslavia. As is now widely known, Jean Jeener laid down the foundations in that lecture of two - and higher - dimensional NMR spectroscopy by proposing the homonuclear COSY experiment. Jeener realized that the new proposal would open the door towards protein NMR and molecular structure determinations, but he felt that useful versions of such experiments could not be achieved with the NMR, computer and electronics technology available at that time, so that copies of the lecture notes were circulated (the Basko Polje lecture notes by J. Jeener and G. Alewaeters), but no formal publication followed. Fortunately, Ernst, Freeman, Griffin, and many others were more far-sighted and optimistic. An early useful extension was Ernst's proposal to replace the original projection/reconstruction technique of MRI by the widely adopted Fourier transform method inspired by the Basko Polje lecture. Later, the pulse method spread over many fields of spectroscopy as soon as the required technology became available. Jean Jeener, Emeritus professor, Université Libre de Bruxelles. Geoffrey Bodenhausen, Ecole Normale Supérieure, Paris. Copyright © 2016 Elsevier B.V. All rights reserved.

  10. 78 FR 36562 - 30-Day Notice of Proposed Information Collection: Home Equity Conversion Mortgage (HECM...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-06-18

    ... DEPARTMENT OF HOUSING AND URBAN DEVELOPMENT [Docket No. FR-5683-N-44] 30-Day Notice of Proposed Information Collection: Home Equity Conversion Mortgage (HECM) Insurance Application for the Origination of... Information Collection: Home Equity Conversion Mortgage (HECM) Insurance Application for the Origination of...

  11. 76 FR 40935 - Agency Information Collection Activities: Proposed Collection Comments Requested Race and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-12

    ...] Agency Information Collection Activities: Proposed Collection Comments Requested Race and National Origin... collection. (2) Title of the Form/Collection: Race and National Origin Identification. (3) Agency form number.... Other: none. Need for Collection: The information collection is used to maintain Race and National...

  12. From where to what: a neuroanatomically based evolutionary model of the emergence of speech in humans

    PubMed Central

    Poliva, Oren

    2017-01-01

    In the brain of primates, the auditory cortex connects with the frontal lobe via the temporal pole (auditory ventral stream; AVS) and via the inferior parietal lobe (auditory dorsal stream; ADS). The AVS is responsible for sound recognition, and the ADS for sound-localization, voice detection and integration of calls with faces. I propose that the primary role of the ADS in non-human primates is the detection and response to contact calls. These calls are exchanged between tribe members (e.g., mother-offspring) and are used for monitoring location. Detection of contact calls occurs by the ADS identifying a voice, localizing it, and verifying that the corresponding face is out of sight. Once a contact call is detected, the primate produces a contact call in return via descending connections from the frontal lobe to a network of limbic and brainstem regions. Because the ADS of present day humans also performs speech production, I further propose an evolutionary course for the transition from contact call exchange to an early form of speech. In accordance with this model, structural changes to the ADS endowed early members of the genus Homo with partial vocal control. This development was beneficial as it enabled offspring to modify their contact calls with intonations for signaling high or low levels of distress to their mother. Eventually, individuals were capable of participating in yes-no question-answer conversations. In these conversations the offspring emitted a low-level distress call for inquiring about the safety of objects (e.g., food), and his/her mother responded with a high- or low-level distress call to signal approval or disapproval of the interaction. Gradually, the ADS and its connections with brainstem motor regions became more robust and vocal control became more volitional. Speech emerged once vocal control was sufficient for inventing novel calls. PMID:28928931

  13. Controlling Decoherence in Superconducting Qubits: Phenomenological Model and Microscopic Origin of 1/f Noise

    DTIC Science & Technology

    2011-04-28

    quasiparticle poisoning which include a completely novel physical origin of these noises. We also proposed a model for excess low frequency flux noise which...and quasiparticle poisoning which include a completely novel physical origin of these noises. We also proposed a model for excess low frequency flux...metallic nanomechanical resonators, Phys. Rev. B 81, 184112 (2010). 3) L. Faoro, A. Kitaev and L. B. Ioffe, Quasiparticle poisoning and Josephson current

  14. Correcting Classifiers for Sample Selection Bias in Two-Phase Case-Control Studies

    PubMed Central

    Theis, Fabian J.

    2017-01-01

    Epidemiological studies often utilize stratified data in which rare outcomes or exposures are artificially enriched. This design can increase precision in association tests but distorts predictions when applying classifiers on nonstratified data. Several methods correct for this so-called sample selection bias, but their performance remains unclear especially for machine learning classifiers. With an emphasis on two-phase case-control studies, we aim to assess which corrections to perform in which setting and to obtain methods suitable for machine learning techniques, especially the random forest. We propose two new resampling-based methods to resemble the original data and covariance structure: stochastic inverse-probability oversampling and parametric inverse-probability bagging. We compare all techniques for the random forest and other classifiers, both theoretically and on simulated and real data. Empirical results show that the random forest profits from only the parametric inverse-probability bagging proposed by us. For other classifiers, correction is mostly advantageous, and methods perform uniformly. We discuss consequences of inappropriate distribution assumptions and reason for different behaviors between the random forest and other classifiers. In conclusion, we provide guidance for choosing correction methods when training classifiers on biased samples. For random forests, our method outperforms state-of-the-art procedures if distribution assumptions are roughly fulfilled. We provide our implementation in the R package sambia. PMID:29312464

  15. 3-D discrete analytical ridgelet transform.

    PubMed

    Helbert, David; Carré, Philippe; Andres, Eric

    2006-12-01

    In this paper, we propose an implementation of the 3-D Ridgelet transform: the 3-D discrete analytical Ridgelet transform (3-D DART). This transform uses the Fourier strategy for the computation of the associated 3-D discrete Radon transform. The innovative step is the definition of a discrete 3-D transform with the discrete analytical geometry theory by the construction of 3-D discrete analytical lines in the Fourier domain. We propose two types of 3-D discrete lines: 3-D discrete radial lines going through the origin defined from their orthogonal projections and 3-D planes covered with 2-D discrete line segments. These discrete analytical lines have a parameter called arithmetical thickness, allowing us to define a 3-D DART adapted to a specific application. Indeed, the 3-D DART representation is not orthogonal, It is associated with a flexible redundancy factor. The 3-D DART has a very simple forward/inverse algorithm that provides an exact reconstruction without any iterative method. In order to illustrate the potentiality of this new discrete transform, we apply the 3-D DART and its extension to the Local-DART (with smooth windowing) to the denoising of 3-D image and color video. These experimental results show that the simple thresholding of the 3-D DART coefficients is efficient.

  16. Classification of Microarray Data Using Kernel Fuzzy Inference System

    PubMed Central

    Kumar Rath, Santanu

    2014-01-01

    The DNA microarray classification technique has gained more popularity in both research and practice. In real data analysis, such as microarray data, the dataset contains a huge number of insignificant and irrelevant features that tend to lose useful information. Classes with high relevance and feature sets with high significance are generally referred for the selected features, which determine the samples classification into their respective classes. In this paper, kernel fuzzy inference system (K-FIS) algorithm is applied to classify the microarray data (leukemia) using t-test as a feature selection method. Kernel functions are used to map original data points into a higher-dimensional (possibly infinite-dimensional) feature space defined by a (usually nonlinear) function ϕ through a mathematical process called the kernel trick. This paper also presents a comparative study for classification using K-FIS along with support vector machine (SVM) for different set of features (genes). Performance parameters available in the literature such as precision, recall, specificity, F-measure, ROC curve, and accuracy are considered to analyze the efficiency of the classification model. From the proposed approach, it is apparent that K-FIS model obtains similar results when compared with SVM model. This is an indication that the proposed approach relies on kernel function. PMID:27433543

  17. Adaptive Suspicious Prevention for Defending DoS Attacks in SDN-Based Convergent Networks

    PubMed Central

    Dao, Nhu-Ngoc; Kim, Joongheon; Park, Minho; Cho, Sungrae

    2016-01-01

    The convergent communication network will play an important role as a single platform to unify heterogeneous networks and integrate emerging technologies and existing legacy networks. Although there have been proposed many feasible solutions, they could not become convergent frameworks since they mainly focused on converting functions between various protocols and interfaces in edge networks, and handling functions for multiple services in core networks, e.g., the Multi-protocol Label Switching (MPLS) technique. Software-defined networking (SDN), on the other hand, is expected to be the ideal future for the convergent network since it can provide a controllable, dynamic, and cost-effective network. However, SDN has an original structural vulnerability behind a lot of advantages, which is the centralized control plane. As the brains of the network, a controller manages the whole network, which is attractive to attackers. In this context, we proposes a novel solution called adaptive suspicious prevention (ASP) mechanism to protect the controller from the Denial of Service (DoS) attacks that could incapacitate an SDN. The ASP is integrated with OpenFlow protocol to detect and prevent DoS attacks effectively. Our comprehensive experimental results show that the ASP enhances the resilience of an SDN network against DoS attacks by up to 38%. PMID:27494411

  18. HemoVision: An automated and virtual approach to bloodstain pattern analysis.

    PubMed

    Joris, Philip; Develter, Wim; Jenar, Els; Suetens, Paul; Vandermeulen, Dirk; Van de Voorde, Wim; Claes, Peter

    2015-06-01

    Bloodstain pattern analysis (BPA) is a subspecialty of forensic sciences, dealing with the analysis and interpretation of bloodstain patterns in crime scenes. The aim of BPA is uncovering new information about the actions that took place in a crime scene, potentially leading to a confirmation or refutation of a suspect's statement. A typical goal of BPA is to estimate the flight paths for a set of stains, followed by a directional analysis in order to estimate the area of origin for the stains. The traditional approach, referred to as stringing, consists of attaching a piece of string to each stain, and letting the string represent an approximation of the stain's flight path. Even though stringing has been used extensively, many (practical) downsides exist. We propose an automated and virtual approach, employing fiducial markers and digital images. By automatically reconstructing a single coordinate frame from several images, limited user input is required. Synthetic crime scenes were created and analysed in order to evaluate the approach. Results demonstrate the correct operation and practical advantages, suggesting that the proposed approach may become a valuable asset for practically analysing bloodstain spatter patterns. Accompanying software called HemoVision is currently provided as a demonstrator and will be further developed for practical use in forensic investigations. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  19. Adaptive Suspicious Prevention for Defending DoS Attacks in SDN-Based Convergent Networks.

    PubMed

    Dao, Nhu-Ngoc; Kim, Joongheon; Park, Minho; Cho, Sungrae

    2016-01-01

    The convergent communication network will play an important role as a single platform to unify heterogeneous networks and integrate emerging technologies and existing legacy networks. Although there have been proposed many feasible solutions, they could not become convergent frameworks since they mainly focused on converting functions between various protocols and interfaces in edge networks, and handling functions for multiple services in core networks, e.g., the Multi-protocol Label Switching (MPLS) technique. Software-defined networking (SDN), on the other hand, is expected to be the ideal future for the convergent network since it can provide a controllable, dynamic, and cost-effective network. However, SDN has an original structural vulnerability behind a lot of advantages, which is the centralized control plane. As the brains of the network, a controller manages the whole network, which is attractive to attackers. In this context, we proposes a novel solution called adaptive suspicious prevention (ASP) mechanism to protect the controller from the Denial of Service (DoS) attacks that could incapacitate an SDN. The ASP is integrated with OpenFlow protocol to detect and prevent DoS attacks effectively. Our comprehensive experimental results show that the ASP enhances the resilience of an SDN network against DoS attacks by up to 38%.

  20. Structured Ordinary Least Squares: A Sufficient Dimension Reduction approach for regressions with partitioned predictors and heterogeneous units.

    PubMed

    Liu, Yang; Chiaromonte, Francesca; Li, Bing

    2017-06-01

    In many scientific and engineering fields, advanced experimental and computing technologies are producing data that are not just high dimensional, but also internally structured. For instance, statistical units may have heterogeneous origins from distinct studies or subpopulations, and features may be naturally partitioned based on experimental platforms generating them, or on information available about their roles in a given phenomenon. In a regression analysis, exploiting this known structure in the predictor dimension reduction stage that precedes modeling can be an effective way to integrate diverse data. To pursue this, we propose a novel Sufficient Dimension Reduction (SDR) approach that we call structured Ordinary Least Squares (sOLS). This combines ideas from existing SDR literature to merge reductions performed within groups of samples and/or predictors. In particular, it leads to a version of OLS for grouped predictors that requires far less computation than recently proposed groupwise SDR procedures, and provides an informal yet effective variable selection tool in these settings. We demonstrate the performance of sOLS by simulation and present a first application to genomic data. The R package "sSDR," publicly available on CRAN, includes all procedures necessary to implement the sOLS approach. © 2016, The International Biometric Society.

  1. Total variation-based method for radar coincidence imaging with model mismatch for extended target

    NASA Astrophysics Data System (ADS)

    Cao, Kaicheng; Zhou, Xiaoli; Cheng, Yongqiang; Fan, Bo; Qin, Yuliang

    2017-11-01

    Originating from traditional optical coincidence imaging, radar coincidence imaging (RCI) is a staring/forward-looking imaging technique. In RCI, the reference matrix must be computed precisely to reconstruct the image as preferred; unfortunately, such precision is almost impossible due to the existence of model mismatch in practical applications. Although some conventional sparse recovery algorithms are proposed to solve the model-mismatch problem, they are inapplicable to nonsparse targets. We therefore sought to derive the signal model of RCI with model mismatch by replacing the sparsity constraint item with total variation (TV) regularization in the sparse total least squares optimization problem; in this manner, we obtain the objective function of RCI with model mismatch for an extended target. A more robust and efficient algorithm called TV-TLS is proposed, in which the objective function is divided into two parts and the perturbation matrix and scattering coefficients are updated alternately. Moreover, due to the ability of TV regularization to recover sparse signal or image with sparse gradient, TV-TLS method is also applicable to sparse recovering. Results of numerical experiments demonstrate that, for uniform extended targets, sparse targets, and real extended targets, the algorithm can achieve preferred imaging performance both in suppressing noise and in adapting to model mismatch.

  2. Anticlockwise or clockwise? A dynamic Perception-Action-Laterality model for directionality bias in visuospatial functioning.

    PubMed

    Karim, A K M Rezaul; Proulx, Michael J; Likova, Lora T

    2016-09-01

    Orientation bias and directionality bias are two fundamental functional characteristics of the visual system. Reviewing the relevant literature in visual psychophysics and visual neuroscience we propose here a three-stage model of directionality bias in visuospatial functioning. We call this model the 'Perception-Action-Laterality' (PAL) hypothesis. We analyzed the research findings for a wide range of visuospatial tasks, showing that there are two major directionality trends in perceptual preference: clockwise versus anticlockwise. It appears these preferences are combinatorial, such that a majority of people fall in the first category demonstrating a preference for stimuli/objects arranged from left-to-right rather than from right-to-left, while people in the second category show an opposite trend. These perceptual biases can guide sensorimotor integration and action, creating two corresponding turner groups in the population. In support of PAL, we propose another model explaining the origins of the biases - how the neurogenetic factors and the cultural factors interact in a biased competition framework to determine the direction and extent of biases. This dynamic model can explain not only the two major categories of biases in terms of direction and strength, but also the unbiased, unreliably biased or mildly biased cases in visuosptial functioning. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  3. Crossing Fibers Detection with an Analytical High Order Tensor Decomposition

    PubMed Central

    Megherbi, T.; Kachouane, M.; Oulebsir-Boumghar, F.; Deriche, R.

    2014-01-01

    Diffusion magnetic resonance imaging (dMRI) is the only technique to probe in vivo and noninvasively the fiber structure of human brain white matter. Detecting the crossing of neuronal fibers remains an exciting challenge with an important impact in tractography. In this work, we tackle this challenging problem and propose an original and efficient technique to extract all crossing fibers from diffusion signals. To this end, we start by estimating, from the dMRI signal, the so-called Cartesian tensor fiber orientation distribution (CT-FOD) function, whose maxima correspond exactly to the orientations of the fibers. The fourth order symmetric positive definite tensor that represents the CT-FOD is then analytically decomposed via the application of a new theoretical approach and this decomposition is used to accurately extract all the fibers orientations. Our proposed high order tensor decomposition based approach is minimal and allows recovering the whole crossing fibers without any a priori information on the total number of fibers. Various experiments performed on noisy synthetic data, on phantom diffusion, data and on human brain data validate our approach and clearly demonstrate that it is efficient, robust to noise and performs favorably in terms of angular resolution and accuracy when compared to some classical and state-of-the-art approaches. PMID:25246940

  4. Midwest Forensics Resource Center Project Summary June 2005

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    David Baldwin

    The mission of the MFRC Research and Development Program, is to provide technological advances in forensic science for the benefit of our regional partners as well as the forensic community at large. Key areas of forensic science need are identified through our interactions with our Midwest partners and our R&D advisory group, as well as through our participation in national meetings in forensic science. Under the sponsorship of the National Institute of Justice, the MFRC solicits proposals for the development of practical and useful technology, instrumentation, and methodology that address needs in areas related to forensic science and its applicationmore » to operational crime laboratories. The MFRC facilitates proposal development by working to establish partnerships between researchers and our regional partners. The MFRC administers a peer-review of the proposals and then funds the selected projects at a cost of approximately $55,000 each, with a 12-month period of performance. The process for selection of these projects includes the following steps: (1) drafting of a call for proposals by MFRC staff, (2) review of the draft call by members of the R&D advisory committee, (3) review and approval of the call by NIJ, (4) issuance of the call to ISU, Ames Laboratory, regional partners, and research organizations, (5) receipt of proposals, (6) review of proposals by R&D advisory committee, (7) ranking and selection by MFRC staff using advisory committee reviews, with concurrence by NIJ, (8) notification of proposers, (9) receipt and review of progress reports by MFRC, (10) receipt and review of final reports by MFRC, R&D advisory committee, and NIJ. The decision to fund any specific project is based upon a peer-reviewed call-for-proposal system administered by the MFRC. The reviewers are crime laboratory specialists and scientists who are asked to rate the proposals on four criteria areas including: (1) relevance to the mission of the MFRC, (2) technical approach and procedures, (3) capabilities, teaming, and leveraging, and (4) implementation plan. A successful proposal demonstrates knowledge of the background for the research and related work in the field and includes a research plan with a defined plan to implement the technology to benefit our partners at the crime laboratories. The project summaries are meant to demonstrate the range of research funded by the MFRC including chemistry, DNA, and patterned evidence. The project summaries describe the forensic need the projects serve as well as the benefits derived from the technology. The summaries provide a brief description of the technology and the accomplishments to date. In addition, the collaboration with regional partners and the status of the implementation of the technology are highlighted. These technical summaries represent the development and implementation of practical and useful technology for crime laboratories that the MFRC hopes to accomplish.« less

  5. On the origin of comets

    NASA Technical Reports Server (NTRS)

    Mendis, A.; Alfven, H.

    1976-01-01

    Physico-chemical processes leading to the dynamic formation and physical evolution of comets are reviewed in relationship to the various theories that propose solar origins, protoplanetary origins, planetary origins and interstellar origins. Evidence points to the origins of comets by the growth and agglomeration of small particles from gas and dust at very low temperatures at undetermined regions in space.

  6. Rivers and valleys of Pennsylvania, revisited

    NASA Astrophysics Data System (ADS)

    Morisawa, Marie

    1989-09-01

    The 1889 paper by William Morris Davis on the "Rivers and Valleys of Pennsylvania" is a landmark in the history of geomorphology. It was in this manuscript that he set forth what came to be known as the Davisian system of landscape. It is important to understand that Davis' interpretation of landforms was restricted by the geologic paradigms of his day. Uniformitarianism was strongly entrenched and Darwin's theory of evolution had become popularly accepted. The concept of the landmass Appalachia and then current theories on mountain building affected the approach that Davis took in hypothesizing the origin and development of the Folded Appalachian drainage. All of these geologic precepts influenced the formulation and explanation of his theories. In his exposition he adapted, synthesized and embellished on ideas he derived from fellow geologists such as Gilbert, Dutton, Powell, and McGee. A number of the concepts he proposed in the 1889 paper quickly became the bases for geomorphic studies by others: the cycles of river erosion and landscape evolution and the peneplain (here called base level erosion). The cycle of erosion became the model for subsequent geomorphic analyses, and peneplain hunting became a popular sport for geomorphologists. Davis' hypothesis of the origin and development of Pennsylvanian drainage stimulated subsequent discussion and further hypotheses by others. In fact, many of the later theories were refinements and/or elaborations of ideas mentioned in this paper of Davis. He proposed the origin of the drainage as consequent streams, then antecedence, superposition, headward extension of divides by piracy, erosion along lines of weaknesses (faults, easily erodible beds) through resistant ridges and normal fluvial erosion. Thus, the hypotheses of regional superposition (Johnson), extended consequents (Ruedemann), consequents and local superposition (Meyerhoff and Olmstead), the utilization of structural weaknesses in development of transverse drainage (Thompson; Meyerhoff; Oberlander, among others), and migration of divides (Thompson), all had been suggested by Davis in 1889. Although the concepts of erosion cycles and peneplaination have waned in popularity in recent geomorphic research, the principles of formation of water and wind gaps, headward migration of divides, stream piracy and adjustment to streams to structure, so clearly and minutely explained in his 1889 publication, are still viable today.

  7. 7 CFR 1737.50 - Review of completed loan application.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 3017); (3) A market survey called the Area Coverage Survey (ACS); (4) The plan and associated costs for the proposed construction, called the Loan Design (LD); (5) Evidence that the borrower is... determine that the system design is acceptable to RUS, that the design is technically correct, that the cost...

  8. 7 CFR 1737.50 - Review of completed loan application.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 3017); (3) A market survey called the Area Coverage Survey (ACS); (4) The plan and associated costs for the proposed construction, called the Loan Design (LD); (5) Evidence that the borrower is... determine that the system design is acceptable to RUS, that the design is technically correct, that the cost...

  9. 7 CFR 1737.50 - Review of completed loan application.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 3017); (3) A market survey called the Area Coverage Survey (ACS); (4) The plan and associated costs for the proposed construction, called the Loan Design (LD); (5) Evidence that the borrower is... determine that the system design is acceptable to RUS, that the design is technically correct, that the cost...

  10. 7 CFR 1737.50 - Review of completed loan application.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 3017); (3) A market survey called the Area Coverage Survey (ACS); (4) The plan and associated costs for the proposed construction, called the Loan Design (LD); (5) Evidence that the borrower is... determine that the system design is acceptable to RUS, that the design is technically correct, that the cost...

  11. 7 CFR 1737.50 - Review of completed loan application.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 3017); (3) A market survey called the Area Coverage Survey (ACS); (4) The plan and associated costs for the proposed construction, called the Loan Design (LD); (5) Evidence that the borrower is... determine that the system design is acceptable to RUS, that the design is technically correct, that the cost...

  12. Integrating Collaborative and Decentralized Models to Support Ubiquitous Learning

    ERIC Educational Resources Information Center

    Barbosa, Jorge Luis Victória; Barbosa, Débora Nice Ferrari; Rigo, Sandro José; de Oliveira, Jezer Machado; Rabello, Solon Andrade, Jr.

    2014-01-01

    The application of ubiquitous technologies in the improvement of education strategies is called Ubiquitous Learning. This article proposes the integration between two models dedicated to support ubiquitous learning environments, called Global and CoolEdu. CoolEdu is a generic collaboration model for decentralized environments. Global is an…

  13. Enhancing the Design and Analysis of Flipped Learning Strategies

    ERIC Educational Resources Information Center

    Jenkins, Martin; Bokosmaty, Rena; Brown, Melanie; Browne, Chris; Gao, Qi; Hanson, Julie; Kupatadze, Ketevan

    2017-01-01

    There are numerous calls in the literature for research into the flipped learning approach to match the flood of popular media articles praising its impact on student learning and educational outcomes. This paper addresses those calls by proposing pedagogical strategies that promote active learning in "flipped" approaches and improved…

  14. Modeling and Frequency Tracking of Marine Mammal Whistle Calls

    DTIC Science & Technology

    2009-02-01

    retrieve em- bedded information from watermarked synthetic whistle calls. Different fundamental frequency watermarking schemes are proposed b&𔃽ed on...unmodified frequency contour is relatively constant, there is little frequency separation between information bits, and watermark retrieval requires...UHYLHZLQJWKHFROOHFWLRQRILQIRUPDWLRQ6HQGFRPPHQWVUHJDUGLQJWKLVEXUGHQHVWLPDWH RU DQ\\RWKHUDVSHFWRIWKLVFROOHFWLRQ RI LQIRUPDWLRQ LQFOXGLQJ

  15. 77 FR 58072 - Finding of Substantial Inadequacy of Implementation Plan; Call for California State...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-09-19

    ... Substantial Inadequacy of Implementation Plan; Call for California State Implementation Plan Revision; South Coast AGENCY: Environmental Protection Agency (EPA). ACTION: Proposed rule. SUMMARY: In response to a... that the California State Implementation Plan (SIP) for the Los Angeles-South Coast Air Basin (South...

  16. 75 FR 44181 - Mevinphos; Proposed Data Call-in Order for Pesticide Tolerance

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-28

    ... are most often collected in a study called the comparative cholinesterase assay (CCA). Since that time....1520 Processing studies Not Required 24 months (tomatoes) 870.6300 Comparative 6 months 12 months... mevinphos including: 1. A developmental neurotoxicity (DNT) study in rats (with expanded protocol to extend...

  17. A Plea for "Close Learning"

    ERIC Educational Resources Information Center

    Newstok, Scott L.

    2013-01-01

    There is a personal, human element to liberal education, what john Henry Newman once called the "living voice, the breathing form, the expressive countenance" Those who cherish personalized instruction would benefit from a phrase to defend and promote the practice. Author Scott Newstok proposes in this article that we begin calling it…

  18. Economic evaluation of the differential benefits of home visits with telephone calls and telephone calls only in transitional discharge support.

    PubMed

    Wong, Frances Kam Yuet; So, Ching; Chau, June; Law, Antony Kwan Pui; Tam, Stanley Ku Fu; McGhee, Sarah

    2015-01-01

    home visits and telephone calls are two often used approaches in transitional care, but their differential economic effects are unknown. to examine the differential economic benefits of home visits with telephone calls and telephone calls only in transitional discharge support. cost-effectiveness analysis conducted alongside a randomised controlled trial (RCT). patients discharged from medical units randomly assigned to control (control, N = 210), home visits with calls (home, N = 196) and calls only (call, N = 204). cost-effectiveness analyses were conducted from the societal perspective comparing monetary benefits and quality-adjusted life years (QALYs) gained. the home arm was less costly but less effective at 28 days and was dominating (less costly and more effective) at 84 days. The call arm was dominating at both 28 and 84 days. The incremental QALY for the home arm was -0.0002/0.0008 (28/84 days), and the call arm was 0.0022/0.0104 (28/84 days). When the three groups were compared, the call arm had a higher probability being cost-effective at 84 days but not at 28 days (home: 53%, call: 35% (28 days) versus home: 22%, call: 73% (84 days)) measuring against the NICE threshold of £20,000. the original RCT showed that the bundled intervention involving home visits and calls was more effective than calls only in the reduction of hospital readmissions. This study adds a cost perspective to inform policymakers that both home visits and calls only are cost-effective for transitional care support, but calls only have a higher chance of being cost-effective for a sustained period after intervention. © The Author 2014. Published by Oxford University Press on behalf of the British Geriatrics Society.

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sands, M.; Rees, J.

    A rather simple electronic bench experiment is proposed for obtaining a measure of the impulse energy loss of a stored particle bunch to an rf cavity or other vacuum-chamber structure--the so-called "cavity radiation". The proposed method is analyzed in some detail.

  20. REGIONAL EMAP PROPOSALS

    EPA Science Inventory

    The US EPA's Environmental Assessment and Monitoring Program (EMAP) annually funds regional EMAP (REMAP) projects through each of the regions to support the improvement of monitoring activities by the states. The last call for proposals emphasized the need to support biological m...

  1. Rotation invariants of vector fields from orthogonal moments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, Bo; Kostková, Jitka; Flusser, Jan

    Vector field images are a type of new multidimensional data that appear in many engineering areas. Although the vector fields can be visualized as images, they differ from graylevel and color images in several aspects. In order to analyze them, special methods and algorithms must be originally developed or substantially adapted from the traditional image processing area. Here, we propose a method for the description and matching of vector field patterns under an unknown rotation of the field. Rotation of a vector field is so-called total rotation, where the action is applied not only on the spatial coordinates but alsomore » on the field values. Invariants of vector fields with respect to total rotation constructed from orthogonal Gaussian–Hermite moments and Zernike moments are introduced. Their numerical stability is shown to be better than that of the invariants published so far. We demonstrate their usefulness in a real world template matching application of rotated vector fields.« less

  2. On gauged maximal d  =  8 supergravities

    NASA Astrophysics Data System (ADS)

    Lasso Andino, Óscar; Ortín, Tomás

    2018-04-01

    We study the gauging of maximal d  =  8 supergravity using the embedding tensor formalism. We focus on SO(3) gaugings, study all the possible choices of gauge fields and construct explicitly the bosonic actions (including the complicated Chern–Simons terms) for all these choices, which are parametrized by a parameter associated to the 8-dimensional SL(2, {R}) duality group that relates all the possible choices which are, ultimately, equivalent from the purely 8-dimensional point of view. Our result proves that the theory constructed by Salam and Sezgin by Scherk–Schwarz compactification of d  =  11 supergravity and the theory constructed in Alonso-Alberca (2001 Nucl. Phys. B 602 329) by dimensional reduction of the so called ‘massive 11-dimensional supergravity’ proposed by Meessen and Ortín in (1999 Nucl. Phys. B 541 195) are indeed related by an SL(2, {R}) duality even though they have two completely different 11-dimensional origins.

  3. Progress in understanding the pathogenesis of Langerhans cell histiocytosis: back to Histiocytosis X?

    PubMed Central

    Berres, Marie-Luise; Merad, Miriam; Allen, Carl E.

    2016-01-01

    Summary Langerhans cell histiocytosis (LCH), the most common histiocytic disorder, is characterized by the accumulation of CD1A+/CD207+ mononuclear phagocytes within granulomatous lesions that can affect nearly all organ systems. Historically, LCH has been presumed to arise from transformed or pathologically activated epidermal dendritic cells called Langerhans cells. However, new evidence supports a model in which LCH occurs as a consequence of a misguided differentiation programme of myeloid dendritic cell precursors. Genetic, molecular and functional data implicate activation of the ERK signalling pathway at critical stages in myeloid differentiation as an essential and universal driver of LCH pathology. Based on these findings, we propose that LCH should be re-defined as an inflammatory myeloid neoplasia. Increased understanding of LCH pathogenesis will provide opportunities to optimize and personalize therapy through improved risk-stratification, targeted therapy and assessment of therapy response based on specific molecular features and origin of the pathological myeloid cells. PMID:25430560

  4. Ontogenetic ritualization of primate gesture as a case study in dyadic brain modeling.

    PubMed

    Gasser, Brad; Cartmill, Erica A; Arbib, Michael A

    2014-01-01

    This paper introduces dyadic brain modeling - the simultaneous, computational modeling of the brains of two interacting agents - to explore ways in which our understanding of macaque brain circuitry can ground new models of brain mechanisms involved in ape interaction. Specifically, we assess a range of data on gestural communication of great apes as the basis for developing an account of the interactions of two primates engaged in ontogenetic ritualization, a proposed learning mechanism through which a functional action may become a communicative gesture over repeated interactions between two individuals (the 'dyad'). The integration of behavioral, neural, and computational data in dyadic (or, more generally, social) brain modeling has broad application to comparative and evolutionary questions, particularly for the evolutionary origins of cognition and language in the human lineage. We relate this work to the neuroinformatics challenges of integrating and sharing data to support collaboration between primatologists, neuroscientists and modelers that will help speed the emergence of what may be called comparative neuro-primatology.

  5. PhylArray: phylogenetic probe design algorithm for microarray.

    PubMed

    Militon, Cécile; Rimour, Sébastien; Missaoui, Mohieddine; Biderre, Corinne; Barra, Vincent; Hill, David; Moné, Anne; Gagne, Geneviève; Meier, Harald; Peyretaillade, Eric; Peyret, Pierre

    2007-10-01

    Microbial diversity is still largely unknown in most environments, such as soils. In order to get access to this microbial 'black-box', the development of powerful tools such as microarrays are necessary. However, the reliability of this approach relies on probe efficiency, in particular sensitivity, specificity and explorative power, in order to obtain an image of the microbial communities that is close to reality. We propose a new probe design algorithm that is able to select microarray probes targeting SSU rRNA at any phylogenetic level. This original approach, implemented in a program called 'PhylArray', designs a combination of degenerate and non-degenerate probes for each target taxon. Comparative experimental evaluations indicate that probes designed with PhylArray yield a higher sensitivity and specificity than those designed by conventional approaches. Applying the combined PhyArray/GoArrays strategy helps to optimize the hybridization performance of short probes. Finally, hybridizations with environmental targets have shown that the use of the PhylArray strategy can draw attention to even previously unknown bacteria.

  6. Multifractal Cross Wavelet Analysis

    NASA Astrophysics Data System (ADS)

    Jiang, Zhi-Qiang; Gao, Xing-Lu; Zhou, Wei-Xing; Stanley, H. Eugene

    Complex systems are composed of mutually interacting components and the output values of these components usually exhibit long-range cross-correlations. Using wavelet analysis, we propose a method of characterizing the joint multifractal nature of these long-range cross correlations, a method we call multifractal cross wavelet analysis (MFXWT). We assess the performance of the MFXWT method by performing extensive numerical experiments on the dual binomial measures with multifractal cross correlations and the bivariate fractional Brownian motions (bFBMs) with monofractal cross correlations. For binomial multifractal measures, we find the empirical joint multifractality of MFXWT to be in approximate agreement with the theoretical formula. For bFBMs, MFXWT may provide spurious multifractality because of the wide spanning range of the multifractal spectrum. We also apply the MFXWT method to stock market indices, and in pairs of index returns and volatilities we find an intriguing joint multifractal behavior. The tests on surrogate series also reveal that the cross correlation behavior, particularly the cross correlation with zero lag, is the main origin of cross multifractality.

  7. Probabilistic Evaluation of Competing Climate Models

    NASA Astrophysics Data System (ADS)

    Braverman, A. J.; Chatterjee, S.; Heyman, M.; Cressie, N.

    2017-12-01

    A standard paradigm for assessing the quality of climate model simulations is to compare what these models produce for past and present time periods, to observations of the past and present. Many of these comparisons are based on simple summary statistics called metrics. Here, we propose an alternative: evaluation of competing climate models through probabilities derived from tests of the hypothesis that climate-model-simulated and observed time sequences share common climate-scale signals. The probabilities are based on the behavior of summary statistics of climate model output and observational data, over ensembles of pseudo-realizations. These are obtained by partitioning the original time sequences into signal and noise components, and using a parametric bootstrap to create pseudo-realizations of the noise sequences. The statistics we choose come from working in the space of decorrelated and dimension-reduced wavelet coefficients. We compare monthly sequences of CMIP5 model output of average global near-surface temperature anomalies to similar sequences obtained from the well-known HadCRUT4 data set, as an illustration.

  8. Six Years of Science with the Chandra X-Ray Observatory

    NASA Technical Reports Server (NTRS)

    Weisskopf, Martin

    2005-01-01

    The Chandra X-ray Observatory had its origins in a 1963 proposal led by Riccardo Giacconi that called for a 1-meter diameter, 1-arcsecond class X-Ray telescope for studying the Universe in X-rays. We will briefly discuss the history of the mission, the development of the hardware, its testing, and the launch on 1999, July 23. The remainder of the talk will be an admittedly eclectic review of some of the most exciting scientific highlights. These include the detection and identification of the first source seen with Chandra - an unusual Seyfert 1 we nicknamed Leon X-1, the detailed study of the Crab Nebula and its pulsar, and spectacular images of other supernova remnants including a 1-Million second exposure on Cas A. We also will summarize some of the major Chandra findings for normal and active galaxies and we will illustrate the breadth of science enabled by Chandra observations of clusters of galaxies and their implications for cosmology.

  9. Neighborhood binary speckle pattern for deformation measurements insensitive to local illumination variation by digital image correlation.

    PubMed

    Zhao, Jian; Yang, Ping; Zhao, Yue

    2017-06-01

    Speckle pattern-based characteristics of digital image correlation (DIC) restrict its application in engineering fields and nonlaboratory environments, since serious decorrelation effect occurs due to localized sudden illumination variation. A simple and efficient speckle pattern adjusting and optimizing approach presented in this paper is aimed at providing a novel speckle pattern robust enough to resist local illumination variation. The new speckle pattern, called neighborhood binary speckle pattern, derived from original speckle pattern, is obtained by means of thresholding the pixels of a neighborhood at its central pixel value and considering the result as a binary number. The efficiency of the proposed speckle pattern is evaluated in six experimental scenarios. Experiment results indicate that the DIC measurements based on neighborhood binary speckle pattern are able to provide reliable and accurate results, even though local brightness and contrast of the deformed images have been seriously changed. It is expected that the new speckle pattern will have more potential value in engineering applications.

  10. Designing for Mild Cognitive Impairment (MCI): A Design Anthropological Perspective.

    PubMed

    Collier, Guy; Kayes, Nicola; Reay, Stephen; Bill, Amanda

    2017-01-01

    This paper will present a design anthropological perspective on an ongoing project called 'Living Well with Mild Cognitive Impairment (MCI)'. The project explores how people with MCI (and their families) manage and respond to changes in their memory and thinking. One of the primary aims of this project is to design an online resource that will support people to 'Live Well' within the context of possible cognitive decline. The resource was originally proposed to function as a kind of online community, where users could both share and learn about home-grown strategies for managing the cognitive changes associated with MCI in everyday life. Much of this project has been guided by the methodological approach of design anthropology, which encourages project researchers and stakeholders to critically examine underlying assumptions and conceptual frameworks, which in this case revolve around the disputed MCI category. In this paper we will provide some background to the Living Well project before highlighting a number of key insights attained from design anthropology.

  11. Grid Stiffened Structure Analysis Tool

    NASA Technical Reports Server (NTRS)

    1999-01-01

    The Grid Stiffened Analysis Tool contract is contract performed by Boeing under NASA purchase order H30249D. The contract calls for a "best effort" study comprised of two tasks: (1) Create documentation for a composite grid-stiffened structure analysis tool, in the form of a Microsoft EXCEL spread sheet, that was developed by originally at Stanford University and later further developed by the Air Force, and (2) Write a program that functions as a NASTRAN pre-processor to generate an FEM code for grid-stiffened structure. In performing this contract, Task 1 was given higher priority because it enables NASA to make efficient use of a unique tool they already have; Task 2 was proposed by Boeing because it also would be beneficial to the analysis of composite grid-stiffened structures, specifically in generating models for preliminary design studies. The contract is now complete, this package includes copies of the user's documentation for Task 1 and a CD ROM & diskette with an electronic copy of the user's documentation and an updated version of the "GRID 99" spreadsheet.

  12. Alleviating bias leads to accurate and personalized recommendation

    NASA Astrophysics Data System (ADS)

    Qiu, Tian; Wang, Tian-Tian; Zhang, Zi-Ke; Zhong, Li-Xin; Chen, Guang

    2013-11-01

    Recommendation bias towards objects has been found to have an impact on personalized recommendation, since objects present heterogeneous characteristics in some network-based recommender systems. In this article, based on a biased heat conduction recommendation algorithm (BHC) which considers the heterogeneity of the target objects, we propose a heterogeneous heat conduction algorithm (HHC), by further taking the heterogeneity of the source objects into account. Tested on three real datasets, the Netflix, RYM and MovieLens, the HHC algorithm is found to present better recommendation in both the accuracy and diversity than two benchmark algorithms, i.e., the original BHC and a hybrid algorithm of heat conduction and mass diffusion (HHM), while not requiring any other accessorial information or parameter. Moreover, the HHC algorithm also elevates the recommendation accuracy on cold objects, referring to the so-called cold-start problem. Eigenvalue analyses show that, the HHC algorithm effectively alleviates the recommendation bias towards objects with different level of popularity, which is beneficial to solving the accuracy-diversity dilemma.

  13. The Back Pocket Map: Social Class and Cultural Capital as Transferable Assets in the Advancement of Second-Generation Immigrants

    PubMed Central

    Fernández-Kelly, Patricia

    2014-01-01

    In this paper I move beyond current understandings of family- and school-related dynamics that explain the educational and occupational success of low-income immigrant children to investigate the role of cultural capital acquired in the country of origin. Class-related forms of knowledge acquired prior to migration can become invaluable assets in areas of destination through the realization of what Pierre Boutdieu calls habitus, that is, a series of embodied predispositions deployed by individuals in their pursuit of set objectives. Although the concept has attracted prolonged attention, the mechanisms by which the habitus is fulfilled remain unspecified. Here, I propose and examine three of those mechanisms: (a) cognitive correspondence, (b) positive emulation, and (c) active recollection. My study shows that class-related resources, like education, self definition, and remembrance of nation and ancestry play an important function, shaping youthful expectations and behaviors, and protecting the children of low-income immigrants from downward mobility. PMID:25431497

  14. Nauclea latifolia: biological activity and alkaloid phytochemistry of a West African tree.

    PubMed

    Boucherle, Benjamin; Haudecoeur, Romain; Queiroz, Emerson Ferreira; De Waard, Michel; Wolfender, Jean-Luc; Robins, Richard J; Boumendjel, Ahcène

    2016-09-25

    Covering up to 2016Nauclea latifolia (syn. Sarcocephalus latifolius, Rubiaceae), commonly called the African pincushion tree, is a plant widely used in folk medicine in different regions of Africa for treating a variety of illnesses, including malaria, epilepsy and pain. N. latifolia has not only drawn the interest of traditional healers but also of phytochemists, who have identified a range of bioactive indole alkaloids in its tissue. More recently, following up on the traditional use of extracts in pain management, a bio-guided purification from the roots of the tree led to the identification of the active ingredient as tramadol, available as a synthetic analgesic since the 1970s. The discovery of this compound as a natural phytochemical was highlighted worldwide. This review focuses on the correlation between extracted compounds and pharmacological activities, paying special attention to infectious diseases and neurologically-related disorders. A critical analysis of the data reported so far on the natural origin of tramadol and its proposed biosynthesis is also presented.

  15. Quasi-Optimal Elimination Trees for 2D Grids with Singularities

    DOE PAGES

    Paszyńska, A.; Paszyński, M.; Jopek, K.; ...

    2015-01-01

    We consmore » truct quasi-optimal elimination trees for 2D finite element meshes with singularities. These trees minimize the complexity of the solution of the discrete system. The computational cost estimates of the elimination process model the execution of the multifrontal algorithms in serial and in parallel shared-memory executions. Since the meshes considered are a subspace of all possible mesh partitions, we call these minimizers quasi-optimal. We minimize the cost functionals using dynamic programming. Finding these minimizers is more computationally expensive than solving the original algebraic system. Nevertheless, from the insights provided by the analysis of the dynamic programming minima, we propose a heuristic construction of the elimination trees that has cost O N e log ⁡ N e , where N e is the number of elements in the mesh. We show that this heuristic ordering has similar computational cost to the quasi-optimal elimination trees found with dynamic programming and outperforms state-of-the-art alternatives in our numerical experiments.« less

  16. Theoretical and experimental characterization of the DUal-BAse transistor (DUBAT)

    NASA Astrophysics Data System (ADS)

    Wu, Chung-Yu; Wu, Ching-Yuan

    1980-11-01

    A new A-type integrated voltage controlled differential negative resistance device using an extra effective base region to form a lateral pnp (npn) bipolar transistor beside the original base region of a vertical npn (pnp) bipolar junction transistor, and so called the DUal BAse Transistor (DUBAT), is studied both experimentally and theoretically, The DUBAT has three terminals and is fully comparible with the existing bipolar integrated circuits technologies. Based upon the equivalent circuit of the DUBAT, a simple first-order analytical theory is developed, and important device parameters, such as: the I-V characteristic, the differential negative resistance, and the peak and valley points, are also characterized. One of the proposed integrated structures of the DUBAT, which is similar in structure to I 2L but with similar high density and a normally operated vertical npn transistor, has been successfully fabricated and studied. Comparisons between the experimental data and theoretical analyses are made, and show in satisfactory agreements.

  17. A novel global Harmony Search method based on Ant Colony Optimisation algorithm

    NASA Astrophysics Data System (ADS)

    Fouad, Allouani; Boukhetala, Djamel; Boudjema, Fares; Zenger, Kai; Gao, Xiao-Zhi

    2016-03-01

    The Global-best Harmony Search (GHS) is a stochastic optimisation algorithm recently developed, which hybridises the Harmony Search (HS) method with the concept of swarm intelligence in the particle swarm optimisation (PSO) to enhance its performance. In this article, a new optimisation algorithm called GHSACO is developed by incorporating the GHS with the Ant Colony Optimisation algorithm (ACO). Our method introduces a novel improvisation process, which is different from that of the GHS in the following aspects. (i) A modified harmony memory (HM) representation and conception. (ii) The use of a global random switching mechanism to monitor the choice between the ACO and GHS. (iii) An additional memory consideration selection rule using the ACO random proportional transition rule with a pheromone trail update mechanism. The proposed GHSACO algorithm has been applied to various benchmark functions and constrained optimisation problems. Simulation results demonstrate that it can find significantly better solutions when compared with the original HS and some of its variants.

  18. A vine copula mixed effect model for trivariate meta-analysis of diagnostic test accuracy studies accounting for disease prevalence.

    PubMed

    Nikoloulopoulos, Aristidis K

    2017-10-01

    A bivariate copula mixed model has been recently proposed to synthesize diagnostic test accuracy studies and it has been shown that it is superior to the standard generalized linear mixed model in this context. Here, we call trivariate vine copulas to extend the bivariate meta-analysis of diagnostic test accuracy studies by accounting for disease prevalence. Our vine copula mixed model includes the trivariate generalized linear mixed model as a special case and can also operate on the original scale of sensitivity, specificity, and disease prevalence. Our general methodology is illustrated by re-analyzing the data of two published meta-analyses. Our study suggests that there can be an improvement on trivariate generalized linear mixed model in fit to data and makes the argument for moving to vine copula random effects models especially because of their richness, including reflection asymmetric tail dependence, and computational feasibility despite their three dimensionality.

  19. New operational technology of intrauterine ventilation the fetus lungs by breathing gas

    NASA Astrophysics Data System (ADS)

    Urakov, A. L.; Nikityuk, D. B.; Urakova, N. A.; Kasankin, A. A.; Chernova, L. V.; Dementiev, V. B.

    2015-11-01

    New operational technology for elimination intrauterine hypoxia and asphyxia of the fetus using endoscopic artificial ventilation lungs by respiratory gas was developed. For intrauterine ventilation of fetal lung it is proposed to enter into the uterus a special breathing mask and wear it on the head of the fetus using the original endoscopic technology. The breathing mask, developed by us is connected with external breathing apparatus with a hose. The device is called "intrauterine aqualung". Intrauterine aqualung includes a ventilator and breathing circuit with a special fold-out breathing mask that is put on inside the uterus on the head of fetus like a mesh hat. Controlled by ultrasound the technology of the introduction of the mask inside of the uterus through the natural opening in the cervix and technology of putting on the respiratory mask on the head of the fetus with its head previa were developed. The technology intrauterine ventilation of the fetus lungs by respiratory gas was developed.

  20. Enhancement of partial robust M-regression (PRM) performance using Bisquare weight function

    NASA Astrophysics Data System (ADS)

    Mohamad, Mazni; Ramli, Norazan Mohamed; Ghani@Mamat, Nor Azura Md; Ahmad, Sanizah

    2014-09-01

    Partial Least Squares (PLS) regression is a popular regression technique for handling multicollinearity in low and high dimensional data which fits a linear relationship between sets of explanatory and response variables. Several robust PLS methods are proposed to accommodate the classical PLS algorithms which are easily affected with the presence of outliers. The recent one was called partial robust M-regression (PRM). Unfortunately, the use of monotonous weighting function in the PRM algorithm fails to assign appropriate and proper weights to large outliers according to their severity. Thus, in this paper, a modified partial robust M-regression is introduced to enhance the performance of the original PRM. A re-descending weight function, known as Bisquare weight function is recommended to replace the fair function in the PRM. A simulation study is done to assess the performance of the modified PRM and its efficiency is also tested in both contaminated and uncontaminated simulated data under various percentages of outliers, sample sizes and number of predictors.

Top