Sample records for abdome normal estudo

  1. Teaching Normal Birth, Normally

    PubMed Central

    Hotelling, Barbara A

    2009-01-01

    Teaching normal-birth Lamaze classes normally involves considering the qualities that make birth normal and structuring classes to embrace those qualities. In this column, teaching strategies are suggested for classes that unfold naturally, free from unnecessary interventions. PMID:19436595

  2. Specialist bees collect Asteraceae pollen by distinctive abdominal drumming (Osmia) or tapping (Melissodes, Svastra)

    USDA-ARS?s Scientific Manuscript database

    Four species of western US Osmia (Cephalosmia) bees that are Asteraceae specialists (oligoleges) were observed to employ a heretofore unappreciated, stereotypical means of collecting pollen, abdominal drumming, to gather pollen from 19 flowering species representing nine tribes of Asteraceae. Abdom...

  3. Selective Nonoperative Management of Penetrating Torso Injury From Combat Fragmentation Wounds

    DTIC Science & Technology

    2008-02-01

    outlines the paradigm of care: “Penetrating inju- ries below the nipples , above the symphysis pubis, and between the posterior axillary lines must be...abdo- men and were hemodynamically stable and without abdom- inal pain or tenderness. CT scan of some of these casualties revealed fragments in the lumen

  4. Normalized modes at selected points without normalization

    NASA Astrophysics Data System (ADS)

    Kausel, Eduardo

    2018-04-01

    As every textbook on linear algebra demonstrates, the eigenvectors for the general eigenvalue problem | K - λM | = 0 involving two real, symmetric, positive definite matrices K , M satisfy some well-defined orthogonality conditions. Equally well-known is the fact that those eigenvectors can be normalized so that their modal mass μ =ϕT Mϕ is unity: it suffices to divide each unscaled mode by the square root of the modal mass. Thus, the normalization is the result of an explicit calculation applied to the modes after they were obtained by some means. However, we show herein that the normalized modes are not merely convenient forms of scaling, but that they are actually intrinsic properties of the pair of matrices K , M, that is, the matrices already "know" about normalization even before the modes have been obtained. This means that we can obtain individual components of the normalized modes directly from the eigenvalue problem, and without needing to obtain either all of the modes or for that matter, any one complete mode. These results are achieved by means of the residue theorem of operational calculus, a finding that is rather remarkable inasmuch as the residues themselves do not make use of any orthogonality conditions or normalization in the first place. It appears that this obscure property connecting the general eigenvalue problem of modal analysis with the residue theorem of operational calculus may have been overlooked up until now, but which has in turn interesting theoretical implications.Á

  5. Clarifying Normalization

    ERIC Educational Resources Information Center

    Carpenter, Donald A.

    2008-01-01

    Confusion exists among database textbooks as to the goal of normalization as well as to which normal form a designer should aspire. This article discusses such discrepancies with the intention of simplifying normalization for both teacher and student. This author's industry and classroom experiences indicate such simplification yields quicker…

  6. Pre Normal Science and the Transition to Post-Normal Policy

    NASA Astrophysics Data System (ADS)

    Halpern, J. B.

    2015-12-01

    Post-Normal Science as formulated by Funtowicz and Ravetz describes cases where "facts are uncertain, values in dispute, stakes high, and decisions urgent". However Post-Normal Science is better described as Pre-Normal Science, the stage at which something has been observed, but no one quite knows where it came from, what it means (science) or what to do about it (policy). The initial flailing about to reach a useful understanding is later used by those who oppose action to obfuscate by insisting that still nothing is known, what is known is wrong, or at best that more research is needed. Consider AIDS/HIV, stratospheric ozone, tobacco, acid rain, climate change, etc. As these issues gained attention, we entered the Pre-Normal Science stage. What was the cause? How could they be dealt with? Every idea could be proposed and was. Normal science sorted through them. Many proposers of the discarded theories still clutched them strongly, but mostly they are dismissed within the scientific community. Post-Normal Policy ensues when normal science has reached a consensus and it is clear that action is needed but it is economically or philosophically impossible for some to accept that. The response is to deny the utility of science and scientific judgment, thus the attacks on scientists and scientific panels that provide policy makers with their best scientific advice. Recognizing the division between Pre-Normal Science and Post-Normal Policy and the uses of the former to block action by the later is useful for understanding the course of controversies that require normal science to influence policy.

  7. Advocating for Normal Birth With Normal Clothes

    PubMed Central

    Waller-Wise, Renece

    2007-01-01

    Childbirth educators need to be aware that the clothes they wear when teaching classes send a nonverbal message to class participants. Regardless of who wears the clothing or what is worn, clothes send a message; thus, both the advantages and disadvantages related to clothing choice should be considered. Ultimately, the message should reflect the values of supporting normal birth. For childbirth educators who are allowed to choose their own apparel to wear in their classes, street clothes may be the benchmark for which to strive. This article discusses the many nonverbal messages that clothes convey and provides support for the choice of street clothes as the dress for the professional childbirth educator; thus, “normal clothes to promote normal birth.” PMID:18408807

  8. Power of tests of normality for detecting contaminated normal samples

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thode, H.C. Jr.; Smith, L.A.; Finch, S.J.

    1981-01-01

    Seventeen tests of normality or goodness of fit were evaluated for power at detecting a contaminated normal sample. This study used 1000 replications each of samples of size 12, 17, 25, 33, 50, and 100 from six different contaminated normal distributions. The kurtosis test was the most powerful over all sample sizes and contaminations. The Hogg and weighted Kolmogorov-Smirnov tests were second. The Kolmogorov-Smirnov, chi-squared, Anderson-Darling, and Cramer-von-Mises tests had very low power at detecting contaminated normal random variables. Tables of the power of the tests and the power curves of certain tests are given.

  9. Interactions between Polygonal Normal Faults and Larger Normal Faults, Offshore Nova Scotia, Canada

    NASA Astrophysics Data System (ADS)

    Pham, T. Q. H.; Withjack, M. O.; Hanafi, B. R.

    2017-12-01

    Polygonal faults, small normal faults with polygonal arrangements that form in fine-grained sedimentary rocks, can influence ground-water flow and hydrocarbon migration. Using well and 3D seismic-reflection data, we have examined the interactions between polygonal faults and larger normal faults on the passive margin of offshore Nova Scotia, Canada. The larger normal faults strike approximately E-W to NE-SW. Growth strata indicate that the larger normal faults were active in the Late Cretaceous (i.e., during the deposition of the Wyandot Formation) and during the Cenozoic. The polygonal faults were also active during the Cenozoic because they affect the top of the Wyandot Formation, a fine-grained carbonate sedimentary rock, and the overlying Cenozoic strata. Thus, the larger normal faults and the polygonal faults were both active during the Cenozoic. The polygonal faults far from the larger normal faults have a wide range of orientations. Near the larger normal faults, however, most polygonal faults have preferred orientations, either striking parallel or perpendicular to the larger normal faults. Some polygonal faults nucleated at the tip of a larger normal fault, propagated outward, and linked with a second larger normal fault. The strike of these polygonal faults changed as they propagated outward, ranging from parallel to the strike of the original larger normal fault to orthogonal to the strike of the second larger normal fault. These polygonal faults hard-linked the larger normal faults at and above the level of the Wyandot Formation but not below it. We argue that the larger normal faults created stress-enhancement and stress-reorientation zones for the polygonal faults. Numerous small, polygonal faults formed in the stress-enhancement zones near the tips of larger normal faults. Stress-reorientation zones surrounded the larger normal faults far from their tips. Fewer polygonal faults are present in these zones, and, more importantly, most polygonal faults

  10. Smooth quantile normalization.

    PubMed

    Hicks, Stephanie C; Okrah, Kwame; Paulson, Joseph N; Quackenbush, John; Irizarry, Rafael A; Bravo, Héctor Corrada

    2018-04-01

    Between-sample normalization is a critical step in genomic data analysis to remove systematic bias and unwanted technical variation in high-throughput data. Global normalization methods are based on the assumption that observed variability in global properties is due to technical reasons and are unrelated to the biology of interest. For example, some methods correct for differences in sequencing read counts by scaling features to have similar median values across samples, but these fail to reduce other forms of unwanted technical variation. Methods such as quantile normalization transform the statistical distributions across samples to be the same and assume global differences in the distribution are induced by only technical variation. However, it remains unclear how to proceed with normalization if these assumptions are violated, for example, if there are global differences in the statistical distributions between biological conditions or groups, and external information, such as negative or control features, is not available. Here, we introduce a generalization of quantile normalization, referred to as smooth quantile normalization (qsmooth), which is based on the assumption that the statistical distribution of each sample should be the same (or have the same distributional shape) within biological groups or conditions, but allowing that they may differ between groups. We illustrate the advantages of our method on several high-throughput datasets with global differences in distributions corresponding to different biological conditions. We also perform a Monte Carlo simulation study to illustrate the bias-variance tradeoff and root mean squared error of qsmooth compared to other global normalization methods. A software implementation is available from https://github.com/stephaniehicks/qsmooth.

  11. Cortical Thinning in Network-Associated Regions in Cognitively Normal and Below-Normal Range Schizophrenia

    PubMed Central

    Pinnock, Farena; Parlar, Melissa; Hawco, Colin; Hanford, Lindsay; Hall, Geoffrey B.

    2017-01-01

    This study assessed whether cortical thickness across the brain and regionally in terms of the default mode, salience, and central executive networks differentiates schizophrenia patients and healthy controls with normal range or below-normal range cognitive performance. Cognitive normality was defined using the MATRICS Consensus Cognitive Battery (MCCB) composite score (T = 50 ± 10) and structural magnetic resonance imaging was used to generate cortical thickness data. Whole brain analysis revealed that cognitively normal range controls (n = 39) had greater cortical thickness than both cognitively normal (n = 17) and below-normal range (n = 49) patients. Cognitively normal controls also demonstrated greater thickness than patients in regions associated with the default mode and salience, but not central executive networks. No differences on any thickness measure were found between cognitively normal range and below-normal range controls (n = 24) or between cognitively normal and below-normal range patients. In addition, structural covariance between network regions was high and similar across subgroups. Positive and negative symptom severity did not correlate with thickness values. Cortical thinning across the brain and regionally in relation to the default and salience networks may index shared aspects of the psychotic psychopathology that defines schizophrenia with no relation to cognitive impairment. PMID:28348889

  12. Multivariate normality

    NASA Technical Reports Server (NTRS)

    Crutcher, H. L.; Falls, L. W.

    1976-01-01

    Sets of experimentally determined or routinely observed data provide information about the past, present and, hopefully, future sets of similarly produced data. An infinite set of statistical models exists which may be used to describe the data sets. The normal distribution is one model. If it serves at all, it serves well. If a data set, or a transformation of the set, representative of a larger population can be described by the normal distribution, then valid statistical inferences can be drawn. There are several tests which may be applied to a data set to determine whether the univariate normal model adequately describes the set. The chi-square test based on Pearson's work in the late nineteenth and early twentieth centuries is often used. Like all tests, it has some weaknesses which are discussed in elementary texts. Extension of the chi-square test to the multivariate normal model is provided. Tables and graphs permit easier application of the test in the higher dimensions. Several examples, using recorded data, illustrate the procedures. Tests of maximum absolute differences, mean sum of squares of residuals, runs and changes of sign are included in these tests. Dimensions one through five with selected sample sizes 11 to 101 are used to illustrate the statistical tests developed.

  13. Normal Coagulation

    DTIC Science & Technology

    2014-09-04

    LO TTIN G with vitamin K antagonist...confidential until formal publication.6 F CHAPTER 34 Normal Coagulation 531 SE C T IO N 7 B LEED IN G A N D C LO TTIN G Table 34-1 Procoagulant...formal publication.8 F CHAPTER 34 Normal Coagulation 533 SE C T IO N 7 B LEED IN G A N D C LO TTIN G Figure 34-4 Vitamin K–dependent com-

  14. Using the Bologna Score to assess normal delivery healthcare.

    PubMed

    Carvalho, Isaiane da Silva; Brito, Rosineide Santana de

    2016-01-01

    Describing the obstetric care provided in public maternity hospitals during normal labour using the Bologna Score in the city of Natal, Northeastern Brazil. A quantitative cross-sectional study conducted with 314 puerperal women. Data collection was carried out consecutively during the months of March to July 2014. Prenatal care was provided to 95.9% of the mothers, beginning around the 1st trimester of pregnancy (72.3%) and having seven or more consultations (51%). Spontaneous vaginal delivery was planned for 88.2% women. All laboring women were assisted by a health professional, mostly by a physician (80.6%), and none of them obtained 5 points on the Bologna Score due to the small percentage of births in non-supine position (0.3%) and absence of a partogram (2.2%). A higher number of episiotomies were observed among primiparous women (75.5%). The score obtained using the Bologna Index was low. Thus, it is necessary to improve and readjust the existing obstetrical model. Descrever a assistência obstétrica prestada em maternidades públicas municipais durante o parto normal na cidade de Natal, Nordeste do Brasil, com uso do Índice de Bologna. Estudo transversal com abordagem quantitativa, desenvolvido com 314 puérperas. A coleta de dados processou-se de forma consecutiva durante os meses de março a julho de 2014. A assistência pré-natal foi prestada a 95,9% das puérperas, com início em torno do 1º trimestre de gestação (72,3%) e realização de sete ou mais consultas (51%). O parto vaginal espontâneo foi planejado para 88,2% mulheres. Todas as parturientes foram assistidas por um profissional de saúde, especialmente pelo médico (80,6%) e nenhuma obteve 5 pontos no Índice de Bologna em virtude dos baixos percentuais de partos em posição não supina (0,3%) e ausência do partograma (2,2%). Houve maior número de episiotomias em primíparas (75,5%). A pontuação obtida por meio do Índice de Bologna foi baixa. Desse modo, é preciso melhorar e

  15. New spatial upscaling methods for multi-point measurements: From normal to p-normal

    NASA Astrophysics Data System (ADS)

    Liu, Feng; Li, Xin

    2017-12-01

    Careful attention must be given to determining whether the geophysical variables of interest are normally distributed, since the assumption of a normal distribution may not accurately reflect the probability distribution of some variables. As a generalization of the normal distribution, the p-normal distribution and its corresponding maximum likelihood estimation (the least power estimation, LPE) were introduced in upscaling methods for multi-point measurements. Six methods, including three normal-based methods, i.e., arithmetic average, least square estimation, block kriging, and three p-normal-based methods, i.e., LPE, geostatistics LPE and inverse distance weighted LPE are compared in two types of experiments: a synthetic experiment to evaluate the performance of the upscaling methods in terms of accuracy, stability and robustness, and a real-world experiment to produce real-world upscaling estimates using soil moisture data obtained from multi-scale observations. The results show that the p-normal-based methods produced lower mean absolute errors and outperformed the other techniques due to their universality and robustness. We conclude that introducing appropriate statistical parameters into an upscaling strategy can substantially improve the estimation, especially if the raw measurements are disorganized; however, further investigation is required to determine which parameter is the most effective among variance, spatial correlation information and parameter p.

  16. Visual Memories Bypass Normalization.

    PubMed

    Bloem, Ilona M; Watanabe, Yurika L; Kibbe, Melissa M; Ling, Sam

    2018-05-01

    How distinct are visual memory representations from visual perception? Although evidence suggests that briefly remembered stimuli are represented within early visual cortices, the degree to which these memory traces resemble true visual representations remains something of a mystery. Here, we tested whether both visual memory and perception succumb to a seemingly ubiquitous neural computation: normalization. Observers were asked to remember the contrast of visual stimuli, which were pitted against each other to promote normalization either in perception or in visual memory. Our results revealed robust normalization between visual representations in perception, yet no signature of normalization occurring between working memory stores-neither between representations in memory nor between memory representations and visual inputs. These results provide unique insight into the nature of visual memory representations, illustrating that visual memory representations follow a different set of computational rules, bypassing normalization, a canonical visual computation.

  17. Group normalization for genomic data.

    PubMed

    Ghandi, Mahmoud; Beer, Michael A

    2012-01-01

    Data normalization is a crucial preliminary step in analyzing genomic datasets. The goal of normalization is to remove global variation to make readings across different experiments comparable. In addition, most genomic loci have non-uniform sensitivity to any given assay because of variation in local sequence properties. In microarray experiments, this non-uniform sensitivity is due to different DNA hybridization and cross-hybridization efficiencies, known as the probe effect. In this paper we introduce a new scheme, called Group Normalization (GN), to remove both global and local biases in one integrated step, whereby we determine the normalized probe signal by finding a set of reference probes with similar responses. Compared to conventional normalization methods such as Quantile normalization and physically motivated probe effect models, our proposed method is general in the sense that it does not require the assumption that the underlying signal distribution be identical for the treatment and control, and is flexible enough to correct for nonlinear and higher order probe effects. The Group Normalization algorithm is computationally efficient and easy to implement. We also describe a variant of the Group Normalization algorithm, called Cross Normalization, which efficiently amplifies biologically relevant differences between any two genomic datasets.

  18. Visual Memories Bypass Normalization

    PubMed Central

    Bloem, Ilona M.; Watanabe, Yurika L.; Kibbe, Melissa M.; Ling, Sam

    2018-01-01

    How distinct are visual memory representations from visual perception? Although evidence suggests that briefly remembered stimuli are represented within early visual cortices, the degree to which these memory traces resemble true visual representations remains something of a mystery. Here, we tested whether both visual memory and perception succumb to a seemingly ubiquitous neural computation: normalization. Observers were asked to remember the contrast of visual stimuli, which were pitted against each other to promote normalization either in perception or in visual memory. Our results revealed robust normalization between visual representations in perception, yet no signature of normalization occurring between working memory stores—neither between representations in memory nor between memory representations and visual inputs. These results provide unique insight into the nature of visual memory representations, illustrating that visual memory representations follow a different set of computational rules, bypassing normalization, a canonical visual computation. PMID:29596038

  19. Group Normalization for Genomic Data

    PubMed Central

    Ghandi, Mahmoud; Beer, Michael A.

    2012-01-01

    Data normalization is a crucial preliminary step in analyzing genomic datasets. The goal of normalization is to remove global variation to make readings across different experiments comparable. In addition, most genomic loci have non-uniform sensitivity to any given assay because of variation in local sequence properties. In microarray experiments, this non-uniform sensitivity is due to different DNA hybridization and cross-hybridization efficiencies, known as the probe effect. In this paper we introduce a new scheme, called Group Normalization (GN), to remove both global and local biases in one integrated step, whereby we determine the normalized probe signal by finding a set of reference probes with similar responses. Compared to conventional normalization methods such as Quantile normalization and physically motivated probe effect models, our proposed method is general in the sense that it does not require the assumption that the underlying signal distribution be identical for the treatment and control, and is flexible enough to correct for nonlinear and higher order probe effects. The Group Normalization algorithm is computationally efficient and easy to implement. We also describe a variant of the Group Normalization algorithm, called Cross Normalization, which efficiently amplifies biologically relevant differences between any two genomic datasets. PMID:22912661

  20. Is this the right normalization? A diagnostic tool for ChIP-seq normalization.

    PubMed

    Angelini, Claudia; Heller, Ruth; Volkinshtein, Rita; Yekutieli, Daniel

    2015-05-09

    Chip-seq experiments are becoming a standard approach for genome-wide profiling protein-DNA interactions, such as detecting transcription factor binding sites, histone modification marks and RNA Polymerase II occupancy. However, when comparing a ChIP sample versus a control sample, such as Input DNA, normalization procedures have to be applied in order to remove experimental source of biases. Despite the substantial impact that the choice of the normalization method can have on the results of a ChIP-seq data analysis, their assessment is not fully explored in the literature. In particular, there are no diagnostic tools that show whether the applied normalization is indeed appropriate for the data being analyzed. In this work we propose a novel diagnostic tool to examine the appropriateness of the estimated normalization procedure. By plotting the empirical densities of log relative risks in bins of equal read count, along with the estimated normalization constant, after logarithmic transformation, the researcher is able to assess the appropriateness of the estimated normalization constant. We use the diagnostic plot to evaluate the appropriateness of the estimates obtained by CisGenome, NCIS and CCAT on several real data examples. Moreover, we show the impact that the choice of the normalization constant can have on standard tools for peak calling such as MACS or SICER. Finally, we propose a novel procedure for controlling the FDR using sample swapping. This procedure makes use of the estimated normalization constant in order to gain power over the naive choice of constant (used in MACS and SICER), which is the ratio of the total number of reads in the ChIP and Input samples. Linear normalization approaches aim to estimate a scale factor, r, to adjust for different sequencing depths when comparing ChIP versus Input samples. The estimated scaling factor can easily be incorporated in many peak caller algorithms to improve the accuracy of the peak identification. The

  1. Statokinesigram normalization method.

    PubMed

    de Oliveira, José Magalhães

    2017-02-01

    Stabilometry is a technique that aims to study the body sway of human subjects, employing a force platform. The signal obtained from this technique refers to the position of the foot base ground-reaction vector, known as the center of pressure (CoP). The parameters calculated from the signal are used to quantify the displacement of the CoP over time; there is a large variability, both between and within subjects, which prevents the definition of normative values. The intersubject variability is related to differences between subjects in terms of their anthropometry, in conjunction with their muscle activation patterns (biomechanics); and the intrasubject variability can be caused by a learning effect or fatigue. Age and foot placement on the platform are also known to influence variability. Normalization is the main method used to decrease this variability and to bring distributions of adjusted values into alignment. In 1996, O'Malley proposed three normalization techniques to eliminate the effect of age and anthropometric factors from temporal-distance parameters of gait. These techniques were adopted to normalize the stabilometric signal by some authors. This paper proposes a new method of normalization of stabilometric signals to be applied in balance studies. The method was applied to a data set collected in a previous study, and the results of normalized and nonnormalized signals were compared. The results showed that the new method, if used in a well-designed experiment, can eliminate undesirable correlations between the analyzed parameters and the subjects' characteristics and show only the experimental conditions' effects.

  2. NOAA predicts near-normal or below-normal 2014 Atlantic hurricane season

    Science.gov Websites

    Related link: Atlantic Basin Hurricane Season Outlook Discussion El Niño/Southern Oscillation (ENSO predicts near-normal or below-normal 2014 Atlantic hurricane season El Niño expected to develop and . The main driver of this year's outlook is the anticipated development of El Niño this summer. El NiÃ

  3. Normal people working in normal organizations with normal equipment: system safety and cognition in a mid-air collision.

    PubMed

    de Carvalho, Paulo Victor Rodrigues; Gomes, José Orlando; Huber, Gilbert Jacob; Vidal, Mario Cesar

    2009-05-01

    A fundamental challenge in improving the safety of complex systems is to understand how accidents emerge in normal working situations, with equipment functioning normally in normally structured organizations. We present a field study of the en route mid-air collision between a commercial carrier and an executive jet, in the clear afternoon Amazon sky in which 154 people lost their lives, that illustrates one response to this challenge. Our focus was on how and why the several safety barriers of a well structured air traffic system melted down enabling the occurrence of this tragedy, without any catastrophic component failure, and in a situation where everything was functioning normally. We identify strong consistencies and feedbacks regarding factors of system day-to-day functioning that made monitoring and awareness difficult, and the cognitive strategies that operators have developed to deal with overall system behavior. These findings emphasize the active problem-solving behavior needed in air traffic control work, and highlight how the day-to-day functioning of the system can jeopardize such behavior. An immediate consequence is that safety managers and engineers should review their traditional safety approach and accident models based on equipment failure probability, linear combinations of failures, rules and procedures, and human errors, to deal with complex patterns of coincidence possibilities, unexpected links, resonance among system functions and activities, and system cognition.

  4. Sympathetic nerve traffic and baroreflex function in optimal, normal, and high-normal blood pressure states.

    PubMed

    Seravalle, Gino; Lonati, Laura; Buzzi, Silvia; Cairo, Matteo; Quarti Trevano, Fosca; Dell'Oro, Raffaella; Facchetti, Rita; Mancia, Giuseppe; Grassi, Guido

    2015-07-01

    Adrenergic activation and baroreflex dysfunction are common in established essential hypertension, elderly hypertension, masked and white-coat hypertension, resistant hypertension, and obesity-related hypertension. Whether this autonomic behavior is peculiar to established hypertension or is also detectable in the earlier clinical phases of the disease, that is, the high-normal blood pressure (BP) state, is still largely undefined, however. In 24 individuals with optimal BP (age: 37.1  ±  2.1 years, mean  ±  SEM) and in 27 with normal BP and 38 with high-normal BP, age matched with optimal BP, we measured clinic, 24-h and beat-to-beat BP, heart rate (HR), and muscle sympathetic nerve activity (MSNA) at rest and during baroreceptor stimulation and deactivation. Measurements also included anthropometric as well as echocardiographic and homeostasis model assessment (HOMA) index. For similar anthropometric values, clinic, 24-h ambulatory, and beat-to-beat BPs were significantly greater in normal BP than in optimal BP. This was the case when the high-normal BP group was compared to the normal and optimal BP groups. MSNA (but not HR) was also significantly greater in high-normal BP than in normal BP and optimal BP (51.3  ±  2.0 vs. 40.3  ±  2.3 and 41.1 ± 2.6  bursts per 100  heartbeats, respectively, P < 0.01). The sympathetic activation seen in high-normal BP was coupled with an impairment of baroreflex HR control (but not MSNA) and with a significant increase in HOMA Index, which showed a significant direct relationship with MSNA. Thus, independently of which BP the diagnosis is based, high-normal BP is a condition characterized by a sympathetic activation. This neurogenic alteration, which is likely to be triggered by metabolic rather than reflex alterations, might be involved, together with other factors, in the progression of the condition to established hypertension.

  5. Upper-normal waist circumference is a risk marker for metabolic syndrome in normal-weight subjects.

    PubMed

    Okada, R; Yasuda, Y; Tsushita, K; Wakai, K; Hamajima, N; Matsuo, S

    2016-01-01

    To elucidate implication of upper-normal waist circumference (WC), we examined whether the normal range of WC still represents a risk of metabolic syndrome (MetS) or non-adipose MetS components among normal-weight subjects. A total of 173,510 persons (100,386 men and 73,124 women) with normal WC (<90/80 cm in men/women) and body mass index (BMI) of 18.5-24.9 were included. Subjects were categorized as having low, moderate, and upper-normal WC for those with WC < 80, 80-84, and 85-89 cm in men and <70, 70-74, and 75-79 cm in women, respectively. The prevalence of all the non-adipose MetS components (e.g. prediabetes and borderline dyslipidemia) was significantly higher in subjects with upper-normal WC on comparison with those with low WC. Overall, the prevalence of MetS (having three or more of four non-adipose MetS components) gradually increased with increasing WC (12%, 21%, and 27% in men and 11%, 14%, and 19% in women for low, moderate, and upper-normal WC, respectively). Moreover, the risk of having a greater number of MetS components increased in subjects with upper-normal WC compared with those with low WC (odds ratios for the number of one, two, three, and four MetS components: 1.29, 1.81, 2.53, and 2.47 in men and 1.16, 1.55, 1.49, and 2.20 in women, respectively). Upper-normal WC represents a risk for acquiring a greater number of MetS components and the early stage of MetS components (prediabetes and borderline dyslipidemia), after adjusting for BMI, in a large general population with normal WC and BMI. Copyright © 2015 The Italian Society of Diabetology, the Italian Society for the Study of Atherosclerosis, the Italian Society of Human Nutrition, and the Department of Clinical Medicine and Surgery, Federico II University. Published by Elsevier B.V. All rights reserved.

  6. Supervised normalization of microarrays

    PubMed Central

    Mecham, Brigham H.; Nelson, Peter S.; Storey, John D.

    2010-01-01

    Motivation: A major challenge in utilizing microarray technologies to measure nucleic acid abundances is ‘normalization’, the goal of which is to separate biologically meaningful signal from other confounding sources of signal, often due to unavoidable technical factors. It is intuitively clear that true biological signal and confounding factors need to be simultaneously considered when performing normalization. However, the most popular normalization approaches do not utilize what is known about the study, both in terms of the biological variables of interest and the known technical factors in the study, such as batch or array processing date. Results: We show here that failing to include all study-specific biological and technical variables when performing normalization leads to biased downstream analyses. We propose a general normalization framework that fits a study-specific model employing every known variable that is relevant to the expression study. The proposed method is generally applicable to the full range of existing probe designs, as well as to both single-channel and dual-channel arrays. We show through real and simulated examples that the method has favorable operating characteristics in comparison to some of the most highly used normalization methods. Availability: An R package called snm implementing the methodology will be made available from Bioconductor (http://bioconductor.org). Contact: jstorey@princeton.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:20363728

  7. Laser-scanned fluorescence of nonlased/normal, lased/normal, nonlased/carious, and lased/carious enamel

    NASA Astrophysics Data System (ADS)

    Zakariasen, Kenneth L.; Barron, Joseph R.; Paton, Barry E.

    1992-06-01

    Research has shown that low levels of CO2 laser irradiation raise enamel resistance to sub-surface demineralization. Additionally, laser scanned fluorescence analysis of enamel, as well a laser and white light reflection studies, have potential for both clinical diagnosis and comparative research investigations of the caries process. This study was designed to compare laser fluorescence and laser/white light reflection of (1) non-lased/normal with lased/normal enamel and (2) non-lased/normal with non-lased/carious and lased/carious enamel. Specimens were buccal surfaces of extracted third molars, coated with acid resistant varnish except for either two or three 2.25 mm2 windows (two window specimens: non-lased/normal, lased/normal--three window specimens: non-lased/normal, non-lased carious, lased/carious). Teeth exhibiting carious windows were immersed in a demineralizing solution for twelve days. Non-carious windows were covered with wax during immersion. Following immersion, the wax was removed, and fluorescence and laser/white light reflection analyses were performed on all windows utilizing a custom scanning laser fluorescence spectrometer which focuses light from a 25 mWatt He-Cd laser at 442 nm through an objective lens onto a cross-section >= 3 (mu) in diameter. For laser/white light reflection analyses, reflected light intensities were measured. A HeNe laser was used for laser light reflection studies. Following analyses, the teeth are sectioned bucco-lingually into 80 micrometers sections, examined under polarized light microscopy, and the lesions photographed. This permits comparison between fluorescence/reflected light values and the visualized decalcification areas for each section, and thus comparisons between various enamel treatments and normal enamel. The enamel specimens are currently being analyzed.

  8. Spinal cord normalization in multiple sclerosis.

    PubMed

    Oh, Jiwon; Seigo, Michaela; Saidha, Shiv; Sotirchos, Elias; Zackowski, Kathy; Chen, Min; Prince, Jerry; Diener-West, Marie; Calabresi, Peter A; Reich, Daniel S

    2014-01-01

    Spinal cord (SC) pathology is common in multiple sclerosis (MS), and measures of SC-atrophy are increasingly utilized. Normalization reduces biological variation of structural measurements unrelated to disease, but optimal parameters for SC volume (SCV)-normalization remain unclear. Using a variety of normalization factors and clinical measures, we assessed the effect of SCV normalization on detecting group differences and clarifying clinical-radiological correlations in MS. 3T cervical SC-MRI was performed in 133 MS cases and 11 healthy controls (HC). Clinical assessment included expanded disability status scale (EDSS), MS functional composite (MSFC), quantitative hip-flexion strength ("strength"), and vibration sensation threshold ("vibration"). SCV between C3 and C4 was measured and normalized individually by subject height, SC-length, and intracranial volume (ICV). There were group differences in raw-SCV and after normalization by height and length (MS vs. HC; progressive vs. relapsing MS-subtypes, P < .05). There were correlations between clinical measures and raw-SCV (EDSS:r = -.20; MSFC:r = .16; strength:r = .35; vibration:r = -.19). Correlations consistently strengthened with normalization by length (EDSS:r = -.43; MSFC:r = .33; strength:r = .38; vibration:r = -.40), and height (EDSS:r = -.26; MSFC:r = .28; strength:r = .22; vibration:r = -.29), but diminished with normalization by ICV (EDSS:r = -.23; MSFC:r = -.10; strength:r = .23; vibration:r = -.35). In relapsing MS, normalization by length allowed statistical detection of correlations that were not apparent with raw-SCV. SCV-normalization by length improves the ability to detect group differences, strengthens clinical-radiological correlations, and is particularly relevant in settings of subtle disease-related SC-atrophy in MS. SCV-normalization by length may enhance the clinical utility of measures of SC-atrophy. Copyright © 2014 by the American Society of Neuroimaging.

  9. Study of the Local Horizon. (Spanish Title: Estudio del Horizonte Local.) Estudo do Horizonte Local

    NASA Astrophysics Data System (ADS)

    Ros, Rosa M.

    2009-12-01

    The study of the horizon is fundamental to easy the first observations of the students at any education center. A simple model, to be developed in each center, allows to easy the study and comprehension of the rudiments of astronomy. The constructed model is presented in turn as a simple equatorial clock, other models (horizontal and vertical) may be constructed starting from it. El estudio del horizonte es fundamental para poder facilitar las primeras observaciones de los alumnos en un centro educativo. Un simple modelo, que debe realizarse para cada centro, nos permite facilitar el estudio y la comprensión de los primeros rudimentos astronómicos. El modelo construido se presenta a su vez como un sencillo modelo de reloj ecuatorial y a partir de él se pueden construir otros modelos (horizontal y vertical). O estudo do horizonte é fundamental para facilitar as primeiras observações dos alunos num centro educativo. Um modelo simples, que deve ser feito para cada centro, permite facilitar o estudo e a compreensão dos primeiros rudimentos astronômicos. O modelo construído apresenta-se, por sua vez, como um modelo simples de relógio equatorial e a partir dele pode-se construir outros modelos (horizontal e vertical)

  10. Normal Stress or Adjustment Disorder?

    MedlinePlus

    ... Lifestyle Stress management What's the difference between normal stress and an adjustment disorder? Answers from Daniel K. Hall-Flavin, M.D. Stress is a normal psychological and physical reaction to ...

  11. Understanding a Normal Distribution of Data.

    PubMed

    Maltenfort, Mitchell G

    2015-12-01

    Assuming data follow a normal distribution is essential for many common statistical tests. However, what are normal data and when can we assume that a data set follows this distribution? What can be done to analyze non-normal data?

  12. Cell proliferation in normal epidermis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Weinstein, G.D.; McCullough, J.L.; Ross, P.

    1984-06-01

    A detailed examination of cell proliferation kinetics in normal human epidermis is presented. Using tritiated thymidine with autoradiographic techniques, proliferative and differentiated cell kinetics are defined and interrelated. The proliferative compartment of normal epidermis has a cell cycle duration (Tc) of 311 h derived from 3 components: the germinative labeling index (LI), the duration of DNA synthesis (ts), and the growth fraction (GF). The germinative LI is 2.7% +/- 1.2 and ts is 14 h, the latter obtained from a composite fraction of labeled mitoses curve obtained from 11 normal subjects. The GF obtained from the literature and from humanmore » skin xenografts to nude mice is estimated to be 60%. Normal-appearing epidermis from patients with psoriasis appears to have a higher proliferation rate. The mean LI is 4.2% +/- 0.9, approximately 50% greater than in normal epidermis. Absolute cell kinetic values for this tissue, however, cannot yet be calculated for lack of other information on ts and GF. A kinetic model for epidermal cell renewal in normal epidermis is described that interrelates the rate of birth/entry, transit, and/or loss of keratinocytes in the 3 epidermal compartments: proliferative, viable differentiated (stratum malpighii), and stratum corneum. Expected kinetic homeostasis in the epidermis is confirmed by the very similar ''turnover'' rates in each of the compartments that are, respectively, 1246, 1417, and 1490 cells/day/mm2 surface area. The mean epidermal turnover time of the entire tissue is 39 days. The Tc of 311 h in normal cells in 8-fold longer than the psoriatic Tc of 36 h and is necessary for understanding the hyperproliferative pathophysiologic process in psoriasis.« less

  13. Normal Weight Obesity: A Hidden Health Risk?

    MedlinePlus

    Normal weight obesity: A hidden health risk? Can you be considered obese if you have a normal body weight? Answers from ... considered obese — a condition known as normal weight obesity. Normal weight obesity means you may have the ...

  14. CT of Normal Developmental and Variant Anatomy of the Pediatric Skull: Distinguishing Trauma from Normality.

    PubMed

    Idriz, Sanjin; Patel, Jaymin H; Ameli Renani, Seyed; Allan, Rosemary; Vlahos, Ioannis

    2015-01-01

    The use of computed tomography (CT) in clinical practice has been increasing rapidly, with the number of CT examinations performed in adults and children rising by 10% per year in England. Because the radiology community strives to reduce the radiation dose associated with pediatric examinations, external factors, including guidelines for pediatric head injury, are raising expectations for use of cranial CT in the pediatric population. Thus, radiologists are increasingly likely to encounter pediatric head CT examinations in daily practice. The variable appearance of cranial sutures at different ages can be confusing for inexperienced readers of radiologic images. The evolution of multidetector CT with thin-section acquisition increases the clarity of some of these sutures, which may be misinterpreted as fractures. Familiarity with the normal anatomy of the pediatric skull, how it changes with age, and normal variants can assist in translating the increased resolution of multidetector CT into more accurate detection of fractures and confident determination of normality, thereby reducing prolonged hospitalization of children with normal developmental structures that have been misinterpreted as fractures. More important, the potential morbidity and mortality related to false-negative interpretation of fractures as normal sutures may be avoided. The authors describe the normal anatomy of all standard pediatric sutures, common variants, and sutural mimics, thereby providing an accurate and safe framework for CT evaluation of skull trauma in pediatric patients. (©)RSNA, 2015.

  15. Comprehensive non-dimensional normalization of gait data.

    PubMed

    Pinzone, Ornella; Schwartz, Michael H; Baker, Richard

    2016-02-01

    Normalizing clinical gait analysis data is required to remove variability due to physical characteristics such as leg length and weight. This is particularly important for children where both are associated with age. In most clinical centres conventional normalization (by mass only) is used whereas there is a stronger biomechanical argument for non-dimensional normalization. This study used data from 82 typically developing children to compare how the two schemes performed over a wide range of temporal-spatial and kinetic parameters by calculating the coefficients of determination with leg length, weight and height. 81% of the conventionally normalized parameters had a coefficient of determination above the threshold for a statistical association (p<0.05) compared to 23% of those normalized non-dimensionally. All the conventionally normalized parameters exceeding this threshold showed a reduced association with non-dimensional normalization. In conclusion, non-dimensional normalization is more effective that conventional normalization in reducing the effects of height, weight and age in a comprehensive range of temporal-spatial and kinetic parameters. Copyright © 2015 Elsevier B.V. All rights reserved.

  16. Estudo espectral em raios-X duros de fontes do tipo Z com o HEXTE/RXTE

    NASA Astrophysics Data System (ADS)

    D'Amico, F.; Heindl, W. A.; Rothschild, R. E.

    2003-08-01

    Apresentam-se os resultados de um estudo espectral em raios-X de fontes do tipo Z. As fontes do tipo Z são binárias de raios-X de baixa massa (BXBM) com campo magnético intermediário (B~109G). Esta classe de fontes é composta por apenas 6 fontes Galácticas (a saber: ScoX-1, 9, 7, CygX-2, 5 e 0). A nossa análise se concentra na faixa de raios-X duros (E ~ 20keV), até cerca de 200keV, faixa ótima de operação do telescópio "High Energy X-ray Timing Experiment" (HEXTE), um dos três telescópios de raios-X à bordo do Rossi X-ray Timing Explorer (RXTE). Nossa motivação para tal estudo, uma busca de caudas em raios-X duros em fontes do tipo Z, foi o pouco conhecimento sobre a emissão nesta faixa de energia das referidas fontes quando comparadas, por exemplo, as fontes do tipo atoll (também BXBM). Apresentam-se a análise/redução de dados e explicita-se a maneira como o HEXTE mede o ru1do de fundo. Especial atenção é direcionada a este item devido a localização das fontes do tipo Z e também ao problema de contaminação por fontes próximas. Com exceção de ScoX-1, nenhuma cauda em raios-X duros foi encontrada para as outras fontes, a despeito de resultados de detecção dessas caudas em algumas fontes pelo satélite BeppoSAX. As interpretações deste resultado serão apresentadas. Do ponto de vista deste estudo, nós deduzimos que a produção de caudas de raios-X duros em fontes do tipo Z é um processo disparado quando, pelo menos, uma condição é satisfeita: o brilho da componente térmica do espectro precisa estar acima de um certo valor limiar de ~4´1036ergs-1.

  17. Normal probability plots with confidence.

    PubMed

    Chantarangsi, Wanpen; Liu, Wei; Bretz, Frank; Kiatsupaibul, Seksan; Hayter, Anthony J; Wan, Fang

    2015-01-01

    Normal probability plots are widely used as a statistical tool for assessing whether an observed simple random sample is drawn from a normally distributed population. The users, however, have to judge subjectively, if no objective rule is provided, whether the plotted points fall close to a straight line. In this paper, we focus on how a normal probability plot can be augmented by intervals for all the points so that, if the population distribution is normal, then all the points should fall into the corresponding intervals simultaneously with probability 1-α. These simultaneous 1-α probability intervals provide therefore an objective mean to judge whether the plotted points fall close to the straight line: the plotted points fall close to the straight line if and only if all the points fall into the corresponding intervals. The powers of several normal probability plot based (graphical) tests and the most popular nongraphical Anderson-Darling and Shapiro-Wilk tests are compared by simulation. Based on this comparison, recommendations are given in Section 3 on which graphical tests should be used in what circumstances. An example is provided to illustrate the methods. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  18. Rock friction under variable normal stress

    USGS Publications Warehouse

    Kilgore, Brian D.; Beeler, Nicholas M.; Lozos, Julian C.; Oglesby, David

    2017-01-01

    This study is to determine the detailed response of shear strength and other fault properties to changes in normal stress at room temperature using dry initially bare rock surfaces of granite at normal stresses between 5 and 7 MPa. Rapid normal stress changes result in gradual, approximately exponential changes in shear resistance with fault slip. The characteristic length of the exponential change is similar for both increases and decreases in normal stress. In contrast, changes in fault normal displacement and the amplitude of small high-frequency elastic waves transmitted across the surface follow a two stage response consisting of a large immediate and a smaller gradual response with slip. The characteristic slip distance of the small gradual response is significantly smaller than that of shear resistance. The stability of sliding in response to large step decreases in normal stress is well predicted using the shear resistance slip length observed in step increases. Analysis of the shear resistance and slip-time histories suggest nearly immediate changes in strength occur in response to rapid changes in normal stress; these are manifested as an immediate change in slip speed. These changes in slip speed can be qualitatively accounted for using a rate-independent strength model. Collectively, the observations and model show that acceleration or deceleration in response to normal stress change depends on the size of the change, the frictional characteristics of the fault surface, and the elastic properties of the loading system.

  19. Evaluation of the ASOS impact on climatic normals and assessment of variable-length time periods in calculation of normals

    NASA Astrophysics Data System (ADS)

    Kauffman, Chad Matthew

    The temperature and precipitation that describe the norm of daily, monthly, and seasonal climate conditions are ``climate normals.'' They are usually calculated based on climate data covering a 30-year period, and updated in every 10 years. The next update will take place in year 2001. Because of the advent of the Automated Surface Observations Systems (ASOS) beginning in early 1990s and recognized temperature bias between ASOS and the conventional temperature sensors there is an uncertainty of how the ASOS data should be used to calculate the 1971-2000 temperature normal. This study examined the uncertainty and offered a method to minimize it. It showed that the ASOS bias has a measurable impact on the new 30-year temperature normal. The impact varies among stations and climate regions. Some stations with a cooling trend in ASOS temperature have a cooler normal for their temperature, while others with a warming trend have a warmer normal for temperature. These quantitative evaluations of ASOS effect for stations and regions can be used to reduce ASOS bias in temperature normals. This study also evaluated temperature normals for different length periods and compared them to the 30-year normal. It showed that the difference between the normals, is smaller in maritime climate than in continental temperate climate. In the former, the six- year normal describes a similar temperature variation as the 30-year normal does. In the latter, the 18-year normal starts to resemble the temperature variation that the 30-year normal describes. These results provide a theoretical basis for applying different normals in different regions. The study further compared temperature normal for different periods and identified a seasonal shift in climate change in the southwestern U.S. where the summer maximum temperature has shifted to a late summer month and the winter minimum temperature shifted to an early winter month in the past 30 years.

  20. A Statistical Selection Strategy for Normalization Procedures in LC-MS Proteomics Experiments through Dataset Dependent Ranking of Normalization Scaling Factors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Webb-Robertson, Bobbie-Jo M.; Matzke, Melissa M.; Jacobs, Jon M.

    2011-12-01

    Quantification of LC-MS peak intensities assigned during peptide identification in a typical comparative proteomics experiment will deviate from run-to-run of the instrument due to both technical and biological variation. Thus, normalization of peak intensities across a LC-MS proteomics dataset is a fundamental step in pre-processing. However, the downstream analysis of LC-MS proteomics data can be dramatically affected by the normalization method selected . Current normalization procedures for LC-MS proteomics data are presented in the context of normalization values derived from subsets of the full collection of identified peptides. The distribution of these normalization values is unknown a priori. If theymore » are not independent from the biological factors associated with the experiment the normalization process can introduce bias into the data, which will affect downstream statistical biomarker discovery. We present a novel approach to evaluate normalization strategies, where a normalization strategy includes the peptide selection component associated with the derivation of normalization values. Our approach evaluates the effect of normalization on the between-group variance structure in order to identify candidate normalization strategies that improve the structure of the data without introducing bias into the normalized peak intensities.« less

  1. Quaternion normalization in spacecraft attitude determination

    NASA Technical Reports Server (NTRS)

    Deutschmann, Julie; Bar-Itzhack, Itzhack; Galal, Ken

    1992-01-01

    Methods are presented to normalize the attitude quaternion in two extended Kalman filters (EKF), namely, the multiplicative EKF (MEKF) and the additive EKF (AEKF). It is concluded that all the normalization methods work well and yield comparable results. In the AEKF, normalization is not essential, since the data chosen for the test do not have a rapidly varying attitude. In the MEKF, normalization is necessary to avoid divergence of the attitude estimate. All of the methods of the methods behave similarly when the spacecraft experiences low angular rates.

  2. Relating normalization to neuronal populations across cortical areas

    PubMed Central

    Alberts, Joshua J.; Cohen, Marlene R.

    2016-01-01

    Normalization, which divisively scales neuronal responses to multiple stimuli, is thought to underlie many sensory, motor, and cognitive processes. In every study where it has been investigated, neurons measured in the same brain area under identical conditions exhibit a range of normalization, ranging from suppression by nonpreferred stimuli (strong normalization) to additive responses to combinations of stimuli (no normalization). Normalization has been hypothesized to arise from interactions between neuronal populations, either in the same or different brain areas, but current models of normalization are not mechanistic and focus on trial-averaged responses. To gain insight into the mechanisms underlying normalization, we examined interactions between neurons that exhibit different degrees of normalization. We recorded from multiple neurons in three cortical areas while rhesus monkeys viewed superimposed drifting gratings. We found that neurons showing strong normalization shared less trial-to-trial variability with other neurons in the same cortical area and more variability with neurons in other cortical areas than did units with weak normalization. Furthermore, the cortical organization of normalization was not random: neurons recorded on nearby electrodes tended to exhibit similar amounts of normalization. Together, our results suggest that normalization reflects a neuron's role in its local network and that modulatory factors like normalization share the topographic organization typical of sensory tuning properties. PMID:27358313

  3. Relating normalization to neuronal populations across cortical areas.

    PubMed

    Ruff, Douglas A; Alberts, Joshua J; Cohen, Marlene R

    2016-09-01

    Normalization, which divisively scales neuronal responses to multiple stimuli, is thought to underlie many sensory, motor, and cognitive processes. In every study where it has been investigated, neurons measured in the same brain area under identical conditions exhibit a range of normalization, ranging from suppression by nonpreferred stimuli (strong normalization) to additive responses to combinations of stimuli (no normalization). Normalization has been hypothesized to arise from interactions between neuronal populations, either in the same or different brain areas, but current models of normalization are not mechanistic and focus on trial-averaged responses. To gain insight into the mechanisms underlying normalization, we examined interactions between neurons that exhibit different degrees of normalization. We recorded from multiple neurons in three cortical areas while rhesus monkeys viewed superimposed drifting gratings. We found that neurons showing strong normalization shared less trial-to-trial variability with other neurons in the same cortical area and more variability with neurons in other cortical areas than did units with weak normalization. Furthermore, the cortical organization of normalization was not random: neurons recorded on nearby electrodes tended to exhibit similar amounts of normalization. Together, our results suggest that normalization reflects a neuron's role in its local network and that modulatory factors like normalization share the topographic organization typical of sensory tuning properties. Copyright © 2016 the American Physiological Society.

  4. Normal evaporation of binary alloys

    NASA Technical Reports Server (NTRS)

    Li, C. H.

    1972-01-01

    In the study of normal evaporation, it is assumed that the evaporating alloy is homogeneous, that the vapor is instantly removed, and that the alloy follows Raoult's law. The differential equation of normal evaporation relating the evaporating time to the final solute concentration is given and solved for several important special cases. Uses of the derived equations are exemplified with a Ni-Al alloy and some binary iron alloys. The accuracy of the predicted results are checked by analyses of actual experimental data on Fe-Ni and Ni-Cr alloys evaporated at 1600 C, and also on the vacuum purification of beryllium. These analyses suggest that the normal evaporation equations presented here give satisfactory results that are accurate to within an order of magnitude of the correct values, even for some highly concentrated solutions. Limited diffusion and the resultant surface solute depletion or enrichment appear important in the extension of this normal evaporation approach.

  5. Fluid involvement in normal faulting

    NASA Astrophysics Data System (ADS)

    Sibson, Richard H.

    2000-04-01

    Evidence of fluid interaction with normal faults comes from their varied role as flow barriers or conduits in hydrocarbon basins and as hosting structures for hydrothermal mineralisation, and from fault-rock assemblages in exhumed footwalls of steep active normal faults and metamorphic core complexes. These last suggest involvement of predominantly aqueous fluids over a broad depth range, with implications for fault shear resistance and the mechanics of normal fault reactivation. A general downwards progression in fault rock assemblages (high-level breccia-gouge (often clay-rich) → cataclasites → phyllonites → mylonite → mylonitic gneiss with the onset of greenschist phyllonites occurring near the base of the seismogenic crust) is inferred for normal fault zones developed in quartzo-feldspathic continental crust. Fluid inclusion studies in hydrothermal veining from some footwall assemblages suggest a transition from hydrostatic to suprahydrostatic fluid pressures over the depth range 3-5 km, with some evidence for near-lithostatic to hydrostatic pressure cycling towards the base of the seismogenic zone in the phyllonitic assemblages. Development of fault-fracture meshes through mixed-mode brittle failure in rock-masses with strong competence layering is promoted by low effective stress in the absence of thoroughgoing cohesionless faults that are favourably oriented for reactivation. Meshes may develop around normal faults in the near-surface under hydrostatic fluid pressures to depths determined by rock tensile strength, and at greater depths in overpressured portions of normal fault zones and at stress heterogeneities, especially dilational jogs. Overpressures localised within developing normal fault zones also determine the extent to which they may reutilise existing discontinuities (for example, low-angle thrust faults). Brittle failure mode plots demonstrate that reactivation of existing low-angle faults under vertical σ1 trajectories is only likely if

  6. The Normal Fetal Pancreas.

    PubMed

    Kivilevitch, Zvi; Achiron, Reuven; Perlman, Sharon; Gilboa, Yinon

    2017-10-01

    The aim of the study was to assess the sonographic feasibility of measuring the fetal pancreas and its normal development throughout pregnancy. We conducted a cross-sectional prospective study between 19 and 36 weeks' gestation. The study included singleton pregnancies with normal pregnancy follow-up. The pancreas circumference was measured. The first 90 cases were tested to assess feasibility. Two hundred ninety-seven fetuses of nondiabetic mothers were recruited during a 3-year period. The overall satisfactory visualization rate was 61.6%. The intraobserver and interobserver variability had high interclass correlation coefficients of of 0.964 and 0.967, respectively. A cubic polynomial regression described best the correlation of pancreas circumference with gestational age (r = 0.744; P < .001) and significant correlations also with abdominal circumference and estimated fetal weight (Pearson r = 0.829 and 0.812, respectively; P < .001). Modeled pancreas circumference percentiles for each week of gestation were calculated. During the study period, we detected 2 cases with overgrowth syndrome and 1 case with an annular pancreas. In this study, we assessed the feasibility of sonography for measuring the fetal pancreas and established a normal reference range for the fetal pancreas circumference throughout pregnancy. This database can be helpful when investigating fetomaternal disorders that can involve its normal development. © 2017 by the American Institute of Ultrasound in Medicine.

  7. Normal pressure hydrocephalus

    MedlinePlus

    Ferri FF. Normal pressure hydrocephalus. In: Ferri FF, ed. Ferri's Clinical Advisor 2016 . Philadelphia, PA: Elsevier; 2016:chap 648. Rosenberg GA. Brain edema and disorders of cerebrospinal fluid circulation. ...

  8. Inheritance of Properties of Normal and Non-Normal Distributions after Transformation of Scores to Ranks

    ERIC Educational Resources Information Center

    Zimmerman, Donald W.

    2011-01-01

    This study investigated how population parameters representing heterogeneity of variance, skewness, kurtosis, bimodality, and outlier-proneness, drawn from normal and eleven non-normal distributions, also characterized the ranks corresponding to independent samples of scores. When the parameters of population distributions from which samples were…

  9. 3j Symbols: To Normalize or Not to Normalize?

    ERIC Educational Resources Information Center

    van Veenendaal, Michel

    2011-01-01

    The systematic use of alternative normalization constants for 3j symbols can lead to a more natural expression of quantities, such as vector products and spherical tensor operators. The redefined coupling constants directly equate tensor products to the inner and outer products without any additional square roots. The approach is extended to…

  10. Normal Aging and Linguistic Decrement.

    ERIC Educational Resources Information Center

    Emery, Olga B.

    A study investigated language patterning, as an indication of synthetic mental activity, in comparison groups of normal pre-middle-aged adults (30-42 years), normal elderly adults (75-93), and elderly adults (71-91) with Alzheimer's dementia. Semiotic theory was used as the conceptual context. Linguistic measures included the Token Test, the…

  11. Quantiles for Finite Mixtures of Normal Distributions

    ERIC Educational Resources Information Center

    Rahman, Mezbahur; Rahman, Rumanur; Pearson, Larry M.

    2006-01-01

    Quantiles for finite mixtures of normal distributions are computed. The difference between a linear combination of independent normal random variables and a linear combination of independent normal densities is emphasized. (Contains 3 tables and 1 figure.)

  12. Metabolic Cost, Mechanical Work, and Efficiency during Normal Walking in Obese and Normal-Weight Children

    ERIC Educational Resources Information Center

    Huang, Liang; Chen, Peijie; Zhuang, Jie; Zhang, Yanxin; Walt, Sharon

    2013-01-01

    Purpose: This study aimed to investigate the influence of childhood obesity on energetic cost during normal walking and to determine if obese children choose a walking strategy optimizing their gait pattern. Method: Sixteen obese children with no functional abnormalities were matched by age and gender with 16 normal-weight children. All…

  13. Normalizing Catastrophe: An Educational Response

    ERIC Educational Resources Information Center

    Jickling, Bob

    2013-01-01

    Processes of normalizing assumptions and values have been the subjects of theoretical framing and critique for several decades now. Critique has often been tied to issues of environmental sustainability and social justice. Now, in an era of global warming, there is a rising concern that the results of normalizing of present values could be…

  14. A Normalized Direct Approach for Estimating the Parameters of the Normal Ogive Three-Parameter Model for Ability Tests.

    ERIC Educational Resources Information Center

    Gugel, John F.

    A new method for estimating the parameters of the normal ogive three-parameter model for multiple-choice test items--the normalized direct (NDIR) procedure--is examined. The procedure is compared to a more commonly used estimation procedure, Lord's LOGIST, using computer simulations. The NDIR procedure uses the normalized (mid-percentile)…

  15. Genomic Changes in Normal Breast Tissue in Women at Normal Risk or at High Risk for Breast Cancer

    PubMed Central

    Danforth, David N.

    2016-01-01

    Sporadic breast cancer develops through the accumulation of molecular abnormalities in normal breast tissue, resulting from exposure to estrogens and other carcinogens beginning at adolescence and continuing throughout life. These molecular changes may take a variety of forms, including numerical and structural chromosomal abnormalities, epigenetic changes, and gene expression alterations. To characterize these abnormalities, a review of the literature has been conducted to define the molecular changes in each of the above major genomic categories in normal breast tissue considered to be either at normal risk or at high risk for sporadic breast cancer. This review indicates that normal risk breast tissues (such as reduction mammoplasty) contain evidence of early breast carcinogenesis including loss of heterozygosity, DNA methylation of tumor suppressor and other genes, and telomere shortening. In normal tissues at high risk for breast cancer (such as normal breast tissue adjacent to breast cancer or the contralateral breast), these changes persist, and are increased and accompanied by aneuploidy, increased genomic instability, a wide range of gene expression differences, development of large cancerized fields, and increased proliferation. These changes are consistent with early and long-standing exposure to carcinogens, especially estrogens. A model for the breast carcinogenic pathway in normal risk and high-risk breast tissues is proposed. These findings should clarify our understanding of breast carcinogenesis in normal breast tissue and promote development of improved methods for risk assessment and breast cancer prevention in women. PMID:27559297

  16. Normalized cDNA libraries

    DOEpatents

    Soares, Marcelo B.; Efstratiadis, Argiris

    1997-01-01

    This invention provides a method to normalize a directional cDNA library constructed in a vector that allows propagation in single-stranded circle form comprising: (a) propagating the directional cDNA library in single-stranded circles; (b) generating fragments complementary to the 3' noncoding sequence of the single-stranded circles in the library to produce partial duplexes; (c) purifying the partial duplexes; (d) melting and reassociating the purified partial duplexes to moderate Cot; and (e) purifying the unassociated single-stranded circles, thereby generating a normalized cDNA library.

  17. Normalized cDNA libraries

    DOEpatents

    Soares, M.B.; Efstratiadis, A.

    1997-06-10

    This invention provides a method to normalize a directional cDNA library constructed in a vector that allows propagation in single-stranded circle form comprising: (a) propagating the directional cDNA library in single-stranded circles; (b) generating fragments complementary to the 3{prime} noncoding sequence of the single-stranded circles in the library to produce partial duplexes; (c) purifying the partial duplexes; (d) melting and reassociating the purified partial duplexes to moderate Cot; and (e) purifying the unassociated single-stranded circles, thereby generating a normalized cDNA library. 4 figs.

  18. Confirmed viral meningitis with normal CSF findings.

    PubMed

    Dawood, Naghum; Desjobert, Edouard; Lumley, Janine; Webster, Daniel; Jacobs, Michael

    2014-07-17

    An 18-year-old woman presented with a progressively worsening headache, photophobia feverishness and vomiting. Three weeks previously she had returned to the UK from a trip to Peru. At presentation, she had clinical signs of meningism. On admission, blood tests showed a mild lymphopenia, with a normal C reactive protein and white cell count. Chest X-ray and CT of the head were normal. Cerebrospinal fluid (CSF) microscopy was normal. CSF protein and glucose were in the normal range. MRI of the head and cerebral angiography were also normal. Subsequent molecular testing of CSF detected enterovirus RNA by reverse transcriptase PCR. The patient's clinical syndrome correlated with her virological diagnosis and no other cause of her symptoms was found. Her symptoms were self-limiting and improved with supportive management. This case illustrates an important example of viral central nervous system infection presenting clinically as meningitis but with normal CSF microscopy. 2014 BMJ Publishing Group Ltd.

  19. 40 CFR 230.24 - Normal water fluctuations.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 25 2014-07-01 2014-07-01 false Normal water fluctuations. 230.24... Impacts on Physical and Chemical Characteristics of the Aquatic Ecosystem § 230.24 Normal water fluctuations. (a) Normal water fluctuations in a natural aquatic system consist of daily, seasonal, and annual...

  20. 40 CFR 230.24 - Normal water fluctuations.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 26 2012-07-01 2011-07-01 true Normal water fluctuations. 230.24... Impacts on Physical and Chemical Characteristics of the Aquatic Ecosystem § 230.24 Normal water fluctuations. (a) Normal water fluctuations in a natural aquatic system consist of daily, seasonal, and annual...

  1. What's normal? Influencing women's perceptions of normal genitalia: an experiment involving exposure to modified and nonmodified images.

    PubMed

    Moran, C; Lee, C

    2014-05-01

    Examine women's perceptions of what is 'normal' and 'desirable' in female genital appearance. Experiment with random allocation across three conditions. Community. A total of 97 women aged 18-30 years. Women were randomly assigned to view a series of images of (1) surgically modified vulvas or (2) nonmodified vulvas, or (3) no images. They then viewed and rated ten target images of surgically modified vulvas and ten of unmodified vulvas. Women used a four-point Likert scale ('strongly agree' to 'strongly disagree'), to rate each target image for 'looks normal' and 'represents society's ideal'. For each woman, we created two summary scores that represented the extent to which she rated the unmodified vulvas as more 'normal' and more 'society's ideal' than the modified vulvas. For ratings of 'normality,' there was a significant effect for condition (F2,94  = 2.75 P = 0.007, radj2 = 0.082): women who had first viewed the modified images rated the modified target vulvas as more normal than the nonmodified vulvas, significantly different from the control group, who rated them as less normal. For ratings of 'society's ideal', there was again a significant effect for condition (F2,92  = 7.72, P < 0.001, radj2 = 0.125); all three groups rated modified target vulvas as more like society's ideal than the nonmodified target vulvas, with the effect significantly strongest for the women who had viewed the modified images. Exposure to images of modified vulvas may change women's perceptions of what is normal and desirable. This may explain why some healthy women seek labiaplasty. © 2013 Royal College of Obstetricians and Gynaecologists.

  2. Valuation of Normal Range of Ankle Systolic Blood Pressure in Subjects with Normal Arm Systolic Blood Pressure.

    PubMed

    Gong, Yi; Cao, Kai-wu; Xu, Jin-song; Li, Ju-xiang; Hong, Kui; Cheng, Xiao-shu; Su, Hai

    2015-01-01

    This study aimed to establish a normal range for ankle systolic blood pressure (SBP). A total of 948 subjects who had normal brachial SBP (90-139 mmHg) at investigation were enrolled. Supine BP of four limbs was simultaneously measured using four automatic BP measurement devices. The ankle-arm difference (An-a) on SBP of both sides was calculated. Two methods were used for establishing normal range of ankle SBP: the 99% method was decided on the 99% reference range of actual ankle BP, and the An-a method was the sum of An-a and the low or up limits of normal arm SBP (90-139 mmHg). Whether in the right or left side, the ankle SBP was significantly higher than the arm SBP (right: 137.1 ± 16.9 vs 119.7 ± 11.4 mmHg, P<0.05). Based on the 99% method, the normal range of ankle SBP was 94~181 mmHg for the total population, 84~166 mmHg for the young (18-44 y), 107~176 mmHg for the middle-aged(45-59 y) and 113~179 mmHg for the elderly (≥ 60 y) group. As the An-a on SBP was 13 mmHg in the young group and 20 mmHg in both middle-aged and elderly groups, the normal range of ankle SBP on the An-a method was 103-153 mmHg for young and 110-160 mmHg for middle-elderly subjects. A primary reference for normal ankle SBP was suggested as 100-165 mmHg in the young and 110-170 mmHg in the middle-elderly subjects.

  3. Strength of Gamma Rhythm Depends on Normalization

    PubMed Central

    Ray, Supratim; Ni, Amy M.; Maunsell, John H. R.

    2013-01-01

    Neuronal assemblies often exhibit stimulus-induced rhythmic activity in the gamma range (30–80 Hz), whose magnitude depends on the attentional load. This has led to the suggestion that gamma rhythms form dynamic communication channels across cortical areas processing the features of behaviorally relevant stimuli. Recently, attention has been linked to a normalization mechanism, in which the response of a neuron is suppressed (normalized) by the overall activity of a large pool of neighboring neurons. In this model, attention increases the excitatory drive received by the neuron, which in turn also increases the strength of normalization, thereby changing the balance of excitation and inhibition. Recent studies have shown that gamma power also depends on such excitatory–inhibitory interactions. Could modulation in gamma power during an attention task be a reflection of the changes in the underlying excitation–inhibition interactions? By manipulating the normalization strength independent of attentional load in macaque monkeys, we show that gamma power increases with increasing normalization, even when the attentional load is fixed. Further, manipulations of attention that increase normalization increase gamma power, even when they decrease the firing rate. Thus, gamma rhythms could be a reflection of changes in the relative strengths of excitation and normalization rather than playing a functional role in communication or control. PMID:23393427

  4. [Normal aging of frontal lobe functions].

    PubMed

    Calso, Cristina; Besnard, Jérémy; Allain, Philippe

    2016-03-01

    Normal aging in individuals is often associated with morphological, metabolic and cognitive changes, which particularly concern the cerebral frontal regions. Starting from the "frontal lobe hypothesis of cognitive aging" (West, 1996), the present review is based on the neuroanatomical model developed by Stuss (2008), introducing four categories of frontal lobe functions: executive control, behavioural and emotional self-regulation and decision-making, energization and meta-cognitive functions. The selected studies only address the changes of one at least of these functions. The results suggest a deterioration of several cognitive frontal abilities in normal aging: flexibility, inhibition, planning, verbal fluency, implicit decision-making, second-order and affective theory of mind. Normal aging seems also to be characterised by a general reduction in processing speed observed during neuropsychological assessment (Salthouse, 1996). Nevertheless many cognitive functions remain preserved such as automatic or non-conscious inhibition, specific capacities of flexibility and first-order theory of mind. Therefore normal aging doesn't seem to be associated with a global cognitive decline but rather with a selective change in some frontal systems, conclusion which should be taken into account for designing caring programs in normal aging.

  5. Normal metal - insulator - superconductor thermometers and coolers with titanium-gold bilayer as the normal metal

    NASA Astrophysics Data System (ADS)

    Räisänen, I. M. W.; Geng, Z.; Kinnunen, K. M.; Maasilta, I. J.

    2018-03-01

    We have fabricated superconductor - insulator - normal metal - insulator - superconductor (SINIS) tunnel junctions in which Al acts as the superconductor, AlOx is the insulator, and the normal metal consists of a thin Ti layer (5 nm) covered with a thicker Au layer (40 nm). We have characterized the junctions by measuring their current-voltage curves between 60 mK and 750 mK. For comparison, the same measurements have been performed for a SINIS junction pair whose normal metal is Cu. The Ti-Au bilayer decreases the SINIS tunneling resistance by an order of magnitude compared to junctions where Cu is used as normal metal, made with the same oxidation parameters. The Ti-Au devices are much more robust against chemical attacks, and their lower tunneling resistance makes them more robust against static charge. More significantly, they exhibit significantly stronger electron cooling than Cu devices with identical fabrication steps, when biased close to the energy gap of the superconducting Al. By using a self-consistent thermal model, we can fit the current-voltage characteristics well, and show an electron cooling from 200 mK to 110 mK, with a non-optimized device.

  6. a Recursive Approach to Compute Normal Forms

    NASA Astrophysics Data System (ADS)

    HSU, L.; MIN, L. J.; FAVRETTO, L.

    2001-06-01

    Normal forms are instrumental in the analysis of dynamical systems described by ordinary differential equations, particularly when singularities close to a bifurcation are to be characterized. However, the computation of a normal form up to an arbitrary order is numerically hard. This paper focuses on the computer programming of some recursive formulas developed earlier to compute higher order normal forms. A computer program to reduce the system to its normal form on a center manifold is developed using the Maple symbolic language. However, it should be stressed that the program relies essentially on recursive numerical computations, while symbolic calculations are used only for minor tasks. Some strategies are proposed to save computation time. Examples are presented to illustrate the application of the program to obtain high order normalization or to handle systems with large dimension.

  7. Estudo da região HII galática NGC 2579

    NASA Astrophysics Data System (ADS)

    Riffel, R.; Copetti, M. V. F.

    2003-08-01

    Desde a descoberta dos gradientes de abundância química em galáxias espirais, as regiões HII galáticas têm sido intensamente estudadas com o objetivo de determinar a forma do gradiente de abundância química na Via-Láctea. Entretanto, a forma do gradiente galático continua controversa e existem muitas regiões HII que continuam inexploradas. A região HII galática NGC 2579 é um objeto que aparece em imagens Ha, como uma pequena mancha brilhante de aproximadamente 2 segundos de arco de diâmetro a 20 segundos de arco ao leste de RCW 20, sendo NGC 2579 muitas vezes confundida com esta última. Apesar de seu alto brilho superficial, NGC 2579 é um objeto pouco estudado provavelmente por problemas de identificação deste objeto. Como parte de um projeto de reavaliação dos gradientes de abundância química das regiões HII na Via-Láctea, estamos realizando um estudo extensivo das propriedades físicas básicas como temperatura eletrônica, densidade eletrônica e composição química da região HII galática NGC 2579. Analisamos dados espectrofotométricos de fenda longa na faixa de 3700Å a 7750Å obtidos com o telescópio de 1.52 m do ESO, Chile, em 2002. Determinamos a temperatura eletrônica usando a razão entre as linhas do [OIII] (l4959+l5007/l4363) e a densidade eletrônica pela razão entre as linhas do [SII] (l6716/l6731). As abundâncias químicas do O, N, Cl, S, Ne e He foram determinadas. Realizamos um estudo de imagens fotométricas nas bandas UBVRI obtidas em 2000 no observatório astronômico San Pedro Mártir, México, para identificar e classificar as estrelas ionizantes de NGC 2579 e determinar a distância deste objeto.

  8. Neither Hematocrit Normalization nor Exercise Training Restores Oxygen Consumption to Normal Levels in Hemodialysis Patients

    PubMed Central

    Stray-Gundersen, James; Parsons, Dora Beth; Thompson, Jeffrey R.

    2016-01-01

    Patients treated with hemodialysis develop severely reduced functional capacity, which can be partially ameliorated by correcting anemia and through exercise training. In this study, we determined perturbations of an erythroid-stimulating agent and exercise training to examine if and where limitation to oxygen transport exists in patients on hemodialysis. Twenty-seven patients on hemodialysis completed a crossover study consisting of two exercise training phases at two hematocrit (Hct) values: 30% (anemic) and 42% (physiologic; normalized by treatment with erythroid-stimulating agent). To determine primary outcome measures of peak power and oxygen consumption (VO2) and secondary measures related to components of oxygen transport and utilization, all patients underwent numerous tests at five time points: baseline, untrained at Hct of 30%, after training at Hct of 30%, untrained at Hct of 42%, and after training at Hct of 42%. Hct normalization, exercise training, or the combination thereof significantly improved peak power and VO2 relative to values in the untrained anemic phase. Hct normalization increased peak arterial oxygen and arteriovenous oxygen difference, whereas exercise training improved cardiac output, citrate synthase activity, and peak tissue diffusing capacity. However, although the increase in arterial oxygen observed in the combination phase reached a value similar to that in healthy sedentary controls, the increase in peak arteriovenous oxygen difference did not. Muscle biopsy specimens showed markedly thickened endothelium and electron–dense interstitial deposits. In conclusion, exercise and Hct normalization had positive effects but failed to normalize exercise capacity in patients on hemodialysis. This effect may be caused by abnormalities identified within skeletal muscle. PMID:27153927

  9. Muscular hypertrophy and atrophy in normal rats provoked by the administration of normal and denervated muscle extracts.

    PubMed

    Agüera, Eduardo; Castilla, Salvador; Luque, Evelio; Jimena, Ignacio; Leiva-Cepas, Fernando; Ruz-Caracuel, Ignacio; Peña, José

    2016-12-01

    This study was conducted to determine the effects of extracts obtained from both normal and denervated muscles on different muscle types. Wistar rats were used and were divided into a control group and four experimental groups. Each experimental group was treated intraperitoneally during 10 consecutive days with a different extract. These extracts were obtained from normal soleus muscle, denervated soleus, normal extensor digitorum longus, and denervated extensor digitorum longus. Following treatment, the soleus and extensor digitorum longus muscles were obtained for study under optic and transmission electron microscope; morphometric parameters and myogenic responses were also analyzed. The results demonstrated that the treatment with normal soleus muscle and denervated soleus muscle extracts provoked hypertrophy and increased myogenic activity. In contrast, treatment with extracts from the normal and denervated EDL had a different effect depending on the muscle analyzed. In the soleus muscle it provoked hypertrophy of type I fibers and increased myogenic activity, while in the extensor digitorum longus atrophy of the type II fibers was observed without changes in myogenic activity. This suggests that the muscular responses of atrophy and hypertrophy may depend on different factors related to the muscle type which could be related to innervation.

  10. Normalization of satellite imagery

    NASA Technical Reports Server (NTRS)

    Kim, Hongsuk H.; Elman, Gregory C.

    1990-01-01

    Sets of Thematic Mapper (TM) imagery taken over the Washington, DC metropolitan area during the months of November, March and May were converted into a form of ground reflectance imagery. This conversion was accomplished by adjusting the incident sunlight and view angles and by applying a pixel-by-pixel correction for atmospheric effects. Seasonal color changes of the area can be better observed when such normalization is applied to space imagery taken in time series. In normalized imagery, the grey scale depicts variations in surface reflectance and tonal signature of multi-band color imagery can be directly interpreted for quantitative information of the target.

  11. Evaluation of CT-based SUV normalization

    NASA Astrophysics Data System (ADS)

    Devriese, Joke; Beels, Laurence; Maes, Alex; Van de Wiele, Christophe; Pottel, Hans

    2016-09-01

    The purpose of this study was to determine patients’ lean body mass (LBM) and lean tissue (LT) mass using a computed tomography (CT)-based method, and to compare standardized uptake value (SUV) normalized by these parameters to conventionally normalized SUVs. Head-to-toe positron emission tomography (PET)/CT examinations were retrospectively retrieved and semi-automatically segmented into tissue types based on thresholding of CT Hounsfield units (HU). The following HU ranges were used for determination of CT-estimated LBM and LT (LBMCT and LTCT):  -180 to  -7 for adipose tissue (AT), -6 to 142 for LT, and 143 to 3010 for bone tissue (BT). Formula-estimated LBMs were calculated using formulas of James (1976 Research on Obesity: a Report of the DHSS/MRC Group (London: HMSO)) and Janmahasatian et al (2005 Clin. Pharmacokinet. 44 1051-65), and body surface area (BSA) was calculated using the DuBois formula (Dubois and Dubois 1989 Nutrition 5 303-11). The CT segmentation method was validated by comparing total patient body weight (BW) to CT-estimated BW (BWCT). LBMCT was compared to formula-based estimates (LBMJames and LBMJanma). SUVs in two healthy reference tissues, liver and mediastinum, were normalized for the aforementioned parameters and compared to each other in terms of variability and dependence on normalization factors and BW. Comparison of actual BW to BWCT shows a non-significant difference of 0.8 kg. LBMJames estimates are significantly higher than LBMJanma with differences of 4.7 kg for female and 1.0 kg for male patients. Formula-based LBM estimates do not significantly differ from LBMCT, neither for men nor for women. The coefficient of variation (CV) of SUV normalized for LBMJames (SUVLBM-James) (12.3%) was significantly reduced in liver compared to SUVBW (15.4%). All SUV variances in mediastinum were significantly reduced (CVs were 11.1-12.2%) compared to SUVBW (15.5%), except SUVBSA (15.2%). Only SUVBW and SUVLBM-James show

  12. Normal Psychosexual Development

    ERIC Educational Resources Information Center

    Rutter, Michael

    1971-01-01

    Normal sexual development is reviewed with respect to physical maturation, sexual interests, sex drive", psychosexual competence and maturity, gender role, object choice, children's concepts of sexual differences, sex role preference and standards, and psychosexual stages. Biologic, psychoanalytic and psychosocial theories are briefly considered.…

  13. Visual attention and flexible normalization pools

    PubMed Central

    Schwartz, Odelia; Coen-Cagli, Ruben

    2013-01-01

    Attention to a spatial location or feature in a visual scene can modulate the responses of cortical neurons and affect perceptual biases in illusions. We add attention to a cortical model of spatial context based on a well-founded account of natural scene statistics. The cortical model amounts to a generalized form of divisive normalization, in which the surround is in the normalization pool of the center target only if they are considered statistically dependent. Here we propose that attention influences this computation by accentuating the neural unit activations at the attended location, and that the amount of attentional influence of the surround on the center thus depends on whether center and surround are deemed in the same normalization pool. The resulting form of model extends a recent divisive normalization model of attention (Reynolds & Heeger, 2009). We simulate cortical surround orientation experiments with attention and show that the flexible model is suitable for capturing additional data and makes nontrivial testable predictions. PMID:23345413

  14. A normalization strategy for comparing tag count data

    PubMed Central

    2012-01-01

    Background High-throughput sequencing, such as ribonucleic acid sequencing (RNA-seq) and chromatin immunoprecipitation sequencing (ChIP-seq) analyses, enables various features of organisms to be compared through tag counts. Recent studies have demonstrated that the normalization step for RNA-seq data is critical for a more accurate subsequent analysis of differential gene expression. Development of a more robust normalization method is desirable for identifying the true difference in tag count data. Results We describe a strategy for normalizing tag count data, focusing on RNA-seq. The key concept is to remove data assigned as potential differentially expressed genes (DEGs) before calculating the normalization factor. Several R packages for identifying DEGs are currently available, and each package uses its own normalization method and gene ranking algorithm. We compared a total of eight package combinations: four R packages (edgeR, DESeq, baySeq, and NBPSeq) with their default normalization settings and with our normalization strategy. Many synthetic datasets under various scenarios were evaluated on the basis of the area under the curve (AUC) as a measure for both sensitivity and specificity. We found that packages using our strategy in the data normalization step overall performed well. This result was also observed for a real experimental dataset. Conclusion Our results showed that the elimination of potential DEGs is essential for more accurate normalization of RNA-seq data. The concept of this normalization strategy can widely be applied to other types of tag count data and to microarray data. PMID:22475125

  15. Parental Perceptions of the Outcome and Meaning of Normalization

    PubMed Central

    Knafl, Kathleen A.; Darney, Blair G.; Gallo, Agatha M.; Angst, Denise B.

    2010-01-01

    The purpose of this secondary analysis was to identify the meaning of normalization for parents of a child with a chronic genetic condition. The sample was comprised of 28 families (48 parents), selected to reflect two groups: Normalization Present (NP) and Normalization Absent (NA). Constant comparison analysis was used to identify themes characterizing parents' perceptions of the meaning of normalization. The meanings parents attributed to normalization reflected their evaluation of condition management, parenting role, and condition impact, with parents in the NP and NA groups demonstrating distinct patterns of meaning. These meaning patterns are discussed as an outcome of normalization. Providers can play a pivotal role in helping families achieve normalization by providing guidance on how to balance condition management with normal family life. PMID:20108258

  16. 18 CFR 154.305 - Tax normalization.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 18 Conservation of Power and Water Resources 1 2011-04-01 2011-04-01 false Tax normalization. 154.305 Section 154.305 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION... Changes § 154.305 Tax normalization. (a) Applicability. An interstate pipeline must compute the income tax...

  17. 18 CFR 154.305 - Tax normalization.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 18 Conservation of Power and Water Resources 1 2012-04-01 2012-04-01 false Tax normalization. 154.305 Section 154.305 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION... Changes § 154.305 Tax normalization. (a) Applicability. An interstate pipeline must compute the income tax...

  18. 18 CFR 154.305 - Tax normalization.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 18 Conservation of Power and Water Resources 1 2013-04-01 2013-04-01 false Tax normalization. 154.305 Section 154.305 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION... Changes § 154.305 Tax normalization. (a) Applicability. An interstate pipeline must compute the income tax...

  19. 18 CFR 154.305 - Tax normalization.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 18 Conservation of Power and Water Resources 1 2014-04-01 2014-04-01 false Tax normalization. 154.305 Section 154.305 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION... Changes § 154.305 Tax normalization. (a) Applicability. An interstate pipeline must compute the income tax...

  20. A normality bias in legal decision making.

    PubMed

    Prentice, Robert A; Koehler, Jonathan J

    2003-03-01

    It is important to understand how legal fact finders determine causation and assign blame. However, this process is poorly understood. Among the psychological factors that affect decision makers are an omission bias (a tendency to blame actions more than inactions [omissions] for bad results), and a normality bias (a tendency to react more strongly to bad outcomes that spring from abnormal rather than normal circumstances). The omission and normality biases often reinforce one another when inaction preserves the normal state and when action creates an abnormal state. But what happens when these biases push in opposite directions as they would when inaction promotes an abnormal state or when action promotes a normal state? Which bias exerts the stronger influence on the judgments and behaviors of legal decision makers? The authors address this issue in two controlled experiments. One experiment involves medical malpractice and the other involves stockbroker negligence. They find that jurors pay much more attention to the normality of conditions than to whether those conditions arose through acts or omissions. Defendants who followed a nontraditional medical treatment regime or who chose a nontraditional stock portfolio received more blame and more punishment for bad outcomes than did defendants who obtained equally poor results after recommending a traditional medical regime or a traditional stock portfolio. Whether these recommendations entailed an action or an omission was essentially irrelevant. The Article concludes with a discussion of the implications of a robust normality bias for American jurisprudence.

  1. Normal peer models and autistic children's learning.

    PubMed Central

    Egel, A L; Richman, G S; Koegel, R L

    1981-01-01

    Present research and legislation regarding mainstreaming autistic children into normal classrooms have raised the importance of studying whether autistic children can benefit from observing normal peer models. The present investigation systematically assessed whether autistic children's learning of discrimination tasks could be improved if they observed normal children perform the tasks correctly. In the context of a multiple baseline design, four autistic children worked on five discrimination tasks that their teachers reported were posing difficulty. Throughout the baseline condition the children evidenced very low levels of correct responding on all five tasks. In the subsequent treatment condition, when normal peers modeled correct responses, the autistic children's correct responding increased dramatically. In each case, the peer modeling procedure produced rapid achievement of the acquisition which was maintained after the peer models were removed. These results are discussed in relation to issues concerning observational learning and in relation to the implications for mainstreaming autistic children into normal classrooms. PMID:7216930

  2. CNN-based ranking for biomedical entity normalization.

    PubMed

    Li, Haodi; Chen, Qingcai; Tang, Buzhou; Wang, Xiaolong; Xu, Hua; Wang, Baohua; Huang, Dong

    2017-10-03

    Most state-of-the-art biomedical entity normalization systems, such as rule-based systems, merely rely on morphological information of entity mentions, but rarely consider their semantic information. In this paper, we introduce a novel convolutional neural network (CNN) architecture that regards biomedical entity normalization as a ranking problem and benefits from semantic information of biomedical entities. The CNN-based ranking method first generates candidates using handcrafted rules, and then ranks the candidates according to their semantic information modeled by CNN as well as their morphological information. Experiments on two benchmark datasets for biomedical entity normalization show that our proposed CNN-based ranking method outperforms traditional rule-based method with state-of-the-art performance. We propose a CNN architecture that regards biomedical entity normalization as a ranking problem. Comparison results show that semantic information is beneficial to biomedical entity normalization and can be well combined with morphological information in our CNN architecture for further improvement.

  3. Normal stresses in shear thickening granular suspensions.

    PubMed

    Pan, Zhongcheng; de Cagny, Henri; Habibi, Mehdi; Bonn, Daniel

    2017-05-24

    When subjected to shear, granular suspensions exhibit normal stresses perpendicular to the shear plane but the magnitude and sign of the different components of the normal stresses are still under debate. By performing both oscillatory and rotational rheology measurements on shear thickening granular suspensions and systematically varying the particle diameters and the gap sizes between two parallel-plates, we show that a transition from a positive to a negative normal stress can be observed. We find that frictional interactions which determine the shear thickening behavior of suspensions contribute to the positive normal stresses. Increasing the particle diameters or decreasing the gap sizes leads to a growing importance of hydrodynamic interactions, which results in negative normal stresses. We determine a relaxation time for the system, set by both the pore and the gap sizes, that governs the fluid flow through the inter-particle space. Finally, using a two-fluid model we determine the relative contributions from the particle phase and the liquid phase.

  4. Statistical normalization techniques for magnetic resonance imaging.

    PubMed

    Shinohara, Russell T; Sweeney, Elizabeth M; Goldsmith, Jeff; Shiee, Navid; Mateen, Farrah J; Calabresi, Peter A; Jarso, Samson; Pham, Dzung L; Reich, Daniel S; Crainiceanu, Ciprian M

    2014-01-01

    While computed tomography and other imaging techniques are measured in absolute units with physical meaning, magnetic resonance images are expressed in arbitrary units that are difficult to interpret and differ between study visits and subjects. Much work in the image processing literature on intensity normalization has focused on histogram matching and other histogram mapping techniques, with little emphasis on normalizing images to have biologically interpretable units. Furthermore, there are no formalized principles or goals for the crucial comparability of image intensities within and across subjects. To address this, we propose a set of criteria necessary for the normalization of images. We further propose simple and robust biologically motivated normalization techniques for multisequence brain imaging that have the same interpretation across acquisitions and satisfy the proposed criteria. We compare the performance of different normalization methods in thousands of images of patients with Alzheimer's disease, hundreds of patients with multiple sclerosis, and hundreds of healthy subjects obtained in several different studies at dozens of imaging centers.

  5. Is My Penis Normal? (For Teens)

    MedlinePlus

    ... worried about whether his penis is a normal size. There's a fairly wide range of normal penis sizes — just as there is for every other body part. And just like other parts of the body, how a penis appears at different stages of a guy's life varies quite a ...

  6. A Skew-Normal Mixture Regression Model

    ERIC Educational Resources Information Center

    Liu, Min; Lin, Tsung-I

    2014-01-01

    A challenge associated with traditional mixture regression models (MRMs), which rest on the assumption of normally distributed errors, is determining the number of unobserved groups. Specifically, even slight deviations from normality can lead to the detection of spurious classes. The current work aims to (a) examine how sensitive the commonly…

  7. Correcting the Normalized Gain for Guessing

    ERIC Educational Resources Information Center

    Stewart, John; Stewart, Gay

    2010-01-01

    The normalized gain, "g", has been an important tool for the characterization of conceptual improvement in physics courses since its use in Hake's extensive study on conceptual learning in introductory physics. The normalized gain is calculated from the score on a pre-test administered before instruction and a post-test administered…

  8. Normalized Temperature Contrast Processing in Infrared Flash Thermography

    NASA Technical Reports Server (NTRS)

    Koshti, Ajay M.

    2016-01-01

    The paper presents further development in normalized contrast processing used in flash infrared thermography method. Method of computing normalized image or pixel intensity contrast, and normalized temperature contrast are provided. Methods of converting image contrast to temperature contrast and vice versa are provided. Normalized contrast processing in flash thermography is useful in quantitative analysis of flash thermography data including flaw characterization and comparison of experimental results with simulation. Computation of normalized temperature contrast involves use of flash thermography data acquisition set-up with high reflectivity foil and high emissivity tape such that the foil, tape and test object are imaged simultaneously. Methods of assessing other quantitative parameters such as emissivity of object, afterglow heat flux, reflection temperature change and surface temperature during flash thermography are also provided. Temperature imaging and normalized temperature contrast processing provide certain advantages over normalized image contrast processing by reducing effect of reflected energy in images and measurements, therefore providing better quantitative data. Examples of incorporating afterglow heat-flux and reflection temperature evolution in flash thermography simulation are also discussed.

  9. Evaluation of normalization methods in mammalian microRNA-Seq data

    PubMed Central

    Garmire, Lana Xia; Subramaniam, Shankar

    2012-01-01

    Simple total tag count normalization is inadequate for microRNA sequencing data generated from the next generation sequencing technology. However, so far systematic evaluation of normalization methods on microRNA sequencing data is lacking. We comprehensively evaluate seven commonly used normalization methods including global normalization, Lowess normalization, Trimmed Mean Method (TMM), quantile normalization, scaling normalization, variance stabilization, and invariant method. We assess these methods on two individual experimental data sets with the empirical statistical metrics of mean square error (MSE) and Kolmogorov-Smirnov (K-S) statistic. Additionally, we evaluate the methods with results from quantitative PCR validation. Our results consistently show that Lowess normalization and quantile normalization perform the best, whereas TMM, a method applied to the RNA-Sequencing normalization, performs the worst. The poor performance of TMM normalization is further evidenced by abnormal results from the test of differential expression (DE) of microRNA-Seq data. Comparing with the models used for DE, the choice of normalization method is the primary factor that affects the results of DE. In summary, Lowess normalization and quantile normalization are recommended for normalizing microRNA-Seq data, whereas the TMM method should be used with caution. PMID:22532701

  10. Economic values under inappropriate normal distribution assumptions.

    PubMed

    Sadeghi-Sefidmazgi, A; Nejati-Javaremi, A; Moradi-Shahrbabak, M; Miraei-Ashtiani, S R; Amer, P R

    2012-08-01

    The objectives of this study were to quantify the errors in economic values (EVs) for traits affected by cost or price thresholds when skewed or kurtotic distributions of varying degree are assumed to be normal and when data with a normal distribution is subject to censoring. EVs were estimated for a continuous trait with dichotomous economic implications because of a price premium or penalty arising from a threshold ranging between -4 and 4 standard deviations from the mean. In order to evaluate the impacts of skewness, positive and negative excess kurtosis, standard skew normal, Pearson and the raised cosine distributions were used, respectively. For the various evaluable levels of skewness and kurtosis, the results showed that EVs can be underestimated or overestimated by more than 100% when price determining thresholds fall within a range from the mean that might be expected in practice. Estimates of EVs were very sensitive to censoring or missing data. In contrast to practical genetic evaluation, economic evaluation is very sensitive to lack of normality and missing data. Although in some special situations, the presence of multiple thresholds may attenuate the combined effect of errors at each threshold point, in practical situations there is a tendency for a few key thresholds to dominate the EV, and there are many situations where errors could be compounded across multiple thresholds. In the development of breeding objectives for non-normal continuous traits influenced by value thresholds, it is necessary to select a transformation that will resolve problems of non-normality or consider alternative methods that are less sensitive to non-normality.

  11. Univariate normalization of bispectrum using Hölder's inequality.

    PubMed

    Shahbazi, Forooz; Ewald, Arne; Nolte, Guido

    2014-08-15

    Considering that many biological systems including the brain are complex non-linear systems, suitable methods capable of detecting these non-linearities are required to study the dynamical properties of these systems. One of these tools is the third order cummulant or cross-bispectrum, which is a measure of interfrequency interactions between three signals. For convenient interpretation, interaction measures are most commonly normalized to be independent of constant scales of the signals such that its absolute values are bounded by one, with this limit reflecting perfect coupling. Although many different normalization factors for cross-bispectra were suggested in the literature these either do not lead to bounded measures or are themselves dependent on the coupling and not only on the scale of the signals. In this paper we suggest a normalization factor which is univariate, i.e., dependent only on the amplitude of each signal and not on the interactions between signals. Using a generalization of Hölder's inequality it is proven that the absolute value of this univariate bicoherence is bounded by zero and one. We compared three widely used normalizations to the univariate normalization concerning the significance of bicoherence values gained from resampling tests. Bicoherence values are calculated from real EEG data recorded in an eyes closed experiment from 10 subjects. The results show slightly more significant values for the univariate normalization but in general, the differences are very small or even vanishing in some subjects. Therefore, we conclude that the normalization factor does not play an important role in the bicoherence values with regard to statistical power, although a univariate normalization is the only normalization factor which fulfills all the required conditions of a proper normalization. Copyright © 2014 Elsevier B.V. All rights reserved.

  12. Score Normalization for Keyword Search

    DTIC Science & Technology

    2016-06-23

    Anahtar Sözcük Arama için Skor Düzgeleme Score Normalization for Keyword Search Leda Sarı, Murat Saraçlar Elektrik ve Elektronik Mühendisliği Bölümü...skor düzgeleme. Abstract—In this work, keyword search (KWS) is based on a symbolic index that uses posteriorgram representation of the speech data...For each query, sum-to-one normalization or keyword specific thresholding is applied to the search results. The effect of these methods on the proposed

  13. Normal forms of Hopf-zero singularity

    NASA Astrophysics Data System (ADS)

    Gazor, Majid; Mokhtari, Fahimeh

    2015-01-01

    The Lie algebra generated by Hopf-zero classical normal forms is decomposed into two versal Lie subalgebras. Some dynamical properties for each subalgebra are described; one is the set of all volume-preserving conservative systems while the other is the maximal Lie algebra of nonconservative systems. This introduces a unique conservative-nonconservative decomposition for the normal form systems. There exists a Lie-subalgebra that is Lie-isomorphic to a large family of vector fields with Bogdanov-Takens singularity. This gives rise to a conclusion that the local dynamics of formal Hopf-zero singularities is well-understood by the study of Bogdanov-Takens singularities. Despite this, the normal form computations of Bogdanov-Takens and Hopf-zero singularities are independent. Thus, by assuming a quadratic nonzero condition, complete results on the simplest Hopf-zero normal forms are obtained in terms of the conservative-nonconservative decomposition. Some practical formulas are derived and the results implemented using Maple. The method has been applied on the Rössler and Kuramoto-Sivashinsky equations to demonstrate the applicability of our results.

  14. Normal stresses in semiflexible polymer hydrogels

    NASA Astrophysics Data System (ADS)

    Vahabi, M.; Vos, Bart E.; de Cagny, Henri C. G.; Bonn, Daniel; Koenderink, Gijsje H.; MacKintosh, F. C.

    2018-03-01

    Biopolymer gels such as fibrin and collagen networks are known to develop tensile axial stress when subject to torsion. This negative normal stress is opposite to the classical Poynting effect observed for most elastic solids including synthetic polymer gels, where torsion provokes a positive normal stress. As shown recently, this anomalous behavior in fibrin gels depends on the open, porous network structure of biopolymer gels, which facilitates interstitial fluid flow during shear and can be described by a phenomenological two-fluid model with viscous coupling between network and solvent. Here we extend this model and develop a microscopic model for the individual diagonal components of the stress tensor that determine the axial response of semiflexible polymer hydrogels. This microscopic model predicts that the magnitude of these stress components depends inversely on the characteristic strain for the onset of nonlinear shear stress, which we confirm experimentally by shear rheometry on fibrin gels. Moreover, our model predicts a transient behavior of the normal stress, which is in excellent agreement with the full time-dependent normal stress we measure.

  15. Forced Normalization: Antagonism Between Epilepsy and Psychosis.

    PubMed

    Kawakami, Yasuhiko; Itoh, Yasuhiko

    2017-05-01

    The antagonism between epilepsy and psychosis has been discussed for a long time. Landolt coined the term "forced normalization" in the 1950s to describe psychotic episodes associated with the remission of seizures and disappearance of epileptiform activity on electroencephalograms in individuals with epilepsy. Since then, neurologists and psychiatrists have been intrigued by this phenomenon. However, although collaborative clinical studies and basic experimental researches have been performed, the mechanism of forced normalization remains unknown. In this review article, we present a historical overview of the concept of forced normalization, and discuss potential pathogenic mechanisms and clinical diagnosis. We also discuss the role of dopamine, which appears to be a key factor in the mechanism of forced normalization. Copyright © 2017 Elsevier Inc. All rights reserved.

  16. Cultured normal mammalian tissue and process

    NASA Technical Reports Server (NTRS)

    Goodwin, Thomas J. (Inventor); Prewett, Tacey L. (Inventor); Wolf, David A. (Inventor); Spaulding, Glenn F. (Inventor)

    1993-01-01

    Normal mammalian tissue and the culturing process has been developed for the three groups of organ, structural and blood tissue. The cells are grown in vitro under microgravity culture conditions and form three dimensional cell aggregates with normal cell function. The microgravity culture conditions may be microgravity or simulated microgravity created in a horizontal rotating wall culture vessel.

  17. Quaternion normalization in spacecraft attitude determination

    NASA Technical Reports Server (NTRS)

    Deutschmann, J.; Markley, F. L.; Bar-Itzhack, Itzhack Y.

    1993-01-01

    Attitude determination of spacecraft usually utilizes vector measurements such as Sun, center of Earth, star, and magnetic field direction to update the quaternion which determines the spacecraft orientation with respect to some reference coordinates in the three dimensional space. These measurements are usually processed by an extended Kalman filter (EKF) which yields an estimate of the attitude quaternion. Two EKF versions for quaternion estimation were presented in the literature; namely, the multiplicative EKF (MEKF) and the additive EKF (AEKF). In the multiplicative EKF, it is assumed that the error between the correct quaternion and its a-priori estimate is, by itself, a quaternion that represents the rotation necessary to bring the attitude which corresponds to the a-priori estimate of the quaternion into coincidence with the correct attitude. The EKF basically estimates this quotient quaternion and then the updated quaternion estimate is obtained by the product of the a-priori quaternion estimate and the estimate of the difference quaternion. In the additive EKF, it is assumed that the error between the a-priori quaternion estimate and the correct one is an algebraic difference between two four-tuple elements and thus the EKF is set to estimate this difference. The updated quaternion is then computed by adding the estimate of the difference to the a-priori quaternion estimate. If the quaternion estimate converges to the correct quaternion, then, naturally, the quaternion estimate has unity norm. This fact was utilized in the past to obtain superior filter performance by applying normalization to the filter measurement update of the quaternion. It was observed for the AEKF that when the attitude changed very slowly between measurements, normalization merely resulted in a faster convergence; however, when the attitude changed considerably between measurements, without filter tuning or normalization, the quaternion estimate diverged. However, when the

  18. Normalization vs. Social Role Valorization: Similar or Different?

    ERIC Educational Resources Information Center

    Kumar, Akhilesh; Singh, Rajani Ranjan; Thressiakutty, A. T.

    2015-01-01

    The radical changes towards services for persons with disabilities were brought by Principle of Normalization, originated in 1969. As a consequence of Normalization, disability as a whole, and intellectual disability in particular, received the attention of the masses and the intelligentsia began advocating normalization ideologies which became…

  19. Flow derivatives and curvatures for a normal shock

    NASA Astrophysics Data System (ADS)

    Emanuel, G.

    2018-03-01

    A detached bow shock wave is strongest where it is normal to the upstream velocity. While the jump conditions across the shock are straightforward, many properties, such as the shock's curvatures and derivatives of the pressure, along and normal to a normal shock, are indeterminate. A novel procedure is introduced for resolving the indeterminacy when the unsteady flow is three-dimensional and the upstream velocity may be nonuniform. Utilizing this procedure, normal shock relations are provided for the nonunique orientation of the flow plane and the corresponding shock's curvatures and, e.g., the downstream normal derivatives of the pressure and the velocity components. These algebraic relations explicitly show the dependence of these parameters on the shock's shape and the upstream velocity gradient. A simple relation, valid only for a normal shock, is obtained for the average curvatures. Results are also obtained when the shock is an elliptic paraboloid shock. These derivatives are both simple and proportional to the average curvature.

  20. GC-Content Normalization for RNA-Seq Data

    PubMed Central

    2011-01-01

    Background Transcriptome sequencing (RNA-Seq) has become the assay of choice for high-throughput studies of gene expression. However, as is the case with microarrays, major technology-related artifacts and biases affect the resulting expression measures. Normalization is therefore essential to ensure accurate inference of expression levels and subsequent analyses thereof. Results We focus on biases related to GC-content and demonstrate the existence of strong sample-specific GC-content effects on RNA-Seq read counts, which can substantially bias differential expression analysis. We propose three simple within-lane gene-level GC-content normalization approaches and assess their performance on two different RNA-Seq datasets, involving different species and experimental designs. Our methods are compared to state-of-the-art normalization procedures in terms of bias and mean squared error for expression fold-change estimation and in terms of Type I error and p-value distributions for tests of differential expression. The exploratory data analysis and normalization methods proposed in this article are implemented in the open-source Bioconductor R package EDASeq. Conclusions Our within-lane normalization procedures, followed by between-lane normalization, reduce GC-content bias and lead to more accurate estimates of expression fold-changes and tests of differential expression. Such results are crucial for the biological interpretation of RNA-Seq experiments, where downstream analyses can be sensitive to the supplied lists of genes. PMID:22177264

  1. Normal IQ is possible in Smith-Lemli-Opitz syndrome.

    PubMed

    Eroglu, Yasemen; Nguyen-Driver, Mina; Steiner, Robert D; Merkens, Louise; Merkens, Mark; Roullet, Jean-Baptiste; Elias, Ellen; Sarphare, Geeta; Porter, Forbes D; Li, Chumei; Tierney, Elaine; Nowaczyk, Małgorzata J; Freeman, Kurt A

    2017-08-01

    Children with Smith-Lemli-Opitz syndrome (SLOS) are typically reported to have moderate to severe intellectual disability. This study aims to determine whether normal cognitive function is possible in this population and to describe clinical, biochemical and molecular characteristics of children with SLOS and normal intelligent quotient (IQ). The study included children with SLOS who underwent cognitive testing in four centers. All children with at least one IQ composite score above 80 were included in the study. Six girls, three boys with SLOS were found to have normal or low-normal IQ in a cohort of 145 children with SLOS. Major/multiple organ anomalies and low serum cholesterol levels were uncommon. No correlation with IQ and genotype was evident and no specific developmental profile were observed. Thus, normal or low-normal cognitive function is possible in SLOS. Further studies are needed to elucidate factors contributing to normal or low-normal cognitive function in children with SLOS. © 2017 Wiley Periodicals, Inc.

  2. Normal gravity field in relativistic geodesy

    NASA Astrophysics Data System (ADS)

    Kopeikin, Sergei; Vlasov, Igor; Han, Wen-Biao

    2018-02-01

    Modern geodesy is subject to a dramatic change from the Newtonian paradigm to Einstein's theory of general relativity. This is motivated by the ongoing advance in development of quantum sensors for applications in geodesy including quantum gravimeters and gradientometers, atomic clocks and fiber optics for making ultra-precise measurements of the geoid and multipolar structure of the Earth's gravitational field. At the same time, very long baseline interferometry, satellite laser ranging, and global navigation satellite systems have achieved an unprecedented level of accuracy in measuring 3-d coordinates of the reference points of the International Terrestrial Reference Frame and the world height system. The main geodetic reference standard to which gravimetric measurements of the of Earth's gravitational field are referred is a normal gravity field represented in the Newtonian gravity by the field of a uniformly rotating, homogeneous Maclaurin ellipsoid of which mass and quadrupole momentum are equal to the total mass and (tide-free) quadrupole moment of Earth's gravitational field. The present paper extends the concept of the normal gravity field from the Newtonian theory to the realm of general relativity. We focus our attention on the calculation of the post-Newtonian approximation of the normal field that is sufficient for current and near-future practical applications. We show that in general relativity the level surface of homogeneous and uniformly rotating fluid is no longer described by the Maclaurin ellipsoid in the most general case but represents an axisymmetric spheroid of the fourth order with respect to the geodetic Cartesian coordinates. At the same time, admitting a post-Newtonian inhomogeneity of the mass density in the form of concentric elliptical shells allows one to preserve the level surface of the fluid as an exact ellipsoid of rotation. We parametrize the mass density distribution and the level surface with two parameters which are

  3. 20 CFR 336.2 - Duration of normal unemployment benefits.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 20 Employees' Benefits 1 2010-04-01 2010-04-01 false Duration of normal unemployment benefits. 336... UNEMPLOYMENT INSURANCE ACT DURATION OF NORMAL AND EXTENDED BENEFITS Normal Benefits § 336.2 Duration of normal unemployment benefits. (a) 130 compensable day limitation. A qualified employee who has satisfied the waiting...

  4. 20 CFR 336.2 - Duration of normal unemployment benefits.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 20 Employees' Benefits 1 2011-04-01 2011-04-01 false Duration of normal unemployment benefits. 336... UNEMPLOYMENT INSURANCE ACT DURATION OF NORMAL AND EXTENDED BENEFITS Normal Benefits § 336.2 Duration of normal unemployment benefits. (a) 130 compensable day limitation. A qualified employee who has satisfied the waiting...

  5. 20 CFR 336.2 - Duration of normal unemployment benefits.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 20 Employees' Benefits 1 2012-04-01 2012-04-01 false Duration of normal unemployment benefits. 336... UNEMPLOYMENT INSURANCE ACT DURATION OF NORMAL AND EXTENDED BENEFITS Normal Benefits § 336.2 Duration of normal unemployment benefits. (a) 130 compensable day limitation. A qualified employee who has satisfied the waiting...

  6. 20 CFR 336.2 - Duration of normal unemployment benefits.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 20 Employees' Benefits 1 2013-04-01 2012-04-01 true Duration of normal unemployment benefits. 336... UNEMPLOYMENT INSURANCE ACT DURATION OF NORMAL AND EXTENDED BENEFITS Normal Benefits § 336.2 Duration of normal unemployment benefits. (a) 130 compensable day limitation. A qualified employee who has satisfied the waiting...

  7. 20 CFR 336.2 - Duration of normal unemployment benefits.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 20 Employees' Benefits 1 2014-04-01 2012-04-01 true Duration of normal unemployment benefits. 336... UNEMPLOYMENT INSURANCE ACT DURATION OF NORMAL AND EXTENDED BENEFITS Normal Benefits § 336.2 Duration of normal unemployment benefits. (a) 130 compensable day limitation. A qualified employee who has satisfied the waiting...

  8. FORCED NORMALIZATION: Epilepsy and Psychosis Interaction

    PubMed Central

    Loganathan, Muruga A.; Enja, Manasa

    2015-01-01

    Forced normalization is the emergence of psychoses following the establishment of seizure control in an uncontrolled epilepsy patient. Two illustrative clinical vignettes are provided about people with epilepsy that was newly controlled and followed by emergence of a psychosis; symptoms appeared only after attaining ictal control. For recognition and differential diagnosis purposes, understanding forced normalization is important in clinical practice. PMID:26155377

  9. Corticocortical feedback increases the spatial extent of normalization.

    PubMed

    Nassi, Jonathan J; Gómez-Laberge, Camille; Kreiman, Gabriel; Born, Richard T

    2014-01-01

    Normalization has been proposed as a canonical computation operating across different brain regions, sensory modalities, and species. It provides a good phenomenological description of non-linear response properties in primary visual cortex (V1), including the contrast response function and surround suppression. Despite its widespread application throughout the visual system, the underlying neural mechanisms remain largely unknown. We recently observed that corticocortical feedback contributes to surround suppression in V1, raising the possibility that feedback acts through normalization. To test this idea, we characterized area summation and contrast response properties in V1 with and without feedback from V2 and V3 in alert macaques and applied a standard normalization model to the data. Area summation properties were well explained by a form of divisive normalization, which computes the ratio between a neuron's driving input and the spatially integrated activity of a "normalization pool." Feedback inactivation reduced surround suppression by shrinking the spatial extent of the normalization pool. This effect was independent of the gain modulation thought to mediate the influence of contrast on area summation, which remained intact during feedback inactivation. Contrast sensitivity within the receptive field center was also unaffected by feedback inactivation, providing further evidence that feedback participates in normalization independent of the circuit mechanisms involved in modulating contrast gain and saturation. These results suggest that corticocortical feedback contributes to surround suppression by increasing the visuotopic extent of normalization and, via this mechanism, feedback can play a critical role in contextual information processing.

  10. Normalization of Gravitational Acceleration Models

    NASA Technical Reports Server (NTRS)

    Eckman, Randy A.; Brown, Aaron J.; Adamo, Daniel R.

    2011-01-01

    Unlike the uniform density spherical shell approximations of Newton, the con- sequence of spaceflight in the real universe is that gravitational fields are sensitive to the nonsphericity of their generating central bodies. The gravitational potential of a nonspherical central body is typically resolved using spherical harmonic approximations. However, attempting to directly calculate the spherical harmonic approximations results in at least two singularities which must be removed in order to generalize the method and solve for any possible orbit, including polar orbits. Three unique algorithms have been developed to eliminate these singularities by Samuel Pines [1], Bill Lear [2], and Robert Gottlieb [3]. This paper documents the methodical normalization of two1 of the three known formulations for singularity-free gravitational acceleration (namely, the Lear [2] and Gottlieb [3] algorithms) and formulates a general method for defining normalization parameters used to generate normalized Legendre Polynomials and ALFs for any algorithm. A treatment of the conventional formulation of the gravitational potential and acceleration is also provided, in addition to a brief overview of the philosophical differences between the three known singularity-free algorithms.

  11. Normalized Temperature Contrast Processing in Flash Infrared Thermography

    NASA Technical Reports Server (NTRS)

    Koshti, Ajay M.

    2016-01-01

    The paper presents further development in normalized contrast processing of flash infrared thermography method by the author given in US 8,577,120 B1. The method of computing normalized image or pixel intensity contrast, and normalized temperature contrast are provided, including converting one from the other. Methods of assessing emissivity of the object, afterglow heat flux, reflection temperature change and temperature video imaging during flash thermography are provided. Temperature imaging and normalized temperature contrast imaging provide certain advantages over pixel intensity normalized contrast processing by reducing effect of reflected energy in images and measurements, providing better quantitative data. The subject matter for this paper mostly comes from US 9,066,028 B1 by the author. Examples of normalized image processing video images and normalized temperature processing video images are provided. Examples of surface temperature video images, surface temperature rise video images and simple contrast video images area also provided. Temperature video imaging in flash infrared thermography allows better comparison with flash thermography simulation using commercial software which provides temperature video as the output. Temperature imaging also allows easy comparison of surface temperature change to camera temperature sensitivity or noise equivalent temperature difference (NETD) to assess probability of detecting (POD) anomalies.

  12. Understanding a Normal Distribution of Data (Part 2).

    PubMed

    Maltenfort, Mitchell

    2016-02-01

    Completing the discussion of data normality, advanced techniques for analysis of non-normal data are discussed including data transformation, Generalized Linear Modeling, and bootstrapping. Relative strengths and weaknesses of each technique are helpful in choosing a strategy, but help from a statistician is usually necessary to analyze non-normal data using these methods.

  13. High-speed digital signal normalization for feature identification

    NASA Technical Reports Server (NTRS)

    Ortiz, J. A.; Meredith, B. D.

    1983-01-01

    A design approach for high speed normalization of digital signals was developed. A reciprocal look up table technique is employed, where a digital value is mapped to its reciprocal via a high speed memory. This reciprocal is then multiplied with an input signal to obtain the normalized result. Normalization improves considerably the accuracy of certain feature identification algorithms. By using the concept of pipelining the multispectral sensor data processing rate is limited only by the speed of the multiplier. The breadboard system was found to operate at an execution rate of five million normalizations per second. This design features high precision, a reduced hardware complexity, high flexibility, and expandability which are very important considerations for spaceborne applications. It also accomplishes a high speed normalization rate essential for real time data processing.

  14. Antitissue Transglutaminase Normalization Postdiagnosis in Children With Celiac Disease.

    PubMed

    Isaac, Daniela Migliarese; Rajani, Seema; Yaskina, Maryna; Huynh, Hien Q; Turner, Justine M

    2017-08-01

    Limited pediatric data exist examining the trend and predictors of antitissue transglutaminase (atTG) normalization over time in children with celiac disease (CD). We aimed to evaluate time to normalization of atTG in children after CD diagnosis, and to assess for independent predictors affecting this duration. A retrospective chart review was completed in pediatric patients with CD diagnosed from 2007 to 2014 at the Stollery Children's Hospital Celiac Clinic (Edmonton, Alberta, Canada). The clinical predictors assessed for impact on time to atTG normalization were initial atTG, Marsh score at diagnosis, gluten-free diet compliance (GFDC), age at diagnosis, sex, ethnicity, medical comorbidities, and family history of CD. Kaplan-Meier survival analysis was completed to assess time to atTG normalization, and Cox regression to assess for independent predictors of this time. A total of 487 patients met inclusion criteria. Approximately 80.5% of patients normalized atTG levels. Median normalization time was 407 days for all patients (95% confidence interval [CI: 361-453]), and 364 days for gluten-free diet compliant patients (95% CI [335-393]). Type 1 diabetes mellitus (T1DM) patients took significantly longer to normalize at 1204 days (95% CI [199-2209], P < 0.001). Cox regression demonstrated T1DM (hazard ratio = 0.36 [0.24-0.55], P < 0.001) and higher baseline atTG (hazard ratio = 0.52 [0.43-0.63], P < 0.001) were significant predictors of longer atTG normalization time. GFDC was a significant predictor of earlier normalization (OR = 13.91 [7.86-24.62], P < 0.001). GFDC and lower atTG at diagnosis are predictors of earlier normalization. Patients with T1DM are less likely to normalize atTG levels, with longer normalization time. Additional research and education for higher-risk populations are needed.

  15. Normalization of energy-dependent gamma survey data.

    PubMed

    Whicker, Randy; Chambers, Douglas

    2015-05-01

    Instruments and methods for normalization of energy-dependent gamma radiation survey data to a less energy-dependent basis of measurement are evaluated based on relevant field data collected at 15 different sites across the western United States along with a site in Mongolia. Normalization performance is assessed relative to measurements with a high-pressure ionization chamber (HPIC) due to its "flat" energy response and accurate measurement of the true exposure rate from both cosmic and terrestrial radiation. While analytically ideal for normalization applications, cost and practicality disadvantages have increased demand for alternatives to the HPIC. Regression analysis on paired measurements between energy-dependent sodium iodide (NaI) scintillation detectors (5-cm by 5-cm crystal dimensions) and the HPIC revealed highly consistent relationships among sites not previously impacted by radiological contamination (natural sites). A resulting generalized data normalization factor based on the average sensitivity of NaI detectors to naturally occurring terrestrial radiation (0.56 nGy hHPIC per nGy hNaI), combined with the calculated site-specific estimate of cosmic radiation, produced reasonably accurate predictions of HPIC readings at natural sites. Normalization against two to potential alternative instruments (a tissue-equivalent plastic scintillator and energy-compensated NaI detector) did not perform better than the sensitivity adjustment approach at natural sites. Each approach produced unreliable estimates of HPIC readings at radiologically impacted sites, though normalization against the plastic scintillator or energy-compensated NaI detector can address incompatibilities between different energy-dependent instruments with respect to estimation of soil radionuclide levels. The appropriate data normalization method depends on the nature of the site, expected duration of the project, survey objectives, and considerations of cost and practicality.

  16. Speaker normalization for chinese vowel recognition in cochlear implants.

    PubMed

    Luo, Xin; Fu, Qian-Jie

    2005-07-01

    Because of the limited spectra-temporal resolution associated with cochlear implants, implant patients often have greater difficulty with multitalker speech recognition. The present study investigated whether multitalker speech recognition can be improved by applying speaker normalization techniques to cochlear implant speech processing. Multitalker Chinese vowel recognition was tested with normal-hearing Chinese-speaking subjects listening to a 4-channel cochlear implant simulation, with and without speaker normalization. For each subject, speaker normalization was referenced to the speaker that produced the best recognition performance under conditions without speaker normalization. To match the remaining speakers to this "optimal" output pattern, the overall frequency range of the analysis filter bank was adjusted for each speaker according to the ratio of the mean third formant frequency values between the specific speaker and the reference speaker. Results showed that speaker normalization provided a small but significant improvement in subjects' overall recognition performance. After speaker normalization, subjects' patterns of recognition performance across speakers changed, demonstrating the potential for speaker-dependent effects with the proposed normalization technique.

  17. 7 CFR 760.4 - Normal marketings of milk.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 7 2010-01-01 2010-01-01 false Normal marketings of milk. 760.4 Section 760.4... Farmers for Milk § 760.4 Normal marketings of milk. (a) The county committee shall determine the affected farmer's normal marketings which, for the purposes of this subpart, shall be the sum of the quantities of...

  18. 7 CFR 760.4 - Normal marketings of milk.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 7 Agriculture 7 2013-01-01 2013-01-01 false Normal marketings of milk. 760.4 Section 760.4... Farmers for Milk § 760.4 Normal marketings of milk. (a) The county committee shall determine the affected farmer's normal marketings which, for the purposes of this subpart, shall be the sum of the quantities of...

  19. 7 CFR 760.4 - Normal marketings of milk.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 7 Agriculture 7 2012-01-01 2012-01-01 false Normal marketings of milk. 760.4 Section 760.4... Farmers for Milk § 760.4 Normal marketings of milk. (a) The county committee shall determine the affected farmer's normal marketings which, for the purposes of this subpart, shall be the sum of the quantities of...

  20. Rhythm-based heartbeat duration normalization for atrial fibrillation detection.

    PubMed

    Islam, Md Saiful; Ammour, Nassim; Alajlan, Naif; Aboalsamh, Hatim

    2016-05-01

    Screening of atrial fibrillation (AF) for high-risk patients including all patients aged 65 years and older is important for prevention of risk of stroke. Different technologies such as modified blood pressure monitor, single lead ECG-based finger-probe, and smart phone using plethysmogram signal have been emerging for this purpose. All these technologies use irregularity of heartbeat duration as a feature for AF detection. We have investigated a normalization method of heartbeat duration for improved AF detection. AF is an arrhythmia in which heartbeat duration generally becomes irregularly irregular. From a window of heartbeat duration, we estimate the possible rhythm of the majority of heartbeats and normalize duration of all heartbeats in the window based on the rhythm so that we can measure the irregularity of heartbeats for both AF and non-AF rhythms in the same scale. Irregularity is measured by the entropy of distribution of the normalized duration. Then we classify a window of heartbeats as AF or non-AF by thresholding the measured irregularity. The effect of this normalization is evaluated by comparing AF detection performances using duration with the normalization, without normalization, and with other existing normalizations. Sensitivity and specificity of AF detection using normalized heartbeat duration were tested on two landmark databases available online and compared with results of other methods (with/without normalization) by receiver operating characteristic (ROC) curves. ROC analysis showed that the normalization was able to improve the performance of AF detection and it was consistent for a wide range of sensitivity and specificity for use of different thresholds. Detection accuracy was also computed for equal rates of sensitivity and specificity for different methods. Using normalized heartbeat duration, we obtained 96.38% accuracy which is more than 4% improvement compared to AF detection without normalization. The proposed normalization

  1. Normal modes of weak colloidal gels

    NASA Astrophysics Data System (ADS)

    Varga, Zsigmond; Swan, James W.

    2018-01-01

    The normal modes and relaxation rates of weak colloidal gels are investigated in calculations using different models of the hydrodynamic interactions between suspended particles. The relaxation spectrum is computed for freely draining, Rotne-Prager-Yamakawa, and accelerated Stokesian dynamics approximations of the hydrodynamic mobility in a normal mode analysis of a harmonic network representing several colloidal gels. We find that the density of states and spatial structure of the normal modes are fundamentally altered by long-ranged hydrodynamic coupling among the particles. Short-ranged coupling due to hydrodynamic lubrication affects only the relaxation rates of short-wavelength modes. Hydrodynamic models accounting for long-ranged coupling exhibit a microscopic relaxation rate for each normal mode, λ that scales as l-2, where l is the spatial correlation length of the normal mode. For the freely draining approximation, which neglects long-ranged coupling, the microscopic relaxation rate scales as l-γ, where γ varies between three and two with increasing particle volume fraction. A simple phenomenological model of the internal elastic response to normal mode fluctuations is developed, which shows that long-ranged hydrodynamic interactions play a central role in the viscoelasticity of the gel network. Dynamic simulations of hard spheres that gel in response to short-ranged depletion attractions are used to test the applicability of the density of states predictions. For particle concentrations up to 30% by volume, the power law decay of the relaxation modulus in simulations accounting for long-ranged hydrodynamic interactions agrees with predictions generated by the density of states of the corresponding harmonic networks as well as experimental measurements. For higher volume fractions, excluded volume interactions dominate the stress response, and the prediction from the harmonic network density of states fails. Analogous to the Zimm model in polymer

  2. Corticocortical feedback increases the spatial extent of normalization

    PubMed Central

    Nassi, Jonathan J.; Gómez-Laberge, Camille; Kreiman, Gabriel; Born, Richard T.

    2014-01-01

    Normalization has been proposed as a canonical computation operating across different brain regions, sensory modalities, and species. It provides a good phenomenological description of non-linear response properties in primary visual cortex (V1), including the contrast response function and surround suppression. Despite its widespread application throughout the visual system, the underlying neural mechanisms remain largely unknown. We recently observed that corticocortical feedback contributes to surround suppression in V1, raising the possibility that feedback acts through normalization. To test this idea, we characterized area summation and contrast response properties in V1 with and without feedback from V2 and V3 in alert macaques and applied a standard normalization model to the data. Area summation properties were well explained by a form of divisive normalization, which computes the ratio between a neuron's driving input and the spatially integrated activity of a “normalization pool.” Feedback inactivation reduced surround suppression by shrinking the spatial extent of the normalization pool. This effect was independent of the gain modulation thought to mediate the influence of contrast on area summation, which remained intact during feedback inactivation. Contrast sensitivity within the receptive field center was also unaffected by feedback inactivation, providing further evidence that feedback participates in normalization independent of the circuit mechanisms involved in modulating contrast gain and saturation. These results suggest that corticocortical feedback contributes to surround suppression by increasing the visuotopic extent of normalization and, via this mechanism, feedback can play a critical role in contextual information processing. PMID:24910596

  3. Plasma Electrolyte Distributions in Humans-Normal or Skewed?

    PubMed

    Feldman, Mark; Dickson, Beverly

    2017-11-01

    It is widely believed that plasma electrolyte levels are normally distributed. Statistical tests and calculations using plasma electrolyte data are often reported based on this assumption of normality. Examples include t tests, analysis of variance, correlations and confidence intervals. The purpose of our study was to determine whether plasma sodium (Na + ), potassium (K + ), chloride (Cl - ) and bicarbonate [Formula: see text] distributions are indeed normally distributed. We analyzed plasma electrolyte data from 237 consecutive adults (137 women and 100 men) who had normal results on a standard basic metabolic panel which included plasma electrolyte measurements. The skewness of each distribution (as a measure of its asymmetry) was compared to the zero skewness of a normal (Gaussian) distribution. The plasma Na + distribution was skewed slightly to the right, but the skew was not significantly different from zero skew. The plasma Cl - distribution was skewed slightly to the left, but again the skew was not significantly different from zero skew. On the contrary, both the plasma K + and [Formula: see text] distributions were significantly skewed to the right (P < 0.01 zero skew). There was also a suggestion from examining frequency distribution curves that K + and [Formula: see text] distributions were bimodal. In adults with a normal basic metabolic panel, plasma potassium and bicarbonate levels are not normally distributed and may be bimodal. Thus, statistical methods to evaluate these 2 plasma electrolytes should be nonparametric tests and not parametric ones that require a normal distribution. Copyright © 2017 Southern Society for Clinical Investigation. Published by Elsevier Inc. All rights reserved.

  4. Applying the log-normal distribution to target detection

    NASA Astrophysics Data System (ADS)

    Holst, Gerald C.

    1992-09-01

    Holst and Pickard experimentally determined that MRT responses tend to follow a log-normal distribution. The log normal distribution appeared reasonable because nearly all visual psychological data is plotted on a logarithmic scale. It has the additional advantage that it is bounded to positive values; an important consideration since probability of detection is often plotted in linear coordinates. Review of published data suggests that the log-normal distribution may have universal applicability. Specifically, the log-normal distribution obtained from MRT tests appears to fit the target transfer function and the probability of detection of rectangular targets.

  5. Is Coefficient Alpha Robust to Non-Normal Data?

    PubMed Central

    Sheng, Yanyan; Sheng, Zhaohui

    2011-01-01

    Coefficient alpha has been a widely used measure by which internal consistency reliability is assessed. In addition to essential tau-equivalence and uncorrelated errors, normality has been noted as another important assumption for alpha. Earlier work on evaluating this assumption considered either exclusively non-normal error score distributions, or limited conditions. In view of this and the availability of advanced methods for generating univariate non-normal data, Monte Carlo simulations were conducted to show that non-normal distributions for true or error scores do create problems for using alpha to estimate the internal consistency reliability. The sample coefficient alpha is affected by leptokurtic true score distributions, or skewed and/or kurtotic error score distributions. Increased sample sizes, not test lengths, help improve the accuracy, bias, or precision of using it with non-normal data. PMID:22363306

  6. Mapping of quantitative trait loci using the skew-normal distribution.

    PubMed

    Fernandes, Elisabete; Pacheco, António; Penha-Gonçalves, Carlos

    2007-11-01

    In standard interval mapping (IM) of quantitative trait loci (QTL), the QTL effect is described by a normal mixture model. When this assumption of normality is violated, the most commonly adopted strategy is to use the previous model after data transformation. However, an appropriate transformation may not exist or may be difficult to find. Also this approach can raise interpretation issues. An interesting alternative is to consider a skew-normal mixture model in standard IM, and the resulting method is here denoted as skew-normal IM. This flexible model that includes the usual symmetric normal distribution as a special case is important, allowing continuous variation from normality to non-normality. In this paper we briefly introduce the main peculiarities of the skew-normal distribution. The maximum likelihood estimates of parameters of the skew-normal distribution are obtained by the expectation-maximization (EM) algorithm. The proposed model is illustrated with real data from an intercross experiment that shows a significant departure from the normality assumption. The performance of the skew-normal IM is assessed via stochastic simulation. The results indicate that the skew-normal IM has higher power for QTL detection and better precision of QTL location as compared to standard IM and nonparametric IM.

  7. Attention and normalization circuits in macaque V1

    PubMed Central

    Sanayei, M; Herrero, J L; Distler, C; Thiele, A

    2015-01-01

    Attention affects neuronal processing and improves behavioural performance. In extrastriate visual cortex these effects have been explained by normalization models, which assume that attention influences the circuit that mediates surround suppression. While normalization models have been able to explain attentional effects, their validity has rarely been tested against alternative models. Here we investigate how attention and surround/mask stimuli affect neuronal firing rates and orientation tuning in macaque V1. Surround/mask stimuli provide an estimate to what extent V1 neurons are affected by normalization, which was compared against effects of spatial top down attention. For some attention/surround effect comparisons, the strength of attentional modulation was correlated with the strength of surround modulation, suggesting that attention and surround/mask stimulation (i.e. normalization) might use a common mechanism. To explore this in detail, we fitted multiplicative and additive models of attention to our data. In one class of models, attention contributed to normalization mechanisms, whereas in a different class of models it did not. Model selection based on Akaike's and on Bayesian information criteria demonstrated that in most cells the effects of attention were best described by models where attention did not contribute to normalization mechanisms. This demonstrates that attentional influences on neuronal responses in primary visual cortex often bypass normalization mechanisms. PMID:25757941

  8. Dependence of normal brain integral dose and normal tissue complication probability on the prescription isodose values for γ-knife radiosurgery

    NASA Astrophysics Data System (ADS)

    Ma, Lijun

    2001-11-01

    A recent multi-institutional clinical study suggested possible benefits of lowering the prescription isodose lines for stereotactic radiosurgery procedures. In this study, we investigate the dependence of the normal brain integral dose and the normal tissue complication probability (NTCP) on the prescription isodose values for γ-knife radiosurgery. An analytical dose model was developed for γ-knife treatment planning. The dose model was commissioned by fitting the measured dose profiles for each helmet size. The dose model was validated by comparing its results with the Leksell gamma plan (LGP, version 5.30) calculations. The normal brain integral dose and the NTCP were computed and analysed for an ensemble of treatment cases. The functional dependence of the normal brain integral dose and the NCTP versus the prescribing isodose values was studied for these cases. We found that the normal brain integral dose and the NTCP increase significantly when lowering the prescription isodose lines from 50% to 35% of the maximum tumour dose. Alternatively, the normal brain integral dose and the NTCP decrease significantly when raising the prescribing isodose lines from 50% to 65% of the maximum tumour dose. The results may be used as a guideline for designing future dose escalation studies for γ-knife applications.

  9. Reparo de aneurisma de artéria ilíaca roto em criança

    PubMed Central

    Hoshiko, Fernando Massaru; Zampieri, Elisa Helena Subtil; Dalio, Marcelo Bellini; Dezotti, Nei Rodrigues Alves; Joviliano, Edwaldo Edner

    2017-01-01

    Resumo Relatamos o caso de uma menina de 12 anos que deu entrada na unidade de emergência com quadro de abdome agudo hemorrágico, massa abdominal pulsátil e instabilidade hemodinâmica. Confirmado o diagnóstico de aneurisma roto de artéria ilíaca direita, foi realizada correção cirúrgica de emergência por reparo aberto com reconstrução extra-anatômica, utilizando enxerto sintético de fino calibre, compatível com a anatomia. O tratamento foi bem-sucedido e a criança apresentou evolução favorável em curto prazo.

  10. Spatially tuned normalization explains attention modulation variance within neurons.

    PubMed

    Ni, Amy M; Maunsell, John H R

    2017-09-01

    Spatial attention improves perception of attended parts of a scene, a behavioral enhancement accompanied by modulations of neuronal firing rates. These modulations vary in size across neurons in the same brain area. Models of normalization explain much of this variance in attention modulation with differences in tuned normalization across neurons (Lee J, Maunsell JHR. PLoS One 4: e4651, 2009; Ni AM, Ray S, Maunsell JHR. Neuron 73: 803-813, 2012). However, recent studies suggest that normalization tuning varies with spatial location both across and within neurons (Ruff DA, Alberts JJ, Cohen MR. J Neurophysiol 116: 1375-1386, 2016; Verhoef BE, Maunsell JHR. eLife 5: e17256, 2016). Here we show directly that attention modulation and normalization tuning do in fact covary within individual neurons, in addition to across neurons as previously demonstrated. We recorded the activity of isolated neurons in the middle temporal area of two rhesus monkeys as they performed a change-detection task that controlled the focus of spatial attention. Using the same two drifting Gabor stimuli and the same two receptive field locations for each neuron, we found that switching which stimulus was presented at which location affected both attention modulation and normalization in a correlated way within neurons. We present an equal-maximum-suppression spatially tuned normalization model that explains this covariance both across and within neurons: each stimulus generates equally strong suppression of its own excitatory drive, but its suppression of distant stimuli is typically less. This new model specifies how the tuned normalization associated with each stimulus location varies across space both within and across neurons, changing our understanding of the normalization mechanism and how attention modulations depend on this mechanism. NEW & NOTEWORTHY Tuned normalization studies have demonstrated that the variance in attention modulation size seen across neurons from the same cortical

  11. 40 CFR 230.24 - Normal water fluctuations.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... change salinity patterns, alter erosion or sedimentation rates, aggravate water temperature extremes, and... 40 Protection of Environment 25 2011-07-01 2011-07-01 false Normal water fluctuations. 230.24... Impacts on Physical and Chemical Characteristics of the Aquatic Ecosystem § 230.24 Normal water...

  12. Facial-Attractiveness Choices Are Predicted by Divisive Normalization.

    PubMed

    Furl, Nicholas

    2016-10-01

    Do people appear more attractive or less attractive depending on the company they keep? A divisive-normalization account-in which representation of stimulus intensity is normalized (divided) by concurrent stimulus intensities-predicts that choice preferences among options increase with the range of option values. In the first experiment reported here, I manipulated the range of attractiveness of the faces presented on each trial by varying the attractiveness of an undesirable distractor face that was presented simultaneously with two attractive targets, and participants were asked to choose the most attractive face. I used normalization models to predict the context dependence of preferences regarding facial attractiveness. The more unattractive the distractor, the more one of the targets was preferred over the other target, which suggests that divisive normalization (a potential canonical computation in the brain) influences social evaluations. I obtained the same result when I manipulated faces' averageness and participants chose the most average face. This finding suggests that divisive normalization is not restricted to value-based decisions (e.g., attractiveness). This new application to social evaluation of normalization, a classic theory, opens possibilities for predicting social decisions in naturalistic contexts such as advertising or dating.

  13. Normal mode analysis and applications in biological physics.

    PubMed

    Dykeman, Eric C; Sankey, Otto F

    2010-10-27

    Normal mode analysis has become a popular and often used theoretical tool in the study of functional motions in enzymes, viruses, and large protein assemblies. The use of normal modes in the study of these motions is often extremely fruitful since many of the functional motions of large proteins can be described using just a few normal modes which are intimately related to the overall structure of the protein. In this review, we present a broad overview of several popular methods used in the study of normal modes in biological physics including continuum elastic theory, the elastic network model, and a new all-atom method, recently developed, which is capable of computing a subset of the low frequency vibrational modes exactly. After a review of the various methods, we present several examples of applications of normal modes in the study of functional motions, with an emphasis on viral capsids.

  14. Meissner effect in normal-superconducting proximity-contact double layers

    NASA Astrophysics Data System (ADS)

    Higashitani, Seiji; Nagai, Katsuhiko

    1995-02-01

    The Meissner effect in normal-superconducting proximity-contact double layers is discussed in the clean limit. The diamagnetic current is calculated using the quasi-classical Green's function. We obtain the quasi-classical Green's function linear in the vector potential in the proximity-contact double layers with a finite reflection coefficient at the interface. It is found that the diamagnetic current in the clean normal layer is constant in space, therefore, the magnetic field linearly decreases in the clean normal layer. We give an explicit expression for the screening length in the clean normal layer and study its temperature dependence. We show that the temperature dependence in the clean normal layer is considerably different from that in the dirty normal layer and agrees with a recent experiment in Au-Nb system.

  15. Helicon normal modes in Proto-MPEX

    NASA Astrophysics Data System (ADS)

    Piotrowicz, P. A.; Caneses, J. F.; Green, D. L.; Goulding, R. H.; Lau, C.; Caughman, J. B. O.; Rapp, J.; Ruzic, D. N.

    2018-05-01

    The Proto-MPEX helicon source has been operating in a high electron density ‘helicon-mode’. Establishing plasma densities and magnetic field strengths under the antenna that allow for the formation of normal modes of the fast-wave are believed to be responsible for the ‘helicon-mode’. A 2D finite-element full-wave model of the helicon antenna on Proto-MPEX is used to identify the fast-wave normal modes responsible for the steady-state electron density profile produced by the source. We also show through the simulation that in the regions of operation in which core power deposition is maximum the slow-wave does not deposit significant power besides directly under the antenna. In the case of a simulation where a normal mode is not excited significant edge power is deposited in the mirror region. ).

  16. Helicon normal modes in Proto-MPEX

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Piotrowicz, Pawel A.; Caneses, Juan F.; Green, David L.

    Here, the Proto-MPEX helicon source has been operating in a high electron density 'helicon-mode'. Establishing plasma densities and magnetic field strengths under the antenna that allow for the formation of normal modes of the fast-wave are believed to be responsible for the 'helicon-mode'. A 2D finite-element full-wave model of the helicon antenna on Proto-MPEX is used to identify the fast-wave normal modes responsible for the steady-state electron density profile produced by the source. We also show through the simulation that in the regions of operation in which core power deposition is maximum the slow-wave does not deposit significant power besidesmore » directly under the antenna. In the case of a simulation where a normal mode is not excited significant edge power is deposited in the mirror region.« less

  17. Helicon normal modes in Proto-MPEX

    DOE PAGES

    Piotrowicz, Pawel A.; Caneses, Juan F.; Green, David L.; ...

    2018-05-22

    Here, the Proto-MPEX helicon source has been operating in a high electron density 'helicon-mode'. Establishing plasma densities and magnetic field strengths under the antenna that allow for the formation of normal modes of the fast-wave are believed to be responsible for the 'helicon-mode'. A 2D finite-element full-wave model of the helicon antenna on Proto-MPEX is used to identify the fast-wave normal modes responsible for the steady-state electron density profile produced by the source. We also show through the simulation that in the regions of operation in which core power deposition is maximum the slow-wave does not deposit significant power besidesmore » directly under the antenna. In the case of a simulation where a normal mode is not excited significant edge power is deposited in the mirror region.« less

  18. Normal fault earthquakes or graviquakes

    PubMed Central

    Doglioni, C.; Carminati, E.; Petricca, P.; Riguzzi, F.

    2015-01-01

    Earthquakes are dissipation of energy throughout elastic waves. Canonically is the elastic energy accumulated during the interseismic period. However, in crustal extensional settings, gravity is the main energy source for hangingwall fault collapsing. Gravitational potential is about 100 times larger than the observed magnitude, far more than enough to explain the earthquake. Therefore, normal faults have a different mechanism of energy accumulation and dissipation (graviquakes) with respect to other tectonic settings (strike-slip and contractional), where elastic energy allows motion even against gravity. The bigger the involved volume, the larger is their magnitude. The steeper the normal fault, the larger is the vertical displacement and the larger is the seismic energy released. Normal faults activate preferentially at about 60° but they can be shallower in low friction rocks. In low static friction rocks, the fault may partly creep dissipating gravitational energy without releasing great amount of seismic energy. The maximum volume involved by graviquakes is smaller than the other tectonic settings, being the activated fault at most about three times the hypocentre depth, explaining their higher b-value and the lower magnitude of the largest recorded events. Having different phenomenology, graviquakes show peculiar precursors. PMID:26169163

  19. On the efficacy of procedures to normalize Ex-Gaussian distributions

    PubMed Central

    Marmolejo-Ramos, Fernando; Cousineau, Denis; Benites, Luis; Maehara, Rocío

    2015-01-01

    Reaction time (RT) is one of the most common types of measure used in experimental psychology. Its distribution is not normal (Gaussian) but resembles a convolution of normal and exponential distributions (Ex-Gaussian). One of the major assumptions in parametric tests (such as ANOVAs) is that variables are normally distributed. Hence, it is acknowledged by many that the normality assumption is not met. This paper presents different procedures to normalize data sampled from an Ex-Gaussian distribution in such a way that they are suitable for parametric tests based on the normality assumption. Using simulation studies, various outlier elimination and transformation procedures were tested against the level of normality they provide. The results suggest that the transformation methods are better than elimination methods in normalizing positively skewed data and the more skewed the distribution then the transformation methods are more effective in normalizing such data. Specifically, transformation with parameter lambda -1 leads to the best results. PMID:25709588

  20. On the efficacy of procedures to normalize Ex-Gaussian distributions.

    PubMed

    Marmolejo-Ramos, Fernando; Cousineau, Denis; Benites, Luis; Maehara, Rocío

    2014-01-01

    Reaction time (RT) is one of the most common types of measure used in experimental psychology. Its distribution is not normal (Gaussian) but resembles a convolution of normal and exponential distributions (Ex-Gaussian). One of the major assumptions in parametric tests (such as ANOVAs) is that variables are normally distributed. Hence, it is acknowledged by many that the normality assumption is not met. This paper presents different procedures to normalize data sampled from an Ex-Gaussian distribution in such a way that they are suitable for parametric tests based on the normality assumption. Using simulation studies, various outlier elimination and transformation procedures were tested against the level of normality they provide. The results suggest that the transformation methods are better than elimination methods in normalizing positively skewed data and the more skewed the distribution then the transformation methods are more effective in normalizing such data. Specifically, transformation with parameter lambda -1 leads to the best results.

  1. Normalized Legal Drafting and the Query Method.

    ERIC Educational Resources Information Center

    Allen, Layman E.; Engholm, C. Rudy

    1978-01-01

    Normalized legal drafting, a mode of expressing ideas in legal documents so that the syntax that relates the constituent propositions is simplified and standardized, and the query method, a question-asking activity that teaches normalized drafting and provides practice, are examined. Some examples are presented. (JMD)

  2. 40 CFR 230.24 - Normal water fluctuations.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... either attuned to or characterized by these periodic water fluctuations. (b) Possible loss of... 40 Protection of Environment 24 2010-07-01 2010-07-01 false Normal water fluctuations. 230.24... Impacts on Physical and Chemical Characteristics of the Aquatic Ecosystem § 230.24 Normal water...

  3. Attention and normalization circuits in macaque V1.

    PubMed

    Sanayei, M; Herrero, J L; Distler, C; Thiele, A

    2015-04-01

    Attention affects neuronal processing and improves behavioural performance. In extrastriate visual cortex these effects have been explained by normalization models, which assume that attention influences the circuit that mediates surround suppression. While normalization models have been able to explain attentional effects, their validity has rarely been tested against alternative models. Here we investigate how attention and surround/mask stimuli affect neuronal firing rates and orientation tuning in macaque V1. Surround/mask stimuli provide an estimate to what extent V1 neurons are affected by normalization, which was compared against effects of spatial top down attention. For some attention/surround effect comparisons, the strength of attentional modulation was correlated with the strength of surround modulation, suggesting that attention and surround/mask stimulation (i.e. normalization) might use a common mechanism. To explore this in detail, we fitted multiplicative and additive models of attention to our data. In one class of models, attention contributed to normalization mechanisms, whereas in a different class of models it did not. Model selection based on Akaike's and on Bayesian information criteria demonstrated that in most cells the effects of attention were best described by models where attention did not contribute to normalization mechanisms. This demonstrates that attentional influences on neuronal responses in primary visual cortex often bypass normalization mechanisms. © 2015 The Authors. European Journal of Neuroscience published by Federation of European Neuroscience Societies and John Wiley & Sons Ltd.

  4. Compressed normalized block difference for object tracking

    NASA Astrophysics Data System (ADS)

    Gao, Yun; Zhang, Dengzhuo; Cai, Donglan; Zhou, Hao; Lan, Ge

    2018-04-01

    Feature extraction is very important for robust and real-time tracking. Compressive sensing provided a technical support for real-time feature extraction. However, all existing compressive tracking were based on compressed Haar-like feature, and how to compress many more excellent high-dimensional features is worth researching. In this paper, a novel compressed normalized block difference feature (CNBD) was proposed. For resisting noise effectively in a highdimensional normalized pixel difference feature (NPD), a normalized block difference feature extends two pixels in the original formula of NPD to two blocks. A CNBD feature can be obtained by compressing a normalized block difference feature based on compressive sensing theory, with the sparse random Gaussian matrix as the measurement matrix. The comparative experiments of 7 trackers on 20 challenging sequences showed that the tracker based on CNBD feature can perform better than other trackers, especially than FCT tracker based on compressed Haar-like feature, in terms of AUC, SR and Precision.

  5. Normal Databases for the Relative Quantification of Myocardial Perfusion

    PubMed Central

    Rubeaux, Mathieu; Xu, Yuan; Germano, Guido; Berman, Daniel S.; Slomka, Piotr J.

    2016-01-01

    Purpose of review Myocardial perfusion imaging (MPI) with SPECT is performed clinically worldwide to detect and monitor coronary artery disease (CAD). MPI allows an objective quantification of myocardial perfusion at stress and rest. This established technique relies on normal databases to compare patient scans against reference normal limits. In this review, we aim to introduce the process of MPI quantification with normal databases and describe the associated perfusion quantitative measures that are used. Recent findings New equipment and new software reconstruction algorithms have been introduced which require the development of new normal limits. The appearance and regional count variations of normal MPI scan may differ between these new scanners and standard Anger cameras. Therefore, these new systems may require the determination of new normal limits to achieve optimal accuracy in relative myocardial perfusion quantification. Accurate diagnostic and prognostic results rivaling those obtained by expert readers can be obtained by this widely used technique. Summary Throughout this review, we emphasize the importance of the different normal databases and the need for specific databases relative to distinct imaging procedures. use of appropriate normal limits allows optimal quantification of MPI by taking into account subtle image differences due to the hardware and software used, and the population studied. PMID:28138354

  6. Tumor vessel normalization after aerobic exercise enhances chemotherapeutic efficacy.

    PubMed

    Schadler, Keri L; Thomas, Nicholas J; Galie, Peter A; Bhang, Dong Ha; Roby, Kerry C; Addai, Prince; Till, Jacob E; Sturgeon, Kathleen; Zaslavsky, Alexander; Chen, Christopher S; Ryeom, Sandra

    2016-10-04

    Targeted therapies aimed at tumor vasculature are utilized in combination with chemotherapy to improve drug delivery and efficacy after tumor vascular normalization. Tumor vessels are highly disorganized with disrupted blood flow impeding drug delivery to cancer cells. Although pharmacologic anti-angiogenic therapy can remodel and normalize tumor vessels, there is a limited window of efficacy and these drugs are associated with severe side effects necessitating alternatives for vascular normalization. Recently, moderate aerobic exercise has been shown to induce vascular normalization in mouse models. Here, we provide a mechanistic explanation for the tumor vascular normalization induced by exercise. Shear stress, the mechanical stimuli exerted on endothelial cells by blood flow, modulates vascular integrity. Increasing vascular shear stress through aerobic exercise can alter and remodel blood vessels in normal tissues. Our data in mouse models indicate that activation of calcineurin-NFAT-TSP1 signaling in endothelial cells plays a critical role in exercise-induced shear stress mediated tumor vessel remodeling. We show that moderate aerobic exercise with chemotherapy caused a significantly greater decrease in tumor growth than chemotherapy alone through improved chemotherapy delivery after tumor vascular normalization. Our work suggests that the vascular normalizing effects of aerobic exercise can be an effective chemotherapy adjuvant.

  7. A Review of Depth and Normal Fusion Algorithms

    PubMed Central

    Štolc, Svorad; Pock, Thomas

    2018-01-01

    Geometric surface information such as depth maps and surface normals can be acquired by various methods such as stereo light fields, shape from shading and photometric stereo techniques. We compare several algorithms which deal with the combination of depth with surface normal information in order to reconstruct a refined depth map. The reasons for performance differences are examined from the perspective of alternative formulations of surface normals for depth reconstruction. We review and analyze methods in a systematic way. Based on our findings, we introduce a new generalized fusion method, which is formulated as a least squares problem and outperforms previous methods in the depth error domain by introducing a novel normal weighting that performs closer to the geodesic distance measure. Furthermore, a novel method is introduced based on Total Generalized Variation (TGV) which further outperforms previous approaches in terms of the geodesic normal distance error and maintains comparable quality in the depth error domain. PMID:29389903

  8. Systemic sclerosis with normal or nonspecific nailfold capillaroscopy.

    PubMed

    Fichel, Fanny; Baudot, Nathalie; Gaitz, Jean-Pierre; Trad, Salim; Barbe, Coralie; Francès, Camille; Senet, Patricia

    2014-01-01

    In systemic sclerosis (SSc), a specific nailfold videocapillaroscopy (NVC) pattern is observed in 90% of cases and seems to be associated with severity and progression of the disease. To describe the characteristics of SSc patients with normal or nonspecific (normal/nonspecific) NVC. In a retrospective cohort study, clinical features and visceral involvements of 25 SSc cases with normal/nonspecific NVC were compared to 63 SSc controls with the SSc-specific NVC pattern. Normal/nonspecific NVC versus SSc-specific NVC pattern was significantly associated with absence of skin sclerosis (32 vs. 6.3%, p = 0.004), absence of telangiectasia (47.8 vs. 17.3%, p = 0.006) and absence of sclerodactyly (60 vs. 25.4%, p = 0.002), and less frequent severe pulmonary involvement (26.3 vs. 58.2%, p = 0.017). Normal/nonspecific NVC in SSc patients appears to be associated with less severe skin involvement and less frequent severe pulmonary involvement. © 2014 S. Karger AG, Basel.

  9. Non-Normality and Testing that a Correlation Equals Zero

    ERIC Educational Resources Information Center

    Levy, Kenneth J.

    1977-01-01

    The importance of the assumption of normality for testing that a bivariate normal correlation equals zero is examined. Both empirical and theoretical evidence suggest that such tests are robust with respect to violation of the normality assumption. (Author/JKS)

  10. Deformation associated with continental normal faults

    NASA Astrophysics Data System (ADS)

    Resor, Phillip G.

    Deformation associated with normal fault earthquakes and geologic structures provide insights into the seismic cycle as it unfolds over time scales from seconds to millions of years. Improved understanding of normal faulting will lead to more accurate seismic hazard assessments and prediction of associated structures. High-precision aftershock locations for the 1995 Kozani-Grevena earthquake (Mw 6.5), Greece image a segmented master fault and antithetic faults. This three-dimensional fault geometry is typical of normal fault systems mapped from outcrop or interpreted from reflection seismic data and illustrates the importance of incorporating three-dimensional fault geometry in mechanical models. Subsurface fault slip associated with the Kozani-Grevena and 1999 Hector Mine (Mw 7.1) earthquakes is modeled using a new method for slip inversion on three-dimensional fault surfaces. Incorporation of three-dimensional fault geometry improves the fit to the geodetic data while honoring aftershock distributions and surface ruptures. GPS Surveying of deformed bedding surfaces associated with normal faulting in the western Grand Canyon reveals patterns of deformation that are similar to those observed by interferometric satellite radar interferometry (InSAR) for the Kozani Grevena earthquake with a prominent down-warp in the hanging wall and a lesser up-warp in the footwall. However, deformation associated with the Kozani-Grevena earthquake extends ˜20 km from the fault surface trace, while the folds in the western Grand Canyon only extend 500 m into the footwall and 1500 m into the hanging wall. A comparison of mechanical and kinematic models illustrates advantages of mechanical models in exploring normal faulting processes including incorporation of both deformation and causative forces, and the opportunity to incorporate more complex fault geometry and constitutive properties. Elastic models with antithetic or synthetic faults or joints in association with a master

  11. Normalization as a canonical neural computation

    PubMed Central

    Carandini, Matteo; Heeger, David J.

    2012-01-01

    There is increasing evidence that the brain relies on a set of canonical neural computations, repeating them across brain regions and modalities to apply similar operations to different problems. A promising candidate for such a computation is normalization, in which the responses of neurons are divided by a common factor that typically includes the summed activity of a pool of neurons. Normalization was developed to explain responses in the primary visual cortex and is now thought to operate throughout the visual system, and in many other sensory modalities and brain regions. Normalization may underlie operations such as the representation of odours, the modulatory effects of visual attention, the encoding of value and the integration of multisensory information. Its presence in such a diversity of neural systems in multiple species, from invertebrates to mammals, suggests that it serves as a canonical neural computation. PMID:22108672

  12. Resonance Raman of BCC and normal skin

    NASA Astrophysics Data System (ADS)

    Liu, Cheng-hui; Sriramoju, Vidyasagar; Boydston-White, Susie; Wu, Binlin; Zhang, Chunyuan; Pei, Zhe; Sordillo, Laura; Beckman, Hugh; Alfano, Robert R.

    2017-02-01

    The Resonance Raman (RR) spectra of basal cell carcinoma (BCC) and normal human skin tissues were analyzed using 532nm laser excitation. RR spectral differences in vibrational fingerprints revealed skin normal and cancerous states tissues. The standard diagnosis criterion for BCC tissues are created by native RR biomarkers and its changes at peak intensity. The diagnostic algorithms for the classification of BCC and normal were generated based on SVM classifier and PCA statistical method. These statistical methods were used to analyze the RR spectral data collected from skin tissues, yielding a diagnostic sensitivity of 98.7% and specificity of 79% compared with pathological reports.

  13. Bulimia nervosa in overweight and normal-weight women.

    PubMed

    Masheb, Robin; White, Marney A

    2012-02-01

    The aim of the present study was to examine overweight bulimia nervosa (BN) in a community sample of women. Volunteers (n = 1964) completed self-report questionnaires of weight, binge eating, purging, and cognitive features. Participants were classified as overweight (body mass index ≥25) or normal weight (body mass index <25). Rates of BN within the overweight and normal-weight classes did not differ (6.4% vs 7.9%). Of the 131 participants identified as BN, 64% (n = 84) were classified as overweight BN and 36% (n = 47) as normal-weight BN. The overweight BN group had a greater proportion of ethnic minorities and reported significantly less restraint than the normal-weight BN group. Otherwise, the 2 groups reported similarly, even in terms of purging and depression. In summary, rates of BN did not differ between overweight and normal-weight women. Among BN participants, the majority (two thirds) were overweight. Differences in ethnicity and restraint, but little else, were found between overweight and normal-weight BN. Findings from the present study should serve to increase awareness of the weight range and ethnic diversity of BN, and highlight the need to address weight and cultural sensitivity in the identification and treatment of eating disorders. Copyright © 2012 Elsevier Inc. All rights reserved.

  14. Normal force and drag force in magnetorheological finishing

    NASA Astrophysics Data System (ADS)

    Miao, Chunlin; Shafrir, Shai N.; Lambropoulos, John C.; Jacobs, Stephen D.

    2009-08-01

    The material removal in magnetorheological finishing (MRF) is known to be controlled by shear stress, λ, which equals drag force, Fd, divided by spot area, As. However, it is unclear how the normal force, Fn, affects the material removal in MRF and how the measured ratio of drag force to normal force Fd/Fn, equivalent to coefficient of friction, is related to material removal. This work studies, for the first time for MRF, the normal force and the measured ratio Fd/Fn as a function of material mechanical properties. Experimental data were obtained by taking spots on a variety of materials including optical glasses and hard ceramics with a spot-taking machine (STM). Drag force and normal force were measured with a dual load cell. Drag force decreases linearly with increasing material hardness. In contrast, normal force increases with hardness for glasses, saturating at high hardness values for ceramics. Volumetric removal rate decreases with normal force across all materials. The measured ratio Fd/Fn shows a strong negative linear correlation with material hardness. Hard materials exhibit a low "coefficient of friction". The volumetric removal rate increases with the measured ratio Fd/Fn which is also correlated with shear stress, indicating that the measured ratio Fd/Fn is a useful measure of material removal in MRF.

  15. High-Frequency Normal Mode Propagation in Aluminum Cylinders

    USGS Publications Warehouse

    Lee, Myung W.; Waite, William F.

    2009-01-01

    Acoustic measurements made using compressional-wave (P-wave) and shear-wave (S-wave) transducers in aluminum cylinders reveal waveform features with high amplitudes and with velocities that depend on the feature's dominant frequency. In a given waveform, high-frequency features generally arrive earlier than low-frequency features, typical for normal mode propagation. To analyze these waveforms, the elastic equation is solved in a cylindrical coordinate system for the high-frequency case in which the acoustic wavelength is small compared to the cylinder geometry, and the surrounding medium is air. Dispersive P- and S-wave normal mode propagations are predicted to exist, but owing to complex interference patterns inside a cylinder, the phase and group velocities are not smooth functions of frequency. To assess the normal mode group velocities and relative amplitudes, approximate dispersion relations are derived using Bessel functions. The utility of the normal mode theory and approximations from a theoretical and experimental standpoint are demonstrated by showing how the sequence of P- and S-wave normal mode arrivals can vary between samples of different size, and how fundamental normal modes can be mistaken for the faster, but significantly smaller amplitude, P- and S-body waves from which P- and S-wave speeds are calculated.

  16. ["Normal pressure" hydrocephalus].

    PubMed

    Philippon, Jacques

    2005-03-01

    Normal pressure hydrocephalus (NPH) or, more precisely, chronic adult hydrocephalus, is a complex condition. Even if the basic mechanism is found in an impediment to CSF absorption, the underlying pathology is heterogeneous. In secondary NPH, the disruption of normal CSF pathways, following meningitis or sub-arachnoid haemorrhage, is responsible for ventricular dilatation. However, in about half of the cases, the etiology remains obscure. NPH is more frequently found in elderly people, probably in relation with the increased incidence of cerebrovascular disease. The diagnosis of NPH is based upon a triad of clinical symptoms. The main symptom is gait disturbances, followed by urinary incontinence and various degree of cognitive changes. The latter two symptoms are not prerequisites for the diagnosis. Radiological ventricular dilatation without cortical sulcal enlargement is a key factor, as well as substantial clinical improvement after CSF withdrawal (CSF tap test). Other CSF dynamic studies and various imaging investigations have been proposed to improve diagnostic accuracy, but no simple test can predict the results of CSF drainage. The current treatment is ventriculo-peritonial shunting, ideally using an adjustable valve. Results are directly dependent upon the accuracy of the preoperative diagnosis. Post-surgical complications may be observed in about 10% of cases.

  17. Self-Monitoring of Listening Abilities in Normal-Hearing Children, Normal-Hearing Adults, and Children with Cochlear Implants

    PubMed Central

    Rothpletz, Ann M.; Wightman, Frederic L.; Kistler, Doris J.

    2012-01-01

    Background Self-monitoring has been shown to be an essential skill for various aspects of our lives, including our health, education, and interpersonal relationships. Likewise, the ability to monitor one’s speech reception in noisy environments may be a fundamental skill for communication, particularly for those who are often confronted with challenging listening environments, such as students and children with hearing loss. Purpose The purpose of this project was to determine if normal-hearing children, normal-hearing adults, and children with cochlear implants can monitor their listening ability in noise and recognize when they are not able to perceive spoken messages. Research Design Participants were administered an Objective-Subjective listening task in which their subjective judgments of their ability to understand sentences from the Coordinate Response Measure corpus presented in speech spectrum noise were compared to their objective performance on the same task. Study Sample Participants included 41 normal-hearing children, 35 normal-hearing adults, and 10 children with cochlear implants. Data Collection and Analysis On the Objective-Subjective listening task, the level of the masker noise remained constant at 63 dB SPL, while the level of the target sentences varied over a 12 dB range in a block of trials. Psychometric functions, relating proportion correct (Objective condition) and proportion perceived as intelligible (Subjective condition) to target/masker ratio (T/M), were estimated for each participant. Thresholds were defined as the T/M required to produce 51% correct (Objective condition) and 51% perceived as intelligible (Subjective condition). Discrepancy scores between listeners’ threshold estimates in the Objective and Subjective conditions served as an index of self-monitoring ability. In addition, the normal-hearing children were administered tests of cognitive skills and academic achievement, and results from these measures were compared

  18. Quantitative computed tomography determined regional lung mechanics in normal nonsmokers, normal smokers and metastatic sarcoma subjects.

    PubMed

    Choi, Jiwoong; Hoffman, Eric A; Lin, Ching-Long; Milhem, Mohammed M; Tessier, Jean; Newell, John D

    2017-01-01

    Extra-thoracic tumors send out pilot cells that attach to the pulmonary endothelium. We hypothesized that this could alter regional lung mechanics (tissue stiffening or accumulation of fluid and inflammatory cells) through interactions with host cells. We explored this with serial inspiratory computed tomography (CT) and image matching to assess regional changes in lung expansion. We retrospectively assessed 44 pairs of two serial CT scans on 21 sarcoma patients: 12 without lung metastases and 9 with lung metastases. For each subject, two or more serial inspiratory clinically-derived CT scans were retrospectively collected. Two research-derived control groups were included: 7 normal nonsmokers and 12 asymptomatic smokers with two inspiratory scans taken the same day or one year apart respectively. We performed image registration for local-to-local matching scans to baseline, and derived local expansion and density changes at an acinar scale. Welch two sample t test was used for comparison between groups. Statistical significance was determined with a p value < 0.05. Lung regions of metastatic sarcoma patients (but not the normal control group) demonstrated an increased proportion of normalized lung expansion between the first and second CT. These hyper-expanded regions were associated with, but not limited to, visible metastatic lung lesions. Compared with the normal control group, the percent of increased normalized hyper-expanded lung in sarcoma subjects was significantly increased (p < 0.05). There was also evidence of increased lung "tissue" volume (non-air components) in the hyper-expanded regions of the cancer subjects relative to non-hyper-expanded regions. "Tissue" volume increase was present in the hyper-expanded regions of metastatic and non-metastatic sarcoma subjects. This putatively could represent regional inflammation related to the presence of tumor pilot cell-host related interactions. This new quantitative CT (QCT) method for linking serial

  19. About normal distribution on SO(3) group in texture analysis

    NASA Astrophysics Data System (ADS)

    Savyolova, T. I.; Filatov, S. V.

    2017-12-01

    This article studies and compares different normal distributions (NDs) on SO(3) group, which are used in texture analysis. Those NDs are: Fisher normal distribution (FND), Bunge normal distribution (BND), central normal distribution (CND) and wrapped normal distribution (WND). All of the previously mentioned NDs are central functions on SO(3) group. CND is a subcase for normal CLT-motivated distributions on SO(3) (CLT here is Parthasarathy’s central limit theorem). WND is motivated by CLT in R 3 and mapped to SO(3) group. A Monte Carlo method for modeling normally distributed values was studied for both CND and WND. All of the NDs mentioned above are used for modeling different components of crystallites orientation distribution function in texture analysis.

  20. Drug Use Normalization: A Systematic and Critical Mixed-Methods Review.

    PubMed

    Sznitman, Sharon R; Taubman, Danielle S

    2016-09-01

    Drug use normalization, which is a process whereby drug use becomes less stigmatized and more accepted as normative behavior, provides a conceptual framework for understanding contemporary drug issues and changes in drug use trends. Through a mixed-methods systematic review of the normalization literature, this article seeks to (a) critically examine how the normalization framework has been applied in empirical research and (b) make recommendations for future research in this area. Twenty quantitative, 26 qualitative, and 4 mixed-methods studies were identified through five electronic databases and reference lists of published studies. Studies were assessed for relevance, study characteristics, quality, and aspects of normalization examined. None of the studies applied the most rigorous research design (experiments) or examined all of the originally proposed normalization dimensions. The most commonly assessed dimension of drug use normalization was "experimentation." In addition to the original dimensions, the review identified the following new normalization dimensions in the literature: (a) breakdown of demographic boundaries and other risk factors in relation to drug use; (b) de-normalization; (c) drug use as a means to achieve normal goals; and (d) two broad forms of micro-politics associated with managing the stigma of illicit drug use: assimilative and transformational normalization. Further development in normalization theory and methodology promises to provide researchers with a novel framework for improving our understanding of drug use in contemporary society. Specifically, quasi-experimental designs that are currently being made feasible by swift changes in cannabis policy provide researchers with new and improved opportunities to examine normalization processes.

  1. Normal Birth: Two Stories

    PubMed Central

    Scaer, Roberta M.

    2002-01-01

    The author shares two stories: one of a normal birth that took place in a hospital with a nurse-midwife in attendance and another of a home birth unexpectedly shared by many colleagues. Both are told with the goal to inform, inspire, and educate. PMID:17273292

  2. Mutual regulation of tumour vessel normalization and immunostimulatory reprogramming.

    PubMed

    Tian, Lin; Goldstein, Amit; Wang, Hai; Ching Lo, Hin; Sun Kim, Ik; Welte, Thomas; Sheng, Kuanwei; Dobrolecki, Lacey E; Zhang, Xiaomei; Putluri, Nagireddy; Phung, Thuy L; Mani, Sendurai A; Stossi, Fabio; Sreekumar, Arun; Mancini, Michael A; Decker, William K; Zong, Chenghang; Lewis, Michael T; Zhang, Xiang H-F

    2017-04-13

    Blockade of angiogenesis can retard tumour growth, but may also paradoxically increase metastasis. This paradox may be resolved by vessel normalization, which involves increased pericyte coverage, improved tumour vessel perfusion, reduced vascular permeability, and consequently mitigated hypoxia. Although these processes alter tumour progression, their regulation is poorly understood. Here we show that type 1 T helper (T H 1) cells play a crucial role in vessel normalization. Bioinformatic analyses revealed that gene expression features related to vessel normalization correlate with immunostimulatory pathways, especially T lymphocyte infiltration or activity. To delineate the causal relationship, we used various mouse models with vessel normalization or T lymphocyte deficiencies. Although disruption of vessel normalization reduced T lymphocyte infiltration as expected, reciprocal depletion or inactivation of CD4 + T lymphocytes decreased vessel normalization, indicating a mutually regulatory loop. In addition, activation of CD4 + T lymphocytes by immune checkpoint blockade increased vessel normalization. T H 1 cells that secrete interferon-γ are a major population of cells associated with vessel normalization. Patient-derived xenograft tumours growing in immunodeficient mice exhibited enhanced hypoxia compared to the original tumours in immunocompetent humans, and hypoxia was reduced by adoptive T H 1 transfer. Our findings elucidate an unexpected role of T H 1 cells in vasculature and immune reprogramming. T H 1 cells may be a marker and a determinant of both immune checkpoint blockade and anti-angiogenesis efficacy.

  3. Phenformin-induced Hypoglycaemia in Normal Subjects*

    PubMed Central

    Lyngsøe, J.; Trap-Jensen, J.

    1969-01-01

    Study of the effect of phenformin on the blood glucose level in normal subjects before and during 70 hours of starvation showed a statistically significant hypoglycaemic effect after 40 hours of starvation. This effect was not due to increased glucose utilization. Another finding in this study was a statistically significant decrease in total urinary nitrogen excretion during starvation in subjects given phenformin. These findings show that the hypoglycaemic effect of phenformin in starved normal subjects is due to inhibition of gluconeogenesis. PMID:5780431

  4. Generalized approach for using unbiased symmetric metrics with negative values: normalized mean bias factor and normalized mean absolute error factor

    EPA Science Inventory

    Unbiased symmetric metrics provide a useful measure to quickly compare two datasets, with similar interpretations for both under and overestimations. Two examples include the normalized mean bias factor and normalized mean absolute error factor. However, the original formulations...

  5. Problems with Multivariate Normality: Can the Multivariate Bootstrap Help?

    ERIC Educational Resources Information Center

    Thompson, Bruce

    Multivariate normality is required for some statistical tests. This paper explores the implications of violating the assumption of multivariate normality and illustrates a graphical procedure for evaluating multivariate normality. The logic for using the multivariate bootstrap is presented. The multivariate bootstrap can be used when distribution…

  6. [Primary culture of human normal epithelial cells].

    PubMed

    Tang, Yu; Xu, Wenji; Guo, Wanbei; Xie, Ming; Fang, Huilong; Chen, Chen; Zhou, Jun

    2017-11-28

    The traditional primary culture methods of human normal epithelial cells have disadvantages of low activity of cultured cells, the low cultivated rate and complicated operation. To solve these problems, researchers made many studies on culture process of human normal primary epithelial cell. In this paper, we mainly introduce some methods used in separation and purification of human normal epithelial cells, such as tissue separation method, enzyme digestion separation method, mechanical brushing method, red blood cell lysis method, percoll layered medium density gradient separation method. We also review some methods used in the culture and subculture, including serum-free medium combined with low mass fraction serum culture method, mouse tail collagen coating method, and glass culture bottle combined with plastic culture dish culture method. The biological characteristics of human normal epithelial cells, the methods of immunocytochemical staining, trypan blue exclusion are described. Moreover, the factors affecting the aseptic operation, the conditions of the extracellular environment, the conditions of the extracellular environment during culture, the number of differential adhesion, and the selection and dosage of additives are summarized.

  7. Neuropathological and neuropsychological changes in "normal" aging: evidence for preclinical Alzheimer disease in cognitively normal individuals.

    PubMed

    Hulette, C M; Welsh-Bohmer, K A; Murray, M G; Saunders, A M; Mash, D C; McIntyre, L M

    1998-12-01

    The presence of diffuse or primitive senile plaques in the neocortex of cognitively normal elderly at autopsy has been presumed to represent normal aging. Alternatively, these patients may have developed dementia and clinical Alzheimer disease (AD) if they had survived. In this setting, these patients could be subjects for cognitive or pharmacologic intervention to delay disease onset. We have thus followed a cohort of cognitively normal elderly subjects with a Clinical Dementia Rating (CDR) of 0 at autopsy. Thirty-one brains were examined at postmortem according to Consortium to Establish a Registry for Alzheimer Disease (CERAD) criteria and staged according to Braak. Ten patients were pathologically normal according to CERAD criteria (1a). Two of these patients were Braak Stage II. Seven very elderly subjects exhibited a few primitive neuritic plaques in the cortex and thus represented CERAD 1b. These individuals ranged in age from 85 to 105 years and were thus older than the CERAD la group that ranged in age from 72 to 93. Fourteen patients displayed Possible AD according to CERAD with ages ranging from 66 to 95. Three of these were Braak Stage I, 4 were Braak Stage II, and 7 were Braak Stage III. The Apolipoprotein E4 allele was over-represented in this possible AD group. Neuropsychological data were available on 12 individuals. In these 12 individuals, Possible AD at autopsy could be predicted by cognitive deficits in 1 or more areas including savings scores on memory testing and overall performance on some measures of frontal executive function.

  8. Laser-induced differential normalized fluorescence method for cancer diagnosis

    DOEpatents

    Vo-Dinh, Tuan; Panjehpour, Masoud; Overholt, Bergein F.

    1996-01-01

    An apparatus and method for cancer diagnosis are disclosed. The diagnostic method includes the steps of irradiating a tissue sample with monochromatic excitation light, producing a laser-induced fluorescence spectrum from emission radiation generated by interaction of the excitation light with the tissue sample, and dividing the intensity at each wavelength of the laser-induced fluorescence spectrum by the integrated area under the laser-induced fluorescence spectrum to produce a normalized spectrum. A mathematical difference between the normalized spectrum and an average value of a reference set of normalized spectra which correspond to normal tissues is calculated, which provides for amplifying small changes in weak signals from malignant tissues for improved analysis. The calculated differential normalized spectrum is correlated to a specific condition of a tissue sample.

  9. Normal Force and Drag Force in Magnetorheological Finishing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miao, C.; Shafrir, S.N.; Lambropoulos, J.C.

    2010-01-13

    The material removal in magnetorheological finishing (MRF) is known to be controlled by shear stress, tau, which equals drag force, Fd, divided by spot area, As. However, it is unclear how the normal force, Fn, affects the material removal in MRF and how the measured ratio of drag force to normal force Fd/Fn, equivalent to coefficient of friction, is related to material removal. This work studies, for the first time for MRF, the normal force and the measured ratio Fd/Fn as a function of material mechanical properties. Experimental data were obtained by taking spots on a variety of materials includingmore » optical glasses and hard ceramics with a spot-taking machine (STM). Drag force and normal force were measured with a dual load cell. Drag force decreases linearly with increasing material hardness. In contrast, normal force increases with hardness for glasses, saturating at high hardness values for ceramics. Volumetric removal rate decreases with normal force across all materials. The measured ratio Fd/Fn shows a strong negative linear correlation with material hardness. Hard materials exhibit a low “coefficient of friction”. The volumetric removal rate increases with the measured ratio Fd/Fn which is also correlated with shear stress, indicating that the measured ratio Fd/Fn is a useful measure of material removal in MRF.« less

  10. Ultraviolet Spectra of Normal Spiral Galaxies

    NASA Technical Reports Server (NTRS)

    Kinney, Anne

    1997-01-01

    The data related to this grant on the Ultraviolet Spectra of Normal Spiral Galaxies have been entirely reduced and analyzed. It is incorporated into templates of Spiral galaxies used in the calculation of K corrections towards the understanding of high redshift galaxies. The main paper was published in the Astrophysical Journal, August 1996, Volume 467, page 38. The data was also used in another publication, The Spectral Energy Distribution of Normal Starburst and Active Galaxies, June 1997, preprint series No. 1158. Copies of both have been attached.

  11. Analytic integrable systems: Analytic normalization and embedding flows

    NASA Astrophysics Data System (ADS)

    Zhang, Xiang

    In this paper we mainly study the existence of analytic normalization and the normal form of finite dimensional complete analytic integrable dynamical systems. More details, we will prove that any complete analytic integrable diffeomorphism F(x)=Bx+f(x) in (Cn,0) with B having eigenvalues not modulus 1 and f(x)=O(|) is locally analytically conjugate to its normal form. Meanwhile, we also prove that any complete analytic integrable differential system x˙=Ax+f(x) in (Cn,0) with A having nonzero eigenvalues and f(x)=O(|) is locally analytically conjugate to its normal form. Furthermore we will prove that any complete analytic integrable diffeomorphism defined on an analytic manifold can be embedded in a complete analytic integrable flow. We note that parts of our results are the improvement of Moser's one in J. Moser, The analytic invariants of an area-preserving mapping near a hyperbolic fixed point, Comm. Pure Appl. Math. 9 (1956) 673-692 and of Poincaré's one in H. Poincaré, Sur l'intégration des équations différentielles du premier order et du premier degré, II, Rend. Circ. Mat. Palermo 11 (1897) 193-239. These results also improve the ones in Xiang Zhang, Analytic normalization of analytic integrable systems and the embedding flows, J. Differential Equations 244 (2008) 1080-1092 in the sense that the linear part of the systems can be nonhyperbolic, and the one in N.T. Zung, Convergence versus integrability in Poincaré-Dulac normal form, Math. Res. Lett. 9 (2002) 217-228 in the way that our paper presents the concrete expression of the normal form in a restricted case.

  12. Normal-range verbal-declarative memory in schizophrenia.

    PubMed

    Heinrichs, R Walter; Parlar, Melissa; Pinnock, Farena

    2017-10-01

    Cognitive impairment is prevalent and related to functional outcome in schizophrenia, but a significant minority of the patient population overlaps with healthy controls on many performance measures, including declarative-verbal-memory tasks. In this study, we assessed the validity, clinical, and functional implications of normal-range (NR), verbal-declarative memory in schizophrenia. Performance normality was defined using normative data for 8 basic California Verbal Learning Test (CVLT-II; Delis, Kramer, Kaplan, & Ober, 2000) recall and recognition trials. Schizophrenia patients (n = 155) and healthy control participants (n = 74) were assessed for performance normality, defined as scores within 1 SD of the normative mean on all 8 trials, and assigned to normal- and below-NR memory groups. NR schizophrenia patients (n = 26) and control participants (n = 51) did not differ in general verbal ability, on a reading-based estimate of premorbid ability, across all 8 CVLT-II-score comparisons or in terms of intrusion and false-positive errors and auditory working memory. NR memory patients did not differ from memory-impaired patients (n = 129) in symptom severity, and both patient groups were significantly and similarly disabled in terms of functional status in the community. These results confirm a subpopulation of schizophrenia patients with normal, verbal-declarative-memory performance and no evidence of decline from higher premorbid ability levels. However, NR patients did not experience less severe psychopathology, nor did they show advantage in community adjustment relative to impaired patients. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  13. Normalization of a chromosomal contact map.

    PubMed

    Cournac, Axel; Marie-Nelly, Hervé; Marbouty, Martial; Koszul, Romain; Mozziconacci, Julien

    2012-08-30

    Chromatin organization has been increasingly studied in relation with its important influence on DNA-related metabolic processes such as replication or regulation of gene expression. Since its original design ten years ago, capture of chromosome conformation (3C) has become an essential tool to investigate the overall conformation of chromosomes. It relies on the capture of long-range trans and cis interactions of chromosomal segments whose relative proportions in the final bank reflect their frequencies of interactions, hence their spatial proximity in a population of cells. The recent coupling of 3C with deep sequencing approaches now allows the generation of high resolution genome-wide chromosomal contact maps. Different protocols have been used to generate such maps in various organisms. This includes mammals, drosophila and yeast. The massive amount of raw data generated by the genomic 3C has to be carefully processed to alleviate the various biases and byproducts generated by the experiments. Our study aims at proposing a simple normalization procedure to minimize the influence of these unwanted but inevitable events on the final results. Careful analysis of the raw data generated previously for budding yeast S. cerevisiae led to the identification of three main biases affecting the final datasets, including a previously unknown bias resulting from the circularization of DNA molecules. We then developed a simple normalization procedure to process the data and allow the generation of a normalized, highly contrasted, chromosomal contact map for S. cerevisiae. The same method was then extended to the first human genome contact map. Using the normalized data, we revisited the preferential interactions originally described between subsets of discrete chromosomal features. Notably, the detection of preferential interactions between tRNA in yeast and CTCF, PolII binding sites in human can vary with the normalization procedure used. We quantitatively reanalyzed the

  14. Normal forms for Hopf-Zero singularities with nonconservative nonlinear part

    NASA Astrophysics Data System (ADS)

    Gazor, Majid; Mokhtari, Fahimeh; Sanders, Jan A.

    In this paper we are concerned with the simplest normal form computation of the systems x˙=2xf(x,y2+z2), y˙=z+yf(x,y2+z2), z˙=-y+zf(x,y2+z2), where f is a formal function with real coefficients and without any constant term. These are the classical normal forms of a larger family of systems with Hopf-Zero singularity. Indeed, these are defined such that this family would be a Lie subalgebra for the space of all classical normal form vector fields with Hopf-Zero singularity. The simplest normal forms and simplest orbital normal forms of this family with nonzero quadratic part are computed. We also obtain the simplest parametric normal form of any non-degenerate perturbation of this family within the Lie subalgebra. The symmetry group of the simplest normal forms is also discussed. This is a part of our results in decomposing the normal forms of Hopf-Zero singular systems into systems with a first integral and nonconservative systems.

  15. Sample normalization methods in quantitative metabolomics.

    PubMed

    Wu, Yiman; Li, Liang

    2016-01-22

    To reveal metabolomic changes caused by a biological event in quantitative metabolomics, it is critical to use an analytical tool that can perform accurate and precise quantification to examine the true concentration differences of individual metabolites found in different samples. A number of steps are involved in metabolomic analysis including pre-analytical work (e.g., sample collection and storage), analytical work (e.g., sample analysis) and data analysis (e.g., feature extraction and quantification). Each one of them can influence the quantitative results significantly and thus should be performed with great care. Among them, the total sample amount or concentration of metabolites can be significantly different from one sample to another. Thus, it is critical to reduce or eliminate the effect of total sample amount variation on quantification of individual metabolites. In this review, we describe the importance of sample normalization in the analytical workflow with a focus on mass spectrometry (MS)-based platforms, discuss a number of methods recently reported in the literature and comment on their applicability in real world metabolomics applications. Sample normalization has been sometimes ignored in metabolomics, partially due to the lack of a convenient means of performing sample normalization. We show that several methods are now available and sample normalization should be performed in quantitative metabolomics where the analyzed samples have significant variations in total sample amounts. Copyright © 2015 Elsevier B.V. All rights reserved.

  16. Role of the normal gut microbiota.

    PubMed

    Jandhyala, Sai Manasa; Talukdar, Rupjyoti; Subramanyam, Chivkula; Vuyyuru, Harish; Sasikala, Mitnala; Nageshwar Reddy, D

    2015-08-07

    Relation between the gut microbiota and human health is being increasingly recognised. It is now well established that a healthy gut flora is largely responsible for overall health of the host. The normal human gut microbiota comprises of two major phyla, namely Bacteroidetes and Firmicutes. Though the gut microbiota in an infant appears haphazard, it starts resembling the adult flora by the age of 3 years. Nevertheless, there exist temporal and spatial variations in the microbial distribution from esophagus to the rectum all along the individual's life span. Developments in genome sequencing technologies and bioinformatics have now enabled scientists to study these microorganisms and their function and microbe-host interactions in an elaborate manner both in health and disease. The normal gut microbiota imparts specific function in host nutrient metabolism, xenobiotic and drug metabolism, maintenance of structural integrity of the gut mucosal barrier, immunomodulation, and protection against pathogens. Several factors play a role in shaping the normal gut microbiota. They include (1) the mode of delivery (vaginal or caesarean); (2) diet during infancy (breast milk or formula feeds) and adulthood (vegan based or meat based); and (3) use of antibiotics or antibiotic like molecules that are derived from the environment or the gut commensal community. A major concern of antibiotic use is the long-term alteration of the normal healthy gut microbiota and horizontal transfer of resistance genes that could result in reservoir of organisms with a multidrug resistant gene pool.

  17. Laser-induced differential normalized fluorescence method for cancer diagnosis

    DOEpatents

    Vo-Dinh, T.; Panjehpour, M.; Overholt, B.F.

    1996-12-03

    An apparatus and method for cancer diagnosis are disclosed. The diagnostic method includes the steps of irradiating a tissue sample with monochromatic excitation light, producing a laser-induced fluorescence spectrum from emission radiation generated by interaction of the excitation light with the tissue sample, and dividing the intensity at each wavelength of the laser-induced fluorescence spectrum by the integrated area under the laser-induced fluorescence spectrum to produce a normalized spectrum. A mathematical difference between the normalized spectrum and an average value of a reference set of normalized spectra which correspond to normal tissues is calculated, which provides for amplifying small changes in weak signals from malignant tissues for improved analysis. The calculated differential normalized spectrum is correlated to a specific condition of a tissue sample. 5 figs.

  18. A general approach to double-moment normalization of drop size distributions

    NASA Astrophysics Data System (ADS)

    Lee, G. W.; Sempere-Torres, D.; Uijlenhoet, R.; Zawadzki, I.

    2003-04-01

    Normalization of drop size distributions (DSDs) is re-examined here. First, we present an extension of scaling normalization using one moment of the DSD as a parameter (as introduced by Sempere-Torres et al, 1994) to a scaling normalization using two moments as parameters of the normalization. It is shown that the normalization of Testud et al. (2001) is a particular case of the two-moment scaling normalization. Thus, a unified vision of the question of DSDs normalization and a good model representation of DSDs is given. Data analysis shows that from the point of view of moment estimation least square regression is slightly more effective than moment estimation from the normalized average DSD.

  19. Toward the optimization of normalized graph Laplacian.

    PubMed

    Xie, Bo; Wang, Meng; Tao, Dacheng

    2011-04-01

    Normalized graph Laplacian has been widely used in many practical machine learning algorithms, e.g., spectral clustering and semisupervised learning. However, all of them use the Euclidean distance to construct the graph Laplacian, which does not necessarily reflect the inherent distribution of the data. In this brief, we propose a method to directly optimize the normalized graph Laplacian by using pairwise constraints. The learned graph is consistent with equivalence and nonequivalence pairwise relationships, and thus it can better represent similarity between samples. Meanwhile, our approach, unlike metric learning, automatically determines the scale factor during the optimization. The learned normalized Laplacian matrix can be directly applied in spectral clustering and semisupervised learning algorithms. Comprehensive experiments demonstrate the effectiveness of the proposed approach.

  20. The Effect of Normalization in Violence Video Classification Performance

    NASA Astrophysics Data System (ADS)

    Ali, Ashikin; Senan, Norhalina

    2017-08-01

    Basically, data pre-processing is an important part of data mining. Normalization is a pre-processing stage for any type of problem statement, especially in video classification. Challenging problems that arises in video classification is because of the heterogeneous content, large variations in video quality and complex semantic meanings of the concepts involved. Therefore, to regularize this problem, it is thoughtful to ensure normalization or basically involvement of thorough pre-processing stage aids the robustness of classification performance. This process is to scale all the numeric variables into certain range to make it more meaningful for further phases in available data mining techniques. Thus, this paper attempts to examine the effect of 2 normalization techniques namely Min-max normalization and Z-score in violence video classifications towards the performance of classification rate using Multi-layer perceptron (MLP) classifier. Using Min-Max Normalization range of [0,1] the result shows almost 98% of accuracy, meanwhile Min-Max Normalization range of [-1,1] accuracy is 59% and for Z-score the accuracy is 50%.

  1. Mitochondrial dysfunction in myocardium obtained from clinically normal dogs, clinically normal anesthetized dogs, and dogs with dilated cardiomyopathy.

    PubMed

    Sleeper, Meg M; Rosato, Bradley P; Bansal, Seema; Avadhani, Narayan G

    2012-11-01

    To compare mitochondrial complex I and complex IV activity in myocardial mitochondria of clinically normal dogs, clinically normal dogs exposed to inhalation anesthesia, and dogs affected with dilated cardiomyopathy. Myocardial samples obtained from 21 euthanized dogs (6 clinically normal [control] dogs, 5 clinically normal dogs subjected to inhalation anesthesia with isoflurane prior to euthanasia, 5 dogs with juvenile-onset dilated cardiomyopathy, and 5 dogs with adult-onset dilated cardiomyopathy). Activity of mitochondrial complex I and complex IV was assayed spectrophotometrically in isolated mitochondria from left ventricular tissue obtained from the 4 groups of dogs. Activity of complex I and complex IV was significantly decreased in anesthetized dogs, compared with activities in the control dogs and dogs with juvenile-onset or adult-onset dilated cardiomyopathy. Inhalation anesthesia disrupted the electron transport chain in the dogs, which potentially led to an outburst of reactive oxygen species that caused mitochondrial dysfunction. Inhalation anesthesia depressed mitochondrial function in dogs, similar to results reported in other species. This effect is important to consider when anesthetizing animals with myocardial disease and suggested that antioxidant treatments may be beneficial in some animals. Additionally, this effect should be considered when designing studies in which mitochondrial enzyme activity will be measured. Additional studies that include a larger number of animals are warranted.

  2. A systematic evaluation of normalization methods in quantitative label-free proteomics.

    PubMed

    Välikangas, Tommi; Suomi, Tomi; Elo, Laura L

    2018-01-01

    To date, mass spectrometry (MS) data remain inherently biased as a result of reasons ranging from sample handling to differences caused by the instrumentation. Normalization is the process that aims to account for the bias and make samples more comparable. The selection of a proper normalization method is a pivotal task for the reliability of the downstream analysis and results. Many normalization methods commonly used in proteomics have been adapted from the DNA microarray techniques. Previous studies comparing normalization methods in proteomics have focused mainly on intragroup variation. In this study, several popular and widely used normalization methods representing different strategies in normalization are evaluated using three spike-in and one experimental mouse label-free proteomic data sets. The normalization methods are evaluated in terms of their ability to reduce variation between technical replicates, their effect on differential expression analysis and their effect on the estimation of logarithmic fold changes. Additionally, we examined whether normalizing the whole data globally or in segments for the differential expression analysis has an effect on the performance of the normalization methods. We found that variance stabilization normalization (Vsn) reduced variation the most between technical replicates in all examined data sets. Vsn also performed consistently well in the differential expression analysis. Linear regression normalization and local regression normalization performed also systematically well. Finally, we discuss the choice of a normalization method and some qualities of a suitable normalization method in the light of the results of our evaluation. © The Author 2016. Published by Oxford University Press.

  3. Verbal Processing Reaction Times in "Normal" and "Poor" Readers.

    ERIC Educational Resources Information Center

    Culbertson, Jack; And Others

    After it had been determined that reaction time (RT) was a sensitive measure of hemispheric dominance in a verbal task performed by normal adult readers, the reaction times of three groups of subjects (20 normal reading college students, 12 normal reading third graders and 11 poor reading grade school students) were compared. Ss were exposed to…

  4. 7 CFR 1794.23 - Proposals normally requiring an EA.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 7 Agriculture 12 2013-01-01 2013-01-01 false Proposals normally requiring an EA. 1794.23 Section... § 1794.23 Proposals normally requiring an EA. RUS will normally prepare an EA for all proposed actions... require an EA and shall be subject to the requirements of §§ 1794.40 through 1794.44. (a) General...

  5. 7 CFR 1794.23 - Proposals normally requiring an EA.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 7 Agriculture 12 2014-01-01 2013-01-01 true Proposals normally requiring an EA. 1794.23 Section... § 1794.23 Proposals normally requiring an EA. RUS will normally prepare an EA for all proposed actions... require an EA and shall be subject to the requirements of §§ 1794.40 through 1794.44. (a) General...

  6. 7 CFR 1794.23 - Proposals normally requiring an EA.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 7 Agriculture 12 2012-01-01 2012-01-01 false Proposals normally requiring an EA. 1794.23 Section... § 1794.23 Proposals normally requiring an EA. RUS will normally prepare an EA for all proposed actions... require an EA and shall be subject to the requirements of §§ 1794.40 through 1794.44. (a) General...

  7. Volume-preserving normal forms of Hopf-zero singularity

    NASA Astrophysics Data System (ADS)

    Gazor, Majid; Mokhtari, Fahimeh

    2013-10-01

    A practical method is described for computing the unique generator of the algebra of first integrals associated with a large class of Hopf-zero singularity. The set of all volume-preserving classical normal forms of this singularity is introduced via a Lie algebra description. This is a maximal vector space of classical normal forms with first integral; this is whence our approach works. Systems with a nonzero condition on their quadratic parts are considered. The algebra of all first integrals for any such system has a unique (modulo scalar multiplication) generator. The infinite level volume-preserving parametric normal forms of any nondegenerate perturbation within the Lie algebra of any such system is computed, where it can have rich dynamics. The associated unique generator of the algebra of first integrals are derived. The symmetry group of the infinite level normal forms are also discussed. Some necessary formulas are derived and applied to appropriately modified Rössler and generalized Kuramoto-Sivashinsky equations to demonstrate the applicability of our theoretical results. An approach (introduced by Iooss and Lombardi) is applied to find an optimal truncation for the first level normal forms of these examples with exponentially small remainders. The numerically suggested radius of convergence (for the first integral) associated with a hypernormalization step is discussed for the truncated first level normal forms of the examples. This is achieved by an efficient implementation of the results using Maple.

  8. An Integrated Approach for RNA-seq Data Normalization.

    PubMed

    Yang, Shengping; Mercante, Donald E; Zhang, Kun; Fang, Zhide

    2016-01-01

    DNA copy number alteration is common in many cancers. Studies have shown that insertion or deletion of DNA sequences can directly alter gene expression, and significant correlation exists between DNA copy number and gene expression. Data normalization is a critical step in the analysis of gene expression generated by RNA-seq technology. Successful normalization reduces/removes unwanted nonbiological variations in the data, while keeping meaningful information intact. However, as far as we know, no attempt has been made to adjust for the variation due to DNA copy number changes in RNA-seq data normalization. In this article, we propose an integrated approach for RNA-seq data normalization. Comparisons show that the proposed normalization can improve power for downstream differentially expressed gene detection and generate more biologically meaningful results in gene profiling. In addition, our findings show that due to the effects of copy number changes, some housekeeping genes are not always suitable internal controls for studying gene expression. Using information from DNA copy number, integrated approach is successful in reducing noises due to both biological and nonbiological causes in RNA-seq data, thus increasing the accuracy of gene profiling.

  9. Pattern Adaptation and Normalization Reweighting.

    PubMed

    Westrick, Zachary M; Heeger, David J; Landy, Michael S

    2016-09-21

    Adaptation to an oriented stimulus changes both the gain and preferred orientation of neural responses in V1. Neurons tuned near the adapted orientation are suppressed, and their preferred orientations shift away from the adapter. We propose a model in which weights of divisive normalization are dynamically adjusted to homeostatically maintain response products between pairs of neurons. We demonstrate that this adjustment can be performed by a very simple learning rule. Simulations of this model closely match existing data from visual adaptation experiments. We consider several alternative models, including variants based on homeostatic maintenance of response correlations or covariance, as well as feedforward gain-control models with multiple layers, and we demonstrate that homeostatic maintenance of response products provides the best account of the physiological data. Adaptation is a phenomenon throughout the nervous system in which neural tuning properties change in response to changes in environmental statistics. We developed a model of adaptation that combines normalization (in which a neuron's gain is reduced by the summed responses of its neighbors) and Hebbian learning (in which synaptic strength, in this case divisive normalization, is increased by correlated firing). The model is shown to account for several properties of adaptation in primary visual cortex in response to changes in the statistics of contour orientation. Copyright © 2016 the authors 0270-6474/16/369805-12$15.00/0.

  10. Fault stability under conditions of variable normal stress

    USGS Publications Warehouse

    Dieterich, J.H.; Linker, M.F.

    1992-01-01

    The stability of fault slip under conditions of varying normal stress is modelled as a spring and slider system with rate- and state-dependent friction. Coupling of normal stress to shear stress is achieved by inclining the spring at an angle, ??, to the sliding surface. Linear analysis yields two conditions for unstable slip. The first, of a type previously identified for constant normal stress systems, results in instability if stiffness is below a critical value. Critical stiffness depends on normal stress, constitutive parameters, characteristic sliding distance and the spring angle. Instability of the first type is possible only for velocity-weakening friction. The second condition yields instability if spring angle ?? <-cot-1??ss, where ??ss is steady-state sliding friction. The second condition can arise under conditions of velocity strengthening or weakening. Stability fields for finite perturbations are investigated by numerical simulation. -Authors

  11. Bacterial microflora of normal and telangiectatic livers in cattle.

    PubMed

    Stotland, E I; Edwards, J F; Roussel, A J; Simpson, R B

    2001-07-01

    To identify potential bacterial pathogens in normal and telangiectatic livers of mature cattle at slaughter and to identify consumer risk associated with hepatic telangiectasia. 50 normal livers and 50 severely telangiectatic livers. Normal and telangiectatic livers were collected at slaughter for aerobic and anaerobic bacterial culture. Isolates were identified, and patterns of isolation were analyzed. Histologic examination of all livers was performed. Human pathogens isolated from normal and telangiectatic livers included Escherichia coli O157:H7 and group-D streptococci. Most livers in both groups contained bacteria in low numbers; however, more normal livers yielded negative culture results. More group-D streptococci were isolated from the right lobes of telangiectatic livers than from the left lobes, and more gram-negative anaerobic bacteria were isolated from left lobes of telangiectatic livers than from right lobes. All telangiectatic lesions were free of fibrosis, active necrotizing processes, and inflammation. The USDA regulation condemning telangiectatic livers is justified insofar as these livers contain more bacteria than normal livers do; however, normal livers contain similar species of microflora. Development of telangiectasia could not be linked to an infectious process. The finding of E coli O157:H7 in bovine livers suggests that information regarding bacterial content of other offal and muscle may identify sources of this and other potential foodborne pathogens and assist in establishing critical control points for the meat industry.

  12. CEC-normalized clay-water sorption isotherm

    NASA Astrophysics Data System (ADS)

    Woodruff, W. F.; Revil, A.

    2011-11-01

    A normalized clay-water isotherm model based on BET theory and describing the sorption and desorption of the bound water in clays, sand-clay mixtures, and shales is presented. Clay-water sorption isotherms (sorption and desorption) of clayey materials are normalized by their cation exchange capacity (CEC) accounting for a correction factor depending on the type of counterion sorbed on the mineral surface in the so-called Stern layer. With such normalizations, all the data collapse into two master curves, one for sorption and one for desorption, independent of the clay mineralogy, crystallographic considerations, and bound cation type; therefore, neglecting the true heterogeneity of water sorption/desorption in smectite. The two master curves show the general hysteretic behavior of the capillary pressure curve at low relative humidity (below 70%). The model is validated against several data sets obtained from the literature comprising a broad range of clay types and clay mineralogies. The CEC values, derived by inverting the sorption/adsorption curves using a Markov chain Monte Carlo approach, are consistent with the CEC associated with the clay mineralogy.

  13. Vagina: What's Normal, What's Not

    MedlinePlus

    ... some antibiotics increases the risk of a vaginal yeast infection. Birth control and feminine-hygiene products. Barrier ... or change in the normal balance of vaginal yeast and bacteria can cause inflammation of the vagina ( ...

  14. Graph-based normalization and whitening for non-linear data analysis.

    PubMed

    Aaron, Catherine

    2006-01-01

    In this paper we construct a graph-based normalization algorithm for non-linear data analysis. The principle of this algorithm is to get a spherical average neighborhood with unit radius. First we present a class of global dispersion measures used for "global normalization"; we then adapt these measures using a weighted graph to build a local normalization called "graph-based" normalization. Then we give details of the graph-based normalization algorithm and illustrate some results. In the second part we present a graph-based whitening algorithm built by analogy between the "global" and the "local" problem.

  15. Normal modes of a small gamelan gong.

    PubMed

    Perrin, Robert; Elford, Daniel P; Chalmers, Luke; Swallowe, Gerry M; Moore, Thomas R; Hamdan, Sinin; Halkon, Benjamin J

    2014-10-01

    Studies have been made of the normal modes of a 20.7 cm diameter steel gamelan gong. A finite-element model has been constructed and its predictions for normal modes compared with experimental results obtained using electronic speckle pattern interferometry. Agreement was reasonable in view of the lack of precision in the manufacture of the instrument. The results agree with expectations for an axially symmetric system subject to small symmetry breaking. The extent to which the results obey Chladni's law is discussed. Comparison with vibrational and acoustical spectra enabled the identification of the small number of modes responsible for the sound output when played normally. Evidence of non-linear behavior was found, mainly in the form of subharmonics of true modes. Experiments using scanning laser Doppler vibrometry gave satisfactory agreement with the other methods.

  16. Normalization methods in time series of platelet function assays

    PubMed Central

    Van Poucke, Sven; Zhang, Zhongheng; Roest, Mark; Vukicevic, Milan; Beran, Maud; Lauwereins, Bart; Zheng, Ming-Hua; Henskens, Yvonne; Lancé, Marcus; Marcus, Abraham

    2016-01-01

    Abstract Platelet function can be quantitatively assessed by specific assays such as light-transmission aggregometry, multiple-electrode aggregometry measuring the response to adenosine diphosphate (ADP), arachidonic acid, collagen, and thrombin-receptor activating peptide and viscoelastic tests such as rotational thromboelastometry (ROTEM). The task of extracting meaningful statistical and clinical information from high-dimensional data spaces in temporal multivariate clinical data represented in multivariate time series is complex. Building insightful visualizations for multivariate time series demands adequate usage of normalization techniques. In this article, various methods for data normalization (z-transformation, range transformation, proportion transformation, and interquartile range) are presented and visualized discussing the most suited approach for platelet function data series. Normalization was calculated per assay (test) for all time points and per time point for all tests. Interquartile range, range transformation, and z-transformation demonstrated the correlation as calculated by the Spearman correlation test, when normalized per assay (test) for all time points. When normalizing per time point for all tests, no correlation could be abstracted from the charts as was the case when using all data as 1 dataset for normalization. PMID:27428217

  17. Normal Language Skills and Normal Intelligence in a Child with de Lange Syndrome.

    ERIC Educational Resources Information Center

    Cameron, Thomas H.; Kelly, Desmond P.

    1988-01-01

    The subject of this case report is a two-year, seven-month-old girl with de Lange syndrome, normal intelligence, and age-appropriate language skills. She demonstrated initial delays in gross motor skills and in receptive and expressive language but responded well to intensive speech and language intervention, as well as to physical therapy.…

  18. The experience of weight management in normal weight adults.

    PubMed

    Hernandez, Cheri Ann; Hernandez, David A; Wellington, Christine M; Kidd, Art

    2016-11-01

    No prior research has been done with normal weight persons specific to their experience of weight management. The purpose of this research was to discover the experience of weight management in normal weight individuals. Glaserian grounded theory was used. Qualitative data (focus group) and quantitative data (food diary, study questionnaire, and anthropometric measures) were collected. Weight management was an ongoing process of trying to focus on living (family, work, and social), while maintaining their normal weight targets through five consciously and unconsciously used strategies. Despite maintaining normal weights, the nutritional composition of foods eaten was grossly inadequate. These five strategies can be used to develop new weight management strategies that could be integrated into existing weight management programs, or could be developed into novel weight management interventions. Surprisingly, normal weight individuals require dietary assessment and nutrition education to prevent future negative health consequences. Copyright © 2016 Elsevier Inc. All rights reserved.

  19. Bayes classification of terrain cover using normalized polarimetric data

    NASA Technical Reports Server (NTRS)

    Yueh, H. A.; Swartz, A. A.; Kong, J. A.; Shin, R. T.; Novak, L. M.

    1988-01-01

    The normalized polarimetric classifier (NPC) which uses only the relative magnitudes and phases of the polarimetric data is proposed for discrimination of terrain elements. The probability density functions (PDFs) of polarimetric data are assumed to have a complex Gaussian distribution, and the marginal PDF of the normalized polarimetric data is derived by adopting the Euclidean norm as the normalization function. The general form of the distance measure for the NPC is also obtained. It is demonstrated that for polarimetric data with an arbitrary PDF, the distance measure of NPC will be independent of the normalization function selected even when the classifier is mistrained. A complex Gaussian distribution is assumed for the polarimetric data consisting of grass and tree regions. The probability of error for the NPC is compared with those of several other single-feature classifiers. The classification error of NPCs is shown to be independent of the normalization function.

  20. Telomere length in normal and neoplastic canine tissues.

    PubMed

    Cadile, Casey D; Kitchell, Barbara E; Newman, Rebecca G; Biller, Barbara J; Hetler, Elizabeth R

    2007-12-01

    To determine the mean telomere restriction fragment (TRF) length in normal and neoplastic canine tissues. 57 solid-tissue tumor specimens collected from client-owned dogs, 40 samples of normal tissue collected from 12 clinically normal dogs, and blood samples collected from 4 healthy blood donor dogs. Tumor specimens were collected from client-owned dogs during diagnostic or therapeutic procedures at the University of Illinois Veterinary Medical Teaching Hospital, whereas 40 normal tissue samples were collected from 12 control dogs. Telomere restriction fragment length was determined by use of an assay kit. A histologic diagnosis was provided for each tumor by personnel at the Veterinary Diagnostic Laboratory at the University of Illinois. Mean of the mean TRF length for 44 normal samples was 19.0 kilobases (kb; range, 15.4 to 21.4 kb), and the mean of the mean TRF length for 57 malignant tumors was 19.0 kb (range, 12.9 to 23.5 kb). Although the mean of the mean TRF length for tumors and normal tissues was identical, tumor samples had more variability in TRF length. Telomerase, which represents the main mechanism by which cancer cells achieve immortality, is an attractive therapeutic target. The ability to measure telomere length is crucial to monitoring the efficacy of telomerase inhibition. In contrast to many other mammalian species, the length of canine telomeres and the rate of telomeric DNA loss are similar to those reported in humans, making dogs a compelling choice for use in the study of human anti-telomerase strategies.

  1. Normal mode-guided transition pathway generation in proteins

    PubMed Central

    Lee, Byung Ho; Seo, Sangjae; Kim, Min Hyeok; Kim, Youngjin; Jo, Soojin; Choi, Moon-ki; Lee, Hoomin; Choi, Jae Boong

    2017-01-01

    The biological function of proteins is closely related to its structural motion. For instance, structurally misfolded proteins do not function properly. Although we are able to experimentally obtain structural information on proteins, it is still challenging to capture their dynamics, such as transition processes. Therefore, we need a simulation method to predict the transition pathways of a protein in order to understand and study large functional deformations. Here, we present a new simulation method called normal mode-guided elastic network interpolation (NGENI) that performs normal modes analysis iteratively to predict transition pathways of proteins. To be more specific, NGENI obtains displacement vectors that determine intermediate structures by interpolating the distance between two end-point conformations, similar to a morphing method called elastic network interpolation. However, the displacement vector is regarded as a linear combination of the normal mode vectors of each intermediate structure, in order to enhance the physical sense of the proposed pathways. As a result, we can generate more reasonable transition pathways geometrically and thermodynamically. By using not only all normal modes, but also in part using only the lowest normal modes, NGENI can still generate reasonable pathways for large deformations in proteins. This study shows that global protein transitions are dominated by collective motion, which means that a few lowest normal modes play an important role in this process. NGENI has considerable merit in terms of computational cost because it is possible to generate transition pathways by partial degrees of freedom, while conventional methods are not capable of this. PMID:29020017

  2. Job strain and hypertension in women: Estudo Pro-Saúde (Pro-Health Study).

    PubMed

    Alves, Márcia Guimarães de Mello; Chor, Dóra; Faerstein, Eduardo; Werneck, Guilherme L; Lopes, Claudia S

    2009-10-01

    This study aimed to analyze the association between job strain and hypertension in the female population. A cross-sectional study was performed with 1,819 women who participated in the Estudo Pró-Saúde (Pro-Health Study), in the city of Rio de Janeiro, Southeastern Brazil, between 1999 and 2001. The Brazilian version of the short version of the Job Stress Scale (demand-control model) was used. Overall prevalence of measured hypertension (> or =140/90 mmHg and/or antihypertensive drug use) was 24%. Compared to participants with jobs classified as low strain, adjusted prevalence ratios for hypertension in women who performed passive and active high-strain jobs were, respectively, 0.93 (95% CI: 0.72;1.20), 1.06 (95% CI: 0.86;1.32) and 1.14 (95% CI: 0.88;1.47). Longitudinal analyses should be performed to clarify the role of these work environment psychosocial characteristics as a determinant of hypertension.

  3. The Influence of Normalization Weight in Population Pharmacokinetic Covariate Models.

    PubMed

    Goulooze, Sebastiaan C; Völler, Swantje; Välitalo, Pyry A J; Calvier, Elisa A M; Aarons, Leon; Krekels, Elke H J; Knibbe, Catherijne A J

    2018-03-23

    In covariate (sub)models of population pharmacokinetic models, most covariates are normalized to the median value; however, for body weight, normalization to 70 kg or 1 kg is often applied. In this article, we illustrate the impact of normalization weight on the precision of population clearance (CL pop ) parameter estimates. The influence of normalization weight (70, 1 kg or median weight) on the precision of the CL pop estimate, expressed as relative standard error (RSE), was illustrated using data from a pharmacokinetic study in neonates with a median weight of 2.7 kg. In addition, a simulation study was performed to show the impact of normalization to 70 kg in pharmacokinetic studies with paediatric or obese patients. The RSE of the CL pop parameter estimate in the neonatal dataset was lowest with normalization to median weight (8.1%), compared with normalization to 1 kg (10.5%) or 70 kg (48.8%). Typical clearance (CL) predictions were independent of the normalization weight used. Simulations showed that the increase in RSE of the CL pop estimate with 70 kg normalization was highest in studies with a narrow weight range and a geometric mean weight away from 70 kg. When, instead of normalizing with median weight, a weight outside the observed range is used, the RSE of the CL pop estimate will be inflated, and should therefore not be used for model selection. Instead, established mathematical principles can be used to calculate the RSE of the typical CL (CL TV ) at a relevant weight to evaluate the precision of CL predictions.

  4. Decorin and biglycan of normal and pathologic human corneas

    NASA Technical Reports Server (NTRS)

    Funderburgh, J. L.; Hevelone, N. D.; Roth, M. R.; Funderburgh, M. L.; Rodrigues, M. R.; Nirankari, V. S.; Conrad, G. W.

    1998-01-01

    PURPOSE: Corneas with scars and certain chronic pathologic conditions contain highly sulfated dermatan sulfate, but little is known of the core proteins that carry these atypical glycosaminoglycans. In this study the proteoglycan proteins attached to dermatan sulfate in normal and pathologic human corneas were examined to identify primary genes involved in the pathobiology of corneal scarring. METHODS: Proteoglycans from human corneas with chronic edema, bullous keratopathy, and keratoconus and from normal corneas were analyzed using sodium dodecyl sulfate-polyacrylamide gel electrophoresis (SDS-PAGE), quantitative immunoblotting, and immunohistology with peptide antibodies to decorin and biglycan. RESULTS: Proteoglycans from pathologic corneas exhibit increased size heterogeneity and binding of the cationic dye alcian blue compared with those in normal corneas. Decorin and biglycan extracted from normal and diseased corneas exhibited similar molecular size distribution patterns. In approximately half of the pathologic corneas, the level of biglycan was elevated an average of seven times above normal, and decorin was elevated approximately three times above normal. The increases were associated with highly charged molecular forms of decorin and biglycan, indicating modification of the proteins with dermatan sulfate chains of increased sulfation. Immunostaining of corneal sections showed an abnormal stromal localization of biglycan in pathologic corneas. CONCLUSIONS: The increased dermatan sulfate associated with chronic corneal pathologic conditions results from stromal accumulation of decorin and particularly of biglycan in the affected corneas. These proteins bear dermatan sulfate chains with increased sulfation compared with normal stromal proteoglycans.

  5. The Normalization of Cannabis Use Among Bangladeshi and Pakistani Youth: A New Frontier for the Normalization Thesis?

    PubMed

    Williams, Lisa; Ralphs, Rob; Gray, Paul

    2017-03-21

    The Asian population in Britain has grown, representing the second largest ethnic group; Bangladeshi, Pakistani, and Indian nationalities are prevalent (Jivraj, 2012 ; Office for National Statistics, 2013 ). Yet, we know relatively little about the nature and extent of their substance use. Jayakody et al. ( 2006 ) argue ethnic minority groups may be influenced by the norms and values of the dominant culture. Given recreational drug use has undergone a process of normalization in Britain (Aldridge et al., 2011 ; Parker et al., 1998 , 2002 ), we explore the degree to which this is occurring in a Bangladeshi and Pakistani community of Muslim faith in Northern England; a group typically assumed to reject substance use because of robust religious and cultural values. To examine the extent, frequency, and nature of substance use, and associated attitudes. A cross-sectional study collecting qualitative data from a sample (N = 43) of adolescents accessing a drug service and a range of professionals working with them during 2014. We also present analyses of routinely collected quantitative client data. Adolescent interviewees reported extensive personal experience smoking skunk cannabis, and professionals working in the community confirmed many young Asians smoked it. Its consumption appeared to be accommodated into the daily lives of young people and the supply of it also showed signs of acceptance. Skunk cannabis may be undergoing a process of normalization within some Asian communities in Britain. Our study has significant implications for the normalization thesis, finding evidence for normalization within a subpopulation that is typically perceived to resist this trend.

  6. Cortical thickness in neuropsychologically near-normal schizophrenia.

    PubMed

    Cobia, Derin J; Csernansky, John G; Wang, Lei

    2011-12-01

    Schizophrenia is a severe psychiatric illness with widespread impairments of cognitive functioning; however, a certain percentage of subjects are known to perform in the normal range on neuropsychological measures. While the cognitive profiles of these individuals have been examined, there has been relatively little attention to the neuroanatomical characteristics of this important subgroup. The aims of this study were to statistically identify schizophrenia subjects with relatively normal cognition, examine their neuroanatomical characteristics relative to their more impaired counterparts using cortical thickness mapping, and to investigate relationships between these characteristics and demographic variables to better understand the nature of cognitive heterogeneity in schizophrenia. Clinical, neuropsychological, and MRI data were collected from schizophrenia (n = 79) and healthy subjects (n = 65). A series of clustering algorithms on neuropsychological scores was examined, and a 2-cluster solution that separated subjects into neuropsychologically near-normal (NPNN) and neuropsychologically impaired (NPI) groups was determined most appropriate. Surface-based cortical thickness mapping was utilized to examine differences in thinning among schizophrenia subtypes compared with the healthy participants. A widespread cortical thinning pattern characteristic of schizophrenia emerged in the NPI group, while NPNN subjects demonstrated very limited thinning relative to healthy comparison subjects. Analysis of illness duration indicated minimal effects on subtype classification and cortical thickness results. Findings suggest a strong link between cognitive impairment and cortical thinning in schizophrenia, where subjects with near-normal cognitive abilities also demonstrate near-normal cortical thickness patterns. While generally supportive of distinct etiological processes for cognitive subtypes, results provide direction for further examination of additional

  7. Um estudo espectrofotométrico da variável cataclísmica V3885 Sgr

    NASA Astrophysics Data System (ADS)

    Ribeiro, F. M. A.; Diaz, M. P.

    2003-08-01

    Variáveis Cataclísmicas são sistemas binários cerrados compostos de uma anã vermelha que transfere matéria para uma anã branca, em sistemas não magnéticos ocorre a formação de um disco de acresção em torno da anã branca. V3885 Sgr é uma variável cataclísmica classificada como sendo do tipo nova-like. É apresentado um estudo espectrofotométrico de V3885 Sgr de alta resolução temporal feito na região do visível. A região observada é centrada em Ha e abrange também a linha de HeI 6678. O primeiro resultado obtido neste estudo é a determinação do período orbital a partir de medidas da velocidade radial da linha de Ha como sendo 0,20716071(22) dias, resolvendo inconsistências quanto a esse valor na literatura e definindo uma efeméride a longo prazo para o sistema. Com este período e as medidas de velocidade radial do perfil de linha de Ha foi construído um diagrama de massas, através do qual restringimos as massas das componentes estelares do sistema e limitamos a inclinação orbital do sistema. Foram construídos diagramas de Greenstein para as linhas de Ha e HeI, onde os espectros médios em cada intervalo de fase são representados lado a lado em escala de cinza, indicando a existência de uma emissão intensa proveniente da parte posterior do disco. A partir da tomografia Doppler obtivemos perfis de emissividade radial para o disco tanto para a linha de Ha como para HeI. Os resultados obtidos são comparados com os de outros sistemas estudados com a mesma técnica. Serão apresentados também resultados da tomografia de flickering para o sistema.

  8. COMS normal operation for Earth Observation mission

    NASA Astrophysics Data System (ADS)

    Cho, Young-Min

    2012-09-01

    Communication Ocean Meteorological Satellite (COMS) for the hybrid mission of meteorological observation, ocean monitoring, and telecommunication service was launched onto Geostationary Earth Orbit on June 27, 2010 and it is currently under normal operation service since April 2011. The COMS is located on 128.2° East of the geostationary orbit. In order to perform the three missions, the COMS has 3 separate payloads, the meteorological imager (MI), the Geostationary Ocean Color Imager (GOCI), and the Ka-band antenna. Each payload is dedicated to one of the three missions, respectively. The MI and GOCI perform the Earth observation mission of meteorological observation and ocean monitoring, respectively. For this Earth observation mission the COMS requires daily mission commands from the satellite control ground station and daily mission is affected by the satellite control activities. For this reason daily mission planning is required. The Earth observation mission operation of COMS is described in aspects of mission operation characteristics and mission planning for the normal operation services of meteorological observation and ocean monitoring. And the first year normal operation results after the In-Orbit-Test (IOT) are investigated through statistical approach to provide the achieved COMS normal operation status for the Earth observation mission.

  9. Normals to a Parabola

    ERIC Educational Resources Information Center

    Srinivasan, V. K.

    2013-01-01

    Given a parabola in the standard form y[superscript 2] = 4ax, corresponding to three points on the parabola, such that the normals at these three points P, Q, R concur at a point M = (h, k), the equation of the circumscribing circle through the three points P, Q, and R provides a tremendous opportunity to illustrate "The Art of Algebraic…

  10. Measurements of normal joint angles by goniometry in calves.

    PubMed

    Sengöz Şirin, O; Timuçin Celik, M; Ozmen, A; Avki, S

    2014-01-01

    The aim of this study was to establish normal reference values of the forelimb and hindlimb joint angles in normal Holstein calves. Thirty clinically normal Holstein calves that were free of any detectable musculoskeletal abnormalities were included in the study. A standard transparent plastic goniometer was used to measure maximum flexion, maximum extension, and range-of-motion of the shoulder, elbow, carpal, hip, stifle, and tarsal joints. The goniometric measurements were done on awake calves that were positioned in lateral recumbency. The goniometric values were measured and recorded by two independent investigators. As a result of the study it was concluded that goniometric values obtained from awake calves in lateral recumbency were found to be highly consistent and accurate between investigators (p <0.05). The data of this study acquired objective and useful information on the normal forelimb and hindlimb joint angles in normal Holstein calves. Further studies can be done to predict detailed goniometric values from different diseases and compare them.

  11. Effects of normalization on quantitative traits in association test

    PubMed Central

    2009-01-01

    Background Quantitative trait loci analysis assumes that the trait is normally distributed. In reality, this is often not observed and one strategy is to transform the trait. However, it is not clear how much normality is required and which transformation works best in association studies. Results We performed simulations on four types of common quantitative traits to evaluate the effects of normalization using the logarithm, Box-Cox, and rank-based transformations. The impact of sample size and genetic effects on normalization is also investigated. Our results show that rank-based transformation gives generally the best and consistent performance in identifying the causal polymorphism and ranking it highly in association tests, with a slight increase in false positive rate. Conclusion For small sample size or genetic effects, the improvement in sensitivity for rank transformation outweighs the slight increase in false positive rate. However, for large sample size and genetic effects, normalization may not be necessary since the increase in sensitivity is relatively modest. PMID:20003414

  12. Normal keratinized mucosa transplants in nude mice.

    PubMed

    Holmstrup, P; Dabelsteen, E; Reibel, J; Harder, F

    1981-01-01

    Two types of normal keratinized mucosa were transplanted to subcutaneous sites of nude mice of two different strains. 24 intact specimens of clinically normal human palatal mucosa were transplanted to nude mice of the strain nu/nu NC. The transplants were recovered after 42 d with a recovery rate of 96%. Moreover, 22 intact specimens of normal rat forestomach mucosa were transplanted to nude mice of the strain nu/nu BALB/c/BOM. These transplants were recovered after 21 d with a recovery rate of 63%. The histologic features of the transplants were essentially the same as those of the original tissues. However, epithelial outgrowths from the transplants differed with respect to the pattern of keratinization. The outgrowths of human palatal mucosa transplants were essentially unkeratinized, while the outgrowths of the rat forestomach transplants showed continued keratinization.

  13. Masturbation, sexuality, and adaptation: normalization in adolescence.

    PubMed

    Shapiro, Theodore

    2008-03-01

    During adolescence the central masturbation fantasy that is formulated during childhood takes its final form and paradoxically must now be directed outward for appropriate object finding and pair matching in the service of procreative aims. This is a step in adaptation that requires a further developmental landmark that I have called normalization. The path toward airing these private fantasies is facilitated by chumship relationships as a step toward further exposure to the social surround. Hartmann's structuring application of adaptation within psychoanalysis is used as a framework for understanding the process that simultaneously serves intrapsychic and social demands and permits goals that follow evolutionary principles. Variations in the normalization process from masturbatory isolation to a variety of forms of sexual socialization are examined in sociological data concerning current adolescent sexual behavior and in case examples that indicate some routes to normalized experience and practice.

  14. Are cancer cells really softer than normal cells?

    PubMed

    Alibert, Charlotte; Goud, Bruno; Manneville, Jean-Baptiste

    2017-05-01

    Solid tumours are often first diagnosed by palpation, suggesting that the tumour is more rigid than its surrounding environment. Paradoxically, individual cancer cells appear to be softer than their healthy counterparts. In this review, we first list the physiological reasons indicating that cancer cells may be more deformable than normal cells. Next, we describe the biophysical tools that have been developed in recent years to characterise and model cancer cell mechanics. By reviewing the experimental studies that compared the mechanics of individual normal and cancer cells, we argue that cancer cells can indeed be considered as softer than normal cells. We then focus on the intracellular elements that could be responsible for the softening of cancer cells. Finally, we ask whether the mechanical differences between normal and cancer cells can be used as diagnostic or prognostic markers of cancer progression. © 2017 Société Française des Microscopies and Société de Biologie Cellulaire de France. Published by John Wiley & Sons Ltd.

  15. Spectra of normal and nutrient-deficient maize leaves

    NASA Technical Reports Server (NTRS)

    Al-Abbas, A. H.; Barr, R.; Hall, J. D.; Crane, F. L.; Baumgardner, M. F.

    1973-01-01

    Reflectance, transmittance and absorptance spectra of normal and six types of nutrient-deficient (N, P, K, S, Mg, and Ca) maize (Zea mays L.) leaves were analyzed at 30 selected wavelengths from 500 to 2600 nm. The analysis of variance showed significant differences in reflectance, transmittance and absorptance in the visible wavelengths among leaf numbers 3, 4, and 5, among the seven treatments, and among the interactions of leaf number and treatments. In the infrared wavelengths only treatments produced significant differences. The chlorophyll content of leaves was reduced in all nutrient-deficient treatments. Percent moisture was increased in S-, Mg-, and N-deficiencies. Polynomial regression analysis of leaf thickness and leaf moisture content showed that these two variables were significantly and directly related. Leaves from the P- and Ca-deficient plants absorbed less energy in the near infrared than the normal plants; S-, Mg-, K-, and N-deficient leaves absorbed more than the normal. Both S- and N-deficient leaves had higher temperatues than normal maize leaves.

  16. Proteoglycans in Leiomyoma and Normal Myometrium

    PubMed Central

    Barker, Nichole M.; Carrino, David A.; Caplan, Arnold I.; Hurd, William W.; Liu, James H.; Tan, Huiqing; Mesiano, Sam

    2015-01-01

    Uterine leiomyoma are a common benign pelvic tumors composed of modified smooth muscle cells and a large amount of extracellular matrix (ECM). The proteoglycan composition of the leiomyoma ECM is thought to affect pathophysiology of the disease. To test this hypothesis, we examined the abundance (by immunoblotting) and expression (by quantitative real-time polymerase chain reaction) of the proteoglycans biglycan, decorin, and versican in leiomyoma and normal myometrium and determined whether expression is affected by steroid hormones and menstrual phase. Leiomyoma and normal myometrium were collected from women (n = 17) undergoing hysterectomy or myomectomy. In vitro studies were performed on immortalized leiomyoma (UtLM) and normal myometrial (hTERT-HM) cells with and without exposure to estradiol and progesterone. In leiomyoma tissue, abundance of decorin messenger RNA (mRNA) and protein were 2.6-fold and 1.4-fold lower, respectively, compared with normal myometrium. Abundance of versican mRNA was not different between matched samples, whereas versican protein was increased 1.8-fold in leiomyoma compared with myometrium. Decorin mRNA was 2.4-fold lower in secretory phase leiomyoma compared with proliferative phase tissue. In UtLM cells, progesterone decreased the abundance of decorin mRNA by 1.3-fold. Lower decorin expression in leiomyoma compared with myometrium may contribute to disease growth and progression. As decorin inhibits the activity of specific growth factors, its reduced level in the leiomyoma cell microenvironment may promote cell proliferation and ECM deposition. Our data suggest that decorin expression in leiomyoma is inhibited by progesterone, which may be a mechanism by which the ovarian steroids affect leiomyoma growth and disease progression. PMID:26423601

  17. The classification of normal screening mammograms

    NASA Astrophysics Data System (ADS)

    Ang, Zoey Z. Y.; Rawashdeh, Mohammad A.; Heard, Robert; Brennan, Patrick C.; Lee, Warwick; Lewis, Sarah J.

    2016-03-01

    Rationale and objectives: To understand how breast screen readers classify the difficulty of normal screening mammograms using common lexicon describing normal appearances. Cases were also assessed on their suitability for a single reader strategy. Materials and Methods: 15 breast readers were asked to interpret a test set of 29 normal screening mammogram cases and classify them by rating the difficulty of the case on a five-point Likert scale, identifying the salient features and assessing their suitability for single reading. Using the False Positive Fractions from a previous study, the 29 cases were classified into 10 "low", 10 "medium" and nine "high" difficulties. Data was analyzed with descriptive statistics. Spearman's correlation was used to test the strength of association between the difficulty of the cases and the readers' recommendation for single reading strategy. Results: The ratings from readers in this study corresponded to the known difficulty level of cases for the 'low' and 'high' difficulty cases. Uniform ductal pattern and density, symmetrical mammographic features and the absence of micro-calcifications were the main reasons associated with 'low' difficulty cases. The 'high' difficulty cases were described as having `dense breasts'. There was a statistically significant negative correlation between the difficulty of the cases and readers' recommendation for single reading (r = -0.475, P = 0.009). Conclusion: The findings demonstrated potential relationships between certain mammographic features and the difficulty for readers to classify mammograms as 'normal'. The standard Australian practice of double reading was deemed more suitable for most cases. There was an inverse moderate association between the difficulty of the cases and the recommendations for single reading.

  18. The COBE normalization for standard cold dark matter

    NASA Technical Reports Server (NTRS)

    Bunn, Emory F.; Scott, Douglas; White, Martin

    1995-01-01

    The Cosmic Background Explorer Satellite (COBE) detection of microwave anisotropies provides the best way of fixing the amplitude of cosmological fluctuations on the largest scales. This normalization is usually given for an n = 1 spectrum, including only the anisotropy caused by the Sachs-Wolfe effect. This is certainly not a good approximation for a model containing any reasonable amount of baryonic matter. In fact, even tilted Sachs-Wolfe spectra are not a good fit to models like cold dark matter (CDM). Here, we normalize standard CDM (sCDM) to the two-year COBE data and quote the best amplitude in terms of the conventionally used measures of power. We also give normalizations for some specific variants of this standard model, and we indicate how the normalization depends on the assumed values on n, Omega(sub B) and H(sub 0). For sCDM we find the mean value of Q = 19.9 +/- 1.5 micro-K, corresponding to sigma(sub 8) = 1.34 +/- 0.10, with the normalization at large scales being B = (8.16 +/- 1.04) x 10(exp 5)(Mpc/h)(exp 4), and other numbers given in the table. The measured rms temperature fluctuation smoothed on 10 deg is a little low relative to this normalization. This is mainly due to the low quadrupole in the data: when the quadrupole is removed, the measured value of sigma(10 deg) is quite consistent with the best-fitting the mean value of Q. The use of the mean value of Q should be preferred over sigma(10 deg), when its value can be determined for a particular theory, since it makes full use of the data.

  19. Proposal: A Hybrid Dictionary Modelling Approach for Malay Tweet Normalization

    NASA Astrophysics Data System (ADS)

    Muhamad, Nor Azlizawati Binti; Idris, Norisma; Arshi Saloot, Mohammad

    2017-02-01

    Malay Twitter message presents a special deviation from the original language. Malay Tweet widely used currently by Twitter users, especially at Malaya archipelago. Thus, it is important to make a normalization system which can translated Malay Tweet language into the standard Malay language. Some researchers have conducted in natural language processing which mainly focuses on normalizing English Twitter messages, while few studies have been done for normalize Malay Tweets. This paper proposes an approach to normalize Malay Twitter messages based on hybrid dictionary modelling methods. This approach normalizes noisy Malay twitter messages such as colloquially language, novel words, and interjections into standard Malay language. This research will be used Language Model and N-grams model.

  20. Normal stress effects on Knudsen flow

    NASA Astrophysics Data System (ADS)

    Eu, Byung Chan

    2018-01-01

    Normal stress effects are investigated on tube flow of a single-component non-Newtonian fluid under a constant pressure gradient in a constant temperature field. The generalized hydrodynamic equations are employed, which are consistent with the laws of thermodynamics. In the cylindrical tube flow configuration, the solutions of generalized hydrodynamic equations are exactly solvable and the flow velocity is obtained in a simple one-dimensional integral quadrature. Unlike the case of flow in the absence of normal stresses, the flow develops an anomaly in that the flow in the boundary layer becomes stagnant and the thickness of such a stagnant velocity boundary layer depends on the pressure gradient, the aspect ratio of the radius to the length of the tube, and the pressure (or density and temperature) at the entrance of the tube. The volume flow rate formula through the tube is derived for the flow. It generalizes the Knudsen flow rate formula to the case of a non-Newtonian stress tensor in the presence of normal stress differences. It also reduces to the Navier-Stokes theory formula in the low shear rate limit near equilibrium.

  1. Glymphatic MRI in idiopathic normal pressure hydrocephalus

    PubMed Central

    Ringstad, Geir; Vatnehol, Svein Are Sirirud; Eide, Per Kristian

    2017-01-01

    Abstract The glymphatic system has in previous studies been shown as fundamental to clearance of waste metabolites from the brain interstitial space, and is proposed to be instrumental in normal ageing and brain pathology such as Alzheimer’s disease and brain trauma. Assessment of glymphatic function using magnetic resonance imaging with intrathecal contrast agent as a cerebrospinal fluid tracer has so far been limited to rodents. We aimed to image cerebrospinal fluid flow characteristics and glymphatic function in humans, and applied the methodology in a prospective study of 15 idiopathic normal pressure hydrocephalus patients (mean age 71.3 ± 8.1 years, three female and 12 male) and eight reference subjects (mean age 41.1 + 13.0 years, six female and two male) with suspected cerebrospinal fluid leakage (seven) and intracranial cyst (one). The imaging protocol included T1-weighted magnetic resonance imaging with equal sequence parameters before and at multiple time points through 24 h after intrathecal injection of the contrast agent gadobutrol at the lumbar level. All study subjects were kept in the supine position between examinations during the first day. Gadobutrol enhancement was measured at all imaging time points from regions of interest placed at predefined locations in brain parenchyma, the subarachnoid and intraventricular space, and inside the sagittal sinus. Parameters demonstrating gadobutrol enhancement and clearance in different locations were compared between idiopathic normal pressure hydrocephalus and reference subjects. A characteristic flow pattern in idiopathic normal hydrocephalus was ventricular reflux of gadobutrol from the subarachnoid space followed by transependymal gadobutrol migration. At the brain surfaces, gadobutrol propagated antegradely along large leptomeningeal arteries in all study subjects, and preceded glymphatic enhancement in adjacent brain tissue, indicating a pivotal role of intracranial pulsations for glymphatic

  2. Glymphatic MRI in idiopathic normal pressure hydrocephalus.

    PubMed

    Ringstad, Geir; Vatnehol, Svein Are Sirirud; Eide, Per Kristian

    2017-10-01

    The glymphatic system has in previous studies been shown as fundamental to clearance of waste metabolites from the brain interstitial space, and is proposed to be instrumental in normal ageing and brain pathology such as Alzheimer's disease and brain trauma. Assessment of glymphatic function using magnetic resonance imaging with intrathecal contrast agent as a cerebrospinal fluid tracer has so far been limited to rodents. We aimed to image cerebrospinal fluid flow characteristics and glymphatic function in humans, and applied the methodology in a prospective study of 15 idiopathic normal pressure hydrocephalus patients (mean age 71.3 ± 8.1 years, three female and 12 male) and eight reference subjects (mean age 41.1 + 13.0 years, six female and two male) with suspected cerebrospinal fluid leakage (seven) and intracranial cyst (one). The imaging protocol included T1-weighted magnetic resonance imaging with equal sequence parameters before and at multiple time points through 24 h after intrathecal injection of the contrast agent gadobutrol at the lumbar level. All study subjects were kept in the supine position between examinations during the first day. Gadobutrol enhancement was measured at all imaging time points from regions of interest placed at predefined locations in brain parenchyma, the subarachnoid and intraventricular space, and inside the sagittal sinus. Parameters demonstrating gadobutrol enhancement and clearance in different locations were compared between idiopathic normal pressure hydrocephalus and reference subjects. A characteristic flow pattern in idiopathic normal hydrocephalus was ventricular reflux of gadobutrol from the subarachnoid space followed by transependymal gadobutrol migration. At the brain surfaces, gadobutrol propagated antegradely along large leptomeningeal arteries in all study subjects, and preceded glymphatic enhancement in adjacent brain tissue, indicating a pivotal role of intracranial pulsations for glymphatic function. In

  3. Method for construction of normalized cDNA libraries

    DOEpatents

    Soares, Marcelo B.; Efstratiadis, Argiris

    1998-01-01

    This invention provides a method to normalize a directional cDNA library constructed in a vector that allows propagation in single-stranded circle form comprising: (a) propagating the directional cDNA library in single-stranded circles; (b) generating fragments complementary to the 3' noncoding sequence of the single-stranded circles in the library to produce partial duplexes; (c) purifying the partial duplexes; (d) melting and reassociating the purified partial duplexes to appropriate Cot; and (e) purifying the unassociated single-stranded circles, thereby generating a normalized cDNA library. This invention also provides normalized cDNA libraries generated by the above-described method and uses of the generated libraries.

  4. Method for construction of normalized cDNA libraries

    DOEpatents

    Soares, M.B.; Efstratiadis, A.

    1998-11-03

    This invention provides a method to normalize a directional cDNA library constructed in a vector that allows propagation in single-stranded circle form comprising: (a) propagating the directional cDNA library in single-stranded circles; (b) generating fragments complementary to the 3` noncoding sequence of the single-stranded circles in the library to produce partial duplexes; (c) purifying the partial duplexes; (d) melting and reassociating the purified partial duplexes to appropriate Cot; and (e) purifying the unassociated single-stranded circles, thereby generating a normalized cDNA library. This invention also provides normalized cDNA libraries generated by the above-described method and uses of the generated libraries. 19 figs.

  5. Normal Perceptual Sensitivity Arising From Weakly Reflective Cone Photoreceptors

    PubMed Central

    Bruce, Kady S.; Harmening, Wolf M.; Langston, Bradley R.; Tuten, William S.; Roorda, Austin; Sincich, Lawrence C.

    2015-01-01

    Purpose To determine the light sensitivity of poorly reflective cones observed in retinas of normal subjects, and to establish a relationship between cone reflectivity and perceptual threshold. Methods Five subjects (four male, one female) with normal vision were imaged longitudinally (7–26 imaging sessions, representing 82–896 days) using adaptive optics scanning laser ophthalmoscopy (AOSLO) to monitor cone reflectance. Ten cones with unusually low reflectivity, as well as 10 normally reflective cones serving as controls, were targeted for perceptual testing. Cone-sized stimuli were delivered to the targeted cones and luminance increment thresholds were quantified. Thresholds were measured three to five times per session for each cone in the 10 pairs, all located 2.2 to 3.3° from the center of gaze. Results Compared with other cones in the same retinal area, three of 10 monitored dark cones were persistently poorly reflective, while seven occasionally manifested normal reflectance. Tested psychophysically, all 10 dark cones had thresholds comparable with those from normally reflecting cones measured concurrently (P = 0.49). The variation observed in dark cone thresholds also matched the wide variation seen in a large population (n = 56 cone pairs, six subjects) of normal cones; in the latter, no correlation was found between cone reflectivity and threshold (P = 0.0502). Conclusions Low cone reflectance cannot be used as a reliable indicator of cone sensitivity to light in normal retinas. To improve assessment of early retinal pathology, other diagnostic criteria should be employed along with imaging and cone-based microperimetry. PMID:26193919

  6. Microarray expression profiling in adhesion and normal peritoneal tissues.

    PubMed

    Ambler, Dana R; Golden, Alicia M; Gell, Jennifer S; Saed, Ghassan M; Carey, David J; Diamond, Michael P

    2012-05-01

    To identify molecular markers associated with adhesion and normal peritoneal tissue using microarray expression profiling. Comparative study. University hospital. Five premenopausal women. Adhesion and normal peritoneal tissue samples were obtained from premenopausal women. Ribonucleic acid was extracted using standard protocols and processed for hybridization to Affymetrix Whole Transcript Human Gene Expression Chips. Microarray data were obtained from five different patients, each with adhesion tissue and normal peritoneal samples. Real-time polymerase chain reaction was performed for confirmation using standard protocols. Gene expression in postoperative adhesion and normal peritoneal tissues. A total of 1,263 genes were differentially expressed between adhesion and normal tissues. One hundred seventy-three genes were found to be up-regulated and 56 genes were down-regulated in the adhesion tissues compared with normal peritoneal tissues. The genes were sorted into functional categories according to Gene Ontology annotations. Twenty-six up-regulated genes and 11 down-regulated genes were identified with functions potentially relevant to the pathophysiology of postoperative adhesions. We evaluated and confirmed expression of 12 of these specific genes via polymerase chain reaction. The pathogenesis, natural history, and optimal treatment of postoperative adhesive disease remains unanswered. Microarray analysis of adhesions identified specific genes with increased and decreased expression when compared with normal peritoneum. Knowledge of these genes and ontologic pathways with altered expression provide targets for new therapies to treat patients who have or are at risk for postoperative adhesions. Copyright © 2012 American Society for Reproductive Medicine. Published by Elsevier Inc. All rights reserved.

  7. Resistance to antibiotics in the normal flora of animals.

    PubMed

    Sørum, H; Sunde, M

    2001-01-01

    The normal bacterial flora contains antibiotic resistance genes to various degrees, even in individuals with no history of exposure to commercially prepared antibiotics. Several factors seem to increase the number of antibiotic-resistant bacteria in feces. One important factor is the exposure of the intestinal flora to antibacterial drugs. Antibiotics used as feed additives seem to play an important role in the development of antibiotic resistance in normal flora bacteria. The use of avoparcin as a feed additive has demonstrated that an antibiotic considered "safe" is responsible for increased levels of antibiotic resistance in the normal flora enterococci of animals fed with avoparcin and possibly in humans consuming products from these animals. However, other factors like stress from temperature, crowding, and management also seem to contribute to the occurrence of antibiotic resistance in normal flora bacteria. The normal flora of animals has been studied with respect to the development of antibiotic resistance over four decades, but there are few studies with the intestinal flora as the main focus. The results of earlier studies are valuable when focused against the recent understanding of mobile genetics responsible for bacterial antibiotic resistance. New studies should be undertaken to assess whether the development of antibiotic resistance in the normal flora is directly linked to the dramatic increase in antibiotic resistance of bacterial pathogens. Bacteria of the normal flora, often disregarded scientifically, should be studied with the intention of using them as active protection against infectious diseases and thereby contributing to the overall reduction of use of antibioties in both animals and humans.

  8. The significance of early post-exercise ST segment normalization.

    PubMed

    Chow, Rudy; Fordyce, Christopher B; Gao, Min; Chan, Sammy; Gin, Kenneth; Bennett, Matthew

    2015-01-01

    The persistence of ST segment depression in recovery signifies a strongly positive exercise treadmill test (ETT). However, it is unclear if early recovery of ST segments portends a similar prognosis. We sought to determine if persistence of ST depression into recovery correlates with ischemic burden based on myocardial perfusion imaging (MPI). This was a retrospective analysis of 853 consecutive patients referred for exercise MPI at a tertiary academic center over a 24-month period. Patients were stratified into three groups based on the results of the ETT: normal (negative ETT), persistence (positive ETT with >1mm ST segment depression at 1minute in recovery) and early normalization (positive ETT with <1mm ST segment depression at 1minute in recovery). Summed stress scores (SSSs) were calculated then for each patient, while the coronary anatomy was reported for the subset of patients who received coronary angiograms. A total of 513 patients had a negative ETT, 235 patients met criteria for early normalization, while 105 patients met criteria for persistence. The persistence group had a significantly greater SSS (8.48±7.77) than both the early normalization (4.34±4.98, p<0.001) and normal (4.47±5.31, p<0.001) groups. The SSSs of the early normalization and normal groups were not statistically different and met the prespecified non-inferiority margin (mean difference 0.12, -0.66=lower 95% CI, p<0.001). Among the 87 patients who underwent an angiogram, significant three-vessel or left main disease was seen in 39.3% of the persistence group compared with 5.9% of normal and 7.4% of early normalization groups. Among patients with an electrically positive ETT, recovery of ST segment depression within 1minute was associated with a lower SSS than patients with persistence of ST depression beyond 1minute. Furthermore, early ST segment recovery conferred a similar SSS to patients with a negative ETT. These results suggest that among patients evaluated for chest pain with

  9. Improved Algorithm For Finite-Field Normal-Basis Multipliers

    NASA Technical Reports Server (NTRS)

    Wang, C. C.

    1989-01-01

    Improved algorithm reduces complexity of calculations that must precede design of Massey-Omura finite-field normal-basis multipliers, used in error-correcting-code equipment and cryptographic devices. Algorithm represents an extension of development reported in "Algorithm To Design Finite-Field Normal-Basis Multipliers" (NPO-17109), NASA Tech Briefs, Vol. 12, No. 5, page 82.

  10. Gang Youth, Substance Use Patterns, and Drug Normalization

    ERIC Educational Resources Information Center

    Sanders, Bill

    2012-01-01

    Gang membership is an indicator of chronic illicit substance use and such patterns of use may have a normalized character. Using epidemiological and qualitative data collected between 2006 and 2007, this manuscript examines the drug normalization thesis among a small sample (n=60) of gang youth aged 16-25 years from Los Angeles. Overall, while…

  11. Empirical evaluation of data normalization methods for molecular classification.

    PubMed

    Huang, Huei-Chung; Qin, Li-Xuan

    2018-01-01

    Data artifacts due to variations in experimental handling are ubiquitous in microarray studies, and they can lead to biased and irreproducible findings. A popular approach to correct for such artifacts is through post hoc data adjustment such as data normalization. Statistical methods for data normalization have been developed and evaluated primarily for the discovery of individual molecular biomarkers. Their performance has rarely been studied for the development of multi-marker molecular classifiers-an increasingly important application of microarrays in the era of personalized medicine. In this study, we set out to evaluate the performance of three commonly used methods for data normalization in the context of molecular classification, using extensive simulations based on re-sampling from a unique pair of microRNA microarray datasets for the same set of samples. The data and code for our simulations are freely available as R packages at GitHub. In the presence of confounding handling effects, all three normalization methods tended to improve the accuracy of the classifier when evaluated in an independent test data. The level of improvement and the relative performance among the normalization methods depended on the relative level of molecular signal, the distributional pattern of handling effects (e.g., location shift vs scale change), and the statistical method used for building the classifier. In addition, cross-validation was associated with biased estimation of classification accuracy in the over-optimistic direction for all three normalization methods. Normalization may improve the accuracy of molecular classification for data with confounding handling effects; however, it cannot circumvent the over-optimistic findings associated with cross-validation for assessing classification accuracy.

  12. Empirical evaluation of data normalization methods for molecular classification

    PubMed Central

    Huang, Huei-Chung

    2018-01-01

    Background Data artifacts due to variations in experimental handling are ubiquitous in microarray studies, and they can lead to biased and irreproducible findings. A popular approach to correct for such artifacts is through post hoc data adjustment such as data normalization. Statistical methods for data normalization have been developed and evaluated primarily for the discovery of individual molecular biomarkers. Their performance has rarely been studied for the development of multi-marker molecular classifiers—an increasingly important application of microarrays in the era of personalized medicine. Methods In this study, we set out to evaluate the performance of three commonly used methods for data normalization in the context of molecular classification, using extensive simulations based on re-sampling from a unique pair of microRNA microarray datasets for the same set of samples. The data and code for our simulations are freely available as R packages at GitHub. Results In the presence of confounding handling effects, all three normalization methods tended to improve the accuracy of the classifier when evaluated in an independent test data. The level of improvement and the relative performance among the normalization methods depended on the relative level of molecular signal, the distributional pattern of handling effects (e.g., location shift vs scale change), and the statistical method used for building the classifier. In addition, cross-validation was associated with biased estimation of classification accuracy in the over-optimistic direction for all three normalization methods. Conclusion Normalization may improve the accuracy of molecular classification for data with confounding handling effects; however, it cannot circumvent the over-optimistic findings associated with cross-validation for assessing classification accuracy. PMID:29666754

  13. Indentation stiffness does not discriminate between normal and degraded articular cartilage.

    PubMed

    Brown, Cameron P; Crawford, Ross W; Oloyede, Adekunle

    2007-08-01

    Relative indentation characteristics are commonly used for distinguishing between normal healthy and degraded cartilage. The application of this parameter in surgical decision making and an appreciation of articular cartilage biomechanics has prompted us to hypothesise that it is difficult to define a reference stiffness to characterise normal articular cartilage. This hypothesis is tested for validity by carrying out biomechanical indentation of articular cartilage samples that are characterised as visually normal and degraded relative to proteoglycan depletion and collagen disruption. Compressive loading was applied at known strain rates to visually normal, artificially degraded and naturally osteoarthritic articular cartilage and observing the trends of their stress-strain and stiffness characteristics. While our results demonstrated a 25% depreciation in the stiffness of individual samples after proteoglycan depletion, they also showed that when compared to the stiffness of normal samples only 17% lie outside the range of the stress-strain behaviour of normal samples. We conclude that the extent of the variability in the properties of normal samples, and the degree of overlap (81%) of the biomechanical properties of normal and degraded matrices demonstrate that indentation data cannot form an accurate basis for distinguishing normal from abnormal articular cartilage samples with consequences for the application of this mechanical process in the clinical environment.

  14. Quasi-normal modes from non-commutative matrix dynamics

    NASA Astrophysics Data System (ADS)

    Aprile, Francesco; Sanfilippo, Francesco

    2017-09-01

    We explore similarities between the process of relaxation in the BMN matrix model and the physics of black holes in AdS/CFT. Focusing on Dyson-fluid solutions of the matrix model, we perform numerical simulations of the real time dynamics of the system. By quenching the equilibrium distribution we study quasi-normal oscillations of scalar single trace observables, we isolate the lowest quasi-normal mode, and we determine its frequencies as function of the energy. Considering the BMN matrix model as a truncation of N=4 SYM, we also compute the frequencies of the quasi-normal modes of the dual scalar fields in the AdS5-Schwarzschild background. We compare the results, and we finda surprising similarity.

  15. Normalization of urinary drug concentrations with specific gravity and creatinine.

    PubMed

    Cone, Edward J; Caplan, Yale H; Moser, Frank; Robert, Tim; Shelby, Melinda K; Black, David L

    2009-01-01

    Excessive fluid intake can substantially dilute urinary drug concentrations and result in false-negative reports for drug users. Methods for correction ("normalization") of drug/metabolite concentrations in urine have been utilized by anti-doping laboratories, pain monitoring programs, and in environmental monitoring programs to compensate for excessive hydration, but such procedures have not been used routinely in workplace, legal, and treatment settings. We evaluated two drug normalization procedures based on specific gravity and creatinine. These corrections were applied to urine specimens collected from three distinct groups (pain patients, heroin users, and marijuana/ cocaine users). Each group was unique in characteristics, study design, and dosing conditions. The results of the two normalization procedures were highly correlated (r=0.94; range, 0.78-0.99). Increases in percent positives by specific gravity and creatinine normalization were small (0.3% and -1.0%, respectively) for heroin users (normally hydrated subjects), modest (4.2-9.8%) for pain patients (unknown hydration state), and substantial (2- to 38-fold increases) for marijuana/cocaine users (excessively hydrated subjects). Despite some limitations, these normalization procedures provide alternative means of dealing with highly dilute, dilute, and concentrated urine specimens. Drug/metabolite concentration normalization by these procedures is recommended for urine testing programs, especially as a means of coping with dilute specimens.

  16. Protein Degradation in Normal and Beige (Chediak-Higashi) Mice

    PubMed Central

    Lyons, Robert T.; Pitot, Henry C.

    1978-01-01

    The beige mouse, C57BL/6 (bg/bg), is an animal model for the Chediak-Higashi syndrome in man, a disease characterized morphologically by giant lysosomes in most cell types. Half-lives for the turnover of [14C]bicarbonate-labeled total soluble liver protein were determined in normal and beige mice. No significant differences were observed between the normal and mutant strain for both rapidly and slowly turning-over classes of proteins. Glucagon treatment during the time-course of protein degradation had similar effects on both normal and mutant strains and led to the conclusion that the rate of turnover of endogenous intracellular protein in the beige mouse liver does not differ from normal. The rates of uptake and degradation of an exogenous protein were determined in normal and beige mice by intravenously injecting 125I-bovine serum albumin and following, in peripheral blood, the loss with time of phosphotungstic acid-insoluble bovine serum albumin and the parallel appearance of phosphotungstic acid-soluble (degraded) material. No significant differences were observed between beige and normal mice in the uptake by liver lysosomes of 125I-bovine serum albumin (t½ = 3.9 and 2.8 h, respectively). However, it was found that lysosomes from livers of beige mice released phosphotungstic acid-soluble radioactivity at a rate significantly slower than normal (t½ = 6.8 and 3.1 h, respectively). This defect in beige mice could be corrected by chronic administration of carbamyl choline (t½ = 3.5 h), a cholinergic agonist which raises intracellular cyclic GMP levels. However, no significant differences between normal and beige mice were observed either in the ability of soluble extracts of liver and kidney to bind [3H]cyclic GMP in vitro or in the basal levels of cyclic AMP in both tissues. The relevance of these observations to the presumed biochemical defect underlying the Chediak-Higashi syndrome is discussed. PMID:202611

  17. 7 CFR 760.4 - Normal marketings of milk.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 7 2011-01-01 2011-01-01 false Normal marketings of milk. 760.4 Section 760.4... Farmers for Milk § 760.4 Normal marketings of milk. (a) The county committee shall determine the affected... whole milk which such farmer would have sold in the commercial market in each of the pay periods in the...

  18. Physics of collisionless scrape-off-layer plasma during normal and off-normal Tokamak operating conditions.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hassanein, A.; Konkashbaev, I.

    1999-03-15

    The structure of a collisionless scrape-off-layer (SOL) plasma in tokamak reactors is being studied to define the electron distribution function and the corresponding sheath potential between the divertor plate and the edge plasma. The collisionless model is shown to be valid during the thermal phase of a plasma disruption, as well as during the newly desired low-recycling normal phase of operation with low-density, high-temperature, edge plasma conditions. An analytical solution is developed by solving the Fokker-Planck equation for electron distribution and balance in the SOL. The solution is in good agreement with numerical studies using Monte-Carlo methods. The analytical solutionsmore » provide an insight to the role of different physical and geometrical processes in a collisionless SOL during disruptions and during the enhanced phase of normal operation over a wide range of parameters.« less

  19. A comparison of vowel normalization procedures for language variation research

    NASA Astrophysics Data System (ADS)

    Adank, Patti; Smits, Roel; van Hout, Roeland

    2004-11-01

    An evaluation of vowel normalization procedures for the purpose of studying language variation is presented. The procedures were compared on how effectively they (a) preserve phonemic information, (b) preserve information about the talker's regional background (or sociolinguistic information), and (c) minimize anatomical/physiological variation in acoustic representations of vowels. Recordings were made for 80 female talkers and 80 male talkers of Dutch. These talkers were stratified according to their gender and regional background. The normalization procedures were applied to measurements of the fundamental frequency and the first three formant frequencies for a large set of vowel tokens. The normalization procedures were evaluated through statistical pattern analysis. The results show that normalization procedures that use information across multiple vowels (``vowel-extrinsic'' information) to normalize a single vowel token performed better than those that include only information contained in the vowel token itself (``vowel-intrinsic'' information). Furthermore, the results show that normalization procedures that operate on individual formants performed better than those that use information across multiple formants (e.g., ``formant-extrinsic'' F2-F1). .

  20. A comparison of vowel normalization procedures for language variation research.

    PubMed

    Adank, Patti; Smits, Roel; van Hout, Roeland

    2004-11-01

    An evaluation of vowel normalization procedures for the purpose of studying language variation is presented. The procedures were compared on how effectively they (a) preserve phonemic information, (b) preserve information about the talker's regional background (or sociolinguistic information), and (c) minimize anatomical/physiological variation in acoustic representations of vowels. Recordings were made for 80 female talkers and 80 male talkers of Dutch. These talkers were stratified according to their gender and regional background. The normalization procedures were applied to measurements of the fundamental frequency and the first three formant frequencies for a large set of vowel tokens. The normalization procedures were evaluated through statistical pattern analysis. The results show that normalization procedures that use information across multiple vowels ("vowel-extrinsic" information) to normalize a single vowel token performed better than those that include only information contained in the vowel token itself ("vowel-intrinsic" information). Furthermore, the results show that normalization procedures that operate on individual formants performed better than those that use information across multiple formants (e.g., "formant-extrinsic" F2-F1).

  1. Normalized methodology for medical infrared imaging

    NASA Astrophysics Data System (ADS)

    Vargas, J. V. C.; Brioschi, M. L.; Dias, F. G.; Parolin, M. B.; Mulinari-Brenner, F. A.; Ordonez, J. C.; Colman, D.

    2009-01-01

    A normalized procedure for medical infrared imaging is suggested, and illustrated by a leprosy and hepatitis C treatment follow-up, in order to investigate the effect of concurrent treatment which has not been reported before. A 50-year-old man with indeterminate leprosy and a 20-year history of hepatitis C was monitored for 587 days, starting from the day the patient received treatment for leprosy. Standard therapy for hepatitis C started 30 days later. Both visual observations and normalized infrared imaging were conducted periodically to assess the response to leprosy treatment. The primary end points were effectiveness of the method under different boundary conditions over the period, and rapid assessment of the response to leprosy treatment. The patient achieved sustained hepatitis C virological response 6 months after the end of the treatment. The normalized infrared results demonstrate the leprosy treatment success in spite of the concurrent hepatitis C treatment, since day 87, whereas repigmentation was visually assessed only after day 182, and corroborated with a skin biopsy on day 390. The method detected the effectiveness of the leprosy treatment in 87 days, whereas repigmentation started only in 182 days. Hepatitis C and leprosy treatment did not affect each other.

  2. A Late Babylonian Normal and ziqpu star text

    NASA Astrophysics Data System (ADS)

    Roughton, N. A.; Steele, J. M.; Walker, C. B. F.

    2004-09-01

    The Late Babylonian tablet BM 36609+ is a substantial rejoined fragment of an important and previously unknown compendium of short texts dealing with the use of stars in astronomy. Three of the fragments which constitute BM 36609+ were first identified as containing a catalogue of Babylonian "Normal Stars" (stars used as reference points in the sky to track the movement of the moon and planets) by N. A. Roughten. C. B. F. Walker has been able to join several more fragments to the tablet which have revealed that other sections of the compendium concern a group of stars whose culminations are used for keeping time, known as ziqpu-stars after the Akkadian term for culmination, ziqpu. All the preserved sections on the obverse of BM 36609+ concern ziqpu-stars. On the reverse of the tablet we find several sections concerning Normal Stars. This side begins with a catalogue of Normal Stars giving their positions within zodiacal signs. The catalogue is apparently related to the only other Normal Star catalogue previously known, BM 46083 published by Sachs. In the following we present an edition of BM 36609+ based upon Walker's transliteration of the tablet. Since Sachs' edition of BM 46083, the Normal Star catalogue related to BM 36609+, was based upon a photograph and is incomplete we include a fresh edition of the tablet. A list of Akkadian and translated star names with identifications is given.

  3. Modeling and simulation of normal and hemiparetic gait

    NASA Astrophysics Data System (ADS)

    Luengas, Lely A.; Camargo, Esperanza; Sanchez, Giovanni

    2015-09-01

    Gait is the collective term for the two types of bipedal locomotion, walking and running. This paper is focused on walking. The analysis of human gait is of interest to many different disciplines, including biomechanics, human-movement science, rehabilitation and medicine in general. Here we present a new model that is capable of reproducing the properties of walking, normal and pathological. The aim of this paper is to establish the biomechanical principles that underlie human walking by using Lagrange method. The constraint forces of Rayleigh dissipation function, through which to consider the effect on the tissues in the gait, are included. Depending on the value of the factor present in the Rayleigh dissipation function, both normal and pathological gait can be simulated. First of all, we apply it in the normal gait and then in the permanent hemiparetic gait. Anthropometric data of adult person are used by simulation, and it is possible to use anthropometric data for children but is necessary to consider existing table of anthropometric data. Validation of these models includes simulations of passive dynamic gait that walk on level ground. The dynamic walking approach provides a new perspective of gait analysis, focusing on the kinematics and kinetics of gait. There have been studies and simulations to show normal human gait, but few of them have focused on abnormal, especially hemiparetic gait. Quantitative comparisons of the model predictions with gait measurements show that the model can reproduce the significant characteristics of normal gait.

  4. Not Quite Normal: Consequences of Violating the Assumption of Normality in Regression Mixture Models

    ERIC Educational Resources Information Center

    Van Horn, M. Lee; Smith, Jessalyn; Fagan, Abigail A.; Jaki, Thomas; Feaster, Daniel J.; Masyn, Katherine; Hawkins, J. David; Howe, George

    2012-01-01

    Regression mixture models, which have only recently begun to be used in applied research, are a new approach for finding differential effects. This approach comes at the cost of the assumption that error terms are normally distributed within classes. This study uses Monte Carlo simulations to explore the effects of relatively minor violations of…

  5. Notes on power of normality tests of error terms in regression models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Střelec, Luboš

    2015-03-10

    Normality is one of the basic assumptions in applying statistical procedures. For example in linear regression most of the inferential procedures are based on the assumption of normality, i.e. the disturbance vector is assumed to be normally distributed. Failure to assess non-normality of the error terms may lead to incorrect results of usual statistical inference techniques such as t-test or F-test. Thus, error terms should be normally distributed in order to allow us to make exact inferences. As a consequence, normally distributed stochastic errors are necessary in order to make a not misleading inferences which explains a necessity and importancemore » of robust tests of normality. Therefore, the aim of this contribution is to discuss normality testing of error terms in regression models. In this contribution, we introduce the general RT class of robust tests for normality, and present and discuss the trade-off between power and robustness of selected classical and robust normality tests of error terms in regression models.« less

  6. A brain imaging repository of normal structural MRI across the life course: Brain Images of Normal Subjects (BRAINS).

    PubMed

    Job, Dominic E; Dickie, David Alexander; Rodriguez, David; Robson, Andrew; Danso, Sammy; Pernet, Cyril; Bastin, Mark E; Boardman, James P; Murray, Alison D; Ahearn, Trevor; Waiter, Gordon D; Staff, Roger T; Deary, Ian J; Shenkin, Susan D; Wardlaw, Joanna M

    2017-01-01

    The Brain Images of Normal Subjects (BRAINS) Imagebank (http://www.brainsimagebank.ac.uk) is an integrated repository project hosted by the University of Edinburgh and sponsored by the Scottish Imaging Network: A Platform for Scientific Excellence (SINAPSE) collaborators. BRAINS provide sharing and archiving of detailed normal human brain imaging and relevant phenotypic data already collected in studies of healthy volunteers across the life-course. It particularly focusses on the extremes of age (currently older age, and in future perinatal) where variability is largest, and which are under-represented in existing databanks. BRAINS is a living imagebank where new data will be added when available. Currently BRAINS contains data from 808 healthy volunteers, from 15 to 81years of age, from 7 projects in 3 centres. Additional completed and ongoing studies of normal individuals from 1st to 10th decades are in preparation and will be included as they become available. BRAINS holds several MRI structural sequences, including T1, T2, T2* and fluid attenuated inversion recovery (FLAIR), available in DICOM (http://dicom.nema.org/); in future Diffusion Tensor Imaging (DTI) will be added where available. Images are linked to a wide range of 'textual data', such as age, medical history, physiological measures (e.g. blood pressure), medication use, cognitive ability, and perinatal information for pre/post-natal subjects. The imagebank can be searched to include or exclude ranges of these variables to create better estimates of 'what is normal' at different ages. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  7. Normal central retinal function and structure preserved in retinitis pigmentosa.

    PubMed

    Jacobson, Samuel G; Roman, Alejandro J; Aleman, Tomas S; Sumaroka, Alexander; Herrera, Waldo; Windsor, Elizabeth A M; Atkinson, Lori A; Schwartz, Sharon B; Steinberg, Janet D; Cideciyan, Artur V

    2010-02-01

    To determine whether normal function and structure, as recently found in forms of Usher syndrome, also occur in a population of patients with nonsyndromic retinitis pigmentosa (RP). Patients with simplex, multiplex, or autosomal recessive RP (n = 238; ages 9-82 years) were studied with static chromatic perimetry. A subset was evaluated with optical coherence tomography (OCT). Co-localized visual sensitivity and photoreceptor nuclear layer thickness were measured across the central retina to establish the relationship of function and structure. Comparisons were made to patients with Usher syndrome (n = 83, ages 10-69 years). Cross-sectional psychophysical data identified patients with RP who had normal rod- and cone-mediated function in the central retina. There were two other patterns with greater dysfunction, and longitudinal data confirmed that progression can occur from normal rod and cone function to cone-only central islands. The retinal extent of normal laminar architecture by OCT corresponded to the extent of normal visual function in patients with RP. Central retinal preservation of normal function and structure did not show a relationship with age or retained peripheral function. Usher syndrome results were like those in nonsyndromic RP. Regional disease variation is a well-known finding in RP. Unexpected was the observation that patients with presumed recessive RP can have regions with functionally and structurally normal retina. Such patients will require special consideration in future clinical trials of either focal or systemic treatment. Whether there is a common molecular mechanism shared by forms of RP with normal regions of retina warrants further study.

  8. Amniotic fluid cortisol and alpha-fetoprotein in normal and aneuploid pregnancies.

    PubMed

    Drugan, A; Subramanian, M G; Johnson, M P; Evans, M I

    1988-01-01

    Cortisol and alpha-fetoprotein (AFP) levels were measured in amniotic fluid (AF) samples at 15-20 weeks of gestation from 125 normal pregnancies and 29 pregnancies affected by aneuploidy. The normal pregnancy group was further subdivided into 'low' AF-AFP (less than 0.6 MOM, n = 60) and 'normal' AF-AFP (0.6 less than AFP less than 1.4 MOM, n = 65). A significant, inverse, linear correlation was found between cortisol and AF-AFP for both normal AFP and low AFP groups (r = -0.26, and r = -0.4, respectively, p less than 0.05). Gestational age was significantly correlated with both cortisol and AFP levels in the normal pregnancy groups. No difference was found when cortisol levels were compared between the low and normal AFP groups. The correlation between cortisol and AFP in aneuploid pregnancies was not significant (p = 0.37). The strong association between cortisol or AFP and gestational age in normal pregnancy (p less than 0.00001) was lost in trisomic gestation. We conclude that higher cortisol levels do not seem to be the cause of low AFP in normal or aneuploid pregnancies.

  9. Physical Properties of Normal Grade Biodiesel and Winter Grade Biodiesel

    PubMed Central

    Sadrolhosseini, Amir Reza; Moksin, Mohd Maarof; Nang, Harrison Lau Lik; Norozi, Monir; Yunus, W. Mahmood Mat; Zakaria, Azmi

    2011-01-01

    In this study, optical and thermal properties of normal grade and winter grade palm oil biodiesel were investigated. Surface Plasmon Resonance and Photopyroelectric technique were used to evaluate the samples. The dispersion curve and thermal diffusivity were obtained. Consequently, the variation of refractive index, as a function of wavelength in normal grade biodiesel is faster than winter grade palm oil biodiesel, and the thermal diffusivity of winter grade biodiesel is higher than the thermal diffusivity of normal grade biodiesel. This is attributed to the higher palmitic acid C16:0 content in normal grade than in winter grade palm oil biodiesel. PMID:21731429

  10. EMG normalization method based on grade 3 of manual muscle testing: Within- and between-day reliability of normalization tasks and application to gait analysis.

    PubMed

    Tabard-Fougère, Anne; Rose-Dulcina, Kevin; Pittet, Vincent; Dayer, Romain; Vuillerme, Nicolas; Armand, Stéphane

    2018-02-01

    Electromyography (EMG) is an important parameter in Clinical Gait Analysis (CGA), and is generally interpreted with timing of activation. EMG amplitude comparisons between individuals, muscles or days need normalization. There is no consensus on existing methods. The gold standard, maximum voluntary isometric contraction (MVIC), is not adapted to pathological populations because patients are often unable to perform an MVIC. The normalization method inspired by the isometric grade 3 of manual muscle testing (isoMMT3), which is the ability of a muscle to maintain a position against gravity, could be an interesting alternative. The aim of this study was to evaluate the within- and between-day reliability of the isoMMT3 EMG normalizing method during gait compared with the conventional MVIC method. Lower limb muscles EMG (gluteus medius, rectus femoris, tibialis anterior, semitendinosus) were recorded bilaterally in nine healthy participants (five males, aged 29.7±6.2years, BMI 22.7±3.3kgm -2 ) giving a total of 18 independent legs. Three repeated measurements of the isoMMT3 and MVIC exercises were performed with an EMG recording. EMG amplitude of the muscles during gait was normalized by these two methods. This protocol was repeated one week later. Within- and between-day reliability of normalization tasks were similar for isoMMT3 and MVIC methods. Within- and between-day reliability of gait EMG normalized by isoMMT3 was higher than with MVIC normalization. These results indicate that EMG normalization using isoMMT3 is a reliable method with no special equipment needed and will support CGA interpretation. The next step will be to evaluate this method in pathological populations. Copyright © 2017 Elsevier B.V. All rights reserved.

  11. Normality of different orders for Cantor series expansions

    NASA Astrophysics Data System (ADS)

    Airey, Dylan; Mance, Bill

    2017-10-01

    Let S \\subseteq {N} have the property that for each k \\in S the set (S - k) \\cap {N} \\setminus S has asymptotic density 0. We prove that there exists a basic sequence Q where the set of numbers Q-normal of all orders in S but not Q-normal of all orders not in S has full Hausdorff dimension. If the function \

  12. Bas-Relief Modeling from Normal Images with Intuitive Styles.

    PubMed

    Ji, Zhongping; Ma, Weiyin; Sun, Xianfang

    2014-05-01

    Traditional 3D model-based bas-relief modeling methods are often limited to model-dependent and monotonic relief styles. This paper presents a novel method for digital bas-relief modeling with intuitive style control. Given a composite normal image, the problem discussed in this paper involves generating a discontinuity-free depth field with high compression of depth data while preserving or even enhancing fine details. In our framework, several layers of normal images are composed into a single normal image. The original normal image on each layer is usually generated from 3D models or through other techniques as described in this paper. The bas-relief style is controlled by choosing a parameter and setting a targeted height for them. Bas-relief modeling and stylization are achieved simultaneously by solving a sparse linear system. Different from previous work, our method can be used to freely design bas-reliefs in normal image space instead of in object space, which makes it possible to use any popular image editing tools for bas-relief modeling. Experiments with a wide range of 3D models and scenes show that our method can effectively generate digital bas-reliefs.

  13. Variance-reduction normalization technique for a compton camera system

    NASA Astrophysics Data System (ADS)

    Kim, S. M.; Lee, J. S.; Kim, J. H.; Seo, H.; Kim, C. H.; Lee, C. S.; Lee, S. J.; Lee, M. C.; Lee, D. S.

    2011-01-01

    For an artifact-free dataset, pre-processing (known as normalization) is needed to correct inherent non-uniformity of detection property in the Compton camera which consists of scattering and absorbing detectors. The detection efficiency depends on the non-uniform detection efficiency of the scattering and absorbing detectors, different incidence angles onto the detector surfaces, and the geometry of the two detectors. The correction factor for each detected position pair which is referred to as the normalization coefficient, is expressed as a product of factors representing the various variations. The variance-reduction technique (VRT) for a Compton camera (a normalization method) was studied. For the VRT, the Compton list-mode data of a planar uniform source of 140 keV was generated from a GATE simulation tool. The projection data of a cylindrical software phantom were normalized with normalization coefficients determined from the non-uniformity map, and then reconstructed by an ordered subset expectation maximization algorithm. The coefficient of variations and percent errors of the 3-D reconstructed images showed that the VRT applied to the Compton camera provides an enhanced image quality and the increased recovery rate of uniformity in the reconstructed image.

  14. Effective normalization for copy number variation detection from whole genome sequencing.

    PubMed

    Janevski, Angel; Varadan, Vinay; Kamalakaran, Sitharthan; Banerjee, Nilanjana; Dimitrova, Nevenka

    2012-01-01

    Whole genome sequencing enables a high resolution view of the human genome and provides unique insights into genome structure at an unprecedented scale. There have been a number of tools to infer copy number variation in the genome. These tools, while validated, also include a number of parameters that are configurable to genome data being analyzed. These algorithms allow for normalization to account for individual and population-specific effects on individual genome CNV estimates but the impact of these changes on the estimated CNVs is not well characterized. We evaluate in detail the effect of normalization methodologies in two CNV algorithms FREEC and CNV-seq using whole genome sequencing data from 8 individuals spanning four populations. We apply FREEC and CNV-seq to a sequencing data set consisting of 8 genomes. We use multiple configurations corresponding to different read-count normalization methodologies in FREEC, and statistically characterize the concordance of the CNV calls between FREEC configurations and the analogous output from CNV-seq. The normalization methodologies evaluated in FREEC are: GC content, mappability and control genome. We further stratify the concordance analysis within genic, non-genic, and a collection of validated variant regions. The GC content normalization methodology generates the highest number of altered copy number regions. Both mappability and control genome normalization reduce the total number and length of copy number regions. Mappability normalization yields Jaccard indices in the 0.07 - 0.3 range, whereas using a control genome normalization yields Jaccard index values around 0.4 with normalization based on GC content. The most critical impact of using mappability as a normalization factor is substantial reduction of deletion CNV calls. The output of another method based on control genome normalization, CNV-seq, resulted in comparable CNV call profiles, and substantial agreement in variable gene and CNV region calls

  15. "I Treat Him as a Normal Patient": Unveiling the Normalization Coping Strategy Among Formal Caregivers of Persons With Dementia and Its Implications for Person-Centered Care.

    PubMed

    Bentwich, Miriam Ethel; Dickman, Nomy; Oberman, Amitai; Bokek-Cohen, Ya'arit

    2017-11-01

    Currently, 47 million people have dementia, worldwide, often requiring paid care by formal caregivers. Research regarding family caregivers suggests normalization as a model for coping with negative emotional outcomes in caring for a person with dementia (PWD). The study aims to explore whether normalization coping mechanism exists among formal caregivers, reveal differences in its application among cross-cultural caregivers, and examine how this coping mechanism may be related to implementing person-centered care for PWDs. Content analysis of interviews with 20 formal caregivers from three cultural groups (Jews born in Israel [JI], Arabs born in Israel [AI], Russian immigrants [RI]), attending to PWDs. We extracted five normalization modes, revealing AI caregivers had substantially more utterances of normalization expressions than their colleagues. The normalization modes most commonly expressed by AI caregivers relate to the personhood of PWDs. These normalization modes may enhance formal caregivers' ability to employ person-centered care.

  16. Color normalization for robust evaluation of microscopy images

    NASA Astrophysics Data System (ADS)

    Švihlík, Jan; Kybic, Jan; Habart, David

    2015-09-01

    This paper deals with color normalization of microscopy images of Langerhans islets in order to increase robustness of the islet segmentation to illumination changes. The main application is automatic quantitative evaluation of the islet parameters, useful for determining the feasibility of islet transplantation in diabetes. First, background illumination inhomogeneity is compensated and a preliminary foreground/background segmentation is performed. The color normalization itself is done in either lαβ or logarithmic RGB color spaces, by comparison with a reference image. The color-normalized images are segmented using color-based features and pixel-wise logistic regression, trained on manually labeled images. Finally, relevant statistics such as the total islet area are evaluated in order to determine the success likelihood of the transplantation.

  17. Selective attention in normal and impaired hearing.

    PubMed

    Shinn-Cunningham, Barbara G; Best, Virginia

    2008-12-01

    A common complaint among listeners with hearing loss (HL) is that they have difficulty communicating in common social settings. This article reviews how normal-hearing listeners cope in such settings, especially how they focus attention on a source of interest. Results of experiments with normal-hearing listeners suggest that the ability to selectively attend depends on the ability to analyze the acoustic scene and to form perceptual auditory objects properly. Unfortunately, sound features important for auditory object formation may not be robustly encoded in the auditory periphery of HL listeners. In turn, impaired auditory object formation may interfere with the ability to filter out competing sound sources. Peripheral degradations are also likely to reduce the salience of higher-order auditory cues such as location, pitch, and timbre, which enable normal-hearing listeners to select a desired sound source out of a sound mixture. Degraded peripheral processing is also likely to increase the time required to form auditory objects and focus selective attention so that listeners with HL lose the ability to switch attention rapidly (a skill that is particularly important when trying to participate in a lively conversation). Finally, peripheral deficits may interfere with strategies that normal-hearing listeners employ in complex acoustic settings, including the use of memory to fill in bits of the conversation that are missed. Thus, peripheral hearing deficits are likely to cause a number of interrelated problems that challenge the ability of HL listeners to communicate in social settings requiring selective attention.

  18. Normal mode study of the earth's rigid body motions

    NASA Technical Reports Server (NTRS)

    Chao, B. F.

    1983-01-01

    In this paper it is shown that the earth's rigid body (rb) motions can be represented by an analytical set of eigensolutions to the equation of motion for elastic-gravitational free oscillations. Thus each degree of freedom in the rb motion is associated with a rb normal mode. Cases of both nonrotating and rotating earth models are studied, and it is shown that the rb modes do incorporate neatly into the earth's system of normal modes of free oscillation. The excitation formula for the rb modes are also obtained, based on normal mode theory. Physical implications of the results are summarized and the fundamental differences between rb modes and seismic modes are emphasized. In particular, it is ascertained that the Chandler wobble, being one of the rb modes belonging to the rotating earth, can be studied using the established theory of normal modes.

  19. Normal mode Rossby waves observed in the upper stratosphere

    NASA Technical Reports Server (NTRS)

    Hirooka, T.; Hirota, I.

    1985-01-01

    In recent years, observational evidence has been obtained for westward traveling planetary waves in the middle atmosphere with the aid of global data from satellites. There is no doubt that the fair portion of the observed traveling waves can be understood as the manifestation of the normal mode Rossby waves which are theoretically derived from the tidal theory. Some observational aspects of the structure and behavior of the normal model Rossby waves in the upper stratosphere are reported. The data used are the global stratospheric geopotential thickness and height analyses which are derived mainly from the Stratospheric Sounding Units (SSUs) on board TIROS-N and NOAA satellites. A clear example of the influence of the normal mode Rossby wave on the mean flow is reported. The mechanism considered is interference between the normal mode Rossby wave and the quasi-stationary wave.

  20. Advanced Very High Resolution Radiometer Normalized Difference Vegetation Index Composites

    USGS Publications Warehouse

    ,

    2005-01-01

    The Advanced Very High Resolution Radiometer (AVHRR) is a broad-band scanner with four to six bands, depending on the model. The AVHRR senses in the visible, near-, middle-, and thermal- infrared portions of the electromagnetic spectrum. This sensor is carried on a series of National Oceanic and Atmospheric Administration (NOAA) Polar Orbiting Environmental Satellites (POES), beginning with the Television InfraRed Observation Satellite (TIROS-N) in 1978. Since 1989, the United States Geological Survey (USGS) Center for Earth Resources Observation and Science (EROS) has been mapping the vegetation condition of the United States and Alaska using satellite information from the AVHRR sensor. The vegetation condition composites, more commonly called greenness maps, are produced every week using the latest information on the growth and condition of the vegetation. One of the most important aspects of USGS greenness mapping is the historical archive of information dating back to 1989. This historical stretch of information has allowed the USGS to determine a 'normal' vegetation condition. As a result, it is possible to compare the current week's vegetation condition with normal vegetation conditions. An above normal condition could indicate wetter or warmer than normal conditions, while a below normal condition could indicate colder or dryer than normal conditions. The interpretation of departure from normal will depend on the season and geography of a region.

  1. Teaching Normal Birth Interactively

    PubMed Central

    Hotelling, Barbara A.

    2004-01-01

    In this column, the author provides examples of teaching strategies that childbirth educators may utilize to illustrate each of the six care practices supported by Lamaze International to promote normal birth: labor begins on its own, freedom of movement throughout labor, continuous labor support, no routine interventions, non-supine (e.g., upright or side-lying) positions for birth, and no separation of mother and baby with unlimited opportunity for breastfeeding. PMID:17273389

  2. Cross Correlation versus Normalized Mutual Information on Image Registration

    NASA Technical Reports Server (NTRS)

    Tan, Bin; Tilton, James C.; Lin, Guoqing

    2016-01-01

    This is the first study to quantitatively assess and compare cross correlation and normalized mutual information methods used to register images in subpixel scale. The study shows that the normalized mutual information method is less sensitive to unaligned edges due to the spectral response differences than is cross correlation. This characteristic makes the normalized image resolution a better candidate for band to band registration. Improved band-to-band registration in the data from satellite-borne instruments will result in improved retrievals of key science measurements such as cloud properties, vegetation, snow and fire.

  3. Return to normality after a radiological emergency.

    PubMed

    Lochard, J; Prêtre, S

    1995-01-01

    Some preliminary considerations from the management of post-accident situations connected to large scale and high land contamination are presented. The return to normal, or at least acceptable living conditions, as soon as reasonably achievable, and the prevention of the possible emergence of a post-accident crisis is of key importance. A scheme is proposed for understanding the dynamics of the various phases after an accident. An attempt is made to characterize some of the parameters driving the acceptability of post-accident situations. Strategies to return to normal living conditions in contaminated areas are considered.

  4. The normalization of deviance in healthcare delivery

    PubMed Central

    Banja, John

    2009-01-01

    Many serious medical errors result from violations of recognized standards of practice. Over time, even egregious violations of standards of practice may become “normalized” in healthcare delivery systems. This article describes what leads to this normalization and explains why flagrant practice deviations can persist for years, despite the importance of the standards at issue. This article also provides recommendations to aid healthcare organizations in identifying and managing unsafe practice deviations before they become normalized and pose genuine risks to patient safety, quality care, and employee morale. PMID:20161685

  5. Reversible grasp reflexes in normal pressure hydrocephalus.

    PubMed

    Thomas, Rhys H; Bennetto, Luke; Silva, Mark T

    2009-05-01

    We present two cases of normal pressure hydrocephalus in combination with grasp reflexes. In both cases the grasp reflexes disappeared following high volume cerebrospinal fluid removal. In one of the cases the grasp reflexes returned over a period of weeks but again resolved following definitive cerebrospinal fluid shunting surgery, and remained absent until final follow up at 9 months. We hypothesise that resolving grasp reflexes following high volume CSF removal has both diagnostic and prognostic value in normal pressure hydrocephalus, encouraging larger studies on the relevance of primitive reflexes in NPH.

  6. Normal and abnormal human vestibular ocular function

    NASA Technical Reports Server (NTRS)

    Peterka, R. J.; Black, F. O.

    1986-01-01

    The major motivation of this research is to understand the role the vestibular system plays in sensorimotor interactions which result in spatial disorientation and motion sickness. A second goal was to explore the range of abnormality as it is reflected in quantitative measures of vestibular reflex responses. The results of a study of vestibular reflex measurements in normal subjects and preliminary results in abnormal subjects are presented in this report. Statistical methods were used to define the range of normal responses, and determine age related changes in function.

  7. Quantitative RNFL attenuation coefficient measurements by RPE-normalized OCT data

    NASA Astrophysics Data System (ADS)

    Vermeer, K. A.; van der Schoot, J.; Lemij, H. G.; de Boer, J. F.

    2012-03-01

    We demonstrate significantly different scattering coefficients of the retinal nerve fiber layer (RNFL) between normal and glaucoma subjects. In clinical care, SD-OCT is routinely used to assess the RNFL thickness for glaucoma management. In this way, the full OCT data set is conveniently reduced to an easy to interpret output, matching results from older (non- OCT) instruments. However, OCT provides more data, such as the signal strength itself, which is due to backscattering in the retinal layers. For quantitative analysis, this signal should be normalized to adjust for local differences in the intensity of the beam that reaches the retina. In this paper, we introduce a model that relates the OCT signal to the attenuation coefficient of the tissue. The average RNFL signal (within an A-line) was then normalized based on the observed RPE signal, resulting in normalized RNFL attenuation coefficient maps. These maps showed local defects matching those found in thickness data. The average (normalized) RNFL attenuation coefficient of a fixed band around the optic nerve head was significantly lower in glaucomatous eyes than in normal eyes (3.0mm-1 vs. 4.9mm-1, P<0.01, Mann-Whitney test).

  8. Evaluating acoustic speaker normalization algorithms: evidence from longitudinal child data.

    PubMed

    Kohn, Mary Elizabeth; Farrington, Charlie

    2012-03-01

    Speaker vowel formant normalization, a technique that controls for variation introduced by physical differences between speakers, is necessary in variationist studies to compare speakers of different ages, genders, and physiological makeup in order to understand non-physiological variation patterns within populations. Many algorithms have been established to reduce variation introduced into vocalic data from physiological sources. The lack of real-time studies tracking the effectiveness of these normalization algorithms from childhood through adolescence inhibits exploration of child participation in vowel shifts. This analysis compares normalization techniques applied to data collected from ten African American children across five time points. Linear regressions compare the reduction in variation attributable to age and gender for each speaker for the vowels BEET, BAT, BOT, BUT, and BOAR. A normalization technique is successful if it maintains variation attributable to a reference sociolinguistic variable, while reducing variation attributable to age. Results indicate that normalization techniques which rely on both a measure of central tendency and range of the vowel space perform best at reducing variation attributable to age, although some variation attributable to age persists after normalization for some sections of the vowel space. © 2012 Acoustical Society of America

  9. Developing Normal Turns-Amplitude Clouds for Upper and Lower Limbs.

    PubMed

    Jabre, Joe F; Nikolayev, Sergey G; Babayev, Michael B; Chindilov, Denis V; Muravyov, Anatoly Y

    2016-10-01

    Turns and amplitude analysis (T&A) is a frequently used method for automatic EMG interference pattern analysis. The T&A normal values have only been developed for a limited number of muscles. Our objective was to obtain normal T&A clouds for upper and lower extremity muscles for which no normal values exist in the literature. The T&A normative data using concentric needle electrodes were obtained from 68 men and 56 women aged 20 to 60 years. Normal upper and lower extremity T&A clouds were obtained and presented in this article. The T&A normal values collected in this study maybe used to detect neurogenic and myopathic abnormalities in men and women at low-to-moderate muscle contractions. The effect of turns-amplitude data obtained at high force level of muscle contraction and its potential to falsely show neurogenic abnormalities are discussed.

  10. Informative graphing of continuous safety variables relative to normal reference limits.

    PubMed

    Breder, Christopher D

    2018-05-16

    Interpreting graphs of continuous safety variables can be complicated because differences in age, gender, and testing site methodologies data may give rise to multiple reference limits. Furthermore, data below the lower limit of normal are compressed relative to those points above the upper limit of normal. The objective of this study is to develop a graphing technique that addresses these issues and is visually intuitive. A mock dataset with multiple reference ranges is initially used to develop the graphing technique. Formulas are developed for conditions where data are above the upper limit of normal, normal, below the lower limit of normal, and below the lower limit of normal when the data value equals zero. After the formulae are developed, an anonymized dataset from an actual set of trials for an approved drug is evaluated comparing the technique developed in this study to standard graphical methods. Formulas are derived for the novel graphing method based on multiples of the normal limits. The formula for values scaled between the upper and lower limits of normal is a novel application of a readily available scaling formula. The formula for the lower limit of normal is novel and addresses the issue of this value potentially being indeterminate when the result to be scaled as a multiple is zero. The formulae and graphing method described in this study provides a visually intuitive method to graph continuous safety data including laboratory values, vital sign data.

  11. A normal ano-genital exam: sexual abuse or not?

    PubMed

    Hornor, Gail

    2010-01-01

    Sexual abuse is a problem of epidemic proportions in the United States. Pediatric nurse practitioners (PNPs) are at the forefront of providing care to children and families. The PNP is in a unique position to educate patients and families regarding sexual abuse and dispel common myths associated with sexual abuse. One such myth is that a normal ano-genital examination is synonymous with the absence of sexual abuse. This article will provide primary care providers, including PNPs, with a framework for understanding why a normal ano-genital examination does not negate the possibility of sexual abuse/assault. Normal ano-genital anatomy, changes that occur with puberty, and physical properties related to the genitalia and anus will be discussed. Photos will provide visualization of both normal variants of the pre-pubertal hymen and genitalia as well as changes that occur with puberty. Implications for practice for PNPs will be discussed.

  12. Quasi-Normal Modes of Stars and Black Holes.

    PubMed

    Kokkotas, Kostas D; Schmidt, Bernd G

    1999-01-01

    Perturbations of stars and black holes have been one of the main topics of relativistic astrophysics for the last few decades. They are of particular importance today, because of their relevance to gravitational wave astronomy. In this review we present the theory of quasi-normal modes of compact objects from both the mathematical and astrophysical points of view. The discussion includes perturbations of black holes (Schwarzschild, Reissner-Nordström, Kerr and Kerr-Newman) and relativistic stars (non-rotating and slowly-rotating). The properties of the various families of quasi-normal modes are described, and numerical techniques for calculating quasi-normal modes reviewed. The successes, as well as the limits, of perturbation theory are presented, and its role in the emerging era of numerical relativity and supercomputers is discussed.

  13. IRAS far-infrared colours of normal stars

    NASA Technical Reports Server (NTRS)

    Waters, L. B. F. M.; Cote, J.; Aumann, H. H.

    1987-01-01

    The analysis of IRAS observations at 12, 25, 60 and 100 microns of bright stars of spectral type O to M is presented. The objective is to identify the 'normal' stellar population and to characterize it in terms of the relationships between (B-V) and (V-/12/), between (R-I) and (V-/12/), and as a function of spectral type and luminosity class. A well-defined relation is found between the color of normal stars in the visual (B-V), (R-I) and in the IR, which does not depend on luminosity class. Using the (B-V), (V-/12/) relation for normal stars, it is found that B and M type stars show a large fraction of deviating stars, mostly with IR excess that is probably caused by circumstellar material. A comparison of IRAS colors with the Johnson colors as a function of spectral type shows good agreement except for the K0 to M5 type stars. The results will be useful in identifying the deviating stars detected with IRAS.

  14. Effect of transforming growth factor-beta1 on embryonic and posthatch muscle growth and development in normal and low score normal chicken.

    PubMed

    Li, X; Velleman, S G

    2009-02-01

    During skeletal muscle development, transforming growth factor-beta1 (TGF-beta1) is a potent inhibitor of muscle cell proliferation and differentiation. The TGF-beta1 signal is carried by Smad proteins into the cell nucleus, inhibiting the expression of key myogenic regulatory factors including MyoD and myogenin. However, the molecular mechanism by which TGF-beta1 inhibits muscle cell proliferation and differentiation has not been well documented in vivo. The present study investigated the effect of TGF-beta1 on in vivo skeletal muscle growth and development. A chicken line, Low Score Normal (LSN) with reduced muscling and upregulated TGF-beta1 expression, was used and compared to a normal chicken line. The injection of TGF-beta1 at embryonic day (ED) 3 significantly reduced the pectoralis major (p. major) muscle weight in the normal birds at 1 wk posthatch, whereas no significant difference was observed in the LSN birds. The difference between normal and LSN birds in response to TGF-beta1 is likely due to different levels of endogenous TGF-beta1 where the LSN birds have increased TGF-beta1 expression in their p. major muscle at both 17 ED and 6 wk posthatch. Smad3 expression was reduced by TGF-beta1 from 10 ED to 1 wk posthatch in normal p. major muscle. Unlike Smad3, Smad7 expression was not significantly affected by TGF-beta1 until posthatch in both normal and LSN p. major muscle. Expression of MyoD was reduced 35% by TGF-beta1 during embryonic development in normal p. major muscle, whereas LSN p. major muscle showed a delayed decrease at 1 d posthatch in MyoD expression in response to the TGF-beta1 treatment. Myogenin expression was reduced 29% by TGF-beta1 after hatch in normal p. major muscle. In LSN p. major muscle, TGF-beta1 treatment significantly decreased myogenin expression by 43% at 1 d posthatch and 32% at 1 wk posthatch. These data suggested that TGF-beta1 reduced p. major muscle growth by inhibiting MyoD and myogenin expression during both embryonic

  15. What is normal in normal aging? Effects of Aging, Amyloid and Alzheimer’s Disease on the Cerebral Cortex and the Hippocampus

    PubMed Central

    Fjell, Anders M.; McEvoy, Linda; Holland, Dominic; Dale, Anders M.; Walhovd, Kristine B

    2015-01-01

    What can be expected in normal aging, and where does normal aging stop and pathological neurodegeneration begin? With the slow progression of age-related dementias such as Alzheimer’s Disease (AD), it is difficult to distinguish age-related changes from effects of undetected disease. We review recent research on changes of the cerebral cortex and the hippocampus in aging and the borders between normal aging and AD. We argue that prominent cortical reductions are evident in fronto-temporal regions in elderly even with low probability of AD, including regions overlapping the default mode network. Importantly, these regions show high levels of amyloid deposition in AD, and are both structurally and functionally vulnerable early in the disease. This normalcy-pathology homology is critical to understand, since aging itself is the major risk factor for sporadic AD. Thus, rather than necessarily reflecting early signs of disease, these changes may be part of normal aging, and may inform on why the aging brain is so much more susceptible to AD than is the younger brain. We suggest that regions characterized by a high degree of life-long plasticity are vulnerable to detrimental effects of normal aging, and that this age-vulnerability renders them more susceptible to additional, pathological AD-related changes. We conclude that it will be difficult to understand AD without understanding why it preferably affects older brains, and that we need a model that accounts for age-related changes in AD-vulnerable regions independently of AD-pathology. PMID:24548606

  16. The use of normal forms for analysing nonlinear mechanical vibrations

    PubMed Central

    Neild, Simon A.; Champneys, Alan R.; Wagg, David J.; Hill, Thomas L.; Cammarano, Andrea

    2015-01-01

    A historical introduction is given of the theory of normal forms for simplifying nonlinear dynamical systems close to resonances or bifurcation points. The specific focus is on mechanical vibration problems, described by finite degree-of-freedom second-order-in-time differential equations. A recent variant of the normal form method, that respects the specific structure of such models, is recalled. It is shown how this method can be placed within the context of the general theory of normal forms provided the damping and forcing terms are treated as unfolding parameters. The approach is contrasted to the alternative theory of nonlinear normal modes (NNMs) which is argued to be problematic in the presence of damping. The efficacy of the normal form method is illustrated on a model of the vibration of a taut cable, which is geometrically nonlinear. It is shown how the method is able to accurately predict NNM shapes and their bifurcations. PMID:26303917

  17. Normal forms for reduced stochastic climate models

    PubMed Central

    Majda, Andrew J.; Franzke, Christian; Crommelin, Daan

    2009-01-01

    The systematic development of reduced low-dimensional stochastic climate models from observations or comprehensive high-dimensional climate models is an important topic for atmospheric low-frequency variability, climate sensitivity, and improved extended range forecasting. Here techniques from applied mathematics are utilized to systematically derive normal forms for reduced stochastic climate models for low-frequency variables. The use of a few Empirical Orthogonal Functions (EOFs) (also known as Principal Component Analysis, Karhunen–Loéve and Proper Orthogonal Decomposition) depending on observational data to span the low-frequency subspace requires the assessment of dyad interactions besides the more familiar triads in the interaction between the low- and high-frequency subspaces of the dynamics. It is shown below that the dyad and multiplicative triad interactions combine with the climatological linear operator interactions to simultaneously produce both strong nonlinear dissipation and Correlated Additive and Multiplicative (CAM) stochastic noise. For a single low-frequency variable the dyad interactions and climatological linear operator alone produce a normal form with CAM noise from advection of the large scales by the small scales and simultaneously strong cubic damping. These normal forms should prove useful for developing systematic strategies for the estimation of stochastic models from climate data. As an illustrative example the one-dimensional normal form is applied below to low-frequency patterns such as the North Atlantic Oscillation (NAO) in a climate model. The results here also illustrate the short comings of a recent linear scalar CAM noise model proposed elsewhere for low-frequency variability. PMID:19228943

  18. Selective Attention in Normal and Impaired Hearing

    PubMed Central

    Shinn-Cunningham, Barbara G.; Best, Virginia

    2008-01-01

    A common complaint among listeners with hearing loss (HL) is that they have difficulty communicating in common social settings. This article reviews how normal-hearing listeners cope in such settings, especially how they focus attention on a source of interest. Results of experiments with normal-hearing listeners suggest that the ability to selectively attend depends on the ability to analyze the acoustic scene and to form perceptual auditory objects properly. Unfortunately, sound features important for auditory object formation may not be robustly encoded in the auditory periphery of HL listeners. In turn, impaired auditory object formation may interfere with the ability to filter out competing sound sources. Peripheral degradations are also likely to reduce the salience of higher-order auditory cues such as location, pitch, and timbre, which enable normal-hearing listeners to select a desired sound source out of a sound mixture. Degraded peripheral processing is also likely to increase the time required to form auditory objects and focus selective attention so that listeners with HL lose the ability to switch attention rapidly (a skill that is particularly important when trying to participate in a lively conversation). Finally, peripheral deficits may interfere with strategies that normal-hearing listeners employ in complex acoustic settings, including the use of memory to fill in bits of the conversation that are missed. Thus, peripheral hearing deficits are likely to cause a number of interrelated problems that challenge the ability of HL listeners to communicate in social settings requiring selective attention. PMID:18974202

  19. Mean flow generation mechanism by inertial waves and normal modes

    NASA Astrophysics Data System (ADS)

    Will, Andreas; Ghasemi, Abouzar

    2016-04-01

    The mean flow generation mechanism by nonlinearity of the inertial normal modes and inertial wave beams in a rotating annular cavity with longitudinally librating walls in stable regime is discussed. Inertial normal modes (standing waves) are excited when libration frequency matches eigenfrequencies of the system. Inertial wave beams are produced by Ekman pumping and suction in a rotating cylinder and form periodic orbits or periodic ray trajectories at selected frequencies. Inertial wave beams emerge as concentrated shear layers in a librating annular cavity, while normal modes appear as global recirculation cells. Both (inertial wave beam and mode) are helical and thus intrinsically non-linear flow structures. No second mode or wave is necessary for non-linearity. We considered the low order normal modes (1,1), (2,1) and (2,2) which are expected to be excited in the planetary objects and investigate the mean flow generation mechanism using two independent solutions: 1) analytical solution (Borcia 2012) and 2) the wave component of the flow (ω0 component) obtained from the direct numerical simulation (DNS). It is well known that a retrograde bulk mean flow is generated by the Ekman boundary layer and E1/4-Stewartson layer close to the outer cylinder side wall due to libration. At and around the normal mode resonant frequencies we found additionally a prograde azimuthal mean flow (Inertial Normal Mode Mean Flow: INMMF) in the bulk of the fluid. The fluid in the bulk is in geostrophic balance in the absence of the inertial normal modes. However, when INMMF is excited, we found that the geostrophic balance does not hold in the region occupied by INMMF. We hypothesize that INMMF is generated by the nonlinearity of the normal modes or by second order effects. Expanding the velocity {V}(u_r,u_θ,u_z) and pressure (p) in a power series in ɛ (libration amplitude), the Navier-Stokes equations are segregated into the linear and nonlinear parts at orders ɛ1 and ɛ^2

  20. CUILESS2016: a clinical corpus applying compositional normalization of text mentions.

    PubMed

    Osborne, John D; Neu, Matthew B; Danila, Maria I; Solorio, Thamar; Bethard, Steven J

    2018-01-10

    Traditionally text mention normalization corpora have normalized concepts to single ontology identifiers ("pre-coordinated concepts"). Less frequently, normalization corpora have used concepts with multiple identifiers ("post-coordinated concepts") but the additional identifiers have been restricted to a defined set of relationships to the core concept. This approach limits the ability of the normalization process to express semantic meaning. We generated a freely available corpus using post-coordinated concepts without a defined set of relationships that we term "compositional concepts" to evaluate their use in clinical text. We annotated 5397 disorder mentions from the ShARe corpus to SNOMED CT that were previously normalized as "CUI-less" in the "SemEval-2015 Task 14" shared task because they lacked a pre-coordinated mapping. Unlike the previous normalization method, we do not restrict concept mappings to a particular set of the Unified Medical Language System (UMLS) semantic types and allow normalization to occur to multiple UMLS Concept Unique Identifiers (CUIs). We computed annotator agreement and assessed semantic coverage with this method. We generated the largest clinical text normalization corpus to date with mappings to multiple identifiers and made it freely available. All but 8 of the 5397 disorder mentions were normalized using this methodology. Annotator agreement ranged from 52.4% using the strictest metric (exact matching) to 78.2% using a hierarchical agreement that measures the overlap of shared ancestral nodes. Our results provide evidence that compositional concepts can increase semantic coverage in clinical text. To our knowledge we provide the first freely available corpus of compositional concept annotation in clinical text.

  1. 14 CFR 1216.306 - Actions normally requiring an EIS.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... normally requiring an EIS. (a) NASA will prepare an EIS for actions with the potential to significantly... action or mitigation of its potentially significant impacts. (b) Typical NASA actions normally requiring... material greater than the quantity for which the NASA Nuclear Flight Safety Assurance Manager may grant...

  2. Normalizing Catastrophe: Sustainability and Scientism

    ERIC Educational Resources Information Center

    Bonnett, Michael

    2013-01-01

    Making an adequate response to our deteriorating environmental situation is a matter of ever increasing urgency. It is argued that a central obstacle to achieving this is the way that scientism has become normalized in our thinking about environmental issues. This is taken to reflect on an underlying "metaphysics of mastery" that vitiates proper…

  3. Metabolic differences between short children with GH peak levels in the lower normal range and healthy children of normal height.

    PubMed

    Tidblad, Anders; Gustafsson, Jan; Marcus, Claude; Ritzén, Martin; Ekström, Klas

    2017-06-01

    Severe growth hormone deficiency (GHD) leads to several metabolic effects in the body ranging from abnormal body composition to biochemical disturbances. However, less is known regarding these parameters in short children with GH peak levels in the lower normal range during provocation tests. Our aim was to study the metabolic profile of this group and compare it with that of healthy children of normal height. Thirty-five pre-pubertal short children (<-2.5 SDS) aged between 7 and 10years, with peak levels of GH between 7 and 14μg/L in an arginine insulin tolerance test (AITT), were compared with twelve age- and sex-matched children of normal height. The metabolic profile of the subjects was analysed by blood samples, DEXA, frequently sampled intravenous glucose tolerance test, microdialysis and stable isotope examinations of rates of glucose production and lipolysis. There were no overall significant metabolic differences between the groups. However, in the subgroup analysis, the short children with GH peaks <10μg/L had significantly lower fasting insulin levels which also correlated to other metabolic parameters. The short pre-pubertal children with GH peak levels between 7 and 14μg/L did not differ significantly from healthy children of normal height but subpopulations within this group show significant metabolic differences. Copyright © 2017 Elsevier Ltd. All rights reserved.

  4. Normalization in Lie algebras via mould calculus and applications

    NASA Astrophysics Data System (ADS)

    Paul, Thierry; Sauzin, David

    2017-11-01

    We establish Écalle's mould calculus in an abstract Lie-theoretic setting and use it to solve a normalization problem, which covers several formal normal form problems in the theory of dynamical systems. The mould formalism allows us to reduce the Lie-theoretic problem to a mould equation, the solutions of which are remarkably explicit and can be fully described by means of a gauge transformation group. The dynamical applications include the construction of Poincaré-Dulac formal normal forms for a vector field around an equilibrium point, a formal infinite-order multiphase averaging procedure for vector fields with fast angular variables (Hamiltonian or not), or the construction of Birkhoff normal forms both in classical and quantum situations. As a by-product we obtain, in the case of harmonic oscillators, the convergence of the quantum Birkhoff form to the classical one, without any Diophantine hypothesis on the frequencies of the unperturbed Hamiltonians.

  5. Measuring and Estimating Normalized Contrast in Infrared Flash Thermography

    NASA Technical Reports Server (NTRS)

    Koshti, Ajay M.

    2013-01-01

    Infrared flash thermography (IRFT) is used to detect void-like flaws in a test object. The IRFT technique involves heating up the part surface using a flash of flash lamps. The post-flash evolution of the part surface temperature is sensed by an IR camera in terms of pixel intensity of image pixels. The IR technique involves recording of the IR video image data and analysis of the data using the normalized pixel intensity and temperature contrast analysis method for characterization of void-like flaws for depth and width. This work introduces a new definition of the normalized IR pixel intensity contrast and normalized surface temperature contrast. A procedure is provided to compute the pixel intensity contrast from the camera pixel intensity evolution data. The pixel intensity contrast and the corresponding surface temperature contrast differ but are related. This work provides a method to estimate the temperature evolution and the normalized temperature contrast from the measured pixel intensity evolution data and some additional measurements during data acquisition.

  6. Trojan dynamics well approximated by a new Hamiltonian normal form

    NASA Astrophysics Data System (ADS)

    Páez, Rocío Isabel; Locatelli, Ugo

    2015-10-01

    We revisit a classical perturbative approach to the Hamiltonian related to the motions of Trojan bodies, in the framework of the planar circular restricted three-body problem, by introducing a number of key new ideas in the formulation. In some sense, we adapt the approach of Garfinkel to the context of the normal form theory and its modern techniques. First, we make use of Delaunay variables for a physically accurate representation of the system. Therefore, we introduce a novel manipulation of the variables so as to respect the natural behaviour of the model. We develop a normalization procedure over the fast angle which exploits the fact that singularities in this model are essentially related to the slow angle. Thus, we produce a new normal form, i.e. an integrable approximation to the Hamiltonian. We emphasize some practical examples of the applicability of our normalizing scheme, e.g. the estimation of the stable libration region. Finally, we compare the level curves produced by our normal form with surfaces of section provided by the integration of the non-normalized Hamiltonian, with very good agreement. Further precision tests are also provided. In addition, we give a step-by-step description of the algorithm, allowing for extensions to more complicated models.

  7. Normal modes of the shallow water system on the cubed sphere

    NASA Astrophysics Data System (ADS)

    Kang, H. G.; Cheong, H. B.; Lee, C. H.

    2017-12-01

    Spherical harmonics expressed as the Rossby-Haurwitz waves are the normal modes of non-divergent barotropic model. Among the normal modes in the numerical models, the most unstable mode will contaminate the numerical results, and therefore the investigation of normal mode for a given grid system and a discretiztaion method is important. The cubed-sphere grid which consists of six identical faces has been widely adopted in many atmospheric models. This grid system is non-orthogonal grid so that calculation of the normal mode is quiet challenge problem. In the present study, the normal modes of the shallow water system on the cubed sphere discretized by the spectral element method employing the Gauss-Lobatto Lagrange interpolating polynomials as orthogonal basis functions is investigated. The algebraic equations for the shallow water equation on the cubed sphere are derived, and the huge global matrix is constructed. The linear system representing the eigenvalue-eigenvector relations is solved by numerical libraries. The normal mode calculated for the several horizontal resolution and lamb parameters will be discussed and compared to the normal mode from the spherical harmonics spectral method.

  8. A Compendium of Canine Normal Tissue Gene Expression

    PubMed Central

    Chen, Qing-Rong; Wen, Xinyu; Khan, Javed; Khanna, Chand

    2011-01-01

    Background Our understanding of disease is increasingly informed by changes in gene expression between normal and abnormal tissues. The release of the canine genome sequence in 2005 provided an opportunity to better understand human health and disease using the dog as clinically relevant model. Accordingly, we now present the first genome-wide, canine normal tissue gene expression compendium with corresponding human cross-species analysis. Methodology/Principal Findings The Affymetrix platform was utilized to catalogue gene expression signatures of 10 normal canine tissues including: liver, kidney, heart, lung, cerebrum, lymph node, spleen, jejunum, pancreas and skeletal muscle. The quality of the database was assessed in several ways. Organ defining gene sets were identified for each tissue and functional enrichment analysis revealed themes consistent with known physio-anatomic functions for each organ. In addition, a comparison of orthologous gene expression between matched canine and human normal tissues uncovered remarkable similarity. To demonstrate the utility of this dataset, novel canine gene annotations were established based on comparative analysis of dog and human tissue selective gene expression and manual curation of canine probeset mapping. Public access, using infrastructure identical to that currently in use for human normal tissues, has been established and allows for additional comparisons across species. Conclusions/Significance These data advance our understanding of the canine genome through a comprehensive analysis of gene expression in a diverse set of tissues, contributing to improved functional annotation that has been lacking. Importantly, it will be used to inform future studies of disease in the dog as a model for human translational research and provides a novel resource to the community at large. PMID:21655323

  9. Log-Normal Turbulence Dissipation in Global Ocean Models

    NASA Astrophysics Data System (ADS)

    Pearson, Brodie; Fox-Kemper, Baylor

    2018-03-01

    Data from turbulent numerical simulations of the global ocean demonstrate that the dissipation of kinetic energy obeys a nearly log-normal distribution even at large horizontal scales O (10 km ) . As the horizontal scales of resolved turbulence are larger than the ocean is deep, the Kolmogorov-Yaglom theory for intermittency in 3D homogeneous, isotropic turbulence cannot apply; instead, the down-scale potential enstrophy cascade of quasigeostrophic turbulence should. Yet, energy dissipation obeys approximate log-normality—robustly across depths, seasons, regions, and subgrid schemes. The distribution parameters, skewness and kurtosis, show small systematic departures from log-normality with depth and subgrid friction schemes. Log-normality suggests that a few high-dissipation locations dominate the integrated energy and enstrophy budgets, which should be taken into account when making inferences from simplified models and inferring global energy budgets from sparse observations.

  10. The Snail Family in Normal and Malignant Haematopoiesis.

    PubMed

    Carmichael, Catherine L; Haigh, Jody J

    2017-01-01

    Snail family proteins are key inducers of the epithelial-mesenchymal transition (EMT), a critical process required for normal embryonic development. They have also been strongly implicated in regulating the EMT-like processes required for tumour cell invasion, migration, and metastasis. Whether these proteins also contribute to normal blood cell development, however, remains to be clearly defined. Increasing evidence supports a role for the Snail family in regulating cell survival, migration, and differentiation within the haematopoietic system, as well as potentially an oncogenic role in the malignant transformation of haematopoietic stem cells. This review will provide a broad overview of the Snail family, including key aspects of their involvement in the regulation and development of solid organ cancer, as well as a discussion on our current understanding of Snail family function during normal and malignant haematopoiesis. © 2017 S. Karger AG, Basel.

  11. Developing Visualization Support System for Teaching/Learning Database Normalization

    ERIC Educational Resources Information Center

    Folorunso, Olusegun; Akinwale, AdioTaofeek

    2010-01-01

    Purpose: In tertiary institution, some students find it hard to learn database design theory, in particular, database normalization. The purpose of this paper is to develop a visualization tool to give students an interactive hands-on experience in database normalization process. Design/methodology/approach: The model-view-controller architecture…

  12. Morphological Differences Between Seyfert Hosts and Normal Galaxies

    NASA Astrophysics Data System (ADS)

    Shlosman, Isaac

    Using new sub-arcsecond resolution imaging we compare large-scale stellar bar fraction in CfA sample of Seyferts and a closely matched control sample of normal galaxies. We find a difference between the samples on the 2.5σ level. We further compare the axial ratios of bars in all available samples quoted in the literature and find a deficiency of small axial ratio bars in Seyferts compared to normal galaxies.

  13. Comparison of Social Interaction between Cochlear-Implanted Children with Normal Intelligence Undergoing Auditory Verbal Therapy and Normal-Hearing Children: A Pilot Study.

    PubMed

    Monshizadeh, Leila; Vameghi, Roshanak; Sajedi, Firoozeh; Yadegari, Fariba; Hashemi, Seyed Basir; Kirchem, Petra; Kasbi, Fatemeh

    2018-04-01

    A cochlear implant is a device that helps hearing-impaired children by transmitting sound signals to the brain and helping them improve their speech, language, and social interaction. Although various studies have investigated the different aspects of speech perception and language acquisition in cochlear-implanted children, little is known about their social skills, particularly Persian-speaking cochlear-implanted children. Considering the growing number of cochlear implants being performed in Iran and the increasing importance of developing near-normal social skills as one of the ultimate goals of cochlear implantation, this study was performed to compare the social interaction between Iranian cochlear-implanted children who have undergone rehabilitation (auditory verbal therapy) after surgery and normal-hearing children. This descriptive-analytical study compared the social interaction level of 30 children with normal hearing and 30 with cochlear implants who were conveniently selected. The Raven test was administered to the both groups to ensure normal intelligence quotient. The social interaction status of both groups was evaluated using the Vineland Adaptive Behavior Scale, and statistical analysis was performed using Statistical Package for Social Sciences (SPSS) version 21. After controlling age as a covariate variable, no significant difference was observed between the social interaction scores of both the groups (p > 0.05). In addition, social interaction had no correlation with sex in either group. Cochlear implantation followed by auditory verbal rehabilitation helps children with sensorineural hearing loss to have normal social interactions, regardless of their sex.

  14. Speech Rate Normalization and Phonemic Boundary Perception in Cochlear-Implant Users

    ERIC Educational Resources Information Center

    Jaekel, Brittany N.; Newman, Rochelle S.; Goupell, Matthew J.

    2017-01-01

    Purpose: Normal-hearing (NH) listeners rate normalize, temporarily remapping phonemic category boundaries to account for a talker's speech rate. It is unknown if adults who use auditory prostheses called cochlear implants (CI) can rate normalize, as CIs transmit degraded speech signals to the auditory nerve. Ineffective adjustment to rate…

  15. 20 CFR 336.11 - Exhaustion of rights to normal unemployment benefits.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 20 Employees' Benefits 1 2010-04-01 2010-04-01 false Exhaustion of rights to normal unemployment... RAILROAD UNEMPLOYMENT INSURANCE ACT DURATION OF NORMAL AND EXTENDED BENEFITS Extended Benefits § 336.11 Exhaustion of rights to normal unemployment benefits. For the purposes of this part, the Board considers that...

  16. 20 CFR 336.11 - Exhaustion of rights to normal unemployment benefits.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 20 Employees' Benefits 1 2011-04-01 2011-04-01 false Exhaustion of rights to normal unemployment... RAILROAD UNEMPLOYMENT INSURANCE ACT DURATION OF NORMAL AND EXTENDED BENEFITS Extended Benefits § 336.11 Exhaustion of rights to normal unemployment benefits. For the purposes of this part, the Board considers that...

  17. 20 CFR 336.11 - Exhaustion of rights to normal unemployment benefits.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 20 Employees' Benefits 1 2013-04-01 2012-04-01 true Exhaustion of rights to normal unemployment... RAILROAD UNEMPLOYMENT INSURANCE ACT DURATION OF NORMAL AND EXTENDED BENEFITS Extended Benefits § 336.11 Exhaustion of rights to normal unemployment benefits. For the purposes of this part, the Board considers that...

  18. 20 CFR 336.11 - Exhaustion of rights to normal unemployment benefits.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 20 Employees' Benefits 1 2012-04-01 2012-04-01 false Exhaustion of rights to normal unemployment... RAILROAD UNEMPLOYMENT INSURANCE ACT DURATION OF NORMAL AND EXTENDED BENEFITS Extended Benefits § 336.11 Exhaustion of rights to normal unemployment benefits. For the purposes of this part, the Board considers that...

  19. 20 CFR 336.11 - Exhaustion of rights to normal unemployment benefits.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 20 Employees' Benefits 1 2014-04-01 2012-04-01 true Exhaustion of rights to normal unemployment... RAILROAD UNEMPLOYMENT INSURANCE ACT DURATION OF NORMAL AND EXTENDED BENEFITS Extended Benefits § 336.11 Exhaustion of rights to normal unemployment benefits. For the purposes of this part, the Board considers that...

  20. Model-Based Normalization of a Fractional-Crystal Collimator for Small-Animal PET Imaging

    PubMed Central

    Li, Yusheng; Matej, Samuel; Karp, Joel S.; Metzler, Scott D.

    2017-01-01

    Previously, we proposed to use a coincidence collimator to achieve fractional-crystal resolution in PET imaging. We have designed and fabricated a collimator prototype for a small-animal PET scanner, A-PET. To compensate for imperfections in the fabricated collimator prototype, collimator normalization, as well as scanner normalization, is required to reconstruct quantitative and artifact-free images. In this study, we develop a normalization method for the collimator prototype based on the A-PET normalization using a uniform cylinder phantom. We performed data acquisition without the collimator for scanner normalization first, and then with the collimator from eight different rotation views for collimator normalization. After a reconstruction without correction, we extracted the cylinder parameters from which we generated expected emission sinograms. Single scatter simulation was used to generate the scattered sinograms. We used the least-squares method to generate the normalization coefficient for each LOR based on measured, expected and scattered sinograms. The scanner and collimator normalization coefficients were factorized by performing two normalizations separately. The normalization methods were also verified using experimental data acquired from A-PET with and without the collimator. In summary, we developed a model-base collimator normalization that can significantly reduce variance and produce collimator normalization with adequate statistical quality within feasible scan time. PMID:29270539

  1. Model-Based Normalization of a Fractional-Crystal Collimator for Small-Animal PET Imaging.

    PubMed

    Li, Yusheng; Matej, Samuel; Karp, Joel S; Metzler, Scott D

    2017-05-01

    Previously, we proposed to use a coincidence collimator to achieve fractional-crystal resolution in PET imaging. We have designed and fabricated a collimator prototype for a small-animal PET scanner, A-PET. To compensate for imperfections in the fabricated collimator prototype, collimator normalization, as well as scanner normalization, is required to reconstruct quantitative and artifact-free images. In this study, we develop a normalization method for the collimator prototype based on the A-PET normalization using a uniform cylinder phantom. We performed data acquisition without the collimator for scanner normalization first, and then with the collimator from eight different rotation views for collimator normalization. After a reconstruction without correction, we extracted the cylinder parameters from which we generated expected emission sinograms. Single scatter simulation was used to generate the scattered sinograms. We used the least-squares method to generate the normalization coefficient for each LOR based on measured, expected and scattered sinograms. The scanner and collimator normalization coefficients were factorized by performing two normalizations separately. The normalization methods were also verified using experimental data acquired from A-PET with and without the collimator. In summary, we developed a model-base collimator normalization that can significantly reduce variance and produce collimator normalization with adequate statistical quality within feasible scan time.

  2. Comparison of CSF Distribution between Idiopathic Normal Pressure Hydrocephalus and Alzheimer Disease.

    PubMed

    Yamada, S; Ishikawa, M; Yamamoto, K

    2016-07-01

    CSF volumes in the basal cistern and Sylvian fissure are increased in both idiopathic normal pressure hydrocephalus and Alzheimer disease, though the differences in these volumes in idiopathic normal pressure hydrocephalus and Alzheimer disease have not been well-described. Using CSF segmentation and volume quantification, we compared the distribution of CSF in idiopathic normal pressure hydrocephalus and Alzheimer disease. CSF volumes were extracted from T2-weighted 3D spin-echo sequences on 3T MR imaging and quantified semi-automatically. We compared the volumes and ratios of the ventricles and subarachnoid spaces after classification in 30 patients diagnosed with idiopathic normal pressure hydrocephalus, 10 with concurrent idiopathic normal pressure hydrocephalus and Alzheimer disease, 18 with Alzheimer disease, and 26 control subjects 60 years of age or older. Brain to ventricle ratios at the anterior and posterior commissure levels and 3D volumetric convexity cistern to ventricle ratios were useful indices for the differential diagnosis of idiopathic normal pressure hydrocephalus or idiopathic normal pressure hydrocephalus with Alzheimer disease from Alzheimer disease, similar to the z-Evans index and callosal angle. The most distinctive characteristics of the CSF distribution in idiopathic normal pressure hydrocephalus were small convexity subarachnoid spaces and the large volume of the basal cistern and Sylvian fissure. The distribution of the subarachnoid spaces in the idiopathic normal pressure hydrocephalus with Alzheimer disease group was the most deformed among these 3 groups, though the mean ventricular volume of the idiopathic normal pressure hydrocephalus with Alzheimer disease group was intermediate between that of the idiopathic normal pressure hydrocephalus and Alzheimer disease groups. The z-axial expansion of the lateral ventricle and compression of the brain just above the ventricle were the common findings in the parameters for differentiating

  3. Differences in otosclerotic and normal human stapedial osteoblast properties are normalized by alendronate in vitro.

    PubMed

    Gronowicz, Gloria; Richardson, Yvonne L; Flynn, John; Kveton, John; Eisen, Marc; Leonard, Gerald; Aronow, Michael; Rodner, Craig; Parham, Kourosh

    2014-10-01

    Identify and compare phenotypic properties of osteoblasts from patients with otosclerosis (OSO), normal bones (HOB), and normal stapes (NSO) to determine a possible cause for OSO hypermineralization and assess any effects of the bisphosphonate, alendronate. OSO (n = 11), NSO (n = 4), and HOB (n = 13) cultures were assayed for proliferation, adhesion, mineralization, and gene expression with and without 10(-10)M-10(-8)M alendronate. Academic hospital. Cultures were matched for age, sex, and passage number. Cell attachment and proliferation + alendronate were determined by Coulter counting cells and assaying tritiated thymidine uptake, respectively. At 7, 14, and 21 days of culture + alendronate, calcium content and gene expression by quantitative reverse transcription-polymerase chain reaction (qRT-PCR) were determined. OSO had significantly more cells adhere but less proliferation than NSO or HOB. Calcification was significantly increased in OSO compared to HOB and NSO. NSO and HOB had similar cell adhesion and proliferation rates. A dose-dependent effect of alendronate on OSO adhesion, proliferation, and mineralization was found, resulting in levels equal to NSO and HOB. All cultures expressed osteoblast-specific genes such as RUNX2, alkaline phosphatase, type I collagen, and osteocalcin. However, osteopontin was dramatically reduced, 9.4-fold at 14 days, in OSO compared to NSO. Receptor activator of nuclear factor κB ligand/osteoprotegerin (RANKL/OPG), important in bone resorption, was elevated in OSO with decreased levels of OPG levels. Alendronate had little effect on gene expression in HOB but in OSO increased osteopontin levels and decreased RANKL/OPG. OSO cultures displayed properties of hypermineralization due to decreased osteopontin (OPN) and also had increased RANKL/OPG, which were normalized by alendronate. © American Academy of Otolaryngology—Head and Neck Surgery Foundation 2014.

  4. Predicting normal tissue radiosensitivity

    NASA Astrophysics Data System (ADS)

    Dickson, Jeanette

    Two methods of predicting normal cell radiosensitivity were investigated in different patient groups. Plasma transforming growth factor beta one (TGFbeta1) levels were measured by ELISA, using a commercially available kit. Residual DNA double strand breaks were measured in normal epidermal fibroblasts following 150 Gy. After allowing 24 hours for repair, the DNA damage was assayed using pulsed field gel electrophoresis (PFGE). Pretreatment plasma TGFbeta1 levels were investigated retrospectively in patients with carcinoma of the cervix in relation to tumour control and late morbidity following radiotherapy. Plasma TGFbeta1 levels increased with increasing disease stage. They also correlated with two other known measures of tumour burden i.e. plasma levels of carcinoma antigen 125 (CA125) and tissue polypeptide antigen (TPA). Elevated pretreatment plasma TGFbeta1 levels predicted for a poor outcome both in terms of local control and overall survival. Plasma TGF?l levels did not predict for the development of radiotherapy morbidity of any grade. In conclusion pre-treatment plasma TGFbeta1 levels predict for tumour burden and tumour outcome in patients with carcinoma of the cervix. Changes in plasma TGFbeta1 levels measured prospectively may predict for radiation morbidity and should be investigated. A prospective study was undertaken in patients with carcinoma of the head and neck region. Changes in plasma TGFbeta1 levels between the start and the end of a course of radical radiotherapy were investigated in relation to the development of acute radiation toxicity. Patients were categorised according to the pattern of response of their TGFbeta1 levels over the course of their treatment. Those patients whose TGFbeta1 levels decreased, but did not normalise during radiotherapy were assigned to category 2. Category 2 predicted for a severe acute reaction, as measured using the LENT SOMA score, with a sensitivity of 33% and a specificity of 100%. The positive predictive

  5. Localized Energy-Based Normalization of Medical Images: Application to Chest Radiography.

    PubMed

    Philipsen, R H H M; Maduskar, P; Hogeweg, L; Melendez, J; Sánchez, C I; van Ginneken, B

    2015-09-01

    Automated quantitative analysis systems for medical images often lack the capability to successfully process images from multiple sources. Normalization of such images prior to further analysis is a possible solution to this limitation. This work presents a general method to normalize medical images and thoroughly investigates its effectiveness for chest radiography (CXR). The method starts with an energy decomposition of the image in different bands. Next, each band's localized energy is scaled to a reference value and the image is reconstructed. We investigate iterative and local application of this technique. The normalization is applied iteratively to the lung fields on six datasets from different sources, each comprising 50 normal CXRs and 50 abnormal CXRs. The method is evaluated in three supervised computer-aided detection tasks related to CXR analysis and compared to two reference normalization methods. In the first task, automatic lung segmentation, the average Jaccard overlap significantly increased from 0.72±0.30 and 0.87±0.11 for both reference methods to with normalization. The second experiment was aimed at segmentation of the clavicles. The reference methods had an average Jaccard index of 0.57±0.26 and 0.53±0.26; with normalization this significantly increased to . The third experiment was detection of tuberculosis related abnormalities in the lung fields. The average area under the Receiver Operating Curve increased significantly from 0.72±0.14 and 0.79±0.06 using the reference methods to with normalization. We conclude that the normalization can be successfully applied in chest radiography and makes supervised systems more generally applicable to data from different sources.

  6. Calculation of grain boundary normals directly from 3D microstructure images

    DOE PAGES

    Lieberman, E. J.; Rollett, A. D.; Lebensohn, R. A.; ...

    2015-03-11

    The determination of grain boundary normals is an integral part of the characterization of grain boundaries in polycrystalline materials. These normal vectors are difficult to quantify due to the discretized nature of available microstructure characterization techniques. The most common method to determine grain boundary normals is by generating a surface mesh from an image of the microstructure, but this process can be slow, and is subject to smoothing issues. A new technique is proposed, utilizing first order Cartesian moments of binary indicator functions, to determine grain boundary normals directly from a voxelized microstructure image. In order to validate the accuracymore » of this technique, the surface normals obtained by the proposed method are compared to those generated by a surface meshing algorithm. Specifically, the local divergence between the surface normals obtained by different variants of the proposed technique and those generated from a surface mesh of a synthetic microstructure constructed using a marching cubes algorithm followed by Laplacian smoothing is quantified. Next, surface normals obtained with the proposed method from a measured 3D microstructure image of a Ni polycrystal are used to generate grain boundary character distributions (GBCD) for Σ3 and Σ9 boundaries, and compared to the GBCD generated using a surface mesh obtained from the same image. Finally, the results show that the proposed technique is an efficient and accurate method to determine voxelized fields of grain boundary normals.« less

  7. Cy5 total protein normalization in Western blot analysis.

    PubMed

    Hagner-McWhirter, Åsa; Laurin, Ylva; Larsson, Anita; Bjerneld, Erik J; Rönn, Ola

    2015-10-01

    Western blotting is a widely used method for analyzing specific target proteins in complex protein samples. Housekeeping proteins are often used for normalization to correct for uneven sample loads, but these require careful validation since expression levels may vary with cell type and treatment. We present a new, more reliable method for normalization using Cy5-prelabeled total protein as a loading control. We used a prelabeling protocol based on Cy5 N-hydroxysuccinimide ester labeling that produces a linear signal response. We obtained a low coefficient of variation (CV) of 7% between the ratio of extracellular signal-regulated kinase (ERK1/2) target to Cy5 total protein control signals over the whole loading range from 2.5 to 20.0μg of Chinese hamster ovary cell lysate protein. Corresponding experiments using actin or tubulin as controls for normalization resulted in CVs of 13 and 18%, respectively. Glyceraldehyde-3-phosphate dehydrogenase did not produce a proportional signal and was not suitable for normalization in these cells. A comparison of ERK1/2 signals from labeled and unlabeled samples showed that Cy5 prelabeling did not affect antibody binding. By using total protein normalization we analyzed PP2A and Smad2/3 levels with high confidence. Copyright © 2015 Elsevier Inc. All rights reserved.

  8. BIOCHEMISTRY OF NORMAL AND IRRADIATED STRAINS OF HYMENOLEPIS DIMINUTA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fairbairn, D.; Wertheim, G.; Harpur, R.P.

    1961-09-01

    An irradiated strain of H. diminuta was developed in which morphological anomalies persisted for at least 7 generations. This and the normal strain from which it was derived were corapared for biochemical differences, which might lead to the discovery of a biocheraical lesion. No significant differences were found in fresh weights between normal and irradiated strains of H. diminuta. Samples of H. diminuta were then prepared, and their composition determined. There was a notable loss of carbohydrates by tapeworms during 24 hr of in vivo fasting, amounting to 56% and 62% in the normal and irradiated strains, respectively. On themore » other hand, lipids increased by 10% and protein by 4% in both strains, which suggests that the substances were not concerned with energy metabolism during starvation. The giycogen of both normal and irradiated strains of H. diminuta obtained from fed or fasted rats, determined directly or after maintenance of the parasites in glucose-saline, accounted for 99% of the alkali-stable carbohydrates. which in turn, comprised about 96% of the total carbohydrates. In general, no notable differences in the growth, chemical composition, or gross metabolism between normal and irradiated strains of H. diminuta were recognized. Thus, the morphological changes due to irradiation previously described are the reflection of biochemical events. (H.H.D.)« less

  9. Computerized mapping of fibrillation in normal ventricular myocardium

    NASA Astrophysics Data System (ADS)

    Chen, Peng-Sheng; Garfinkel, Alan; Weiss, James N.; Karagueuzian, Hrayr S.

    1998-03-01

    It is well known that the ability to fibrillate is intrinsic to a normal ventricle that exceeds a critical mass. The questions we address are how is ventricular fibrillation (VF) initiated and perpetuated in normal myocardium, and why is VF not seen more often in the general population if all ventricles have the ability to fibrillate. To study the mechanisms of VF, we used computerized mapping techniques with up to 512 channels of simultaneous multisite recordings for data acquisition. The data were then processed for dynamic display of the activation patterns and for mathematical analyses of the activation intervals. The results show that in normal ventricles, VF can be initiated by a single strong premature stimulus given during the vulnerable period of the cardiac cycle. The initial activations form a figure-eight pattern. Afterward, VF will perpetuate itself without any outside help. The self-perpetuation itself is due to at least two factors. One is that single wave fronts spontaneously break up into two or more wavelets. The second is that when two wavelets intersect perpendicular to each other, the second wavelet is broken by the residual refractoriness left over from the first wavelet. Mathematical analyses of the patterns of activation during VF revealed that VF is a form of chaos, and that transition from ventricular tachycardia (VT) to VF occurs via the quasiperiodic route. In separate experiments, we found that we can convert VF to VT by tissue size reduction. The physiological mechanism associated with the latter transition appears to be the reduction of the number of reentrant wave fronts and wandering wavelets. Based on these findings, we propose that the reentrant wave fronts and the wandering wavelets serve as the physiological equivalent of coupled oscillators. A minimal number of oscillators is needed for VF to perpetuate itself, and to generate chaotic dynamics; hence a critical mass is required to perpetuate VF. We conclude that VF in normal

  10. Ectodermal fragments from normal frog gastrulae condition substrata to support normal and hybrid mesodermal cell migration in vitro.

    PubMed

    Nakatsuji, N; Johnson, K E

    1984-06-01

    Using time-lapse cinemicrography and scanning electron microscopy, we have shown that normal Rana embryos and gastrulating hybrid embryos have extracellular fibrils on the inner surface of the ectodermal layer. These fibrils are absent prior to gastrulation and appear in increasing numbers during gastrulation. They can also be deposited in vitro where they condition substrata in such a way that normal presumptive mesodermal cells placed on them show extensive attachment and unoriented cell movement. These fibrils are also present in some arrested hybrid embryos, but in reduced numbers, or are lacking in other arrested hybrid embryos. Explanted ectodermal fragments from arrested hybrid embryos fail both to condition culture substrata by the deposition of fibrils and to promote cell attachment and translocation. In contrast, ectodermal fragments from normal embryos can condition culture substrata so as to promote moderate cell attachment and, for one particular gamete combination, even cell translocation of presumptive mesodermal cells taken from arrested hybrid embryos. These results provide new evidence to support the hypothesis that extracellular fibrils represent a system that promotes mesodermal cell migration in amphibian embryos. Differences in the fibrillar system in urodele and anuran embryos are discussed in relation to fundamental differences in the mode of mesodermal cell migration in these two classes of Amphibia.

  11. Optimal consistency in microRNA expression analysis using reference-gene-based normalization.

    PubMed

    Wang, Xi; Gardiner, Erin J; Cairns, Murray J

    2015-05-01

    Normalization of high-throughput molecular expression profiles secures differential expression analysis between samples of different phenotypes or biological conditions, and facilitates comparison between experimental batches. While the same general principles apply to microRNA (miRNA) normalization, there is mounting evidence that global shifts in their expression patterns occur in specific circumstances, which pose a challenge for normalizing miRNA expression data. As an alternative to global normalization, which has the propensity to flatten large trends, normalization against constitutively expressed reference genes presents an advantage through their relative independence. Here we investigated the performance of reference-gene-based (RGB) normalization for differential miRNA expression analysis of microarray expression data, and compared the results with other normalization methods, including: quantile, variance stabilization, robust spline, simple scaling, rank invariant, and Loess regression. The comparative analyses were executed using miRNA expression in tissue samples derived from subjects with schizophrenia and non-psychiatric controls. We proposed a consistency criterion for evaluating methods by examining the overlapping of differentially expressed miRNAs detected using different partitions of the whole data. Based on this criterion, we found that RGB normalization generally outperformed global normalization methods. Thus we recommend the application of RGB normalization for miRNA expression data sets, and believe that this will yield a more consistent and useful readout of differentially expressed miRNAs, particularly in biological conditions characterized by large shifts in miRNA expression.

  12. Shear-coupled grain-boundary migration dependence on normal strain/stress

    NASA Astrophysics Data System (ADS)

    Combe, N.; Mompiou, F.; Legros, M.

    2017-08-01

    In specific conditions, grain-boundary (GB) migration occurs in polycrystalline materials as an alternative vector of plasticity compared to the usual dislocation activity. The shear-coupled GB migration, the expected most efficient GB based mechanism, couples the GB motion to an applied shear stress. Stresses on GB in polycrystalline materials seldom have, however, a unique pure shear component. This work investigates the influence of a normal strain on the shear coupled migration of a Σ 13 (320 )[001 ] GB in a copper bicrystal using atomistic simulations. We show that the yield shear stress inducing the GB migration strongly depends on the applied normal stress. Beyond, the application of a normal stress on this GB qualitatively modifies the GB migration: while the Σ 13 (320 )[001 ] GB shear couples following the 〈110 〉 migration mode without normal stress, we report the observation of the 〈010 〉 mode under a sufficiently high tensile normal stress. Using the nudge elastic band method, we uncover the atomistic mechanism of this 〈010 〉 migration mode and energetically characterize it.

  13. Topological resilience in non-normal networked systems

    NASA Astrophysics Data System (ADS)

    Asllani, Malbor; Carletti, Timoteo

    2018-04-01

    The network of interactions in complex systems strongly influences their resilience and the system capability to resist external perturbations or structural damages and to promptly recover thereafter. The phenomenon manifests itself in different domains, e.g., parasitic species invasion in ecosystems or cascade failures in human-made networks. Understanding the topological features of the networks that affect the resilience phenomenon remains a challenging goal for the design of robust complex systems. We hereby introduce the concept of non-normal networks, namely networks whose adjacency matrices are non-normal, propose a generating model, and show that such a feature can drastically change the global dynamics through an amplification of the system response to exogenous disturbances and eventually impact the system resilience. This early stage transient period can induce the formation of inhomogeneous patterns, even in systems involving a single diffusing agent, providing thus a new kind of dynamical instability complementary to the Turing one. We provide, first, an illustrative application of this result to ecology by proposing a mechanism to mute the Allee effect and, second, we propose a model of virus spreading in a population of commuters moving using a non-normal transport network, the London Tube.

  14. Normalizing the causality between time series.

    PubMed

    Liang, X San

    2015-08-01

    Recently, a rigorous yet concise formula was derived to evaluate information flow, and hence the causality in a quantitative sense, between time series. To assess the importance of a resulting causality, it needs to be normalized. The normalization is achieved through distinguishing a Lyapunov exponent-like, one-dimensional phase-space stretching rate and a noise-to-signal ratio from the rate of information flow in the balance of the marginal entropy evolution of the flow recipient. It is verified with autoregressive models and applied to a real financial analysis problem. An unusually strong one-way causality is identified from IBM (International Business Machines Corporation) to GE (General Electric Company) in their early era, revealing to us an old story, which has almost faded into oblivion, about "Seven Dwarfs" competing with a giant for the mainframe computer market.

  15. Normalization regulates competition for visual awareness

    PubMed Central

    Ling, Sam; Blake, Randolph

    2012-01-01

    Summary Signals in our brain are in a constant state of competition, including those that vie for motor control, sensory dominance and awareness. To shed light on the mechanisms underlying neural competition, we exploit binocular rivalry, a phenomenon that allows us to probe the competitive process that ordinarily transpires outside of our awareness. By measuring psychometric functions under different states of rivalry, we discovered a pattern of gain changes that are consistent with a model of competition in which attention interacts with normalization processes, thereby driving the ebb and flow between states of awareness. Moreover, we reveal that attention plays a crucial role in modulating competition; without attention, rivalry suppression for high-contrast stimuli is negligible. We propose a framework whereby our visual awareness of competing sensory representations is governed by a common neural computation: normalization. PMID:22884335

  16. Normalizing the causality between time series

    NASA Astrophysics Data System (ADS)

    Liang, X. San

    2015-08-01

    Recently, a rigorous yet concise formula was derived to evaluate information flow, and hence the causality in a quantitative sense, between time series. To assess the importance of a resulting causality, it needs to be normalized. The normalization is achieved through distinguishing a Lyapunov exponent-like, one-dimensional phase-space stretching rate and a noise-to-signal ratio from the rate of information flow in the balance of the marginal entropy evolution of the flow recipient. It is verified with autoregressive models and applied to a real financial analysis problem. An unusually strong one-way causality is identified from IBM (International Business Machines Corporation) to GE (General Electric Company) in their early era, revealing to us an old story, which has almost faded into oblivion, about "Seven Dwarfs" competing with a giant for the mainframe computer market.

  17. Shunting for normal pressure hydrocephalus (NPH).

    PubMed

    Esmonde, T; Cooke, S

    2002-01-01

    Since the condition was first described in 1965, the syndrome of normal pressure hydrocephalus (NPH) has conventionally been managed by placement of a cerebrospinal fluid (CSF) shunt. To determine the effectiveness of shunting procedures in promoting stability or improvement in the neurological symptoms and signs of NPH. The trials were identified from a search of the Specialized Register of the Cochrane Dementia and Cognitive Improvement Group on 26 June 2001 using the terms 'shunt*' and 'normal pressure hydrocephalus'. Studies included for analysis were those involving the placement of a CSF shunt for the treatment of NPH as part of a randomized controlled trial. No data matching the selection criteria were found. No randomized controlled trials of shunt placement versus no shunt were found. There is no evidence to indicate whether placement of a shunt is effective in the management of NPH.

  18. Chevron nails: a normal variant in the pediatric population.

    PubMed

    Delano, Sofia; Belazarian, Leah

    2014-01-01

    A 7-month-old girl was evaluated for V-shaped ridging of the fingernails consistent with chevron nails. Chevron nails are a normal variant in the pediatric population that is frequently outgrown. This case nicely demonstrates this normal finding that has so rarely been reported in the literature. © 2013 Wiley Periodicals, Inc.

  19. Akron Perkins Normal School: An Institutional History, 1898-1931.

    ERIC Educational Resources Information Center

    Kline, Melinda J.

    The preparation of elementary school teachers in the middle and late 19th century increasingly included city normal training schools. The city school board of Akron, Ohio, reflected this trend and established its own normal training school in 1898. This paper documents the preservice training of teachers within the city training school, the…

  20. Density- and wavefunction-normalized Cartesian spherical harmonics for l ≤ 20

    DOE PAGES

    Michael, J. Robert; Volkov, Anatoliy

    2015-03-01

    The widely used pseudoatom formalism in experimental X-ray charge-density studies makes use of real spherical harmonics when describing the angular component of aspherical deformations of the atomic electron density in molecules and crystals. The analytical form of the density-normalized Cartesian spherical harmonic functions for up to l ≤ 7 and the corresponding normalization coefficients were reported previously by Paturle & Coppens. It was shown that the analytical form for normalization coefficients is available primarily forl ≤ 4. Only in very special cases it is possible to derive an analytical representation of the normalization coefficients for 4 < l ≤ 7.more » In most cases for l > 4 the density normalization coefficients were calculated numerically to within seven significant figures. In this study we review the literature on the density-normalized spherical harmonics, clarify the existing notations, use the Paturle–Coppens method in the Wolfram Mathematicasoftware to derive the Cartesian spherical harmonics for l ≤ 20 and determine the density normalization coefficients to 35 significant figures, and computer-generate a Fortran90 code. The article primarily targets researchers who work in the field of experimental X-ray electron density, but may be of some use to all who are interested in Cartesian spherical harmonics.« less

  1. The impact of sample non-normality on ANOVA and alternative methods.

    PubMed

    Lantz, Björn

    2013-05-01

    In this journal, Zimmerman (2004, 2011) has discussed preliminary tests that researchers often use to choose an appropriate method for comparing locations when the assumption of normality is doubtful. The conceptual problem with this approach is that such a two-stage process makes both the power and the significance of the entire procedure uncertain, as type I and type II errors are possible at both stages. A type I error at the first stage, for example, will obviously increase the probability of a type II error at the second stage. Based on the idea of Schmider et al. (2010), which proposes that simulated sets of sample data be ranked with respect to their degree of normality, this paper investigates the relationship between population non-normality and sample non-normality with respect to the performance of the ANOVA, Brown-Forsythe test, Welch test, and Kruskal-Wallis test when used with different distributions, sample sizes, and effect sizes. The overall conclusion is that the Kruskal-Wallis test is considerably less sensitive to the degree of sample normality when populations are distinctly non-normal and should therefore be the primary tool used to compare locations when it is known that populations are not at least approximately normal. © 2012 The British Psychological Society.

  2. Experimental studies of breaking of elastic tired wheel under variable normal load

    NASA Astrophysics Data System (ADS)

    Fedotov, A. I.; Zedgenizov, V. G.; Ovchinnikova, N. I.

    2017-10-01

    The paper analyzes the braking of a vehicle wheel subjected to disturbances of normal load variations. Experimental tests and methods for developing test modes as sinusoidal force disturbances of the normal wheel load were used. Measuring methods for digital and analogue signals were used as well. Stabilization of vehicle wheel braking subjected to disturbances of normal load variations is a topical issue. The paper suggests a method for analyzing wheel braking processes under disturbances of normal load variations. A method to control wheel baking processes subjected to disturbances of normal load variations was developed.

  3. The normal-equivalent: a patient-specific assessment of facial harmony.

    PubMed

    Claes, P; Walters, M; Gillett, D; Vandermeulen, D; Clement, J G; Suetens, P

    2013-09-01

    Evidence-based practice in oral and maxillofacial surgery would greatly benefit from an objective assessment of facial harmony or gestalt. Normal reference faces have previously been introduced, but they describe harmony in facial form as an average only and fail to report on harmonic variations found between non-dysmorphic faces. In this work, facial harmony, in all its complexity, is defined using a face-space, which describes all possible variations within a non-dysmorphic population; this was sampled here, based on 400 healthy subjects. Subsequently, dysmorphometrics, which involves the measurement of morphological abnormalities, is employed to construct the normal-equivalent within the given face-space of a presented dysmorphic face. The normal-equivalent can be seen as a synthetic identical but unaffected twin that is a patient-specific and population-based normal. It is used to extract objective scores of facial discordancy. This technique, along with a comparing approach, was used on healthy subjects to establish ranges of discordancy that are accepted to be normal, as well as on two patient examples before and after surgical intervention. The specificity of the presented normal-equivalent approach was confirmed by correctly attributing abnormality and providing regional depictions of the known dysmorphologies. Furthermore, it proved to be superior to the comparing approach. Copyright © 2013 International Association of Oral and Maxillofacial Surgeons. Published by Elsevier Ltd. All rights reserved.

  4. Speech rate reduction and "nasality" in normal speakers.

    PubMed

    Brancewicz, T M; Reich, A R

    1989-12-01

    This study explored the effects of reduced speech rate on nasal/voice accelerometric measures and nasality ratings. Nasal/voice accelerometric measures were obtained from normal adults for various speech stimuli and speaking rates. Stimuli included three sentences (one obstruent-loaded, one semivowel-loaded, and one containing a single nasal), and /pv/ syllable trains.. Speakers read the stimuli at their normal rate, half their normal rate, and as slowly as possible. In addition, a computer program paced each speaker at rates of 1, 2, and 3 syllables per second. The nasal/voice accelerometric values revealed significant stimulus effects but no rate effects. The nasality ratings of experienced listeners, evaluated as a function of stimulus and speaking rate, were compared to the accelerometric measures. The nasality scale values demonstrated small, but statistically significant, stimulus and rate effects. However, the nasality percepts were poorly correlated with the nasal/voice accelerometric measures.

  5. Sensitivity of Raman spectroscopy to normal patient variability

    NASA Astrophysics Data System (ADS)

    Vargis, Elizabeth; Byrd, Teresa; Logan, Quinisha; Khabele, Dineo; Mahadevan-Jansen, Anita

    2011-11-01

    Many groups have used Raman spectroscopy for diagnosing cervical dysplasia; however, there have been few studies looking at the effect of normal physiological variations on Raman spectra. We assess four patient variables that may affect normal Raman spectra: Race/ethnicity, body mass index (BMI), parity, and socioeconomic status. Raman spectra were acquired from a diverse population of 75 patients undergoing routine screening for cervical dysplasia. Classification of Raman spectra from patients with a normal cervix is performed using sparse multinomial logistic regression (SMLR) to determine if any of these variables has a significant effect. Results suggest that BMI and parity have the greatest impact, whereas race/ethnicity and socioeconomic status have a limited effect. Incorporating BMI and obstetric history into classification algorithms may increase sensitivity and specificity rates of disease classification using Raman spectroscopy. Studies are underway to assess the effect of these variables on disease.

  6. Computing Instantaneous Frequency by normalizing Hilbert Transform

    NASA Technical Reports Server (NTRS)

    Huang, Norden E. (Inventor)

    2005-01-01

    This invention presents Normalized Amplitude Hilbert Transform (NAHT) and Normalized Hilbert Transform(NHT), both of which are new methods for computing Instantaneous Frequency. This method is designed specifically to circumvent the limitation set by the Bedorsian and Nuttal Theorems, and to provide a sharp local measure of error when the quadrature and the Hilbert Transform do not agree. Motivation for this method is that straightforward application of the Hilbert Transform followed by taking the derivative of the phase-angle as the Instantaneous Frequency (IF) leads to a common mistake made up to this date. In order to make the Hilbert Transform method work, the data has to obey certain restrictions.

  7. Procedure for normalization of cDNA libraries

    DOEpatents

    Bonaldo, Maria DeFatima; Soares, Marcelo Bento

    1997-01-01

    This invention provides a method to normalize a cDNA library constructed in a vector capable of being converted to single-stranded circles and capable of producing complementary nucleic acid molecules to the single-stranded circles comprising: (a) converting the cDNA library in single-stranded circles; (b) generating complementary nucleic acid molecules to the single-stranded circles; (c) hybridizing the single-stranded circles converted in step (a) with complementary nucleic acid molecules of step (b) to produce partial duplexes to an appropriate Cot; (e) separating the unhybridized single-stranded circles from the hybridized single-stranded circles, thereby generating a normalized cDNA library.

  8. Procedure for normalization of cDNA libraries

    DOEpatents

    Bonaldo, M.D.; Soares, M.B.

    1997-12-30

    This invention provides a method to normalize a cDNA library constructed in a vector capable of being converted to single-stranded circles and capable of producing complementary nucleic acid molecules to the single-stranded circles comprising: (a) converting the cDNA library in single-stranded circles; (b) generating complementary nucleic acid molecules to the single-stranded circles; (c) hybridizing the single-stranded circles converted in step (a) with complementary nucleic acid molecules of step (b) to produce partial duplexes to an appropriate Cot; (e) separating the unhybridized single-stranded circles from the hybridized single-stranded circles, thereby generating a normalized cDNA library. 1 fig.

  9. Computing Instantaneous Frequency by normalizing Hilbert Transform

    DOEpatents

    Huang, Norden E.

    2005-05-31

    This invention presents Normalized Amplitude Hilbert Transform (NAHT) and Normalized Hilbert Transform(NHT), both of which are new methods for computing Instantaneous Frequency. This method is designed specifically to circumvent the limitation set by the Bedorsian and Nuttal Theorems, and to provide a sharp local measure of error when the quadrature and the Hilbert Transform do not agree. Motivation for this method is that straightforward application of the Hilbert Transform followed by taking the derivative of the phase-angle as the Instantaneous Frequency (IF) leads to a common mistake made up to this date. In order to make the Hilbert Transform method work, the data has to obey certain restrictions.

  10. Mathematical analysis of the normal anatomy of the aging fovea.

    PubMed

    Nesmith, Brooke; Gupta, Akash; Strange, Taylor; Schaal, Yuval; Schaal, Shlomit

    2014-08-28

    To mathematically analyze anatomical changes that occur in the normal fovea during aging. A total of 2912 spectral-domain optical coherence tomography (SD-OCT) normal foveal scans were analyzed. Subjects were healthy individuals, aged 13 to 97 years, with visual acuity ≥20/40 and without evidence of foveal pathology. Using automated symbolic regression software Eureqa (version 0.98), foveal thickness maps of 390 eyes were analyzed using several measurements: parafoveal retinal thickness at 50 μm consecutive intervals, parafoveal maximum retinal thickness at two points lateral to central foveal depression, distance between two points of maximum retinal thickness, maximal foveal slope at two intervals lateral to central foveal depression, and central length of foveal depression. A unique mathematical equation representing the mathematical analog of foveal anatomy was derived for every decade, between 10 and 100 years. The mathematical regression function for normal fovea followed first order sine curve of level 10 complexity for the second decade of life. The mathematical regression function became more complex with normal aging, up to level 43 complexity (0.085 fit; P < 0.05). Young foveas had higher symmetry (0.92 ± 0.10) along midline, whereas aged foveas had significantly less symmetry (0.76 ± 0.27, P < 0.01) along midline and steeper maximal slopes (29 ± 32°, P < 0.01). Normal foveal anatomical configuration changes with age. Normal aged foveas are less symmetric along midline with steeper slopes. Differentiating between normal aging and pathologic changes using SD-OCT scans may allow early diagnosis, follow-up, and better management of the aging population. Copyright 2014 The Association for Research in Vision and Ophthalmology, Inc.

  11. Evaluation of Kurtosis into the product of two normally distributed variables

    NASA Astrophysics Data System (ADS)

    Oliveira, Amílcar; Oliveira, Teresa; Seijas-Macías, Antonio

    2016-06-01

    Kurtosis (κ) is any measure of the "peakedness" of a distribution of a real-valued random variable. We study the evolution of the Kurtosis for the product of two normally distributed variables. Product of two normal variables is a very common problem for some areas of study, like, physics, economics, psychology, … Normal variables have a constant value for kurtosis (κ = 3), independently of the value of the two parameters: mean and variance. In fact, the excess kurtosis is defined as κ- 3 and the Normal Distribution Kurtosis is zero. The product of two normally distributed variables is a function of the parameters of the two variables and the correlation between then, and the range for kurtosis is in [0, 6] for independent variables and in [0, 12] when correlation between then is allowed.

  12. CDC6600 subroutine for normal random variables. [RVNORM (RMU, SIG)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Amos, D.E.

    1977-04-01

    A value y for a uniform variable on (0,1) is generated and a table of 96-percent points for the (0,1) normal distribution is interpolated for a value of the normal variable x(0,1) on 0.02 less than or equal to y less than or equal to 0.98. For the tails, the inverse normal is computed by a rational Chebyshev approximation in an appropriate variable. Then X = x sigma + ..mu.. gives the X(..mu..,sigma) variable.

  13. Succeeding in the New Normal

    ERIC Educational Resources Information Center

    Schaffhauser, Dian

    2012-01-01

    Being a college CIO these days must feel a bit like juggling chain saws with one hand while holding a donation cup in the other. It's unlikely to end well, yet it represents the new normal in IT. While campus clients--from administrators to faculty and students--expect the usual raft of tech services, the IT budget simply can't deliver. In this…

  14. Distributive justice and cognitive enhancement in lower, normal intelligence.

    PubMed

    Dunlop, Mikael; Savulescu, Julian

    2014-01-01

    There exists a significant disparity within society between individuals in terms of intelligence. While intelligence varies naturally throughout society, the extent to which this impacts on the life opportunities it affords to each individual is greatly undervalued. Intelligence appears to have a prominent effect over a broad range of social and economic life outcomes. Many key determinants of well-being correlate highly with the results of IQ tests, and other measures of intelligence, and an IQ of 75 is generally accepted as the most important threshold in modern life. The ability to enhance our cognitive capacities offers an exciting opportunity to correct disabling natural variation and inequality in intelligence. Pharmaceutical cognitive enhancers, such as modafinil and methylphenidate, have been shown to have the capacity to enhance cognition in normal, healthy individuals. Perhaps of most relevance is the presence of an 'inverted U effect' for most pharmaceutical cognitive enhancers, whereby the degree of enhancement increases as intelligence levels deviate further below the mean. Although enhancement, including cognitive enhancement, has been much debated recently, we argue that there are egalitarian reasons to enhance individuals with low but normal intelligence. Under egalitarianism, cognitive enhancement has the potential to reduce opportunity inequality and contribute to relative income and welfare equality in the lower, normal intelligence subgroup. Cognitive enhancement use is justifiable under prioritarianism through various means of distribution; selective access to the lower, normal intelligence subgroup, universal access, or paradoxically through access primarily to the average and above average intelligence subgroups. Similarly, an aggregate increase in social well-being is achieved through similar means of distribution under utilitarianism. In addition, the use of cognitive enhancement within the lower, normal intelligence subgroup negates, or at

  15. The Idea of a Normal University in the 21st Century

    ERIC Educational Resources Information Center

    Hayhoe, Ruth; Li, Jun

    2010-01-01

    The establishment of normal colleges and universities is an important component of building a modern country, which possesses different value ethos with the universities. The emergence of the Ecole Normale Superieure in Paris and the local normal schools has set a new model for teacher education around the world and promoted values and knowledge…

  16. Analysis of the Revised Trauma Score (RTS) in 200 victims of different trauma mechanisms.

    PubMed

    Alvarez, Bruno Durante; Razente, Danilo Mardegam; Lacerda, Daniel Augusto Mauad; Lother, Nicole Silveira; VON-Bahten, Luiz Carlos; Stahlschmidt, Carla Martinez Menini

    2016-01-01

    to analyze the epidemiological profile and mortality associated with the Revised Trauma Score (RTS) in trauma victims treated at a university hospital. we conducted a descriptive, cross-sectional study of trauma protocols (prospectively collected) from December 2013 to February 2014, including trauma victims admitted in the emergency room of the Cajuru University Hospital. We set up three groups: (G1) penetrating trauma to the abdomen and chest, (G2) blunt trauma to the abdomen and chest, and (G3) traumatic brain injury. The variables we analyzed were: gender, age, day of week, mechanism of injury, type of transportation, RTS, hospitalization time and mortality. we analyzed 200 patients, with a mean age of 36.42 ± 17.63 years, and 73.5% were male. The mean age was significantly lower in G1 than in the other groups (p <0.001). Most (40%) of the visits occurred on weekends and the most common pre-hospital transport service (58%) was the SIATE (Emergency Trauma Care Integrated Service). The hospital stay was significantly higher in G1 compared with the other groups (p <0.01). Regarding mortality, there were 12%, 1.35% and 3.95% of deaths in G1, G2 and G3, respectively. The median RTS among the deaths was 5.49, 7.84 and 1.16, respectively, for the three groups. the majority of patients were young men. RTS was effective in predicting mortality in traumatic brain injury, however failing to predict it in patients suffering from blunt and penetrating trauma. analisar o perfil epidemiológico e a mortalidade associada ao escore de trauma revisado (RTS) em vítimas de trauma atendidas em um hospital universitário. estudo transversal descritivo de protocolos de trauma (coletados prospectivamente) de dezembro de 2013 a fevereiro de 2014, incluindo vítimas de trauma admitidas na sala de emergência do Hospital Universitário Cajuru. Três grupos foram criados: (G1) trauma penetrante em abdome e tórax, (G2) trauma contuso em abdome e tórax, e (G3) trauma cranioencef

  17. Physical Development: What's Normal? What's Not?

    MedlinePlus

    ... Normal? What’s Not? Page Content Article Body ​Two boys or girls exactly the same age can start or end ... in Girls: What to Expect . Growth in both boys and girls slows considerably soon after puberty is complete. Having ...

  18. Normal Forces at Solid-Liquid Interface

    NASA Astrophysics Data System (ADS)

    Das, Ratul

    Adhesion can be defined as the tendency of dissimilar particles or surfaces to cling on to one another. Fields that require knowledge about adhesion interactions at the solid-liquid interface span over a wide spectrum from biotechnological issues such as liquid adhesion to skin tissues, insect feet adhesion to solids, or contact lenses to tear fluid adhesion; filtration issues such as membrane fouling and membrane affinity to different liquids; oil and gas extraction where one needs knowledge of the adhesion of the oil and brine to the rock; fuel cells in which droplets are formed on the electrodes and need to be considered in the system's design; classic chemical engineering industry such as drop adhesion to the mist eliminators in flash drums, or to heat exchangers; and classic surface science such as nano-structured surfaces, self cleaning surfaces, and general wetting phenomena. We execute the Young-Dupre (Y-P) gedanken experiment to establish unique values of work of adhesion rather than a work of adhesion range that the contact angle hysteresis results in. We use the Centrifugal Adhesion Balance (CAB) which allows independent manipulation of normal and lateral forces to induce an increase in the normal force which pulls on a liquid drop while keeping zero lateral force. This method mimics a drop that is subjected to a gravitational force that is gradually increasing. The values obtained for the work of adhesion are independent of drop size and are in agreement with the Y-P estimate. Cyclically varying the normal force, just to prevent the drop flying away from the surface will also enable us to study the Contact Angle Hysteresis for a pendant drop. With this set up, the work of adhesion is not only calculated from experimental normal force measurements, but the found results are also used to provide a venue for calculating the Young equilibrium contact angle, theta0. According to Shanahan and de Gennes, a liquid drop with a non-zero contact angle is

  19. Coronary artery anomalies overview: The normal and the abnormal

    PubMed Central

    Villa, Adriana DM; Sammut, Eva; Nair, Arjun; Rajani, Ronak; Bonamini, Rodolfo; Chiribiri, Amedeo

    2016-01-01

    The aim of this review is to give a comprehensive and concise overview of coronary embryology and normal coronary anatomy, describe common variants of normal and summarize typical patterns of anomalous coronary artery anatomy. Extensive iconography supports the text, with particular attention to images obtained in vivo using non-invasive imaging. We have divided this article into three groups, according to their frequency in the general population: Normal, normal variant and anomaly. Although congenital coronary artery anomalies are relatively uncommon, they are the second most common cause of sudden cardiac death among young athletes and therefore warrant detailed review. Based on the functional relevance of each abnormality, coronary artery anomalies can be classified as anomalies with obligatory ischemia, without ischemia or with exceptional ischemia. The clinical symptoms may include chest pain, dyspnea, palpitations, syncope, cardiomyopathy, arrhythmia, myocardial infarction and sudden cardiac death. Moreover, it is important to also identify variants and anomalies without clinical relevance in their own right as complications during surgery or angioplasty can occur. PMID:27358682

  20. A generalized algorithm to design finite field normal basis multipliers

    NASA Technical Reports Server (NTRS)

    Wang, C. C.

    1986-01-01

    Finite field arithmetic logic is central in the implementation of some error-correcting coders and some cryptographic devices. There is a need for good multiplication algorithms which can be easily realized. Massey and Omura recently developed a new multiplication algorithm for finite fields based on a normal basis representation. Using the normal basis representation, the design of the finite field multiplier is simple and regular. The fundamental design of the Massey-Omura multiplier is based on a design of a product function. In this article, a generalized algorithm to locate a normal basis in a field is first presented. Using this normal basis, an algorithm to construct the product function is then developed. This design does not depend on particular characteristics of the generator polynomial of the field.

  1. Instantaneous Normal Modes and the Protein Glass Transition

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schulz, Roland; Krishnan, Marimuthu; Daidone, Isabella

    2009-01-01

    In the instantaneous normal mode method, normal mode analysis is performed at instantaneous configurations of a condensed-phase system, leading to modes with negative eigenvalues. These negative modes provide a means of characterizing local anharmonicities of the potential energy surface. Here, we apply instantaneous normal mode to analyze temperature-dependent diffusive dynamics in molecular dynamics simulations of a small protein (a scorpion toxin). Those characteristics of the negative modes are determined that correlate with the dynamical (or glass) transition behavior of the protein, as manifested as an increase in the gradient with T of the average atomic mean-square displacement at ~ 220more » K. The number of negative eigenvalues shows no transition with temperature. Further, although filtering the negative modes to retain only those with eigenvectors corresponding to double-well potentials does reveal a transition in the hydration water, again, no transition in the protein is seen. However, additional filtering of the protein double-well modes, so as to retain only those that, on energy minimization, escape to different regions of configurational space, finally leads to clear protein dynamical transition behavior. Partial minimization of instantaneous configurations is also found to remove nondiffusive imaginary modes. In summary, examination of the form of negative instantaneous normal modes is shown to furnish a physical picture of local diffusive dynamics accompanying the protein glass transition.« less

  2. Instantaneous Normal Modes and the Protein Glass Transition

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schultz, Roland; Krishnan, Marimuthu; Daidone, Isabella

    2009-01-01

    In the instantaneous normal mode method, normal mode analysis is performed at instantaneous configurations of a condensed-phase system, leading to modes with negative eigenvalues. These negative modes provide a means of characterizing local anharmonicities of the potential energy surface. Here, we apply instantaneous normal mode to analyze temperature-dependent diffusive dynamics in molecular dynamics simulations of a small protein (a scorpion toxin). Those characteristics of the negative modes are determined that correlate with the dynamical (or glass) transition behavior of the protein, as manifested as an increase in the gradient with T of the average atomic mean-square displacement at 220 K.more » The number of negative eigenvalues shows no transition with temperature. Further, although filtering the negative modes to retain only those with eigenvectors corresponding to double-well potentials does reveal a transition in the hydration water, again, no transition in the protein is seen. However, additional filtering of the protein double-well modes, so as to retain only those that, on energy minimization, escape to different regions of configurational space, finally leads to clear protein dynamical transition behavior. Partial minimization of instantaneous configurations is also found to remove nondiffusive imaginary modes. In summary, examination of the form of negative instantaneous normal modes is shown to furnish a physical picture of local diffusive dynamics accompanying the protein glass transition.« less

  3. Normal Anal Examination After Penetration: A Case Report.

    PubMed

    Slingsby, Brett; Goldberg, Amy

    2018-03-01

    Physical findings are rare after anal penetration. Furthermore, children delay in disclosing or are reticent to discuss penetration. A 12-year-old boy presented to medical care multiple times over a several-week period complaining of abdominal pain, bloody diarrhea, and poor appetite. On colonoscopy, he was found to have a cylindrical foreign body (measuring 7 cm tall and 7 cm in diameter) in his rectum, which had been present for at least 2 weeks. He initially denied knowing how the object got into his rectum and later stated that he inserted it himself out of curiosity. One week after the object was removed, follow-up examination using video colposcopy revealed a completely normal anal examination; the patient had a normal anal examination despite known anal penetration and removal of the object. WHY SHOULD AN EMERGENCY PHYSICIAN BE AWARE OF THIS?: Children can have a normal anal examination despite anal penetration, and do not always disclose anal penetration. The aforementioned concepts can be applied to situations related to child sexual abuse in the emergency department, where physical examinations are frequently normal and children delay in disclosing the abuse. When there is concern for sexual abuse, even in the absence of a disclosure or examination findings, patients should be referred for a child abuse pediatrics evaluation if available. Copyright © 2017 Elsevier Inc. All rights reserved.

  4. Hydrodynamic lubrication of rigid nonconformal contacts in combined rolling and normal motion

    NASA Technical Reports Server (NTRS)

    Ghosh, M. K.; Hamrock, B. J.; Brewe, D. E.

    1984-01-01

    A numerical solution to the problem of hydrodynamic lubrication of rigid point contacts with an isoviscous, incompressible lubricant was obtained. The hydrodynamic load-carrying capacity under unsteady (or dynamic) conditions arising from the combined effects of squeeze motion superposed upon the entraining motion was determined for both normal approach and separation. Superposed normal motion considerably increases net load-carrying capacity during normal approach and substantially reduces net load-carrying capacity during separation. Geometry was also found to have a significant influence on the dynamic load-carrying capacity. The ratio of dynamic to steady state load-carrying capacity increases with increasing geometry parameter for normal approach and decreases during separation. The cavitation (film rupture) boundary is also influenced significantly by the normal motion, moving downstream during approach and upstream during separation. For sufficiently high normal separation velocity the rupture boundary may even move upstream of the minimum-film-thickness position. Sixty-three cases were used to derive a functional relationship for the ratio of the dynamic to steady state load-carrying capacity in terms of the dimensionless normal velocity parameter (incorporating normal velocity, entraining velocity, and film thickness) and the geometry parameter.

  5. Hydrodynamic lubrication of rigid nonconformal contacts in combined rolling and normal motion

    NASA Technical Reports Server (NTRS)

    Ghosh, M. K.; Hamrock, B. J.; Brewe, D.

    1985-01-01

    A numerical solution to the problem of hydrodynamic lubrication of rigid point contacts with an isoviscous, incompressible lubricant was obtained. The hydrodynamic load-carrying capacity under unsteady (or dynamic) conditions arising from the combined effects of squeeze motion superposed upon the entraining motion was determined for both normal approach and separation. Superposed normal motion considerably increases net load-carrying capacity during normal approach and substantially reduces net load-carrying capacity during separation. Geometry was also found to have a significant influence on the dynamic load-carrying capacity. The ratio of dynamic to steady state load-carrying capacity increases with increasing geometry parameter for normal approach and decreases during separation. The cavitation (film rupture) boundary is also influenced significantly by the normal motion, moving downstream during approach and upstream during separation. For sufficiently high normal separation velocity the rupture boundary may even move upstream of the minimum-film-thickness position. Sixty-three cases were used to derive a functional relationship for the ratio of the dynamic to steady state load-carrying capacity in terms of the dimensionless normal velocity parameter (incorporating normal velocity, entraining velocity, and film thickness) and the geometry parameter.

  6. A new method to real-normalize measured complex modes

    NASA Technical Reports Server (NTRS)

    Wei, Max L.; Allemang, Randall J.; Zhang, Qiang; Brown, David L.

    1987-01-01

    A time domain subspace iteration technique is presented to compute a set of normal modes from the measured complex modes. By using the proposed method, a large number of physical coordinates are reduced to a smaller number of model or principal coordinates. Subspace free decay time responses are computed using properly scaled complex modal vectors. Companion matrix for the general case of nonproportional damping is then derived in the selected vector subspace. Subspace normal modes are obtained through eigenvalue solution of the (M sub N) sup -1 (K sub N) matrix and transformed back to the physical coordinates to get a set of normal modes. A numerical example is presented to demonstrate the outlined theory.

  7. Quantum turbulence in superfluids with wall-clamped normal component.

    PubMed

    Eltsov, Vladimir; Hänninen, Risto; Krusius, Matti

    2014-03-25

    In Fermi superfluids, such as superfluid (3)He, the viscous normal component can be considered to be stationary with respect to the container. The normal component interacts with the superfluid component via mutual friction, which damps the motion of quantized vortex lines and eventually couples the superfluid component to the container. With decreasing temperature and mutual friction, the internal dynamics of the superfluid component becomes more important compared with the damping and coupling effects from the normal component. As a result profound changes in superfluid dynamics are observed: the temperature-dependent transition from laminar to turbulent vortex motion and the decoupling from the reference frame of the container at even lower temperatures.

  8. Quantum turbulence in superfluids with wall-clamped normal component

    PubMed Central

    Eltsov, Vladimir; Hänninen, Risto; Krusius, Matti

    2014-01-01

    In Fermi superfluids, such as superfluid 3He, the viscous normal component can be considered to be stationary with respect to the container. The normal component interacts with the superfluid component via mutual friction, which damps the motion of quantized vortex lines and eventually couples the superfluid component to the container. With decreasing temperature and mutual friction, the internal dynamics of the superfluid component becomes more important compared with the damping and coupling effects from the normal component. As a result profound changes in superfluid dynamics are observed: the temperature-dependent transition from laminar to turbulent vortex motion and the decoupling from the reference frame of the container at even lower temperatures. PMID:24704879

  9. A childbirth educator speaks out for increased advocacy for normal birth.

    PubMed

    Boyd, Anne

    2006-01-01

    Upon noting that, over the years, normal birth has become less and less a cultural norm in the United States (where cesarean births now approach 30%), a childbirth educator speaks out to say it is time for normal-birth advocates to organize in order to increase efforts at social marketing of normal birth as a cultural norm.

  10. A Childbirth Educator Speaks Out for Increased Advocacy for Normal Birth

    PubMed Central

    Boyd, Anne

    2006-01-01

    Upon noting that, over the years, normal birth has become less and less a cultural norm in the United States (where cesarean births now approach 30%), a childbirth educator speaks out to say it is time for normal-birth advocates to organize in order to increase efforts at social marketing of normal birth as a cultural norm. PMID:17322939

  11. Diagonalization and Jordan Normal Form--Motivation through "Maple"[R

    ERIC Educational Resources Information Center

    Glaister, P.

    2009-01-01

    Following an introduction to the diagonalization of matrices, one of the more difficult topics for students to grasp in linear algebra is the concept of Jordan normal form. In this note, we show how the important notions of diagonalization and Jordan normal form can be introduced and developed through the use of the computer algebra package…

  12. Asymptotic Normality Through Factorial Cumulants and Partition Identities

    PubMed Central

    Bobecka, Konstancja; Hitczenko, Paweł; López-Blázquez, Fernando; Rempała, Grzegorz; Wesołowski, Jacek

    2013-01-01

    In the paper we develop an approach to asymptotic normality through factorial cumulants. Factorial cumulants arise in the same manner from factorial moments as do (ordinary) cumulants from (ordinary) moments. Another tool we exploit is a new identity for ‘moments’ of partitions of numbers. The general limiting result is then used to (re-)derive asymptotic normality for several models including classical discrete distributions, occupancy problems in some generalized allocation schemes and two models related to negative multinomial distribution. PMID:24591773

  13. The Normal Vulva, Vulvar Examination, and Evaluation Tools.

    PubMed

    Cohen Sacher, Bina

    2015-09-01

    The appearance of the female external genitalia is key for understanding and diagnosing many diseases that women of all ages encounter. Alas, the normal appearance of the vulva is an elusive concept, scarcely represented in textbooks, and the growing number of vulvar cosmetic surgery calls for a review of the normal appearance of the vulva and its diversity. In this paper I will review vulvar embryology, anatomy, the current literature discussing vulvar appearance, and describe meticulous vulvar examination, including the diagnostic tools.

  14. Analysis of quantitative data obtained from toxicity studies showing non-normal distribution.

    PubMed

    Kobayashi, Katsumi

    2005-05-01

    The data obtained from toxicity studies are examined for homogeneity of variance, but, usually, they are not examined for normal distribution. In this study I examined the measured items of a carcinogenicity/chronic toxicity study with rats for both homogeneity of variance and normal distribution. It was observed that a lot of hematology and biochemistry items showed non-normal distribution. For testing normal distribution of the data obtained from toxicity studies, the data of the concurrent control group may be examined, and for the data that show a non-normal distribution, non-parametric tests with robustness may be applied.

  15. Minimal Conductance Quantization in a Normal-Metal/Unconventional-Superconductor Junction

    NASA Astrophysics Data System (ADS)

    Ikegaya, Satoshi; Asano, Yasuhiro

    2018-04-01

    We discuss the minimum value of the zero-bias differential conductance in a normal-metal/unconventional-superconductor junction. A numerical simulation demonstrates that the zero-bias conductance is quantized at (4e^2/h) N_ZES in the limit of strong impurity scatterings in the normal-metal. The integer N_ZES represents the number of perfect transmission channels through the junction. By focusing on the chiral symmetry of Hamiltonian, we prove the existence of N_ZES-fold degenerate resonant states in the dirty normal segment.

  16. Normal forms for Poisson maps and symplectic groupoids around Poisson transversals

    NASA Astrophysics Data System (ADS)

    Frejlich, Pedro; Mărcuț, Ioan

    2018-03-01

    Poisson transversals are submanifolds in a Poisson manifold which intersect all symplectic leaves transversally and symplectically. In this communication, we prove a normal form theorem for Poisson maps around Poisson transversals. A Poisson map pulls a Poisson transversal back to a Poisson transversal, and our first main result states that simultaneous normal forms exist around such transversals, for which the Poisson map becomes transversally linear, and intertwines the normal form data of the transversals. Our second result concerns symplectic integrations. We prove that a neighborhood of a Poisson transversal is integrable exactly when the Poisson transversal itself is integrable, and in that case we prove a normal form theorem for the symplectic groupoid around its restriction to the Poisson transversal, which puts all structure maps in normal form. We conclude by illustrating our results with examples arising from Lie algebras.

  17. Normal forms for Poisson maps and symplectic groupoids around Poisson transversals.

    PubMed

    Frejlich, Pedro; Mărcuț, Ioan

    2018-01-01

    Poisson transversals are submanifolds in a Poisson manifold which intersect all symplectic leaves transversally and symplectically. In this communication, we prove a normal form theorem for Poisson maps around Poisson transversals. A Poisson map pulls a Poisson transversal back to a Poisson transversal, and our first main result states that simultaneous normal forms exist around such transversals, for which the Poisson map becomes transversally linear, and intertwines the normal form data of the transversals. Our second result concerns symplectic integrations. We prove that a neighborhood of a Poisson transversal is integrable exactly when the Poisson transversal itself is integrable, and in that case we prove a normal form theorem for the symplectic groupoid around its restriction to the Poisson transversal, which puts all structure maps in normal form. We conclude by illustrating our results with examples arising from Lie algebras.

  18. Secondary electron emission from electrically charged fluorinated-ethylene-propylene Teflon for normal and non-normal electron incidence. M.S. Thesis; [spacecraft thermal coatings

    NASA Technical Reports Server (NTRS)

    Budd, P. A.

    1981-01-01

    The secondary electron emission coefficient was measured for a charged polymer (FEP-Teflon) with normally and obliquely incident primary electrons. Theories of secondary emission are reviewed and the experimental data is compared to these theories. Results were obtained for angles of incidence up to 60 deg in normal electric fields of 1500 V/mm. Additional measurements in the range from 50 to 70 deg were made in regions where the normal and tangential fields were approximately equal. The initial input angles and measured output point of the electron beam could be analyzed with computer simulations in order to determine the field within the chamber. When the field is known, the trajectories can be calculated for impacting electrons having various energies and angles of incidence. There was close agreement between the experimental results and the commonly assumed theoretical model in the presence of normal electric fields for angles of incidence up to 60 deg. High angle results obtained in the presence of tangential electric fields did not agree with the theoretical models.

  19. The autistic brain in the context of normal neurodevelopment.

    PubMed

    Ziats, Mark N; Edmonson, Catherine; Rennert, Owen M

    2015-01-01

    The etiology of autism spectrum disorders (ASDs) is complex and largely unclear. Among various lines of inquiry, many have suggested convergence onto disruptions in both neural circuitry and immune regulation/glial cell function pathways. However, the interpretation of the relationship between these two putative mechanisms has largely focused on the role of exogenous factors and insults, such as maternal infection, in activating immune pathways that in turn result in neural network abnormalities. Yet, given recent insights into our understanding of human neurodevelopment, and in particular the critical role of glia and the immune system in normal brain development, it is important to consider these putative pathological processes in their appropriate normal neurodevelopmental context. In this review, we explore the hypothesis that the autistic brain cellular phenotype likely represents intrinsic abnormalities of glial/immune processes constitutively operant in normal brain development that result in the observed neural network dysfunction. We review recent studies demonstrating the intercalated role of neural circuit development, the immune system, and glial cells in the normal developing brain, and integrate them with studies demonstrating pathological alterations in these processes in autism. By discussing known abnormalities in the autistic brain in the context of normal brain development, we explore the hypothesis that the glial/immune component of ASD may instead be related to intrinsic exaggerated/abnormal constitutive neurodevelopmental processes such as network pruning. Moreover, this hypothesis may be relevant to other neurodevelopmental disorders that share genetic, pathologic, and clinical features with autism.

  20. On the Normal Force Mechanotransduction of Human Umbilical Vein Endothelial Cells

    NASA Astrophysics Data System (ADS)

    Vahabikashi, Amir; Wang, Qiuyun; Wilson, James; Wu, Qianhong; Vucbmss Team

    2016-11-01

    In this paper, we report a cellular biomechanics study to examine the normal force mechanotransduction of Human Umbilical Vein Endothelial Cells (HUVECs) with their implications on hypertension. Endothelial cells sense mechanical forces and adjust their structure and function accordingly. The mechanotransduction of normal forces plays a vital role in hypertension due to the higher pressure buildup inside blood vessels. Herein, HUVECs were cultured to full confluency and then exposed to different mechanical loadings using a novel microfluidic flow chamber. One various pressure levels while keeps the shear stress constant inside the flow chamber. Three groups of cells were examined, the control group (neither shear nor normal stresses), the normal pressure group (10 dyne/cm2 of shear stress and 95 mmHg of pressure), and the hypertensive group (10 dyne/cm2 of shear stress and 142 mmHg of pressure). Cellular response characterized by RT-PCR method indicates that, COX-2 expressed under normal pressure but not high pressure; Mn-SOD expressed under both normal and high pressure while this response was stronger for normal pressure; FOS and e-NOS did not respond under any condition. The differential behavior of COX-2 and Mn-SOD in response to changes in pressure, is instrumental for better understanding the pathogenesis of hypertensive cardiovascular diseases. This research was supported by the National Science Foundation under Award #1511096.

  1. Normalized velocity profiles of field-measured turbidity currents

    USGS Publications Warehouse

    Xu, Jingping

    2010-01-01

    Multiple turbidity currents were recorded in two submarine canyons with maximum speed as high as 280 cm/s. For each individual turbidity current measured at a fixed station, its depth-averaged velocity typically decreased over time while its thickness increased. Some turbidity currents gained in speed as they traveled downcanyon, suggesting a possible self-accelerating process. The measured velocity profiles, first in this high resolution, allowed normalizations with various schemes. Empirical functions, obtained from laboratory experiments whose spatial and time scales are two to three orders of magnitude smaller, were found to represent the field data fairly well. The best similarity collapse of the velocity profiles was achieved when the streamwise velocity and the elevation were normalized respectively by the depth-averaged velocity and the turbidity current thickness. This normalization scheme can be generalized to an empirical function Y = exp(–αXβ) for the jet region above the velocity maximum. Confirming theoretical arguments and laboratory results of other studies, the field turbidity currents are Froude-supercritical.

  2. Optical based tactile shear and normal load sensor

    DOEpatents

    Salisbury, Curt Michael

    2015-06-09

    Various technologies described herein pertain to a tactile sensor that senses normal load and/or shear load. The tactile sensor includes a first layer and an optically transparent layer bonded together. At least a portion of the first layer is made of optically reflective material. The optically transparent layer is made of resilient material (e.g., clear silicone rubber). The tactile sensor includes light emitter/light detector pair(s), which respectively detect either normal load or shear load. Light emitter(s) emit light that traverses through the optically transparent layer and reflects off optically reflective material of the first layer, and light detector(s) detect and measure intensity of reflected light. When a normal load is applied, the optically transparent layer compresses, causing a change in reflected light intensity. When shear load is applied, a boundary between optically reflective material and optically absorptive material is laterally displaced, causing a change in reflected light intensity.

  3. Sandstone-filled normal faults: A case study from central California

    NASA Astrophysics Data System (ADS)

    Palladino, Giuseppe; Alsop, G. Ian; Grippa, Antonio; Zvirtes, Gustavo; Phillip, Ruy Paulo; Hurst, Andrew

    2018-05-01

    Despite the potential of sandstone-filled normal faults to significantly influence fluid transmissivity within reservoirs and the shallow crust, they have to date been largely overlooked. Fluidized sand, forcefully intruded along normal fault zones, markedly enhances the transmissivity of faults and, in general, the connectivity between otherwise unconnected reservoirs. Here, we provide a detailed outcrop description and interpretation of sandstone-filled normal faults from different stratigraphic units in central California. Such faults commonly show limited fault throw, cm to dm wide apertures, poorly-developed fault zones and full or partial sand infill. Based on these features and inferences regarding their origin, we propose a general classification that defines two main types of sandstone-filled normal faults. Type 1 form as a consequence of the hydraulic failure of the host strata above a poorly-consolidated sandstone following a significant, rapid increase of pore fluid over-pressure. Type 2 sandstone-filled normal faults form as a result of regional tectonic deformation. These structures may play a significant role in the connectivity of siliciclastic reservoirs, and may therefore be crucial not just for investigation of basin evolution but also in hydrocarbon exploration.

  4. A model of the normal and null states of pulsars

    NASA Astrophysics Data System (ADS)

    Jones, P. B.

    1981-12-01

    A solvable three-dimensional polar cap model of pair creation and charged particle acceleration has been derived. There are no free parameters of significance apart from the polar surface magnetic flux density. The parameter determining the acceleration potential difference has been obtained by calculation of elementary nuclear and electromagnetic processes. Solutions of the model exist for both normal and null states of a pulsar, and the instability in the normal state leading to the normal to null transition has been identified. The predicted necessary condition for the transition is entirely consistent with observation.

  5. A model of the normal and null states of pulsars

    NASA Astrophysics Data System (ADS)

    Jones, P. B.

    A solvable three dimensional polar cap model of pair creation and charged particle acceleration is derived. There are no free parameters of significance apart from the polar surface magnetic flux density. The parameter CO determining the acceleration potential difference was obtained by calculation of elementary nuclear and electromagnetic processes. Solutions of the model exist for both normal and null states of a pulsar, and the instability in the normal state leading to the normal to null transition is identified. The predicted necessary condition for the transition is entirely consistent with observation.

  6. Repetition priming of words and nonwords in Alzheimer's disease and normal aging

    PubMed Central

    Ober, Beth A.; Shenaut, Gregory K.

    2014-01-01

    Objective This study examines the magnitude and direction of nonword and word lexical decision repetition priming effects in Alzheimer’s disease (AD) and normal aging, focusing specifically on the negative priming effect sometimes observed with repeated nonwords. Method Probable Alzheimer's disease (AD) patients (30), elderly normal controls (34), and young normal controls (49) participated in a repetition priming experiment using low-frequency words and word-like nonwords with a letter-level orthographic orienting task at study followed by a lexical decision test phase. Results Although participants' reaction times were longer in AD compared to elderly normal, and elderly normal compared to young normal, the repetition priming effect and the degree to which the repetition priming effect was reversed for nonwords compared to words was unaffected by AD or normal aging. Conclusion AD patients, like young and elderly normal participants, are able to modify (in the case of words) and create (in the case of nonwords) long-term memory traces for lexical stimuli, based on a single orthographic processing trial. The nonword repetition results are discussed from the perspective of new vocabulary learning commencing with a provisional lexical memory trace created after orthographic encoding of a novel word-like letter string. PMID:25000325

  7. A Method to Measure and Estimate Normalized Contrast in Infrared Flash Thermography

    NASA Technical Reports Server (NTRS)

    Koshti, Ajay M.

    2016-01-01

    The paper presents further development in normalized contrast processing used in flash infrared thermography method. Method of computing normalized image or pixel intensity contrast, and normalized temperature contrast are provided. Methods of converting image contrast to temperature contrast and vice versa are provided. Normalized contrast processing in flash thermography is useful in quantitative analysis of flash thermography data including flaw characterization and comparison of experimental results with simulation. Computation of normalized temperature contrast involves use of flash thermography data acquisition set-up with high reflectivity foil and high emissivity tape such that the foil, tape and test object are imaged simultaneously. Methods of assessing other quantitative parameters such as emissivity of object, afterglow heat flux, reflection temperature change and surface temperature during flash thermography are also provided. Temperature imaging and normalized temperature contrast processing provide certain advantages over normalized image contrast processing by reducing effect of reflected energy in images and measurements, therefore providing better quantitative data. Examples of incorporating afterglow heat-flux and reflection temperature evolution in flash thermography simulation are also discussed.

  8. A high performance normally closed solenoid-actuated cold valve.

    PubMed

    Taminiau, I A J; Benningshof, O W B; Jochemsen, R

    2009-08-01

    An electromagnetically driven normally closed valve for liquid helium is presented, which is meant to regulate the input flow to a 1 K pot. An earlier design is modified to be normally closed (not actuated) and tuned for durability and reliability. A new feature is presented which prevents seat deformation at room temperature and provides comfort and durability for intensive use.

  9. A systematic assessment of normalization approaches for the Infinium 450K methylation platform.

    PubMed

    Wu, Michael C; Joubert, Bonnie R; Kuan, Pei-fen; Håberg, Siri E; Nystad, Wenche; Peddada, Shyamal D; London, Stephanie J

    2014-02-01

    The Illumina Infinium HumanMethylation450 BeadChip has emerged as one of the most popular platforms for genome wide profiling of DNA methylation. While the technology is wide-spread, systematic technical biases are believed to be present in the data. For example, this array incorporates two different chemical assays, i.e., Type I and Type II probes, which exhibit different technical characteristics and potentially complicate the computational and statistical analysis. Several normalization methods have been introduced recently to adjust for possible biases. However, there is considerable debate within the field on which normalization procedure should be used and indeed whether normalization is even necessary. Yet despite the importance of the question, there has been little comprehensive comparison of normalization methods. We sought to systematically compare several popular normalization approaches using the Norwegian Mother and Child Cohort Study (MoBa) methylation data set and the technical replicates analyzed with it as a case study. We assessed both the reproducibility between technical replicates following normalization and the effect of normalization on association analysis. Results indicate that the raw data are already highly reproducible, some normalization approaches can slightly improve reproducibility, but other normalization approaches may introduce more variability into the data. Results also suggest that differences in association analysis after applying different normalizations are not large when the signal is strong, but when the signal is more modest, different normalizations can yield very different numbers of findings that meet a weaker statistical significance threshold. Overall, our work provides useful, objective assessment of the effectiveness of key normalization methods.

  10. Multidimensional Normalization to Minimize Plate Effects of Suspension Bead Array Data.

    PubMed

    Hong, Mun-Gwan; Lee, Woojoo; Nilsson, Peter; Pawitan, Yudi; Schwenk, Jochen M

    2016-10-07

    Enhanced by the growing number of biobanks, biomarker studies can now be performed with reasonable statistical power by using large sets of samples. Antibody-based proteomics by means of suspension bead arrays offers one attractive approach to analyze serum, plasma, or CSF samples for such studies in microtiter plates. To expand measurements beyond single batches, with either 96 or 384 samples per plate, suitable normalization methods are required to minimize the variation between plates. Here we propose two normalization approaches utilizing MA coordinates. The multidimensional MA (multi-MA) and MA-loess both consider all samples of a microtiter plate per suspension bead array assay and thus do not require any external reference samples. We demonstrate the performance of the two MA normalization methods with data obtained from the analysis of 384 samples including both serum and plasma. Samples were randomized across 96-well sample plates, processed, and analyzed in assay plates, respectively. Using principal component analysis (PCA), we could show that plate-wise clusters found in the first two components were eliminated by multi-MA normalization as compared with other normalization methods. Furthermore, we studied the correlation profiles between random pairs of antibodies and found that both MA normalization methods substantially reduced the inflated correlation introduced by plate effects. Normalization approaches using multi-MA and MA-loess minimized batch effects arising from the analysis of several assay plates with antibody suspension bead arrays. In a simulated biomarker study, multi-MA restored associations lost due to plate effects. Our normalization approaches, which are available as R package MDimNormn, could also be useful in studies using other types of high-throughput assay data.

  11. Characteristics of global organic matrix in normal and pimpled chicken eggshells.

    PubMed

    Liu, Z; Song, L; Zhang, F; He, W; Linhardt, R J

    2017-10-01

    The organic matrix from normal and pimpled calcified chicken eggshells were dissociated into acid-insoluble, water-insoluble, and facultative-soluble (both acid- and water-soluble) components, to understand the influence of shell matrix on eggshell qualities. A linear correlation was shown among these 3 matrix components in normal eggshells but was not observed in pimpled eggshells. In pimpled eggshells, the percentage contents of all 4 groups of matrix (the total matrix, acid-insoluble matrix, water-insoluble matrix, and facultative-soluble matrix) were significantly higher than that in normal eggshells. The amounts of both total matrix and acid-insoluble matrix in individual pimpled calcified shells were high, even though their weight was much lower than a normal eggshell. In both normal and pimpled eggshells, the calcified eggshell weight and shell thickness significantly and positively correlated with the amounts of all 4 groups of matrix in an individual calcified shell. In normal eggshells, the calcified shell thickness and shell breaking strength showed no significant correlations with the percentage contents of all 4 groups of matrix. In normal eggshells, only the shell membrane weight significantly correlated with the constituent ratios of both acid-insoluble matrix and facultative-soluble matrix in the whole matrix. In pimpled eggshells, 3 variables (calcified shell weight, shell thickness, and breaking strength) were significantly correlated with the constituent proportions of both acid-insoluble matrix and facultative-matrix. This study suggests that mechanical properties of normal eggshells may not linearly depend on the organic matrix content in the calcified eggshells and that pimpled eggshells might result by the disequilibrium enrichment of some proteins with negative effects. © 2017 Poultry Science Association Inc.

  12. Quaternion normalization in additive EKF for spacecraft attitude determination

    NASA Technical Reports Server (NTRS)

    Bar-Itzhack, I. Y.; Deutschmann, J.; Markley, F. L.

    1991-01-01

    This work introduces, examines, and compares several quaternion normalization algorithms, which are shown to be an effective stage in the application of the additive extended Kalman filter (EKF) to spacecraft attitude determination, which is based on vector measurements. Two new normalization schemes are introduced. They are compared with one another and with the known brute force normalization scheme, and their efficiency is examined. Simulated satellite data are used to demonstrate the performance of all three schemes. A fourth scheme is suggested for future research. Although the schemes were tested for spacecraft attitude determination, the conclusions are general and hold for attitude determination of any three dimensional body when based on vector measurements, and use an additive EKF for estimation, and the quaternion for specifying the attitude.

  13. Patterns of pulmonary maturation in normal and abnormal pregnancy.

    PubMed

    Goldkrand, J W; Slattery, D S

    1979-03-01

    Fetal pulmonary maturation may be a variable event depending on various feto-maternal environmental and biochemical influences. The patterns of maturation were studied in 211 amniotic fluid samples from 123 patients (normal 55; diabetes 23; Rh sensitization 19; preeclampsia 26). The phenomenon of globule formation from the amniotic fluid lipid extract and is relation to pulmonary maturity was utilized for this analysis. Validation of this technique is presented. A normal curve was constructed from 22 to 42 weeks; gestation and compared to the abnormal pregnancies. Patients with class A, B, and C diabetes and Rh-sensitized pregnancies had delayed pulmonary maturation. Patients with class D diabetes and preclampsia paralleled the normal course of maturation. A discussion of these results and their possible cause is presented.

  14. Normal-incidence quantum cascade detector coupled by nanopore structure

    NASA Astrophysics Data System (ADS)

    Liu, Jianqi; Wang, Fengjiao; Zhai, Shenqiang; Zhang, Jinchuan; Liu, Shuman; Liu, Junqi; Wang, Lijun; Liu, Fengqi; Wang, Zhanguo

    2018-04-01

    A normal-incidence quantum cascade detector coupled by a nanopore array structure (NPS) is demonstrated. The NPS is fabricated on top of an In0.53Ga0.47As contact layer by inductively coupled plasma etching using anodic aluminum oxide as a mask. Because of the nonuniform volume fraction at different areas of the device mesa, the NPS acts as subwavelength random gratings. Normal-incidence light can be scattered into random oblique directions for inter-sub-band transition absorption. With normal incidence, the responsivities of the device reach 24 mA/W at 77 K and 15.7 mA/W at 300 K, which are enhanced 2.23 and 1.96 times, respectively, compared with that of the 45°-edge device.

  15. 40 CFR 406.50 - Applicability; description of the normal rice milling subcategory.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... normal rice milling subcategory. 406.50 Section 406.50 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) EFFLUENT GUIDELINES AND STANDARDS GRAIN MILLS POINT SOURCE CATEGORY Normal Rice Milling Subcategory § 406.50 Applicability; description of the normal rice milling subcategory. The...

  16. An Assessment of Normalized Difference Skin Index Robustness in Aquatic Environments

    DTIC Science & Technology

    2014-03-27

    Index NDSI Normalized Difference Skin Index NDVI Normalized Difference Vegetation Index NIR Near-Infrared SAR Search and Rescue SERG Sensors... Vegetation and water-bearing objects with high scatter tend to have NDSI values similar to human skin , potentially causing false positives in certain...AN ASSESSMENT OF NORMALIZED DIFFERENCE SKIN INDEX ROBUSTNESS IN AQUATIC ENVIRONMENTS THESIS Alice W. Chan, First Lieutenant, USAF AFIT-ENG-14-M-17

  17. Heart rate control in normal and aborted-SIDS infants.

    PubMed

    Pincus, S M; Cummins, T R; Haddad, G G

    1993-03-01

    Approximate entropy (ApEn), a mathematical formula quantifying regularity in data, was applied to heart rate data from normal and aborted-sudden infant death syndrome (SIDS) infants. We distinguished quiet from rapid-eye-movement (REM) sleep via the following three criteria, refining the notion of REM as more "variable": 1) REM sleep has greater overall variability (0.0374 +/- 0.0138 vs. 0.0205 +/- 0.0090 s, P < 0.005); 2) REM sleep is less stationary (StatAv = 0.742 +/- 0.110) than quiet sleep (StatAv = 0.599 +/- 0.159, P < 0.03); 3) after normalization to overall variability, REM sleep is more regular (ApEnsub = 1.224 +/- 0.092) than quiet sleep (ApEnsub = 1.448 +/- 0.071, P < 0.0001). Fifty percent of aborted-SIDS infants showed greater ApEn instability across quiet sleep than any normal infant exhibited, suggesting that autonomic regulation of heart rate occasionally becomes abnormal in a high-risk subject. There was an association between low ApEn values and aborted-SIDS events; 5 of 14 aborted-SIDS infants had at least one quiet sleep epoch with an ApEn value below the minimum of 45 normal-infant ApEn values.

  18. A prospective study of posturography in normal older people.

    PubMed

    Baloh, R W; Corona, S; Jacobson, K M; Enrietto, J A; Bell, T

    1998-04-01

    To follow posturographic measurements over time in a group of normal older subjects to see if sway increases with aging and if sway is greater in those with deteriorating balance and falls. Seventy-two community-dwelling older people (age range 79-91 years), who initially had normal neurological evaluations, were followed with three yearly follow-up examinations. Amplitude and velocity of sway on static and dynamic posturography, Tinetti gait and balance score, reports of falls. Velocity of sway on dynamic tests increased significantly during the 3 years of follow-up. The percentage increase in sway was about the same in the anterior-posterior and medial-lateral directions and with eyes open and eyes closed. Subjects with low Tinetti scores had higher sway amplitude and velocity, particularly on dynamic tests, but no measure of sway was significantly different in those who reported falls compared with those who did not report falls. Sway increases in normal subjects over time, and sway is greater in older subjects with deteriorating balance compared with those with normal balance. Sway was not greater in those who fell compared with those who did not fall, probably because falls are highly dependent on individual behavior.

  19. 75 FR 35098 - Federal Employees' Retirement System; Normal Cost Percentages

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-06-21

    ... OFFICE OF PERSONNEL MANAGEMENT Federal Employees' Retirement System; Normal Cost Percentages...' Retirement System (FERS) Act of 1986. DATES: The revised normal cost percentages are effective at the... retirement system intended to cover most Federal employees hired after 1983. Most Federal employees hired...

  20. The Battle of "The Normal Heart."

    ERIC Educational Resources Information Center

    Rottman, Larry

    1990-01-01

    The history of the controversy over Southwest Missouri State University's production of "The Normal Heart," a play about acquired immune deficiency syndrome, is chronicled and concern is expressed about the resurgence of bitterness and hatred in the debate over academic freedom, even within the academic community. (MSE)

  1. Method for construction of normalized cDNA libraries

    DOEpatents

    Soares, Marcelo B.; Efstratiadis, Argiris

    1996-01-01

    This invention provides a method to normalize a directional cDNA library constructed in a vector that allows propagation in single-stranded circle form comprising: (a) propagating the directional cDNA library in single-stranded circles; (b) generating fragments complementary to the 3' noncoding sequence of the single-stranded circles in the library to produce partial duplexes; (c) purifying the partial duplexes; (d) melting and reassociating the purified partial duplexes to moderate Cot; and (e) purifying the unassociated single-stranded circles, thereby generating a normalized cDNA library.

  2. Method for construction of normalized cDNA libraries

    DOEpatents

    Soares, M.B.; Efstratiadis, A.

    1996-01-09

    This invention provides a method to normalize a directional cDNA library constructed in a vector that allows propagation in single-stranded circle form. The method comprises: (a) propagating the directional cDNA library in single-stranded circles; (b) generating fragments complementary to the 3` noncoding sequence of the single-stranded circles in the library to produce partial duplexes; (c) purifying the partial duplexes; (d) melting and reassociating the purified partial duplexes to moderate Cot; and (e) purifying the unassociated single-stranded circles, thereby generating a normalized cDNA library. 4 figs.

  3. TaggerOne: joint named entity recognition and normalization with semi-Markov Models

    PubMed Central

    Leaman, Robert; Lu, Zhiyong

    2016-01-01

    Motivation: Text mining is increasingly used to manage the accelerating pace of the biomedical literature. Many text mining applications depend on accurate named entity recognition (NER) and normalization (grounding). While high performing machine learning methods trainable for many entity types exist for NER, normalization methods are usually specialized to a single entity type. NER and normalization systems are also typically used in a serial pipeline, causing cascading errors and limiting the ability of the NER system to directly exploit the lexical information provided by the normalization. Methods: We propose the first machine learning model for joint NER and normalization during both training and prediction. The model is trainable for arbitrary entity types and consists of a semi-Markov structured linear classifier, with a rich feature approach for NER and supervised semantic indexing for normalization. We also introduce TaggerOne, a Java implementation of our model as a general toolkit for joint NER and normalization. TaggerOne is not specific to any entity type, requiring only annotated training data and a corresponding lexicon, and has been optimized for high throughput. Results: We validated TaggerOne with multiple gold-standard corpora containing both mention- and concept-level annotations. Benchmarking results show that TaggerOne achieves high performance on diseases (NCBI Disease corpus, NER f-score: 0.829, normalization f-score: 0.807) and chemicals (BioCreative 5 CDR corpus, NER f-score: 0.914, normalization f-score 0.895). These results compare favorably to the previous state of the art, notwithstanding the greater flexibility of the model. We conclude that jointly modeling NER and normalization greatly improves performance. Availability and Implementation: The TaggerOne source code and an online demonstration are available at: http://www.ncbi.nlm.nih.gov/bionlp/taggerone Contact: zhiyong.lu@nih.gov Supplementary information: Supplementary data are

  4. Usefulness of Maintaining a Normal Electrocardiogram Over Time for Predicting Cardiovascular Health.

    PubMed

    Soliman, Elsayed Z; Zhang, Zhu-Ming; Chen, Lin Y; Tereshchenko, Larisa G; Arking, Dan; Alonso, Alvaro

    2017-01-15

    We hypothesized that maintaining a normal electrocardiogram (ECG) status over time is associated with low cardiovascular (CV) disease in a dose-response fashion and subsequently could be used to monitor programs aimed at promoting CV health. This analysis included 4,856 CV disease-free participants from the Atherosclerosis Risk in Communities study who had a normal ECG at baseline (1987 to 1989) and complete electrocardiographic data in subsequent 3 visits (1990 to 1992, 1993 to 1995, and 1996 to 1998). Participants were classified based on maintaining their normal ECG status during these 4 visits into "maintained," "not maintained," or "inconsistent" normal ECG status as defined by the Minnesota ECG classification. CV disease events (coronary heart disease, heart failure, and stroke) were adjudicated from Atherosclerosis Risk in Communities visit-4 through 2010. Over a median follow-up of 13.2 years, 885 CV disease events occurred. The incidence rate of CV disease events was lowest among study participants who maintained a normal ECG status, followed by those with an inconsistent pattern, and then those who did not maintain their normal ECG status (trend p value <0.001). Similarly, the greater the number of visits with a normal ECG status, the lower was the incidence rate of CV disease events (trend p value <0.001). Maintaining (vs not maintaining) a normal ECG status was associated with a lower risk of CV disease, which was lower than that observed in those with inconsistent normal ECG pattern (trend p value <0.01). In conclusion, maintaining a normal ECG status over time is associated with low risk of CV disease in a dose-response fashion, suggesting its potential use as a monitoring tool for programs promoting CV health. Copyright © 2016 Elsevier Inc. All rights reserved.

  5. Cytologic features of the normal pineal gland of adults.

    PubMed

    Jiménez-Heffernan, José A; Bárcena, Carmen; Agra, Carolina; Asunción, Alfonso

    2015-08-01

    It is well known that the histology of normal pineal gland may resemble not only pineal tumors but also gliomas, owing to its cellularity which is much greater than that of normal white or gray matter. Our recent experience with a case in which part of a normal gland was submitted for intraoperative consultation, together with the scarcity of cytologic descriptions, led us to perform a cyto-histologic correlation study. In addition to the intraoperative case, we collected five pineal glands from consecutive adult autopsies. During the squash procedure, we often noted the presence of calcified grains. Smears were hypercellular, distributed in tissue fibrillary fragments and as numerous single cells, with crystalline structures. Pineal gland cells (pineocytes) were large, round, epithelioid with ill-defined cytoplasms and moderate nuclear pleomorphism. Spindle cells with greater fibrillary quality were less common. One of the most remarkable findings seen in all cases was the presence of cytoplasmic pigment. Histological evaluation and immunohistochemical staining confirmed that the tissue was normal pineal gland. The histology showed a characteristic lobular aspect and frequent corpora arenacea. The pigment seen cytologically was also encountered in histology and corresponded to lipofuscin. Cytologic features of the pineal gland are peculiar when compared to other normal structures of the central nervous system. These features correlate closely with what is seen on histology. In an adequate clinical context, and in combination with frozen sections, cytology allows a specific recognition of the pineal gland during intraoperative pathologic consultations. © 2015 Wiley Periodicals, Inc.

  6. Vascular corrosion casting of normal and pre-eclamptic placentas.

    PubMed

    Yin, Geping; Chen, Ming; Li, Juan; Zhao, Xiaoli; Yang, Shujun; Li, Xiuyun; Yuan, Zheng; Wu, Aifang

    2017-12-01

    Pre-eclampsia is an important cause of maternal and fetal morbidity and mortality that is associated with decreased placental perfusion. In the present study, vascular corrosion casting was used to investigate the differences in structural changes of the fetoplacental vasculature between normal and pre-eclamptic placentas. An improved epoxy resin vascular casting technique was used in the present study. Casting media were infused into 40 normal and 40 pre-eclamptic placentas through umbilical arteries and veins in order to construct three dimensional fetoplacental vasculatures. The number of branches, diameter, morphology and peripheral artery-to-vein ratio were measured for each specimen. The results indicated that the venous system of normal placentas was divided into 5-7 grades of branches and the volume of the vascular bed was 155.5±45.3 ml. In severe pre-eclamptic placentas, the volume was 106.4±36.1 ml, which was significantly lower compared with normal placentas (P<0.01). The venous system of pre-eclamptic placentas was divided into 4-5 grades of branches, which was much more sparse compared with normal placentas. In additions, the diameters of grade 1-3 veins and grade 2-3 arteries were significantly smaller in severe pre-eclampsia (P<0.05). In conclusion, pre-eclamptic placentas displayed a decreased volume of vascular bed, smaller diameters of grade 1-3 veins and grade 2-3 arteries, and an increased peripheral artery-to-vein ratio, which may be a cause of the placental dysfunction during severe pre-eclampsia.

  7. A Normalization Framework for Emotional Attention

    PubMed Central

    Zhang, Xilin; Japee, Shruti; Safiullah, Zaid; Ungerleider, Leslie G.

    2016-01-01

    The normalization model of attention proposes that attention can affect performance by response- or contrast-gain changes, depending on the size of the stimulus and attention field. Here, we manipulated the attention field by emotional valence, negative faces versus positive faces, while holding stimulus size constant in a spatial cueing task. We observed changes in the cueing effect consonant with changes in response gain for negative faces and contrast gain for positive faces. Neuroimaging experiments confirmed that subjects’ attention fields were narrowed for negative faces and broadened for positive faces. Importantly, across subjects, the self-reported emotional strength of negative faces and positive faces correlated, respectively, both with response- and contrast-gain changes and with primary visual cortex (V1) narrowed and broadened attention fields. Effective connectivity analysis showed that the emotional valence-dependent attention field was closely associated with feedback from the dorsolateral prefrontal cortex (DLPFC) to V1. These findings indicate a crucial involvement of DLPFC in the normalization processes of emotional attention. PMID:27870851

  8. A Normalization Framework for Emotional Attention.

    PubMed

    Zhang, Xilin; Japee, Shruti; Safiullah, Zaid; Mlynaryk, Nicole; Ungerleider, Leslie G

    2016-11-01

    The normalization model of attention proposes that attention can affect performance by response- or contrast-gain changes, depending on the size of the stimulus and attention field. Here, we manipulated the attention field by emotional valence, negative faces versus positive faces, while holding stimulus size constant in a spatial cueing task. We observed changes in the cueing effect consonant with changes in response gain for negative faces and contrast gain for positive faces. Neuroimaging experiments confirmed that subjects' attention fields were narrowed for negative faces and broadened for positive faces. Importantly, across subjects, the self-reported emotional strength of negative faces and positive faces correlated, respectively, both with response- and contrast-gain changes and with primary visual cortex (V1) narrowed and broadened attention fields. Effective connectivity analysis showed that the emotional valence-dependent attention field was closely associated with feedback from the dorsolateral prefrontal cortex (DLPFC) to V1. These findings indicate a crucial involvement of DLPFC in the normalization processes of emotional attention.

  9. [Surgical treatment of normal pressure hydrocephalus].

    PubMed

    Svendsen, F; Hugdahl, K; Wester, K

    2001-05-30

    Normal pressure hydrocephalus (NPH) is an important diagnosis to keep in mind, i.e. the possibility that NPH and not a neurodegenerative disease causes the patient's symptoms with ataxia, urinary incontinence and dementia. Clinical improvement, sometimes a complete reversal of the symptoms, may be seen after a simple surgical procedure. This prospective study was performed by testing eight consecutively shunted patients with a walking test and a cognitive test battery pre- and postoperatively. Improvement 3-4 months after the shunting procedure suggests that NPH was present in six of eight patients. Walking ability was improved after surgery, also in patients with severe dementia. Severe dementia caused by NPH is hardly reversible, though cognitive tests may indicate some improvement. However, early surgical treatment of NPH in patients not suffering from a dementia according to the Mini Mental Status Test may bring improvement in some cognitive functions. Patients with both clinical and radiological signs of normal pressure hydrocephalus should be offered a shunting procedure.

  10. ProNormz--an integrated approach for human proteins and protein kinases normalization.

    PubMed

    Subramani, Suresh; Raja, Kalpana; Natarajan, Jeyakumar

    2014-02-01

    The task of recognizing and normalizing protein name mentions in biomedical literature is a challenging task and important for text mining applications such as protein-protein interactions, pathway reconstruction and many more. In this paper, we present ProNormz, an integrated approach for human proteins (HPs) tagging and normalization. In Homo sapiens, a greater number of biological processes are regulated by a large human gene family called protein kinases by post translational phosphorylation. Recognition and normalization of human protein kinases (HPKs) is considered to be important for the extraction of the underlying information on its regulatory mechanism from biomedical literature. ProNormz distinguishes HPKs from other HPs besides tagging and normalization. To our knowledge, ProNormz is the first normalization system available to distinguish HPKs from other HPs in addition to gene normalization task. ProNormz incorporates a specialized synonyms dictionary for human proteins and protein kinases, a set of 15 string matching rules and a disambiguation module to achieve the normalization. Experimental results on benchmark BioCreative II training and test datasets show that our integrated approach achieve a fairly good performance and outperforms more sophisticated semantic similarity and disambiguation systems presented in BioCreative II GN task. As a freely available web tool, ProNormz is useful to developers as extensible gene normalization implementation, to researchers as a standard for comparing their innovative techniques, and to biologists for normalization and categorization of HPs and HPKs mentions in biomedical literature. URL: http://www.biominingbu.org/pronormz. Copyright © 2013 Elsevier Inc. All rights reserved.

  11. Attention-related changes in correlated neuronal activity arise from normalization mechanisms

    PubMed Central

    Verhoef, Bram-Ernst; Maunsell, John H.R.

    2017-01-01

    Attention is believed to enhance perception by altering the correlations between pairs of neurons. How attention changes neuronal correlations is unknown. Using multi-electrodes in primate visual cortex, we measured spike-count correlations when single or multiple stimuli were presented, and stimuli were attended or unattended. When stimuli were unattended, adding a suppressive, non-preferred, stimulus beside a preferred stimulus increased spike-count correlations between pairs of similarly-tuned neurons, but decreased spike-count correlations between pairs of oppositely-tuned neurons. These changes are explained by a stochastic normalization model containing populations of oppositely-tuned, mutually-suppressive neurons. Importantly, this model also explains why attention decreased (attend preferred stimulus) or increased (attend non-preferred stimulus) correlations: as an indirect consequence of attention-related changes in the inputs to normalization mechanisms. Our findings link normalization mechanisms to correlated neuronal activity and attention, showing that normalization mechanisms shape response correlations and that these correlations change when attention biases normalization mechanisms. PMID:28553943

  12. Magnetic measurements on human erythrocytes: Normal, beta thalassemia major, and sickle

    NASA Astrophysics Data System (ADS)

    Sakhnini, Lama

    2003-05-01

    In this article magnetic measurements were made on human erythrocytes at different hemoglobin states (normal and reduced hemoglobin). Different blood samples: normal, beta thalassemia major, and sickle were studied. Beta thalassemia major and sickle samples were taken from patients receiving lifelong blood transfusion treatment. All samples examined exhibited diamagnetic behavior. Beta thalassemia major and sickle samples showed higher diamagnetic susceptibilities than that for the normal, which was attributed to the increase of membrane to hemoglobin volume ratio of the abnormal cells. Magnetic measurements showed that the erythrocytes in the reduced state showed less diamagnetic response in comparison with erythrocytes in the normal state. Analysis of the paramagnetic component of magnetization curves gave an effective magnetic moment of μeff=7.6 μB per reduced hemoglobin molecule. The same procedure was applied to sickle and beta thalassemia major samples and values for μeff were found to be comparable to that of the normal erythrocytes.

  13. Primary Sarcopenia in Older People with Normal Nutrition.

    PubMed

    Yadigar, S; Yavuzer, H; Yavuzer, S; Cengiz, M; Yürüyen, M; Döventaş, A; Erdinçler, D S

    2016-03-01

    The aim of this study was to investigate the presence of primary sarcopenia in older patients with normal nutrition and to assess the relationships between the primary sarcopenia with anthropometric measurements. In this prospective clinical cross-sectional study, six-hundred patients who applied to Polyclinic of Geriatrics between dates 2010 and 2011 have been evaluated. The 386 patients who were supposed to have potential secondary sarcopenia were excluded from the study. Age, gender, weight, height, BMI, calf and waist circumference, ongoing medications, additional diseases of the 214 patients included in the study have been surveyed. The sarcopenia criteria of EWSGOP have been applied. Two hundred fourteen cases included in the study were composed of 148 female and 66 male subjects. Mean age was 71.8 ± 2.1 years. Sarcopenia was detected in 105 (49%) subjects while 109 (51%) were normal. Sixty-four female (61%) and 41 (39%) male subjects were sarcopenic. Normal group included 84 female (77%) and 25 male (23%) subjects. Incidence of sarcopenia was found higher in the female patients (p<0.001). No statistically significant difference was detected between sarcopenic and normal groups with respect to age, height, weight, calf circumference and evaluation tests. Waist circumference was higher in the sarcopenic group than the normal group (p=0.02). When both groups were analyzed for BMI; 53 (51%) of the 105 sarcopenic patients had BMI over 30 kg/m2 while 29 (27%) and 23 (22%) patients had BMI of 25-30 kg/m2 and below 25 kg/m2, respectively. Incidence of sarcopenia was significantly higher in the group with BMI over 30 kg/m2 when compared with the groups with BMI of 25-30 kg/m2 and below 25 kg/m2 (p=0.01). Sarcopenia that makes older people physically dependent and decreases their quality of life that receive sufficient nutritional support and are also obese should be comprehensively investigated with respect to presence of sarcopenia.

  14. Mean Posterior Corneal Power and Astigmatism in Normal Versus Keratoconic Eyes.

    PubMed

    Feizi, Sepehr; Delfazayebaher, Siamak; Javadi, Mohammad Ali; Karimian, Farid; Ownagh, Vahid; Sadeghpour, Fatemeh

    2018-01-01

    To compare mean posterior corneal power and astigmatism in normal versus keratoconus affected eyes and determine the optimal cut-off points to maximize sensitivity and specificity in discriminating keratoconus from normal corneas. A total of 204 normal eyes and 142 keratoconus affected eyes were enrolled in this prospective comparative study. Mean posterior corneal power and astigmatism were measured using a dual Scheimpflug camera. Correlation coefficients were calculated to assess the relationship between the magnitudes of keratometric and posterior corneal astigmatism in the study groups. Receiver operating characteristic curves were used to compare the sensitivity and specificity of the measured parameters and to identify the optimal cut-off points for discriminating keratoconus from normal corneas. The mean posterior corneal power was -6.29 ± 0.20 D in the normal group and -7.77 ± 0.87 D in the keratoconus group ( P < 0.001). The mean magnitudes of the posterior corneal astigmatisms were -0.32 ± 0.15 D and -0.94 ± 0.39 D in the normal and keratoconus groups, respectively ( P < 0.001). Significant correlations were found between the magnitudes of keratometric and posterior corneal astigmatism in the normal (r=-0.76, P < 0.001) and keratoconus (r=-0.72, P < 0.001) groups. The mean posterior corneal power and astigmatism were highly reliable characteristics that distinguished keratoconus from normal corneas (area under the curve, 0.99 and 0.95, respectively). The optimal cut-off points of mean posterior corneal power and astigmatism were -6.70 D and -0.54 D, respectively. Mean posterior corneal power and astigmatism measured using a Galilei analyzer camera might have potential in diagnosing keratoconus. The cut-off points provided can be used for keratoconus screening.

  15. 46 CFR 112.30-3 - Normal source for emergency loads.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ....30-3 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) ELECTRICAL ENGINEERING EMERGENCY LIGHTING AND POWER SYSTEMS Emergency Systems Having an Automatically Connected Storage Battery as the Sole Emergency Power Source § 112.30-3 Normal source for emergency loads. (a) The normal source...

  16. 46 CFR 112.30-3 - Normal source for emergency loads.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ....30-3 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) ELECTRICAL ENGINEERING EMERGENCY LIGHTING AND POWER SYSTEMS Emergency Systems Having an Automatically Connected Storage Battery as the Sole Emergency Power Source § 112.30-3 Normal source for emergency loads. (a) The normal source...

  17. 46 CFR 112.30-3 - Normal source for emergency loads.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ....30-3 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) ELECTRICAL ENGINEERING EMERGENCY LIGHTING AND POWER SYSTEMS Emergency Systems Having an Automatically Connected Storage Battery as the Sole Emergency Power Source § 112.30-3 Normal source for emergency loads. (a) The normal source...

  18. 46 CFR 112.30-3 - Normal source for emergency loads.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ....30-3 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) ELECTRICAL ENGINEERING EMERGENCY LIGHTING AND POWER SYSTEMS Emergency Systems Having an Automatically Connected Storage Battery as the Sole Emergency Power Source § 112.30-3 Normal source for emergency loads. (a) The normal source...

  19. 46 CFR 112.30-3 - Normal source for emergency loads.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ....30-3 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) ELECTRICAL ENGINEERING EMERGENCY LIGHTING AND POWER SYSTEMS Emergency Systems Having an Automatically Connected Storage Battery as the Sole Emergency Power Source § 112.30-3 Normal source for emergency loads. (a) The normal source...

  20. 76 FR 32242 - Federal Employees' Retirement System; Normal Cost Percentages

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-06-03

    ... OFFICE OF PERSONNEL MANAGEMENT Federal Employees' Retirement System; Normal Cost Percentages...' Retirement System (FERS) Act of 1986. DATES: The revised normal cost percentages are effective at the..., Public Law 99-335, created a new retirement system intended to cover most Federal employees hired after...

  1. Physical activity patterns in morbidly obese and normal-weight women.

    PubMed

    Kwon, Soyang; Mohammad, Jamal; Samuel, Isaac

    2011-01-01

    To compare physical activity patterns between morbidly obese and normal-weight women. Daily physical activity of 18 morbidly obese and 7 normal-weight women aged 30-58 years was measured for 2 days using the Intelligent Device for Energy Expenditure and Activity (IDEEA) device. The obese group spent about 2 hr/day less standing and 30 min/day less walking than did the normal-weight group. Time spent standing (standing time) was positively associated with time spent walking (walking time). Age- and walking time-adjusted standing time did not differ according to weight status. Promoting standing may be a strategy to increase walking.

  2. Food shopping and weight concern. Balancing consumer and body normality.

    PubMed

    Nielsen, Annemette; Holm, Lotte

    2014-11-01

    The desire to achieve a normal, culturally acceptable body is often seen as the main driver of food-consumption practices adopted by individuals who are concerned about their body weight. In social research into weight management self-control is therefore often a central theme. Turning the focus towards practices and values related to food shopping, this study adds to our understanding of central features in perceptions of normality among people with weight concerns. In a qualitative study 25 people who participated in a dietary intervention trial in Denmark were interviewed and five people were observed. The study shows that the aim of achieving a normal body does not eclipse the importance of enacting values linked to ideas of the 'normal consumer'. Using empirical examples, the study illuminates how consumer freedom is attained in ways that are both complementary to, and in conflict with, practices and experiences of controlling food intake. The paper suggests that freedom and control are composite and complementary ideals of normality for people with weight concerns. On the basis of this insight, the authors discuss the contribution the paper makes to existing studies of weight management and food consumption. Copyright © 2014 Elsevier Ltd. All rights reserved.

  3. Broad Ligament Haematoma Following Normal Vaginal Delivery.

    PubMed

    Ibrar, Faiza; Awan, Azra Saeed; Fatima, Touseef; Tabassum, Hina

    2017-01-01

    A 37-year-old, patient presented in emergency with history of normal vaginal delivery followed by development of abdominal distention, vomiting, constipation for last 3 days. She was para 4 and had normal vaginal delivery by traditional birth attendant at peripheral hospital 3 days back. Imaging study revealed a heterogeneous complex mass, ascites, pleural effusion, air fluid levels with dilatation gut loops. Based upon pelvic examination by senior gynaecologist in combination with ultrasound; a clinical diagnosis of broad ligament haematoma was made. However, vomiting and abdominal distention raised suspicion of intestinal obstruction. Due to worsening abdominal distention exploratory laparotomy was carried out. It was pseudo colonic obstruction and caecostomy was done. Timely intervention by multidisciplinary approach saved patient life with minimal morbidity.

  4. Achondroplasia in sibs of normal parents.

    PubMed Central

    Philip, N; Auger, M; Mattei, J F; Giraud, F

    1988-01-01

    A new case of recurrent achondroplasia in sibs of normal parents is reported. Two sisters and a half sister were affected. Various mechanisms can be postulated to account for unexpected recurrence of achondroplasia in the same sibship. Germinal mosaicism and unstable premutation are discussed here. Images PMID:3236371

  5. Speech Rate Normalization and Phonemic Boundary Perception in Cochlear-Implant Users.

    PubMed

    Jaekel, Brittany N; Newman, Rochelle S; Goupell, Matthew J

    2017-05-24

    Normal-hearing (NH) listeners rate normalize, temporarily remapping phonemic category boundaries to account for a talker's speech rate. It is unknown if adults who use auditory prostheses called cochlear implants (CI) can rate normalize, as CIs transmit degraded speech signals to the auditory nerve. Ineffective adjustment to rate information could explain some of the variability in this population's speech perception outcomes. Phonemes with manipulated voice-onset-time (VOT) durations were embedded in sentences with different speech rates. Twenty-three CI and 29 NH participants performed a phoneme identification task. NH participants heard the same unprocessed stimuli as the CI participants or stimuli degraded by a sine vocoder, simulating aspects of CI processing. CI participants showed larger rate normalization effects (6.6 ms) than the NH participants (3.7 ms) and had shallower (less reliable) category boundary slopes. NH participants showed similarly shallow slopes when presented acoustically degraded vocoded signals, but an equal or smaller rate effect in response to reductions in available spectral and temporal information. CI participants can rate normalize, despite their degraded speech input, and show a larger rate effect compared to NH participants. CI participants may particularly rely on rate normalization to better maintain perceptual constancy of the speech signal.

  6. Speech Rate Normalization and Phonemic Boundary Perception in Cochlear-Implant Users

    PubMed Central

    Newman, Rochelle S.; Goupell, Matthew J.

    2017-01-01

    Purpose Normal-hearing (NH) listeners rate normalize, temporarily remapping phonemic category boundaries to account for a talker's speech rate. It is unknown if adults who use auditory prostheses called cochlear implants (CI) can rate normalize, as CIs transmit degraded speech signals to the auditory nerve. Ineffective adjustment to rate information could explain some of the variability in this population's speech perception outcomes. Method Phonemes with manipulated voice-onset-time (VOT) durations were embedded in sentences with different speech rates. Twenty-three CI and 29 NH participants performed a phoneme identification task. NH participants heard the same unprocessed stimuli as the CI participants or stimuli degraded by a sine vocoder, simulating aspects of CI processing. Results CI participants showed larger rate normalization effects (6.6 ms) than the NH participants (3.7 ms) and had shallower (less reliable) category boundary slopes. NH participants showed similarly shallow slopes when presented acoustically degraded vocoded signals, but an equal or smaller rate effect in response to reductions in available spectral and temporal information. Conclusion CI participants can rate normalize, despite their degraded speech input, and show a larger rate effect compared to NH participants. CI participants may particularly rely on rate normalization to better maintain perceptual constancy of the speech signal. PMID:28395319

  7. Compton profiles of some composite materials normalized by a new method

    NASA Astrophysics Data System (ADS)

    Sankarshan, B. M.; Umesh, T. K.

    2018-03-01

    Recently, we have shown that as a novel approach, in the case of samples which can be treated as pure incoherent scatterers, the effective atomic number Zeff itself could be conveniently used to normalize their un-normalized Compton profiles. In the present investigation, we have attempted to examine the efficacy of this approach. For this purpose, we have first determined the single differential Compton scattering cross sections (SDCS) of the elements C and Al as well as of some H, C, N and O based polymer samples such as bakelite, epoxy, nylon and teflon which are pure incoherent scatterers. The measurements were made at 120° in a goniometer assembly that employs a high resolution high purity germanium detector. The SDCS values were used to obtain the Zeff and the un-normalized Compton profiles. These Compton profiles were separately normalized with their Zeff values (for Compton scattering) as well as with the normalization constant obtained by integrating their Hartree-Fock Biggs et al Compton profiles based on the mixture rule. These two sets of values agreed well within the range of experimental errors, implying that Zeff can be conveniently used to normalize the experimental Compton profiles of pure incoherent scatterers.

  8. Dynamic Divisive Normalization Predicts Time-Varying Value Coding in Decision-Related Circuits

    PubMed Central

    LoFaro, Thomas; Webb, Ryan; Glimcher, Paul W.

    2014-01-01

    Normalization is a widespread neural computation, mediating divisive gain control in sensory processing and implementing a context-dependent value code in decision-related frontal and parietal cortices. Although decision-making is a dynamic process with complex temporal characteristics, most models of normalization are time-independent and little is known about the dynamic interaction of normalization and choice. Here, we show that a simple differential equation model of normalization explains the characteristic phasic-sustained pattern of cortical decision activity and predicts specific normalization dynamics: value coding during initial transients, time-varying value modulation, and delayed onset of contextual information. Empirically, we observe these predicted dynamics in saccade-related neurons in monkey lateral intraparietal cortex. Furthermore, such models naturally incorporate a time-weighted average of past activity, implementing an intrinsic reference-dependence in value coding. These results suggest that a single network mechanism can explain both transient and sustained decision activity, emphasizing the importance of a dynamic view of normalization in neural coding. PMID:25429145

  9. Normalizing Stigmatized Practices: Achieving Co-membership by "Doing Being Ordinary".

    ERIC Educational Resources Information Center

    Lawrence, Samuel G.

    1996-01-01

    Discusses the effect of the interactive accomplishment of conversational normalization. To illuminate this process, this article investigates how the parties to a news interview collaborate to normalize the interviewee's practices in operating a house of prostitution. The methodological impetus for this study involves a variant of conversation…

  10. 28 CFR 42.712 - Exception; normal operation or statutory objective.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 28 Judicial Administration 1 2010-07-01 2010-07-01 false Exception; normal operation or statutory objective. 42.712 Section 42.712 Judicial Administration DEPARTMENT OF JUSTICE NONDISCRIMINATION; EQUAL... Discrimination § 42.712 Exception; normal operation or statutory objective. (a) A recipient may take an action...

  11. Normalization and Implementation of Three Gravitational Acceleration Models

    NASA Technical Reports Server (NTRS)

    Eckman, Randy A.; Brown, Aaron J.; Adamo, Daniel R.; Gottlieb, Robert G.

    2016-01-01

    Unlike the uniform density spherical shell approximations of Newton, the consequence of spaceflight in the real universe is that gravitational fields are sensitive to the asphericity of their generating central bodies. The gravitational potential of an aspherical central body is typically resolved using spherical harmonic approximations. However, attempting to directly calculate the spherical harmonic approximations results in at least two singularities that must be removed to generalize the method and solve for any possible orbit, including polar orbits. Samuel Pines, Bill Lear, and Robert Gottlieb developed three unique algorithms to eliminate these singularities. This paper documents the methodical normalization of two of the three known formulations for singularity-free gravitational acceleration (namely, the Lear and Gottlieb algorithms) and formulates a general method for defining normalization parameters used to generate normalized Legendre polynomials and Associated Legendre Functions (ALFs) for any algorithm. A treatment of the conventional formulation of the gravitational potential and acceleration is also provided, in addition to a brief overview of the philosophical differences between the three known singularity-free algorithms.

  12. Time-invariant component-based normalization for a simultaneous PET-MR scanner.

    PubMed

    Belzunce, M A; Reader, A J

    2016-05-07

    Component-based normalization is a method used to compensate for the sensitivity of each of the lines of response acquired in positron emission tomography. This method consists of modelling the sensitivity of each line of response as a product of multiple factors, which can be classified as time-invariant, time-variant and acquisition-dependent components. Typical time-variant factors are the intrinsic crystal efficiencies, which are needed to be updated by a regular normalization scan. Failure to do so would in principle generate artifacts in the reconstructed images due to the use of out of date time-variant factors. For this reason, an assessment of the variability and the impact of the crystal efficiencies in the reconstructed images is important to determine the frequency needed for the normalization scans, as well as to estimate the error obtained when an inappropriate normalization is used. Furthermore, if the fluctuations of these components are low enough, they could be neglected and nearly artifact-free reconstructions become achievable without performing a regular normalization scan. In this work, we analyse the impact of the time-variant factors in the component-based normalization used in the Biograph mMR scanner, but the work is applicable to other PET scanners. These factors are the intrinsic crystal efficiencies and the axial factors. For the latter, we propose a new method to obtain fixed axial factors that was validated with simulated data. Regarding the crystal efficiencies, we assessed their fluctuations during a period of 230 d and we found that they had good stability and low dispersion. We studied the impact of not including the intrinsic crystal efficiencies in the normalization when reconstructing simulated and real data. Based on this assessment and using the fixed axial factors, we propose the use of a time-invariant normalization that is able to achieve comparable results to the standard, daily updated, normalization factors used in this

  13. Time-invariant component-based normalization for a simultaneous PET-MR scanner

    NASA Astrophysics Data System (ADS)

    Belzunce, M. A.; Reader, A. J.

    2016-05-01

    Component-based normalization is a method used to compensate for the sensitivity of each of the lines of response acquired in positron emission tomography. This method consists of modelling the sensitivity of each line of response as a product of multiple factors, which can be classified as time-invariant, time-variant and acquisition-dependent components. Typical time-variant factors are the intrinsic crystal efficiencies, which are needed to be updated by a regular normalization scan. Failure to do so would in principle generate artifacts in the reconstructed images due to the use of out of date time-variant factors. For this reason, an assessment of the variability and the impact of the crystal efficiencies in the reconstructed images is important to determine the frequency needed for the normalization scans, as well as to estimate the error obtained when an inappropriate normalization is used. Furthermore, if the fluctuations of these components are low enough, they could be neglected and nearly artifact-free reconstructions become achievable without performing a regular normalization scan. In this work, we analyse the impact of the time-variant factors in the component-based normalization used in the Biograph mMR scanner, but the work is applicable to other PET scanners. These factors are the intrinsic crystal efficiencies and the axial factors. For the latter, we propose a new method to obtain fixed axial factors that was validated with simulated data. Regarding the crystal efficiencies, we assessed their fluctuations during a period of 230 d and we found that they had good stability and low dispersion. We studied the impact of not including the intrinsic crystal efficiencies in the normalization when reconstructing simulated and real data. Based on this assessment and using the fixed axial factors, we propose the use of a time-invariant normalization that is able to achieve comparable results to the standard, daily updated, normalization factors used in this

  14. Relationship between hamstring length and gluteus maximus strength with and without normalization.

    PubMed

    Lee, Dong-Kyu; Oh, Jae-Seop

    2018-01-01

    [Purpose] This study assessed the relationship between hamstring length and gluteus maximus (GM) strength with and without normalization by body weight and height. [Subjects and Methods] In total, 34 healthy male subjects volunteered for this study. To measure GM strength, subjects performed maximal hip joint extension with the knee joints flexed to 90° in the prone position. GM strength was normalized for body weight and height. [Results] GM strength with normalization was positively correlated with hamstring length, whereas GM strength without normalization was negatively correlated with hamstring length. [Conclusion] The normalization of GM strength by body weight and height has the potential to lead to more appropriate conclusions and interpretations about its correlation with hamstring length. Hamstring length may be related to GM strength.

  15. Stability of strongly nonlinear normal modes

    NASA Astrophysics Data System (ADS)

    Recktenwald, Geoffrey; Rand, Richard

    2007-10-01

    It is shown that a transformation of time can allow the periodic solution of a strongly nonlinear oscillator to be written as a simple cosine function. This enables the stability of strongly nonlinear normal modes in multidegree of freedom systems to be investigated by standard procedures such as harmonic balance.

  16. Research Summaries for Normal Birth

    PubMed Central

    Romano, Amy M.; Goer, Henci

    2007-01-01

    In this column, the authors summarize four research studies that further support the benefits of normal birth. The topics of the studies include the association of cesarean birth with an increased risk of neonatal death; the use of acupuncture and self-hypnosis as effective pain-management strategies; factors associated with amniotic-fluid embolism; and the positive influence of continuous support by lay doulas on obstetric outcomes for low-income women. PMID:18408810

  17. Normalizing biomedical terms by minimizing ambiguity and variability

    PubMed Central

    Tsuruoka, Yoshimasa; McNaught, John; Ananiadou, Sophia

    2008-01-01

    Background One of the difficulties in mapping biomedical named entities, e.g. genes, proteins, chemicals and diseases, to their concept identifiers stems from the potential variability of the terms. Soft string matching is a possible solution to the problem, but its inherent heavy computational cost discourages its use when the dictionaries are large or when real time processing is required. A less computationally demanding approach is to normalize the terms by using heuristic rules, which enables us to look up a dictionary in a constant time regardless of its size. The development of good heuristic rules, however, requires extensive knowledge of the terminology in question and thus is the bottleneck of the normalization approach. Results We present a novel framework for discovering a list of normalization rules from a dictionary in a fully automated manner. The rules are discovered in such a way that they minimize the ambiguity and variability of the terms in the dictionary. We evaluated our algorithm using two large dictionaries: a human gene/protein name dictionary built from BioThesaurus and a disease name dictionary built from UMLS. Conclusions The experimental results showed that automatically discovered rules can perform comparably to carefully crafted heuristic rules in term mapping tasks, and the computational overhead of rule application is small enough that a very fast implementation is possible. This work will help improve the performance of term-concept mapping tasks in biomedical information extraction especially when good normalization heuristics for the target terminology are not fully known. PMID:18426547

  18. Mechanical properties of normal versus cancerous breast cells

    PubMed Central

    Smelser, Amanda M.; Macosko, Jed C.; O’Dell, Adam P.; Smyre, Scott; Bonin, Keith

    2016-01-01

    A cell’s mechanical properties are important in determining its adhesion, migration, and response to the mechanical properties of its microenvironment and may help explain behavioral differences between normal and cancerous cells. Using fluorescently labeled peroxisomes as microrheological probes, the interior mechanical properties of normal breast cells were compared to a metastatic breast cell line, MDA-MB-231. To estimate the mechanical properties of cell cytoplasms from the motions of their peroxisomes, it was necessary to reduce the contribution of active cytoskeletal motions to peroxisome motion. This was done by treating the cells with blebbistatin, to inhibit myosin II, or with sodium azide and 2-deoxy-D-glucose, to reduce intracellular ATP. Using either treatment, the peroxisomes exhibited normal diffusion or subdiffusion, and their mean squared displacements (MSDs) showed that the MDA-MB-231 cells were significantly softer than normal cells. For these two cell types, peroxisome MSDs in treated and untreated cells converged at high frequencies, indicating that cytoskeletal structure was not altered by the drug treatment. The MSDs from ATP-depleted cells were analyzed by the generalized Stokes–Einstein relation to estimate the interior viscoelastic modulus G* and its components, the elastic shear modulus G′ and viscous shear modulus G″, at angular frequencies between 0.126 and 628rad/s. These moduli are the material coefficients that enter into stress–strain relations and relaxation times in quantitative mechanical models such as the poroelastic model of the interior regions of cancerous and non-cancerous cells. PMID:25929519

  19. Multivariate Models for Normal and Binary Responses in Intervention Studies

    ERIC Educational Resources Information Center

    Pituch, Keenan A.; Whittaker, Tiffany A.; Chang, Wanchen

    2016-01-01

    Use of multivariate analysis (e.g., multivariate analysis of variance) is common when normally distributed outcomes are collected in intervention research. However, when mixed responses--a set of normal and binary outcomes--are collected, standard multivariate analyses are no longer suitable. While mixed responses are often obtained in…

  20. Friendly Fire: War-Normalizing Metaphors in the Israeli Political Discourse

    ERIC Educational Resources Information Center

    Gavriely-Nuri, Dalia

    2009-01-01

    Combining principles of peace education and political discourse analysis, this study dwells on one powerful metaphorical mechanism engaged in by Israeli political leaders: war-normalizing metaphors, a mechanism for framing war as part of human nature and normal life. Six core semantic fields were identified as particularly useful "raw…

  1. Normalizing Heterosexuality: Mothers' Assumptions, Talk, and Strategies with Young Children

    ERIC Educational Resources Information Center

    Martin, Karin A.

    2009-01-01

    In recent years, social scientists have identified not just heterosexism and homophobia as social problems, but also heteronormativity--the mundane, everyday ways that heterosexuality is privileged and taken for granted as normal and natural. There is little empirical research, however, on how heterosexuality is reproduced and then normalized for…

  2. Um Breve Balanço dos Estudos em Astronomia e Educação no Brasil no Período de 2010 a 2013

    NASA Astrophysics Data System (ADS)

    Goncalves, Erica de Oliveira; Kern, C.

    2014-10-01

    No Brasil, as pesquisas em ensino de astronomia para a Educação Básica vem ganhando destaque. Posto como importante área do conhecimento para estudantes e professores, os estudos em astronomia conquistam espaços nos documentos oficiais da educação e nos currículos escolares. Diante desse cenário, fez-se, neste trabalho, um mapeamento no banco de dados da Biblioteca Digital Brasileira de Teses e Dissertações , com base nas palavras-chave "astronomia" e "educação" no período de 2010 a 2013. Para compor o que aqui denominamos de balanço da área de estudo, foram selecionados trabalhos e analisados os títulos, os resumos, as considerações finais e as referências, bem como identificamos as fontes epistemológicas correntes nas pesquisas de pós-graduação no período supracitado. Identificou-se, na maior parte dos trabalhos pesquisados, referenciais teóricos relacionados & agrave; área de física, ciências e astronomia que envolvem discussões sobre currículo e práticas pedagógicas vinculados ao ensino de astronomia no ensino fundamental e médio da Educação Bãsica e nos cursos de formação de professores.

  3. Multiple imputation in the presence of non-normal data.

    PubMed

    Lee, Katherine J; Carlin, John B

    2017-02-20

    Multiple imputation (MI) is becoming increasingly popular for handling missing data. Standard approaches for MI assume normality for continuous variables (conditionally on the other variables in the imputation model). However, it is unclear how to impute non-normally distributed continuous variables. Using simulation and a case study, we compared various transformations applied prior to imputation, including a novel non-parametric transformation, to imputation on the raw scale and using predictive mean matching (PMM) when imputing non-normal data. We generated data from a range of non-normal distributions, and set 50% to missing completely at random or missing at random. We then imputed missing values on the raw scale, following a zero-skewness log, Box-Cox or non-parametric transformation and using PMM with both type 1 and 2 matching. We compared inferences regarding the marginal mean of the incomplete variable and the association with a fully observed outcome. We also compared results from these approaches in the analysis of depression and anxiety symptoms in parents of very preterm compared with term-born infants. The results provide novel empirical evidence that the decision regarding how to impute a non-normal variable should be based on the nature of the relationship between the variables of interest. If the relationship is linear in the untransformed scale, transformation can introduce bias irrespective of the transformation used. However, if the relationship is non-linear, it may be important to transform the variable to accurately capture this relationship. A useful alternative is to impute the variable using PMM with type 1 matching. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  4. Normal-inverse bimodule operation Hadamard transform ion mobility spectrometry.

    PubMed

    Hong, Yan; Huang, Chaoqun; Liu, Sheng; Xia, Lei; Shen, Chengyin; Chu, Yannan

    2018-10-31

    In order to suppress or eliminate the spurious peaks and improve signal-to-noise ratio (SNR) of Hadamard transform ion mobility spectrometry (HT-IMS), a normal-inverse bimodule operation Hadamard transform - ion mobility spectrometry (NIBOHT-IMS) technique was developed. In this novel technique, a normal and inverse pseudo random binary sequence (PRBS) was produced in sequential order by an ion gate controller and utilized to control the ion gate of IMS, and then the normal HT-IMS mobility spectrum and the inverse HT-IMS mobility spectrum were obtained. A NIBOHT-IMS mobility spectrum was gained by subtracting the inverse HT-IMS mobility spectrum from normal HT-IMS mobility spectrum. Experimental results demonstrate that the NIBOHT-IMS technique can significantly suppress or eliminate the spurious peaks, and enhance the SNR by measuring the reactant ions. Furthermore, the gas CHCl 3 and CH 2 Br 2 were measured for evaluating the capability of detecting real sample. The results show that the NIBOHT-IMS technique is able to eliminate the spurious peaks and improve the SNR notably not only for the detection of larger ion signals but also for the detection of small ion signals. Copyright © 2018 Elsevier B.V. All rights reserved.

  5. [Objective measurement of normal nasality in the Saxony dialect].

    PubMed

    Müller, R; Beleites, T; Hloucal, U; Kühn, M

    2000-12-01

    In the United States of America, the nasometer was developed by Fletcher as an objective method for measuring nasality. There are no accepted normal values for comparable test materials regarding the German language. The aim of this study was the examination of the auditively normal nasality of Saxon-speaking people with the nasometer. The nasalance of 51 healthy Saxon-speaking test persons with auditively normal nasality was measured with a model 6200 nasometer (Kay-Elemetrics, U.S.A.). The text materials used were the vowels "a", "e", "i", "o", and "u", the sentences "Die Schokolade ist sehr lecker" ("The chocolate is very tasty") and "Nenne meine Mama Mimi" ("Name my mama Mimi"), and the texts of "North wind and sun", "A children's birthday", and an arbitrary selection from Strittmatter. The mean nasalance for the vowels was 17.7%, for the sentence containing no nasal sounds 13.0%, and for the sentence containing many nasal sounds 67.2%. The mean value of the texts was 33-41%. The results for the texts agreed well with the results of Reuter (1997), who examined people from the state of Brandenburg. A range from 20% to 55% is suggested as the normal value for nasalance in the German-speaking area.

  6. The normalization heuristic: an untested hypothesis that may misguide medical decisions.

    PubMed

    Aberegg, Scott K; O'Brien, James M

    2009-06-01

    Medical practice is increasingly informed by the evidence from randomized controlled trials. When such evidence is not available, clinical hypotheses based on pathophysiological reasoning and common sense guide clinical decision making. One commonly utilized general clinical hypothesis is the assumption that normalizing abnormal laboratory values and physiological parameters will lead to improved patient outcomes. We refer to the general use of this clinical hypothesis to guide medical therapeutics as the "normalization heuristic". In this paper, we operationally define this heuristic and discuss its limitations as a rule of thumb for clinical decision making. We review historical and contemporaneous examples of normalization practices as empirical evidence for the normalization heuristic and to highlight its frailty as a guide for clinical decision making.

  7. Loss of Brain Aerobic Glycolysis in Normal Human Aging.

    PubMed

    Goyal, Manu S; Vlassenko, Andrei G; Blazey, Tyler M; Su, Yi; Couture, Lars E; Durbin, Tony J; Bateman, Randall J; Benzinger, Tammie L-S; Morris, John C; Raichle, Marcus E

    2017-08-01

    The normal aging human brain experiences global decreases in metabolism, but whether this affects the topography of brain metabolism is unknown. Here we describe PET-based measurements of brain glucose uptake, oxygen utilization, and blood flow in cognitively normal adults from 20 to 82 years of age. Age-related decreases in brain glucose uptake exceed that of oxygen use, resulting in loss of brain aerobic glycolysis (AG). Whereas the topographies of total brain glucose uptake, oxygen utilization, and blood flow remain largely stable with age, brain AG topography changes significantly. Brain regions with high AG in young adults show the greatest change, as do regions with prolonged developmental transcriptional features (i.e., neoteny). The normal aging human brain thus undergoes characteristic metabolic changes, largely driven by global loss and topographic changes in brain AG. Copyright © 2017 Elsevier Inc. All rights reserved.

  8. Horizontal and sun-normal spectral biologically effective ultraviolet irradiances.

    PubMed

    Parisi, A V; Kimlin, M G

    1999-01-01

    The dependence of the spectral biologically effective solar UV irradiance on the orientation of the receiver with respect to the sun has been determined for relatively cloud-free days at a sub-tropical Southern Hemisphere latitude for the solar zenith angle range 35-64 degrees. For the UV and biologically effective irradiances, the sun-normal to horizontal ratio for the total UV ranges from 1.18 +/- 0.05 to 1.27 +/- 0.06. The sun-normal to horizontal ratio for biologically effective irradiance is dependent on the relative effectiveness of the relevant action spectrum in the UV-A waveband. In contrast to the total UV, the diffuse UV and diffuse biologically effective irradiances are reduced in a sun-normal compared with a horizontal orientation by a factor ranging from 0.70 +/- 0.05 to 0.76 +/- 0.03.

  9. Normal/Modern: Reconstructive Surgery in a Mexican Public Hospital.

    PubMed

    Taylor-Alexander, Samuel

    2017-10-01

    A growing corpus of anthropological scholarship demonstrates how science and medicine in Mexico are imbued by national concerns with modernization. Drawing on ethnographic research in a public hospital located in the south of Mexico City, I unpack one manifestation of this dynamic, which is the conjugation of the normal and the modern in Mexican reconstructive surgery. The aspiration toward normality underlies everyday clinic practices and relationships in this field, including why parents want surgery for their children and how doctors see their patients and their responsibilities toward them. It is also central to the professional ethic of reconstructive surgeons. I argue that the realities of health care provision in Mexico coalesced with this ethic to produce reconstructive surgeons as political subjects. They aimed to modernize craniofacial surgery in Mexico and so make the bodies of craniofacial patients normal.

  10. Design of optimally normal minimum gain controllers by continuation method

    NASA Technical Reports Server (NTRS)

    Lim, K. B.; Juang, J.-N.; Kim, Z. C.

    1989-01-01

    A measure of the departure from normality is investigated for system robustness. An attractive feature of the normality index is its simplicity for pole placement designs. To allow a tradeoff between system robustness and control effort, a cost function consisting of the sum of a norm of weighted gain matrix and a normality index is minimized. First- and second-order necessary conditions for the constrained optimization problem are derived and solved by a Newton-Raphson algorithm imbedded into a one-parameter family of neighboring zero problems. The method presented allows the direct computation of optimal gains in terms of robustness and control effort for pole placement problems.

  11. Cystic fibrosis transmembrane conductance regulator is correlated closely with sperm progressive motility and normal morphology in healthy and fertile men with normal sperm parameters.

    PubMed

    Jiang, L-Y; Shan, J-J; Tong, X-M; Zhu, H-Y; Yang, L-Y; Zheng, Q; Luo, Y; Shi, Q-X; Zhang, S-Y

    2014-10-01

    Cystic fibrosis transmembrane conductance regulator (CFTR) has been demonstrated to be expressed in mature spermatozoa and correlated with sperm quality. Sperm CFTR expression in fertile men is higher than that in infertile men suffering from teratospermia, asthenoteratospermia, asthenospermia and oligospermia, but it is unknown whether CFTR is correlated with sperm parameters when sperm parameters are normal. In this study, 282 healthy and fertile men with normal semen parameters were classified into three age groups, group (I): age group of 20-29 years (98 cases, 27.1 ± 6.2), group (II): age group of 30-39 years (142 cases, 33.7 ± 2.6) and group (III): age group of more than or equal to 40 years (42 cases, 44.1 ± 4.6). Sperm concentration, total count and progressive motility were analysed by computer-assisted sperm analysis. Sperm morphology was analysed by modified Papanicolaou staining. Sperm CFTR expression was conducted by indirect immunofluorescence staining. There was a significant positive correlation (P < 0.001) between CFTR expression and sperm progressive motility (r = 0.221) and normal morphology (r = 0.202), but there were no correlations between sperm CFTR expression and semen volume, sperm concentration, sperm total count as well as male age (P > 0.05). Our findings show that CFTR expression is associated with sperm progressive motility and normal morphology in healthy and fertile men with normal sperm parameters, but not associated with the number of spermatozoa and male age. © 2013 Blackwell Verlag GmbH.

  12. 14 CFR § 1216.306 - Actions normally requiring an EIS.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ...) § 1216.306 Actions normally requiring an EIS. (a) NASA will prepare an EIS for actions with the potential...) Typical NASA actions normally requiring an EIS include: (1) Development and operation of new launch... using a total quantity of radioactive material greater than the quantity for which the NASA Nuclear...

  13. TaggerOne: joint named entity recognition and normalization with semi-Markov Models.

    PubMed

    Leaman, Robert; Lu, Zhiyong

    2016-09-15

    Text mining is increasingly used to manage the accelerating pace of the biomedical literature. Many text mining applications depend on accurate named entity recognition (NER) and normalization (grounding). While high performing machine learning methods trainable for many entity types exist for NER, normalization methods are usually specialized to a single entity type. NER and normalization systems are also typically used in a serial pipeline, causing cascading errors and limiting the ability of the NER system to directly exploit the lexical information provided by the normalization. We propose the first machine learning model for joint NER and normalization during both training and prediction. The model is trainable for arbitrary entity types and consists of a semi-Markov structured linear classifier, with a rich feature approach for NER and supervised semantic indexing for normalization. We also introduce TaggerOne, a Java implementation of our model as a general toolkit for joint NER and normalization. TaggerOne is not specific to any entity type, requiring only annotated training data and a corresponding lexicon, and has been optimized for high throughput. We validated TaggerOne with multiple gold-standard corpora containing both mention- and concept-level annotations. Benchmarking results show that TaggerOne achieves high performance on diseases (NCBI Disease corpus, NER f-score: 0.829, normalization f-score: 0.807) and chemicals (BioCreative 5 CDR corpus, NER f-score: 0.914, normalization f-score 0.895). These results compare favorably to the previous state of the art, notwithstanding the greater flexibility of the model. We conclude that jointly modeling NER and normalization greatly improves performance. The TaggerOne source code and an online demonstration are available at: http://www.ncbi.nlm.nih.gov/bionlp/taggerone zhiyong.lu@nih.gov Supplementary data are available at Bioinformatics online. Published by Oxford University Press 2016. This work is written

  14. Survival After Early and Normal Retirement

    ERIC Educational Resources Information Center

    Haynes, Suzanne G.; And Others

    1978-01-01

    Describes an epidemiological study of the patterns and correlates of survival after early (age 62 to 64) and normal retirement (age 65). Death rates were significantly elevated during the first, fourth, and fifth years after early retirement. Pre-retirement health status was the only significant predictor of survival after early retirement.…

  15. Terre Haute and the Normal School Fire

    ERIC Educational Resources Information Center

    Ferreira, Allen

    1974-01-01

    This paper examines the short history of the Terre Haute Normal School before its tragic burning on April 9, 1888 and relates that story to the course of events immediately following the fire. (Author)

  16. Effect of friction on vibrotactile sensation of normal and dehydrated skin.

    PubMed

    Chen, S; Ge, S; Tang, W; Zhang, J

    2016-02-01

    Vibrotactile sensation mediated is highly dependent on surface mechanical and frictional properties. Dehydration of skin could change these properties. To investigate the relationship between friction and vibrotactile sensation of normal and dehydrated skin. Vibrations were firstly measured during surface exploration using a biomimetic sensor. Piglet skin was used as human skin model to study frictional properties for both normal and dehydrated skin using an atomic force microscope on nanoscale and a pin-on-disk tribometer on macroscale. Effect of vibrational frequency on friction and vibrotactile perception was also observed on nano and macro scale for normal and dehydrated skin. The result indicated that dehydrated skin was less sensitive than normal skin. The coefficient of friction of dehydrated skin is smaller than that of normal skin on both nano and macro scale. The coefficient of friction increases as increasing scanning frequencies. There is a positive correlation between coefficient of friction and vibrotactile sensation on nanoscale and macroscale. © 2015 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  17. Diagnostic imaging features of normal anal sacs in dogs and cats.

    PubMed

    Jung, Yechan; Jeong, Eunseok; Park, Sangjun; Jeong, Jimo; Choi, Ul Soo; Kim, Min-Su; Kim, Namsoo; Lee, Kichang

    2016-09-30

    This study was conducted to provide normal reference features for canine and feline anal sacs using ultrasound, low-field magnetic resonance imaging (MRI) and radiograph contrast as diagnostic imaging tools. A total of ten clinically normal beagle dogs and eight clinically normally cats were included. General radiography with contrast, ultrasonography and low-field MRI scans were performed. The visualization of anal sacs, which are located at distinct sites in dogs and cats, is possible with a contrast study on radiography. Most surfaces of the anal sacs tissue, occasionally appearing as a hyperechoic thin line, were surrounded by the hypoechoic external sphincter muscle on ultrasonography. The normal anal sac contents of dogs and cats had variable echogenicity. Signals of anal sac contents on low-field MRI varied in cats and dogs, and contrast medium using T1-weighted images enhanced the anal sac walls more obviously than that on ultrasonography. In conclusion, this study provides the normal features of anal sacs from dogs and cats on diagnostic imaging. Further studies including anal sac evaluation are expected to investigate disease conditions.

  18. Diagnostic imaging features of normal anal sacs in dogs and cats

    PubMed Central

    Jung, Yechan; Jeong, Eunseok; Park, Sangjun; Jeong, Jimo; Choi, Ul Soo; Kim, Min-Su; Kim, Namsoo

    2016-01-01

    This study was conducted to provide normal reference features for canine and feline anal sacs using ultrasound, low-field magnetic resonance imaging (MRI) and radiograph contrast as diagnostic imaging tools. A total of ten clinically normal beagle dogs and eight clinically normally cats were included. General radiography with contrast, ultrasonography and low-field MRI scans were performed. The visualization of anal sacs, which are located at distinct sites in dogs and cats, is possible with a contrast study on radiography. Most surfaces of the anal sacs tissue, occasionally appearing as a hyperechoic thin line, were surrounded by the hypoechoic external sphincter muscle on ultrasonography. The normal anal sac contents of dogs and cats had variable echogenicity. Signals of anal sac contents on low-field MRI varied in cats and dogs, and contrast medium using T1-weighted images enhanced the anal sac walls more obviously than that on ultrasonography. In conclusion, this study provides the normal features of anal sacs from dogs and cats on diagnostic imaging. Further studies including anal sac evaluation are expected to investigate disease conditions. PMID:26645338

  19. The spectral theorem for quaternionic unbounded normal operators based on the S-spectrum

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Alpay, Daniel, E-mail: dany@math.bgu.ac.il; Kimsey, David P., E-mail: dpkimsey@gmail.com; Colombo, Fabrizio, E-mail: fabrizio.colombo@polimi.it

    In this paper we prove the spectral theorem for quaternionic unbounded normal operators using the notion of S-spectrum. The proof technique consists of first establishing a spectral theorem for quaternionic bounded normal operators and then using a transformation which maps a quaternionic unbounded normal operator to a quaternionic bounded normal operator. With this paper we complete the foundation of spectral analysis of quaternionic operators. The S-spectrum has been introduced to define the quaternionic functional calculus but it turns out to be the correct object also for the spectral theorem for quaternionic normal operators. The lack of a suitable notion ofmore » spectrum was a major obstruction to fully understand the spectral theorem for quaternionic normal operators. A prime motivation for studying the spectral theorem for quaternionic unbounded normal operators is given by the subclass of unbounded anti-self adjoint quaternionic operators which play a crucial role in the quaternionic quantum mechanics.« less

  20. [Identification of NMDA receptor in normal bovine ovary and ovum].

    PubMed

    Tachibana, Naoko; Ikeda, Shu-ichi

    2014-01-01

    To clarify the pathogenesis of anti-N-methyl-D-aspartate receptor (NMDAR) encephalitis in patients without ovarian teratoma, we investigate normal human ovary, normal bovine ovary and bovine ova. On the basis of immunohistochemical studies, normal human ovary expressed NR2B epitope in primordial oocytes. The results of SDS-PAGE and immunoblotting using bovine ovarian tissues and ova, we identified two bands of NR1 and NR2B. Moreover, reverse phase liquid chromatography coupled to tandem mass spectrometry showed peptides fractions of NR1, NR2A, NR2B and NR2C. Immunocytochemical study disclosed that normal bovine oocyte has a strong affinity for a patient's disease-specific IgG. Anti-NMDAR encephalitis involves mainly young women who are in their reproductive age. Ovarian teratoma is important as simultaneous tumor, the percentage of patients with ovarian teratoma is less than 40%. It is obvious that the origin of ovarian teratoma is oocyte. So the existence of NMDAR in normal oocytes is very important to assert that ovary itself is the antigen presenting tissue. And also it is helpful to explain why young women are mainly affected from this disease. It seems to conclude that anti-NMDAR encephalitis is one form of autoimmune synaptic encephalitis and that the antigen presenting tissue is ovary itself.

  1. Are vaginal symptoms ever normal? a review of the literature.

    PubMed

    Anderson, Matthew; Karasz, Alison; Friedland, Sarah

    2004-11-22

    Vaginal symptoms such as discharge, odor, and itch are among the most common presenting complaints in primary care. We undertook to determine if the symptoms associated with vaginitis (discharge, odor, irritation) occur in normal women. To answer this question, we performed a literature review. We conducted a Medline search using the following terms: "vagina," "vaginal discharge," "secretion," "odors," discharge," "pruritus," "normal," "irritation," "itch," "physical examination," "healthy," "asymptomatic," "quantity," and "physiology." To find additional references we reviewed textbooks in gynecology, primary care, and physical diagnosis and contacted authors. There are few primary studies, and most are not of high quality. Existing data show that the quantity and quality of vaginal discharge in healthy women vary considerably both across individuals and in the same individual during the menstrual cycle. Most studies indicate that discharge is greatest at midcycle. Vaginal fluid contains malodorants, and one study of intact vaginal fluid found it to be malodorous. Two studies found that normal women reported irritative symptoms in the course of their menstrual cycle. The primary literature indicates that there is a wide variation in the normal vagina and that some of the symptoms associated with vaginal abnormality are found in well women. Both clinicians and their patients would benefit from a better understanding of the range of normal as well as what constitutes a meaningful departure from that range.

  2. Accurate Thermal Stresses for Beams: Normal Stress

    NASA Technical Reports Server (NTRS)

    Johnson, Theodore F.; Pilkey, Walter D.

    2002-01-01

    Formulations for a general theory of thermoelasticity to generate accurate thermal stresses for structural members of aeronautical vehicles were developed in 1954 by Boley. The formulation also provides three normal stresses and a shear stress along the entire length of the beam. The Poisson effect of the lateral and transverse normal stresses on a thermally loaded beam is taken into account in this theory by employing an Airy stress function. The Airy stress function enables the reduction of the three-dimensional thermal stress problem to a two-dimensional one. Numerical results from the general theory of thermoelasticity are compared to those obtained from strength of materials. It is concluded that the theory of thermoelasticity for prismatic beams proposed in this paper can be used instead of strength of materials when precise stress results are desired.

  3. Accurate Thermal Stresses for Beams: Normal Stress

    NASA Technical Reports Server (NTRS)

    Johnson, Theodore F.; Pilkey, Walter D.

    2003-01-01

    Formulations for a general theory of thermoelasticity to generate accurate thermal stresses for structural members of aeronautical vehicles were developed in 1954 by Boley. The formulation also provides three normal stresses and a shear stress along the entire length of the beam. The Poisson effect of the lateral and transverse normal stresses on a thermally loaded beam is taken into account in this theory by employing an Airy stress function. The Airy stress function enables the reduction of the three-dimensional thermal stress problem to a two-dimensional one. Numerical results from the general theory of thermoelasticity are compared to those obtained from strength of materials. It is concluded that the theory of thermoelasticity for prismatic beams proposed in this paper can be used instead of strength of materials when precise stress results are desired.

  4. High normal fasting blood glucose is associated with dementia in Chinese elders

    PubMed Central

    Mortimer, J.A.; Borenstein, A.R.; Ding, D.; DeCarli, C.; Zhao, Q.; Copenhaver, C.; Guo, Q.; Chu, S.; Galasko, D.; Salmon, D.P.; Dai, Q.; Wu, Y.; Petersen, R.; Hong, Z.

    2010-01-01

    Background Diabetes is a risk factor for MCI and dementia. However, the association between high normal fasting blood glucose (FBG) and dementia has not been studied. Methods Polytomous logistic regression was used to assess the association of dementia and MCI with FBG in an age- and sex-matched sample of 32 dementia patients, 27 amnestic MCI (aMCI) patients and 31 normal controls (NC). Analyses were repeated for those with normal FBG. Correlations between FBG and cognitive test scores were obtained. Results Controlling for age, sex, education, body mass index, Hachinski Ischemic Score, MRI stroke, and normalized brain, hippocampal and white matter hyperintensity MRI volumes; higher FBG was associated with dementia vs. aMCI status (OR= 3.13; 95% CI:1.28–7.69). This association remained (OR= 7.75; 95% CI:1.10–55.56) when analyses were restricted to subjects with normal FBG. When dementia patients were compared with NC adjusting for age, sex and education a significant association with FBG also was seen (OR=1.83; 95%CI:1.09–3.08), but the association was lost when vascular covariates were added to the model. FBG was not associated with aMCI status vs. NC. Higher FBG was correlated with poorer performance on the Trailmaking Test Part B (p=0.003). The percentage of dementia patients with high normal FBG (90%) was significantly higher than that of aMCI patients with high normal FBG (32.9%) (χ2=13.9, p<0.001). Conclusions Higher FBG was associated with dementia (vs. aMCI) independent of vascular risk factors and MRI indicators of vascular disease, and remained a significant risk factor when analyses were restricted to subjects with normal FBG. The results of this cross-sectional study suggest that a high normal level of FBG may be a risk factor for dementia. PMID:21044774

  5. Prognostic significance of normal-sized ovary in advanced serous epithelial ovarian cancer.

    PubMed

    Paik, E Sun; Kim, Ji Hye; Kim, Tae Joong; Lee, Jeong Won; Kim, Byoung Gie; Bae, Duk Soo; Choi, Chel Hun

    2018-01-01

    We compared survival outcomes of advanced serous type epithelial ovarian cancer (EOC) patients with normal-sized ovaries and enlarged-ovarian tumors by propensity score matching analysis. The medical records of EOC patients treated at Samsung Medical Center between 2002 and 2015 were reviewed retrospectively. We investigated EOC patients with high grade serous type histology and International Federation of Gynecology and Obstetrics (FIGO) stage IIIB, IIIC, or IV who underwent primary debulking surgery (PDS) and adjuvant chemotherapy to identify patients with normal-sized ovaries. Propensity score matching was performed to compare patients with normal-sized ovaries to patients with enlarged-ovarian tumors (ratio, 1:3) according to age, FIGO stage, initial cancer antigen (CA)-125 level, and residual disease status after PDS. Of the 419 EOC patients, 48 patients had normal-sized ovary. Patients with enlarged-ovarian tumor were younger (54.0±10.3 vs. 58.4±9.2 years, p=0.005) than those with normal-sized ovary, and there was a statistically significant difference in residual disease status between the 2 groups. In total cohort with a median follow-up period of 43 months (range, 3-164 months), inferior overall survival (OS) was shown in the normal-sized ovary group (median OS, 71.2 vs. 41.4 months; p=0.003). After propensity score matching, the group with normal-sized ovary showed inferior OS compared to the group with enlarged-ovarian tumor (median OS, 72.1 vs. 41.4 months; p=0.031). In multivariate analysis for OS, normal-sized ovary remained a significant factor. Normal-sized ovary was associated with poor OS compared with the common presentation of enlarged ovaries in EOC, independent of CA-125 level or residual disease. Copyright © 2018. Asian Society of Gynecologic Oncology, Korean Society of Gynecologic Oncology

  6. How do normal faults grow?

    NASA Astrophysics Data System (ADS)

    Jackson, Christopher; Bell, Rebecca; Rotevatn, Atle; Tvedt, Anette

    2016-04-01

    Normal faulting accommodates stretching of the Earth's crust, and it is arguably the most fundamental tectonic process leading to continent rupture and oceanic crust emplacement. Furthermore, the incremental and finite geometries associated with normal faulting dictate landscape evolution, sediment dispersal and hydrocarbon systems development in rifts. Displacement-length scaling relationships compiled from global datasets suggest normal faults grow via a sympathetic increase in these two parameters (the 'isolated fault model'). This model has dominated the structural geology literature for >20 years and underpins the structural and tectono-stratigraphic models developed for active rifts. However, relatively recent analysis of high-quality 3D seismic reflection data suggests faults may grow by rapid establishment of their near-final length prior to significant displacement accumulation (the 'coherent fault model'). The isolated and coherent fault models make very different predictions regarding the tectono-stratigraphic evolution of rift basin, thus assessing their applicability is important. To-date, however, very few studies have explicitly set out to critically test the coherent fault model thus, it may be argued, it has yet to be widely accepted in the structural geology community. Displacement backstripping is a simple graphical technique typically used to determine how faults lengthen and accumulate displacement; this technique should therefore allow us to test the competing fault models. However, in this talk we use several subsurface case studies to show that the most commonly used backstripping methods (the 'original' and 'modified' methods) are, however, of limited value, because application of one over the other requires an a priori assumption of the model most applicable to any given fault; we argue this is illogical given that the style of growth is exactly what the analysis is attempting to determine. We then revisit our case studies and demonstrate

  7. Dynamic divisive normalization predicts time-varying value coding in decision-related circuits.

    PubMed

    Louie, Kenway; LoFaro, Thomas; Webb, Ryan; Glimcher, Paul W

    2014-11-26

    Normalization is a widespread neural computation, mediating divisive gain control in sensory processing and implementing a context-dependent value code in decision-related frontal and parietal cortices. Although decision-making is a dynamic process with complex temporal characteristics, most models of normalization are time-independent and little is known about the dynamic interaction of normalization and choice. Here, we show that a simple differential equation model of normalization explains the characteristic phasic-sustained pattern of cortical decision activity and predicts specific normalization dynamics: value coding during initial transients, time-varying value modulation, and delayed onset of contextual information. Empirically, we observe these predicted dynamics in saccade-related neurons in monkey lateral intraparietal cortex. Furthermore, such models naturally incorporate a time-weighted average of past activity, implementing an intrinsic reference-dependence in value coding. These results suggest that a single network mechanism can explain both transient and sustained decision activity, emphasizing the importance of a dynamic view of normalization in neural coding. Copyright © 2014 the authors 0270-6474/14/3416046-12$15.00/0.

  8. Density- and wavefunction-normalized Cartesian spherical harmonics for l ≤ 20.

    PubMed

    Michael, J Robert; Volkov, Anatoliy

    2015-03-01

    The widely used pseudoatom formalism [Stewart (1976). Acta Cryst. A32, 565-574; Hansen & Coppens (1978). Acta Cryst. A34, 909-921] in experimental X-ray charge-density studies makes use of real spherical harmonics when describing the angular component of aspherical deformations of the atomic electron density in molecules and crystals. The analytical form of the density-normalized Cartesian spherical harmonic functions for up to l ≤ 7 and the corresponding normalization coefficients were reported previously by Paturle & Coppens [Acta Cryst. (1988), A44, 6-7]. It was shown that the analytical form for normalization coefficients is available primarily for l ≤ 4 [Hansen & Coppens, 1978; Paturle & Coppens, 1988; Coppens (1992). International Tables for Crystallography, Vol. B, Reciprocal space, 1st ed., edited by U. Shmueli, ch. 1.2. Dordrecht: Kluwer Academic Publishers; Coppens (1997). X-ray Charge Densities and Chemical Bonding. New York: Oxford University Press]. Only in very special cases it is possible to derive an analytical representation of the normalization coefficients for 4 < l ≤ 7 (Paturle & Coppens, 1988). In most cases for l > 4 the density normalization coefficients were calculated numerically to within seven significant figures. In this study we review the literature on the density-normalized spherical harmonics, clarify the existing notations, use the Paturle-Coppens (Paturle & Coppens, 1988) method in the Wolfram Mathematica software to derive the Cartesian spherical harmonics for l ≤ 20 and determine the density normalization coefficients to 35 significant figures, and computer-generate a Fortran90 code. The article primarily targets researchers who work in the field of experimental X-ray electron density, but may be of some use to all who are interested in Cartesian spherical harmonics.

  9. Linear perturbations of black holes: stability, quasi-normal modes and tails

    NASA Astrophysics Data System (ADS)

    Zhidenko, Alexander

    2009-03-01

    Black holes have their proper oscillations, which are called the quasi-normal modes. The proper oscillations of astrophysical black holes can be observed in the nearest future with the help of gravitational wave detectors. Quasi-normal modes are also very important in the context of testing of the stability of black objects, the anti-de Sitter/Conformal Field Theory (AdS/CFT) correspondence and in higher dimensional theories, such as the brane-world scenarios and string theory. This dissertation reviews a number of works, which provide a thorough study of the quasi-normal spectrum of a wide class of black holes in four and higher dimensions for fields of various spin and gravitational perturbations. We have studied numerically the dependance of the quasi-normal modes on a number of factors, such as the presence of the cosmological constant, the Gauss-Bonnet parameter or the aether in the space-time, the dependance of the spectrum on parameters of the black hole and fields under consideration. By the analysis of the quasi-normal spectrum, we have studied the stability of higher dimensional Reissner-Nordstrom-de Sitter black holes, Kaluza-Klein black holes with squashed horizons, Gauss-Bonnet black holes and black strings. Special attention is paid to the evolution of massive fields in the background of various black holes. We have considered their quasi-normal ringing and the late-time tails. In addition, we present two new numerical techniques: a generalisation of the Nollert improvement of the Frobenius method for higher dimensional problems and a qualitatively new method, which allows to calculate quasi-normal frequencies for black holes, which metrics are not known analytically.

  10. Lactate Clearance and Normalization and Prolonged Organ Dysfunction in Pediatric Sepsis.

    PubMed

    Scott, Halden F; Brou, Lina; Deakyne, Sara J; Fairclough, Diane L; Kempe, Allison; Bajaj, Lalit

    2016-03-01

    To evaluate whether lactate clearance and normalization during emergency care of pediatric sepsis is associated with lower rates of persistent organ dysfunction. This was a prospective cohort study of 77 children <18 years of age in the emergency department with infection and acute organ dysfunction per consensus definitions. In consented patients, lactate was measured 2 and/or 4 hours after an initial lactate; persistent organ dysfunction was assessed through laboratory and physician evaluation at 48 hours. A decrease of ≥ 10% from initial to final level was considered lactate clearance; a final level < 2 mmol/L was considered lactate normalization. Relative risk (RR) with 95% CIs, adjusted in a log-binomial model, was used to evaluate associations between lactate clearance/normalization and organ dysfunction. Lactate normalized in 62 (81%) patients and cleared in 70 (91%). The primary outcome, persistent 48-hour organ dysfunction, was present in 32 (42%). Lactate normalization was associated with decreased risk of persistent organ dysfunction (RR 0.46, 0.29-0.73; adjusted RR 0.47, 0.29-0.78); lactate clearance was not (RR 0.70, 0.35-1.41; adjusted RR 0.75, 0.38-1.50). The association between lactate normalization and decreased risk of persistent organ dysfunction was retained in the subgroups with initial lactate ≥ 2 mmol/L and hypotension. In children with sepsis and organ dysfunction, lactate normalization within 4 hours was associated with decreased persistent organ dysfunction. Serial lactate level measurement may provide a useful prognostic tool during the first hours of resuscitation in pediatric sepsis. Copyright © 2016 Elsevier Inc. All rights reserved.

  11. Self-Esteem of Gifted, Normal, and Mild Mentally Handicapped Children.

    ERIC Educational Resources Information Center

    Chiu, Lian-Hwang

    1990-01-01

    Administered Coopersmith Self-Esteem Inventory (SEI) Form B to elementary school students (N=450) identified as gifted, normal, and mild mentally handicapped (MiMH). Results indicated that both the gifted and normal children had significantly higher self-esteem than did the MiMH children, but there were no differences between gifted and normal…

  12. Linguistic Deterioration in Alzheimer's Senile Dementia and in Normal Aging.

    ERIC Educational Resources Information Center

    Emery, Olga Beattie

    A study of language patterning as an indicator of higher cortical process focused on three matched comparison groups: normal pre-middle-aged, normal elderly, and elderly adults with senile dementia Alzheimer's type. In addition to tests of memory, level of cognitive function, and organic deficit, the formal aspects of language were analyzed in…

  13. Frictional response of simulated faults to normal stresses perturbations probed with ultrasonic waves

    NASA Astrophysics Data System (ADS)

    Shreedharan, S.; Riviere, J.; Marone, C.

    2017-12-01

    We report on a suite of laboratory friction experiments conducted on saw-cut Westerly Granite surfaces to probe frictional response to step changes in normal stress and loading rate. The experiments are conducted to illuminate the fundamental processes that yield friction rate and state dependence. We quantify the microphysical frictional response of the simulated fault surfaces to normal stress steps, in the range of 1% - 600% step increases and decreases from a nominal baseline normal stress. We measure directly the fault slip rate and account for changes in slip rate with changes in normal stress and complement mechanical data acquisition by continuously probing the faults with ultrasonic pulses. We conduct the experiments at room temperature and humidity conditions in a servo controlled biaxial testing apparatus in the double direct shear configuration. The samples are sheared over a range of velocities, from 0.02 - 100 μm/s. We report observations of a transient shear stress and friction evolution with step increases and decreases in normal stress. Specifically, we show that, at low shear velocities and small increases in normal stress (<5% increase), the shear stress on the fault does not increase instantaneously with the normal stress step while the ultrasonic wave amplitude and normal displacement do. In other words, the shear stress does not follow the load point stiffness curve. At high shear velocities and larger normal stress steps (> 5% increases), the shear stress evolves immediately with normal stress. We show that the excursions in slip rate resulting from the changes in normal stress must be accounted for in order to predict fault strength evolution. Ultrasonic wave amplitudes which first increase immediately in response to normal stress steps, then decrease approximately linearly to a new steady state value, in part due to changes in fault slip rate. Previous descriptions of frictional state evolution during normal stress perturbations have not

  14. Histogram-based normalization technique on human brain magnetic resonance images from different acquisitions.

    PubMed

    Sun, Xiaofei; Shi, Lin; Luo, Yishan; Yang, Wei; Li, Hongpeng; Liang, Peipeng; Li, Kuncheng; Mok, Vincent C T; Chu, Winnie C W; Wang, Defeng

    2015-07-28

    Intensity normalization is an important preprocessing step in brain magnetic resonance image (MRI) analysis. During MR image acquisition, different scanners or parameters would be used for scanning different subjects or the same subject at a different time, which may result in large intensity variations. This intensity variation will greatly undermine the performance of subsequent MRI processing and population analysis, such as image registration, segmentation, and tissue volume measurement. In this work, we proposed a new histogram normalization method to reduce the intensity variation between MRIs obtained from different acquisitions. In our experiment, we scanned each subject twice on two different scanners using different imaging parameters. With noise estimation, the image with lower noise level was determined and treated as the high-quality reference image. Then the histogram of the low-quality image was normalized to the histogram of the high-quality image. The normalization algorithm includes two main steps: (1) intensity scaling (IS), where, for the high-quality reference image, the intensities of the image are first rescaled to a range between the low intensity region (LIR) value and the high intensity region (HIR) value; and (2) histogram normalization (HN),where the histogram of low-quality image as input image is stretched to match the histogram of the reference image, so that the intensity range in the normalized image will also lie between LIR and HIR. We performed three sets of experiments to evaluate the proposed method, i.e., image registration, segmentation, and tissue volume measurement, and compared this with the existing intensity normalization method. It is then possible to validate that our histogram normalization framework can achieve better results in all the experiments. It is also demonstrated that the brain template with normalization preprocessing is of higher quality than the template with no normalization processing. We have proposed

  15. The impact of signal normalization on seizure detection using line length features.

    PubMed

    Logesparan, Lojini; Rodriguez-Villegas, Esther; Casson, Alexander J

    2015-10-01

    Accurate automated seizure detection remains a desirable but elusive target for many neural monitoring systems. While much attention has been given to the different feature extractions that can be used to highlight seizure activity in the EEG, very little formal attention has been given to the normalization that these features are routinely paired with. This normalization is essential in patient-independent algorithms to correct for broad-level differences in the EEG amplitude between people, and in patient-dependent algorithms to correct for amplitude variations over time. It is crucial, however, that the normalization used does not have a detrimental effect on the seizure detection process. This paper presents the first formal investigation into the impact of signal normalization techniques on seizure discrimination performance when using the line length feature to emphasize seizure activity. Comparing five normalization methods, based upon the mean, median, standard deviation, signal peak and signal range, we demonstrate differences in seizure detection accuracy (assessed as the area under a sensitivity-specificity ROC curve) of up to 52 %. This is despite the same analysis feature being used in all cases. Further, changes in performance of up to 22 % are present depending on whether the normalization is applied to the raw EEG itself or directly to the line length feature. Our results highlight the median decaying memory as the best current approach for providing normalization when using line length features, and they quantify the under-appreciated challenge of providing signal normalization that does not impair seizure detection algorithm performance.

  16. Are controls in schizophrenia research "normal"?

    PubMed

    Olson, S C; Bornstein, R A; Schwarzkopf, S B; Nasrallah, H A

    1993-03-01

    The psychiatric assessment by structured interview and family history of mental disorder in normal volunteers recruited by advertisement for a study of brain structure and function in psychosis is described. Nine of 51 volunteers (17.6%) who passed a phone screen were excluded after a structured interview for major psychopathology. Of 35 completers, 10 (28.6%) had subthreshold mood or substance use but were included in the study. Only 16 subjects (45%) had a negative family history by FH-RDC. Diagnoses in family members included substance abuse (31%), mood disorder (11%), psychosis (9%), and other/undiagnosed (14%). Ventricular enlargement was evaluated by magnetic resonance imaging in two planes. Ventricular size was bimodally distributed in the males, and the group with larger ventricles was more educated and had higher scores on the 8 (Schizophrenia) scale of the MMPI (F = 5.44, p = .0099). Our results suggest that 'normal' volunteers for psychiatric research have personal or family psychopathology which motivates them to participate. As the sensitivity of biological instrumentation increases, the characteristics of the control group must be anticipated in the design and recruitment.

  17. Declaratoria del IV Congreso Nacional de Educacion Normal (Declaration of the Fourth National Congress on Normal Education).

    ERIC Educational Resources Information Center

    El Maestro, Mexico, 1970

    1970-01-01

    This document is an English-language abstract (approximately 1,500 words) of a desclaration drawn up by the participants of the Fourth Mexican National Congress of Normal Education. The declaration points out the importance of teacher training in the educational system, the fundamental problems presently facing this level of studies and the…

  18. Normalization of relative and incomplete temporal expressions in clinical narratives.

    PubMed

    Sun, Weiyi; Rumshisky, Anna; Uzuner, Ozlem

    2015-09-01

    To improve the normalization of relative and incomplete temporal expressions (RI-TIMEXes) in clinical narratives. We analyzed the RI-TIMEXes in temporally annotated corpora and propose two hypotheses regarding the normalization of RI-TIMEXes in the clinical narrative domain: the anchor point hypothesis and the anchor relation hypothesis. We annotated the RI-TIMEXes in three corpora to study the characteristics of RI-TMEXes in different domains. This informed the design of our RI-TIMEX normalization system for the clinical domain, which consists of an anchor point classifier, an anchor relation classifier, and a rule-based RI-TIMEX text span parser. We experimented with different feature sets and performed an error analysis for each system component. The annotation confirmed the hypotheses that we can simplify the RI-TIMEXes normalization task using two multi-label classifiers. Our system achieves anchor point classification, anchor relation classification, and rule-based parsing accuracy of 74.68%, 87.71%, and 57.2% (82.09% under relaxed matching criteria), respectively, on the held-out test set of the 2012 i2b2 temporal relation challenge. Experiments with feature sets reveal some interesting findings, such as: the verbal tense feature does not inform the anchor relation classification in clinical narratives as much as the tokens near the RI-TIMEX. Error analysis showed that underrepresented anchor point and anchor relation classes are difficult to detect. We formulate the RI-TIMEX normalization problem as a pair of multi-label classification problems. Considering only RI-TIMEX extraction and normalization, the system achieves statistically significant improvement over the RI-TIMEX results of the best systems in the 2012 i2b2 challenge. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  19. Bartter Syndrome with Normal Aldosterone Level: An Unusual Presentation.

    PubMed

    Huque, S S; Rahman, M H; Khatun, S

    2016-04-01

    Bartter syndrome (BS) is a hereditary disease, with an autosomal recessive or autosomal dominant mode of transmission. It is characterized by salt wasting hypochloraemic, hypokalaemic metabolic alkalosis and hyperreninaemia with normal blood pressure. The primary defect is in the thick ascending limb of loop of Henle (TAL). Herein, we report a case that had typical features of BS like severe dehydration, severe hypokalaemia, metabolic alkalosis and failure to thrive but had normal aldosterone level which is very uncommon.

  20. Colony formation by normal and malignant human B-lymphocytes.

    PubMed Central

    Izaguirre, C. A.; Minden, M. D.; Howatson, A. F.; McCulloch, E. A.

    1980-01-01

    A method is described that permits colony formation in culture by B lymphocytes from normal blood and from blood, marrow or lymph nodes of patients with myeloma or lymphoma. The method depends on: (1) exhaustively depleting cell suspensions of T lymphocytes, (2) a medium conditioned by T lymphocytes in the presence of phytohaemagglutinin (PHA-TCM), and (3) irradiated autologous or homologous T lymphocytes. Under these conditions the assay is linear. Cellular development of B lymphocytes can be followed; differentiation to plasma cells is seen in cultures of cells from normal individuals and myeloma patients, but not lymphoma patients. Malignant B lymphocytes in culture produced immunoglobulin of the class identified in the patient's blood, or in freshly obtained cells. We conclude that the assay is suitable for studying the growth, differentiation and regulation of normal and malignant B lymphocytes in culture. Images Fig. 1 Fig. 2 PMID:6968572

  1. [Epileptic encephalopathy associated with forced normalization after administration of levetiracetam].

    PubMed

    Kikuchi, Takahiro; Kato, Mitsuhiro; Takahashi, Nobuya; Nakamura, Kazuyuki; Hayasaka, Kiyoshi

    2013-09-01

    Here we report a case of a 10-year-old female with unclassified epileptic encephalopathy who showed forced normalization after administration of levetiracetam (LEV). She initially presented with intractable tonic and myoclonic seizures that were observed about 10 times a day along with frequent multifocal sharp and slow wave complexes on electroencephalography (EEG). We were forced to decrease the topiramate dose because of the appearance of nystagmus, and her myoclonic seizures became worse. We added LEV (250 mg/day) and her tonic and myoclonic seizures disappeared one day after initiation of LEV administration. However, she showed hyporesponsiveness and akinesia. The disappearance of paroxysmal discharges on EEG confirmed the diagnosis of forced normalization. Despite continuous administration of LEV, tonic and myoclonic seizures relapsed within a month but her psychotic symptoms resolved simultaneously. To the best of our knowledge, this is the first reported case of forced normalization after LEV administration. It should be noted that LEV may cause forced normalization although it can be started at an adequate dosage.

  2. Effect of delayed auditory feedback on normal speakers at two speech rates

    NASA Astrophysics Data System (ADS)

    Stuart, Andrew; Kalinowski, Joseph; Rastatter, Michael P.; Lynch, Kerry

    2002-05-01

    This study investigated the effect of short and long auditory feedback delays at two speech rates with normal speakers. Seventeen participants spoke under delayed auditory feedback (DAF) at 0, 25, 50, and 200 ms at normal and fast rates of speech. Significantly two to three times more dysfluencies were displayed at 200 ms (p<0.05) relative to no delay or the shorter delays. There were significantly more dysfluencies observed at the fast rate of speech (p=0.028). These findings implicate the peripheral feedback system(s) of fluent speakers for the disruptive effects of DAF on normal speech production at long auditory feedback delays. Considering the contrast in fluency/dysfluency exhibited between normal speakers and those who stutter at short and long delays, it appears that speech disruption of normal speakers under DAF is a poor analog of stuttering.

  3. Profilaxia da trombose venosa profunda em cirurgia bariátrica: estudo comparativo com doses diferentes de heparina de baixo peso molecular

    PubMed Central

    Goslan, Carlos José; Baretta, Giórgio Alfredo Pedroso; de Souza, Hemuara Grasiela Pestana; Orsi, Bruna Zanin; Zanoni, Esdras Camargo A.; Lopes, Marco Antonio Gimenez; Engelhorn, Carlos Alberto

    2018-01-01

    Resumo Contexto A cirurgia bariátrica é considerada a melhor opção para o tratamento da obesidade, cujos pacientes são considerados de alto risco para fenômenos tromboembólicos. Objetivos Comparar o uso de doses diferentes de heparina de baixo peso molecular (HBPM) na profilaxia da trombose venosa profunda (TVP) em pacientes candidatos à cirurgia bariátrica em relação ao risco de TVP, alteração na dosagem do fator anti-Xa e sangramento pré ou pós-operatório. Métodos Estudo comparativo transversal em pacientes submetidos à cirurgia bariátrica distribuídos em dois grupos, que receberam doses de HBPM de 40 mg (grupo controle, GC) e 80 mg (grupo de estudo, GE). Foram avaliados por ultrassonografia vascular e dosagem de KPTT, TAP, plaquetas e fator anti-Xa. Resultados Foram avaliados 60 pacientes, sendo 34 no GC e 26 no GE. Foi observada diferença significativa somente no peso (p = 0,003) e índice de massa corporal (p = 0,018) no GE em relação ao GC. Não houve diferença na dosagem de KPTT, TAP, plaquetas e fator anti-Xa entre os grupos. Não foram detectados TVP ou sangramentos significativos em ambos os grupos. Conclusões Não houve diferença estatisticamente significativa na utilização de doses maiores de HBPM na profilaxia da TVP em pacientes candidatos à cirurgia bariátrica em relação ao risco de TVP, dosagem do fator anti-Xa e sangramento pré ou pós-operatório.

  4. Normal-pressure hydrocephalus and the saga of the treatable dementias

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Friedland, R.P.

    1989-11-10

    A case study of a 74-year-old woman is presented which illustrates the difficulty of understanding dementing illnesses. A diagnosis of normal-pressure hydrocephalus (NPH) was made because of the development of abnormal gait, with urinary incontinence and severe, diffuse, white matter lesions on the MRI scan. Computed tomographic, MRI scans and positron emission tomographic images of glucose use are presented. The treatable dementias are a large, multifaceted group of illnesses, of which NPH is one. The author proposes a new term for this disorder commonly known as NPH because the problem with the term normal-pressure hydrocephalus is that the cerebrospinal fluidmore » pressure is not always normal in the disease.« less

  5. Normalization of mass cytometry data with bead standards

    PubMed Central

    Finck, Rachel; Simonds, Erin F.; Jager, Astraea; Krishnaswamy, Smita; Sachs, Karen; Fantl, Wendy; Pe’er, Dana; Nolan, Garry P.; Bendall, Sean C.

    2013-01-01

    Mass cytometry uses atomic mass spectrometry combined with isotopically pure reporter elements to currently measure as many as 40 parameters per single cell. As with any quantitative technology, there is a fundamental need for quality assurance and normalization protocols. In the case of mass cytometry, the signal variation over time due to changes in instrument performance combined with intervals between scheduled maintenance must be accounted for and then normalized. Here, samples were mixed with polystyrene beads embedded with metal lanthanides, allowing monitoring of mass cytometry instrument performance over multiple days of data acquisition. The protocol described here includes simultaneous measurements of beads and cells on the mass cytometer, subsequent extraction of the bead-based signature, and the application of an algorithm enabling correction of both short- and long-term signal fluctuations. The variation in the intensity of the beads that remains after normalization may also be used to determine data quality. Application of the algorithm to a one-month longitudinal analysis of a human peripheral blood sample reduced the range of median signal fluctuation from 4.9-fold to 1.3-fold. PMID:23512433

  6. Strong Bayesian evidence for the normal neutrino hierarchy

    NASA Astrophysics Data System (ADS)

    Simpson, Fergus; Jimenez, Raul; Pena-Garay, Carlos; Verde, Licia

    2017-06-01

    The configuration of the three neutrino masses can take two forms, known as the normal and inverted hierarchies. We compute the Bayesian evidence associated with these two hierarchies. Previous studies found a mild preference for the normal hierarchy, and this was driven by the asymmetric manner in which cosmological data has confined the available parameter space. Here we identify the presence of a second asymmetry, which is imposed by data from neutrino oscillations. By combining constraints on the squared-mass splittings [1] with the limit on the sum of neutrino masses of Σmν < 0.13 eV [2], and using a minimally informative prior on the masses, we infer odds of 42:1 in favour of the normal hierarchy, which is classified as "strong" in the Jeffreys' scale. We explore how these odds may evolve in light of higher precision cosmological data, and discuss the implications of this finding with regards to the nature of neutrinos. Finally the individual masses are inferred to be m1=3.80+26.2-3.73meV; m2=8.8+18-1.2meV; m3=50.4+5.8-1.2meV (95% credible intervals).

  7. Pirfenidone normalizes the tumor microenvironment to improve chemotherapy

    PubMed Central

    Papageorgis, Panagiotis; Voutouri, Chrysovalantis; Stylianopoulos, Triantafyllos

    2017-01-01

    Normalization of the tumor microenvironment by selectively targeting components of the tumor extracellular matrix has been recently proposed to have the potential to decompress tumor blood vessels, increase vessel perfusion and thus, improve drug delivery and the efficacy of cancer therapy. Therefore, we now need to identify safe and well tolerated pharmaceutical agents that are able to remodel the microenvironment of solid tumors and enhance chemotherapy. In this study, we repurposed Pirfenidone, a clinically approved anti-fibrotic drug for the treatment of idiopathic pulmonary fibrosis, to investigate its possible role on tumor microenvironment normalization. Using two orthotopic mammary tumor models we demonstrate that Pirfenidone reduces collagen and hyaluronan levels and, as a result, significantly increases blood vessel functionality and perfusion and improves the anti-tumor efficacy of doxorubicin. Reduction of extracellular matrix components were mediated via TGFβ signaling pathway inhibition due to downregulation of TGFβ1, COL1A1, COL3A1, HAS2, HAS3 expression levels. Our findings provide evidence that repurposing Pirfenidone could be used as a promising strategy to enhance drug delivery to solid tumors by normalizing the tumor microenvironment. PMID:28445938

  8. Charting the Course for Civil Affairs in the New Normal

    DTIC Science & Technology

    2015-07-01

    Select a caveat Unlimited distribution Charting the Course for Civil Affairs in the New Normal Vera Zakem and Emily Mushen July...11  Challenges for Joint CA in the New Normal Environment ............................................ 13  New requirements for ...deliver essential services in preparation for handing full control back to the host nation government. Other independent civil affairs teams worked

  9. Normality Tests for Statistical Analysis: A Guide for Non-Statisticians

    PubMed Central

    Ghasemi, Asghar; Zahediasl, Saleh

    2012-01-01

    Statistical errors are common in scientific literature and about 50% of the published articles have at least one error. The assumption of normality needs to be checked for many statistical procedures, namely parametric tests, because their validity depends on it. The aim of this commentary is to overview checking for normality in statistical analysis using SPSS. PMID:23843808

  10. Deconstructing Interocular Suppression: Attention and Divisive Normalization.

    PubMed

    Li, Hsin-Hung; Carrasco, Marisa; Heeger, David J

    2015-10-01

    In interocular suppression, a suprathreshold monocular target can be rendered invisible by a salient competitor stimulus presented in the other eye. Despite decades of research on interocular suppression and related phenomena (e.g., binocular rivalry, flash suppression, continuous flash suppression), the neural processing underlying interocular suppression is still unknown. We developed and tested a computational model of interocular suppression. The model included two processes that contributed to the strength of interocular suppression: divisive normalization and attentional modulation. According to the model, the salient competitor induced a stimulus-driven attentional modulation selective for the location and orientation of the competitor, thereby increasing the gain of neural responses to the competitor and reducing the gain of neural responses to the target. Additional suppression was induced by divisive normalization in the model, similar to other forms of visual masking. To test the model, we conducted psychophysics experiments in which both the size and the eye-of-origin of the competitor were manipulated. For small and medium competitors, behavioral performance was consonant with a change in the response gain of neurons that responded to the target. But large competitors induced a contrast-gain change, even when the competitor was split between the two eyes. The model correctly predicted these results and outperformed an alternative model in which the attentional modulation was eye specific. We conclude that both stimulus-driven attention (selective for location and feature) and divisive normalization contribute to interocular suppression.

  11. Deconstructing Interocular Suppression: Attention and Divisive Normalization

    PubMed Central

    Li, Hsin-Hung; Carrasco, Marisa; Heeger, David J.

    2015-01-01

    In interocular suppression, a suprathreshold monocular target can be rendered invisible by a salient competitor stimulus presented in the other eye. Despite decades of research on interocular suppression and related phenomena (e.g., binocular rivalry, flash suppression, continuous flash suppression), the neural processing underlying interocular suppression is still unknown. We developed and tested a computational model of interocular suppression. The model included two processes that contributed to the strength of interocular suppression: divisive normalization and attentional modulation. According to the model, the salient competitor induced a stimulus-driven attentional modulation selective for the location and orientation of the competitor, thereby increasing the gain of neural responses to the competitor and reducing the gain of neural responses to the target. Additional suppression was induced by divisive normalization in the model, similar to other forms of visual masking. To test the model, we conducted psychophysics experiments in which both the size and the eye-of-origin of the competitor were manipulated. For small and medium competitors, behavioral performance was consonant with a change in the response gain of neurons that responded to the target. But large competitors induced a contrast-gain change, even when the competitor was split between the two eyes. The model correctly predicted these results and outperformed an alternative model in which the attentional modulation was eye specific. We conclude that both stimulus-driven attention (selective for location and feature) and divisive normalization contribute to interocular suppression. PMID:26517321

  12. [Quantification of acetabular coverage in normal adult].

    PubMed

    Lin, R M; Yang, C Y; Yu, C Y; Yang, C R; Chang, G L; Chou, Y L

    1991-03-01

    Quantification of acetabular coverage is important and can be expressed by superimposition of cartilage tracings on the maximum cross-sectional area of the femoral head. A practical Autolisp program on PC AutoCAD has been developed by us to quantify the acetabular coverage through numerical expression of the images of computed tomography. Thirty adults (60 hips) with normal center-edge angle and acetabular index in plain X ray were randomly selected for serial drops. These slices were prepared with a fixed coordination and in continuous sections of 5 mm in thickness. The contours of the cartilage of each section were digitized into a PC computer and processed by AutoCAD programs to quantify and characterize the acetabular coverage of normal and dysplastic adult hips. We found that a total coverage ratio of greater than 80%, an anterior coverage ratio of greater than 75% and a posterior coverage ratio of greater than 80% can be categorized in a normal group. Polar edge distance is a good indicator for the evaluation of preoperative and postoperative coverage conditions. For standardization and evaluation of acetabular coverage, the most suitable parameters are the total coverage ratio, anterior coverage ratio, posterior coverage ratio and polar edge distance. However, medial coverage and lateral coverage ratios are indispensable in cases of dysplastic hip because variations between them are so great that acetabuloplasty may be impossible. This program can also be used to classify precisely the type of dysplastic hip.

  13. DNorm: disease name normalization with pairwise learning to rank.

    PubMed

    Leaman, Robert; Islamaj Dogan, Rezarta; Lu, Zhiyong

    2013-11-15

    Despite the central role of diseases in biomedical research, there have been much fewer attempts to automatically determine which diseases are mentioned in a text-the task of disease name normalization (DNorm)-compared with other normalization tasks in biomedical text mining research. In this article we introduce the first machine learning approach for DNorm, using the NCBI disease corpus and the MEDIC vocabulary, which combines MeSH® and OMIM. Our method is a high-performing and mathematically principled framework for learning similarities between mentions and concept names directly from training data. The technique is based on pairwise learning to rank, which has not previously been applied to the normalization task but has proven successful in large optimization problems for information retrieval. We compare our method with several techniques based on lexical normalization and matching, MetaMap and Lucene. Our algorithm achieves 0.782 micro-averaged F-measure and 0.809 macro-averaged F-measure, an increase over the highest performing baseline method of 0.121 and 0.098, respectively. The source code for DNorm is available at http://www.ncbi.nlm.nih.gov/CBBresearch/Lu/Demo/DNorm, along with a web-based demonstration and links to the NCBI disease corpus. Results on PubMed abstracts are available in PubTator: http://www.ncbi.nlm.nih.gov/CBBresearch/Lu/Demo/PubTator .

  14. Birkhoff Normal Form for Some Nonlinear PDEs

    NASA Astrophysics Data System (ADS)

    Bambusi, Dario

    We consider the problem of extending to PDEs Birkhoff normal form theorem on Hamiltonian systems close to nonresonant elliptic equilibria. As a model problem we take the nonlinear wave equation with Dirichlet boundary conditions on [0,π] g is an analytic skewsymmetric function which vanishes for u=0 and is periodic with period 2π in the x variable. We prove, under a nonresonance condition which is fulfilled for most g's, that for any integer M there exists a canonical transformation that puts the Hamiltonian in Birkhoff normal form up to a reminder of order M. The canonical transformation is well defined in a neighbourhood of the origin of a Sobolev type phase space of sufficiently high order. Some dynamical consequences are obtained. The technique of proof is applicable to quite general semilinear equations in one space dimension.

  15. Testing the Normality of Residuals.

    DTIC Science & Technology

    1982-09-01

    work, Ramsey (1969) and Ransey and Gilbert (1972) investigate tests for detection of regression specification errors such as omitted variables, incorrect...estimator of scale. Biometrika, 68, 331-333. Nelson, L.S. (1981). A simple test for normality. J. of Quality Technology, 13 , 76-77. Ramsey , J.B. (1969...AD-A120 997 TESTING THE NORMALT O F RESIDU ALS(U WISCONSIN UNV MAO ISON MATRENAT C S RESEARCH CENT ER NRDAPER FT AL SEP 92 NRC- SR- 2 42 6 DAAG29-90

  16. Multispectral histogram normalization contrast enhancement

    NASA Technical Reports Server (NTRS)

    Soha, J. M.; Schwartz, A. A.

    1979-01-01

    A multispectral histogram normalization or decorrelation enhancement which achieves effective color composites by removing interband correlation is described. The enhancement procedure employs either linear or nonlinear transformations to equalize principal component variances. An additional rotation to any set of orthogonal coordinates is thus possible, while full histogram utilization is maintained by avoiding the reintroduction of correlation. For the three-dimensional case, the enhancement procedure may be implemented with a lookup table. An application of the enhancement to Landsat multispectral scanning imagery is presented.

  17. Validation of Normalizations, Scaling, and Photofading Corrections for FRAP Data Analysis

    PubMed Central

    Kang, Minchul; Andreani, Manuel; Kenworthy, Anne K.

    2015-01-01

    Fluorescence Recovery After Photobleaching (FRAP) has been a versatile tool to study transport and reaction kinetics in live cells. Since the fluorescence data generated by fluorescence microscopy are in a relative scale, a wide variety of scalings and normalizations are used in quantitative FRAP analysis. Scaling and normalization are often required to account for inherent properties of diffusing biomolecules of interest or photochemical properties of the fluorescent tag such as mobile fraction or photofading during image acquisition. In some cases, scaling and normalization are also used for computational simplicity. However, to our best knowledge, the validity of those various forms of scaling and normalization has not been studied in a rigorous manner. In this study, we investigate the validity of various scalings and normalizations that have appeared in the literature to calculate mobile fractions and correct for photofading and assess their consistency with FRAP equations. As a test case, we consider linear or affine scaling of normal or anomalous diffusion FRAP equations in combination with scaling for immobile fractions. We also consider exponential scaling of either FRAP equations or FRAP data to correct for photofading. Using a combination of theoretical and experimental approaches, we show that compatible scaling schemes should be applied in the correct sequential order; otherwise, erroneous results may be obtained. We propose a hierarchical workflow to carry out FRAP data analysis and discuss the broader implications of our findings for FRAP data analysis using a variety of kinetic models. PMID:26017223

  18. Spatial frequency discrimination learning in normal and developmentally impaired human vision

    PubMed Central

    Astle, Andrew T.; Webb, Ben S.; McGraw, Paul V.

    2010-01-01

    Perceptual learning effects demonstrate that the adult visual system retains neural plasticity. If perceptual learning holds any value as a treatment tool for amblyopia, trained improvements in performance must generalise. Here we investigate whether spatial frequency discrimination learning generalises within task to other spatial frequencies, and across task to contrast sensitivity. Before and after training, we measured contrast sensitivity and spatial frequency discrimination (at a range of reference frequencies 1, 2, 4, 8, 16 c/deg). During training, normal and amblyopic observers were divided into three groups. Each group trained on a spatial frequency discrimination task at one reference frequency (2, 4, or 8 c/deg). Normal and amblyopic observers who trained at lower frequencies showed a greater rate of within task learning (at their reference frequency) compared to those trained at higher frequencies. Compared to normals, amblyopic observers showed greater within task learning, at the trained reference frequency. Normal and amblyopic observers showed asymmetrical transfer of learning from high to low spatial frequencies. Both normal and amblyopic subjects showed transfer to contrast sensitivity. The direction of transfer for contrast sensitivity measurements was from the trained spatial frequency to higher frequencies, with the bandwidth and magnitude of transfer greater in the amblyopic observers compared to normals. The findings provide further support for the therapeutic efficacy of this approach and establish general principles that may help develop more effective protocols for the treatment of developmental visual deficits. PMID:20832416

  19. A Cancer-Indicative microRNA Pattern in Normal Prostate Tissue

    PubMed Central

    Hellwinkel, Olaf J. C.; Sellier, Christina; Sylvester, Yu-Mi Jessica; Brase, Jan C.; Isbarn, Hendrik; Erbersdobler, Andreas; Steuber, Thomas; Sültmann, Holger; Schlomm, Thorsten; Wagner, Christina

    2013-01-01

    We analyzed the levels of selected micro-RNAs in normal prostate tissue to assess their potential to indicate tumor foci elsewhere in the prostate. Histologically normal prostate tissue samples from 31 prostate cancer patients and two cancer negative control groups with either unsuspicious or elevated prostate specific antigen (PSA) levels (14 and 17 individuals, respectively) were analyzed. Based on the expression analysis of 157 microRNAs in a pool of prostate tissue samples and information from data bases/literature, we selected eight microRNAs for quantification by real-time polymerase chain reactions (RT-PCRs). Selected miRNAs were analyzed in histologically tumor-free biopsy samples from patients and healthy controls. We identified seven microRNAs (miR-124a, miR-146a & b, miR-185, miR-16 and let-7a & b), which displayed significant differential expression in normal prostate tissue from men with prostate cancer compared to both cancer negative control groups. Four microRNAs (miR-185, miR-16 and let-7a and let-7b) remained to significantly discriminate normal tissues from prostate cancer patients from those of the cancer negative control group with elevated PSA levels. The transcript levels of these microRNAs were highly indicative for the presence of cancer in the prostates, independently of the PSA level. Our results suggest a microRNA-pattern in histologically normal prostate tissue, indicating prostate cancer elsewhere in the organ. PMID:23459235

  20. Somatotype and Body Composition of Normal and Dysphonic Adult Speakers.

    PubMed

    Franco, Débora; Fragoso, Isabel; Andrea, Mário; Teles, Júlia; Martins, Fernando

    2017-01-01

    Voice quality provides information about the anatomical characteristics of the speaker. The patterns of somatotype and body composition can provide essential knowledge to characterize the individuality of voice quality. The aim of this study was to verify if there were significant differences in somatotype and body composition between normal and dysphonic speakers. Cross-sectional study. Anthropometric measurements were taken of a sample of 72 adult participants (40 normal speakers and 32 dysphonic speakers) according to International Society for the Advancement of Kinanthropometry standards, which allowed the calculation of endomorphism, mesomorphism, ectomorphism components, body density, body mass index, fat mass, percentage fat, and fat-free mass. Perception and acoustic evaluations as well as nasoendoscopy were used to assign speakers into normal or dysphonic groups. There were no significant differences between normal and dysphonic speakers in the mean somatotype attitudinal distance and somatotype dispersion distance (in spite of marginally significant differences [P < 0.10] in somatotype attitudinal distance and somatotype dispersion distance between groups) and in the mean vector of the somatotype components. Furthermore, no significant differences were found between groups concerning the mean of percentage fat, fat mass, fat-free mass, body density, and body mass index after controlling by sex. The findings suggested no significant differences in the somatotype and body composition variables, between normal and dysphonic speakers. Copyright © 2017 The Voice Foundation. Published by Elsevier Inc. All rights reserved.

  1. Automated quantification of myocardial perfusion SPECT using simplified normal limits.

    PubMed

    Slomka, Piotr J; Nishina, Hidetaka; Berman, Daniel S; Akincioglu, Cigdem; Abidov, Aiden; Friedman, John D; Hayes, Sean W; Germano, Guido

    2005-01-01

    To simplify development of normal limits for myocardial perfusion SPECT (MPS), we implemented a quantification scheme in which normal limits are derived without visual scoring of abnormal scans or optimization of regional thresholds. Normal limits were derived from same-day TI-201 rest/Tc-99m-sestamibi stress scans of male (n = 40) and female (n = 40) low-likelihood patients. Defect extent, total perfusion deficit (TPD), and regional perfusion extents were derived by comparison to normal limits in polar-map coordinates. MPS scans from 256 consecutive patients without known coronary artery disease, who underwent coronary angiography, were analyzed. The new method of quantification (TPD) was compared with our previously developed quantification system and visual scoring. The receiver operator characteristic area under the curve for detection of 50% or greater stenoses by TPD (0.88 +/- 0.02) was higher than by visual scoring (0.83 +/- 0.03) ( P = .039) or standard quantification (0.82 +/- 0.03) ( P = .004). For detection of 70% or greater stenoses, it was higher for TPD (0.89 +/- 0.02) than for standard quantification (0.85 +/- 0.02) ( P = .014). Sensitivity and specificity were 93% and 79%, respectively, for TPD; 81% and 85%, respectively, for visual scoring; and 80% and 73%, respectively, for standard quantification. The use of stress mode-specific normal limits did not improve performance. Simplified quantification achieves performance better than or equivalent to visual scoring or quantification based on per-segment visual optimization of abnormality thresholds.

  2. EGSIEM: Combination of GRACE monthly gravity models on normal equation level

    NASA Astrophysics Data System (ADS)

    Meyer, Ulrich; Jean, Yoomin; Jäggi, Adrian; Mayer-Gürr, Torsten; Neumayer, Hans; Lemoine, Jean-Michel

    2016-04-01

    One of the three geodetic services to be realized in the frame of the EGSIEM project is a scientific combination service. Each associated processing center (AC) will follow a set of common processing standards but will apply its own, independent analysis method. Therefore the quality, robustness and reliability of the combined monthly gravity fields is expected to improve significantly compared to the individual solutions. The Monthly GRACE gravity fields of all ACs are combined on normal equation level. The individual normal equations are weighted depending on pairwise comparisons of the individual gravity field solutions. To derive these weights and for quality control of the individual contributions first a combination of the monthly gravity fields on solution level is performed. The concept of weighting and of the combination on normal equation level is introduced and the formats used for normal equation exchange and gravity field solutions is described. First results of the combination on normal equation level are presented and compared to the corresponding combinations on solution level. EGSIEM has an open data policy and all processing centers of GRACE gravity fields are invited to participate in the combination.

  3. Dynamic Visual Acuity While Walking in Normals and Labyrinthine-Deficient Patients

    NASA Technical Reports Server (NTRS)

    Hillman, Edward J.; Bloomberg, Jacob J.; McDonald, P. Vernon; Cohen, Helen S.

    1996-01-01

    We describe a new, objective, easily administered test of dynamic visual acuity (DVA) while walking. Ten normal subjects and five patients with histories of severe bilateral vestibular dysfunctions participated in this study. Subjects viewed a visual display of numerals of different font sizes presented on a laptop computer while they stood still and while they walked on a motorized treadmill. Treadmill speed was adapted for 4 of 5 patients. Subjects were asked to identify the numerals as they appeared on the computer screen. Test results were reasonably repeatable in normals. The percent correct responses at each font size dropped slightly while walking in normals and dropped significantly more in patients. Patients performed significantly worse than normals while standing still and while walking. This task may be useful for evaluating post-flight astronauts and vestibularly impaired patients.

  4. Strelka: accurate somatic small-variant calling from sequenced tumor-normal sample pairs.

    PubMed

    Saunders, Christopher T; Wong, Wendy S W; Swamy, Sajani; Becq, Jennifer; Murray, Lisa J; Cheetham, R Keira

    2012-07-15

    Whole genome and exome sequencing of matched tumor-normal sample pairs is becoming routine in cancer research. The consequent increased demand for somatic variant analysis of paired samples requires methods specialized to model this problem so as to sensitively call variants at any practical level of tumor impurity. We describe Strelka, a method for somatic SNV and small indel detection from sequencing data of matched tumor-normal samples. The method uses a novel Bayesian approach which represents continuous allele frequencies for both tumor and normal samples, while leveraging the expected genotype structure of the normal. This is achieved by representing the normal sample as a mixture of germline variation with noise, and representing the tumor sample as a mixture of the normal sample with somatic variation. A natural consequence of the model structure is that sensitivity can be maintained at high tumor impurity without requiring purity estimates. We demonstrate that the method has superior accuracy and sensitivity on impure samples compared with approaches based on either diploid genotype likelihoods or general allele-frequency tests. The Strelka workflow source code is available at ftp://strelka@ftp.illumina.com/. csaunders@illumina.com

  5. Are your covariates under control? How normalization can re-introduce covariate effects.

    PubMed

    Pain, Oliver; Dudbridge, Frank; Ronald, Angelica

    2018-04-30

    Many statistical tests rely on the assumption that the residuals of a model are normally distributed. Rank-based inverse normal transformation (INT) of the dependent variable is one of the most popular approaches to satisfy the normality assumption. When covariates are included in the analysis, a common approach is to first adjust for the covariates and then normalize the residuals. This study investigated the effect of regressing covariates against the dependent variable and then applying rank-based INT to the residuals. The correlation between the dependent variable and covariates at each stage of processing was assessed. An alternative approach was tested in which rank-based INT was applied to the dependent variable before regressing covariates. Analyses based on both simulated and real data examples demonstrated that applying rank-based INT to the dependent variable residuals after regressing out covariates re-introduces a linear correlation between the dependent variable and covariates, increasing type-I errors and reducing power. On the other hand, when rank-based INT was applied prior to controlling for covariate effects, residuals were normally distributed and linearly uncorrelated with covariates. This latter approach is therefore recommended in situations were normality of the dependent variable is required.

  6. Multiple Regression with Varying Levels of Correlation among Predictors: Monte Carlo Sampling from Normal and Non-Normal Populations.

    ERIC Educational Resources Information Center

    Vasu, Ellen Storey

    1978-01-01

    The effects of the violation of the assumption of normality in the conditional distributions of the dependent variable, coupled with the condition of multicollinearity upon the outcome of testing the hypothesis that the regression coefficient equals zero, are investigated via a Monte Carlo study. (Author/JKS)

  7. Three-dimensional assessment of the normal Japanese glenoid and comparison with the normal French glenoid.

    PubMed

    Mizuno, N; Nonaka, S; Ozaki, R; Yoshida, M; Yoneda, M; Walch, G

    2017-12-01

    In 2014, reverse total shoulder arthroplasty was approved in Japan. We were concerned that the base plate might be incompatible with Japanese who were generally smaller than Westerners. Therefore, we investigated the dimensions and morphology of the normal Japanese glenoid and compared with the normal French glenoid. One hundred Japanese shoulders without glenoid lesions (50 men and 50 women) were investigated and compared with 100 French shoulders (50 men and 50 women). Computed tomography was performed with 3-dimensional image reconstruction and images were analyzed using Glenosys software. Glenoid parameters (width, height, retroversion and inclination) were compared between Japanese and French subjects. In Japanese subjects, the mean glenoid width was 25.5mm, height was 33.3mm, retroversion was 2.3° and inclination was 11.6° superiorly. In French subjects, the mean glenoid width was 26.7mm, height was 35.4mm, retroversion was 6.0° and inclination was 10.4° superiorly. Glenoid width and height were significantly smaller in Japanese subjects than French subjects (P=0.001 and P<0.001), while retroversion was significantly greater in French subjects (P<0.001). There was no significant difference of inclination. These findings will help surgeons to identify suitable patients for RSA and perform the procedure with appropriate preoperative planning. IV: retrospective or historical series. Copyright © 2017 Elsevier Masson SAS. All rights reserved.

  8. Iris Segmentation and Normalization Algorithm Based on Zigzag Collarette

    NASA Astrophysics Data System (ADS)

    Rizky Faundra, M.; Ratna Sulistyaningrum, Dwi

    2017-01-01

    In this paper, we proposed iris segmentation and normalization algorithm based on the zigzag collarette. First of all, iris images are processed by using Canny Edge Detection to detect pupil edge, then finding the center and the radius of the pupil with the Hough Transform Circle. Next, isolate important part in iris based zigzag collarette area. Finally, Daugman Rubber Sheet Model applied to get the fixed dimensions or normalization iris by transforming cartesian into polar format and thresholding technique to remove eyelid and eyelash. This experiment will be conducted with a grayscale eye image data taken from a database of iris-Chinese Academy of Sciences Institute of Automation (CASIA). Data iris taken is the data reliable and widely used to study the iris biometrics. The result show that specific threshold level is 0.3 have better accuracy than other, so the present algorithm can be used to segmentation and normalization zigzag collarette with accuracy is 98.88%

  9. Evaluation of [(201)Tl](III) Vancomycin in normal rats.

    PubMed

    Jalilian, Amir Reza; Hosseini, Mohammad Amin; Majdabadi, Abbas; Saddadi, Fariba

    2008-01-01

    Tl-201 has potential in the preparation of radiolabelled compounds similar to its homologues, like In-111 and radiogallium. In this paper, recently prepared [(201)Tl](III) vancomycin complex ([(201)Tl](III)VAN) has been evaluated for its biological properties. [(201)Tl](III)VAN was prepared according to the optimized conditions followed by biodistribution studies in normal rats for up to 52 h. The Staphylococcus aurous specific binding was checked in vitro. The complex was finally injected to normal rats. Tracer SPECT images were obtained in normal animals and compared to those of (67)Ga-citrate. Freshly-prepared [(201)Tl](III)VAN batches (radiochemical yield > 99%, radiochemical purity > 98%, specific activity approximately 1.2 Ci/mmol) showed a similar biodistribution to that of unlabeled vancomycin. The microorganism binding ratios were 3 and 9 for tracer (201)Tl(3+) and tracer (201)Tl(III)DTPA, respectively, suggesting the preservation of the tracer bioactivity. As a nonspecific cell penetrating tracer, [(201)Tl](III)DTPA was used.

  10. Correction of Bowtie-Filter Normalization and Crescent Artifacts for a Clinical CBCT System.

    PubMed

    Zhang, Hong; Kong, Vic; Huang, Ke; Jin, Jian-Yue

    2017-02-01

    To present our experiences in understanding and minimizing bowtie-filter crescent artifacts and bowtie-filter normalization artifacts in a clinical cone beam computed tomography system. Bowtie-filter position and profile variations during gantry rotation were studied. Two previously proposed strategies (A and B) were applied to the clinical cone beam computed tomography system to correct bowtie-filter crescent artifacts. Physical calibration and analytical approaches were used to minimize the norm phantom misalignment and to correct for bowtie-filter normalization artifacts. A combined procedure to reduce bowtie-filter crescent artifacts and bowtie-filter normalization artifacts was proposed and tested on a norm phantom, CatPhan, and a patient and evaluated using standard deviation of Hounsfield unit along a sampling line. The bowtie-filter exhibited not only a translational shift but also an amplitude variation in its projection profile during gantry rotation. Strategy B was better than strategy A slightly in minimizing bowtie-filter crescent artifacts, possibly because it corrected the amplitude variation, suggesting that the amplitude variation plays a role in bowtie-filter crescent artifacts. The physical calibration largely reduced the misalignment-induced bowtie-filter normalization artifacts, and the analytical approach further reduced bowtie-filter normalization artifacts. The combined procedure minimized both bowtie-filter crescent artifacts and bowtie-filter normalization artifacts, with Hounsfield unit standard deviation being 63.2, 45.0, 35.0, and 18.8 Hounsfield unit for the best correction approaches of none, bowtie-filter crescent artifacts, bowtie-filter normalization artifacts, and bowtie-filter normalization artifacts + bowtie-filter crescent artifacts, respectively. The combined procedure also demonstrated reduction of bowtie-filter crescent artifacts and bowtie-filter normalization artifacts in a CatPhan and a patient. We have developed a step

  11. Gestational trophoblastic neoplasia after spontaneous human chorionic gonadotropin normalization following molar pregnancy evacuation.

    PubMed

    Braga, Antonio; Maestá, Izildinha; Matos, Michelle; Elias, Kevin M; Rizzo, Julianna; Viggiano, Maurício Guilherme Campos

    2015-11-01

    To evaluate the risk of gestational trophoblastic neoplasia (GTN) after spontaneous human chorionic gonadotropin normalization in postmolar follow-up. Retrospective chart review of 2284 consecutive cases of hydatidiform mole with spontaneous normalization of hCG following uterine evacuation treated at one of five Brazilian reference centers from January 2002 to June 2013. After hCG normalization, GTN occurred in 10/2284 patients (0.4%; 95% CI 0.2%-0.8%). GTN developed in 9/1424 patients (0.6%; 95% CI 0.3%-1.2%) after a complete hydatidiform mole, in 1/849 patients (0.1%; 95% CI<0.01%-0.7%) after a partial hydatidiform mole, and in 0/13 patients (0%; 95% CI 0%-27%) after a twin molar pregnancy. The median time to GTN diagnosis after hCG normalization was 18months, and no diagnoses were made before six months of postmolar surveillance. Patients who required more than 56days to achieve a normal hCG value had a ten-fold increased risk of developing GTN after hCG normalization (9/1074; 0.8%; 95% CI 0.4%-1.6%) compared to those who reached a normal hCG level in fewer than 56days (1/1210;0.08%; 95% CI<0.01%-0.5%; p=0.008). All patients presented with symptoms at the time of GTN diagnosis. GTN after spontaneous hCG normalization following molar pregnancy is exceedingly rare, and the few patients who do develop GTN after achieving a normal hCG value are likely to be diagnosed after completing the commonly recommended six months of postmolar surveillance. Current recommendations for surveillance after hCG normalization should be revisited. Copyright © 2015 Elsevier Inc. All rights reserved.

  12. Site-Specific Differentiation of Fibroblasts in Normal and Scleroderma Skin

    DTIC Science & Technology

    2010-06-01

    SITE-SPECIFIC DIFFERENTIATION OF FIBROBLASTS IN NORMAL AND SCLERODERMA SKIN PRINCIPAL INVESTIGATOR: Howard Y. Chang, M.D., Ph.D...2010 4. TITLE AND SUBTITLE Site-Specific Differentiation of Fibroblasts in Normal and 5a. CONTRACT NUMBER Scleroderma Skin 5b. GRANT NUMBER...activated fibroblasts from SSc. 15. SUBJECT TERMS Scleroderma , fibroblasts, gene expression 16. SECURITY CLASSIFICATION OF: U 17. LIMITATION OF

  13. Differentiation of Normal and Malignant Breast Tissues using Infrared Spectroscopy

    NASA Astrophysics Data System (ADS)

    Mehrotra, Ranjana; Jangir, Deepak Kumar; Gupta, Alka; Kandpal, H. C.

    2008-11-01

    Infrared spectra of carcinomatous and their normal fore bearing tissues were collected in the 600 cm-1 to 4000 cm-1 region. Fourier Transform Infrared (FTIR) data of infiltrating ductal carcinoma of breast with different grades of malignancy from patients of different age groups were analyzed. Infrared spectra demonstrate significant spectral differences between the tumor sections of normal and the malignant breast tissues. In particular, changes in frequency and intensity in the spectra of protein, nucleic acid and glycogen were observed. This allows to make a qualitative and semi quantitative evaluation of the changes in proliferation activities from normal to diseased tissue. The findings establish a framework for additional studies, which may enable us to establish a relation of the diseased state with its infrared spectra.

  14. [Lower urinary tract dysfunction in normal pressure hydrocephalus: Review of the literature].

    PubMed

    Bey, E; Nicot, B; Casez, O; Le Normand, L

    2016-12-01

    Lower urinary tract dysfunction in normal pressure hydrocephalus has received little attention from the scientific community. The aim of this review article was to discuss diagnostic and therapeutic options for these patients. A literature review of MedLine publications on urinary incontinence in normal pressure hydrocephalus was conducted. The following keywords were used: "hydrocephalus, normal pressure" and "bladder dysfunction" or "urinary incontinence" or "overactive bladder" or "urinary bladder, neurogenic". Prospective and retrospective studies as well as previous reviews were analyzed. Urinary symptoms in normal pressure hydrocephalus are mainly represented by overactive bladder, which is a significant burden for the concerned patients. Isolated overactive bladder is more frequent (64%) than urinary incontinence (57%). Detrusor overactivity is seen in 95.2% of the cases. Neuro-surgery is efficient on urinary symptoms for 61.5% of the patients. Bladder recovery after surgery relates with increased mid-cingulate perfusion, probably linked with a functional restoration of the mid-cingulate that normally inhibits the micturition reflex. Medical options, added or not to surgery, include anticholinergic drugs unable to pass through the blood-brain barrier, Transcutaneous Electrical Nerve Stimulation and sacral neuromodulation. There is actually an insufficient concern about urinary symptoms in normal pressure hydrocephalus. This article highlights the importance of a harmonization of neuro-urological practices in the pre-therapeutic evaluation of patients suffering from normal pressure hydrocephalus. Copyright © 2016 Elsevier Masson SAS. All rights reserved.

  15. Investigation into the Use of Normal and Half-Normal Plots for Interpreting Results from Screening Experiments.

    DTIC Science & Technology

    1987-03-25

    by Lloyd (1952) using generalized least squares instead of ordinary least squares, and by Wilk, % 20 Gnanadesikan , and Freeny (1963) using a maximum...plot. The half-normal distribution is a special case of the gamma distribution proposed by Wilk, Gnanadesikan , and Huyett (1962). VARIATIONS ON THE... Gnanadesikan , R. Probability plotting methods for the analysis of data. Biometrika, 1968, 55, 1-17. This paper describes and discusses graphical techniques

  16. Metabolomic analysis of urine samples by UHPLC-QTOF-MS: Impact of normalization strategies.

    PubMed

    Gagnebin, Yoric; Tonoli, David; Lescuyer, Pierre; Ponte, Belen; de Seigneux, Sophie; Martin, Pierre-Yves; Schappler, Julie; Boccard, Julien; Rudaz, Serge

    2017-02-22

    Among the various biological matrices used in metabolomics, urine is a biofluid of major interest because of its non-invasive collection and its availability in large quantities. However, significant sources of variability in urine metabolomics based on UHPLC-MS are related to the analytical drift and variation of the sample concentration, thus requiring normalization. A sequential normalization strategy was developed to remove these detrimental effects, including: (i) pre-acquisition sample normalization by individual dilution factors to narrow the concentration range and to standardize the analytical conditions, (ii) post-acquisition data normalization by quality control-based robust LOESS signal correction (QC-RLSC) to correct for potential analytical drift, and (iii) post-acquisition data normalization by MS total useful signal (MSTUS) or probabilistic quotient normalization (PQN) to prevent the impact of concentration variability. This generic strategy was performed with urine samples from healthy individuals and was further implemented in the context of a clinical study to detect alterations in urine metabolomic profiles due to kidney failure. In the case of kidney failure, the relation between creatinine/osmolality and the sample concentration is modified, and relying only on these measurements for normalization could be highly detrimental. The sequential normalization strategy was demonstrated to significantly improve patient stratification by decreasing the unwanted variability and thus enhancing data quality. Copyright © 2016 Elsevier B.V. All rights reserved.

  17. EMG normalization to study muscle activation in cycling.

    PubMed

    Rouffet, David M; Hautier, Christophe A

    2008-10-01

    The value of electromyography (EMG) is sensitive to many physiological and non-physiological factors. The purpose of the present study was to determine if the torque-velocity test (T-V) can be used to normalize EMG signals into a framework of biological significance. Peak EMG amplitude of gluteus maximus (GMAX), vastus lateralis (VL), rectus femoris (RF), biceps femoris long head (BF), gastrocnemius medialis (GAS) and soleus (SOL) was calculated for nine subjects during isometric maximal voluntary contractions (IMVC) and torque-velocity bicycling tests (T-V). Then, the reference EMG signals obtained from IMVC and T-V bicycling tests were used to normalize the amplitude of the EMG signals collected for 15 different submaximal pedaling conditions. The results of this study showed that the repeatability of the measurements between IMVC (from 10% to 23%) and T-V (from 8% to 20%) was comparable. The amplitude of the peak EMG of VL was 99+/-43% higher (p<0.001) when measured during T-V. Moreover, the inter-individual variability of the EMG patterns calculated for submaximal cycling exercises differed significantly when using T-V bicycling normalization method (GMAX: 0.33+/-0.16 vs. 1.09+/-0.04, VL: 0.07+/-0.02 vs. 0.64+/-0.14, SOL: 0.07+/-0.03 vs. 1.00+/-0.07, RF: 1.21+/-0.20 vs. 0.92+/-0.13, BF: 1.47+/-0.47 vs. 0.84+/-0.11). It was concluded that T-V bicycling test offers the advantage to be less time and energy-consuming and to be as repeatable as IMVC tests to measure peak EMG amplitude. Furthermore, this normalization method avoids the impact of non-physiological factors on the amplitude of the EMG signals so that it allows quantifying better the activation level of lower limb muscles and the variability of the EMG patterns during submaximal bicycling exercises.

  18. Normal Isocurvature Surfaces and Special Isocurvature Circles (SIC)

    NASA Astrophysics Data System (ADS)

    Manoussakis, Gerassimos; Delikaraoglou, Demitris

    2010-05-01

    An isocurvature surface of a gravity field is a surface on which the value of the plumblines' curvature is constant. Here we are going to study the isocurvature surfaces of the Earth's normal gravity field. The normal gravity field is a symmetric gravity field therefore the isocurvature surfaces are surfaces of revolution. But even in this case the necessary relations for their study are not simple at all. Therefore to study an isocurvature surface we make special assumptions to form a vector equation which will hold only for a small coordinate patch of the isocurvature surface. Yet from the definition of the isocurvature surface and the properties of the normal gravity field is possible to express very interesting global geometrical properties of these surfaces without mixing surface differential calculus. The gradient of the plumblines' curvature function is vertical to an isocurvature surface. If P is a point of an isocurvature surface and "Φ" is the angle of the gradient of the plumblines' curvature with the equatorial plane then this direction points to the direction along which the curvature of the plumbline decreases / increases the most, and therefore is related to the strength of the normal gravity field. We will show that this direction is constant along a line of curvature of the isocurvature surface and this line is an isocurvature circle. In addition we will show that at each isocurvature surface there is at least one isocurvature circle along which the direction of the maximum variation of the plumblines' curvature function is parallel to the equatorial plane of the ellipsoid of revolution. This circle is defined as a Special Isocurvature Circle (SIC). Finally we shall prove that all these SIC lye on a special surface of revolution, the so - called SIC surface. That is to say, a SIC is not an isolated curve in the three dimensional space.

  19. Blood Ferrokinetics in Normal Man*

    PubMed Central

    Hosain, Fazle; Marsaglia, George; Finch, Clement A.

    1967-01-01

    The clearance of radioiron from plasma and its appearance in circulating erythrocytes in normal subjects are studied. The importance of correcting for plasma iron fluctuations and for mean body hematocrit is illustrated. The data are analyzed by probability theory to determine relationships between intravascular and extravascular iron. Two refluxes are described, one of about 7 particles of every 100 leaving the plasma, and the second of about 23. The return times of these are about 5 hours and 8 days, respectively. Images PMID:6018746

  20. Normalization matters: tracking the best strategy for sperm miRNA quantification.

    PubMed

    Corral-Vazquez, Celia; Blanco, Joan; Salas-Huetos, Albert; Vidal, Francesca; Anton, Ester

    2017-01-01

    What is the most reliable normalization strategy for sperm microRNA (miRNA) quantitative Reverse Transcription Polymerase Chain Reactions (qRT-PCR) using singleplex assays? The use of the average expression of hsa-miR-100-5p and hsa-miR-30a-5p as sperm miRNA qRT-PCR data normalizer is suggested as an optimal strategy. Mean-centering methods are the most reliable normalization strategies for miRNA high-throughput expression analyses. Nevertheless, specific trustworthy reference controls must be established in singleplex sperm miRNA qRT-PCRs. Cycle threshold (Ct) values from previously published sperm miRNA expression profiles were normalized using four approaches: (i) Mean-Centering Restricted (MCR) method (taken as the reference strategy); (ii) expression of the small nuclear RNA RNU6B; (iii) expression of four miRNAs selected by the Concordance Correlation Restricted (CCR) algorithm: hsa-miR-100-5p, hsa-miR-146b-5p, hsa-miR-92a-3p and hsa-miR-30a-5p; (iv) the combination of two of these miRNAs that achieved the highest proximity to MCR. Expression profile data from 736 sperm miRNAs were taken from previously published studies performed in fertile donors (n = 10) and infertile patients (n = 38). For each tested normalizer molecule, expression ubiquity and uniformity across the different samples and populations were assessed as indispensable requirements for being considered as valid candidates. The reliability of the different normalizing strategies was compared to MCR based on the set of differentially expressed miRNAs (DE-miRNAs) detected between populations, the corresponding predicted targets and the associated enriched biological processes. All tested normalizers were found to be ubiquitous and non-differentially expressed between populations. RNU6B was the least uniformly expressed candidate across samples. Data normalization through RNU6B led to dramatically misguided results when compared to MCR outputs, with a null prediction of target genes and enriched