Sample records for type pearson code

  1. Intricate Li-Sn Disorder in Rare-Earth Metal-Lithium Stannides. Crystal Chemistry of RE3Li4- xSn4+ x (RE = La-Nd, Sm; x < 0.3) and Eu7Li8- xSn10+ x ( x ≈ 2.0).

    PubMed

    Suen, Nian-Tzu; Guo, Sheng-Ping; Hoos, James; Bobev, Svilen

    2018-05-07

    Reported are the syntheses, crystal structures, and electronic structures of six rare-earth metal-lithium stannides with the general formulas RE 3 Li 4- x Sn 4+ x (RE = La-Nd, Sm) and Eu 7 Li 8- x Sn 10+ x . These new ternary compounds have been synthesized by high-temperature reactions of the corresponding elements. Their crystal structures have been established using single-crystal X-ray diffraction methods. The RE 3 Li 4- x Sn 4+ x phases crystallize in the orthorhombic body-centered space group Immm (No. 71) with the Zr 3 Cu 4 Si 4 structure type (Pearson code oI22), and the Eu 7 Li 8- x Sn 10+ x phase crystallizes in the orthorhombic base-centered space group Cmmm (No. 65) with the Ce 7 Li 8 Ge 10 structure type (Pearson code oC50). Both structures can be consdered as part of the [RESn 2 ] n [RELi 2 Sn] m homologous series, wherein the structures are intergrowths of imaginary RESn 2 (AlB 2 -like structure type) and RELi 2 Sn (MgAl 2 Cu-like structure type) fragments. Close examination the structures indicates complex occupational Li-Sn disorder, apparently governed by the drive of the structure to achieve an optimal number of valence electrons. This conclusion based on experimental results is supported by detailed electronic structure calculations, carried out using the tight-binding linear muffin-tin orbital method.

  2. A product Pearson-type VII density distribution

    NASA Astrophysics Data System (ADS)

    Nadarajah, Saralees; Kotz, Samuel

    2008-01-01

    The Pearson-type VII distributions (containing the Student's t distributions) are becoming increasing prominent and are being considered as competitors to the normal distribution. Motivated by real examples in decision sciences, Bayesian statistics, probability theory and Physics, a new Pearson-type VII distribution is introduced by taking the product of two Pearson-type VII pdfs. Various structural properties of this distribution are derived, including its cdf, moments, mean deviation about the mean, mean deviation about the median, entropy, asymptotic distribution of the extreme order statistics, maximum likelihood estimates and the Fisher information matrix. Finally, an application to a Bayesian testing problem is illustrated.

  3. The Effect of Explicit Instruction for Story Grammar Code Strategy on Third Graders' Reading Comprehension

    ERIC Educational Resources Information Center

    De Nigris, Rosemarie Previti

    2017-01-01

    The hypothesis of the study was explicit gradual release of responsibility comprehension instruction (GRR) (Pearson & Gallagher, 1983; Fisher & Frey, 2008) with the researcher-created Story Grammar Code (SGC) strategy would significantly increase third graders' comprehension of narrative fiction and nonfiction text. SGC comprehension…

  4. Testing the significance of a correlation with nonnormal data: comparison of Pearson, Spearman, transformation, and resampling approaches.

    PubMed

    Bishara, Anthony J; Hittner, James B

    2012-09-01

    It is well known that when data are nonnormally distributed, a test of the significance of Pearson's r may inflate Type I error rates and reduce power. Statistics textbooks and the simulation literature provide several alternatives to Pearson's correlation. However, the relative performance of these alternatives has been unclear. Two simulation studies were conducted to compare 12 methods, including Pearson, Spearman's rank-order, transformation, and resampling approaches. With most sample sizes (n ≥ 20), Type I and Type II error rates were minimized by transforming the data to a normal shape prior to assessing the Pearson correlation. Among transformation approaches, a general purpose rank-based inverse normal transformation (i.e., transformation to rankit scores) was most beneficial. However, when samples were both small (n ≤ 10) and extremely nonnormal, the permutation test often outperformed other alternatives, including various bootstrap tests.

  5. cis-trans Germanium chains in the intermetallic compounds ALi{sub 1-x}In{sub x}Ge{sub 2} and A{sub 2}(Li{sub 1-x}In{sub x}){sub 2}Ge{sub 3} (A=Sr, Ba, Eu)-experimental and theoretical studies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    You, Tae-Soo; Bobev, Svilen, E-mail: bobev@udel.ed

    Two types of strontium-, barium- and europium-containing germanides have been synthesized using high temperature reactions and characterized by single-crystal X-ray diffraction. All reported compounds also contain mixed-occupied Li and In atoms, resulting in quaternary phases with narrow homogeneity ranges. The first type comprises EuLi{sub 0.91(1)}In{sub 0.09}Ge{sub 2}, SrLi{sub 0.95(1)}In{sub 0.05}Ge{sub 2} and BaLi{sub 0.99(1)}In{sub 0.01}Ge{sub 2}, which crystallize in the orthorhombic space group Pnma (BaLi{sub 0.9}Mg{sub 0.1}Si{sub 2} structure type, Pearson code oP16). The lattice parameters are a=7.129(4)-7.405(4) A; b=4.426(3)-4.638(2) A; and c=11.462(7)-11.872(6) A. The second type includes Eu{sub 2}Li{sub 1.36(1)}In{sub 0.64}Ge{sub 3} and Sr{sub 2}Li{sub 1.45(1)}In{sub 0.55}Ge{sub 3}, whichmore » adopt the orthorhombic space group Cmcm (Ce{sub 2}Li{sub 2}Ge{sub 3} structure type, Pearson code oC28) with lattice parameters a=4.534(2)-4.618(2) A; b=19.347(8)-19.685(9) A; and c=7.164(3)-7.260(3) A. The polyanionic sub-structures in both cases feature one-dimensional Ge chains with alternating Ge-Ge bonds in cis- and trans-conformation. Theoretical studies using the tight-binding linear muffin-tin orbital (LMTO) method provide the rationale for optimizing the overall bonding by diminishing the {pi}-p delocalization along the Ge chains, accounting for the experimentally confirmed substitution of Li forIn. -- Graphical abstract: Presented are the single-crystal structures of two types of closely related intermetallics, as well as their band structures, calculated using tight-binding linear muffin-tin orbital (TB-LMTO-ASA) method. Display Omitted« less

  6. A Tunisian patient with Pearson syndrome harboring the 4.977kb common deletion associated to two novel large-scale mitochondrial deletions.

    PubMed

    Ayed, Imen Ben; Chamkha, Imen; Mkaouar-Rebai, Emna; Kammoun, Thouraya; Mezghani, Najla; Chabchoub, Imen; Aloulou, Hajer; Hachicha, Mongia; Fakhfakh, Faiza

    2011-07-29

    Pearson syndrome (PS) is a multisystem disease including refractory anemia, vacuolization of marrow precursors and pancreatic fibrosis. The disease starts during infancy and affects various tissues and organs, and most affected children die before the age of 3years. Pearson syndrome is caused by de novo large-scale deletions or, more rarely, duplications in the mitochondrial genome. In the present report, we described a Pearson syndrome patient harboring multiple mitochondrial deletions which is, in our knowledge, the first case described and studied in Tunisia. In fact, we reported the common 4.977kb deletion and two novel heteroplasmic deletions (5.030 and 5.234kb) of the mtDNA. These deletions affect several protein-coding and tRNAs genes and could strongly lead to defects in mitochondrial polypeptides synthesis, and impair oxidative phosphorylation and energy metabolism in the respiratory chain in the studied patient. Copyright © 2011 Elsevier Inc. All rights reserved.

  7. Clopper-Pearson bounds from HEP data cuts

    NASA Astrophysics Data System (ADS)

    Berg, B. A.

    2001-08-01

    For the measurement of Ns signals in N events rigorous confidence bounds on the true signal probability pexact were established in a classical paper by Clopper and Pearson [Biometrica 26, 404 (1934)]. Here, their bounds are generalized to the HEP situation where cuts on the data tag signals with probability Ps and background data with likelihood Pb

  8. The crystal structure of the new ternary antimonide Dy 3Cu 20+xSb 11-x ( x≈2)

    NASA Astrophysics Data System (ADS)

    Fedyna, L. O.; Bodak, O. I.; Fedorchuk, A. O.; Tokaychuk, Ya. O.

    2005-06-01

    New ternary antimonide Dy 3Cu 20+xSb 11-x ( x≈2) was synthesized and its crystal structure was determined by direct methods from X-ray powder diffraction data (diffractometer DRON-3M, Cu Kα-radiation, R=6.99%,R=12.27%,R=11.55%). The compound crystallizes with the own cubic structure type: space group F 4¯ 3m, Pearson code cF272, a=16.6150(2) Å,Z=8. The structure of the Dy 3Cu 20Sb 11-x ( x≈2) can be obtained from the structure type BaHg 11 by doubling of the lattice parameter and subtraction of 16 atoms. The studied structure was compared with the structures of known compounds, which crystallize in the same space group with similar cell parameters.

  9. Spatial trends in Pearson Type III statistical parameters

    USGS Publications Warehouse

    Lichty, R.W.; Karlinger, M.R.

    1995-01-01

    Spatial trends in the statistical parameters (mean, standard deviation, and skewness coefficient) of a Pearson Type III distribution of the logarithms of annual flood peaks for small rural basins (less than 90 km2) are delineated using a climate factor CT, (T=2-, 25-, and 100-yr recurrence intervals), which quantifies the effects of long-term climatic data (rainfall and pan evaporation) on observed T-yr floods. Maps showing trends in average parameter values demonstrate the geographically varying influence of climate on the magnitude of Pearson Type III statistical parameters. The spatial trends in variability of the parameter values characterize the sensitivity of statistical parameters to the interaction of basin-runoff characteristics (hydrology) and climate. -from Authors

  10. Correlation of Hip Fracture with Other Fracture Types: Toward a Rational Composite Hip Fracture Endpoint

    PubMed Central

    Colón-Emeric, Cathleen; Pieper, Carl F.; Grubber, Janet; Van Scoyoc, Lynn; Schnell, Merritt L; Van Houtven, Courtney Harold; Pearson, Megan; Lafleur, Joanne; Lyles, Kenneth W.; Adler, Robert A.

    2016-01-01

    Purpose With ethical requirements to the enrollment of lower risk subjects, osteoporosis trials are underpowered to detect reduction in hip fractures. Different skeletal sites have different levels of fracture risk and response to treatment. We sought to identify fracture sites which cluster with hip fracture at higher than expected frequency; if these sites respond to treatment similarly, then a composite fracture endpoint could provide a better estimate of hip fracture reduction. Methods Cohort study using Veterans Affairs and Medicare administrative data. Male Veterans (n=5,036,536) aged 50-99 years receiving VA primary care between1999-2009 were included. Fractures were ascertained using ICD9 and CPT codes and classified by skeletal site. Pearson correlation coefficients, logistic regression and kappa statistics, were used to describe the correlation between each fracture type and hip fracture within individuals, without regards to the timing of the events. Results 595,579 (11.8%) men suffered 1 or more fractures and 179,597 (3.6%) suffered 2 or more fractures during the time under study. Of those with one or more fractures, rib was the most common site (29%), followed by spine (22%), hip (21%) and femur (20%). The fracture types most highly correlated with hip fracture were pelvic/acetabular (Pearson correlation coefficient 0.25, p<0.0001), femur (0.15, p<0.0001), and shoulder (0.11, p<0.0001). Conclusions Pelvic, acetabular, femur, and shoulder fractures cluster with hip fractures within individuals at greater than expected frequency. If we observe similar treatment risk reductions within that cluster, subsequent trials could consider use of a composite endpoint to better estimate hip fracture risk. PMID:26151123

  11. Smaller Satellite Operations Near Geostationary Orbit

    DTIC Science & Technology

    2007-09-01

    At the time, this was considered a very difficult task, due to the complexity involved with creating computer code to autonomously perform... computer systems and even permanently damage equipment. Depending on the solar cycle, solar weather will be properly characterized and modeled to...30 Wayne Tomasi. Electronic Communciations Systems. Upper Saddle River: Pearson Education, 2004. 1041

  12. Method of estimating flood-frequency parameters for streams in Idaho

    USGS Publications Warehouse

    Kjelstrom, L.C.; Moffatt, R.L.

    1981-01-01

    Skew coefficients for the log-Pearson type III distribution are generalized on the basis of some similarity of floods in the Snake River basin and other parts of Idaho. Generalized skew coefficients aid in shaping flood-frequency curves because skew coefficients computed from gaging stations having relatively short periods of peak flow records can be unreliable. Generalized skew coefficients can be obtained for a gaging station from one of three maps in this report. The map to be used depends on whether (1) snowmelt floods are domiant (generally when more than 20 percent of the drainage area is above 6,000 feet altitude), (2) rainstorm floods are dominant (generally when the mean altitude is less than 3,000 feet), or (3) either snowmelt or rainstorm floods can be the annual miximum discharge. For the latter case, frequency curves constructed using separate arrays of each type of runoff can be combined into one curve, which, for some stations, is significantly different than the frequency curve constructed using only annual maximum discharges. For 269 gaging stations, flood-frequency curves that include the generalized skew coefficients in the computation of the log-Pearson type III equation tend to fit the data better than previous analyses. Frequency curves for ungaged sites can be derived by estimating three statistics of the log-Pearson type III distribution. The mean and standard deviation of logarithms of annual maximum discharges are estimated by regression equations that use basin characteristics as independent variables. Skew coefficient estimates are the generalized skews. The log-Pearson type III equation is then applied with the three estimated statistics to compute the discharge at selected exceedance probabilities. Standard errors at the 2-percent exceedance probability range from 41 to 90 percent. (USGS)

  13. Using type IV Pearson distribution to calculate the probabilities of underrun and overrun of lists of multiple cases.

    PubMed

    Wang, Jihan; Yang, Kai

    2014-07-01

    An efficient operating room needs both little underutilised and overutilised time to achieve optimal cost efficiency. The probabilities of underrun and overrun of lists of cases can be estimated by a well defined duration distribution of the lists. To propose a method of predicting the probabilities of underrun and overrun of lists of cases using Type IV Pearson distribution to support case scheduling. Six years of data were collected. The first 5 years of data were used to fit distributions and estimate parameters. The data from the last year were used as testing data to validate the proposed methods. The percentiles of the duration distribution of lists of cases were calculated by Type IV Pearson distribution and t-distribution. Monte Carlo simulation was conducted to verify the accuracy of percentiles defined by the proposed methods. Operating rooms in John D. Dingell VA Medical Center, United States, from January 2005 to December 2011. Differences between the proportion of lists of cases that were completed within the percentiles of the proposed duration distribution of the lists and the corresponding percentiles. Compared with the t-distribution, the proposed new distribution is 8.31% (0.38) more accurate on average and 14.16% (0.19) more accurate in calculating the probabilities at the 10th and 90th percentiles of the distribution, which is a major concern of operating room schedulers. The absolute deviations between the percentiles defined by Type IV Pearson distribution and those from Monte Carlo simulation varied from 0.20  min (0.01) to 0.43  min (0.03). Operating room schedulers can rely on the most recent 10 cases with the same combination of surgeon and procedure(s) for distribution parameter estimation to plan lists of cases. Values are mean (SEM). The proposed Type IV Pearson distribution is more accurate than t-distribution to estimate the probabilities of underrun and overrun of lists of cases. However, as not all the individual case durations followed log-normal distributions, there was some deviation from the true duration distribution of the lists.

  14. Disability in physical education textbooks: an analysis of image content.

    PubMed

    Táboas-Pais, María Inés; Rey-Cao, Ana

    2012-10-01

    The aim of this paper is to show how images of disability are portrayed in physical education textbooks for secondary schools in Spain. The sample was composed of 3,316 images published in 36 textbooks by 10 publishing houses. A content analysis was carried out using a coding scheme based on categories employed in other similar studies and adapted to the requirements of this study with additional categories. The variables were camera angle, gender, type of physical activity, field of practice, space, and level. Univariate and bivariate descriptive analyses were also carried out. The Pearson chi-square statistic was used to identify associations between the variables. Results showed a noticeable imbalance between people with disabilities and people without disabilities, and women with disabilities were less frequently represented than men with disabilities. People with disabilities were depicted as participating in a very limited variety of segregated, competitive, and elite sports activities.

  15. Computer routines for probability distributions, random numbers, and related functions

    USGS Publications Warehouse

    Kirby, W.

    1983-01-01

    Use of previously coded and tested subroutines simplifies and speeds up program development and testing. This report presents routines that can be used to calculate various probability distributions and other functions of importance in statistical hydrology. The routines are designed as general-purpose Fortran subroutines and functions to be called from user-written main progress. The probability distributions provided include the beta, chi-square, gamma, Gaussian (normal), Pearson Type III (tables and approximation), and Weibull. Also provided are the distributions of the Grubbs-Beck outlier test, Kolmogorov 's and Smirnov 's D, Student 's t, noncentral t (approximate), and Snedecor F. Other mathematical functions include the Bessel function, I sub o, gamma and log-gamma functions, error functions, and exponential integral. Auxiliary services include sorting and printer-plotting. Random number generators for uniform and normal numbers are provided and may be used with some of the above routines to generate numbers from other distributions. (USGS)

  16. Computer routines for probability distributions, random numbers, and related functions

    USGS Publications Warehouse

    Kirby, W.H.

    1980-01-01

    Use of previously codes and tested subroutines simplifies and speeds up program development and testing. This report presents routines that can be used to calculate various probability distributions and other functions of importance in statistical hydrology. The routines are designed as general-purpose Fortran subroutines and functions to be called from user-written main programs. The probability distributions provided include the beta, chisquare, gamma, Gaussian (normal), Pearson Type III (tables and approximation), and Weibull. Also provided are the distributions of the Grubbs-Beck outlier test, Kolmogorov 's and Smirnov 's D, Student 's t, noncentral t (approximate), and Snedecor F tests. Other mathematical functions include the Bessel function I (subzero), gamma and log-gamma functions, error functions and exponential integral. Auxiliary services include sorting and printer plotting. Random number generators for uniform and normal numbers are provided and may be used with some of the above routines to generate numbers from other distributions. (USGS)

  17. Nd3Ge1.18In0.82 and Sm3Ge1.33In0.67 - New ternary indides with La3GeIn type structure

    NASA Astrophysics Data System (ADS)

    Kravets, Oksana; Nychyporuk, Galyna; Muts, Ihor; Hlukhyy, Viktor; Pöttgen, Rainer; Zaremba, Vasyl'

    2014-06-01

    New indides, Nd3Ge1.18In0.82 and Sm3Ge1.33In0.67, were synthesized from the elements by arc-melting and subsequent annealing at 870 K. Single crystals were grown through special annealing procedures in sealed tantalum tubes in a resistance furnace. Both compounds were investigated on the basis of X-ray powder and single crystal data: La3GeIn type structure, Pearson code tI80, space group I4/mcm; a = 1200.1(1), c = 1562.8(1) pm, wR2 = 0.0781, 716 F2 values, 34 variables for Nd3Ge1.18In0.82 and a = 1184.7(2), c = 1537.0(3) pm, wR2 = 0.0305, 911 F2 values, 34 variables for Sm3Ge1.33In0.67. The crystal chemistry in Nd3Ge1.18In0.82 is discussed from a geometrical point of view and in terms of LMTO band structure calculations.

  18. A shift from significance test to hypothesis test through power analysis in medical research.

    PubMed

    Singh, G

    2006-01-01

    Medical research literature until recently, exhibited substantial dominance of the Fisher's significance test approach of statistical inference concentrating more on probability of type I error over Neyman-Pearson's hypothesis test considering both probability of type I and II error. Fisher's approach dichotomises results into significant or not significant results with a P value. The Neyman-Pearson's approach talks of acceptance or rejection of null hypothesis. Based on the same theory these two approaches deal with same objective and conclude in their own way. The advancement in computing techniques and availability of statistical software have resulted in increasing application of power calculations in medical research and thereby reporting the result of significance tests in the light of power of the test also. Significance test approach, when it incorporates power analysis contains the essence of hypothesis test approach. It may be safely argued that rising application of power analysis in medical research may have initiated a shift from Fisher's significance test to Neyman-Pearson's hypothesis test procedure.

  19. SPSS and SAS programs for comparing Pearson correlations and OLS regression coefficients.

    PubMed

    Weaver, Bruce; Wuensch, Karl L

    2013-09-01

    Several procedures that use summary data to test hypotheses about Pearson correlations and ordinary least squares regression coefficients have been described in various books and articles. To our knowledge, however, no single resource describes all of the most common tests. Furthermore, many of these tests have not yet been implemented in popular statistical software packages such as SPSS and SAS. In this article, we describe all of the most common tests and provide SPSS and SAS programs to perform them. When they are applicable, our code also computes 100 × (1 - α)% confidence intervals corresponding to the tests. For testing hypotheses about independent regression coefficients, we demonstrate one method that uses summary data and another that uses raw data (i.e., Potthoff analysis). When the raw data are available, the latter method is preferred, because use of summary data entails some loss of precision due to rounding.

  20. Hotspot Patterns: The Formal Definition and Automatic Detection of Architecture Smells

    DTIC Science & Technology

    2015-01-15

    serious question for a project manager or architect: how to determine which parts of the code base should be given higher priority for maintenance and...services framework; Hadoop8 is a tool for distributed processing of large data sets; HBase9 is the Hadoop database; Ivy10 is a dependency management tool...answer this question more rigorously, we conducted Pearson Correlation Analysis to test the dependency between the number of issues a file involves

  1. Time-scale effects on the gain-loss asymmetry in stock indices

    NASA Astrophysics Data System (ADS)

    Sándor, Bulcsú; Simonsen, Ingve; Nagy, Bálint Zsolt; Néda, Zoltán

    2016-08-01

    The gain-loss asymmetry, observed in the inverse statistics of stock indices is present for logarithmic return levels that are over 2 % , and it is the result of the non-Pearson-type autocorrelations in the index. These non-Pearson-type correlations can be viewed also as functionally dependent daily volatilities, extending for a finite time interval. A generalized time-window shuffling method is used to show the existence of such autocorrelations. Their characteristic time scale proves to be smaller (less than 25 trading days) than what was previously believed. It is also found that this characteristic time scale has decreased with the appearance of program trading in the stock market transactions. Connections with the leverage effect are also established.

  2. Classification of probability densities on the basis of Pearson?s curves with application to coronal heating simulations

    NASA Astrophysics Data System (ADS)

    Podladchikova, O.; Lefebvre, B.; Krasnoselskikh, V.; Podladchikov, V.

    An important task for the problem of coronal heating is to produce reliable evaluation of the statistical properties of energy release and eruptive events such as micro-and nanoflares in the solar corona. Different types of distributions for the peak flux, peak count rate measurements, pixel intensities, total energy flux or emission measures increases or waiting times have appeared in the literature. This raises the question of a precise evaluation and classification of such distributions. For this purpose, we use the method proposed by K. Pearson at the beginning of the last century, based on the relationship between the first 4 moments of the distribution. Pearson's technique encompasses and classifies a broad range of distributions, including some of those which have appeared in the literature about coronal heating. This technique is successfully applied to simulated data from the model of Krasnoselskikh et al. (2002). It allows to provide successful fits to the empirical distributions of the dissipated energy, and to classify them as a function of model parameters such as dissipation mechanisms and threshold.

  3. Rare-earth metal gallium silicides via the gallium self-flux method. Synthesis, crystal structures, and magnetic properties of RE(Ga 1–xSi x)₂ (RE=Y, La–Nd, Sm, Gd–Yb, Lu)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Darone, Gregory M.; Hmiel, Benjamin; Zhang, Jiliang

    Fifteen ternary rare-earth metal gallium silicides have been synthesized using molten Ga as a molten flux. They have been structurally characterized by single-crystal and powder X-ray diffraction to form with three different structures—the early to mid-late rare-earth metals RE=La–Nd, Sm, Gd–Ho, Yb and Y form compounds with empirical formulae RE(Ga xSi 1–x)₂ (0.38≤x≤0.63), which crystallize with the tetragonal α-ThSi₂ structure type (space group I4₁/amd, No. 141; Pearson symbol tI12). The compounds of the late rare-earth crystallize with the orthorhombic α-GdSi₂ structure type (space group Imma, No. 74; Pearson symbol oI12), with refined empirical formula REGa xSi 2–x–y (RE=Ho, Er, Tm;more » 0.33≤x≤0.40, 0.10≤y≤0.18). LuGa₀.₃₂₍₁₎Si₁.₄₃₍₁₎ crystallizes with the orthorhombic YbMn₀.₁₇Si₁.₈₃ structure type (space group Cmcm, No. 63; Pearson symbol oC24). Structural trends are reviewed and analyzed; the magnetic susceptibilities of the grown single-crystals are presented. - Graphical abstract: This article details the exploration of the RE–Ga–Si ternary system with the aim to systematically investigate the structural “boundaries” between the α-ThSi₂ and α-GdSi₂-type structures, and studies of the magnetic properties of the newly synthesized single-crystalline materials. Highlights: • Light rare-earth gallium silicides crystallize in α-ThSi₂ structure type. • Heavy rare-earth gallium silicides crystallize in α-GdSi₂ structure type. • LuGaSi crystallizes in a defect variant of the YbMn₀.₁₇Si₁.₈₃ structure type.« less

  4. Knowledge and Performance about Nursing Ethic Codes from Nurses' and Patients' Perspective in Tabriz Teaching Hospitals, Iran.

    PubMed

    Mohajjel-Aghdam, Alireza; Hassankhani, Hadi; Zamanzadeh, Vahid; Khameneh, Saied; Moghaddam, Sara

    2013-09-01

    Nursing profession requires knowledge of ethics to guide performance. The nature of this profession necessitates ethical care more than routine care. Today, worldwide definition of professional ethic code has been done based on human and ethical issues in the communication between nurse and patient. To improve all dimensions of nursing, we need to respect ethic codes. The aim of this study is to assess knowledge and performance about nursing ethic codes from nurses' and patients' perspective. A descriptive study Conducted upon 345 nurses and 500 inpatients in six teaching hospitals of Tabriz, 2012. To investigate nurses' knowledge and performance, data were collected by using structured questionnaires. Statistical analysis was done using descriptive and analytic statistics, independent t-test and ANOVA and Pearson correlation coefficient, in SPSS13. Most of the nurses were female, married, educated at BS degree and 86.4% of them were aware of Ethic codes also 91.9% of nurses and 41.8% of patients represented nurses respect ethic codes. Nurses' and patients' perspective about ethic codes differed significantly. Significant relationship was found between nurses' knowledge of ethic codes and job satisfaction and complaint of ethical performance. According to the results, consideration to teaching ethic codes in nursing curriculum for student and continuous education for staff is proposed, on the other hand recognizing failures of the health system, optimizing nursing care, attempt to inform patients about Nursing ethic codes, promote patient rights and achieve patient satisfaction can minimize the differences between the two perspectives.

  5. Knowledge and Performance about Nursing Ethic Codes from Nurses' and Patients' Perspective in Tabriz Teaching Hospitals, Iran

    PubMed Central

    Mohajjel-Aghdam, Alireza; Hassankhani, Hadi; Zamanzadeh, Vahid; Khameneh, Saied; Moghaddam, Sara

    2013-01-01

    Introduction: Nursing profession requires knowledge of ethics to guide performance. The nature of this profession necessitates ethical care more than routine care. Today, worldwide definition of professional ethic code has been done based on human and ethical issues in the communication between nurse and patient. To improve all dimensions of nursing, we need to respect ethic codes. The aim of this study is to assess knowledge and performance about nursing ethic codes from nurses' and patients' perspective. Methods: A descriptive study Conducted upon 345 nurses and 500 inpatients in six teaching hospitals of Tabriz, 2012. To investigate nurses' knowledge and performance, data were collected by using structured questionnaires. Statistical analysis was done using descriptive and analytic statistics, independent t-test and ANOVA and Pearson correlation coefficient, in SPSS13. Results: Most of the nurses were female, married, educated at BS degree and 86.4% of them were aware of Ethic codes also 91.9% of nurses and 41.8% of patients represented nurses respect ethic codes. Nurses' and patients' perspective about ethic codes differed significantly. Significant relationship was found between nurses' knowledge of ethic codes and job satisfaction and complaint of ethical performance. Conclusion: According to the results, consideration to teaching ethic codes in nursing curriculum for student and continuous education for staff is proposed, on the other hand recognizing failures of the health system, optimizing nursing care, attempt to inform patients about Nursing ethic codes, promote patient rights and achieve patient satisfaction can minimize the differences between the two perspectives. PMID:25276730

  6. The Relationship between Teacher Efficacy and Reading Program Type in West Virginia Elementary Schools

    ERIC Educational Resources Information Center

    Harvey, Patricia Lee

    2009-01-01

    This study, based on Bandura's social cognitive theory, explored the two dimensions of teacher efficacy among reading program types (Harcourt; Houghton Mifflin; MacMillan McGraw Hill; Pearson Scott Foresman; and, Other) and selected demographic factors (school enrollment size; student ethnicity; school district of urban, rural, and suburban;…

  7. The Robustness of LISREL Estimates in Structural Equation Models with Categorical Variables.

    ERIC Educational Resources Information Center

    Ethington, Corinna A.

    This study examined the effect of type of correlation matrix on the robustness of LISREL maximum likelihood and unweighted least squares structural parameter estimates for models with categorical manifest variables. Two types of correlation matrices were analyzed; one containing Pearson product-moment correlations and one containing tetrachoric,…

  8. Evaluation of the best fit distribution for partial duration series of daily rainfall in Madinah, western Saudi Arabia

    NASA Astrophysics Data System (ADS)

    Alahmadi, F.; Rahman, N. A.; Abdulrazzak, M.

    2014-09-01

    Rainfall frequency analysis is an essential tool for the design of water related infrastructure. It can be used to predict future flood magnitudes for a given magnitude and frequency of extreme rainfall events. This study analyses the application of rainfall partial duration series (PDS) in the vast growing urban Madinah city located in the western part of Saudi Arabia. Different statistical distributions were applied (i.e. Normal, Log Normal, Extreme Value type I, Generalized Extreme Value, Pearson Type III, Log Pearson Type III) and their distribution parameters were estimated using L-moments methods. Also, different selection criteria models are applied, e.g. Akaike Information Criterion (AIC), Corrected Akaike Information Criterion (AICc), Bayesian Information Criterion (BIC) and Anderson-Darling Criterion (ADC). The analysis indicated the advantage of Generalized Extreme Value as the best fit statistical distribution for Madinah partial duration daily rainfall series. The outcome of such an evaluation can contribute toward better design criteria for flood management, especially flood protection measures.

  9. Pearson-type goodness-of-fit test with bootstrap maximum likelihood estimation.

    PubMed

    Yin, Guosheng; Ma, Yanyuan

    2013-01-01

    The Pearson test statistic is constructed by partitioning the data into bins and computing the difference between the observed and expected counts in these bins. If the maximum likelihood estimator (MLE) of the original data is used, the statistic generally does not follow a chi-squared distribution or any explicit distribution. We propose a bootstrap-based modification of the Pearson test statistic to recover the chi-squared distribution. We compute the observed and expected counts in the partitioned bins by using the MLE obtained from a bootstrap sample. This bootstrap-sample MLE adjusts exactly the right amount of randomness to the test statistic, and recovers the chi-squared distribution. The bootstrap chi-squared test is easy to implement, as it only requires fitting exactly the same model to the bootstrap data to obtain the corresponding MLE, and then constructs the bin counts based on the original data. We examine the test size and power of the new model diagnostic procedure using simulation studies and illustrate it with a real data set.

  10. Standardized Pearson type 3 density function area tables

    NASA Technical Reports Server (NTRS)

    Cohen, A. C.; Helm, F. R.; Sugg, M.

    1971-01-01

    Tables constituting extension of similar tables published in 1936 are presented in report form. Single and triple parameter gamma functions are discussed. Report tables should interest persons concerned with development and use of numerical analysis and evaluation methods.

  11. Tests of Independence in Contingency Tables with Small Samples: A Comparison of Statistical Power.

    ERIC Educational Resources Information Center

    Parshall, Cynthia G.; Kromrey, Jeffrey D.

    1996-01-01

    Power and Type I error rates were estimated for contingency tables with small sample sizes for the following four types of tests: (1) Pearson's chi-square; (2) chi-square with Yates's continuity correction; (3) the likelihood ratio test; and (4) Fisher's Exact Test. Various marginal distributions, sample sizes, and effect sizes were examined. (SLD)

  12. Revisiting Pearson's climate and forest type studies on the Fort Valley Experimental Forest

    Treesearch

    Joseph E. Crouse; Margaret M. Moore; Peter Fule

    2008-01-01

    Five weather station sites were established in 1916 by Fort Valley personnel along an elevational gradient from the Experimental Station to near the top of the San Francisco Peaks to investigate the factors that controlled and limited forest types. The stations were located in the ponderosa pine, Douglas-fir, limber pine, Engelmann spruce, and Engelmann spruce/...

  13. Revisiting Pearson's climate and forest type studies on the Fort Valley Experimental Forest (P-53)

    Treesearch

    Joseph E. Crouse; Margaret M. Moore; Peter Z. Fule

    2008-01-01

    Five weather station sites were established in 1916 by Fort Valley personnel along an elevational gradient from the Experimental Station to near the top of the San Francisco Peaks to investigate the factors that controlled and limited forest types. The stations were located in the ponderosa pine, Douglas-fir, limber pine, Engelmann spruce, and Engelmann spruce/...

  14. MP estimation applied to platykurtic sets of geodetic observations

    NASA Astrophysics Data System (ADS)

    Wiśniewski, Zbigniew

    2017-06-01

    MP estimation is a method which concerns estimating of the location parameters when the probabilistic models of observations differ from the normal distributions in the kurtosis or asymmetry. The system of Pearson's distributions is the probabilistic basis for the method. So far, such a method was applied and analyzed mostly for leptokurtic or mesokurtic distributions (Pearson's distributions of types IV or VII), which predominate practical cases. The analyses of geodetic or astronomical observations show that we may also deal with sets which have moderate asymmetry or small negative excess kurtosis. Asymmetry might result from the influence of many small systematic errors, which were not eliminated during preprocessing of data. The excess kurtosis can be related with bigger or smaller (in relations to the Hagen hypothesis) frequency of occurrence of the elementary errors which are close to zero. Considering that fact, this paper focuses on the estimation with application of the Pearson platykurtic distributions of types I or II. The paper presents the solution of the corresponding optimization problem and its basic properties. Although platykurtic distributions are rare in practice, it was an interesting issue to find out what results can be provided by MP estimation in the case of such observation distributions. The numerical tests which are presented in the paper are rather limited; however, they allow us to draw some general conclusions.

  15. Emotional Availability Scale Among Three U.S. Race/Ethnic Groups.

    PubMed

    Derscheid, Della J; Fogg, Louis F; Julion, Wrenetha; Johnson, Mary E; Tucker, Sharon; Delaney, Kathleen R

    2018-05-01

    This study used a cross-sectional design to conduct a subgroup psychometric analysis of the Emotional Availability Scale among matched Hispanic ( n = 20), African American ( n = 20), and European American ( n = 10) English-speaking mother-child dyads in the United States. Differences by race/ethnicity were tested ( p < .05) among (a) Emotional Availability Scale dimensions with ANOVA, and (b) relationships of Emotional Availability Scale dimensions with select Dyadic Parent-Child Interaction Coding System variables with Pearson correlation and matched moderated regression. Internal consistency was .950 (Cronbach's α; N = 50). No significant differences in the six Emotional Availability Scale dimension scores by race/ethnicity emerged. Two Dyadic Parent-Child Interaction Coding System behaviors predicted two Emotional Availability Scale dimensions each for Hispanic and African American mother-child dyads. Results suggest emotional availability similarity among race/ethnic subgroups with few predictive differences of emotional availability dimensions by specific behaviors for Hispanic and African American subgroups.

  16. Flood frequency analysis using optimization techniques : final report.

    DOT National Transportation Integrated Search

    1992-10-01

    this study consists of three parts. In the first part, a comprehensive investigation was made to find an improved estimation method for the log-Pearson type 3 (LP3) distribution by using optimization techniques. Ninety sets of observed Louisiana floo...

  17. Nearly ideal binary communication in squeezed channels

    NASA Astrophysics Data System (ADS)

    Paris, Matteo G.

    2001-07-01

    We analyze the effect of squeezing the channel in binary communication based on Gaussian states. We show that for coding on pure states, squeezing increases the detection probability at fixed size of the strategy, actually saturating the optimal bound already for moderate signal energy. Using Neyman-Pearson lemma for fuzzy hypothesis testing we are able to analyze also the case of mixed states, and to find the optimal amount of squeezing that can be effectively employed. It results that optimally squeezed channels are robust against signal mixing, and largely improve the strategy power by comparison with coherent ones.

  18. Peak-flow frequency estimates through 1994 for gaged streams in South Dakota

    USGS Publications Warehouse

    Burr, M.J.; Korkow, K.L.

    1996-01-01

    Annual peak-flow data are listed for 250 continuous-record and crest-stage gaging stations in South Dakota. Peak-flow frequency estimates for selected recurrence intervals ranging from 2 to 500 years are given for 234 of these 250 stations. The log-Pearson Type III procedure was used to compute the frequency relations for the 234 stations, which in 1994 included 105 active and 129 inactive stations. The log-Pearson Type III procedure is recommended by the Hydrology Subcommittee of the Interagency Advisory Committee on Water Data, 1982, "Guidelines for Determining Flood Flow Frequency."No peak-flow frequency estimates are given for 16 of the 250 stations because: (1) of extreme variability in data set; (2) more than 20 percent of years had no flow; (3) annual peak flows represent large outflow from a spring; (4) of insufficient peak-flow record subsequent to reservoir regulation; and (5) peak-flow records were combined with records from nearby stations.

  19. A modified weighted function method for parameter estimation of Pearson type three distribution

    NASA Astrophysics Data System (ADS)

    Liang, Zhongmin; Hu, Yiming; Li, Binquan; Yu, Zhongbo

    2014-04-01

    In this paper, an unconventional method called Modified Weighted Function (MWF) is presented for the conventional moment estimation of a probability distribution function. The aim of MWF is to estimate the coefficient of variation (CV) and coefficient of skewness (CS) from the original higher moment computations to the first-order moment calculations. The estimators for CV and CS of Pearson type three distribution function (PE3) were derived by weighting the moments of the distribution with two weight functions, which were constructed by combining two negative exponential-type functions. The selection of these weight functions was based on two considerations: (1) to relate weight functions to sample size in order to reflect the relationship between the quantity of sample information and the role of weight function and (2) to allocate more weights to data close to medium-tail positions in a sample series ranked in an ascending order. A Monte-Carlo experiment was conducted to simulate a large number of samples upon which statistical properties of MWF were investigated. For the PE3 parent distribution, results of MWF were compared to those of the original Weighted Function (WF) and Linear Moments (L-M). The results indicate that MWF was superior to WF and slightly better than L-M, in terms of statistical unbiasness and effectiveness. In addition, the robustness of MWF, WF, and L-M were compared by designing the Monte-Carlo experiment that samples are obtained from Log-Pearson type three distribution (LPE3), three parameter Log-Normal distribution (LN3), and Generalized Extreme Value distribution (GEV), respectively, but all used as samples from the PE3 distribution. The results show that in terms of statistical unbiasness, no one method possesses the absolutely overwhelming advantage among MWF, WF, and L-M, while in terms of statistical effectiveness, the MWF is superior to WF and L-M.

  20. Asymptotic confidence intervals for the Pearson correlation via skewness and kurtosis.

    PubMed

    Bishara, Anthony J; Li, Jiexiang; Nash, Thomas

    2018-02-01

    When bivariate normality is violated, the default confidence interval of the Pearson correlation can be inaccurate. Two new methods were developed based on the asymptotic sampling distribution of Fisher's z' under the general case where bivariate normality need not be assumed. In Monte Carlo simulations, the most successful of these methods relied on the (Vale & Maurelli, 1983, Psychometrika, 48, 465) family to approximate a distribution via the marginal skewness and kurtosis of the sample data. In Simulation 1, this method provided more accurate confidence intervals of the correlation in non-normal data, at least as compared to no adjustment of the Fisher z' interval, or to adjustment via the sample joint moments. In Simulation 2, this approximate distribution method performed favourably relative to common non-parametric bootstrap methods, but its performance was mixed relative to an observed imposed bootstrap and two other robust methods (PM1 and HC4). No method was completely satisfactory. An advantage of the approximate distribution method, though, is that it can be implemented even without access to raw data if sample skewness and kurtosis are reported, making the method particularly useful for meta-analysis. Supporting information includes R code. © 2017 The British Psychological Society.

  1. Epidemiology of injuries in Belgium: contribution of hospital data for surveillance.

    PubMed

    Senterre, Christelle; Levêque, Alain; Di Pierdomenico, Lionel; Dramaix-Wilmet, Michèle; Pirson, Magali

    2014-01-01

    Investigating injuries in terms of occurrences and patient and hospital stay characteristics. 17370 stays, with at least one E code, were investigated based on data from 13 Belgian hospitals. Pearson's chi-square and Kruskal-Wallis tests were used to assess the variations between distributions of the investigated factors according to the injury's types. Major injuries were accidental falls, transport injuries, and self-inflicted injuries. There were more men in the transport injuries group and the accidental falls group was older. For the transport injuries, there were more arrivals with the support of a mobile intensive care unit and/or a paramedic intervention team and a general practitioner was more implicated for the accidental falls. In three-quarters of cases, it was a primary diagnostic related to injury and poisoning which was made. The median length of stay was nearly equal to one week and for accidental falls, this value is three times higher. The median cost, from the social security point of view, for all injuries was equal to € 1377 and there was a higher median cost within the falls group. This study based on hospitals data provides important information both on factors associated with and on hospital costs generated by injuries.

  2. Evaluation of patients with painful total hip arthroplasty using combined single photon emission tomography and conventional computerized tomography (SPECT/CT) - a comparison of semi-quantitative versus 3D volumetric quantitative measurements.

    PubMed

    Barthassat, Emilienne; Afifi, Faik; Konala, Praveen; Rasch, Helmut; Hirschmann, Michael T

    2017-05-08

    It was the primary purpose of our study to evaluate the inter- and intra-observer reliability of a standardized SPECT/CT algorithm for evaluating patients with painful primary total hip arthroplasty (THA). The secondary purpose was a comparison of semi-quantitative and 3D volumetric quantification method for assessment of bone tracer uptake (BTU) in those patients. A novel SPECT/CT localization scheme consisting of 14 femoral and 4 acetabular regions on standardized axial and coronal slices was introduced and evaluated in terms of inter- and intra-observer reliability in 37 consecutive patients with hip pain after THA. BTU for each anatomical region was assessed semi-quantitatively using a color-coded Likert type scale (0-10) and volumetrically quantified using a validated software. Two observers interpreted the SPECT/CT findings in all patients two times with six weeks interval between interpretations in random order. Semi-quantitative and quantitative measurements were compared in terms of reliability. In addition, the values were correlated using Pearson`s correlation. A factorial cluster analysis of BTU was performed to identify clinically relevant regions, which should be grouped and analysed together. The localization scheme showed high inter- and intra-observer reliabilities for all femoral and acetabular regions independent of the measurement method used (semiquantitative versus 3D volumetric quantitative measurements). A high to moderate correlation between both measurement methods was shown for the distal femur, the proximal femur and the acetabular cup. The factorial cluster analysis showed that the anatomical regions might be summarized into three distinct anatomical regions. These were the proximal femur, the distal femur and the acetabular cup region. The SPECT/CT algorithm for assessment of patients with pain after THA is highly reliable independent from the measurement method used. Three clinically relevant anatomical regions (proximal femoral, distal femoral, acetabular) were identified.

  3. Rate of occurrence, gross appearance, and age relation of hyperostosis frontalis interna in females: a prospective autopsy study.

    PubMed

    Nikolić, Slobodan; Djonić, Danijela; Zivković, Vladimir; Babić, Dragan; Juković, Fehim; Djurić, Marija

    2010-09-01

    The aim of our study was to determine rate of occurrence and appearance of hyperostosis frontalis interna (HFI) in females and correlation of this phenomenon with ageing. The sample included 248 deceased females: 45 of them with different types of HFI, and 203 without HFI, average age 68.3 +/- 15.4 years (range, 19-93), and 58.2 +/- 20.2 years (range, 10-101), respectively. According to our results, the rate of HFI was 18.14%. The older the woman was, the higher the possibility of HFI occurring (Pearson correlation 0.211, N=248, P=0.001), but the type of HFI did not correlate with age (Pearson correlation 0.229, N=45, P=0.131). Frontal and temporal bone were significantly thicker in women with than in women without HFI (t= -10.490, DF=246, P=0.000, and t= -5.658, DF=246, P=0.000, respectively). These bones became thicker with ageing (Pearson correlation 0.178, N=248, P=0.005, and 0.303, N=248, P=0.000, respectively). The best predictors of HFI occurrence were respectively, frontal bone thickness, temporal bone thickness, and age(Wald. coeff.=35.487, P=0.000; Wald. coeff.=3.288, P=0.070, and Wald.coeff. =2.727, P =0.099). Diagnosis of HFI depends not only on frontal bone thickness, but also on waviness of internal plate of the frontal bone, as well as-the involvement of the inner bone surface.

  4. A Study of the Attitudes of Married Minuteman Crewmembers and Their Wives Concerning Female Minuteman Crewmembers

    DTIC Science & Technology

    1978-12-01

    female crew. The crewmembers were about evenly split as to type of crew pairing. The author recommended using an all-female crew pairing plan when...obtained so that the respondents could be assigned to various subpopulations during the analysis. Data ob- tained provided information about: Type of...respondent is made. There are many types of correlations that can be calculated but the parti- cular one employed by SPSS is Pearson’s correlation. A

  5. PEBBED Uncertainty and Sensitivity Analysis of the CRP-5 PBMR DLOFC Transient Benchmark with the SUSA Code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gerhard Strydom

    2011-01-01

    The need for a defendable and systematic uncertainty and sensitivity approach that conforms to the Code Scaling, Applicability, and Uncertainty (CSAU) process, and that could be used for a wide variety of software codes, was defined in 2008. The GRS (Gesellschaft für Anlagen und Reaktorsicherheit) company of Germany has developed one type of CSAU approach that is particularly well suited for legacy coupled core analysis codes, and a trial version of their commercial software product SUSA (Software for Uncertainty and Sensitivity Analyses) was acquired on May 12, 2010. This report summarized the results of the initial investigations performed with SUSA,more » utilizing a typical High Temperature Reactor benchmark (the IAEA CRP-5 PBMR 400MW Exercise 2) and the PEBBED-THERMIX suite of codes. The following steps were performed as part of the uncertainty and sensitivity analysis: 1. Eight PEBBED-THERMIX model input parameters were selected for inclusion in the uncertainty study: the total reactor power, inlet gas temperature, decay heat, and the specific heat capability and thermal conductivity of the fuel, pebble bed and reflector graphite. 2. The input parameters variations and probability density functions were specified, and a total of 800 PEBBED-THERMIX model calculations were performed, divided into 4 sets of 100 and 2 sets of 200 Steady State and Depressurized Loss of Forced Cooling (DLOFC) transient calculations each. 3. The steady state and DLOFC maximum fuel temperature, as well as the daily pebble fuel load rate data, were supplied to SUSA as model output parameters of interest. The 6 data sets were statistically analyzed to determine the 5% and 95% percentile values for each of the 3 output parameters with a 95% confidence level, and typical statistical indictors were also generated (e.g. Kendall, Pearson and Spearman coefficients). 4. A SUSA sensitivity study was performed to obtain correlation data between the input and output parameters, and to identify the primary contributors to the output data uncertainties. It was found that the uncertainties in the decay heat, pebble bed and reflector thermal conductivities were responsible for the bulk of the propagated uncertainty in the DLOFC maximum fuel temperature. It was also determined that the two standard deviation (2s) uncertainty on the maximum fuel temperature was between ±58oC (3.6%) and ±76oC (4.7%) on a mean value of 1604 oC. These values mostly depended on the selection of the distributions types, and not on the number of model calculations above the required Wilks criteria (a (95%,95%) statement would usually require 93 model runs).« less

  6. Probability Density Functions of Observed Rainfall in Montana

    NASA Technical Reports Server (NTRS)

    Larsen, Scott D.; Johnson, L. Ronald; Smith, Paul L.

    1995-01-01

    The question of whether a rain rate probability density function (PDF) can vary uniformly between precipitation events is examined. Image analysis on large samples of radar echoes is possible because of advances in technology. The data provided by such an analysis easily allow development of radar reflectivity factors (and by extension rain rate) distribution. Finding a PDF becomes a matter of finding a function that describes the curve approximating the resulting distributions. Ideally, one PDF would exist for all cases; or many PDF's that have the same functional form with only systematic variations in parameters (such as size or shape) exist. Satisfying either of theses cases will, validate the theoretical basis of the Area Time Integral (ATI). Using the method of moments and Elderton's curve selection criteria, the Pearson Type 1 equation was identified as a potential fit for 89 percent of the observed distributions. Further analysis indicates that the Type 1 curve does approximate the shape of the distributions but quantitatively does not produce a great fit. Using the method of moments and Elderton's curve selection criteria, the Pearson Type 1 equation was identified as a potential fit for 89% of the observed distributions. Further analysis indicates that the Type 1 curve does approximate the shape of the distributions but quantitatively does not produce a great fit.

  7. Relative validity of a web-based food frequency questionnaire for patients with type 1 and type 2 diabetes in Denmark

    PubMed Central

    Bentzen, S M R; Knudsen, V K; Christiensen, T; Ewers, B

    2016-01-01

    Background: Diet has an important role in the management of diabetes. However, little is known about dietary intake in Danish diabetes patients. A food frequency questionnaire (FFQ) focusing on most relevant nutrients in diabetes including carbohydrates, dietary fibres and simple sugars was developed and validated. Objectives: To examine the relative validity of nutrients calculated by a web-based food frequency questionnaire for patients with diabetes. Design: The FFQ was validated against a 4-day pre-coded food diary (FD). Intakes of nutrients were calculated. Means of intake were compared and cross-classifications of individuals according to intake were performed. To assess the agreement between the two methods, Pearson and Spearman's correlation coefficients and weighted kappa coefficients were calculated. Subjects: Ninety patients (64 with type 1 diabetes and 26 with type 2 diabetes) accepted to participate in the study. Twenty-six were excluded from the final study population. Setting: 64 volunteer diabetes patients at the Steno Diabetes Center. Results: Intakes of carbohydrates, simple sugars, dietary fibres and total energy were higher according to the FFQ compared with the FD. However, intakes of nutrients were grossly classified in the same or adjacent quartiles with an average of 82% of the selected nutrients when comparing the two methods. In general, moderate agreement between the two methods was found. Conclusion: The FFQ was validated for assessment of a range of nutrients. Comparing the intakes of selected nutrients (carbohydrates, dietary fibres and simple sugars), patients were classified correctly according to low and high intakes. The FFQ is a reliable dietary assessment tool to use in research and evaluation of patient education for patients with diabetes. PMID:27669176

  8. Young People and Suicide—the College Scene | NIH MedlinePlus the Magazine

    MedlinePlus

    ... why NIMH has been funding research on preventing suicide, depression, and other disorders among college students." Dr. Pearson ... should be of help: MedlinePlus: medlineplus.gov (Type "suicide" in the Search ... of Michigan's Depression Center (research funded, in part, by NIMH): www. ...

  9. The human foramen magnum--normal anatomy of the cisterna magna in adults.

    PubMed

    Whitney, Nathaniel; Sun, Hai; Pollock, Jeffrey M; Ross, Donald A

    2013-11-01

    The goal of this study was to radiologically describe the anatomical characteristics of the cisterna magna (CM) with regard to presence, dimension, and configuration. In this retrospective study, 523 records were reviewed. We defined five CM types, the range of which covered all normal variants found in the study population. Characteristics of the CM were recorded and correlations between various posterior fossa dimensions and CM volume determined. There were 268 female (mean age 50.9 ± 16.9 years) and 255 male (mean age 54.1 ± 15.8 years) patients. CM volume was smaller in females than in males and correlated with age (Pearson correlation, r = 0.1494, p = 0.0006) and gender (unpaired t test, r (2) = 0.0608, p < 0.0001). Clivus length correlated with CM volume (Pearson correlation, r = 0.211, p < 0.0001) and gender (unpaired t test, r (2) = 0.2428, p < 0.0001). Tentorial angle did not correlate with CM volume (Pearson correlation, r = -0.0609, p < 0.1642) but did correlate with gender (unpaired t test, r (2) = 0.0163, p < 0.0035). The anterior-posterior dimension of cerebrospinal fluid anterior to the brainstem correlated with CM volume (Pearson correlation, r = 0.181, p < 0.0001) and gender (unpaired t test, r (2) = 0.0205, p = 0.001). The anatomical description and simple classification system we define allows for a more precise description of posterior fossa anatomy and could potentially contribute to the understanding of Chiari malformation anatomy and management.

  10. Creation of a retrospective job-exposure matrix using surrogate measures of exposure for a cohort of US career firefighters from San Francisco, Chicago and Philadelphia

    PubMed Central

    Dahm, Matthew M; Bertke, Stephen; Allee, Steve; Daniels, Robert D

    2015-01-01

    Objectives To construct a cohort-specific job-exposure matrix (JEM) using surrogate metrics of exposure for a cancer study on career firefighters from the Chicago, Philadelphia and San Francisco Fire Departments. Methods Departmental work history records, along with data on historical annual fire-runs and hours, were collected from 1950 to 2009 and coded into separate databases. These data were used to create a JEM based on standardised job titles and fire apparatus assignments using several surrogate exposure metrics to estimate firefighters’ exposure to the combustion byproducts of fire. The metrics included duration of exposure (cumulative time with a standardised exposed job title and assignment), fire-runs (cumulative events of potential fire exposure) and time at fire (cumulative hours of potential fire exposure). Results The JEM consisted of 2298 unique job titles alongside 16 174 fire apparatus assignments from the three departments, which were collapsed into 15 standardised job titles and 15 standardised job assignments. Correlations were found between fire-runs and time at fires (Pearson coefficient=0.92), duration of exposure and time at fires (Pearson coefficient=0.85), and duration of exposure and fire-runs (Pearson coefficient=0.82). Total misclassification rates were found to be between 16–30% when using duration of employment as an exposure surrogate, which has been traditionally used in most epidemiological studies, compared with using the duration of exposure surrogate metric. Conclusions The constructed JEM successfully differentiated firefighters based on gradient levels of potential exposure to the combustion byproducts of fire using multiple surrogate exposure metrics. PMID:26163543

  11. Creation of a retrospective job-exposure matrix using surrogate measures of exposure for a cohort of US career firefighters from San Francisco, Chicago and Philadelphia.

    PubMed

    Dahm, Matthew M; Bertke, Stephen; Allee, Steve; Daniels, Robert D

    2015-09-01

    To construct a cohort-specific job-exposure matrix (JEM) using surrogate metrics of exposure for a cancer study on career firefighters from the Chicago, Philadelphia and San Francisco Fire Departments. Departmental work history records, along with data on historical annual fire-runs and hours, were collected from 1950 to 2009 and coded into separate databases. These data were used to create a JEM based on standardised job titles and fire apparatus assignments using several surrogate exposure metrics to estimate firefighters' exposure to the combustion byproducts of fire. The metrics included duration of exposure (cumulative time with a standardised exposed job title and assignment), fire-runs (cumulative events of potential fire exposure) and time at fire (cumulative hours of potential fire exposure). The JEM consisted of 2298 unique job titles alongside 16,174 fire apparatus assignments from the three departments, which were collapsed into 15 standardised job titles and 15 standardised job assignments. Correlations were found between fire-runs and time at fires (Pearson coefficient=0.92), duration of exposure and time at fires (Pearson coefficient=0.85), and duration of exposure and fire-runs (Pearson coefficient=0.82). Total misclassification rates were found to be between 16-30% when using duration of employment as an exposure surrogate, which has been traditionally used in most epidemiological studies, compared with using the duration of exposure surrogate metric. The constructed JEM successfully differentiated firefighters based on gradient levels of potential exposure to the combustion byproducts of fire using multiple surrogate exposure metrics. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  12. Eliciting and Receiving Online Support: Using Computer-Aided Content Analysis to Examine the Dynamics of Online Social Support

    PubMed Central

    Kraut, Robert E; Levine, John M

    2015-01-01

    Background Although many people with serious diseases participate in online support communities, little research has investigated how participants elicit and provide social support on these sites. Objective The first goal was to propose and test a model of the dynamic process through which participants in online support communities elicit and provide emotional and informational support. The second was to demonstrate the value of computer coding of conversational data using machine learning techniques (1) by replicating results derived from human-coded data about how people elicit support and (2) by answering questions that are intractable with small samples of human-coded data, namely how exposure to different types of social support predicts continued participation in online support communities. The third was to provide a detailed description of these machine learning techniques to enable other researchers to perform large-scale data analysis in these communities. Methods Communication among approximately 90,000 registered users of an online cancer support community was analyzed. The corpus comprised 1,562,459 messages organized into 68,158 discussion threads. Amazon Mechanical Turk workers coded (1) 1000 thread-starting messages on 5 attributes (positive and negative emotional self-disclosure, positive and negative informational self-disclosure, questions) and (2) 1000 replies on emotional and informational support. Their judgments were used to train machine learning models that automatically estimated the amount of these 7 attributes in the messages. Across attributes, the average Pearson correlation between human-based judgments and computer-based judgments was .65. Results Part 1 used human-coded data to investigate relationships between (1) 4 kinds of self-disclosure and question asking in thread-starting posts and (2) the amount of emotional and informational support in the first reply. Self-disclosure about negative emotions (beta=.24, P<.001), negative events (beta=.25, P<.001), and positive events (beta=.10, P=.02) increased emotional support. However, asking questions depressed emotional support (beta=–.21, P<.001). In contrast, asking questions increased informational support (beta=.38, P<.001), whereas positive informational self-disclosure depressed it (beta=–.09, P=.003). Self-disclosure led to the perception of emotional needs, which elicited emotional support, whereas asking questions led to the perception of informational needs, which elicited informational support. Part 2 used machine-coded data to replicate these results. Part 3 analyzed the machine-coded data and showed that exposure to more emotional support predicted staying in the group longer 33% (hazard ratio=0.67, P<.001), whereas exposure to more informational support predicted leaving the group sooner (hazard ratio=1.05, P<.001). Conclusions Self-disclosure is effective in eliciting emotional support, whereas question asking is effective in eliciting informational support. Moreover, perceptions that people desire particular kinds of support influence the support they receive. Finally, the type of support people receive affects the likelihood of their staying in or leaving the group. These results demonstrate the utility of machine learning methods for investigating the dynamics of social support exchange in online support communities. PMID:25896033

  13. Eliciting and receiving online support: using computer-aided content analysis to examine the dynamics of online social support.

    PubMed

    Wang, Yi-Chia; Kraut, Robert E; Levine, John M

    2015-04-20

    Although many people with serious diseases participate in online support communities, little research has investigated how participants elicit and provide social support on these sites. The first goal was to propose and test a model of the dynamic process through which participants in online support communities elicit and provide emotional and informational support. The second was to demonstrate the value of computer coding of conversational data using machine learning techniques (1) by replicating results derived from human-coded data about how people elicit support and (2) by answering questions that are intractable with small samples of human-coded data, namely how exposure to different types of social support predicts continued participation in online support communities. The third was to provide a detailed description of these machine learning techniques to enable other researchers to perform large-scale data analysis in these communities. Communication among approximately 90,000 registered users of an online cancer support community was analyzed. The corpus comprised 1,562,459 messages organized into 68,158 discussion threads. Amazon Mechanical Turk workers coded (1) 1000 thread-starting messages on 5 attributes (positive and negative emotional self-disclosure, positive and negative informational self-disclosure, questions) and (2) 1000 replies on emotional and informational support. Their judgments were used to train machine learning models that automatically estimated the amount of these 7 attributes in the messages. Across attributes, the average Pearson correlation between human-based judgments and computer-based judgments was .65. Part 1 used human-coded data to investigate relationships between (1) 4 kinds of self-disclosure and question asking in thread-starting posts and (2) the amount of emotional and informational support in the first reply. Self-disclosure about negative emotions (beta=.24, P<.001), negative events (beta=.25, P<.001), and positive events (beta=.10, P=.02) increased emotional support. However, asking questions depressed emotional support (beta=-.21, P<.001). In contrast, asking questions increased informational support (beta=.38, P<.001), whereas positive informational self-disclosure depressed it (beta=-.09, P=.003). Self-disclosure led to the perception of emotional needs, which elicited emotional support, whereas asking questions led to the perception of informational needs, which elicited informational support. Part 2 used machine-coded data to replicate these results. Part 3 analyzed the machine-coded data and showed that exposure to more emotional support predicted staying in the group longer 33% (hazard ratio=0.67, P<.001), whereas exposure to more informational support predicted leaving the group sooner (hazard ratio=1.05, P<.001). Self-disclosure is effective in eliciting emotional support, whereas question asking is effective in eliciting informational support. Moreover, perceptions that people desire particular kinds of support influence the support they receive. Finally, the type of support people receive affects the likelihood of their staying in or leaving the group. These results demonstrate the utility of machine learning methods for investigating the dynamics of social support exchange in online support communities.

  14. Comparison of manually produced and automated cross country movement maps using digital image processing techniques

    NASA Technical Reports Server (NTRS)

    Wynn, L. K.

    1985-01-01

    The Image-Based Information System (IBIS) was used to automate the cross country movement (CCM) mapping model developed by the Defense Mapping Agency (DMA). Existing terrain factor overlays and a CCM map, produced by DMA for the Fort Lewis, Washington area, were digitized and reformatted into geometrically registered images. Terrain factor data from Slope, Soils, and Vegetation overlays were entered into IBIS, and were then combined utilizing IBIS-programmed equations to implement the DMA CCM model. The resulting IBIS-generated CCM map was then compared with the digitized manually produced map to test similarity. The numbers of pixels comprising each CCM region were compared between the two map images, and percent agreement between each two regional counts was computed. The mean percent agreement equalled 86.21%, with an areally weighted standard deviation of 11.11%. Calculation of Pearson's correlation coefficient yielded +9.997. In some cases, the IBIS-calculated map code differed from the DMA codes: analysis revealed that IBIS had calculated the codes correctly. These highly positive results demonstrate the power and accuracy of IBIS in automating models which synthesize a variety of thematic geographic data.

  15. Generalized type II hybrid ARQ scheme using punctured convolutional coding

    NASA Astrophysics Data System (ADS)

    Kallel, Samir; Haccoun, David

    1990-11-01

    A method is presented to construct rate-compatible convolutional (RCC) codes from known high-rate punctured convolutional codes, obtained from best-rate 1/2 codes. The construction method is rather simple and straightforward, and still yields good codes. Moreover, low-rate codes can be obtained without any limit on the lowest achievable code rate. Based on the RCC codes, a generalized type-II hybrid ARQ scheme, which combines the benefits of the modified type-II hybrid ARQ strategy of Hagenauer (1988) with the code-combining ARQ strategy of Chase (1985), is proposed and analyzed. With the proposed generalized type-II hybrid ARQ strategy, the throughput increases as the starting coding rate increases, and as the channel degrades, it tends to merge with the throughput of rate 1/2 type-II hybrid ARQ schemes with code combining, thus allowing the system to be flexible and adaptive to channel conditions, even under wide noise variations and severe degradations.

  16. Neyman-Pearson classification algorithms and NP receiver operating characteristics

    PubMed Central

    Tong, Xin; Feng, Yang; Li, Jingyi Jessica

    2018-01-01

    In many binary classification applications, such as disease diagnosis and spam detection, practitioners commonly face the need to limit type I error (that is, the conditional probability of misclassifying a class 0 observation as class 1) so that it remains below a desired threshold. To address this need, the Neyman-Pearson (NP) classification paradigm is a natural choice; it minimizes type II error (that is, the conditional probability of misclassifying a class 1 observation as class 0) while enforcing an upper bound, α, on the type I error. Despite its century-long history in hypothesis testing, the NP paradigm has not been well recognized and implemented in classification schemes. Common practices that directly limit the empirical type I error to no more than α do not satisfy the type I error control objective because the resulting classifiers are likely to have type I errors much larger than α, and the NP paradigm has not been properly implemented in practice. We develop the first umbrella algorithm that implements the NP paradigm for all scoring-type classification methods, such as logistic regression, support vector machines, and random forests. Powered by this algorithm, we propose a novel graphical tool for NP classification methods: NP receiver operating characteristic (NP-ROC) bands motivated by the popular ROC curves. NP-ROC bands will help choose α in a data-adaptive way and compare different NP classifiers. We demonstrate the use and properties of the NP umbrella algorithm and NP-ROC bands, available in the R package nproc, through simulation and real data studies. PMID:29423442

  17. Neyman-Pearson classification algorithms and NP receiver operating characteristics.

    PubMed

    Tong, Xin; Feng, Yang; Li, Jingyi Jessica

    2018-02-01

    In many binary classification applications, such as disease diagnosis and spam detection, practitioners commonly face the need to limit type I error (that is, the conditional probability of misclassifying a class 0 observation as class 1) so that it remains below a desired threshold. To address this need, the Neyman-Pearson (NP) classification paradigm is a natural choice; it minimizes type II error (that is, the conditional probability of misclassifying a class 1 observation as class 0) while enforcing an upper bound, α, on the type I error. Despite its century-long history in hypothesis testing, the NP paradigm has not been well recognized and implemented in classification schemes. Common practices that directly limit the empirical type I error to no more than α do not satisfy the type I error control objective because the resulting classifiers are likely to have type I errors much larger than α, and the NP paradigm has not been properly implemented in practice. We develop the first umbrella algorithm that implements the NP paradigm for all scoring-type classification methods, such as logistic regression, support vector machines, and random forests. Powered by this algorithm, we propose a novel graphical tool for NP classification methods: NP receiver operating characteristic (NP-ROC) bands motivated by the popular ROC curves. NP-ROC bands will help choose α in a data-adaptive way and compare different NP classifiers. We demonstrate the use and properties of the NP umbrella algorithm and NP-ROC bands, available in the R package nproc, through simulation and real data studies.

  18. Comparison of two methods of MMPI-2 profile classification.

    PubMed

    Munley, P H; Germain, J M

    2000-10-01

    The present study investigated the extent of agreement of the highest scale method and the best-fit method in matching MMPI-2 profiles to database code-type profiles and considered profile characteristics that may relate to agreement or disagreement of code-type matches by these two methods. A sample of 519 MMPI-2 profiles that had been classified into database profile code types by these two methods was studied. Resulting code-type matches were classified into three groups: identical (30%), similar (39%), and different (31%), and the profile characteristics of profile elevation, dispersion, and profile code-type definition were studied. Profile code-type definition was significantly different across the three groups with identical and similar match profile groups showing greater profile code-type definition and the different group consisting of profiles that were less well-defined.

  19. Analyses of flood-flow frequency for selected gaging stations in South Dakota

    USGS Publications Warehouse

    Benson, R.D.; Hoffman, E.B.; Wipf, V.J.

    1985-01-01

    Analyses of flood flow frequency were made for 111 continuous-record gaging stations in South Dakota with 10 or more years of record. The analyses were developed using the log-Pearson Type III procedure recommended by the U.S. Water Resources Council. The procedure characterizes flood occurrence at a single site as a sequence of annual peak flows. The magnitudes of the annual peak flows are assumed to be independent random variables following a log-Pearson Type III probability distribution, which defines the probability that any single annual peak flow will exceed a specified discharge. By considering only annual peak flows, the flood-frequency analysis becomes the estimation of the log-Pearson annual-probability curve using the record of annual peak flows at the site. The recorded data are divided into two classes: systematic and historic. The systematic record includes all annual peak flows determined in the process of conducting a systematic gaging program at a site. In this program, the annual peak flow is determined for each and every year of the program. The systematic record is intended to constitute an unbiased and representative sample of the population of all possible annual peak flows at the site. In contrast to the systematic record, the historic record consists of annual peak flows that would not have been determined except for evidence indicating their unusual magnitude. Flood information acquired from historical sources almost invariably refers to floods of noteworthy, and hence extraordinary, size. Although historic records form a biased and unrepresentative sample, they can be used to supplement the systematic record. (Author 's abstract)

  20. User's Manual for Program PeakFQ, Annual Flood-Frequency Analysis Using Bulletin 17B Guidelines

    USGS Publications Warehouse

    Flynn, Kathleen M.; Kirby, William H.; Hummel, Paul R.

    2006-01-01

    Estimates of flood flows having given recurrence intervals or probabilities of exceedance are needed for design of hydraulic structures and floodplain management. Program PeakFQ provides estimates of instantaneous annual-maximum peak flows having recurrence intervals of 2, 5, 10, 25, 50, 100, 200, and 500 years (annual-exceedance probabilities of 0.50, 0.20, 0.10, 0.04, 0.02, 0.01, 0.005, and 0.002, respectively). As implemented in program PeakFQ, the Pearson Type III frequency distribution is fit to the logarithms of instantaneous annual peak flows following Bulletin 17B guidelines of the Interagency Advisory Committee on Water Data. The parameters of the Pearson Type III frequency curve are estimated by the logarithmic sample moments (mean, standard deviation, and coefficient of skewness), with adjustments for low outliers, high outliers, historic peaks, and generalized skew. This documentation provides an overview of the computational procedures in program PeakFQ, provides a description of the program menus, and provides an example of the output from the program.

  1. Quadriphase DS-CDMA wireless communication systems employing the generalized detector

    NASA Astrophysics Data System (ADS)

    Tuzlukov, Vyacheslav

    2012-05-01

    Probability of bit-error Per performance of asynchronous direct-sequence code-division multiple-access (DS-CDMA) wireless communication systems employing the generalized detector (GD) constructed based on the generalized approach to signal processing in noise is analyzed. The effects of pulse shaping, quadriphase or direct sequence quadriphase shift keying (DS-QPSK) spreading, aperiodic spreading sequences are considered in DS-CDMA based on GD and compared with the coherent Neyman-Pearson receiver. An exact Per expression and several approximations: one using the characterristic function method, a simplified expression for the improved Gaussian approximation (IGA) and the simplified improved Gaussian approximation are derived. Under conditions typically satisfied in practice and even with a small number of interferers, the standard Gaussian approximation (SGA) for the multiple-access interference component of the GD statistic and Per performance is shown to be accurate. Moreover, the IGA is shown to reduce to the SGA for pulses with zero excess bandwidth. Second, the GD Per performance of quadriphase DS-CDMA is shown to be superior to that of bi-phase DS-CDMA. Numerical examples by Monte Carlo simulation are presented to illustrate the GD Per performance for square-root raised-cosine pulses and spreading factors of moderate to large values. Also, a superiority of GD employment in CDMA systems over the Neyman-Pearson receiver is demonstrated

  2. A novel fractal image compression scheme with block classification and sorting based on Pearson's correlation coefficient.

    PubMed

    Wang, Jianji; Zheng, Nanning

    2013-09-01

    Fractal image compression (FIC) is an image coding technology based on the local similarity of image structure. It is widely used in many fields such as image retrieval, image denoising, image authentication, and encryption. FIC, however, suffers from the high computational complexity in encoding. Although many schemes are published to speed up encoding, they do not easily satisfy the encoding time or the reconstructed image quality requirements. In this paper, a new FIC scheme is proposed based on the fact that the affine similarity between two blocks in FIC is equivalent to the absolute value of Pearson's correlation coefficient (APCC) between them. First, all blocks in the range and domain pools are chosen and classified using an APCC-based block classification method to increase the matching probability. Second, by sorting the domain blocks with respect to APCCs between these domain blocks and a preset block in each class, the matching domain block for a range block can be searched in the selected domain set in which these APCCs are closer to APCC between the range block and the preset block. Experimental results show that the proposed scheme can significantly speed up the encoding process in FIC while preserving the reconstructed image quality well.

  3. Palliative Care for Hospitalized Patients With Stroke: Results From the 2010 to 2012 National Inpatient Sample.

    PubMed

    Singh, Tarvinder; Peters, Steven R; Tirschwell, David L; Creutzfeldt, Claire J

    2017-09-01

    Substantial variability exists in the use of life-prolonging treatments for patients with stroke, especially near the end of life. This study explores patterns of palliative care utilization and death in hospitalized patients with stroke across the United States. Using the 2010 to 2012 nationwide inpatient sample databases, we included all patients discharged with stroke identified by International Classification of Diseases-Ninth Revision codes. Strokes were subclassified as ischemic, intracerebral, and subarachnoid hemorrhage. We compared demographics, comorbidities, procedures, and outcomes between patients with and without a palliative care encounter (PCE) as defined by the International Classification of Diseases-Ninth Revision code V66.7. Pearson χ 2 test was used for categorical variables. Multivariate logistic regression was used to account for hospital, regional, payer, and medical severity factors to predict PCE use and death. Among 395 411 patients with stroke, PCE was used in 6.2% with an increasing trend over time ( P <0.05). We found a wide range in PCE use with higher rates in patients with older age, hemorrhagic stroke types, women, and white race (all P <0.001). Smaller and for-profit hospitals saw lower rates. Overall, 9.2% of hospitalized patients with stroke died, and PCE was significantly associated with death. Length of stay in decedents was shorter for patients who received PCE. Palliative care use is increasing nationally for patients with stroke, especially in larger hospitals. Persistent disparities in PCE use and mortality exist in regards to age, sex, race, region, and hospital characteristics. Given the variations in PCE use, especially at the end of life, the use of mortality rates as a hospital quality measure is questioned. © 2017 The Authors.

  4. FRACTIONAL PEARSON DIFFUSIONS.

    PubMed

    Leonenko, Nikolai N; Meerschaert, Mark M; Sikorskii, Alla

    2013-07-15

    Pearson diffusions are governed by diffusion equations with polynomial coefficients. Fractional Pearson diffusions are governed by the corresponding time-fractional diffusion equation. They are useful for modeling sub-diffusive phenomena, caused by particle sticking and trapping. This paper provides explicit strong solutions for fractional Pearson diffusions, using spectral methods. It also presents stochastic solutions, using a non-Markovian inverse stable time change.

  5. Correlation Structure of Fractional Pearson Diffusions.

    PubMed

    Leonenko, Nikolai N; Meerschaert, Mark M; Sikorskii, Alla

    2013-09-01

    The stochastic solution to a diffusion equations with polynomial coefficients is called a Pearson diffusion. If the first time derivative is replaced by a Caputo fractional derivative of order less than one, the stochastic solution is called a fractional Pearson diffusion. This paper develops an explicit formula for the covariance function of a fractional Pearson diffusion in steady state, in terms of Mittag-Leffler functions. That formula shows that fractional Pearson diffusions are long range dependent, with a correlation that falls off like a power law, whose exponent equals the order of the fractional derivative.

  6. Zero Pearson coefficient for strongly correlated growing trees.

    PubMed

    Dorogovtsev, S N; Ferreira, A L; Goltsev, A V; Mendes, J F F

    2010-03-01

    We obtained Pearson's coefficient of strongly correlated recursive networks growing by preferential attachment of every new vertex by m edges. We found that the Pearson coefficient is exactly zero in the infinite network limit for the recursive trees (m=1). If the number of connections of new vertices exceeds one (m>1), then the Pearson coefficient in the infinite networks equals zero only when the degree distribution exponent gamma does not exceed 4. We calculated the Pearson coefficient for finite networks and observed a slow power-law-like approach to an infinite network limit. Our findings indicate that Pearson's coefficient strongly depends on size and details of networks, which makes this characteristic virtually useless for quantitative comparison of different networks.

  7. Can poison control data be used for pharmaceutical poisoning surveillance?

    PubMed

    Naun, Christopher A; Olsen, Cody S; Dean, J Michael; Olson, Lenora M; Cook, Lawrence J; Keenan, Heather T

    2011-05-01

    To determine the association between the frequencies of pharmaceutical exposures reported to a poison control center (PCC) and those seen in the emergency department (ED). A statewide population-based retrospective comparison of frequencies of ED pharmaceutical poisonings with frequencies of pharmaceutical exposures reported to a regional PCC. ED poisonings, identified by International Classification of Diseases, Version 9 (ICD-9) codes, were grouped into substance categories. Using a reproducible algorithm facilitated by probabilistic linkage, codes from the PCC classification system were mapped into the same categories. A readily identifiable subset of PCC calls was selected for comparison. Correlations between frequencies of quarterly exposures by substance categories were calculated using Pearson correlation coefficients and partial correlation coefficients with adjustment for seasonality. PCC reported exposures correlated with ED poisonings in nine of 10 categories. Partial correlation coefficients (r(p)) indicated strong associations (r(p)>0.8) for three substance categories that underwent large changes in their incidences (opiates, benzodiazepines, and muscle relaxants). Six substance categories were moderately correlated (r(p)>0.6). One category, salicylates, showed no association. Limitations Imperfect overlap between ICD-9 and PCC codes may have led to miscategorization. Substances without changes in exposure frequency have inadequate variability to detect association using this method. PCC data are able to effectively identify trends in poisonings seen in EDs and may be useful as part of a pharmaceutical poisoning surveillance system. The authors developed an algorithm-driven technique for mapping American Association of Poison Control Centers codes to ICD-9 codes and identified a useful subset of poison control exposures for analysis.

  8. On the Evolution of the Standard Genetic Code: Vestiges of Critical Scale Invariance from the RNA World in Current Prokaryote Genomes

    PubMed Central

    José, Marco V.; Govezensky, Tzipe; García, José A.; Bobadilla, Juan R.

    2009-01-01

    Herein two genetic codes from which the primeval RNA code could have originated the standard genetic code (SGC) are derived. One of them, called extended RNA code type I, consists of all codons of the type RNY (purine-any base-pyrimidine) plus codons obtained by considering the RNA code but in the second (NYR type) and third (YRN type) reading frames. The extended RNA code type II, comprises all codons of the type RNY plus codons that arise from transversions of the RNA code in the first (YNY type) and third (RNR) nucleotide bases. In order to test if putative nucleotide sequences in the RNA World and in both extended RNA codes, share the same scaling and statistical properties to those encountered in current prokaryotes, we used the genomes of four Eubacteria and three Archaeas. For each prokaryote, we obtained their respective genomes obeying the RNA code or the extended RNA codes types I and II. In each case, we estimated the scaling properties of triplet sequences via a renormalization group approach, and we calculated the frequency distributions of distances for each codon. Remarkably, the scaling properties of the distance series of some codons from the RNA code and most codons from both extended RNA codes turned out to be identical or very close to the scaling properties of codons of the SGC. To test for the robustness of these results, we show, via computer simulation experiments, that random mutations of current genomes, at the rates of 10−10 per site per year during three billions of years, were not enough for destroying the observed patterns. Therefore, we conclude that most current prokaryotes may still contain relics of the primeval RNA World and that both extended RNA codes may well represent two plausible evolutionary paths between the RNA code and the current SGC. PMID:19183813

  9. Novel Multidimensional Cross-Correlation Data Comparison Techniques for Spectroscopic Discernment in a Volumetrically Sensitive, Moderating Type Neutron Spectrometer

    NASA Astrophysics Data System (ADS)

    Hoshor, Cory; Young, Stephan; Rogers, Brent; Currie, James; Oakes, Thomas; Scott, Paul; Miller, William; Caruso, Anthony

    2014-03-01

    A novel application of the Pearson Cross-Correlation to neutron spectral discernment in a moderating type neutron spectrometer is introduced. This cross-correlation analysis will be applied to spectral response data collected through both MCNP simulation and empirical measurement by the volumetrically sensitive spectrometer for comparison in 1, 2, and 3 spatial dimensions. The spectroscopic analysis methods discussed will be demonstrated to discern various common spectral and monoenergetic neutron sources.

  10. iGC-an integrated analysis package of gene expression and copy number alteration.

    PubMed

    Lai, Yi-Pin; Wang, Liang-Bo; Wang, Wei-An; Lai, Liang-Chuan; Tsai, Mong-Hsun; Lu, Tzu-Pin; Chuang, Eric Y

    2017-01-14

    With the advancement in high-throughput technologies, researchers can simultaneously investigate gene expression and copy number alteration (CNA) data from individual patients at a lower cost. Traditional analysis methods analyze each type of data individually and integrate their results using Venn diagrams. Challenges arise, however, when the results are irreproducible and inconsistent across multiple platforms. To address these issues, one possible approach is to concurrently analyze both gene expression profiling and CNAs in the same individual. We have developed an open-source R/Bioconductor package (iGC). Multiple input formats are supported and users can define their own criteria for identifying differentially expressed genes driven by CNAs. The analysis of two real microarray datasets demonstrated that the CNA-driven genes identified by the iGC package showed significantly higher Pearson correlation coefficients with their gene expression levels and copy numbers than those genes located in a genomic region with CNA. Compared with the Venn diagram approach, the iGC package showed better performance. The iGC package is effective and useful for identifying CNA-driven genes. By simultaneously considering both comparative genomic and transcriptomic data, it can provide better understanding of biological and medical questions. The iGC package's source code and manual are freely available at https://www.bioconductor.org/packages/release/bioc/html/iGC.html .

  11. An Investigation of Integrative and Independent Listening Test Tasks in a Computerised Academic English Test

    ERIC Educational Resources Information Center

    Wei, Wei; Zheng, Ying

    2017-01-01

    This research provided a comprehensive evaluation and validation of the listening section of a newly introduced computerised test, Pearson Test of English Academic (PTE Academic). PTE Academic contains 11 item types assessing academic listening skills either alone or in combination with other skills. First, task analysis helped identify skills…

  12. College Students’ Perceived Differences Between the Terms Real Meal, Meal, and Snack

    PubMed Central

    Banna, Jinan; Richards, Rickelle; Brown, Lora Beth

    2017-01-01

    Objective To assess qualitatively and quantitatively college students’ perceived differences between a real meal, meal, and snack. Design A descriptive study design was used to administer an 11-item online survey to college students. Setting Two university campuses in the western US. Participants Pilot testing was conducted with 20 students. The final survey was completed by 628 ethnically diverse students. Main Outcome Measures Students’ perceptions of the terms real meal, meal, and snack. Analysis Three researchers coded the data independently, reconciled differences via conference calls, and agreed on a final coding scheme. Data were reevaluated based on the coding scheme. Means, frequencies, Pearson chi-square, and t test statistics were used. Results More than half of students perceived a difference between the terms real meal and meal. Most (97.6%) perceived a difference between the terms meal and snack. A marked difference in the way students defined these terms was evident, with a real meal deemed nutritious and healthy and meeting dietary recommendations, compared with meals, which were considered anything to eat. Conclusions and Implications These findings suggest that the term real meal may provide nutrition educators with a simple phrase to use in educational campaigns to promote healthful food intake among college students. PMID:27993555

  13. Genotyping and drug resistance patterns of M. tuberculosis strains in Pakistan.

    PubMed

    Tanveer, Mahnaz; Hasan, Zahra; Siddiqui, Amna R; Ali, Asho; Kanji, Akbar; Ghebremicheal, Solomon; Hasan, Rumina

    2008-12-24

    The incidence of tuberculosis in Pakistan is 181/100,000 population. However, information about transmission and geographical prevalence of Mycobacterium tuberculosis strains and their evolutionary genetics as well as drug resistance remains limited. Our objective was to determine the clonal composition, evolutionary genetics and drug resistance of M. tuberculosis isolates from different regions of the country. M. tuberculosis strains isolated (2003-2005) from specimens submitted to the laboratory through collection units nationwide were included. Drug susceptibility was performed and strains were spoligotyped. Of 926 M. tuberculosis strains studied, 721(78%) were grouped into 59 "shared types", while 205 (22%) were identified as "Orphan" spoligotypes. Amongst the predominant genotypes 61% were Central Asian strains (CAS ; including CAS1, CAS sub-families and Orphan Pak clusters), 4% East African-Indian (EAI), 3% Beijing, 2% poorly defined TB strains (T), 2% Haarlem and LAM (0.2). Also TbD1 analysis (M. tuberculosis specific deletion 1) confirmed that CAS1 was of "modern" origin while EAI isolates belonged to "ancestral" strain types.Prevalence of CAS1 clade was significantly higher in Punjab (P < 0.01, Pearsons Chi-square test) as compared with Sindh, North West Frontier Province and Balochistan provinces. Forty six percent of isolates were sensitive to five first line antibiotics tested, 45% were Rifampicin resistant, 50% isoniazid resistant. MDR was significantly associated with Beijing strains (P = 0.01, Pearsons Chi-square test) and EAI (P = 0.001, Pearsons Chi-square test), but not with CAS family. Our results show variation of prevalent M. tuberculosis strain with greater association of CAS1 with the Punjab province. The fact that the prevalent CAS genotype was not associated with drug resistance is encouraging. It further suggests a more effective treatment and control programme should be successful in reducing the tuberculosis burden in Pakistan.

  14. The Battle Between the Biometricians and the Mendelians: How Sir Francis Galton's Work Caused his Disciples to Reach Conflicting Conclusions About the Hereditary Mechanism

    NASA Astrophysics Data System (ADS)

    Gillham, Nicholas W.

    2015-01-01

    Francis Galton, Charles Darwin's cousin, had wide and varied interests. They ranged from exploration and travel writing to fingerprinting and the weather. After reading Darwin's On the Origin of Species, Galton reached the conclusion that it should be possible to improve the human stock through selective breeding, as was the case for domestic animals and cultivated plants. Much of the latter half of Galton's career was devoted to trying to devise methods to distinguish men of good stock and then to show that these qualities were inherited. But along the way he invented two important statistical methods: regression and correlation. He also discovered regression to the mean. This led Galton to believe that evolution could not proceed by the small steps envisioned by Darwin, but must proceed by discontinuous changes. Galton's book Natural Inheritance (1889) served as the inspiration for Karl Pearson, W.F.R. Weldon and William Bateson. Pearson and Weldon were interested in continuously varying characters and the application of statistical techniques to their study. Bateson was fascinated by discontinuities and the role they might play in evolution. Galton proposed his Law of Ancestral Heredity in the last decade of the nineteenth century. At first this seemed to work well as an explanation for continuously varying traits of the type that interested Pearson and Weldon. In contrast, Bateson had published a book on discontinuously varying traits so he was in a position to understand and embrace Mendel's principles of inheritance when they were rediscovered in 1900. The subsequent battle between Weldon and Pearson, the biometricians, and Bateson, the Mendelian, went on acrimoniously for several years at the beginning of the twentieth century before Mendelian theory finally won out.

  15. Neyman-Pearson biometric score fusion as an extension of the sum rule

    NASA Astrophysics Data System (ADS)

    Hube, Jens Peter

    2007-04-01

    We define the biometric performance invariance under strictly monotonic functions on match scores as normalization symmetry. We use this symmetry to clarify the essential difference between the standard score-level fusion approaches of sum rule and Neyman-Pearson. We then express Neyman-Pearson fusion assuming match scores defined using false acceptance rates on a logarithmic scale. We show that by stating Neyman-Pearson in this form, it reduces to sum rule fusion for ROC curves with logarithmic slope. We also introduce a one parameter model of biometric performance and use it to express Neyman-Pearson fusion as a weighted sum rule.

  16. Genetic hotels for the standard genetic code: evolutionary analysis based upon novel three-dimensional algebraic models.

    PubMed

    José, Marco V; Morgado, Eberto R; Govezensky, Tzipe

    2011-07-01

    Herein, we rigorously develop novel 3-dimensional algebraic models called Genetic Hotels of the Standard Genetic Code (SGC). We start by considering the primeval RNA genetic code which consists of the 16 codons of type RNY (purine-any base-pyrimidine). Using simple algebraic operations, we show how the RNA code could have evolved toward the current SGC via two different intermediate evolutionary stages called Extended RNA code type I and II. By rotations or translations of the subset RNY, we arrive at the SGC via the former (type I) or via the latter (type II), respectively. Biologically, the Extended RNA code type I, consists of all codons of the type RNY plus codons obtained by considering the RNA code but in the second (NYR type) and third (YRN type) reading frames. The Extended RNA code type II, comprises all codons of the type RNY plus codons that arise from transversions of the RNA code in the first (YNY type) and third (RNR) nucleotide bases. Since the dimensions of remarkable subsets of the Genetic Hotels are not necessarily integer numbers, we also introduce the concept of algebraic fractal dimension. A general decoding function which maps each codon to its corresponding amino acid or the stop signals is also derived. The Phenotypic Hotel of amino acids is also illustrated. The proposed evolutionary paths are discussed in terms of the existing theories of the evolution of the SGC. The adoption of 3-dimensional models of the Genetic and Phenotypic Hotels will facilitate the understanding of the biological properties of the SGC.

  17. Types of homes and ways of life: a territorial analysis of the environmental determinants that factor into the proliferation of malaria vectors in the rural region of Allada in Benin.

    PubMed

    Lysaniuk, Benjamin; Ladsous, Roman; Tabeaud, Martine; Cottrell, Gilles; Pennetier, Cédric; Garcia, André

    2015-01-01

    Anthropogenic factors, as well as environmental factors, can explain fine-scale spatial differences in vector densities and seasonal variations in malaria. In this pilot study, numbers of Anopheles gambiae were quantified in concessions in a rural area of southern Benin, West Africa, in order to establish whether vector number and human factors, such as habitat and living practices, are related. The courtyard homes of 64 concessions (houses and private yards) were systematically and similarly photographed. Predefined features in the photographed items were extracted by applying an analysis grid that listed vector resting sites or potential breeding sites and also more general information about the building materials used. These data were analysed with respect to entomological data (number of mosquitoes caught per night) using the Kruskal-Wallis test, Pearson correlation coefficients, and analysis of covariance (ANCOVA). Three recurrent habitat/household types and living practices were identified that corresponded to different standards of living. These were related to the average number of mosquitoes captured per night: type I=0.88 anopheles/night; type II=0.85; and type III 0.55, but this was not statistically significant (Kruskal-Wallis test; p=0.41). There were no significant relationships between the number of potential breeding sites and number of mosquitoes caught (Pearson's correlation coefficient=-0.09, p=0.53). ANCOVA analysis of building materials and numbers of openings did not explain variation in the number of mosquitoes caught. Three dwelling types were identified by using predetermined socio-environmental characteristics but there was no association found in this study between vector number and habitat characteristics as was suspected.

  18. A New Family of Solvable Pearson-Dirichlet Random Walks

    NASA Astrophysics Data System (ADS)

    Le Caër, Gérard

    2011-07-01

    An n-step Pearson-Gamma random walk in ℝ d starts at the origin and consists of n independent steps with gamma distributed lengths and uniform orientations. The gamma distribution of each step length has a shape parameter q>0. Constrained random walks of n steps in ℝ d are obtained from the latter walks by imposing that the sum of the step lengths is equal to a fixed value. Simple closed-form expressions were obtained in particular for the distribution of the endpoint of such constrained walks for any d≥ d 0 and any n≥2 when q is either q = d/2 - 1 ( d 0=3) or q= d-1 ( d 0=2) (Le Caër in J. Stat. Phys. 140:728-751, 2010). When the total walk length is chosen, without loss of generality, to be equal to 1, then the constrained step lengths have a Dirichlet distribution whose parameters are all equal to q and the associated walk is thus named a Pearson-Dirichlet random walk. The density of the endpoint position of a n-step planar walk of this type ( n≥2), with q= d=2, was shown recently to be a weighted mixture of 1+ floor( n/2) endpoint densities of planar Pearson-Dirichlet walks with q=1 (Beghin and Orsingher in Stochastics 82:201-229, 2010). The previous result is generalized to any walk space dimension and any number of steps n≥2 when the parameter of the Pearson-Dirichlet random walk is q= d>1. We rely on the connection between an unconstrained random walk and a constrained one, which have both the same n and the same q= d, to obtain a closed-form expression of the endpoint density. The latter is a weighted mixture of 1+ floor( n/2) densities with simple forms, equivalently expressed as a product of a power and a Gauss hypergeometric function. The weights are products of factors which depends both on d and n and Bessel numbers independent of d.

  19. Joint design of QC-LDPC codes for coded cooperation system with joint iterative decoding

    NASA Astrophysics Data System (ADS)

    Zhang, Shunwai; Yang, Fengfan; Tang, Lei; Ejaz, Saqib; Luo, Lin; Maharaj, B. T.

    2016-03-01

    In this paper, we investigate joint design of quasi-cyclic low-density-parity-check (QC-LDPC) codes for coded cooperation system with joint iterative decoding in the destination. First, QC-LDPC codes based on the base matrix and exponent matrix are introduced, and then we describe two types of girth-4 cycles in QC-LDPC codes employed by the source and relay. In the equivalent parity-check matrix corresponding to the jointly designed QC-LDPC codes employed by the source and relay, all girth-4 cycles including both type I and type II are cancelled. Theoretical analysis and numerical simulations show that the jointly designed QC-LDPC coded cooperation well combines cooperation gain and channel coding gain, and outperforms the coded non-cooperation under the same conditions. Furthermore, the bit error rate performance of the coded cooperation employing jointly designed QC-LDPC codes is better than those of random LDPC codes and separately designed QC-LDPC codes over AWGN channels.

  20. Correlation Between Posttraumatic Growth and Posttraumatic Stress Disorder Symptoms Based on Pearson Correlation Coefficient: A Meta-Analysis.

    PubMed

    Liu, An-Nuo; Wang, Lu-Lu; Li, Hui-Ping; Gong, Juan; Liu, Xiao-Hong

    2017-05-01

    The literature on posttraumatic growth (PTG) is burgeoning, with the inconsistencies in the literature of the relationship between PTG and posttraumatic stress disorder (PTSD) symptoms becoming a focal point of attention. Thus, this meta-analysis aims to explore the relationship between PTG and PTSD symptoms through the Pearson correlation coefficient. A systematic search of the literature from January 1996 to November 2015 was completed. We retrieved reports on 63 studies that involved 26,951 patients. The weighted correlation coefficient revealed an effect size of 0.22 with a 95% confidence interval of 0.18 to 0.25. Meta-analysis provides evidence that PTG may be positively correlated with PTSD symptoms and that this correlation may be modified by age, trauma type, and time since trauma. Accordingly, people with high levels of PTG should not be ignored, but rather, they should continue to receive help to alleviate their PTSD symptoms.

  1. National Underground Mines Inventory

    DTIC Science & Technology

    1983-10-01

    system is well designed to minimize water accumulation on the drift levels. In many areas, sufficient water has accumulated to make the use of boots a...four characters designate Field office. 17-18 State Code Pic 99 FIPS code for state in which minets located. 19-21 County Code Plc 999 FIPS code for... Designate a general product class based onSIC code. 28-29 Nine Type Plc 99 Natal/Nonmetal mine type code. Based on subunit operations code and canvass code

  2. Shaping electromagnetic waves using software-automatically-designed metasurfaces.

    PubMed

    Zhang, Qian; Wan, Xiang; Liu, Shuo; Yuan Yin, Jia; Zhang, Lei; Jun Cui, Tie

    2017-06-15

    We present a fully digital procedure of designing reflective coding metasurfaces to shape reflected electromagnetic waves. The design procedure is completely automatic, controlled by a personal computer. In details, the macro coding units of metasurface are automatically divided into several types (e.g. two types for 1-bit coding, four types for 2-bit coding, etc.), and each type of the macro coding units is formed by discretely random arrangement of micro coding units. By combining an optimization algorithm and commercial electromagnetic software, the digital patterns of the macro coding units are optimized to possess constant phase difference for the reflected waves. The apertures of the designed reflective metasurfaces are formed by arranging the macro coding units with certain coding sequence. To experimentally verify the performance, a coding metasurface is fabricated by automatically designing two digital 1-bit unit cells, which are arranged in array to constitute a periodic coding metasurface to generate the required four-beam radiations with specific directions. Two complicated functional metasurfaces with circularly- and elliptically-shaped radiation beams are realized by automatically designing 4-bit macro coding units, showing excellent performance of the automatic designs by software. The proposed method provides a smart tool to realize various functional devices and systems automatically.

  3. 50 CFR Table 1c to Part 679 - Product Type Codes

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 50 Wildlife and Fisheries 11 2011-10-01 2011-10-01 false Product Type Codes 1c Table 1c to Part..., Table 1c Table 1c to Part 679—Product Type Codes Description Code Ancillary product.A product, such as... the highest recovery rate. P Reprocessed or rehandled product.A product, such as meal, that results...

  4. Measurement of tibial tuberosity-trochlear groove distance: evaluation of inter- and intraobserver correlation dependent on the severity of trochlear dysplasia.

    PubMed

    Dornacher, Daniel; Reichel, Heiko; Lippacher, Sabine

    2014-10-01

    Excessive tibial tuberosity-trochlear groove distance (TT-TG) is considered as one of the major risk factors in patellofemoral instability (PFI). TT-TG characterises the lateralisation of the tibial tuberosity and the medialisation of the trochlear groove in the case of trochlear dysplasia. The aim of this study was to assess the inter- and intraobserver reliability of the measurement of TT-TG dependent on the grade of trochlear dysplasia. Magnetic resonance imaging (MRI) scans of 99 consecutive knee joints were analysed retrospectively. Hereof, 61 knee joints presented with a history of PFI and 38 had no symptoms of PFI. After synopsis of the axial MRI scans with true lateral radiographs of the knee, the 61 knees presenting with PFI were assessed in terms of trochlear dysplasia. The knees were distributed according to the four-type classification system described by Dejour. Regarding interobserver correlation for the measurements of TT-TG in trochlear dysplasia, we found r=0.89 (type A), r=0.90 (type B), r=0.74 (type C) and 0.62 (type D) for Pearson's correlation coefficient. Regarding intraobserver correlation, we calculated r=0.89 (type A), r=0.91 (type B), r=0.77 (type C) and r=0.71 (type D), respectively. Pearson's correlation coefficient for the measurement of TT-TG in normal knees resulted in r=0.87 for interobserver correlation and r=0.90 for intraobserver correlation. Decreasing inter- and intraobserver correlation for the measurement of TT-TG with increasing severity of trochlear dysplasia was detected. In our opinion, the measurement of TT-TG is of significance in low-grade trochlear dysplasia. The final decision to perform a distal realignment procedure based on a pathological TT-TG in the presence of high-grade trochlear dysplasia should be reassessed properly. Retrospective study, Level II.

  5. ARA type protograph codes

    NASA Technical Reports Server (NTRS)

    Divsalar, Dariush (Inventor); Abbasfar, Aliazam (Inventor); Jones, Christopher R. (Inventor); Dolinar, Samuel J. (Inventor); Thorpe, Jeremy C. (Inventor); Andrews, Kenneth S. (Inventor); Yao, Kung (Inventor)

    2008-01-01

    An apparatus and method for encoding low-density parity check codes. Together with a repeater, an interleaver and an accumulator, the apparatus comprises a precoder, thus forming accumulate-repeat-accumulate (ARA codes). Protographs representing various types of ARA codes, including AR3A, AR4A and ARJA codes, are described. High performance is obtained when compared to the performance of current repeat-accumulate (RA) or irregular-repeat-accumulate (IRA) codes.

  6. Practical scheme for optimal measurement in quantum interferometric devices

    NASA Astrophysics Data System (ADS)

    Takeoka, Masahiro; Ban, Masashi; Sasaki, Masahide

    2003-06-01

    We apply a Kennedy-type detection scheme, which was originally proposed for a binary communications system, to interferometric sensing devices. We show that the minimum detectable perturbation of the proposed system reaches the ultimate precision bound which is predicted by quantum Neyman-Pearson hypothesis testing. To provide concrete examples, we apply our interferometric scheme to phase shift detection by using coherent and squeezed probe fields.

  7. [Calculating Pearson residual in logistic regressions: a comparison between SPSS and SAS].

    PubMed

    Xu, Hao; Zhang, Tao; Li, Xiao-song; Liu, Yuan-yuan

    2015-01-01

    To compare the results of Pearson residual calculations in logistic regression models using SPSS and SAS. We reviewed Pearson residual calculation methods, and used two sets of data to test logistic models constructed by SPSS and STATA. One model contained a small number of covariates compared to the number of observed. The other contained a similar number of covariates as the number of observed. The two software packages produced similar Pearson residual estimates when the models contained a similar number of covariates as the number of observed, but the results differed when the number of observed was much greater than the number of covariates. The two software packages produce different results of Pearson residuals, especially when the models contain a small number of covariates. Further studies are warranted.

  8. Revised catalog of types of CODES applications implemented using linked state data : crash outcome data evaluation system (CODES)

    DOT National Transportation Integrated Search

    2000-06-01

    The purpose of the Revised Catalog of Types of CODES Applications Implemented Using Linked : State Data (CODES) is to inspire the development of new applications for linked data that support : efforts to reduce death, disability, severity, and health...

  9. Karl Pearson and eugenics: personal opinions and scientific rigor.

    PubMed

    Delzell, Darcie A P; Poliak, Cathy D

    2013-09-01

    The influence of personal opinions and biases on scientific conclusions is a threat to the advancement of knowledge. Expertise and experience does not render one immune to this temptation. In this work, one of the founding fathers of statistics, Karl Pearson, is used as an illustration of how even the most talented among us can produce misleading results when inferences are made without caution or reference to potential bias and other analysis limitations. A study performed by Pearson on British Jewish schoolchildren is examined in light of ethical and professional statistical practice. The methodology used and inferences made by Pearson and his coauthor are sometimes questionable and offer insight into how Pearson's support of eugenics and his own British nationalism could have potentially influenced his often careless and far-fetched inferences. A short background into Pearson's work and beliefs is provided, along with an in-depth examination of the authors' overall experimental design and statistical practices. In addition, portions of the study regarding intelligence and tuberculosis are discussed in more detail, along with historical reactions to their work.

  10. Efficient Type Representation in TAL

    NASA Technical Reports Server (NTRS)

    Chen, Juan

    2009-01-01

    Certifying compilers generate proofs for low-level code that guarantee safety properties of the code. Type information is an essential part of safety proofs. But the size of type information remains a concern for certifying compilers in practice. This paper demonstrates type representation techniques in a large-scale compiler that achieves both concise type information and efficient type checking. In our 200,000-line certifying compiler, the size of type information is about 36% of the size of pure code and data for our benchmarks, the best result to the best of our knowledge. The type checking time is about 2% of the compilation time.

  11. Protein Solvent-Accessibility Prediction by a Stacked Deep Bidirectional Recurrent Neural Network.

    PubMed

    Zhang, Buzhong; Li, Linqing; Lü, Qiang

    2018-05-25

    Residue solvent accessibility is closely related to the spatial arrangement and packing of residues. Predicting the solvent accessibility of a protein is an important step to understand its structure and function. In this work, we present a deep learning method to predict residue solvent accessibility, which is based on a stacked deep bidirectional recurrent neural network applied to sequence profiles. To capture more long-range sequence information, a merging operator was proposed when bidirectional information from hidden nodes was merged for outputs. Three types of merging operators were used in our improved model, with a long short-term memory network performing as a hidden computing node. The trained database was constructed from 7361 proteins extracted from the PISCES server using a cut-off of 25% sequence identity. Sequence-derived features including position-specific scoring matrix, physical properties, physicochemical characteristics, conservation score and protein coding were used to represent a residue. Using this method, predictive values of continuous relative solvent-accessible area were obtained, and then, these values were transformed into binary states with predefined thresholds. Our experimental results showed that our deep learning method improved prediction quality relative to current methods, with mean absolute error and Pearson's correlation coefficient values of 8.8% and 74.8%, respectively, on the CB502 dataset and 8.2% and 78%, respectively, on the Manesh215 dataset.

  12. Narrative Characteristics of Genocide Testimonies Predict Posttraumatic Stress Disorder Symptoms Years Later

    PubMed Central

    Ng, Lauren C.; Ahishakiye, Naphtal; Miller, Donald E.; Meyerowitz, Beth E.

    2015-01-01

    Cognitive theories of posttraumatic stress disorder (PTSD) suggest that trauma narratives that make greater use of somatosensory, perceptual, and negative emotion words may be indicators of greater risk of PTSD symptoms (Ehlers & Clark, 2000). The purpose of this study was to analyze whether the way that survivors of the 1994 Rwandan Genocide against the Tutsi naturally construct genocide testimonies predicts PTSD symptoms six years later. One hundred orphaned heads of household (OHH) who were members of a community association gave testimonies about their genocide experiences in 2002. In 2008, PTSD symptoms of 61 of the original OHH were assessed using a genocide specific version of the Impact of Events Scale-Revised (Weiss & Marmar, 2004). Experienced genocide events were coded from the genocide testimonies, and the types of words used in the testimonies were analyzed using the Linguistic Inquiry and Word Count program (Pennebaker, Chung, Ireland, Gonzales, & Booth, 2007). Pearson correlations and path analyses assessed the relationships between variables. After accounting for genocide events, touching positively predicted avoidance, and sadness negatively predicted hyperarousal. Sensory descriptions of traumatic experiences in trauma narratives may signify higher risk for mental health problems, while expressions of sadness may indicate emotional processing and better mental health. Analyzing genocide testimonies may help identify survivors at the highest risk of developing PTSD symptoms, even among a group of survivors who have arguably suffered some of the most severe genocide experiences. PMID:25793398

  13. Federal Logistics Information Systems. FLIS Procedures Manual. Document Identifier Code Input/Output Formats (Variable Length). Volume 9.

    DTIC Science & Technology

    1997-04-01

    DATA COLLABORATORS 0001N B NQ 8380 NUMBER OF DATA RECEIVERS 0001N B NQ 2533 AUTHORIZED ITEM IDENTIFICATION DATA COLLABORATOR CODE 0002 ,X B 03 18 TD...01 NC 8268 DATA ELEMENT TERMINATOR CODE 000iX VT 9505 TYPE OF SCREENING CODE 0001A 01 NC 8268 DATA ELEMENT TERMINATOR CODE 000iX VT 4690 OUTPUT DATA... 9505 TYPE OF SCREENING CODE 0001A 2 89 2910 REFERENCE NUMBER CATEGORY CODE (RNCC) 0001X 2 89 4780 REFERENCE NUMBER VARIATION CODE (RNVC) 0001 N 2 89

  14. [Neonatal Pearson syndrome. two case studies].

    PubMed

    Collin-Ducasse, H; Maillotte, A-M; Monpoux, F; Boutté, P; Ferrero-Vacher, C; Paquis, V

    2010-01-01

    Among the etiologies of anemia in the newborn, those related to mitochondrial cytopathies are rare. Pearson syndrome is mostly diagnosed during infancy and characterized by refractory sideroblastic anemia with vacuolization of marrow progenitor cells and exocrine pancreatic dysfunction. We describe two diagnosed cases of Pearson syndrome in the early neonatal period caused by severe macrocytic aregenerative anemia. Bone marrow aspiration revealed sideroblastic anemia and vacuolization of erythroblastic precursors. The diagnosis was confirmed by genetic analysis revealing a deletion in the mitochondrial DNA. These two newborns received monthly transfusions. Five other newborns suffering from Pearson syndrome with various clinical symptoms were found in literature. Pearson syndrome, rarely diagnosed in newborns, should be suspected in the presence of macrocytic aregenerative anemia and requires a bone marrow aspirate followed by a genetic analysis from a blood sample. Copyright 2009 Elsevier Masson SAS. All rights reserved.

  15. Corneal endothelial dysfunction in Pearson syndrome.

    PubMed

    Kasbekar, Shivani A; Gonzalez-Martin, Jose A; Shafiq, Ayad E; Chandna, Arvind; Willoughby, Colin E

    2013-01-01

    Mitochondrial disorders are associated with well recognized ocular manifestations. Pearson syndrome is an often fatal, multisystem, mitochondrial disorder that causes variable bone marrow, hepatic, renal and pancreatic exocrine dysfunction. Phenotypic progression of ocular disease in a 12-year-old male with Pearson syndrome is described. This case illustrates phenotypic drift from Pearson syndrome to Kearns-Sayre syndrome given the patient's longevity. Persistent corneal endothelial failure was noted in addition to ptosis, chronic external ophthalmoplegia and mid-peripheral pigmentary retinopathy. We propose that corneal edema resulting from corneal endothelial metabolic pump failure occurs within a spectrum of mitochondrial disorders.

  16. Multi-stage decoding for multi-level block modulation codes

    NASA Technical Reports Server (NTRS)

    Lin, Shu; Kasami, Tadao

    1991-01-01

    Various types of multistage decoding for multilevel block modulation codes, in which the decoding of a component code at each stage can be either soft decision or hard decision, maximum likelihood or bounded distance are discussed. Error performance for codes is analyzed for a memoryless additive channel based on various types of multi-stage decoding, and upper bounds on the probability of an incorrect decoding are derived. It was found that, if component codes of a multi-level modulation code and types of decoding at various stages are chosen properly, high spectral efficiency and large coding gain can be achieved with reduced decoding complexity. It was found that the difference in performance between the suboptimum multi-stage soft decision maximum likelihood decoding of a modulation code and the single stage optimum decoding of the overall code is very small, only a fraction of dB loss in SNR at the probability of an incorrect decoding for a block of 10(exp -6). Multi-stage decoding of multi-level modulation codes really offers a way to achieve the best of three worlds, bandwidth efficiency, coding gain, and decoding complexity.

  17. A new triclinic modification of the pyrochlore-type KOs{sub 2}O{sub 6} superconductor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Katrych, S.; Gu, Q.F.; Bukowski, Z.

    2009-03-15

    A new modification of KOs{sub 2}O{sub 6}, the representative of a new structural type (Pearson symbol aP18, a=5.5668(1) A, b=6.4519(2) A, c=7.2356(2) A, {alpha}=65.377(3){sup o}, {beta}=70.572(3){sup o}, {gamma}=75.613(2){sup o} space group P-1, no. 2 was synthesized employing high pressure technique. Its structure was determined by single-crystal X-ray diffraction. The structure can be described as two OsO{sub 6} octahedral chains relating to each other through inversion and forming big voids with K atoms inside. Quantum chemical calculations were performed on the novel compound and structurally related cubic compound. High-pressure X-ray study showed that cubic KOs{sub 2}O{sub 6} phase was stable upmore » to 32.5(2) GPa at room temperature. - Graphical abstract: A new modification of KOs{sub 2}O{sub 6}, the representative of a new structural type (Pearson symbol aP18, a=5.5668(1) A, b=6.4519(2) A, c=7.2356(2) A, {alpha}=65.377(3){sup o}, {beta}=70.572(3){sup o}, {gamma}=75.613(2){sup o} space group P-1, no. 2 was synthesized employing high pressure technique. The structure can be described as two OsO{sub 6} octahedral chains relating to each other through inversion and forming big voids with K atoms inside.« less

  18. A comparison of moment-based methods of estimation for the log Pearson type 3 distribution

    NASA Astrophysics Data System (ADS)

    Koutrouvelis, I. A.; Canavos, G. C.

    2000-06-01

    The log Pearson type 3 distribution is a very important model in statistical hydrology, especially for modeling annual flood series. In this paper we compare the various methods based on moments for estimating quantiles of this distribution. Besides the methods of direct and mixed moments which were found most successful in previous studies and the well-known indirect method of moments, we develop generalized direct moments and generalized mixed moments methods and a new method of adaptive mixed moments. The last method chooses the orders of two moments for the original observations by utilizing information contained in the sample itself. The results of Monte Carlo experiments demonstrated the superiority of this method in estimating flood events of high return periods when a large sample is available and in estimating flood events of low return periods regardless of the sample size. In addition, a comparison of simulation and asymptotic results shows that the adaptive method may be used for the construction of meaningful confidence intervals for design events based on the asymptotic theory even with small samples. The simulation results also point to the specific members of the class of generalized moments estimates which maintain small values for bias and/or mean square error.

  19. Log Pearson type 3 quantile estimators with regional skew information and low outlier adjustments

    USGS Publications Warehouse

    Griffis, V.W.; Stedinger, Jery R.; Cohn, T.A.

    2004-01-01

    The recently developed expected moments algorithm (EMA) [Cohn et al., 1997] does as well as maximum likelihood estimations at estimating log‐Pearson type 3 (LP3) flood quantiles using systematic and historical flood information. Needed extensions include use of a regional skewness estimator and its precision to be consistent with Bulletin 17B. Another issue addressed by Bulletin 17B is the treatment of low outliers. A Monte Carlo study compares the performance of Bulletin 17B using the entire sample with and without regional skew with estimators that use regional skew and censor low outliers, including an extended EMA estimator, the conditional probability adjustment (CPA) from Bulletin 17B, and an estimator that uses probability plot regression (PPR) to compute substitute values for low outliers. Estimators that neglect regional skew information do much worse than estimators that use an informative regional skewness estimator. For LP3 data the low outlier rejection procedure generally results in no loss of overall accuracy, and the differences between the MSEs of the estimators that used an informative regional skew are generally modest in the skewness range of real interest. Samples contaminated to model actual flood data demonstrate that estimators which give special treatment to low outliers significantly outperform estimators that make no such adjustment.

  20. Log Pearson type 3 quantile estimators with regional skew information and low outlier adjustments

    NASA Astrophysics Data System (ADS)

    Griffis, V. W.; Stedinger, J. R.; Cohn, T. A.

    2004-07-01

    The recently developed expected moments algorithm (EMA) [, 1997] does as well as maximum likelihood estimations at estimating log-Pearson type 3 (LP3) flood quantiles using systematic and historical flood information. Needed extensions include use of a regional skewness estimator and its precision to be consistent with Bulletin 17B. Another issue addressed by Bulletin 17B is the treatment of low outliers. A Monte Carlo study compares the performance of Bulletin 17B using the entire sample with and without regional skew with estimators that use regional skew and censor low outliers, including an extended EMA estimator, the conditional probability adjustment (CPA) from Bulletin 17B, and an estimator that uses probability plot regression (PPR) to compute substitute values for low outliers. Estimators that neglect regional skew information do much worse than estimators that use an informative regional skewness estimator. For LP3 data the low outlier rejection procedure generally results in no loss of overall accuracy, and the differences between the MSEs of the estimators that used an informative regional skew are generally modest in the skewness range of real interest. Samples contaminated to model actual flood data demonstrate that estimators which give special treatment to low outliers significantly outperform estimators that make no such adjustment.

  1. MMPI--2 Code-Type Congruence of Injured Workers

    ERIC Educational Resources Information Center

    Livingston, Ronald B.; Jennings, Earl; Colotla, Victor A.; Reynolds, Cecil R.; Shercliffe, Regan J.

    2006-01-01

    In this study, the authors examined the stability of Minnesota Multiphasic Personality Inventory--2 (J. N. Butcher, W. G. Dahlstrom, J. R. Graham, A. Tellegen, & B. Kaemmer, 1989) code types in a sample of 94 injured workers with a mean test-retest interval of 21.3 months (SD = 14.1). Congruence rates for undefined code types were 34% for…

  2. Forest regeneration research at Fort Valley

    Treesearch

    L. J. (Pat) Heidmann

    2008-01-01

    When G. A. Pearson arrived at Fort Valley to establish the first Forest Service Experiment Station he found many open park-like stands similar to those in Figure 1. Within two years, Pearson had outlined the major factors detrimental to the establishment of ponderosa pine seedlings (Pearson 1910). During the next almost 40 years, he wrote many articles on methods of...

  3. Forest regeneration research (P-53)

    Treesearch

    Leroy J. (Pat) Heidmann

    2008-01-01

    When G. A. Pearson arrived at Fort Valley to establish the first Forest Service Experiment Station he found many open park-like stands similar to those in Figure 1. Within two years, Pearson had outlined the major factors detrimental to the establishment of ponderosa pine seedlings (Pearson 1910). During the next almost 40 years, he wrote many articles on methods of...

  4. 78 FR 25465 - Notice of Realty Action: Modified Competitive Auction of Public Lands in Lincoln County, NV

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-01

    ... County Commission supports a request by Lee Pearson for a modified-competitive sale of the 26.39 acre parcels. Mr. Pearson presently resides and conducts a cattle ranching operation on the private land that... dislocation of existing users, the BLM authorized officer has determined Lee Pearson as the designated bidder...

  5. Genotyping and drug resistance patterns of M. tuberculosis strains in Pakistan

    PubMed Central

    Tanveer, Mahnaz; Hasan, Zahra; Siddiqui, Amna R; Ali, Asho; Kanji, Akbar; Ghebremicheal, Solomon; Hasan, Rumina

    2008-01-01

    Background The incidence of tuberculosis in Pakistan is 181/100,000 population. However, information about transmission and geographical prevalence of Mycobacterium tuberculosis strains and their evolutionary genetics as well as drug resistance remains limited. Our objective was to determine the clonal composition, evolutionary genetics and drug resistance of M. tuberculosis isolates from different regions of the country. Methods M. tuberculosis strains isolated (2003–2005) from specimens submitted to the laboratory through collection units nationwide were included. Drug susceptibility was performed and strains were spoligotyped. Results Of 926 M. tuberculosis strains studied, 721(78%) were grouped into 59 "shared types", while 205 (22%) were identified as "Orphan" spoligotypes. Amongst the predominant genotypes 61% were Central Asian strains (CAS ; including CAS1, CAS sub-families and Orphan Pak clusters), 4% East African-Indian (EAI), 3% Beijing, 2% poorly defined TB strains (T), 2% Haarlem and LAM (0.2). Also TbD1 analysis (M. tuberculosis specific deletion 1) confirmed that CAS1 was of "modern" origin while EAI isolates belonged to "ancestral" strain types. Prevalence of CAS1 clade was significantly higher in Punjab (P < 0.01, Pearsons Chi-square test) as compared with Sindh, North West Frontier Province and Balochistan provinces. Forty six percent of isolates were sensitive to five first line antibiotics tested, 45% were Rifampicin resistant, 50% isoniazid resistant. MDR was significantly associated with Beijing strains (P = 0.01, Pearsons Chi-square test) and EAI (P = 0.001, Pearsons Chi-square test), but not with CAS family. Conclusion Our results show variation of prevalent M. tuberculosis strain with greater association of CAS1 with the Punjab province. The fact that the prevalent CAS genotype was not associated with drug resistance is encouraging. It further suggests a more effective treatment and control programme should be successful in reducing the tuberculosis burden in Pakistan. PMID:19108722

  6. Exploring Type and Amount of Parent Talk during Individualized Family Service Plan Meetings

    ERIC Educational Resources Information Center

    Ridgley, Robyn; Snyder, Patricia; McWilliam, R. A.

    2014-01-01

    We discuss the utility of a coding system designed to evaluate the amount and type of parent talk during individualized family service plan (IFSP) meetings. The iterative processes used to develop the "Parent Communication Coding System" (PCCS) and its associated codes are described. In addition, we explored whether PCCS codes could be…

  7. Information-theoretic approach to lead-lag effect on financial markets

    NASA Astrophysics Data System (ADS)

    Fiedor, Paweł

    2014-08-01

    Recently the interest of researchers has shifted from the analysis of synchronous relationships of financial instruments to the analysis of more meaningful asynchronous relationships. Both types of analysis are concentrated mostly on Pearson's correlation coefficient and consequently intraday lead-lag relationships (where one of the variables in a pair is time-lagged) are also associated with them. Under the Efficient-Market Hypothesis such relationships are not possible as all information is embedded in the prices, but in real markets we find such dependencies. In this paper we analyse lead-lag relationships of financial instruments and extend known methodology by using mutual information instead of Pearson's correlation coefficient. Mutual information is not only a more general measure, sensitive to non-linear dependencies, but also can lead to a simpler procedure of statistical validation of links between financial instruments. We analyse lagged relationships using New York Stock Exchange 100 data not only on an intraday level, but also for daily stock returns, which have usually been ignored.

  8. 77 FR 66601 - Electronic Tariff Filings; Notice of Change to eTariff Type of Filing Codes

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-11-06

    ... Tariff Filings; Notice of Change to eTariff Type of Filing Codes Take notice that, effective November 18, 2012, the list of available eTariff Type of Filing Codes (TOFC) will be modified to include a new TOFC... Energy's regulations. Tariff records included in such filings will be automatically accepted to be...

  9. "It was a young man's life": G. A. Pearson

    Treesearch

    Susan D. Olberding

    2008-01-01

    The nation's initial USFS research site commenced in a rustic cabin in the midst of northern Arizona's expansive ponderosa pine forest. Gustaf A. Pearson was the first in a distinguished line of USFS scientists to live and study there. A visitor to Fort Valley today often wishes he could have stood in Pearson's large boots (he was said to have enormous...

  10. Memories of Fort Valley from 1938 to 1942

    Treesearch

    Frank H. Wadsworth

    2008-01-01

    This delightful essay records Frank Wadsworth's early forestry career at FVEF in the late 1930s. Frank married Margaret Pearson, G.A. and May Pearson's daughter, in 1941. Pearson believed Frank could not continue to work for him because of nepotism rules, so Frank and Margaret moved to San Juan, Puerto Rico in 1942 where Frank continued his forestry career....

  11. An analysis of high-performing science students' preparation for collegiate science courses

    NASA Astrophysics Data System (ADS)

    Walter, Karen

    This mixed-method study surveyed first year high-performing science students who participated in high-level courses such as International Baccalaureate (IB), Advanced Placement (AP), and honors science courses in high school to determine their perception of preparation for academic success at the collegiate level. The study used 52 students from an honors college campus and surveyed the students and their professors. The students reported that they felt better prepared for academic success at the collegiate level by taking these courses in high school (p<.001). There was a significant negative correlation between perception of preparation and student GPA with honors science courses (n=55 and Pearson's r=-0.336), while AP courses (n=47 and Pearson's r=0.0016) and IB courses (n=17 and Pearson's r=-0.2716) demonstrated no correlation between perception of preparation and GPA. Students reported various themes that helped or hindered their perception of academic success once at the collegiate level. Those themes that reportedly helped students were preparedness, different types of learning, and teacher qualities. Students reported in a post-hoc experience that more lab time, rigorous coursework, better teachers, and better study techniques helped prepare them for academic success at the collegiate level. Students further reported on qualities of teachers and teaching that helped foster their academic abilities at the collegiate level, including teacher knowledge, caring, teaching style, and expectations. Some reasons for taking high-level science courses in high school include boosting GPA, college credit, challenge, and getting into better colleges.

  12. Multi-stage decoding for multi-level block modulation codes

    NASA Technical Reports Server (NTRS)

    Lin, Shu

    1991-01-01

    In this paper, we investigate various types of multi-stage decoding for multi-level block modulation codes, in which the decoding of a component code at each stage can be either soft-decision or hard-decision, maximum likelihood or bounded-distance. Error performance of codes is analyzed for a memoryless additive channel based on various types of multi-stage decoding, and upper bounds on the probability of an incorrect decoding are derived. Based on our study and computation results, we find that, if component codes of a multi-level modulation code and types of decoding at various stages are chosen properly, high spectral efficiency and large coding gain can be achieved with reduced decoding complexity. In particular, we find that the difference in performance between the suboptimum multi-stage soft-decision maximum likelihood decoding of a modulation code and the single-stage optimum decoding of the overall code is very small: only a fraction of dB loss in SNR at the probability of an incorrect decoding for a block of 10(exp -6). Multi-stage decoding of multi-level modulation codes really offers a way to achieve the best of three worlds, bandwidth efficiency, coding gain, and decoding complexity.

  13. Evaluation of the influence of blood glucose level on oral candidal colonization in complete denture wearers with Type-II Diabetes Mellitus: An in vivo Study.

    PubMed

    Ganapathy, Dhanraj Muthuveera; Joseph, Sajeesh; Ariga, Padma; Selvaraj, Anand

    2013-01-01

    Candidal colonization in complete denture wearers is a commonly encountered condition that worsens in the presence of untreated Diabetes Mellitus. The aim of this study was to evaluate the correlation between oral candidiasis in denture-bearing mucosa and elevated blood glucose levels in complete denture wearers and to evaluate the effect of oral hypoglycemic drug therapy in controlling oral candidal colonization in denture-bearing mucosa of complete denture wearers with Type II Diabetes Mellitus. This prospective observational study involved the participation of 15 complete denture wearers with Type II Diabetes Mellitus. The sample collection was made prior and after oral hypoglycaemic drug intervention, by swabbing the rugal surfaces of palatal mucosa, cultured and the density of the candidal colony formed was analyzed and interpreted as colony forming units (CFU) per mL. The candidal samples CFU and corresponding pre- and post-prandial blood glucose levels were estimated, analyzed and compared using Karl Pearson correlation analysis and paired t-test (α = 0.05). The Karl Pearson correlation analysis showed that there was a positive correlation between the blood glucose levels (PPS and FBS) and the candidal colonization (CFU) (P < 0.05). The mean values of all the variables were analyzed using the paired t-test. There was significant reduction in the mean values of blood glucose levels (P < 0.001) and the mean values of the CFU (P < 0.001) following oral hypoglycemic drug therapy. Positive correlation was observed between oral candidiasis in complete denture-bearing mucosa and elevated blood glucose levels and oral hypoglycemic drug therapy has a positive effect in controlling oral candidal colonization in complete denture wearers with Type II Diabetes Mellitus.

  14. Trend of Occupational Injuries/Diseases in Pakistan: Index Value Analysis of Injured Employed Persons from 2001-02 to 2012-13.

    PubMed

    Abbas, Mohsin

    2015-09-01

    The present study aimed to analyze the index value trends of injured employed persons (IEPs) covered in Pakistan Labour Force Surveys from 2001-02 to 2012-13. The index value method based on reference years and reference groups was used to analyze the IEP trends in terms of different criteria such as gender, area, employment status, industry types, occupational groups, types of injury, injured body parts, and treatment received. The Pearson correlation coefficient analysis was also performed to investigate the inter-relationship of different occupational variables. The values of IEP increased at the end of the studied year in industry divisions such as agriculture, forestry, hunting, and fishing, followed by in manufacturing and construction industry divisions. People associated with major occupations (such as skilled agricultural and fishery workers) and elementary (unskilled) occupations were found to be at an increasing risk of occupational injuries/diseases with an increasing IEP trend. Types of occupational injuries such as sprain or strain, superficial injury, and dislocation increased during the studied years. Major injured parts of body such as upper limb and lower limb found with increasing trend. Types of treatment received, including hospitalization and no treatment, were found to decrease. Increased IEP can be justified due to inadequate health care facilities, especially in rural areas by increased IEP in terms of gender, areas, received treatment, occupational groups and employment status as results found after Pearson correlation coefficient analysis. The increasing trend in the IEP% of the total employed persons due to agrarian activities shows that there is a need to improve health care setups in rural areas of Pakistan.

  15. Trend of Occupational Injuries/Diseases in Pakistan: Index Value Analysis of Injured Employed Persons from 2001–02 to 2012–13

    PubMed Central

    Abbas, Mohsin

    2015-01-01

    Background The present study aimed to analyze the index value trends of injured employed persons (IEPs) covered in Pakistan Labour Force Surveys from 2001–02 to 2012–13. Methods The index value method based on reference years and reference groups was used to analyze the IEP trends in terms of different criteria such as gender, area, employment status, industry types, occupational groups, types of injury, injured body parts, and treatment received. The Pearson correlation coefficient analysis was also performed to investigate the inter-relationship of different occupational variables. Results The values of IEP increased at the end of the studied year in industry divisions such as agriculture, forestry, hunting, and fishing, followed by in manufacturing and construction industry divisions. People associated with major occupations (such as skilled agricultural and fishery workers) and elementary (unskilled) occupations were found to be at an increasing risk of occupational injuries/diseases with an increasing IEP trend. Types of occupational injuries such as sprain or strain, superficial injury, and dislocation increased during the studied years. Major injured parts of body such as upper limb and lower limb found with increasing trend. Types of treatment received, including hospitalization and no treatment, were found to decrease. Increased IEP can be justified due to inadequate health care facilities, especially in rural areas by increased IEP in terms of gender, areas, received treatment, occupational groups and employment status as results found after Pearson correlation coefficient analysis. Conclusion The increasing trend in the IEP% of the total employed persons due to agrarian activities shows that there is a need to improve health care setups in rural areas of Pakistan. PMID:26929831

  16. Comparisons of two moments‐based estimators that utilize historical and paleoflood data for the log Pearson type III distribution

    USGS Publications Warehouse

    England, John F.; Salas, José D.; Jarrett, Robert D.

    2003-01-01

    The expected moments algorithm (EMA) [Cohn et al., 1997] and the Bulletin 17B [Interagency Committee on Water Data, 1982] historical weighting procedure (B17H) for the log Pearson type III distribution are compared by Monte Carlo computer simulation for cases in which historical and/or paleoflood data are available. The relative performance of the estimators was explored for three cases: fixed‐threshold exceedances, a fixed number of large floods, and floods generated from a different parent distribution. EMA can effectively incorporate four types of historical and paleoflood data: floods where the discharge is explicitly known, unknown discharges below a single threshold, floods with unknown discharge that exceed some level, and floods with discharges described in a range. The B17H estimator can utilize only the first two types of historical information. Including historical/paleoflood data in the simulation experiments significantly improved the quantile estimates in terms of mean square error and bias relative to using gage data alone. EMA performed significantly better than B17H in nearly all cases considered. B17H performed as well as EMA for estimating X100 in some limited fixed‐threshold exceedance cases. EMA performed comparatively much better in other fixed‐threshold situations, for the single large flood case, and in cases when estimating extreme floods equal to or greater than X500. B17H did not fully utilize historical information when the historical period exceeded 200 years. Robustness studies using GEV‐simulated data confirmed that EMA performed better than B17H. Overall, EMA is preferred to B17H when historical and paleoflood data are available for flood frequency analysis.

  17. Comparisons of two moments-based estimators that utilize historical and paleoflood data for the log Pearson type III distribution

    NASA Astrophysics Data System (ADS)

    England, John F.; Salas, José D.; Jarrett, Robert D.

    2003-09-01

    The expected moments algorithm (EMA) [, 1997] and the Bulletin 17B [, 1982] historical weighting procedure (B17H) for the log Pearson type III distribution are compared by Monte Carlo computer simulation for cases in which historical and/or paleoflood data are available. The relative performance of the estimators was explored for three cases: fixed-threshold exceedances, a fixed number of large floods, and floods generated from a different parent distribution. EMA can effectively incorporate four types of historical and paleoflood data: floods where the discharge is explicitly known, unknown discharges below a single threshold, floods with unknown discharge that exceed some level, and floods with discharges described in a range. The B17H estimator can utilize only the first two types of historical information. Including historical/paleoflood data in the simulation experiments significantly improved the quantile estimates in terms of mean square error and bias relative to using gage data alone. EMA performed significantly better than B17H in nearly all cases considered. B17H performed as well as EMA for estimating X100 in some limited fixed-threshold exceedance cases. EMA performed comparatively much better in other fixed-threshold situations, for the single large flood case, and in cases when estimating extreme floods equal to or greater than X500. B17H did not fully utilize historical information when the historical period exceeded 200 years. Robustness studies using GEV-simulated data confirmed that EMA performed better than B17H. Overall, EMA is preferred to B17H when historical and paleoflood data are available for flood frequency analysis.

  18. Circulating growth factors data associated with insulin secretagogue use in women with incident breast cancer.

    PubMed

    Wintrob, Zachary A P; Hammel, Jeffrey P; Nimako, George K; Gaile, Dan P; Forrest, Alan; Ceacareanu, Alice C

    2017-04-01

    Oral drugs stimulating insulin production may impact growth factor levels. The data presented shows the relationship between pre-existing insulin secretagogues use, growth factor profiles at the time of breast cancer diagnosis and subsequent cancer outcomes in women diagnosed with breast cancer and type 2 diabetes mellitus. A Pearson correlation analysis evaluating the relationship between growth factors stratified by diabetes pharmacotherapy and controls is also provided.

  19. A uniform technique for flood frequency analysis.

    USGS Publications Warehouse

    Thomas, W.O.

    1985-01-01

    This uniform technique consisted of fitting the logarithms of annual peak discharges to a Pearson Type III distribution using the method of moments. The objective was to adopt a consistent approach for the estimation of floodflow frequencies that could be used in computing average annual flood losses for project evaluation. In addition, a consistent approach was needed for defining equitable flood-hazard zones as part of the National Flood Insurance Program. -from ASCE Publications Information

  20. Team interaction during surgery: a systematic review of communication coding schemes.

    PubMed

    Tiferes, Judith; Bisantz, Ann M; Guru, Khurshid A

    2015-05-15

    Communication problems have been systematically linked to human errors in surgery and a deep understanding of the underlying processes is essential. Although a number of tools exist to assess nontechnical skills, methods to study communication and other team-related processes are far from being standardized, making comparisons challenging. We conducted a systematic review to analyze methods used to study events in the operating room (OR) and to develop a synthesized coding scheme for OR team communication. Six electronic databases were accessed to search for articles that collected individual events during surgery and included detailed coding schemes. Additional articles were added based on cross-referencing. That collection was then classified based on type of events collected, environment type (real or simulated), number of procedures, type of surgical task, team characteristics, method of data collection, and coding scheme characteristics. All dimensions within each coding scheme were grouped based on emergent content similarity. Categories drawn from articles, which focused on communication events, were further analyzed and synthesized into one common coding scheme. A total of 34 of 949 articles met the inclusion criteria. The methodological characteristics and coding dimensions of the articles were summarized. A priori coding was used in nine studies. The synthesized coding scheme for OR communication included six dimensions as follows: information flow, period, statement type, topic, communication breakdown, and effects of communication breakdown. The coding scheme provides a standardized coding method for OR communication, which can be used to develop a priori codes for future studies especially in comparative effectiveness research. Copyright © 2015 Elsevier Inc. All rights reserved.

  1. Multi-stage decoding of multi-level modulation codes

    NASA Technical Reports Server (NTRS)

    Lin, Shu; Kasami, Tadao; Costello, Daniel J., Jr.

    1991-01-01

    Various types of multi-stage decoding for multi-level modulation codes are investigated. It is shown that if the component codes of a multi-level modulation code and types of decoding at various stages are chosen properly, high spectral efficiency and large coding gain can be achieved with reduced decoding complexity. Particularly, it is shown that the difference in performance between the suboptimum multi-stage soft-decision maximum likelihood decoding of a modulation code and the single-stage optimum soft-decision decoding of the code is very small, only a fraction of dB loss in signal to noise ratio at a bit error rate (BER) of 10(exp -6).

  2. Pearson's Functions to Describe FSW Weld Geometry

    NASA Astrophysics Data System (ADS)

    Lacombe, D.; Gutierrez-Orrantia, M. E.; Coupard, D.; Tcherniaeff, S.; Girot, F.

    2011-01-01

    Friction stir welding (FSW) is a relatively new joining technique particularly for aluminium alloys that are difficult to fusion weld. In this study, the geometry of the weld has been investigated and modelled using Pearson's functions. It has been demonstrated that the Pearson's parameters (mean, standard deviation, skewness, kurtosis and geometric constant) can be used to characterize the weld geometry and the tensile strength of the weld assembly. Pearson's parameters and process parameters are strongly correlated allowing to define a control process procedure for FSW assemblies which make radiographic or ultrasonic controls unnecessary. Finally, an optimisation using a Generalized Gradient Method allows to determine the geometry of the weld which maximises the assembly tensile strength.

  3. General practitioners’ justifications for therapeutic inertia in cardiovascular prevention: an empirically grounded typology

    PubMed Central

    Lebeau, Jean-Pierre; Cadwallader, Jean-Sébastien; Vaillant-Roussel, Hélène; Pouchain, Denis; Yaouanc, Virginie; Aubin-Auger, Isabelle; Mercier, Alain; Rusch, Emmanuel; Remmen, Roy; Vermeire, Etienne; Hendrickx, Kristin

    2016-01-01

    Objective To construct a typology of general practitioners’ (GPs) responses regarding their justification of therapeutic inertia in cardiovascular primary prevention for high-risk patients with hypertension. Design Empirically grounded construction of typology. Types were defined by attributes derived from the qualitative analysis of GPs’ reported reasons for inaction. Participants 256 GPs randomised in the intervention group of a cluster randomised controlled trial. Setting GPs members of 23 French Regional Colleges of Teachers in General Practice, included in the EffectS of a multifaceted intervention on CArdiovascular risk factors in high-risk hyPErtensive patients (ESCAPE) trial. Data collection and analysis The database consisted of 2638 written responses given by the GPs to an open-ended question asking for the reasons why drug treatment was not changed as suggested by the national guidelines. All answers were coded using constant comparison analysis. A matrix analysis of codes per GP allowed the construction of a response typology, where types were defined by codes as attributes. Initial coding and definition of types were performed independently by two teams. Results Initial coding resulted in a list of 69 codes in the final codebook, representing 4764 coded references in the question responses. A typology including seven types was constructed. 100 GPs were allocated to one and only one of these types, while 25 GPs did not provide enough data to allow classification. Types (numbers of GPs allocated) were: ‘optimists’ (28), ‘negotiators’ (20), ‘checkers’ (15), ‘contextualisers’ (13), ‘cautious’ (11), ‘rounders’ (8) and ‘scientists’ (5). For the 36 GPs that provided 50 or more coded references, analysis of the code evolution over time and across patients showed a consistent belonging to the initial type for any given GP. Conclusion This typology could provide GPs with some insight into their general ways of considering changes in the treatment/management of cardiovascular risk factors and guide design of specific physician-centred interventions to reduce inappropriate inaction. Trial registration number NCT00348855. PMID:27178974

  4. How accurate is the Pearson r-from-Z approximation? A Monte Carlo simulation study.

    PubMed

    Hittner, James B; May, Kim

    2012-01-01

    The Pearson r-from-Z approximation estimates the sample correlation (as an effect size measure) from the ratio of two quantities: the standard normal deviate equivalent (Z-score) corresponding to a one-tailed p-value divided by the square root of the total (pooled) sample size. The formula has utility in meta-analytic work when reports of research contain minimal statistical information. Although simple to implement, the accuracy of the Pearson r-from-Z approximation has not been empirically evaluated. To address this omission, we performed a series of Monte Carlo simulations. Results indicated that in some cases the formula did accurately estimate the sample correlation. However, when sample size was very small (N = 10) and effect sizes were small to small-moderate (ds of 0.1 and 0.3), the Pearson r-from-Z approximation was very inaccurate. Detailed figures that provide guidance as to when the Pearson r-from-Z formula will likely yield valid inferences are presented.

  5. The Large Space Structures Technology Program

    DTIC Science & Technology

    1992-04-01

    Organization and Plan--The LSSTP was initiated in July 1985. It was conceived by Jerome Pearson, who, as leader of the Vibration Group, was responsible... Jerome Pearson was named project manager and Terry Hertz from the Analysis and Optimization Branch was his deputy. The technical disciplines and the...continued until the end of 1990. The LSSTP was originally managed by Jerome Pearson, in addition to his responsibilities as Vibration Group leader. Terry

  6. "It was a young man's life": G.A. Pearson (P-53)

    Treesearch

    Susan D. Olberding

    2008-01-01

    The nation's initial USFS research site commenced in a rustic cabin in the midst of northern Arizona’s expansive ponderosa pine forest. Gustaf A. Pearson was the first in a distinguished line of USFS scientists to live and study there. A visitor to Fort Valley today often wishes he could have stood in Pearson's large boots (he was said to have enormous feet)...

  7. Analytic posteriors for Pearson's correlation coefficient.

    PubMed

    Ly, Alexander; Marsman, Maarten; Wagenmakers, Eric-Jan

    2018-02-01

    Pearson's correlation is one of the most common measures of linear dependence. Recently, Bernardo (11th International Workshop on Objective Bayes Methodology, 2015) introduced a flexible class of priors to study this measure in a Bayesian setting. For this large class of priors, we show that the (marginal) posterior for Pearson's correlation coefficient and all of the posterior moments are analytic. Our results are available in the open-source software package JASP.

  8. Memories of Fort Valley From 1938 to 1942 (P-53)

    Treesearch

    Frank H. Wadsworth

    2008-01-01

    This delightful essay records Frank Wadsworth’s early forestry career at FVEF in the late 1930s. Frank married Margaret Pearson, G.A. and May Pearson’s daughter, in 1941. Pearson believed Frank could not continue to work for him because of nepotism rules, so Frank and Margaret moved to San Juan, Puerto Rico in 1942 where Frank continued his forestry career. His...

  9. Thermodynamic Analysis of the Combustion of Metallic Materials

    NASA Technical Reports Server (NTRS)

    Wilson, D. Bruce; Stoltzfus, Joel M.

    2000-01-01

    Two types of computer codes are available to assist in the thermodynamic analysis of metallic materials combustion. One type of code calculates phase equilibrium data and is represented by CALPHAD. The other type of code calculates chemical reaction by the Gordon-McBride code. The first has seen significant application for alloy-phase diagrams, but only recently has it been considered for oxidation systems. The Gordon-McBride code has been applied to the combustion of metallic materials. Both codes are limited by their treatment of non-ideal solutions and the fact they are limited to treating volatile and gaseous species as ideal. This paper examines the significance of these limitations for combustion of metallic materials. In addition, the applicability of linear-free energy relationships for solid-phase oxidation and their possible extension to liquid-phase systems is examined.

  10. Comparison between air pollution concentrations measured at the nearest monitoring station to the delivery hospital and those measured at stations nearest the residential postal code regions of pregnant women in Fukuoka.

    PubMed

    Michikawa, Takehiro; Morokuma, Seiichi; Nitta, Hiroshi; Kato, Kiyoko; Yamazaki, Shin

    2017-06-13

    Numerous earlier studies examining the association of air pollution with maternal and foetal health estimated maternal exposure to air pollutants based on the women's residential addresses. However, residential addresses, which are personally identifiable information, are not always obtainable. Since a majority of pregnant women reside near their delivery hospitals, the concentrations of air pollutants at the respective delivery hospitals may be surrogate markers of pollutant exposure at home. We compared air pollutant concentrations measured at the nearest monitoring station to Kyushu University Hospital with those measured at the closest monitoring stations to the respective residential postal code regions of pregnant women in Fukuoka. Aggregated postal code data for the home addresses of pregnant women who delivered at Kyushu University Hospital in 2014 was obtained from Kyushu University Hospital. For each of the study's 695 women who resided in Fukuoka Prefecture, we assigned pollutant concentrations measured at the nearest monitoring station to Kyushu University Hospital and pollutant concentrations measured at the nearest monitoring station to their respective residential postal code regions. Among the 695 women, 584 (84.0%) resided in the proximity of the nearest monitoring station to hospital or one of the four other stations (as the nearest stations to their respective residential postal code region) in Fukuoka city. Pearson's correlation for daily mean concentrations among the monitoring stations in Fukuoka city was strong for fine particulate matter (PM 2.5 ), suspended particulate matter (SPM), and photochemical oxidants (Ox) (coefficients ≥0.9), but moderate for coarse particulate matter (the result of subtracting the PM 2.5 from the SPM concentrations), nitrogen dioxide, and sulphur dioxide. Hospital-based and residence-based concentrations of PM 2.5 , SPM, and Ox were comparable. For PM 2.5 , SPM, and Ox, exposure estimation based on the delivery hospital is likely to approximate that based on the home of pregnant women.

  11. An algorithm for computing moments-based flood quantile estimates when historical flood information is available

    USGS Publications Warehouse

    Cohn, T.A.; Lane, W.L.; Baier, W.G.

    1997-01-01

    This paper presents the expected moments algorithm (EMA), a simple and efficient method for incorporating historical and paleoflood information into flood frequency studies. EMA can utilize three types of at-site flood information: systematic stream gage record; information about the magnitude of historical floods; and knowledge of the number of years in the historical period when no large flood occurred. EMA employs an iterative procedure to compute method-of-moments parameter estimates. Initial parameter estimates are calculated from systematic stream gage data. These moments are then updated by including the measured historical peaks and the expected moments, given the previously estimated parameters, of the below-threshold floods from the historical period. The updated moments result in new parameter estimates, and the last two steps are repeated until the algorithm converges. Monte Carlo simulations compare EMA, Bulletin 17B's [United States Water Resources Council, 1982] historically weighted moments adjustment, and maximum likelihood estimators when fitting the three parameters of the log-Pearson type III distribution. These simulations demonstrate that EMA is more efficient than the Bulletin 17B method, and that it is nearly as efficient as maximum likelihood estimation (MLE). The experiments also suggest that EMA has two advantages over MLE when dealing with the log-Pearson type III distribution: It appears that EMA estimates always exist and that they are unique, although neither result has been proven. EMA can be used with binomial or interval-censored data and with any distributional family amenable to method-of-moments estimation.

  12. An algorithm for computing moments-based flood quantile estimates when historical flood information is available

    NASA Astrophysics Data System (ADS)

    Cohn, T. A.; Lane, W. L.; Baier, W. G.

    This paper presents the expected moments algorithm (EMA), a simple and efficient method for incorporating historical and paleoflood information into flood frequency studies. EMA can utilize three types of at-site flood information: systematic stream gage record; information about the magnitude of historical floods; and knowledge of the number of years in the historical period when no large flood occurred. EMA employs an iterative procedure to compute method-of-moments parameter estimates. Initial parameter estimates are calculated from systematic stream gage data. These moments are then updated by including the measured historical peaks and the expected moments, given the previously estimated parameters, of the below-threshold floods from the historical period. The updated moments result in new parameter estimates, and the last two steps are repeated until the algorithm converges. Monte Carlo simulations compare EMA, Bulletin 17B's [United States Water Resources Council, 1982] historically weighted moments adjustment, and maximum likelihood estimators when fitting the three parameters of the log-Pearson type III distribution. These simulations demonstrate that EMA is more efficient than the Bulletin 17B method, and that it is nearly as efficient as maximum likelihood estimation (MLE). The experiments also suggest that EMA has two advantages over MLE when dealing with the log-Pearson type III distribution: It appears that EMA estimates always exist and that they are unique, although neither result has been proven. EMA can be used with binomial or interval-censored data and with any distributional family amenable to method-of-moments estimation.

  13. Zero Pearson coefficient for strongly correlated growing trees

    NASA Astrophysics Data System (ADS)

    Dorogovtsev, S. N.; Ferreira, A. L.; Goltsev, A. V.; Mendes, J. F. F.

    2010-03-01

    We obtained Pearson’s coefficient of strongly correlated recursive networks growing by preferential attachment of every new vertex by m edges. We found that the Pearson coefficient is exactly zero in the infinite network limit for the recursive trees (m=1) . If the number of connections of new vertices exceeds one (m>1) , then the Pearson coefficient in the infinite networks equals zero only when the degree distribution exponent γ does not exceed 4. We calculated the Pearson coefficient for finite networks and observed a slow power-law-like approach to an infinite network limit. Our findings indicate that Pearson’s coefficient strongly depends on size and details of networks, which makes this characteristic virtually useless for quantitative comparison of different networks.

  14. Strengths and limitations of the NATALI code for aerosol typing from multiwavelength Raman lidar observations

    NASA Astrophysics Data System (ADS)

    Nicolae, Doina; Talianu, Camelia; Vasilescu, Jeni; Nicolae, Victor; Stachlewska, Iwona S.

    2018-04-01

    A Python code was developed to automatically retrieve the aerosol type (and its predominant component in the mixture) from EARLINET's 3 backscatter and 2 extinction data. The typing relies on Artificial Neural Networks which are trained to identify the most probable aerosol type from a set of mean-layer intensive optical parameters. This paper presents the use and limitations of the code with respect to the quality of the inputed lidar profiles, as well as with the assumptions made in the aerosol model.

  15. Synthesis, Crystal and Electronic Structures of the Pnictides AE 3TrPn 3 (AE = Sr, Ba; Tr = Al, Ga; Pn = P, As)

    DOE PAGES

    Stoyko, Stanislav; Voss, Leonard; He, Hua; ...

    2015-09-24

    New ternary arsenides AE 3TrAs 3 (AE = Sr, Ba; Tr = Al, Ga) and their phosphide analogs Sr 3GaP 3 and Ba 3AlP 3 have been prepared by reactions of the respective elements at high temperatures. Single-crystal X-ray diffraction studies reveal that Sr 3AlAs 3 and Ba 3AlAs 3 adopt the Ba 3AlSb 3-type structure (Pearson symbol oC56, space group Cmce, Z = 8). This structure is also realized for Sr 3GaP 3 and Ba 3AlP 3. Likewise, the compounds Sr 3GaAs 3 and Ba 3GaAs 3 crystallize with the Ba 3GaSb 3-type structure (Pearson symbol oP56, space groupmore » Pnma, Z = 8). Both structures are made up of isolated pairs of edge-shared AlPn 4 and GaPn 4 tetrahedra (Pn = pnictogen, i.e., P or As), separated by the alkaline-earth Sr 2+ and Ba 2+ cations. In both cases, there are no homoatomic bonds, hence, regardless of the slightly different atomic arrangements, both structures can be rationalized as valence-precise [AE 2+] 3[Tr 3+][Pn 3-] 3, or rather [AE 2+] 6[Tr 2Pn 6] 12-, i.e., as Zintl phases.« less

  16. Metocean design parameter estimation for fixed platform based on copula functions

    NASA Astrophysics Data System (ADS)

    Zhai, Jinjin; Yin, Qilin; Dong, Sheng

    2017-08-01

    Considering the dependent relationship among wave height, wind speed, and current velocity, we construct novel trivariate joint probability distributions via Archimedean copula functions. Total 30-year data of wave height, wind speed, and current velocity in the Bohai Sea are hindcast and sampled for case study. Four kinds of distributions, namely, Gumbel distribution, lognormal distribution, Weibull distribution, and Pearson Type III distribution, are candidate models for marginal distributions of wave height, wind speed, and current velocity. The Pearson Type III distribution is selected as the optimal model. Bivariate and trivariate probability distributions of these environmental conditions are established based on four bivariate and trivariate Archimedean copulas, namely, Clayton, Frank, Gumbel-Hougaard, and Ali-Mikhail-Haq copulas. These joint probability models can maximize marginal information and the dependence among the three variables. The design return values of these three variables can be obtained by three methods: univariate probability, conditional probability, and joint probability. The joint return periods of different load combinations are estimated by the proposed models. Platform responses (including base shear, overturning moment, and deck displacement) are further calculated. For the same return period, the design values of wave height, wind speed, and current velocity obtained by the conditional and joint probability models are much smaller than those by univariate probability. Considering the dependence among variables, the multivariate probability distributions provide close design parameters to actual sea state for ocean platform design.

  17. 48 CFR 246.710-70 - Warranty attachment.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... Enterprise Identifier Code Type 0-9—GS1 Company Prefix. D—CAGE. LB—ATIS-0322000. LH—EHIBCC. RH—HIBCC. UN—DUNS... Guarantor Enterprise Identifier Code Type 0-9—GS1 Company Prefix. D—CAGE. LB—ATIS-0322000. LH—EHIBCC. RH... returns Name ** Address line 1 ** Address line 2 ** City/county ** State/province ** Postal code...

  18. Building codes : obstacle or opportunity?

    Treesearch

    Alberto Goetzl; David B. McKeever

    1999-01-01

    Building codes are critically important in the use of wood products for construction. The codes contain regulations that are prescriptive or performance related for various kinds of buildings and construction types. A prescriptive standard might dictate that a particular type of material be used in a given application. A performance standard requires that a particular...

  19. Color coding of control room displays: the psychocartography of visual layering effects.

    PubMed

    Van Laar, Darren; Deshe, Ofer

    2007-06-01

    To evaluate which of three color coding methods (monochrome, maximally discriminable, and visual layering) used to code four types of control room display format (bars, tables, trend, mimic) was superior in two classes of task (search, compare). It has recently been shown that color coding of visual layers, as used in cartography, may be used to color code any type of information display, but this has yet to be fully evaluated. Twenty-four people took part in a 2 (task) x 3 (coding method) x 4 (format) wholly repeated measures design. The dependent variables assessed were target location reaction time, error rates, workload, and subjective feedback. Overall, the visual layers coding method produced significantly faster reaction times than did the maximally discriminable and the monochrome methods for both the search and compare tasks. No significant difference in errors was observed between conditions for either task type. Significantly less perceived workload was experienced with the visual layers coding method, which was also rated more highly than the other coding methods on a 14-item visual display quality questionnaire. The visual layers coding method is superior to other color coding methods for control room displays when the method supports the user's task. The visual layers color coding method has wide applicability to the design of all complex information displays utilizing color coding, from the most maplike (e.g., air traffic control) to the most abstract (e.g., abstracted ecological display).

  20. Motion artifact detection and correction in functional near-infrared spectroscopy: a new hybrid method based on spline interpolation method and Savitzky-Golay filtering.

    PubMed

    Jahani, Sahar; Setarehdan, Seyed K; Boas, David A; Yücel, Meryem A

    2018-01-01

    Motion artifact contamination in near-infrared spectroscopy (NIRS) data has become an important challenge in realizing the full potential of NIRS for real-life applications. Various motion correction algorithms have been used to alleviate the effect of motion artifacts on the estimation of the hemodynamic response function. While smoothing methods, such as wavelet filtering, are excellent in removing motion-induced sharp spikes, the baseline shifts in the signal remain after this type of filtering. Methods, such as spline interpolation, on the other hand, can properly correct baseline shifts; however, they leave residual high-frequency spikes. We propose a hybrid method that takes advantage of different correction algorithms. This method first identifies the baseline shifts and corrects them using a spline interpolation method or targeted principal component analysis. The remaining spikes, on the other hand, are corrected by smoothing methods: Savitzky-Golay (SG) filtering or robust locally weighted regression and smoothing. We have compared our new approach with the existing correction algorithms in terms of hemodynamic response function estimation using the following metrics: mean-squared error, peak-to-peak error ([Formula: see text]), Pearson's correlation ([Formula: see text]), and the area under the receiver operator characteristic curve. We found that spline-SG hybrid method provides reasonable improvements in all these metrics with a relatively short computational time. The dataset and the code used in this study are made available online for the use of all interested researchers.

  1. Narrative characteristics of genocide testimonies predict posttraumatic stress disorder symptoms years later.

    PubMed

    Ng, Lauren C; Ahishakiye, Naphtal; Miller, Donald E; Meyerowitz, Beth E

    2015-05-01

    Cognitive theories of posttraumatic stress disorder (PTSD) suggest that trauma narratives that make greater use of somatosensory, perceptual, and negative emotion words may be indicators of greater risk of PTSD symptoms (Ehlers & Clark, 2000). The purpose of this study was to analyze whether the way that survivors of the 1994 Rwandan Genocide against the Tutsi naturally construct genocide testimonies predicts PTSD symptoms 6 years later. One hundred orphaned heads of household (OHH) who were members of a community association gave testimonies about their genocide experiences in 2002. In 2008, PTSD symptoms of 61 of the original OHH were assessed using a genocide-specific version of the Impact of Events Scale-Revised (Weiss & Marmar, 1997). Experienced genocide events were coded from the genocide testimonies, and the types of words used in the testimonies were analyzed using the Linguistic Inquiry and Word Count program (Pennebaker, Chung, Ireland, Gonzales, & Booth, 2007). Pearson correlations and path analyses assessed the relationships between variables. After accounting for genocide events, touching positively predicted avoidance, and sadness negatively predicted hyperarousal. Sensory descriptions of traumatic experiences in trauma narratives may signify higher risk for mental health problems whereas expressions of sadness may indicate emotional processing and better mental health. Analyzing genocide testimonies may help identify survivors at the highest risk of developing PTSD symptoms, even among a group of survivors who have arguably suffered some of the most severe genocide experiences. (c) 2015 APA, all rights reserved).

  2. 75 FR 15726 - Polyvinyl Alcohol From Taiwan; Determination

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-30

    ...\\ Vice Chairman Pearson and Commissioners Okun and Lane dissented, having determined that there is no... remand, Vice Chairman Pearson and Commissioners Okun and Lane reaffirmed their negative preliminary...

  3. General practitioners' justifications for therapeutic inertia in cardiovascular prevention: an empirically grounded typology.

    PubMed

    Lebeau, Jean-Pierre; Cadwallader, Jean-Sébastien; Vaillant-Roussel, Hélène; Pouchain, Denis; Yaouanc, Virginie; Aubin-Auger, Isabelle; Mercier, Alain; Rusch, Emmanuel; Remmen, Roy; Vermeire, Etienne; Hendrickx, Kristin

    2016-05-13

    To construct a typology of general practitioners' (GPs) responses regarding their justification of therapeutic inertia in cardiovascular primary prevention for high-risk patients with hypertension. Empirically grounded construction of typology. Types were defined by attributes derived from the qualitative analysis of GPs' reported reasons for inaction. 256 GPs randomised in the intervention group of a cluster randomised controlled trial. GPs members of 23 French Regional Colleges of Teachers in General Practice, included in the EffectS of a multifaceted intervention on CArdiovascular risk factors in high-risk hyPErtensive patients (ESCAPE) trial. The database consisted of 2638 written responses given by the GPs to an open-ended question asking for the reasons why drug treatment was not changed as suggested by the national guidelines. All answers were coded using constant comparison analysis. A matrix analysis of codes per GP allowed the construction of a response typology, where types were defined by codes as attributes. Initial coding and definition of types were performed independently by two teams. Initial coding resulted in a list of 69 codes in the final codebook, representing 4764 coded references in the question responses. A typology including seven types was constructed. 100 GPs were allocated to one and only one of these types, while 25 GPs did not provide enough data to allow classification. Types (numbers of GPs allocated) were: 'optimists' (28), 'negotiators' (20), 'checkers' (15), 'contextualisers' (13), 'cautious' (11), 'rounders' (8) and 'scientists' (5). For the 36 GPs that provided 50 or more coded references, analysis of the code evolution over time and across patients showed a consistent belonging to the initial type for any given GP. This typology could provide GPs with some insight into their general ways of considering changes in the treatment/management of cardiovascular risk factors and guide design of specific physician-centred interventions to reduce inappropriate inaction. NCT00348855. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  4. Measuring User Similarity Using Electric Circuit Analysis: Application to Collaborative Filtering

    PubMed Central

    Yang, Joonhyuk; Kim, Jinwook; Kim, Wonjoon; Kim, Young Hwan

    2012-01-01

    We propose a new technique of measuring user similarity in collaborative filtering using electric circuit analysis. Electric circuit analysis is used to measure the potential differences between nodes on an electric circuit. In this paper, by applying this method to transaction networks comprising users and items, i.e., user–item matrix, and by using the full information about the relationship structure of users in the perspective of item adoption, we overcome the limitations of one-to-one similarity calculation approach, such as the Pearson correlation, Tanimoto coefficient, and Hamming distance, in collaborative filtering. We found that electric circuit analysis can be successfully incorporated into recommender systems and has the potential to significantly enhance predictability, especially when combined with user-based collaborative filtering. We also propose four types of hybrid algorithms that combine the Pearson correlation method and electric circuit analysis. One of the algorithms exceeds the performance of the traditional collaborative filtering by 37.5% at most. This work opens new opportunities for interdisciplinary research between physics and computer science and the development of new recommendation systems PMID:23145095

  5. Measuring user similarity using electric circuit analysis: application to collaborative filtering.

    PubMed

    Yang, Joonhyuk; Kim, Jinwook; Kim, Wonjoon; Kim, Young Hwan

    2012-01-01

    We propose a new technique of measuring user similarity in collaborative filtering using electric circuit analysis. Electric circuit analysis is used to measure the potential differences between nodes on an electric circuit. In this paper, by applying this method to transaction networks comprising users and items, i.e., user-item matrix, and by using the full information about the relationship structure of users in the perspective of item adoption, we overcome the limitations of one-to-one similarity calculation approach, such as the Pearson correlation, Tanimoto coefficient, and Hamming distance, in collaborative filtering. We found that electric circuit analysis can be successfully incorporated into recommender systems and has the potential to significantly enhance predictability, especially when combined with user-based collaborative filtering. We also propose four types of hybrid algorithms that combine the Pearson correlation method and electric circuit analysis. One of the algorithms exceeds the performance of the traditional collaborative filtering by 37.5% at most. This work opens new opportunities for interdisciplinary research between physics and computer science and the development of new recommendation systems.

  6. Method for stationarity-segmentation of spike train data with application to the Pearson cross-correlation.

    PubMed

    Quiroga-Lombard, Claudio S; Hass, Joachim; Durstewitz, Daniel

    2013-07-01

    Correlations among neurons are supposed to play an important role in computation and information coding in the nervous system. Empirically, functional interactions between neurons are most commonly assessed by cross-correlation functions. Recent studies have suggested that pairwise correlations may indeed be sufficient to capture most of the information present in neural interactions. Many applications of correlation functions, however, implicitly tend to assume that the underlying processes are stationary. This assumption will usually fail for real neurons recorded in vivo since their activity during behavioral tasks is heavily influenced by stimulus-, movement-, or cognition-related processes as well as by more general processes like slow oscillations or changes in state of alertness. To address the problem of nonstationarity, we introduce a method for assessing stationarity empirically and then "slicing" spike trains into stationary segments according to the statistical definition of weak-sense stationarity. We examine pairwise Pearson cross-correlations (PCCs) under both stationary and nonstationary conditions and identify another source of covariance that can be differentiated from the covariance of the spike times and emerges as a consequence of residual nonstationarities after the slicing process: the covariance of the firing rates defined on each segment. Based on this, a correction of the PCC is introduced that accounts for the effect of segmentation. We probe these methods both on simulated data sets and on in vivo recordings from the prefrontal cortex of behaving rats. Rather than for removing nonstationarities, the present method may also be used for detecting significant events in spike trains.

  7. IntNetLncSim: an integrative network analysis method to infer human lncRNA functional similarity

    PubMed Central

    Hu, Yang; Yang, Haixiu; Zhou, Chen; Sun, Jie; Zhou, Meng

    2016-01-01

    Increasing evidence indicated that long non-coding RNAs (lncRNAs) were involved in various biological processes and complex diseases by communicating with mRNAs/miRNAs each other. Exploiting interactions between lncRNAs and mRNA/miRNAs to lncRNA functional similarity (LFS) is an effective method to explore function of lncRNAs and predict novel lncRNA-disease associations. In this article, we proposed an integrative framework, IntNetLncSim, to infer LFS by modeling the information flow in an integrated network that comprises both lncRNA-related transcriptional and post-transcriptional information. The performance of IntNetLncSim was evaluated by investigating the relationship of LFS with the similarity of lncRNA-related mRNA sets (LmRSets) and miRNA sets (LmiRSets). As a result, LFS by IntNetLncSim was significant positively correlated with the LmRSet (Pearson correlation γ2=0.8424) and LmiRSet (Pearson correlation γ2=0.2601). Particularly, the performance of IntNetLncSim is superior to several previous methods. In the case of applying the LFS to identify novel lncRNA-disease relationships, we achieved an area under the ROC curve (0.7300) in experimentally verified lncRNA-disease associations based on leave-one-out cross-validation. Furthermore, highly-ranked lncRNA-disease associations confirmed by literature mining demonstrated the excellent performance of IntNetLncSim. Finally, a web-accessible system was provided for querying LFS and potential lncRNA-disease relationships: http://www.bio-bigdata.com/IntNetLncSim. PMID:27323856

  8. IntNetLncSim: an integrative network analysis method to infer human lncRNA functional similarity.

    PubMed

    Cheng, Liang; Shi, Hongbo; Wang, Zhenzhen; Hu, Yang; Yang, Haixiu; Zhou, Chen; Sun, Jie; Zhou, Meng

    2016-07-26

    Increasing evidence indicated that long non-coding RNAs (lncRNAs) were involved in various biological processes and complex diseases by communicating with mRNAs/miRNAs each other. Exploiting interactions between lncRNAs and mRNA/miRNAs to lncRNA functional similarity (LFS) is an effective method to explore function of lncRNAs and predict novel lncRNA-disease associations. In this article, we proposed an integrative framework, IntNetLncSim, to infer LFS by modeling the information flow in an integrated network that comprises both lncRNA-related transcriptional and post-transcriptional information. The performance of IntNetLncSim was evaluated by investigating the relationship of LFS with the similarity of lncRNA-related mRNA sets (LmRSets) and miRNA sets (LmiRSets). As a result, LFS by IntNetLncSim was significant positively correlated with the LmRSet (Pearson correlation γ2=0.8424) and LmiRSet (Pearson correlation γ2=0.2601). Particularly, the performance of IntNetLncSim is superior to several previous methods. In the case of applying the LFS to identify novel lncRNA-disease relationships, we achieved an area under the ROC curve (0.7300) in experimentally verified lncRNA-disease associations based on leave-one-out cross-validation. Furthermore, highly-ranked lncRNA-disease associations confirmed by literature mining demonstrated the excellent performance of IntNetLncSim. Finally, a web-accessible system was provided for querying LFS and potential lncRNA-disease relationships: http://www.bio-bigdata.com/IntNetLncSim.

  9. ACSNSQIP Risk Calculator in Indian Patients Undergoing Surgery for Head and Neck Cancers: Is It Valid?

    PubMed

    Subramaniam, Narayana; Balasubramanian, Deepak; Rka, Pradeep; Murthy, Samskruthi; Rathod, Priyank; Vidhyadharan, Sivakumar; Thankappan, Krishnakumar; Iyer, Subramania

    2018-06-01

    Pre-operative assessment is vital to determine patient-specific risks and minimize them in order to optimize surgical outcomes. The American College of Surgeons National Surgical Quality Improvement Program (ACSNSQIP) Surgical Risk Calculator is the most comprehensive surgical risk assessment tool available. We performed this study to determine the validity of ACSNSQIP calculator when used to predict surgical complications in a cohort of patients with head and neck cancer treated in an Indian tertiary care center. Retrospective data was collected for 150 patients with head and neck cancer who were operated in the Department of Head and Neck Oncology, Amrita Institute of Medical Sciences, Kochi, in the year 2016. The predicted outcome data was compared with actual documented outcome data for the variables mentioned. Brier's score was used to estimate the predictive value of the risk assessment generated. Pearson's r coefficient was utilized to validate the prediction of length of hospital stay. Brier's score for the entire calculator was 0.32 (not significant). Additionally, when the score was determined for individual parameters (surgical site infection, pneumonia, etc.), none were significant. Pearson's r value for length of stay was also not significant ( p  = .632). The ACSNSQIP risk assessment tool did not accurately reflect surgical outcomes in our cohort of Indian patients. Although it is the most comprehensive tool available at present, modifications that may improve accuracy are allowing for input of multiple procedure codes, risk stratifying for previous radiation or surgery, and better risk assessment for microvascular flap reconstruction.

  10. Construction of type-II QC-LDPC codes with fast encoding based on perfect cyclic difference sets

    NASA Astrophysics Data System (ADS)

    Li, Ling-xiang; Li, Hai-bing; Li, Ji-bi; Jiang, Hua

    2017-09-01

    In view of the problems that the encoding complexity of quasi-cyclic low-density parity-check (QC-LDPC) codes is high and the minimum distance is not large enough which leads to the degradation of the error-correction performance, the new irregular type-II QC-LDPC codes based on perfect cyclic difference sets (CDSs) are constructed. The parity check matrices of these type-II QC-LDPC codes consist of the zero matrices with weight of 0, the circulant permutation matrices (CPMs) with weight of 1 and the circulant matrices with weight of 2 (W2CMs). The introduction of W2CMs in parity check matrices makes it possible to achieve the larger minimum distance which can improve the error- correction performance of the codes. The Tanner graphs of these codes have no girth-4, thus they have the excellent decoding convergence characteristics. In addition, because the parity check matrices have the quasi-dual diagonal structure, the fast encoding algorithm can reduce the encoding complexity effectively. Simulation results show that the new type-II QC-LDPC codes can achieve a more excellent error-correction performance and have no error floor phenomenon over the additive white Gaussian noise (AWGN) channel with sum-product algorithm (SPA) iterative decoding.

  11. 2 × 2 Tables: a note on Campbell's recommendation.

    PubMed

    Busing, F M T A; Weaver, B; Dubois, S

    2016-04-15

    For 2 × 2 tables, Egon Pearson's N - 1 chi-squared statistic is theoretically more sound than Karl Pearson's chi-squared statistic, and provides more accurate p values. Moreover, Egon Pearson's N - 1 chi-squared statistic is equal to the Mantel-Haenszel chi-squared statistic for a single 2 × 2 table, and as such, is often available in statistical software packages like SPSS, SAS, Stata, or R, which facilitates compliance with Ian Campbell's recommendations. Copyright © 2015 John Wiley & Sons, Ltd.

  12. Dataset on growth factor levels and insulin use in patients with diabetes mellitus and incident breast cancer.

    PubMed

    Wintrob, Zachary A P; Hammel, Jeffrey P; Nimako, George K; Gaile, Dan P; Forrest, Alan; Ceacareanu, Alice C

    2017-04-01

    Growth factor profiles could be influenced by the utilization of exogenous insulin. The data presented shows the relationship between pre-existing use of injectable insulin in women diagnosed with breast cancer and type 2 diabetes mellitus, the growth factor profiles at the time of breast cancer diagnosis, and subsequent cancer outcomes. A Pearson correlation analysis evaluating the relationship between growth factors stratified by of insulin use and controls is also provided.

  13. Ada Integrated Environment III Computer Program Development Specification. Volume III. Ada Optimizing Compiler.

    DTIC Science & Technology

    1981-12-01

    file.library-unit{.subunit).SYMAP Statement Map: library-file. library-unit.subunit).SMAP Type Map: 1 ibrary.fi le. 1 ibrary-unit{.subunit). TMAP The library...generator SYMAP Symbol Map code generator SMAP Updated Statement Map code generator TMAP Type Map code generator A.3.5 The PUNIT Command The P UNIT...Core.Stmtmap) NAME Tmap (Core.Typemap) END Example A-3 Compiler Command Stream for the Code Generator Texas Instruments A-5 Ada Optimizing Compiler

  14. Techniques for estimating flood-peak discharges of rural, unregulated streams in Ohio

    USGS Publications Warehouse

    Koltun, G.F.; Roberts, J.W.

    1990-01-01

    Multiple-regression equations are presented for estimating flood-peak discharges having recurrence intervals of 2, 5, 10, 25, 50, and 100 years at ungaged sites on rural, unregulated streams in Ohio. The average standard errors of prediction for the equations range from 33.4% to 41.4%. Peak discharge estimates determined by log-Pearson Type III analysis using data collected through the 1987 water year are reported for 275 streamflow-gaging stations. Ordinary least-squares multiple-regression techniques were used to divide the State into three regions and to identify a set of basin characteristics that help explain station-to- station variation in the log-Pearson estimates. Contributing drainage area, main-channel slope, and storage area were identified as suitable explanatory variables. Generalized least-square procedures, which include historical flow data and account for differences in the variance of flows at different gaging stations, spatial correlation among gaging station records, and variable lengths of station record were used to estimate the regression parameters. Weighted peak-discharge estimates computed as a function of the log-Pearson Type III and regression estimates are reported for each station. A method is provided to adjust regression estimates for ungaged sites by use of weighted and regression estimates for a gaged site located on the same stream. Limitations and shortcomings cited in an earlier report on the magnitude and frequency of floods in Ohio are addressed in this study. Geographic bias is no longer evident for the Maumee River basin of northwestern Ohio. No bias is found to be associated with the forested-area characteristic for the range used in the regression analysis (0.0 to 99.0%), nor is this characteristic significant in explaining peak discharges. Surface-mined area likewise is not significant in explaining peak discharges, and the regression equations are not biased when applied to basins having approximately 30% or less surface-mined area. Analyses of residuals indicate that the equations tend to overestimate flood-peak discharges for basins having approximately 30% or more surface-mined area. (USGS)

  15. 3. Historic American Buildings Survey, Elmer R. Pearson, Photographer, 1968 ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    3. Historic American Buildings Survey, Elmer R. Pearson, Photographer, 1968 ELEVATION, LOOKING NORTHWEST. - Shaker Centre Family, Broom Shop, East side of Oxford Road, White Water Park, Hamilton County, OH

  16. 76 FR 14372 - Glenn/Colusa County Resource Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-16

    ... agenda items contact Eduardo Olmedo, DFO, 825 N. Humboldt Ave., Willows, CA 95988 or Laurie Pearson..., Stonyford, CA 95979. FOR FURTHER INFORMATION CONTACT: Laurie Pearson, Glenn/Colusa RAC Coordinator, USDA...

  17. Can evaluation of a dental procedure at the outset of learning predict later performance at the preclinical level? A pilot study.

    PubMed

    Polyzois, Ioannis; Claffey, Noel; McDonald, Albhe; Hussey, David; Quinn, Frank

    2011-05-01

    The purpose of this study was to examine the effectiveness of conventional pre-clinical training in dentistry and to determine if evaluation of a dental procedure at the beginning of dental training can be a predictor for future performance. A group of second year dental students with no previous experience in operative dentistry were asked to prepare a conventional class I cavity on a lower first molar typodont. Their first preparation was carried out after an introductory lecture and a demonstration and their second at the end of conventional training. The prepared typodonts were coded and blindly scored for the traditional assessment criteria of outline form, retention form, smoothness, cavity depth and cavity margin angulation. Once the codes were broken, a paired t-test was used to compare the difference between the means of before and after scores (P<0.0001) and a Pearson's linear correlation to test the association (r=0.4). From the results of this study, we could conclude that conventional preclinical training results in a significant improvement in the manual skills of the dental students and that the dental procedure used had only a limited predictive value for later performance at the preclinical level. © 2011 John Wiley & Sons A/S.

  18. [Medication error management climate and perception for system use according to construction of medication error prevention system].

    PubMed

    Kim, Myoung Soo

    2012-08-01

    The purpose of this cross-sectional study was to examine current status of IT-based medication error prevention system construction and the relationships among system construction, medication error management climate and perception for system use. The participants were 124 patient safety chief managers working for 124 hospitals with over 300 beds in Korea. The characteristics of the participants, construction status and perception of systems (electric pharmacopoeia, electric drug dosage calculation system, computer-based patient safety reporting and bar-code system) and medication error management climate were measured in this study. The data were collected between June and August 2011. Descriptive statistics, partial Pearson correlation and MANCOVA were used for data analysis. Electric pharmacopoeia were constructed in 67.7% of participating hospitals, computer-based patient safety reporting systems were constructed in 50.8%, electric drug dosage calculation systems were in use in 32.3%. Bar-code systems showed up the lowest construction rate at 16.1% of Korean hospitals. Higher rates of construction of IT-based medication error prevention systems resulted in greater safety and a more positive error management climate prevailed. The supportive strategies for improving perception for use of IT-based systems would add to system construction, and positive error management climate would be more easily promoted.

  19. The Code of the Street and Violent Versus Property Crime Victimization.

    PubMed

    McNeeley, Susan; Wilcox, Pamela

    2015-01-01

    Previous research has shown that individuals who adopt values in line with the code of the street are more likely to experience violent victimization (e.g., Stewart, Schreck, & Simons, 2006). This study extends this literature by examining the relationship between the street code and multiple types of violent and property victimization. This research investigates the relationship between street code-related values and 4 types of victimization (assault, breaking and entering, theft, and vandalism) using Poisson-based multilevel regression models. Belief in the street code was associated with higher risk of experiencing assault, breaking and entering, and vandalism, whereas theft victimization was not related to the street code. The results suggest that the code of the street influences victimization broadly--beyond violence--by increasing behavior that provokes retaliation from others in various forms.

  20. Software metrics: The quantitative impact of four factors on work rates experienced during software development. [reliability engineering

    NASA Technical Reports Server (NTRS)

    Gaffney, J. E., Jr.; Judge, R. W.

    1981-01-01

    A model of a software development process is described. The software development process is seen to consist of a sequence of activities, such as 'program design' and 'module development' (or coding). A manpower estimate is made by multiplying code size by the rates (man months per thousand lines of code) for each of the activities relevant to the particular case of interest and summing up the results. The effect of four objectively determinable factors (organization, software product type, computer type, and code type) on productivity values for each of nine principal software development activities was assessed. Four factors were identified which account for 39% of the observed productivity variation.

  1. Software Cost Estimating,

    DTIC Science & Technology

    1982-05-13

    Size Of The Software. A favourite measure for software system size is linos of operational code, or deliverable code (operational code plus...regression models, these conversions are either derived from productivity measures using the "cost per instruction" type of equation or they are...appropriate to different development organisattons, differert project types, different sets of units for measuring e and s, and different items

  2. 76 FR 35399 - Glenn/Colusa Resource Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-06-17

    ... building to view comments. FOR FURTHER INFORMATION CONTACT: Laurie L. Pearson, Visitor Information...: Laurie L. Pearson, Glenn/Colusa R.A.C. Coordinator, PO Box 160, Stonyford, CA 95979, or by e-mail to...

  3. MaxEnt alternatives to pearson family distributions

    NASA Astrophysics Data System (ADS)

    Stokes, Barrie J.

    2012-05-01

    In a previous MaxEnt conference [11] a method of obtaining MaxEnt univariate distributions under a variety of constraints was presented. The Mathematica function Interpolation[], normally used with numerical data, can also process "semi-symbolic" data, and Lagrange Multiplier equations were solved for a set of symbolic ordinates describing the required MaxEnt probability density function. We apply a more developed version of this approach to finding MaxEnt distributions having prescribed β1 and β2 values, and compare the entropy of the MaxEnt distribution to that of the Pearson family distribution having the same β1 and β2. These MaxEnt distributions do have, in general, greater entropy than the related Pearson distribution. In accordance with Jaynes' Maximum Entropy Principle, these MaxEnt distributions are thus to be preferred to the corresponding Pearson distributions as priors in Bayes' Theorem.

  4. 24 CFR 578.75 - General operations.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... assistance under this part must meet State or local building codes, and in the absence of State or local building codes, the International Residential Code or International Building Code (as applicable to the type of structure) of the International Code Council. (2) Services provided with assistance under this...

  5. 24 CFR 578.75 - General operations.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... assistance under this part must meet State or local building codes, and in the absence of State or local building codes, the International Residential Code or International Building Code (as applicable to the type of structure) of the International Code Council. (2) Services provided with assistance under this...

  6. Getting out of bed after surgery

    MedlinePlus

    ... to Advanced Skills . 9th ed. New York, NY: Pearson; 2017:chap 13. Smith SF, Duell DJ, Martin ... to Advanced Skills . 9th ed. New York, NY: Pearson; 2017:chap 26. Patient Instructions Gallbladder removal - open - ...

  7. Guidelines for Coding and Entering Ground-Water Data into the Ground-Water Site Inventory Data Base, Version 4.6, U.S. Geological Survey, Washington Water Science Center

    DTIC Science & Technology

    2006-01-01

    collected, code both. Code Type of Analysis Code Type of Analysis A Physical properties I Common ions/trace elements B Common ions J Sanitary analysis and...1) A ground-water site is coded as if it is a single point, not a geographic area or property . (2) Latitude and longitude should be determined at a...terrace from an adjacent upland on one side, and a lowland coast or valley on the other. Due to the effects of erosion, the terrace surface may not be as

  8. Contrasting Five Different Theories of Letter Position Coding: Evidence from Orthographic Similarity Effects

    ERIC Educational Resources Information Center

    Davis, Colin J.; Bowers, Jeffrey S.

    2006-01-01

    Five theories of how letter position is coded are contrasted: position-specific slot-coding, Wickelcoding, open-bigram coding (discrete and continuous), and spatial coding. These theories make different predictions regarding the relative similarity of three different types of pairs of letter strings: substitution neighbors,…

  9. 78 FR 49412 - Personal Flotation Devices Labeling and Standards

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-14

    ...The Coast Guard proposes to remove references to type codes in its regulations on the carriage and labeling of Coast Guard-approved personal flotation devices (PFDs). PFD type codes are unique to Coast Guard approval and are not well understood by the public. Removing these type codes from our regulations would facilitate future incorporation by reference of new industry consensus standards for PFD labeling that will more effectively convey safety information, and is a step toward harmonization of our regulations with PFD requirements in Canada and in other countries.

  10. [Pearson syndrome. Case report].

    PubMed

    Cammarata-Scalisi, Francisco; López-Gallardo, Ester; Emperador, Sonia; Ruiz-Pesini, Eduardo; Da Silva, Gloria; Camacho, Nolis; Montoya, Julio

    2011-09-01

    Among the etiologies of anemia in the infancy, the mitochondrial cytopathies are infrequent. Pearson syndrome is diagnosed principally during the initial stages of life and it is characterized by refractory sideroblastic anemia with vacuolization of marrow progenitor cells, exocrine pancreatic dysfunction and variable neurologic, hepatic, renal and endocrine failures. We report the case of a 14 month-old girl evaluated by a multicentric study, with clinic and molecular diagnosis of Pearson syndrome, with the 4,977-base pair common deletion of mitochondrial DNA. This entity has been associated to diverse phenotypes within the broad clinical spectrum of mitochondrial disease.

  11. Energy Savings Analysis of the Proposed NYStretch-Energy Code 2018

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Bing; Zhang, Jian; Chen, Yan

    This study was conducted by the Pacific Northwest National Laboratory (PNNL) in support of the stretch energy code development led by the New York State Energy Research and Development Authority (NYSERDA). In 2017 NYSERDA developed its 2016 Stretch Code Supplement to the 2016 New York State Energy Conservation Construction Code (hereinafter referred to as “NYStretch-Energy”). NYStretch-Energy is intended as a model energy code for statewide voluntary adoption that anticipates other code advancements culminating in the goal of a statewide Net Zero Energy Code by 2028. Since then, NYSERDA continues to develop the NYStretch-Energy Code 2018 edition. To support the effort,more » PNNL conducted energy simulation analysis to quantify the energy savings of proposed commercial provisions of the NYStretch-Energy Code (2018) in New York. The focus of this project is the 20% improvement over existing commercial model energy codes. A key requirement of the proposed stretch code is that it be ‘adoptable’ as an energy code, meaning that it must align with current code scope and limitations, and primarily impact building components that are currently regulated by local building departments. It is largely limited to prescriptive measures, which are what most building departments and design projects are most familiar with. This report describes a set of energy-efficiency measures (EEMs) that demonstrate 20% energy savings over ANSI/ASHRAE/IES Standard 90.1-2013 (ASHRAE 2013) across a broad range of commercial building types and all three climate zones in New York. In collaboration with New Building Institute, the EEMs were developed from national model codes and standards, high-performance building codes and standards, regional energy codes, and measures being proposed as part of the on-going code development process. PNNL analyzed these measures using whole building energy models for selected prototype commercial buildings and multifamily buildings representing buildings in New York. Section 2 of this report describes the analysis methodology, including the building types and construction area weights update for this analysis, the baseline, and the method to conduct the energy saving analysis. Section 3 provides detailed specifications of the EEMs and bundles. Section 4 summarizes the results of individual EEMs and EEM bundles by building type, energy end-use and climate zone. Appendix A documents detailed descriptions of the selected prototype buildings. Appendix B provides energy end-use breakdown results by building type for both the baseline code and stretch code in all climate zones.« less

  12. Correlating Species and Spectral Diversity using Remote Sensing in Successional Fields in Virginia

    NASA Astrophysics Data System (ADS)

    Aneece, I.; Epstein, H. E.

    2015-12-01

    Conserving biodiversity can help preserve ecosystem properties and function. As the increasing prevalence of invasive plant species threatens biodiversity, advances in remote sensing technology can help monitor invasive species and their effects on ecosystems and plant communities. To assess whether we could study the effects of invasive species on biodiversity using remote sensing, we asked whether species diversity was positively correlated with spectral diversity, and whether correlations differed among spectral regions along the visible and near-infrared range. To answer these questions, we established community plots in secondary successional fields at the Blandy Experimental Farm in northern Virginia and collected vegetation surveys and ground-level hyperspectral data from 350 to 1025 nm wavelengths. Pearson correlation analysis revealed a positive correlation between spectral diversity and species diversity in the visible ranges of 350-499 nm (Pearson correlation=0.69, p=0.01), 500-589 nm (Pearson=0.64, p=0.03), and 590-674 nm (Pearson=0.70, p=0.01), slight positive correlation in the red edge range of 675-754 nm (Pearson=0.56, p=0.06), and no correlation in the near-infrared ranges of 755-924 nm (Pearson=-0.06, p=0.85) and 925-1025 nm (Pearson=0.30, p=0.34). These differences in correlations across spectral regions may be due to the elements that contribute to signatures in those regions and spectral data transformation methods. To investigate the role of pigment variability in these correlations, we estimated chlorophyll, carotenoid, and anthocyanin concentrations of five dominant species in the plots using vegetation indices. Although interspecific variability in pigment levels exceeded intraspecific variability, chlorophyll (F value=118) was more varied within species than carotenoids (F=322) and anthocyanins (F=126), perhaps contributing to the lack of correlation between species diversity and spectral diversity in the red edge region. Interspecific differences in pigment levels, however, make it possible to differentiate species remotely.

  13. Engine dynamic analysis with general nonlinear finite element codes. Part 2: Bearing element implementation overall numerical characteristics and benchmaking

    NASA Technical Reports Server (NTRS)

    Padovan, J.; Adams, M.; Fertis, J.; Zeid, I.; Lam, P.

    1982-01-01

    Finite element codes are used in modelling rotor-bearing-stator structure common to the turbine industry. Engine dynamic simulation is used by developing strategies which enable the use of available finite element codes. benchmarking the elements developed are benchmarked by incorporation into a general purpose code (ADINA); the numerical characteristics of finite element type rotor-bearing-stator simulations are evaluated through the use of various types of explicit/implicit numerical integration operators. Improving the overall numerical efficiency of the procedure is improved.

  14. The identification of incident cancers in UK primary care databases: a systematic review.

    PubMed

    Rañopa, Michael; Douglas, Ian; van Staa, Tjeerd; Smeeth, Liam; Klungel, Olaf; Reynolds, Robert; Bhaskaran, Krishnan

    2015-01-01

    UK primary care databases are frequently used in observational studies with cancer outcomes. We aimed to systematically review methods used by such studies to identify and validate incident cancers of the breast, colorectum, and prostate. Medline and Embase (1980-2013) were searched for UK primary care database studies with incident breast, colorectal, or prostate cancer outcomes. Data on the methods used for case ascertainment were extracted and summarised. Questionnaires were sent to corresponding authors to obtain details about case ascertainment. Eighty-four studies of breast (n = 51), colorectal (n = 54), and prostate cancer (n = 31) were identified; 30 examined >1 cancer type. Among the 84 studies, 57 defined cancers using only diagnosis codes, while 27 required further evidence such as chemotherapy. Few studies described methods used to create cancer code lists (n = 5); or made lists available directly (n = 5). Twenty-eight code lists were received on request from study authors. All included malignant neoplasm diagnosis codes, but there was considerable variation in the specific codes included which was not explained by coding dictionary changes. Code lists also varied in terms of other types of codes included, such as in-situ, cancer morphology, history of cancer, and secondary/suspected/borderline cancer codes. In UK primary care database studies, methods for identifying breast, colorectal, and prostate cancers were often unclear. Code lists were often unavailable, and where provided, we observed variation in the individual codes and types of codes included. Clearer reporting of methods and publication of code lists would improve transparency and reproducibility of studies. Copyright © 2014 John Wiley & Sons, Ltd.

  15. 31. Historic American Buildings Survey E. R. Pearson, Photographer 1972 ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    31. Historic American Buildings Survey E. R. Pearson, Photographer 1972 CLOTHES ROOM, FIRST ATTIC, SOUTHEAST CORNER, LOOKING EAST - Shaker Centre Family Dwelling House, West side of U.S. Route 68, South Union, Logan County, KY

  16. 75 FR 68618 - Diamond Sawblades and Parts Thereof From China and Korea

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-08

    ... Pearson dissent, having determined that an industry in the United States is not materially injured or... remand, Vice Chairman Pearson and Commissioners Okun and Lane voted in the negative. On January 13, 2009...

  17. KSC-04PD-2193

    NASA Technical Reports Server (NTRS)

    2004-01-01

    KENNEDY SPACE CENTER, FLA. From left, Carl Benoit, senior national science consultant, Pearson Scott Foresman; Paul McFall, president, Pearson Scott Foresman; Dr. Adena Williams Loston, NASA chief education officer; and James Lippe, science product manager, Pearson Scott Foresman, participate in the unveiling of 'The Science in Space Challenge' at the Doubletree Hotel in Orlando, Fla. The national challenge program is sponsored by NASA and Pearson Scott Foresman, publisher of pre-K through grade six educational books. To participate in the challenge, teachers may submit proposals, on behalf of their students, for a science and technology investigation. Astronauts will conduct the winning projects on a Space Shuttle mission or on the International Space Station, while teachers and students follow along via television or the Web. For more information about the announcement, see the news release at http://www.nasa.gov/home/hqnews/2004/oct/HQ_04341_publication.htm l.

  18. KSC-04PD-2194

    NASA Technical Reports Server (NTRS)

    2004-01-01

    KENNEDY SPACE CENTER, FLA. From left, NASA astronaut Patrick Forrester; Paul McFall, president, Pearson Scott Foresman; Dr. Adena Williams Loston, NASA chief education officer; James Lippe, science product manager, Pearson Scott Foresman; and Carl Benoit, senior national science consultant, Pearson Scott Foresman, participate in the unveiling of 'The Science in Space Challenge' at the Doubletree Hotel in Orlando, Fla. The national challenge program is sponsored by NASA and Pearson Scott Foresman, publisher of pre-K through grade six educational books. To participate in the challenge, teachers may submit proposals, on behalf of their students, for a science and technology investigation. Astronauts will conduct the winning projects on a Space Shuttle mission or on the International Space Station, while teachers and students follow along via television or the Web. For more information about the announcement, see the news release at http://www.nasa.gov/home/hqnews/2004/oct/HQ_04341_publication.htm l.

  19. Weather Types, temperature and relief relationship in the Iberian Peninsula: A regional adiabatic processes under directional weather types

    NASA Astrophysics Data System (ADS)

    Peña Angulo, Dhais; Trigo, Ricardo; Cortesi, Nicola; Gonzalez-Hidalgo, Jose Carlos

    2016-04-01

    We have analyzed at monthly scale the spatial distribution of Pearson correlation between monthly mean of maximum (Tmax) and minimum (Tmin) temperatures with weather types (WTs) in the Iberian Peninsula (IP), represent them in a high spatial resolution grid (10km x 10km) from MOTEDAS dataset (Gonzalez-Hidalgo et al., 2015a). The WT classification was that developed by Jenkinson and Collison, adapted to the Iberian Peninsula by Trigo and DaCamara, using Sea Level Pressure data from NCAR/NCEP Reanalysis dataset (period 1951-2010). The spatial distribution of Pearson correlations shows a clear zonal gradient in Tmax under the zonal advection produced in westerly (W) and easterly (E) flows, with negative correlation in the coastland where the air mass come from but positive correlation to the inland areas. The same is true under North-West (NW), North-East (NE), South-West (SW) and South-East (SE) WTs. These spatial gradients are coherent with the spatial distribution of the main mountain chain and offer an example of regional adiabatic phenomena that affect the entire IP (Peña-Angulo et al., 2015b). These spatial gradients have not been observed in Tmin. We suggest that Tmin values are less sensitive to changes in Sea Level Pressure and more related to local factors. These directional WT present a monthly frequency over 10 days and could be a valuable tool for downscaling processes. González-Hidalgo J.C., Peña-Angulo D., Brunetti M., Cortesi, C. (2015a): MOTEDAS: a new monthly temperature database for mainland Spain and the trend in temperature (1951-2010). International Journal of Climatology 31, 715-731. DOI: 10.1002/joc.4298 Peña-Angulo, D., Trigo, R., Cortesi, C., González-Hidalgo, J.C. (2015b): The influence of weather types on the monthly average maximum and minimum temperatures in the Iberian Peninsula. Submitted to Hydrology and Earth System Sciences.

  20. Quantum error-correcting codes from algebraic geometry codes of Castle type

    NASA Astrophysics Data System (ADS)

    Munuera, Carlos; Tenório, Wanderson; Torres, Fernando

    2016-10-01

    We study algebraic geometry codes producing quantum error-correcting codes by the CSS construction. We pay particular attention to the family of Castle codes. We show that many of the examples known in the literature in fact belong to this family of codes. We systematize these constructions by showing the common theory that underlies all of them.

  1. Comparing the Pearson and Spearman correlation coefficients across distributions and sample sizes: A tutorial using simulations and empirical data.

    PubMed

    de Winter, Joost C F; Gosling, Samuel D; Potter, Jeff

    2016-09-01

    The Pearson product–moment correlation coefficient ( r p ) and the Spearman rank correlation coefficient ( r s ) are widely used in psychological research. We compare r p and r s on 3 criteria: variability, bias with respect to the population value, and robustness to an outlier. Using simulations across low (N = 5) to high (N = 1,000) sample sizes we show that, for normally distributed variables, r p and r s have similar expected values but r s is more variable, especially when the correlation is strong. However, when the variables have high kurtosis, r p is more variable than r s . Next, we conducted a sampling study of a psychometric dataset featuring symmetrically distributed data with light tails, and of 2 Likert-type survey datasets, 1 with light-tailed and the other with heavy-tailed distributions. Consistent with the simulations, r p had lower variability than r s in the psychometric dataset. In the survey datasets with heavy-tailed variables in particular, r s had lower variability than r p , and often corresponded more accurately to the population Pearson correlation coefficient ( R p ) than r p did. The simulations and the sampling studies showed that variability in terms of standard deviations can be reduced by about 20% by choosing r s instead of r p . In comparison, increasing the sample size by a factor of 2 results in a 41% reduction of the standard deviations of r s and r p . In conclusion, r p is suitable for light-tailed distributions, whereas r s is preferable when variables feature heavy-tailed distributions or when outliers are present, as is often the case in psychological research. PsycINFO Database Record (c) 2016 APA, all rights reserved

  2. Identification and Classification of Orthogonal Frequency Division Multiple Access (OFDMA) Signals Used in Next Generation Wireless Systems

    DTIC Science & Technology

    2012-03-01

    advanced antenna systems AMC adaptive modulation and coding AWGN additive white Gaussian noise BPSK binary phase shift keying BS base station BTC ...QAM-16, and QAM-64, and coding types include convolutional coding (CC), convolutional turbo coding (CTC), block turbo coding ( BTC ), zero-terminating

  3. Using QR Codes to Differentiate Learning for Gifted and Talented Students

    ERIC Educational Resources Information Center

    Siegle, Del

    2015-01-01

    QR codes are two-dimensional square patterns that are capable of coding information that ranges from web addresses to links to YouTube video. The codes save time typing and eliminate errors in entering addresses incorrectly. These codes make learning with technology easier for students and motivationally engage them in news ways.

  4. The 'Brick Wall' radio loss approximation and the performance of strong channel codes for deep space applications at high data rates

    NASA Technical Reports Server (NTRS)

    Shambayati, Shervin

    2001-01-01

    In order to evaluate performance of strong channel codes in presence of imperfect carrier phase tracking for residual carrier BPSK modulation in this paper an approximate 'brick wall' model is developed which is independent of the channel code type for high data rates. It is shown that this approximation is reasonably accurate (less than 0.7dB for low FERs for (1784,1/6) code and less than 0.35dB for low FERs for (5920,1/6) code). Based on the approximation's accuracy, it is concluded that the effects of imperfect carrier tracking are more or less independent of the channel code type for strong channel codes. Therefore, the advantage that one strong channel code has over another with perfect carrier tracking translates to nearly the same advantage under imperfect carrier tracking conditions. This will allow the link designers to incorporate projected channel code performance of strong channel codes into their design tables without worrying about their behavior in the face of imperfect carrier phase tracking.

  5. 17 CFR 17.00 - Information to be furnished by futures commission merchants, clearing members and foreign brokers.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 2 AN Exchange Code. 30 1 AN Put or Call. 31 5 AN Commodity Code (1). 36 8 AN Expiration Date (1). 44... Commodity Code (2). 71 8 AN Expiration Date (2). 79 2 Reserved. 80 1 AN Record Type. 1 AN—Alpha—numeric, N—Numeric, S—Signed numeric. (2) Field definitions are as follows: (i) Report type. This report format will...

  6. Analysis of a two-dimensional type 6 shock-interference pattern using a perfect-gas code and a real-gas code

    NASA Technical Reports Server (NTRS)

    Bertin, J. J.; Graumann, B. W.

    1973-01-01

    Numerical codes were developed to calculate the two dimensional flow field which results when supersonic flow encounters double wedge configurations whose angles are such that a type 4 pattern occurs. The flow field model included the shock interaction phenomena for a delta wing orbiter. Two numerical codes were developed, one which used the perfect gas relations and a second which incorporated a Mollier table to define equilibrium air properties. The two codes were used to generate theoretical surface pressure and heat transfer distributions for velocities from 3,821 feet per second to an entry condition of 25,000 feet per second.

  7. Portable Just-in-Time Specialization of Dynamically Typed Scripting Languages

    NASA Astrophysics Data System (ADS)

    Williams, Kevin; McCandless, Jason; Gregg, David

    In this paper, we present a portable approach to JIT compilation for dynamically typed scripting languages. At runtime we generate ANSI C code and use the system's native C compiler to compile this code. The C compiler runs on a separate thread to the interpreter allowing program execution to continue during JIT compilation. Dynamic languages have variables which may change type at any point in execution. Our interpreter profiles variable types at both whole method and partial method granularity. When a frequently executed region of code is discovered, the compilation thread generates a specialized version of the region based on the profiled types. In this paper, we evaluate the level of instruction specialization achieved by our profiling scheme as well as the overall performance of our JIT.

  8. Burnout Syndrome Among Health Care Students: The Role of Type D Personality.

    PubMed

    Skodova, Zuzana; Lajciakova, Petra; Banovcinova, Lubica

    2016-07-18

    The aim of this study was to examine the effect of Type D personality, along with other personality traits (resilience and sense of coherence), on burnout syndrome and its counterpart, engagement, among students of nursing, midwifery, and psychology. A cross-sectional study was conducted on 97 university students (91.9% females; M age = 20.2 ± 1.49 years). A Type D personality subscale, School Burnout Inventory, Utrecht Work Engagement Scale, Sense of Coherence Questionnaire, and Baruth Protective Factor Inventory were used. Linear regression models, Student's t test, and Pearson's correlation analysis were employed. Negative affectivity, a dimension of Type D personality, was a significant personality predictor for burnout syndrome (β = .54; 95% CI = [0.33, 1.01]). The only significant personality predictor of engagement was a sense of coherence. Students who were identified as having Type D personality characteristics scored significantly higher on the burnout syndrome questionnaire (t = -2.58, p < .01). In health care professions, personality predictors should be addressed to prevent burnout. © The Author(s) 2016.

  9. Accelerated Stability Studies on Dried Extracts of Centella asiatica Through Chemical, HPLC, HPTLC, and Biological Activity Analyses.

    PubMed

    Kaur, Ishtdeep; Suthar, Nancy; Kaur, Jasmeen; Bansal, Yogita; Bansal, Gulshan

    2016-10-01

    Regulatory guidelines recommend systematic stability studies on a herbal product to establish its shelf life. In the present study, commercial extracts (Types I and II) and freshly prepared extract (Type III) of Centella asiatica were subjected to accelerated stability testing for 6 months. Control and stability samples were evaluated for organoleptics, pH, moisture, total phenolic content (TPC), asiatic acid, kaempherol, and high-performance thin layer chromatography fingerprints, and for antioxidant and acetylcholinesterase inhibitory activities. Markers and TPC and both the activities of each extract decreased in stability samples with respect to control. These losses were maximum in Type I extract and minimum in Type III extract. Higher stability of Type III extract than others might be attributed to the additional phytoconstituents and/or preservatives in it. Pearson correlation analysis of the results suggested that TPC, asiatic acid, and kaempferol can be taken as chemical markers to assess chemical and therapeutic shelf lives of herbal products containing Centella asiatica. © The Author(s) 2016.

  10. Clinical manifestations and management of four children with Pearson syndrome.

    PubMed

    Tumino, Manuela; Meli, Concetta; Farruggia, Piero; La Spina, Milena; Faraci, Maura; Castana, Cinzia; Di Raimondo, Vincenzo; Alfano, Marivana; Pittalà, Annarita; Lo Nigro, Luca; Russo, Giovanna; Di Cataldo, Andrea

    2011-12-01

    Pearson marrow-pancreas syndrome is a fatal disorder mostly diagnosed during infancy and caused by mutations of mitochondrial DNA. We hereby report on four children affected by Pearson syndrome with hematological disorders at onset. The disease was fatal to three of them and the fourth one, who received hematopoietic stem cell transplantation, died of secondary malignancy. In this latter patient transplantation corrected hematological and non-hematological issues like metabolic acidosis, and we therefore argue that it could be considered as a useful option in an early stage of the disease. Copyright © 2011 Wiley Periodicals, Inc.

  11. Pearson Syndrome, A Medical Diagnosis Difficult to Sustain Without Genetic Testing.

    PubMed

    Sur, Lucia; Floca, Emanuela; Samasca, Gabriel; Lupan, Iulia; Aldea, Cornel; Sur, Genel

    2018-03-01

    The detection of sideroblastic anemia in a newborn may suggest developing Pearson syndrome. The prognosis of these patients is severe and death occurs in the first 3 years of life, so it is important to find new ways of diagnosis. Case Presentation: In the case of our patient the diagnosis was supported only at the age of 5 months, highlighting the difficulties of diagnosis at this age. The diagnosis of Pearson syndrome with neonatal onset is difficult to sustain or even impossible at that age. This diagnosis can be confirmed and supported during disease progression.

  12. Heavy-tailed fractional Pearson diffusions.

    PubMed

    Leonenko, N N; Papić, I; Sikorskii, A; Šuvak, N

    2017-11-01

    We define heavy-tailed fractional reciprocal gamma and Fisher-Snedecor diffusions by a non-Markovian time change in the corresponding Pearson diffusions. Pearson diffusions are governed by the backward Kolmogorov equations with space-varying polynomial coefficients and are widely used in applications. The corresponding fractional reciprocal gamma and Fisher-Snedecor diffusions are governed by the fractional backward Kolmogorov equations and have heavy-tailed marginal distributions in the steady state. We derive the explicit expressions for the transition densities of the fractional reciprocal gamma and Fisher-Snedecor diffusions and strong solutions of the associated Cauchy problems for the fractional backward Kolmogorov equation.

  13. Pearson disease in an infant presenting with severe hypoplastic anemia, normal pancreatic function, and progressive liver failure.

    PubMed

    Shapira, Adi; Konopnicki, Muriel; Hammad-Saied, Mohammed; Shabad, Evelyn

    2014-07-01

    Pearson disease is a rare, usually fatal, mitochondrial disorder affecting primarily the bone marrow and the exocrine pancreas. We report a previously healthy 10-week-old girl who presented with profound macrocytic anemia followed by pancytopenia, synthetic liver dysfunction with liver steatosis, and metabolic acidosis with high lactate levels. She had no pancreatic involvement. Multiple cytoplasmic vacuoles in myelocytes and monocytes were seen upon microscopic evaluation of the bone marrow. Genetic analysis of the mitochondrial genome revealed a 5 kbp deletion, thus establishing the diagnosis of Pearson disease.

  14. "Describing our whole experience": the statistical philosophies of W. F. R. Weldon and Karl Pearson.

    PubMed

    Pence, Charles H

    2011-12-01

    There are two motivations commonly ascribed to historical actors for taking up statistics: to reduce complicated data to a mean value (e.g., Quetelet), and to take account of diversity (e.g., Galton). Different motivations will, it is assumed, lead to different methodological decisions in the practice of the statistical sciences. Karl Pearson and W. F. R. Weldon are generally seen as following directly in Galton's footsteps. I argue for two related theses in light of this standard interpretation, based on a reading of several sources in which Weldon, independently of Pearson, reflects on his own motivations. First, while Pearson does approach statistics from this "Galtonian" perspective, he is, consistent with his positivist philosophy of science, utilizing statistics to simplify the highly variable data of biology. Weldon, on the other hand, is brought to statistics by a rich empiricism and a desire to preserve the diversity of biological data. Secondly, we have here a counterexample to the claim that divergence in motivation will lead to a corresponding separation in methodology. Pearson and Weldon, despite embracing biometry for different reasons, settled on precisely the same set of statistical tools for the investigation of evolution. Copyright © 2011 Elsevier Ltd. All rights reserved.

  15. Quantifying distinct associations on different temporal scales: comparison of DCCA and Pearson methods

    NASA Astrophysics Data System (ADS)

    Piao, Lin; Fu, Zuntao

    2016-11-01

    Cross-correlation between pairs of variables takes multi-time scale characteristic, and it can be totally different on different time scales (changing from positive correlation to negative one), e.g., the associations between mean air temperature and relative humidity over regions to the east of Taihang mountain in China. Therefore, how to correctly unveil these correlations on different time scales is really of great importance since we actually do not know if the correlation varies with scales in advance. Here, we compare two methods, i.e. Detrended Cross-Correlation Analysis (DCCA for short) and Pearson correlation, in quantifying scale-dependent correlations directly to raw observed records and artificially generated sequences with known cross-correlation features. Studies show that 1) DCCA related methods can indeed quantify scale-dependent correlations, but not Pearson method; 2) the correlation features from DCCA related methods are robust to contaminated noises, however, the results from Pearson method are sensitive to noise; 3) the scale-dependent correlation results from DCCA related methods are robust to the amplitude ratio between slow and fast components, while Pearson method may be sensitive to the amplitude ratio. All these features indicate that DCCA related methods take some advantages in correctly quantifying scale-dependent correlations, which results from different physical processes.

  16. Quantifying distinct associations on different temporal scales: comparison of DCCA and Pearson methods.

    PubMed

    Piao, Lin; Fu, Zuntao

    2016-11-09

    Cross-correlation between pairs of variables takes multi-time scale characteristic, and it can be totally different on different time scales (changing from positive correlation to negative one), e.g., the associations between mean air temperature and relative humidity over regions to the east of Taihang mountain in China. Therefore, how to correctly unveil these correlations on different time scales is really of great importance since we actually do not know if the correlation varies with scales in advance. Here, we compare two methods, i.e. Detrended Cross-Correlation Analysis (DCCA for short) and Pearson correlation, in quantifying scale-dependent correlations directly to raw observed records and artificially generated sequences with known cross-correlation features. Studies show that 1) DCCA related methods can indeed quantify scale-dependent correlations, but not Pearson method; 2) the correlation features from DCCA related methods are robust to contaminated noises, however, the results from Pearson method are sensitive to noise; 3) the scale-dependent correlation results from DCCA related methods are robust to the amplitude ratio between slow and fast components, while Pearson method may be sensitive to the amplitude ratio. All these features indicate that DCCA related methods take some advantages in correctly quantifying scale-dependent correlations, which results from different physical processes.

  17. An efficient sensitivity analysis method for modified geometry of Macpherson suspension based on Pearson correlation coefficient

    NASA Astrophysics Data System (ADS)

    Shojaeefard, Mohammad Hasan; Khalkhali, Abolfazl; Yarmohammadisatri, Sadegh

    2017-06-01

    The main purpose of this paper is to propose a new method for designing Macpherson suspension, based on the Sobol indices in terms of Pearson correlation which determines the importance of each member on the behaviour of vehicle suspension. The formulation of dynamic analysis of Macpherson suspension system is developed using the suspension members as the modified links in order to achieve the desired kinematic behaviour. The mechanical system is replaced with an equivalent constrained links and then kinematic laws are utilised to obtain a new modified geometry of Macpherson suspension. The equivalent mechanism of Macpherson suspension increased the speed of analysis and reduced its complexity. The ADAMS/CAR software is utilised to simulate a full vehicle, Renault Logan car, in order to analyse the accuracy of modified geometry model. An experimental 4-poster test rig is considered for validating both ADAMS/CAR simulation and analytical geometry model. Pearson correlation coefficient is applied to analyse the sensitivity of each suspension member according to vehicle objective functions such as sprung mass acceleration, etc. Besides this matter, the estimation of Pearson correlation coefficient between variables is analysed in this method. It is understood that the Pearson correlation coefficient is an efficient method for analysing the vehicle suspension which leads to a better design of Macpherson suspension system.

  18. Performance Analysis of Hybrid ARQ Protocols in a Slotted Code Division Multiple-Access Network

    DTIC Science & Technology

    1989-08-01

    Convolutional Codes . in Proc Int. Conf. Commun., 21.4.1-21.4.5, 1987. [27] J. Hagenauer. Rate Compatible Punctured Convolutional Codes . in Proc Int. Conf...achieved by using a low rate (r = 0.5), high constraint length (e.g., 32) punctured convolutional code . Code puncturing provides for a variable rate code ...investigated the use of convolutional codes in Type II Hybrid ARQ protocols. The error

  19. Short-Block Protograph-Based LDPC Codes

    NASA Technical Reports Server (NTRS)

    Divsalar, Dariush; Dolinar, Samuel; Jones, Christopher

    2010-01-01

    Short-block low-density parity-check (LDPC) codes of a special type are intended to be especially well suited for potential applications that include transmission of command and control data, cellular telephony, data communications in wireless local area networks, and satellite data communications. [In general, LDPC codes belong to a class of error-correcting codes suitable for use in a variety of wireless data-communication systems that include noisy channels.] The codes of the present special type exhibit low error floors, low bit and frame error rates, and low latency (in comparison with related prior codes). These codes also achieve low maximum rate of undetected errors over all signal-to-noise ratios, without requiring the use of cyclic redundancy checks, which would significantly increase the overhead for short blocks. These codes have protograph representations; this is advantageous in that, for reasons that exceed the scope of this article, the applicability of protograph representations makes it possible to design highspeed iterative decoders that utilize belief- propagation algorithms.

  20. Reliability of routinely collected hospital data for child maltreatment surveillance.

    PubMed

    McKenzie, Kirsten; Scott, Debbie A; Waller, Garry S; Campbell, Margaret

    2011-01-05

    Internationally, research on child maltreatment-related injuries has been hampered by a lack of available routinely collected health data to identify cases, examine causes, identify risk factors and explore health outcomes. Routinely collected hospital separation data coded using the International Classification of Diseases and Related Health Problems (ICD) system provide an internationally standardised data source for classifying and aggregating diseases, injuries, causes of injuries and related health conditions for statistical purposes. However, there has been limited research to examine the reliability of these data for child maltreatment surveillance purposes. This study examined the reliability of coding of child maltreatment in Queensland, Australia. A retrospective medical record review and recoding methodology was used to assess the reliability of coding of child maltreatment. A stratified sample of hospitals across Queensland was selected for this study, and a stratified random sample of cases was selected from within those hospitals. In 3.6% of cases the coders disagreed on whether any maltreatment code could be assigned (definite or possible) versus no maltreatment being assigned (unintentional injury), giving a sensitivity of 0.982 and specificity of 0.948. The review of these cases where discrepancies existed revealed that all cases had some indications of risk documented in the records. 15.5% of cases originally assigned a definite or possible maltreatment code, were recoded to a more or less definite strata. In terms of the number and type of maltreatment codes assigned, the auditor assigned a greater number of maltreatment types based on the medical documentation than the original coder assigned (22% of the auditor coded cases had more than one maltreatment type assigned compared to only 6% of the original coded data). The maltreatment types which were the most 'under-coded' by the original coder were psychological abuse and neglect. Cases coded with a sexual abuse code showed the highest level of reliability. Given the increasing international attention being given to improving the uniformity of reporting of child-maltreatment related injuries and the emphasis on the better utilisation of routinely collected health data, this study provides an estimate of the reliability of maltreatment-specific ICD-10-AM codes assigned in an inpatient setting.

  1. Reliability of Routinely Collected Hospital Data for Child Maltreatment Surveillance

    PubMed Central

    2011-01-01

    Background Internationally, research on child maltreatment-related injuries has been hampered by a lack of available routinely collected health data to identify cases, examine causes, identify risk factors and explore health outcomes. Routinely collected hospital separation data coded using the International Classification of Diseases and Related Health Problems (ICD) system provide an internationally standardised data source for classifying and aggregating diseases, injuries, causes of injuries and related health conditions for statistical purposes. However, there has been limited research to examine the reliability of these data for child maltreatment surveillance purposes. This study examined the reliability of coding of child maltreatment in Queensland, Australia. Methods A retrospective medical record review and recoding methodology was used to assess the reliability of coding of child maltreatment. A stratified sample of hospitals across Queensland was selected for this study, and a stratified random sample of cases was selected from within those hospitals. Results In 3.6% of cases the coders disagreed on whether any maltreatment code could be assigned (definite or possible) versus no maltreatment being assigned (unintentional injury), giving a sensitivity of 0.982 and specificity of 0.948. The review of these cases where discrepancies existed revealed that all cases had some indications of risk documented in the records. 15.5% of cases originally assigned a definite or possible maltreatment code, were recoded to a more or less definite strata. In terms of the number and type of maltreatment codes assigned, the auditor assigned a greater number of maltreatment types based on the medical documentation than the original coder assigned (22% of the auditor coded cases had more than one maltreatment type assigned compared to only 6% of the original coded data). The maltreatment types which were the most 'under-coded' by the original coder were psychological abuse and neglect. Cases coded with a sexual abuse code showed the highest level of reliability. Conclusion Given the increasing international attention being given to improving the uniformity of reporting of child-maltreatment related injuries and the emphasis on the better utilisation of routinely collected health data, this study provides an estimate of the reliability of maltreatment-specific ICD-10-AM codes assigned in an inpatient setting. PMID:21208411

  2. Rocky Mountain Arsenal Ecological Chemical Data (1984-1985)

    DTIC Science & Technology

    1986-03-01

    Type Wet Areas Code Area LK Lake MT Marshy Type PD Pond C-1 APPENDIX D TISSUE CODES Code Tissue BRA Brain FIL Filet EDP Edible Portion LIV Liver MUS...ctus 472 Craton texcisis Crotoni 473 Cryptantha fcndleri Fcnder’s Cryptantha 474 Cucurbita foctidissinma Wild Gourd 475 Cytroptcrus montanus Pink Cym...adcrnocaulon Nlorthern wilIlow- herb 484 Eragrostis cilianensis Stinkgrass 485 Erfcgonum annuum Tall Erlogonum 486 Erigercn divergcns Spreading Fleabine 487

  3. Wound center facility billing: A retrospective analysis of time, wound size, and acuity scoring for determining facility level of service.

    PubMed

    Fife, Caroline E; Walker, David; Farrow, Wade; Otto, Gordon

    2007-01-01

    Outpatient wound center facility reimbursement for Medicare beneficiaries can be a challenge to determine and obtain. To compare methods of calculating facility service levels for outpatient wound centers and to demonstrate the advantages of an acuity-based billing system (one that incorporates components of facility work that is non-reimbursable by procedure codes and that represents an activity-based costing approach to medical billing), a retrospective study of 5,098 patient encounters contained in a wound care-specific electronic medical record database was conducted. Approximately 500 patient visits to the outpatient wound center of a Texas regional hospital between April 2003 and November 2004 were categorized by service level in documentation and facility management software. Visits previously billed using a time-based system were compared to the Centers for Medicare and Medicaid Services' proposed three-tiered wound size-based system. The time-based system also was compared to an acuity-based scoring system. The Pearson correlation coefficient between billed level of service by time and estimated level of service by acuity was 0.442 and the majority of follow-up visits were billed as Level 3 and above (on a time level of 1 to 5) , confirming that time is not a surrogate for actual work performed. Wound size also was found to be unrelated to service level (Pearson correlation = 0.017) and 97% of wound areas were < 100 cm2. The acuity-based scoring system produced a near-normal distribution of results, producing more mid-range billings than extremes; no other method produced this distribution. Hospital-based outpatient wound centers should develop, review, and refine acuity score-based models on which to determine billed level of service.

  4. Do HCAHPS Doctor Communication Scores Reflect the Communication Skills of the Attending on Record? A Cautionary Tale from a Tertiary-Care Medical Service.

    PubMed

    Velez, Vicente J; Kaw, Roop; Hu, Bo; Frankel, Richard M; Windover, Amy K; Bokar, Dan; Rish, Julie M; Rothberg, Michael B

    2017-06-01

    Hospital Consumer Assessment of Healthcare Providers and Systems (HCAHPS) scores measure patient satisfaction with hospital care. It is not known if these reflect the communication skills of the attending physician on record. The Four Habits Coding Scheme (4HCS) is a validated instrument that measures bedside physician communication skills according to 4 habits, namely: investing in the beginning, eliciting the patient's perspective, demonstrating empathy, and investing in the end. To investigate whether the 4HCS correlates with provider HCAHPS scores. Using a cross-sectional design, consenting hospitalist physicians (n = 28), were observed on inpatient rounds during 3 separate encounters. We compared hospitalists' 4HCS scores with their doctor communication HCAHPS scores to assess the degree to which these correlated with inpatient physician communication skills. We performed sensitivity analysis excluding scores returned by patients cared for by more than 1 hospitalist. A total of 1003 HCAHPS survey responses were available. Pearson correlation between 4HCS and doctor communication scores was not significant, at 0.098 (-0.285, 0.455; P = 0.619). Also, no significant correlations were found between each habit and HCAHPS. When including only scores attributable to 1 hospitalist, Pearson correlation between the empathy habit and the HCAHPS respect score was 0.515 (0.176, 0.745; P = 0.005). Between empathy and overall doctor communication, it was 0.442 (0.082, 0.7; P = 0.019). Attending-of-record HCAHPS scores do not correlate with 4HCS. After excluding patients cared for by more than 1 hospitalist, demonstrating empathy did correlate with the doctor communication and respect HCAHPS scores. Journal of Hospital Medicine 2017;12:421-427. © 2017 Society of Hospital Medicine

  5. HIV self-care practices during pregnancy and maternal health outcomes among HIV-positive postnatal mothers aged 18-35 years at Mbuya Nehanda maternity hospital.

    PubMed

    Dodzo, Lilian Gertrude; Mahaka, Hilda Tandazani; Mukona, Doreen; Zvinavashe, Mathilda; Haruzivishe, Clara

    2017-06-01

    HIV-related conditions are one of the indirect causes of maternal deaths in Zimbabwe and the prevalence rate was estimated to be 13.63% in 2009. The study utilised a descriptive correlational design on 80 pregnant women who were HIV positive at Mbuya Nehanda maternity hospital in Harare, Zimbabwe. Participants comprised a random sample of 80 postnatal mothers. Permission to carry out the study was obtained from the respective review boards. Participants signed an informed consent. Data were collected using a structured questionnaire and record review from 1 to 20 March 2012. Interviews were done in a private room and code numbers were used to identify the participants. Completed questionnaires were kept in a lockable cupboard and the researcher had sole access to them. Data were analysed using the Statistical Package for Social Sciences (SPSS) version 12. Descriptive statistics were used to analyse data on demographics, maternal health outcomes and self-care practices. Inferential statistics (Pearson's correlation and regression analysis) were used to analyse the relationship between self-care practices and maternal health outcomes. Self-care practices were good with a mean score of 8 out of 16. Majority (71.3%) fell within the good category. Maternal outcomes were poor with a mean score of 28 out of 62 and 67.5% falling in the poor category. Pearson's correlation indicated a weak significant positive relationship (r = .317, p = <.01). Regression analysis (R 2 ) was .10 implying that self-care practices explained 10% of the variance observed in maternal health outcomes. More research needs to be carried out to identify other variables affecting maternal outcomes in HIV-positive pregnant women.

  6. [Gluten: Is the information available on the Internet valid?

    PubMed

    Banti, T; Fievet, L; Fabre, A

    2017-10-01

    Internet provides easy access to health information, but the quality and validity of this information vary. Evaluate the quality of website structures and the information provided on celiac disease (CD), gluten sensitivity (GS), and wheat allergy (WA). The websites addressing CD, GS, and WA appearing on the first two pages of Google, Yahoo, and Bing from seven selected queries were investigated. We initially assessed the website structures with one instrument (Netscoring) and the presence of certification (quality label Health On the Net (HON code)). Then we evaluated the content of each website concerning the information about CD, GS, and WA. Our repository was based on the most recent guidelines of the European Society of Pediatric Gastroenterology, Hepatology, and Nutrition (ESPGHAN) and the World Gastroenterology Organization (WGO) published in 2012. The websites were classified into eight categories. One hundred and five websites were included. Twenty-one websites obtained a sufficient score with the Netscoring instrument (average 113.6/312). There was a significant correlation between the referenced websites analyzed and the grades obtained with the Netscoring instrument (Pearson=0.39, P=0.2×10 -5 ): websites of scientific societies (11.8/18), community websites (9.44/18), and website associations (9.4/18). There was a significant correlation between the results obtained for the websites on CD, GS, and WA and the results obtained for the websites with the Netscoring instruments (Pearson=0.41, P=2.6×10 -6 ). Only three websites were consistent with the guidelines on CD, GS, and WA. The websites were partially in agreement with the guidelines. To date, the pediatrician remains the main actor in parental guidance concerning gluten information. Copyright © 2017 Elsevier Masson SAS. All rights reserved.

  7. Online Radiology Reporting with Peer Review as a Learning and Feedback Tool in Radiology; Implementation, Validity, and Student Impressions.

    PubMed

    McEvoy, Fintan J; Shen, Nicholas W; Nielsen, Dorte H; Buelund, Lene E; Holm, Peter

    2017-02-01

    Communicating radiological reports to peers has pedagogical value. Students may be uneasy with the process due to a lack of communication and peer review skills or to their failure to see value in the process. We describe a communication exercise with peer review in an undergraduate veterinary radiology course. The computer code used to manage the course and deliver images online is reported, and we provide links to the executable files. We tested to see if undergraduate peer review of radiological reports has validity and describe student impressions of the learning process. Peer review scores for student-generated radiological reports were compared to scores obtained in the summative multiple choice (MCQ) examination for the course. Student satisfaction was measured using a bespoke questionnaire. There was a weak positive correlation (Pearson correlation coefficient = 0.32, p < 0.01) between peer review scores students received and the student scores obtained in the MCQ examination. The difference in peer review scores received by students grouped according to their level of course performance (high vs. low) was statistically significant (p < 0.05). No correlation was found between peer review scores awarded by the students and the scores they obtained in the MCQ examination (Pearson correlation coefficient = 0.17, p = 0.14). In conclusion, we have created a realistic radiology imaging exercise with readily available software. The peer review scores are valid in that to a limited degree they reflect student future performance in an examination. Students valued the process of learning to communicate radiological findings but do not fully appreciated the value of peer review.

  8. Methods and results of peak-flow frequency analyses for streamgages in and bordering Minnesota, through water year 2011

    USGS Publications Warehouse

    Kessler, Erich W.; Lorenz, David L.; Sanocki, Christopher A.

    2013-01-01

    Peak-flow frequency analyses were completed for 409 streamgages in and bordering Minnesota having at least 10 systematic peak flows through water year 2011. Selected annual exceedance probabilities were determined by fitting a log-Pearson type III probability distribution to the recorded annual peak flows. A detailed explanation of the methods that were used to determine the annual exceedance probabilities, the historical period, acceptable low outliers, and analysis method for each streamgage are presented. The final results of the analyses are presented.

  9. Comparison of methods for estimating flood magnitudes on small streams in Georgia

    USGS Publications Warehouse

    Hess, Glen W.; Price, McGlone

    1989-01-01

    The U.S. Geological Survey has collected flood data for small, natural streams at many sites throughout Georgia during the past 20 years. Flood-frequency relations were developed for these data using four methods: (1) observed (log-Pearson Type III analysis) data, (2) rainfall-runoff model, (3) regional regression equations, and (4) map-model combination. The results of the latter three methods were compared to the analyses of the observed data in order to quantify the differences in the methods and determine if the differences are statistically significant.

  10. Finite-block-length analysis in classical and quantum information theory.

    PubMed

    Hayashi, Masahito

    2017-01-01

    Coding technology is used in several information processing tasks. In particular, when noise during transmission disturbs communications, coding technology is employed to protect the information. However, there are two types of coding technology: coding in classical information theory and coding in quantum information theory. Although the physical media used to transmit information ultimately obey quantum mechanics, we need to choose the type of coding depending on the kind of information device, classical or quantum, that is being used. In both branches of information theory, there are many elegant theoretical results under the ideal assumption that an infinitely large system is available. In a realistic situation, we need to account for finite size effects. The present paper reviews finite size effects in classical and quantum information theory with respect to various topics, including applied aspects.

  11. Finite-block-length analysis in classical and quantum information theory

    PubMed Central

    HAYASHI, Masahito

    2017-01-01

    Coding technology is used in several information processing tasks. In particular, when noise during transmission disturbs communications, coding technology is employed to protect the information. However, there are two types of coding technology: coding in classical information theory and coding in quantum information theory. Although the physical media used to transmit information ultimately obey quantum mechanics, we need to choose the type of coding depending on the kind of information device, classical or quantum, that is being used. In both branches of information theory, there are many elegant theoretical results under the ideal assumption that an infinitely large system is available. In a realistic situation, we need to account for finite size effects. The present paper reviews finite size effects in classical and quantum information theory with respect to various topics, including applied aspects. PMID:28302962

  12. Adverse Events Involving Radiation Oncology Medical Devices: Comprehensive Analysis of US Food and Drug Administration Data, 1991 to 2015

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Connor, Michael J.; Department of Radiation Oncology, University of California Irvine School of Medicine, Irvine, California; Marshall, Deborah C.

    Purpose: Radiation oncology relies on rapidly evolving technology and highly complex processes. The US Food and Drug Administration collects reports of adverse events related to medical devices. We sought to characterize all events involving radiation oncology devices (RODs) from the US Food and Drug Administration's postmarket surveillance Manufacturer and User Facility Device Experience (MAUDE) database, comparing these with non–radiation oncology devices. Methods and Materials: MAUDE data on RODs from 1991 to 2015 were sorted into 4 product categories (external beam, brachytherapy, planning systems, and simulation systems) and 5 device problem categories (software, mechanical, electrical, user error, and dose delivery impact).more » Outcomes included whether the device was evaluated by the manufacturer, adverse event type, remedial action, problem code, device age, and time since 510(k) approval. Descriptive statistics were performed with linear regression of time-series data. Results for RODs were compared with those for other devices by the Pearson χ{sup 2} test for categorical data and 2-sample Kolmogorov-Smirnov test for distributions. Results: There were 4234 ROD and 4,985,698 other device adverse event reports. Adverse event reports increased over time, and events involving RODs peaked in 2011. Most ROD reports involved external beam therapy (50.8%), followed by brachytherapy (24.9%) and treatment planning systems (21.6%). The top problem types were software (30.4%), mechanical (20.9%), and user error (20.4%). RODs differed significantly from other devices in each outcome (P<.001). RODs were more likely to be evaluated by the manufacturer after an event (46.9% vs 33.0%) but less likely to be recalled (10.5% vs 37.9%) (P<.001). Device age and time since 510(k) approval were shorter among RODs (P<.001). Conclusions: Compared with other devices, RODs may experience adverse events sooner after manufacture and market approval. Close postmarket surveillance, improved software design, and manufacturer–user training may help mitigate these events.« less

  13. Acceptance sampling for attributes via hypothesis testing and the hypergeometric distribution

    NASA Astrophysics Data System (ADS)

    Samohyl, Robert Wayne

    2017-10-01

    This paper questions some aspects of attribute acceptance sampling in light of the original concepts of hypothesis testing from Neyman and Pearson (NP). Attribute acceptance sampling in industry, as developed by Dodge and Romig (DR), generally follows the international standards of ISO 2859, and similarly the Brazilian standards NBR 5425 to NBR 5427 and the United States Standards ANSI/ASQC Z1.4. The paper evaluates and extends the area of acceptance sampling in two directions. First, by suggesting the use of the hypergeometric distribution to calculate the parameters of sampling plans avoiding the unnecessary use of approximations such as the binomial or Poisson distributions. We show that, under usual conditions, discrepancies can be large. The conclusion is that the hypergeometric distribution, ubiquitously available in commonly used software, is more appropriate than other distributions for acceptance sampling. Second, and more importantly, we elaborate the theory of acceptance sampling in terms of hypothesis testing rigorously following the original concepts of NP. By offering a common theoretical structure, hypothesis testing from NP can produce a better understanding of applications even beyond the usual areas of industry and commerce such as public health and political polling. With the new procedures, both sample size and sample error can be reduced. What is unclear in traditional acceptance sampling is the necessity of linking the acceptable quality limit (AQL) exclusively to the producer and the lot quality percent defective (LTPD) exclusively to the consumer. In reality, the consumer should also be preoccupied with a value of AQL, as should the producer with LTPD. Furthermore, we can also question why type I error is always uniquely associated with the producer as producer risk, and likewise, the same question arises with consumer risk which is necessarily associated with type II error. The resolution of these questions is new to the literature. The article presents R code throughout.

  14. Analyser-based phase contrast image reconstruction using geometrical optics.

    PubMed

    Kitchen, M J; Pavlov, K M; Siu, K K W; Menk, R H; Tromba, G; Lewis, R A

    2007-07-21

    Analyser-based phase contrast imaging can provide radiographs of exceptional contrast at high resolution (<100 microm), whilst quantitative phase and attenuation information can be extracted using just two images when the approximations of geometrical optics are satisfied. Analytical phase retrieval can be performed by fitting the analyser rocking curve with a symmetric Pearson type VII function. The Pearson VII function provided at least a 10% better fit to experimentally measured rocking curves than linear or Gaussian functions. A test phantom, a hollow nylon cylinder, was imaged at 20 keV using a Si(1 1 1) analyser at the ELETTRA synchrotron radiation facility. Our phase retrieval method yielded a more accurate object reconstruction than methods based on a linear fit to the rocking curve. Where reconstructions failed to map expected values, calculations of the Takagi number permitted distinction between the violation of the geometrical optics conditions and the failure of curve fitting procedures. The need for synchronized object/detector translation stages was removed by using a large, divergent beam and imaging the object in segments. Our image acquisition and reconstruction procedure enables quantitative phase retrieval for systems with a divergent source and accounts for imperfections in the analyser.

  15. Novel four-sided neural probe fabricated by a thermal lamination process of polymer films.

    PubMed

    Shin, Soowon; Kim, Jae-Hyun; Jeong, Joonsoo; Gwon, Tae Mok; Lee, Seung-Hee; Kim, Sung June

    2017-02-15

    Ideally, neural probes should have channels with a three-dimensional (3-D) configuration to record the activities of 3-D neural circuits. Many types of 3-D neural probes have been developed; however, most of them were designed as an array of multiple shanks with electrodes located along one side of the shanks. We developed a novel liquid crystal polymer (LCP)-based neural probe with four-sided electrodes. This probe has electrodes on four sides of the shank, i.e., the front, back and two sidewalls. To generate the proposed configuration of the electrodes, we used a thermal lamination process involving LCP films and laser micromachining. The proposed novel four-sided neural probe, was used to successfully perform in vivo multichannel neural recording in the mouse primary somatosensory cortex. The multichannel neural recording showed that the proposed four-sided neural probe can record spiking activities from a more diverse neuronal population than single-sided probes. This was confirmed by a pairwise Pearson correlation coefficient (Pearson's r) analysis and a cross-correlation analysis. The developed four-sided neural probe can be used to record various signals from a complex neural network. Copyright © 2016 Elsevier B.V. All rights reserved.

  16. Identifying the Source of Misfit in Item Response Theory Models.

    PubMed

    Liu, Yang; Maydeu-Olivares, Alberto

    2014-01-01

    When an item response theory model fails to fit adequately, the items for which the model provides a good fit and those for which it does not must be determined. To this end, we compare the performance of several fit statistics for item pairs with known asymptotic distributions under maximum likelihood estimation of the item parameters: (a) a mean and variance adjustment to bivariate Pearson's X(2), (b) a bivariate subtable analog to Reiser's (1996) overall goodness-of-fit test, (c) a z statistic for the bivariate residual cross product, and (d) Maydeu-Olivares and Joe's (2006) M2 statistic applied to bivariate subtables. The unadjusted Pearson's X(2) with heuristically determined degrees of freedom is also included in the comparison. For binary and ordinal data, our simulation results suggest that the z statistic has the best Type I error and power behavior among all the statistics under investigation when the observed information matrix is used in its computation. However, if one has to use the cross-product information, the mean and variance adjusted X(2) is recommended. We illustrate the use of pairwise fit statistics in 2 real-data examples and discuss possible extensions of the current research in various directions.

  17. Stability of physical activity, fitness components and diet quality indices.

    PubMed

    Mertens, E; Clarys, P; Mullie, P; Lefevre, J; Charlier, R; Knaeps, S; Huybrechts, I; Deforche, B

    2017-04-01

    Regular physical activity (PA), a high level of fitness and a high diet quality are positively associated with health. However, information about stability of fitness components and diet quality indices is limited. This study aimed to evaluate stability of those parameters. This study includes 652 adults (men=57.56 (10.28) years; women=55.90 (8.34) years at follow-up) who participated in 2002-2004 and returned for follow-up at the Policy Research Centre Leuven in 2012-2014. Minutes sport per day and Physical activity level (PAL) were calculated from the Flemish Physical Activity Computerized Questionnaire. Cardiorespiratory fitness (CRF), morphological fitness (MORF; body mass index and waist circumference) and metabolic fitness (METF) (blood cholesterol and triglycerides) were used as fitness components. Diet quality indices (Healthy Eating Index-2010 (HEI), Diet Quality Index (DQI), Mediterranean Diet Score (MDS)) were calculated from a diet record. Tracking coefficients were calculated using Pearson/Spearman correlation coefficients (r Pearson ) and intra-class correlation coefficients (r ICC ). In both men (r Pearson&ICC =0.51) and women (r Pearson =0.62 and r ICC =0.60) PAL showed good stability, while minutes sport remained stable in women (r Pearson&ICC =0.57) but less in men (r Pearson&ICC =0.45). Most fitness components remained stable (r⩾0.50) except some METF components in women. In general the diet quality indices and their components were unstable (r<0.50). PAL and the majority of the fitness components remained stable, while diet quality was unstable over 10 years. For unstable parameters such as diet quality measurements are needed at both time points in prospective research.

  18. The utility of ultrasound and magnetic resonance imaging versus surgery for the characterization of müllerian anomalies in the pediatric and adolescent population.

    PubMed

    Santos, X M; Krishnamurthy, R; Bercaw-Pratt, J L; Dietrich, J E

    2012-06-01

    To evaluate the utility of transabdominal ultrasound and magnetic resonance imaging in the evaluation of American Society for Reproductive Medicine (†)(ASRM)-classified müllerian anomalies compared to surgical findings in the pediatric and adolescent population. Retrospective chart review. Tertiary academic center. Thirty-eight patients with müllerian anomalies seen in our pediatric and adolescent gynecology clinic were identified both on the basis of ICD-9 codes and having magnetic resonance imaging at Texas Children's Hospital between 2004 and 2009. None. Correlation among transabdominal ultrasound and magnetic resonance imaging findings with surgical findings. Mean age was 12.2 (± 4.1) years. Twenty-eight patients underwent magnetic resonance imaging and required surgical intervention, and 88.5% demonstrated correlative consistency with surgical findings. Twenty-two patients underwent ultrasound, magnetic resonance imaging, and surgery, which revealed consistency among ultrasound and surgical findings (59.1%) and consistency among magnetic resonance imaging and surgical findings (90.9%). In ASRM diagnoses evaluated by magnetic resonance imaging, surgical findings correlated in 92% (Pearson 0.89). Overall, 55.2% of patients had a renal malformation. Magnetic resonance imaging is the gold standard imaging modality for müllerian anomalies and is an effective technique for noninvasive evaluation and accurate classification of the type of anomaly in the pediatric and adolescent population. Magnetic resonance imaging should be considered as an adjunct to transabdominal ultrasound to evaluate müllerian anomalies. Copyright © 2012 North American Society for Pediatric and Adolescent Gynecology. Published by Elsevier Inc. All rights reserved.

  19. Cheiloscopy and dactyloscopy: Do they dictate personality patterns?

    PubMed

    Abidullah, Mohammed; Kumar, M Naveen; Bhorgonde, Kavita D; Reddy, D Shyam Prasad

    2015-01-01

    Cheiloscopy and dactyloscopy, both are well-established forensic tools used in individual identification in any scenario be it a crime scene or civil cause. Like finger prints, lip prints are unique and distinguishable for every individual. But their relationship to personality types has not been established excepting the hypothesis stating that finger prints could explain these personality patterns. The study was aimed to record and correlate the lip and finger prints with that of character/personality of a person. The lip and finger prints and character of a person were recorded and the data obtained was subjected for statistical analysis, especially for Pearson's Chi-square test and correlation/association between the groups was also studied. The study sample comprised of 200 subjects, 100 males and 100 females, aged between 18 and 30 years. For recording lip prints, brown/pink-colored lipstick was applied on the lips and the subjects were asked to spread uniformly over the lips. Lip prints were traced in the normal rest position on a plain white bond paper. For recording the finger prints, imprints of the fingers were taken on a plain white bond paper using ink pad. The collected prints were visualized using magnifying lens. To record the character of person, a pro forma manual for multivariable personality inventory by Dr. BC Muthayya was used. Data obtained was subjected for statistical analysis, especially for Pearson's Chi-square test and correlation/association between the groups was also studied. In males, predominant lip pattern recorded was Type I with whorls-type finger pattern and the character being ego ideal, pessimism, introvert, and dogmatic; whereas in females, predominant lip pattern recorded was Type II with loops-type finger pattern and the character being neurotic, need achievers, and dominant. Many studies on lip pattern, finger pattern, palatal rugae, etc., for individual identification and gender determination exist, but correlative studies are scanty. This is the first study done on correlating patterns, that is, lip and finger pattern with the character of a person. With this study we conclude that this correlation can be used as an adjunct in the investigatory process in forensic sciences.

  20. 50 CFR Table 14c - At-sea Operation Type Codes To Be Used as Port Codes for Vessels Matching This Type of Operation

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... CONSERVATION AND MANAGEMENT, NATIONAL OCEANIC AND ATMOSPHERIC ADMINISTRATION, DEPARTMENT OF COMMERCE (CONTINUED) FISHERIES OF THE EXCLUSIVE ECONOMIC ZONE OFF ALASKA Amendment 80 Program Economic data report (EDR) for the...

  1. Workplace Incivility: Worker and Organizational Antecedents and Outcomes

    ERIC Educational Resources Information Center

    Bartlett, James E., II; Bartlett, Michelle E.; Reio, Thomas G., Jr.

    2008-01-01

    Unresolved workplace conflicts represent the largest reducible costs to an organization (Keenan & Newton, 1985). As incivility increases (Buhler, 2003; Pearson, Andersson, & Wegner, 2001; Pearson & Porath, 2005) more research is being conducted (Tepper, Duffy, Henle, & Lambert, 2006; Vickers, 2006). This review examined antecedents (variables that…

  2. Porting plasma physics simulation codes to modern computing architectures using the libmrc framework

    NASA Astrophysics Data System (ADS)

    Germaschewski, Kai; Abbott, Stephen

    2015-11-01

    Available computing power has continued to grow exponentially even after single-core performance satured in the last decade. The increase has since been driven by more parallelism, both using more cores and having more parallelism in each core, e.g. in GPUs and Intel Xeon Phi. Adapting existing plasma physics codes is challenging, in particular as there is no single programming model that covers current and future architectures. We will introduce the open-source libmrc framework that has been used to modularize and port three plasma physics codes: The extended MHD code MRCv3 with implicit time integration and curvilinear grids; the OpenGGCM global magnetosphere model; and the particle-in-cell code PSC. libmrc consolidates basic functionality needed for simulations based on structured grids (I/O, load balancing, time integrators), and also introduces a parallel object model that makes it possible to maintain multiple implementations of computational kernels, on e.g. conventional processors and GPUs. It handles data layout conversions and enables us to port performance-critical parts of a code to a new architecture step-by-step, while the rest of the code can remain unchanged. We will show examples of the performance gains and some physics applications.

  3. Performance evaluation for epileptic electroencephalogram (EEG) detection by using Neyman-Pearson criteria and a support vector machine

    NASA Astrophysics Data System (ADS)

    Wang, Chun-mei; Zhang, Chong-ming; Zou, Jun-zhong; Zhang, Jian

    2012-02-01

    The diagnosis of several neurological disorders is based on the detection of typical pathological patterns in electroencephalograms (EEGs). This is a time-consuming task requiring significant training and experience. A lot of effort has been devoted to developing automatic detection techniques which might help not only in accelerating this process but also in avoiding the disagreement among readers of the same record. In this work, Neyman-Pearson criteria and a support vector machine (SVM) are applied for detecting an epileptic EEG. Decision making is performed in two stages: feature extraction by computing the wavelet coefficients and the approximate entropy (ApEn) and detection by using Neyman-Pearson criteria and an SVM. Then the detection performance of the proposed method is evaluated. Simulation results demonstrate that the wavelet coefficients and the ApEn are features that represent the EEG signals well. By comparison with Neyman-Pearson criteria, an SVM applied on these features achieved higher detection accuracies.

  4. On the Equivalence of a Likelihood Ratio of Drasgow, Levine, and Zickar (1996) and the Statistic Based on the Neyman-Pearson Lemma of Belov (2016).

    PubMed

    Sinharay, Sandip

    2017-03-01

    Levine and Drasgow (1988) suggested an approach based on the Neyman-Pearson lemma to detect examinees whose response patterns are "aberrant" due to cheating, language issues, and so on. Belov (2016) used the approach of Levine and Drasgow (1988) to suggest a statistic based on the Neyman-Pearson Lemma (SBNPL) to detect item preknowledge when the investigator knows which items are compromised. This brief report proves that the SBNPL of Belov (2016) is equivalent to a statistic suggested for the same purpose by Drasgow, Levine, and Zickar 20 years ago.

  5. An Evaluation of Comparability between NEISS and ICD-9-CM Injury Coding

    PubMed Central

    Thompson, Meghan C.; Wheeler, Krista K.; Shi, Junxin; Smith, Gary A.; Xiang, Huiyun

    2014-01-01

    Objective To evaluate the National Electronic Injury Surveillance System’s (NEISS) comparability with a data source that uses ICD-9-CM coding. Methods A sample of NEISS cases from a children’s hospital in 2008 was selected, and cases were linked with their original medical record. Medical records were reviewed and an ICD-9-CM code was assigned to each case. Cases in the NEISS sample that were non-injuries by ICD-9-CM standards were identified. A bridging matrix between the NEISS and ICD-9-CM injury coding systems, by type of injury classification, was proposed and evaluated. Results Of the 2,890 cases reviewed, 13.32% (n = 385) were non-injuries according to the ICD-9-CM diagnosis. Using the proposed matrix, the comparability of the NEISS with ICD-9-CM coding was favorable among injury cases (κ = 0.87, 95% CI: 0.85–0.88). The distribution of injury types among the entire sample was similar for the two systems, with percentage differences ≥1% for only open wounds or amputation, poisoning, and other or unspecified injury types. Conclusions There is potential for conducting comparable injury research using NEISS and ICD-9-CM data. Due to the inclusion of some non-injuries in the NEISS and some differences in type of injury definitions between NEISS and ICD-9-CM coding, best practice for studies using NEISS data obtained from the CPSC should include manual review of case narratives. Use of the standardized injury and injury type definitions presented in this study will facilitate more accurate comparisons in injury research. PMID:24658100

  6. An evaluation of comparability between NEISS and ICD-9-CM injury coding.

    PubMed

    Thompson, Meghan C; Wheeler, Krista K; Shi, Junxin; Smith, Gary A; Xiang, Huiyun

    2014-01-01

    To evaluate the National Electronic Injury Surveillance System's (NEISS) comparability with a data source that uses ICD-9-CM coding. A sample of NEISS cases from a children's hospital in 2008 was selected, and cases were linked with their original medical record. Medical records were reviewed and an ICD-9-CM code was assigned to each case. Cases in the NEISS sample that were non-injuries by ICD-9-CM standards were identified. A bridging matrix between the NEISS and ICD-9-CM injury coding systems, by type of injury classification, was proposed and evaluated. Of the 2,890 cases reviewed, 13.32% (n = 385) were non-injuries according to the ICD-9-CM diagnosis. Using the proposed matrix, the comparability of the NEISS with ICD-9-CM coding was favorable among injury cases (κ = 0.87, 95% CI: 0.85-0.88). The distribution of injury types among the entire sample was similar for the two systems, with percentage differences ≥1% for only open wounds or amputation, poisoning, and other or unspecified injury types. There is potential for conducting comparable injury research using NEISS and ICD-9-CM data. Due to the inclusion of some non-injuries in the NEISS and some differences in type of injury definitions between NEISS and ICD-9-CM coding, best practice for studies using NEISS data obtained from the CPSC should include manual review of case narratives. Use of the standardized injury and injury type definitions presented in this study will facilitate more accurate comparisons in injury research.

  7. Analysis of the Length of Braille Texts in English Braille American Edition, the Nemeth Code, and Computer Braille Code versus the Unified English Braille Code

    ERIC Educational Resources Information Center

    Knowlton, Marie; Wetzel, Robin

    2006-01-01

    This study compared the length of text in English Braille American Edition, the Nemeth code, and the computer braille code with the Unified English Braille Code (UEBC)--also known as Unified English Braille (UEB). The findings indicate that differences in the length of text are dependent on the type of material that is transcribed and the grade…

  8. Contrasting LMS Marketing Approaches

    ERIC Educational Resources Information Center

    Carriere, Brain; Challborn, Carl; Moore, James; Nibourg, Theodorus

    2005-01-01

    The first section of this report examines the CourseCompass learning management system (LMS), made available to educators by the Pearson publishing group as a vehicle for the company's extensive content library. The product's features are discussed, and the implications of Pearson's software/textbook "bundling" policy for the integrity of course…

  9. The Evolution of Pearson's Correlation Coefficient

    ERIC Educational Resources Information Center

    Kader, Gary D.; Franklin, Christine A.

    2008-01-01

    This article describes an activity for developing the notion of association between two quantitative variables. By exploring a collection of scatter plots, the authors propose a nonstandard "intuitive" measure of association; and by examining properties of this measure, they develop the more standard measure, Pearson's Correlation Coefficient. The…

  10. Commercialising Comparison: Pearson Puts the TLC in Soft Capitalism

    ERIC Educational Resources Information Center

    Hogan, Anna; Sellar, Sam; Lingard, Bob

    2016-01-01

    This paper provides a critical policy analysis of "The Learning Curve" (TLC) (2012), an initiative developed by the multinational edu-business, Pearson, in conjunction with the Economist Intelligence Unit. "TLC" exemplifies the commercialising of comparison and the efforts of edu-businesses to strategically position themselves…

  11. Large-scale transmission-type multifunctional anisotropic coding metasurfaces in millimeter-wave frequencies

    NASA Astrophysics Data System (ADS)

    Cui, Tie Jun; Wu, Rui Yuan; Wu, Wei; Shi, Chuan Bo; Li, Yun Bo

    2017-10-01

    We propose fast and accurate designs to large-scale and low-profile transmission-type anisotropic coding metasurfaces with multiple functions in the millimeter-wave frequencies based on the antenna-array method. The numerical simulation of an anisotropic coding metasurface with the size of 30λ × 30λ by the proposed method takes only 20 min, which however cannot be realized by commercial software due to huge memory usage in personal computers. To inspect the performance of coding metasurfaces in the millimeter-wave band, the working frequency is chosen as 60 GHz. Based on the convolution operations and holographic theory, the proposed multifunctional anisotropic coding metasurface exhibits different effects excited by y-polarized and x-polarized incidences. This study extends the frequency range of coding metasurfaces, filling the gap between microwave and terahertz bands, and implying promising applications in millimeter-wave communication and imaging.

  12. Refining the accuracy of validated target identification through coding variant fine-mapping in type 2 diabetes.

    PubMed

    Mahajan, Anubha; Wessel, Jennifer; Willems, Sara M; Zhao, Wei; Robertson, Neil R; Chu, Audrey Y; Gan, Wei; Kitajima, Hidetoshi; Taliun, Daniel; Rayner, N William; Guo, Xiuqing; Lu, Yingchang; Li, Man; Jensen, Richard A; Hu, Yao; Huo, Shaofeng; Lohman, Kurt K; Zhang, Weihua; Cook, James P; Prins, Bram Peter; Flannick, Jason; Grarup, Niels; Trubetskoy, Vassily Vladimirovich; Kravic, Jasmina; Kim, Young Jin; Rybin, Denis V; Yaghootkar, Hanieh; Müller-Nurasyid, Martina; Meidtner, Karina; Li-Gao, Ruifang; Varga, Tibor V; Marten, Jonathan; Li, Jin; Smith, Albert Vernon; An, Ping; Ligthart, Symen; Gustafsson, Stefan; Malerba, Giovanni; Demirkan, Ayse; Tajes, Juan Fernandez; Steinthorsdottir, Valgerdur; Wuttke, Matthias; Lecoeur, Cécile; Preuss, Michael; Bielak, Lawrence F; Graff, Marielisa; Highland, Heather M; Justice, Anne E; Liu, Dajiang J; Marouli, Eirini; Peloso, Gina Marie; Warren, Helen R; Afaq, Saima; Afzal, Shoaib; Ahlqvist, Emma; Almgren, Peter; Amin, Najaf; Bang, Lia B; Bertoni, Alain G; Bombieri, Cristina; Bork-Jensen, Jette; Brandslund, Ivan; Brody, Jennifer A; Burtt, Noël P; Canouil, Mickaël; Chen, Yii-Der Ida; Cho, Yoon Shin; Christensen, Cramer; Eastwood, Sophie V; Eckardt, Kai-Uwe; Fischer, Krista; Gambaro, Giovanni; Giedraitis, Vilmantas; Grove, Megan L; de Haan, Hugoline G; Hackinger, Sophie; Hai, Yang; Han, Sohee; Tybjærg-Hansen, Anne; Hivert, Marie-France; Isomaa, Bo; Jäger, Susanne; Jørgensen, Marit E; Jørgensen, Torben; Käräjämäki, Annemari; Kim, Bong-Jo; Kim, Sung Soo; Koistinen, Heikki A; Kovacs, Peter; Kriebel, Jennifer; Kronenberg, Florian; Läll, Kristi; Lange, Leslie A; Lee, Jung-Jin; Lehne, Benjamin; Li, Huaixing; Lin, Keng-Hung; Linneberg, Allan; Liu, Ching-Ti; Liu, Jun; Loh, Marie; Mägi, Reedik; Mamakou, Vasiliki; McKean-Cowdin, Roberta; Nadkarni, Girish; Neville, Matt; Nielsen, Sune F; Ntalla, Ioanna; Peyser, Patricia A; Rathmann, Wolfgang; Rice, Kenneth; Rich, Stephen S; Rode, Line; Rolandsson, Olov; Schönherr, Sebastian; Selvin, Elizabeth; Small, Kerrin S; Stančáková, Alena; Surendran, Praveen; Taylor, Kent D; Teslovich, Tanya M; Thorand, Barbara; Thorleifsson, Gudmar; Tin, Adrienne; Tönjes, Anke; Varbo, Anette; Witte, Daniel R; Wood, Andrew R; Yajnik, Pranav; Yao, Jie; Yengo, Loïc; Young, Robin; Amouyel, Philippe; Boeing, Heiner; Boerwinkle, Eric; Bottinger, Erwin P; Chowdhury, Rajiv; Collins, Francis S; Dedoussis, George; Dehghan, Abbas; Deloukas, Panos; Ferrario, Marco M; Ferrières, Jean; Florez, Jose C; Frossard, Philippe; Gudnason, Vilmundur; Harris, Tamara B; Heckbert, Susan R; Howson, Joanna M M; Ingelsson, Martin; Kathiresan, Sekar; Kee, Frank; Kuusisto, Johanna; Langenberg, Claudia; Launer, Lenore J; Lindgren, Cecilia M; Männistö, Satu; Meitinger, Thomas; Melander, Olle; Mohlke, Karen L; Moitry, Marie; Morris, Andrew D; Murray, Alison D; de Mutsert, Renée; Orho-Melander, Marju; Owen, Katharine R; Perola, Markus; Peters, Annette; Province, Michael A; Rasheed, Asif; Ridker, Paul M; Rivadineira, Fernando; Rosendaal, Frits R; Rosengren, Anders H; Salomaa, Veikko; Sheu, Wayne H-H; Sladek, Rob; Smith, Blair H; Strauch, Konstantin; Uitterlinden, André G; Varma, Rohit; Willer, Cristen J; Blüher, Matthias; Butterworth, Adam S; Chambers, John Campbell; Chasman, Daniel I; Danesh, John; van Duijn, Cornelia; Dupuis, Josée; Franco, Oscar H; Franks, Paul W; Froguel, Philippe; Grallert, Harald; Groop, Leif; Han, Bok-Ghee; Hansen, Torben; Hattersley, Andrew T; Hayward, Caroline; Ingelsson, Erik; Kardia, Sharon L R; Karpe, Fredrik; Kooner, Jaspal Singh; Köttgen, Anna; Kuulasmaa, Kari; Laakso, Markku; Lin, Xu; Lind, Lars; Liu, Yongmei; Loos, Ruth J F; Marchini, Jonathan; Metspalu, Andres; Mook-Kanamori, Dennis; Nordestgaard, Børge G; Palmer, Colin N A; Pankow, James S; Pedersen, Oluf; Psaty, Bruce M; Rauramaa, Rainer; Sattar, Naveed; Schulze, Matthias B; Soranzo, Nicole; Spector, Timothy D; Stefansson, Kari; Stumvoll, Michael; Thorsteinsdottir, Unnur; Tuomi, Tiinamaija; Tuomilehto, Jaakko; Wareham, Nicholas J; Wilson, James G; Zeggini, Eleftheria; Scott, Robert A; Barroso, Inês; Frayling, Timothy M; Goodarzi, Mark O; Meigs, James B; Boehnke, Michael; Saleheen, Danish; Morris, Andrew P; Rotter, Jerome I; McCarthy, Mark I

    2018-04-01

    We aggregated coding variant data for 81,412 type 2 diabetes cases and 370,832 controls of diverse ancestry, identifying 40 coding variant association signals (P < 2.2 × 10 -7 ); of these, 16 map outside known risk-associated loci. We make two important observations. First, only five of these signals are driven by low-frequency variants: even for these, effect sizes are modest (odds ratio ≤1.29). Second, when we used large-scale genome-wide association data to fine-map the associated variants in their regional context, accounting for the global enrichment of complex trait associations in coding sequence, compelling evidence for coding variant causality was obtained for only 16 signals. At 13 others, the associated coding variants clearly represent 'false leads' with potential to generate erroneous mechanistic inference. Coding variant associations offer a direct route to biological insight for complex diseases and identification of validated therapeutic targets; however, appropriate mechanistic inference requires careful specification of their causal contribution to disease predisposition.

  13. Are nursing codes of practice ethical?

    PubMed

    Pattison, S

    2001-01-01

    This article provides a theoretical critique from a particular 'ideal type' ethical perspective of professional codes in general and the United Kingdom Central Council for Nursing, Midwifery and Health Visiting (UKCC) Code of professional conduct (reprinted on pp. 77-78) in particular. Having outlined a specific 'ideal type' of what ethically informed and aware practice may be, the article examines the extent to which professional codes may be likely to elicit and engender such practice. Because of their terminological inexactitudes and confusions, their arbitrary values and principles, their lack of helpful ethical guidance, and their exclusion of ordinary moral experience, a number of contemporary professional codes in health and social care can be arraigned as ethically inadequate. The UKCC Code of professional conduct embodies many of these flaws, and others besides. Some of its weaknesses in this respect are anatomized before some tentative suggestions are offered for the reform of codes and the engendering of greater ethical awareness among professionals in the light of greater public ethical concerns and values.

  14. An empirically derived short form of the Hypoglycaemia Fear Survey II.

    PubMed

    Grabman, J; Vajda Bailey, K; Schmidt, K; Cariou, B; Vaur, L; Madani, S; Cox, D; Gonder-Frederick, L

    2017-04-01

    To develop an empirically derived short version of the Hypoglycaemia Fear Survey II that still accurately measures fear of hypoglycaemia. Item response theory methods were used to generate an 11-item version of the Hypoglycaemia Fear Survey from a sample of 487 people with Type 1 or Type 2 diabetes mellitus. Subsequently, this scale was tested on a sample of 2718 people with Type 1 or insulin-treated Type 2 diabetes taking part in DIALOG, a large observational prospective study of hypoglycaemia in France. The short form of the Hypoglycaemia Fear Survey II matched the factor structure of the long form for respondents with both Type 1 and Type 2 diabetes, while maintaining adequate internal reliability on the total scale and all three subscales. The two forms were highly correlated on both the total scale and each subscale (Pearson's R > 0.89). The short form of the Hypoglycaemia Fear Survey II is an important first step in more efficiently measuring fear of hypoglycaemia. Future prospective studies are needed for further validity testing and exploring the survey's applicability to different populations. © 2016 Diabetes UK.

  15. Universal Noiseless Coding Subroutines

    NASA Technical Reports Server (NTRS)

    Schlutsmeyer, A. P.; Rice, R. F.

    1986-01-01

    Software package consists of FORTRAN subroutines that perform universal noiseless coding and decoding of integer and binary data strings. Purpose of this type of coding to achieve data compression in sense that coded data represents original data perfectly (noiselessly) while taking fewer bits to do so. Routines universal because they apply to virtually any "real-world" data source.

  16. [Relationship between the ankle-arm index determined by Doppler ultrasonography and cardiovascular outcomes and amputations, in a group of patients with type 2 diabetes mellitus from the Instituto Nacional de Ciencias Médicas y Nutrición Salvador Zubirán].

    PubMed

    Miranda Garduño, Luis Miguel; Bermúdez Rocha, Rocío; Gómez Pérez, Francisco J; Aguilar Salinas, Carlos A

    2011-01-01

    An ankle/arm index < 0.90 and ≥ 1.41 is considered as abnormal. This study was aimed to investigate the prevalence of peripheral arterial disease through the identification of the ankle/arm index using Doppler ultrasound, and the possible association between pathological ankle/arm index and the micro- and macrovascular complications of diabetes and amputation. The ankle/arm index was determined in outpatient type 2 diabetic subjects. There were the following variables: age and cardiovascular outcomes. To find if the ankle/arm index is related to the cardiovascular outcomes or with the presence of micro- or macrovascular complications we determined the index of correlation of Pearson and also used logistic regression methods to analyze the association between ankle/arm index with the categorical variables. We calculated the ankle/arm index in 242 patients. The prevalence of ischemic ankle/arm index (< 0.90) was 13.6%. The Pearson correlation coefficient for ankle/arm index pathological and cardiovascular outcomes was 0.180 (p = 0.005), amputation 0.130 (p < 0.05), retinopathy 0.132 (p < 0.05), and nephropathy 0.158 (p = 0.01). In logistic regression analysis, the factors associated with pathological ankle/arm index were age > 51 years, cardiovascular outcomes, and amputation. With the Mann Whitney U test we found that a relationship exists between pathological and amputation iliotibial band (p < 0.05). Diabetic patients have a high prevalence of pathological ankle/arm index.

  17. The Regular Interaction Pattern among Odorants of the Same Type and Its Application in Odor Intensity Assessment.

    PubMed

    Yan, Luchun; Liu, Jiemin; Jiang, Shen; Wu, Chuandong; Gao, Kewei

    2017-07-13

    The olfactory evaluation function (e.g., odor intensity rating) of e-nose is always one of the most challenging issues in researches about odor pollution monitoring. But odor is normally produced by a set of stimuli, and odor interactions among constituents significantly influenced their mixture's odor intensity. This study investigated the odor interaction principle in odor mixtures of aldehydes and esters, respectively. Then, a modified vector model (MVM) was proposed and it successfully demonstrated the similarity of the odor interaction pattern among odorants of the same type. Based on the regular interaction pattern, unlike a determined empirical model only fit for a specific odor mixture in conventional approaches, the MVM distinctly simplified the odor intensity prediction of odor mixtures. Furthermore, the MVM also provided a way of directly converting constituents' chemical concentrations to their mixture's odor intensity. By combining the MVM with usual data-processing algorithm of e-nose, a new e-nose system was established for an odor intensity rating. Compared with instrumental analysis and human assessor, it exhibited accuracy well in both quantitative analysis (Pearson correlation coefficient was 0.999 for individual aldehydes ( n = 12), 0.996 for their binary mixtures ( n = 36) and 0.990 for their ternary mixtures ( n = 60)) and odor intensity assessment (Pearson correlation coefficient was 0.980 for individual aldehydes ( n = 15), 0.973 for their binary mixtures ( n = 24), and 0.888 for their ternary mixtures ( n = 25)). Thus, the observed regular interaction pattern is considered an important foundation for accelerating extensive application of olfactory evaluation in odor pollution monitoring.

  18. Flood-frequency characteristics of Wisconsin streams

    USGS Publications Warehouse

    Walker, John F.; Peppler, Marie C.; Danz, Mari E.; Hubbard, Laura E.

    2017-05-22

    Flood-frequency characteristics for 360 gaged sites on unregulated rural streams in Wisconsin are presented for percent annual exceedance probabilities ranging from 0.2 to 50 using a statewide skewness map developed for this report. Equations of the relations between flood-frequency and drainage-basin characteristics were developed by multiple-regression analyses. Flood-frequency characteristics for ungaged sites on unregulated, rural streams can be estimated by use of the equations presented in this report. The State was divided into eight areas of similar physiographic characteristics. The most significant basin characteristics are drainage area, soil saturated hydraulic conductivity, main-channel slope, and several land-use variables. The standard error of prediction for the equation for the 1-percent annual exceedance probability flood ranges from 56 to 70 percent for Wisconsin Streams; these values are larger than results presented in previous reports. The increase in the standard error of prediction is likely due to increased variability of the annual-peak discharges, resulting in increased variability in the magnitude of flood peaks at higher frequencies. For each of the unregulated rural streamflow-gaging stations, a weighted estimate based on the at-site log Pearson type III analysis and the multiple regression results was determined. The weighted estimate generally has a lower uncertainty than either the Log Pearson type III or multiple regression estimates. For regulated streams, a graphical method for estimating flood-frequency characteristics was developed from the relations of discharge and drainage area for selected annual exceedance probabilities. Graphs for the major regulated streams in Wisconsin are presented in the report.

  19. Significance of Epicardial and Intrathoracic Adipose Tissue Volume among Type 1 Diabetes Patients in the DCCT/EDIC: A Pilot Study

    PubMed Central

    Budoff, Matthew J.

    2016-01-01

    Introduction Type 1 diabetes (T1DM) patients are at increased risk of coronary artery disease (CAD). This pilot study sought to evaluate the relationship between epicardial adipose tissue (EAT) and intra-thoracic adipose tissue (IAT) volumes and cardio-metabolic risk factors in T1DM. Method EAT/IAT volumes in 100 patients, underwent non-contrast cardiac computed tomography in the Diabetes Control and Complications Trial /Epidemiology of Diabetes Interventions and Complications (DCCT/EDIC) study were measured by a certified reader. Fat was defined as pixels’ density of -30 to -190 Hounsfield Unit. The associations were assessed using–Pearson partial correlation and linear regression models adjusted for gender and age with inverse probability sample weighting. Results The weighted mean age was 43 years (range 32–57) and 53% were male. Adjusted for gender, Pearson correlation analysis showed a significant correlation between age and EAT/IAT volumes (both p<0.001). After adjusting for gender and age, participants with greater BMI, higher waist to hip ratio (WTH), higher weighted HbA1c, elevated triglyceride level, and a history of albumin excretion rate of equal or greater than 300 mg/d (AER≥300) or end stage renal disease (ESRD) had significantly larger EAT/IAT volumes. Conclusion T1DM patients with greater BMI, WTH ratio, weighted HbA1c level, triglyceride level and AER≥300/ESRD had significantly larger EAT/IAT volumes. Larger sample size studies are recommended to evaluate independency. PMID:27459689

  20. Use of color-coded sleeve shutters accelerates oscillograph channel selection

    NASA Technical Reports Server (NTRS)

    Bouchlas, T.; Bowden, F. W.

    1967-01-01

    Sleeve-type shutters mechanically adjust individual galvanometer light beams onto or away from selected channels on oscillograph papers. In complex test setups, the sleeve-type shutters are color coded to separately identify each oscillograph channel. This technique could be used on any equipment using tubular galvanometer light sources.

  1. The Heart of Great Teaching: Pearson Global Survey of Educator Effectiveness

    ERIC Educational Resources Information Center

    McKnight, Katherine; Graybeal, John; Yarbro, Jessica; Graybeal, Lacey

    2016-01-01

    To contribute to the global discussion about what makes an effective teacher, Pearson surveyed students ages 15-19, teachers, principals, education researchers, education policymakers, and parents of school-aged children in 23 countries (Canada, U.S., Mexico, Brazil, Argentina, Finland, Germany, Poland, England, Morocco, Egypt, South Africa,…

  2. How a Publishing Empire Is Changing Higher Education.

    ERIC Educational Resources Information Center

    Blumenstyk, Goldie

    2000-01-01

    Discusses the increasing role of the London-based media conglomerate Pearson PLC and its subsidiary FT Knowledge in bringing together various Pearson assets in distance and corporate education. The company is developing partnerships with several elite institutions to develop course material for world-wide marketing. Other initiatives include the…

  3. Measuring Skewness: A Forgotten Statistic?

    ERIC Educational Resources Information Center

    Doane, David P.; Seward, Lori E.

    2011-01-01

    This paper discusses common approaches to presenting the topic of skewness in the classroom, and explains why students need to know how to measure it. Two skewness statistics are examined: the Fisher-Pearson standardized third moment coefficient, and the Pearson 2 coefficient that compares the mean and median. The former is reported in statistical…

  4. Using the Pearson Distribution for Synthesis of the Suboptimal Algorithms for Filtering Multi-Dimensional Markov Processes

    NASA Astrophysics Data System (ADS)

    Mit'kin, A. S.; Pogorelov, V. A.; Chub, E. G.

    2015-08-01

    We consider the method of constructing the suboptimal filter on the basis of approximating the a posteriori probability density of the multidimensional Markov process by the Pearson distributions. The proposed method can efficiently be used for approximating asymmetric, excessive, and finite densities.

  5. Different small, acid-soluble proteins of the alpha/beta type have interchangeable roles in the heat and UV radiation resistance of Bacillus subtilis spores.

    PubMed Central

    Mason, J M; Setlow, P

    1987-01-01

    Spores of Bacillus subtilis strains which carry deletion mutations in one gene (sspA) or two genes (sspA and sspB) which code for major alpha/beta-type small, acid-soluble spore proteins (SASP) are known to be much more sensitive to heat and UV radiation than wild-type spores. This heat- and UV-sensitive phenotype was cured completely or in part by introduction into these mutant strains of one or more copies of the sspA or sspB genes themselves; multiple copies of the B. subtilis sspD gene, which codes for a minor alpha/beta-type SASP; or multiple copies of the SASP-C gene, which codes for a major alpha/beta-type SASP of Bacillus megaterium. These findings suggest that alpha/beta-type SASP play interchangeable roles in the heat and UV radiation resistance of bacterial spores. Images PMID:3112127

  6. Developing a bivariate spatial association measure: An integration of Pearson's r and Moran's I

    NASA Astrophysics Data System (ADS)

    Lee, Sang-Il

    This research is concerned with developing a bivariate spatial association measure or spatial correlation coefficient, which is intended to capture spatial association among observations in terms of their point-to-point relationships across two spatial patterns. The need for parameterization of the bivariate spatial dependence is precipitated by the realization that aspatial bivariate association measures, such as Pearson's correlation coefficient, do not recognize spatial distributional aspects of data sets. This study devises an L statistic by integrating Pearson's r as an aspatial bivariate association measure and Moran's I as a univariate spatial association measure. The concept of a spatial smoothing scalar (SSS) plays a pivotal role in this task.

  7. Contextualisation in the revised dual representation theory of PTSD: a response to Pearson and colleagues.

    PubMed

    Brewin, Chris R; Burgess, Neil

    2014-03-01

    Three recent studies (Pearson, 2012; Pearson, Ross, & Webster, 2012) purported to test the revised dual representation theory of posttraumatic stress disorder (Brewin, Gregory, Lipton, & Burgess, 2010) by manipulating the amount of additional information accompanying traumatic stimulus materials and assessing the effect on subsequent intrusive memories. Here we point out that these studies involve a misunderstanding of the meaning of "contextual" within the theory, such that the manipulation would be unlikely to have had the intended effect and the results are ambiguous with respect to the theory. Past and future experimental tests of the theory are discussed. Copyright © 2013 The Authors. Published by Elsevier Ltd.. All rights reserved.

  8. mtDNA Deletion in an Iranian Infant with Pearson Marrow Syndrome.

    PubMed

    Arzanian, Mohammad Taghi; Eghbali, Aziz; Karimzade, Parvaneh; Ahmadi, Mitra; Houshmand, Massoud; Rezaei, Nima

    2010-03-01

    Pearson syndrome (PS) is a rare multisystem mitochondrial disorder of hematopoietic system, characterized by refractory sideroblastic anemia, pancytopenia, exocrine pancreatic insufficiency, and variable neurologic, hepatic, renal, and endocrine failure. We describe a six-month-old female infant with Pearson marrow syndrome who presented with neurological manifestations. She had several episodes of seizures. Hematopoietic abnormalities were macrocytic anemia and neutropenia. Bone marrow aspiration revealed a cellular marrow with marked vacuolization of erythroid and myeloid precursors. Analysis of mtDNA in peripheral blood showed 8.5 kb deletion that was compatible with the diagnosis of PS. PS should be considered in infants with neurologic diseases, in patients with cytopenias, and also in patients with acidosis or refractory anemia.

  9. Pacific Northwest ecoclass codes for seral and potential natural communities.

    Treesearch

    Frederick C. Hall

    1998-01-01

    Lists codes for identification of potential natural communities (plant association, habitat types), their seral status, and vegetation structure in and around the Pacific Northwest. Codes are a six-digit alphanumeric system using the first letter of tree species, life-form, seral status, and structure so that most codes can be directly interpreted. Seven appendices...

  10. Quality of recording of diabetes in the UK: how does the GP's method of coding clinical data affect incidence estimates? Cross-sectional study using the CPRD database

    PubMed Central

    Tate, A Rosemary; Dungey, Sheena; Glew, Simon; Beloff, Natalia; Williams, Rachael; Williams, Tim

    2017-01-01

    Objective To assess the effect of coding quality on estimates of the incidence of diabetes in the UK between 1995 and 2014. Design A cross-sectional analysis examining diabetes coding from 1995 to 2014 and how the choice of codes (diagnosis codes vs codes which suggest diagnosis) and quality of coding affect estimated incidence. Setting Routine primary care data from 684 practices contributing to the UK Clinical Practice Research Datalink (data contributed from Vision (INPS) practices). Main outcome measure Incidence rates of diabetes and how they are affected by (1) GP coding and (2) excluding ‘poor’ quality practices with at least 10% incident patients inaccurately coded between 2004 and 2014. Results Incidence rates and accuracy of coding varied widely between practices and the trends differed according to selected category of code. If diagnosis codes were used, the incidence of type 2 increased sharply until 2004 (when the UK Quality Outcomes Framework was introduced), and then flattened off, until 2009, after which they decreased. If non-diagnosis codes were included, the numbers continued to increase until 2012. Although coding quality improved over time, 15% of the 666 practices that contributed data between 2004 and 2014 were labelled ‘poor’ quality. When these practices were dropped from the analyses, the downward trend in the incidence of type 2 after 2009 became less marked and incidence rates were higher. Conclusions In contrast to some previous reports, diabetes incidence (based on diagnostic codes) appears not to have increased since 2004 in the UK. Choice of codes can make a significant difference to incidence estimates, as can quality of recording. Codes and data quality should be checked when assessing incidence rates using GP data. PMID:28122831

  11. Pt-Bi Antibonding Interaction: The Key Factor for Superconductivity in Monoclinic BaPt2Bi2.

    PubMed

    Gui, Xin; Xing, Lingyi; Wang, Xiaoxiong; Bian, Guang; Jin, Rongying; Xie, Weiwei

    2018-02-19

    In the search for superconductivity in a BaAu 2 Sb 2 -type monoclinic structure, we have successfully synthesized the new compound BaPt 2 Bi 2 , which crystallizes in the space group P2 1 /m (No. 11; Pearson symbol mP10) according to a combination of powder and single-crystal X-ray diffraction and scanning electron microscopy. A sharp electrical resistivity drop and large diamagnetic magnetization below 2.0 K indicates it owns superconducting ground state. This makes BaPt 2 Bi 2 the first reported superconductor in a monoclinic BaAu 2 Sb 2 -type structure, a previously unappreciated structure for superconductivity. First-principles calculations considering spin-orbit coupling indicate that Pt-Bi antibonding interaction plays a critical role in inducing superconductivity.

  12. The PLUTO code for astrophysical gasdynamics .

    NASA Astrophysics Data System (ADS)

    Mignone, A.

    Present numerical codes appeal to a consolidated theory based on finite difference and Godunov-type schemes. In this context we have developed a versatile numerical code, PLUTO, suitable for the solution of high-mach number flow in 1, 2 and 3 spatial dimensions and different systems of coordinates. Different hydrodynamic modules and algorithms may be independently selected to properly describe Newtonian, relativistic, MHD, or relativistic MHD fluids. The modular structure exploits a general framework for integrating a system of conservation laws, built on modern Godunov-type shock-capturing schemes. The code is freely distributed under the GNU public license and it is available for download to the astrophysical community at the URL http://plutocode.to.astro.it.

  13. Determination of multi-GNSS pseudo-absolute code biases and verification of receiver tracking technology

    NASA Astrophysics Data System (ADS)

    Villiger, Arturo; Schaer, Stefan; Dach, Rolf; Prange, Lars; Jäggi, Adrian

    2017-04-01

    It is common to handle code biases in the Global Navigation Satellite System (GNSS) data analysis as conventional differential code biases (DCBs): P1-C1, P1-P2, and P2-C2. Due to the increasing number of signals and systems in conjunction with various tracking modes for the different signals (as defined in RINEX3 format), the number of DCBs would increase drastically and the bookkeeping becomes almost unbearable. The Center for Orbit Determination in Europe (CODE) has thus changed its processing scheme to observable-specific signal biases (OSB). This means that for each observation involved all related satellite and receiver biases are considered. The OSB contributions from various ionosphere analyses (geometry-free linear combination) using different observables and frequencies and from clock analyses (ionosphere-free linear combination) are then combined on normal equation level. By this, one consistent set of OSB values per satellite and receiver can be obtained that contains all information needed for GNSS-related processing. This advanced procedure of code bias handling is now also applied to the IGS (International GNSS Service) MGEX (Multi-GNSS Experiment) procedure at CODE. Results for the biases from the legacy IGS solution as well as the CODE MGEX processing (considering GPS, GLONASS, Galileo, BeiDou, and QZSS) are presented. The consistency with the traditional method is confirmed and the new results are discussed regarding the long-term stability. When processing code data, it is essential to know the true observable types in order to correct for the associated biases. CODE has been verifying the receiver tracking technologies for GPS based on estimated DCB multipliers (for the RINEX 2 case). With the change to OSB, the original verification approach was extended to search for the best fitting observable types based on known OSB values. In essence, a multiplier parameter is estimated for each involved GNSS observable type. This implies that we could recover, for receivers tracking a combination of signals, even the factors of these combinations. The verification of the observable types is crucial to identify the correct observable types of RINEX 2 data (which does not contain the signal modulation in comparison to RINEX 3). The correct information of the used observable types is essential for precise point positioning (PPP) applications and GNSS ambiguity resolution. Multi-GNSS OSBs and verified receiver tracking modes are essential to get best possible multi-GNSS solutions for geodynamic purposes and other applications.

  14. Pearson's Correlation between Three Variables; Using Students' Basic Knowledge of Geometry for an Exercise in Mathematical Statistics

    ERIC Educational Resources Information Center

    Vos, Pauline

    2009-01-01

    When studying correlations, how do the three bivariate correlation coefficients between three variables relate? After transforming Pearson's correlation coefficient r into a Euclidean distance, undergraduate students can tackle this problem using their secondary school knowledge of geometry (Pythagoras' theorem and similarity of triangles).…

  15. The Advanced Security Operations Corporation Special Weapons and Tactics Initiative: A Business Plan

    DTIC Science & Technology

    2004-12-01

    5 Kotler , Philip . (2003). A Framework for Marketing Management (2nd ed.). New Jersey: Pearson Education. 8 teams in Los...www.ojp.usdoj.gov/odp/docs/fy04hsgp_appkit.pdf 5. Kotler , Philip . (2003). A Framework for Marketing Management (2nd ed.). New Jersey: Pearson Education. 6

  16. The Pearson-Readhead AGN Survey

    NASA Astrophysics Data System (ADS)

    Lister, M. L.

    2009-08-01

    The Pearson-Readhead survey of active galactic nuclei (AGN) was the first complete sample to be studied by VLBI, and was the subject of a detailed investigation by the VSOP program. We discuss the scientific findings from this unique survey, and how it has provided robust confirmation of the relativistic beaming model for AGN jets.

  17. Storm surge evolution and its relationship to climate oscillations at Duck, NC

    NASA Astrophysics Data System (ADS)

    Munroe, Robert; Curtis, Scott

    2017-07-01

    Coastal communities experience increased vulnerability during storm surge events through the risk of damage to coastal infrastructure, erosion/deposition, and the endangerment of human life. Policy and planning measures attempt to avoid or mitigate storm surge consequences through building codes and setbacks, beach stabilization, insurance rates, and coastal zoning. The coastal emergency management community and public react and respond on shorter time scales, through temporary protection, emergency stockpiling, and evacuation. This study utilizes time series analysis, the Kolmogorov-Smirnov (K-S) test, Pearson's correlation, and the generalized extreme value (GEV) theorem to make the connection between climate oscillation indices and storm surge characteristics intra-seasonally to inter-annually. Results indicate that an El Niño (+ENSO), negative phase of the NAO, and positive phase of the PNA pattern all support longer duration and hence more powerful surge events, especially in winter. Increased surge duration increases the likelihood of extensive erosion, inland inundation, among other undesirable effects of the surge hazard.

  18. Sensitivity Analysis of OECD Benchmark Tests in BISON

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Swiler, Laura Painton; Gamble, Kyle; Schmidt, Rodney C.

    2015-09-01

    This report summarizes a NEAMS (Nuclear Energy Advanced Modeling and Simulation) project focused on sensitivity analysis of a fuels performance benchmark problem. The benchmark problem was defined by the Uncertainty Analysis in Modeling working group of the Nuclear Science Committee, part of the Nuclear Energy Agency of the Organization for Economic Cooperation and Development (OECD ). The benchmark problem involv ed steady - state behavior of a fuel pin in a Pressurized Water Reactor (PWR). The problem was created in the BISON Fuels Performance code. Dakota was used to generate and analyze 300 samples of 17 input parameters defining coremore » boundary conditions, manuf acturing tolerances , and fuel properties. There were 24 responses of interest, including fuel centerline temperatures at a variety of locations and burnup levels, fission gas released, axial elongation of the fuel pin, etc. Pearson and Spearman correlatio n coefficients and Sobol' variance - based indices were used to perform the sensitivity analysis. This report summarizes the process and presents results from this study.« less

  19. MPIGeneNet: Parallel Calculation of Gene Co-Expression Networks on Multicore Clusters.

    PubMed

    Gonzalez-Dominguez, Jorge; Martin, Maria J

    2017-10-10

    In this work we present MPIGeneNet, a parallel tool that applies Pearson's correlation and Random Matrix Theory to construct gene co-expression networks. It is based on the state-of-the-art sequential tool RMTGeneNet, which provides networks with high robustness and sensitivity at the expenses of relatively long runtimes for large scale input datasets. MPIGeneNet returns the same results as RMTGeneNet but improves the memory management, reduces the I/O cost, and accelerates the two most computationally demanding steps of co-expression network construction by exploiting the compute capabilities of common multicore CPU clusters. Our performance evaluation on two different systems using three typical input datasets shows that MPIGeneNet is significantly faster than RMTGeneNet. As an example, our tool is up to 175.41 times faster on a cluster with eight nodes, each one containing two 12-core Intel Haswell processors. Source code of MPIGeneNet, as well as a reference manual, are available at https://sourceforge.net/projects/mpigenenet/.

  20. A Bioinformatics-Based Alternative mRNA Splicing Code that May Explain Some Disease Mutations Is Conserved in Animals.

    PubMed

    Qu, Wen; Cingolani, Pablo; Zeeberg, Barry R; Ruden, Douglas M

    2017-01-01

    Deep sequencing of cDNAs made from spliced mRNAs indicates that most coding genes in many animals and plants have pre-mRNA transcripts that are alternatively spliced. In pre-mRNAs, in addition to invariant exons that are present in almost all mature mRNA products, there are at least 6 additional types of exons, such as exons from alternative promoters or with alternative polyA sites, mutually exclusive exons, skipped exons, or exons with alternative 5' or 3' splice sites. Our bioinformatics-based hypothesis is that, in analogy to the genetic code, there is an "alternative-splicing code" in introns and flanking exon sequences, analogous to the genetic code, that directs alternative splicing of many of the 36 types of introns. In humans, we identified 42 different consensus sequences that are each present in at least 100 human introns. 37 of the 42 top consensus sequences are significantly enriched or depleted in at least one of the 36 types of introns. We further supported our hypothesis by showing that 96 out of 96 analyzed human disease mutations that affect RNA splicing, and change alternative splicing from one class to another, can be partially explained by a mutation altering a consensus sequence from one type of intron to that of another type of intron. Some of the alternative splicing consensus sequences, and presumably their small-RNA or protein targets, are evolutionarily conserved from 50 plant to animal species. We also noticed the set of introns within a gene usually share the same splicing codes, thus arguing that one sub-type of splicesosome might process all (or most) of the introns in a given gene. Our work sheds new light on a possible mechanism for generating the tremendous diversity in protein structure by alternative splicing of pre-mRNAs.

  1. The Complete Mitochondrial DNA Sequence of Scenedesmus obliquus Reflects an Intermediate Stage in the Evolution of the Green Algal Mitochondrial Genome

    PubMed Central

    Nedelcu, Aurora M.; Lee, Robert W.; Lemieux, Claude; Gray, Michael W.; Burger, Gertraud

    2000-01-01

    Two distinct mitochondrial genome types have been described among the green algal lineages investigated to date: a reduced–derived, Chlamydomonas-like type and an ancestral, Prototheca-like type. To determine if this unexpected dichotomy is real or is due to insufficient or biased sampling and to define trends in the evolution of the green algal mitochondrial genome, we sequenced and analyzed the mitochondrial DNA (mtDNA) of Scenedesmus obliquus. This genome is 42,919 bp in size and encodes 42 conserved genes (i.e., large and small subunit rRNA genes, 27 tRNA and 13 respiratory protein-coding genes), four additional free-standing open reading frames with no known homologs, and an intronic reading frame with endonuclease/maturase similarity. No 5S rRNA or ribosomal protein-coding genes have been identified in Scenedesmus mtDNA. The standard protein-coding genes feature a deviant genetic code characterized by the use of UAG (normally a stop codon) to specify leucine, and the unprecedented use of UCA (normally a serine codon) as a signal for termination of translation. The mitochondrial genome of Scenedesmus combines features of both green algal mitochondrial genome types: the presence of a more complex set of protein-coding and tRNA genes is shared with the ancestral type, whereas the lack of 5S rRNA and ribosomal protein-coding genes as well as the presence of fragmented and scrambled rRNA genes are shared with the reduced–derived type of mitochondrial genome organization. Furthermore, the gene content and the fragmentation pattern of the rRNA genes suggest that this genome represents an intermediate stage in the evolutionary process of mitochondrial genome streamlining in green algae. [The sequence data described in this paper have been submitted to the GenBank data library under accession no. AF204057.] PMID:10854413

  2. Patterns of behavior in online homework for introductory physics

    NASA Astrophysics Data System (ADS)

    Fredericks, Colin

    Student activity in online homework was obtained from courses in physics in 2003 and 2005. This data was analyzed through a variety of methods, including principal component analysis, Pearson's r correlation, and comparison to performance measures such as detailed exam scores. Through this analysis it was determined which measured homework behaviors were associated with high exam scores and course grades. It was also determined that homework problems requiring analysis can have an impact on certain types of exam problems where traditional homework does not. Suggestions are given for future research and possible use of these methods in other contexts.

  3. The relationship between ego-state and communication skills in medical students.

    PubMed

    Hur, Yera; Cho, A-Ra

    2014-03-01

    The purpose of this study was to examine the relationship between ego-states and communication skills in medical students. A total of 109 medical school students participated in this study, which used the communication skills self-test papers and the Egogram checklist. The data were analyzed by frequency analysis, and Pearson correlation analysis. Ego-state was related to communication skills. In particular, adapted child ego-state was negatively associated with each sphere of communication skills. Our results suggested that ego-state types should be considered in developing a communication skills education program for medical students.

  4. Methods used to compute low-flow frequency characteristics for continuous-record streamflow stations in Minnesota, 2006

    USGS Publications Warehouse

    Winterstein, Thomas A.; Arntson, Allan D.; Mitton, Gregory B.

    2007-01-01

    The 1-, 7-, and 30-day low-flow series were determined for 120 continuous-record streamflow stations in Minnesota having at least 20 years of continuous record. The 2-, 5-, 10-, 50-, and 100-year statistics were determined for each series by fitting a log Pearson type III distribution to the data. The methods used to determine the low-flow statistics and to construct the plots of the low-flow frequency curves are described. The low-flow series and the low-flow statistics are presented in tables and graphs.

  5. An evaluation of procedures to estimate monthly precipitation probabilities

    NASA Astrophysics Data System (ADS)

    Legates, David R.

    1991-01-01

    Many frequency distributions have been used to evaluate monthly precipitation probabilities. Eight of these distributions (including Pearson type III, extreme value, and transform normal probability density functions) are comparatively examined to determine their ability to represent accurately variations in monthly precipitation totals for global hydroclimatological analyses. Results indicate that a modified version of the Box-Cox transform-normal distribution more adequately describes the 'true' precipitation distribution than does any of the other methods. This assessment was made using a cross-validation procedure for a global network of 253 stations for which at least 100 years of monthly precipitation totals were available.

  6. Statistical analysis of landing contact conditions for three lifting body research vehicles

    NASA Technical Reports Server (NTRS)

    Larson, R. R.

    1972-01-01

    The landing contact conditions for the HL-10, M2-F2/F3, and the X-24A lifting body vehicles are analyzed statistically for 81 landings. The landing contact parameters analyzed are true airspeed, peak normal acceleration at the center of gravity, roll angle, and roll velocity. Ground measurement parameters analyzed are lateral and longitudinal distance from intended touchdown, lateral distance from touchdown to full stop, and rollout distance. The results are presented in the form of histograms for frequency distributions and cumulative frequency distribution probability curves with a Pearson Type 3 curve fit for extrapolation purposes.

  7. Correlation between antimicrobial consumption and antimicrobial resistance of Pseudomonas aeruginosa in a hospital setting: a 10-year study.

    PubMed

    Mladenovic-Antic, S; Kocic, B; Velickovic-Radovanovic, R; Dinic, M; Petrovic, J; Randjelovic, G; Mitic, R

    2016-10-01

    Antimicrobial resistance is one of the greatest threats to human health. One of the most important factors leading to the emergence of resistant bacteria is overuse of antibiotics. The purpose of this study was to investigate the correlation between antimicrobial usage and bacterial resistance of Pseudomonas aeruginosa (P. aeruginosa) over a 10-year period in the Clinical Center Niš, one of the biggest tertiary care hospitals in Serbia. We focused on possible relationships between the consumption of carbapenems and beta-lactam antibiotics and the rates of resistance of P. aeruginosa to carbapenems. We recorded utilization of antibiotics expressed as defined daily doses per 100 bed days (DBD). Bacterial resistance was reported as the percentage of resistant isolates (percentage of all resistant and intermediate resistant strains) among all tested isolates. A significant increasing trend in resistance was seen in imipenem (P < 0·05, Spearman ρ = 0·758) and meropenem (P < 0·05, ρ = 0·745). We found a significant correlation between aminoglycoside consumption and resistance to amikacin (P < 0·01, Pearson r = 0·837) and gentamicin (P < 0·01, Pearson r = 0·827). The correlation between the consumption of carbapenems and resistance to imipenem in P. aeruginosa shows significance (P < 0·01, Pearson r = 0·795), whereas resistance to meropenem showed a trend towards significance (P > 0·05, Pearson r = 0·607). We found a very good correlation between the use of all beta-lactam and P. aeruginosa resistance to carbapenems (P < 0·01, Pearson r = 0·847 for imipenem and P < 0·05, Pearson r = 0·668 for meropenem). Our data demonstrated a significant increase in antimicrobial resistance to carbapenems, significant correlations between the consumption of antibiotics, especially carbapenems and beta-lactams, and rates of antimicrobial resistance of P. aeruginosa to imipenem and meropenem. © 2016 John Wiley & Sons Ltd.

  8. Correlation between increasing tissue ischemia and circulating levels of angiogenic growth factors in peripheral artery disease.

    PubMed

    Jalkanen, Juho; Hautero, Olli; Maksimow, Mikael; Jalkanen, Sirpa; Hakovirta, Harri

    2018-04-21

    The aim of the present study was to assess the circulating levels of vascular endothelial growth factor (VEGF) and other suggested therapeutic growth factors with the degree of ischemia in patients with different clinical manifestations of peripheral arterial disease (PAD) according to the Rutherford grades. The study cohort consists of 226 consecutive patients admitted to a Department of Vascular Surgery for elective invasive procedures. PAD patients were grouped according to the Rutherford grades after a clinical assessment. Ankle-brachial pressure indices (ABI) and absolute toe pressure (TP) values were measured. Serum levels of circulating VEGF, hepatocyte growth factor (HGF), basic fibroblast growth factor (bFGF), and platelet derived growth factor (PDGF) were measured from serum and analysed against Rutherford grades and peripheral hemodynamic measurements. The levels of VEGF (P = 0.009) and HGF (P < 0.001) increased significantly as the ischaemic burden became more severe according to the Rutherford grades. PDGF behaved in opposite manner and declined along increasing Rutherford grades (P = 0.004). A significant, inverse correlations between Rutherford grades was detected as follows; VEGF (Pearson's correlation = 0.183, P = 0.004), HGF (Pearson's correlation = 0.253, P < 0.001), bFGF (Pearson's correlation = 0.169, P = 0.008) and PDGF (Pearson's correlation = 0.296, P < 0.001). In addition, VEGF had a clear direct negative correlation with ABI (Pearson's correlation -0.19, P = 0.009) and TP (Pearson's correlation -0.20, P = 0.005) measurements. Our present observations show that the circulating levels of VEGF and other suggested therapeutic growth factors are significantly increased along with increasing ischemia. These findings present a new perspective to anticipated positive effects of gene therapies utilizing VEGF, HGF, and bFGF, because the levels of these growth factors are endogenously high in end-stage PAD. Copyright © 2018 Elsevier Ltd. All rights reserved.

  9. Relationship between perceived social support and self-care behavior in type 2 diabetics: A cross-sectional study

    PubMed Central

    Mohebi, Siamak; Parham, Mahmoud; Sharifirad, Gholamreza; Gharlipour, Zabihollah; Mohammadbeigi, Abolfazl; Rajati, Fatemeh

    2018-01-01

    BACKGROUND: Social support is one of the most effective factors on the diabetic self-care. This study aimed to assess social support and its relationship to self-care in type 2 diabetic patients in Qom, Iran. STUDY DESIGN: A cross-sectional study was conducted on 325 diabetics attending the Diabetes Mellitus Association. METHODS: Patients who meet inclusion and exclusion criteria were selected using random sampling method. Data were collected by the Summary of Diabetes Self-Care Activities and Multidimensional Scale of Perceived Social Support, with hemoglobin A1C test. Data were analyzed using descriptive statistics and independent t-test, analysis of variance, Pearson correlation, and linear regression test, using 0.05 as the critical significance level, provided by SPSS software. RESULTS: The mean and standard deviation of self-care and social support scores were 4.31 ± 2.7 and 50.32 ± 11.09, respectively. The mean level of glycosylated hemoglobin (HbA1C) of patients was 7.54. There was a significant difference between mean score of self-care behaviors and social support according to gender and marital status (P < 0.05). The regression analysis showed that disease duration was the only variable which had a significant effect on the level of HbA1C (P < 0.001). Pearson correlation coefficient indicated that self-care and social support significantly correlated (r = 0.489, P > 0.001) and also predictive power of social support was 0.28. Self-care was significantly better in diabetics with HbA1C ≤7%. Patients who had higher HbA1C felt less, but not significant, social support. CONCLUSIONS: This study indicated the relationship between social support and self-care behaviors in type 2 diabetic patients. Interventions that focus on improving the social support and self-care of diabetic control may be more effective in improving glycemic control. PMID:29693029

  10. Relationship between perceived social support and self-care behavior in type 2 diabetics: A cross-sectional study.

    PubMed

    Mohebi, Siamak; Parham, Mahmoud; Sharifirad, Gholamreza; Gharlipour, Zabihollah; Mohammadbeigi, Abolfazl; Rajati, Fatemeh

    2018-01-01

    Social support is one of the most effective factors on the diabetic self-care. This study aimed to assess social support and its relationship to self-care in type 2 diabetic patients in Qom, Iran. A cross-sectional study was conducted on 325 diabetics attending the Diabetes Mellitus Association. Patients who meet inclusion and exclusion criteria were selected using random sampling method. Data were collected by the Summary of Diabetes Self-Care Activities and Multidimensional Scale of Perceived Social Support, with hemoglobin A 1 C test. Data were analyzed using descriptive statistics and independent t -test, analysis of variance, Pearson correlation, and linear regression test, using 0.05 as the critical significance level, provided by SPSS software. The mean and standard deviation of self-care and social support scores were 4.31 ± 2.7 and 50.32 ± 11.09, respectively. The mean level of glycosylated hemoglobin (HbA 1 C) of patients was 7.54. There was a significant difference between mean score of self-care behaviors and social support according to gender and marital status ( P < 0.05). The regression analysis showed that disease duration was the only variable which had a significant effect on the level of HbA 1 C ( P < 0.001). Pearson correlation coefficient indicated that self-care and social support significantly correlated ( r = 0.489, P > 0.001) and also predictive power of social support was 0.28. Self-care was significantly better in diabetics with HbA 1 C ≤7%. Patients who had higher HbA 1 C felt less, but not significant, social support. This study indicated the relationship between social support and self-care behaviors in type 2 diabetic patients. Interventions that focus on improving the social support and self-care of diabetic control may be more effective in improving glycemic control.

  11. Multilevel Concatenated Block Modulation Codes for the Frequency Non-selective Rayleigh Fading Channel

    NASA Technical Reports Server (NTRS)

    Lin, Shu; Rhee, Dojun

    1996-01-01

    This paper is concerned with construction of multilevel concatenated block modulation codes using a multi-level concatenation scheme for the frequency non-selective Rayleigh fading channel. In the construction of multilevel concatenated modulation code, block modulation codes are used as the inner codes. Various types of codes (block or convolutional, binary or nonbinary) are being considered as the outer codes. In particular, we focus on the special case for which Reed-Solomon (RS) codes are used as the outer codes. For this special case, a systematic algebraic technique for constructing q-level concatenated block modulation codes is proposed. Codes have been constructed for certain specific values of q and compared with the single-level concatenated block modulation codes using the same inner codes. A multilevel closest coset decoding scheme for these codes is proposed.

  12. Geographic Information Systems using CODES linked data (Crash outcome data evaluation system)

    DOT National Transportation Integrated Search

    2001-04-01

    This report presents information about geographic information systems (GIS) and CODES linked data. Section one provides an overview of a GIS and the benefits of linking to CODES. Section two outlines the basic issues relative to the types of map data...

  13. Pan-cancer transcriptomic analysis associates long non-coding RNAs with key mutational driver events

    PubMed Central

    Ashouri, Arghavan; Sayin, Volkan I.; Van den Eynden, Jimmy; Singh, Simranjit X.; Papagiannakopoulos, Thales; Larsson, Erik

    2016-01-01

    Thousands of long non-coding RNAs (lncRNAs) lie interspersed with coding genes across the genome, and a small subset has been implicated as downstream effectors in oncogenic pathways. Here we make use of transcriptome and exome sequencing data from thousands of tumours across 19 cancer types, to identify lncRNAs that are induced or repressed in relation to somatic mutations in key oncogenic driver genes. Our screen confirms known coding and non-coding effectors and also associates many new lncRNAs to relevant pathways. The associations are often highly reproducible across cancer types, and while many lncRNAs are co-expressed with their protein-coding hosts or neighbours, some are intergenic and independent. We highlight lncRNAs with possible functions downstream of the tumour suppressor TP53 and the master antioxidant transcription factor NFE2L2. Our study provides a comprehensive overview of lncRNA transcriptional alterations in relation to key driver mutational events in human cancers. PMID:28959951

  14. A Comparison of Six MMPI Short Forms: Code Type Correspondence and Indices of Psychopathology.

    ERIC Educational Resources Information Center

    Willcockson, James C.; And Others

    1983-01-01

    Compared six Minnesota Multiphasic Personality Inventory (MMPI) short forms with the full-length MMPI for ability to identify code-types and indices of psychopathology in renal dialysis patients (N=53) and paranoid schizophrenics (N=58). Results suggested that the accuracy of the short forms fluctuates for different patient populations and…

  15. 78 FR 72878 - Revisions to Procedural Regulations Governing Filing, Indexing and Service by Oil Pipelines...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-04

    ...-5-000] Revisions to Procedural Regulations Governing Filing, Indexing and Service by Oil Pipelines, Electronic Tariff Filings; Notice of Changes to eTariff Part 341 Type of Filing Codes Order No. 780... available eTariff Type of Filing Codes (TOFC) will be modified as follows: \\2\\ \\1\\ Filing, Indexing and...

  16. A Study on the Correlation of Pertrochanteric Osteoporotic Fracture Severity with the Severity of Osteoporosis.

    PubMed

    Hayer, Prabhnoor Singh; Deane, Anit Kumar Samuel; Agrawal, Atul; Maheshwari, Rajesh; Juyal, Anil

    2016-04-01

    Osteoporosis is a metabolic bone disease caused by progressive bone loss. It is characterized by low Bone Mineral Density (BMD) and structural deterioration of bone tissue leading to bone fragility and increased risk of fractures. When classifying a fracture, high reliability and validity are crucial for successful treatment. Furthermore, a classification system should include severity, method of treatment, and prognosis for any given fracture. Since it is known that treatment significantly influences prognosis, a classification system claiming to include both would be desirable. Since there is no such classification system, which includes both the fracture type and the osteoporosis severity, we tried to find a correlation between fracture severity and osteoporosis severity. The aim of the study was to evaluate whether the AO/ASIF fracture classification system, which indicates the severity of fractures, has any relationship with the bone mineral status in patients with primary osteoporosis. We hypothesized that fracture severity and severity of osteoporosis should show some correlation. An observational analytical study was conducted over a period of one year during which 49 patients were included in the study at HIMS, SRH University, Dehradun. The osteoporosis status of all the included patients with a pertrochanteric fracture was documented using a DEXA scan and T-Score (BMD) was calculated. All patients had a trivial trauma. All the fractures were classified as per AO/ASIF classification. Pearson Correlation between BMD and fracture type was calculated. Data was entered on Microsoft Office Excel version 2007 and Interpretation and analysis of obtained data was done using summary statistics. Pearson Correlation between BMD and fracture type was calculated using the SPSS software version 22.0. The average age of the patients included in the study was 71.2 years and the average bone mineral density was -4.9. The correlation between BMD and fracture type was calculated and the r-values obtained was 0.180, which showed low a correlation and p-value was 0.215, which was insignificant. Statistically the pertrochanteric fracture configuration as per AO Classification does not correlate with the osteoporosis severity of the patient.

  17. Robustness of Two Formulas to Correct Pearson Correlation for Restriction of Range

    ERIC Educational Resources Information Center

    tran, minh

    2011-01-01

    Many research studies involving Pearson correlations are conducted in settings where one of the two variables has a restricted range in the sample. For example, this situation occurs when tests are used for selecting candidates for employment or university admission. Often after selection, there is interest in correlating the selection variable,…

  18. Ecological restoration experiments (1992-2007) at the G. A. Pearson Natural Area, Fort Valley Experimental Forest

    Treesearch

    Margaret M. Moore; W. Wallace Covington; Peter Z. Fule; Stephen C. Hart; Thomas E. Kolb; Joy N. Mast; Stephen S. Sackett; Michael R. Wagner

    2008-01-01

    In 1992 an experiment was initiated at the G. A. Pearson Natural Area on the Fort Valley Experimental Forest to evaluate long-term ecosystem responses to two restoration treatments: thinning only and thinning with prescribed burning. Fifteen years of key findings about tree physiology, herbaceous, and ecosystem responses are presented.

  19. Practitioner Perspectives: Children's Use of Technology in the Early Years

    ERIC Educational Resources Information Center

    Formby, Susie

    2014-01-01

    This research, a collaboration between Pearson and the National Literacy Trust, was designed to explore the use of technology by children in the early years. In 2013 Pearson and the National Literacy Trust invited practitioners who work with three to five-year-olds to take part in an online survey to explore how they support children's language…

  20. Reducing Bias and Error in the Correlation Coefficient Due to Nonnormality

    ERIC Educational Resources Information Center

    Bishara, Anthony J.; Hittner, James B.

    2015-01-01

    It is more common for educational and psychological data to be nonnormal than to be approximately normal. This tendency may lead to bias and error in point estimates of the Pearson correlation coefficient. In a series of Monte Carlo simulations, the Pearson correlation was examined under conditions of normal and nonnormal data, and it was compared…

  1. An Assessment of External Organizational Marketing at Naval Aviation Depot (NADEP) North Island

    DTIC Science & Technology

    2005-12-01

    organizational marketing. In his book "A Framework for Marketing Management," marketing expert Philip Kotler addresses marketing as it applies to...3 Kotler , Philip , A Framework for Marketing Management (Second Edition), Pearson Education, Inc...56 Kotler , Philip , A Framework for Marketing Management (Second Edition), Pearson Education, Inc. 2003. 64 strengthening its

  2. Prentice Hall/Pearson Literature© (2007-15). What Works Clearinghouse Intervention Report

    ERIC Educational Resources Information Center

    What Works Clearinghouse, 2017

    2017-01-01

    "Prentice Hall/Pearson Literature©" (2007-15) is an English language arts curriculum designed for students in grades 6-12 that focuses on building reading, vocabulary, literary analysis, and writing skills. It uses passages from fiction and nonfiction texts, poetry, and contemporary digital media. The curriculum is based on a textbook.…

  3. Abuses and Mysteries at the Association of Social Work Boards

    ERIC Educational Resources Information Center

    Woodcock, Ray

    2016-01-01

    Under contract with the Association of Social Work Boards (ASWB), Pearson VUE reportedly performs much of the work of developing and administering the social work licensing exams required by most states. ASWB charges substantial fees for such exams and, after paying Pearson, has been able to bank considerable sums. One of the key contributions to…

  4. Quantitative comparison of the absorption spectra of the gas mixtures in analogy to the criterion of Pearson

    NASA Astrophysics Data System (ADS)

    Kistenev, Yu. V.; Kuzmin, D. A.; Sandykova, E. A.; Shapovalov, A. V.

    2015-11-01

    An approach to the reduction of the space of the absorption spectra, based on the original criterion for profile analysis of the spectra, was proposed. This criterion dates back to the known statistics chi-square test of Pearson. Introduced criterion allows to quantify the differences of spectral curves.

  5. Fusarium Osteomyelitis in a Patient With Pearson Syndrome: Case Report and Review of the Literature.

    PubMed

    Hiebert, Rachael M; Welliver, Robert C; Yu, Zhongxin

    2016-10-01

    Fusarium species are ubiquitous fungi causing a wide array of infections, including invasive disease in the immunosuppressed. We present a fusarium bone infection in a child with Pearson syndrome and review the literature. Ten cases of fusarium osteomyelitis were reported in the past 40 years, and we review the treatments.

  6. A Successful Recipe? Aspects of the Initial Training of Secondary Teachers of Foreign Languages

    ERIC Educational Resources Information Center

    Pearson, Sue; Chambers, Gary

    2005-01-01

    An earlier study (Pearson, 2005) raised concerns about the adequacy of the current Postgraduate Certificate in Education (PGCE) courses in relation to inclusive education and suggested some relevant questions for training providers (and students). In this article, Sue Pearson and Gary Chambers report on a small-scale study involving a group of…

  7. A New Way to Teach (or Compute) Pearson's "r" without Reliance on Cross-Products

    ERIC Educational Resources Information Center

    Huck, Schuyler W.; Ren, Bixiang; Yang, Hongwei

    2007-01-01

    Many students have difficulty seeing the conceptual link between bivariate data displayed in a scatterplot and the statistical summary of the relationship, "r." This article shows how to teach (and compute) "r" such that each datum's direct and indirect influences are made apparent and used in a new formula for calculating Pearson's "r."

  8. Comparison of the Mahalanobis distance and Pearson's χ² statistic as measures of similarity of isotope patterns.

    PubMed

    Zamanzad Ghavidel, Fatemeh; Claesen, Jürgen; Burzykowski, Tomasz; Valkenborg, Dirk

    2014-02-01

    To extract a genuine peptide signal from a mass spectrum, an observed series of peaks at a particular mass can be compared with the isotope distribution expected for a peptide of that mass. To decide whether the observed series of peaks is similar to the isotope distribution, a similarity measure is needed. In this short communication, we investigate whether the Mahalanobis distance could be an alternative measure for the commonly employed Pearson's χ(2) statistic. We evaluate the performance of the two measures by using a controlled MALDI-TOF experiment. The results indicate that Pearson's χ(2) statistic has better discriminatory performance than the Mahalanobis distance and is a more robust measure.

  9. Comparison of the Mahalanobis Distance and Pearson's χ2 Statistic as Measures of Similarity of Isotope Patterns

    NASA Astrophysics Data System (ADS)

    Zamanzad Ghavidel, Fatemeh; Claesen, Jürgen; Burzykowski, Tomasz; Valkenborg, Dirk

    2014-02-01

    To extract a genuine peptide signal from a mass spectrum, an observed series of peaks at a particular mass can be compared with the isotope distribution expected for a peptide of that mass. To decide whether the observed series of peaks is similar to the isotope distribution, a similarity measure is needed. In this short communication, we investigate whether the Mahalanobis distance could be an alternative measure for the commonly employed Pearson's χ2 statistic. We evaluate the performance of the two measures by using a controlled MALDI-TOF experiment. The results indicate that Pearson's χ2 statistic has better discriminatory performance than the Mahalanobis distance and is a more robust measure.

  10. Consumer knowledge and attitudes about genetically modified food products and labelling policy.

    PubMed

    Vecchione, Melissa; Feldman, Charles; Wunderlich, Shahla

    2015-05-01

    The purpose of this study was to examine the relationship between consumer knowledge, attitudes and behaviours towards foods containing genetically modified organisms (GMOs) and the prevalence of GMO labelling in northern New Jersey supermarkets. This cross-sectional study surveyed 331 adults, New Jersey supermarket customers (mean age 26 years old, 79.8% women). The results show a strong, positive correlation between consumer attitudes towards foods not containing GMOs and purchasing behaviour (Pearson's r = 0.701, p < 0.001) with lesser correlations between knowledge and behaviour (Pearson's r = 0.593, p < 0.001) and knowledge and attitudes (Pearson's r = 0.413, p < 0.001). GMO labelling would assist consumers in making informed purchase decisions.

  11. Endothelial necrosis at 1h post-burn predicts progression of tissue injury

    PubMed Central

    Hirth, Douglas; McClain, Steve A.; Singer, Adam J.; Clark, Richard A.F.

    2013-01-01

    Burn injury progression has not been well characterized at the cellular level. To define burn injury progression in terms of cell death, histopathologic spatiotemporal relationships of cellular necrosis and apoptosis were investigated in a validated porcine model of vertical burn injury progression. Cell necrosis was identified by High Mobility Group Box 1 protein and apoptosis by Caspase 3a staining of tissue samples taken 1h, 24h and 7 days post-burn. Level of endothelial cell necrosis at 1h was predictive of level of apoptosis at 24h (Pearson's r=0.87) and of level of tissue necrosis at 7 days (Pearson's r=0.87). Furthermore, endothelial cell necrosis was deeper than interstitial cell necrosis at 1h (p<0.001). Endothelial cell necrosis at 1h divided the zone of injury progression (Jackson's zone of stasis) into an upper subzone with necrotic endothelial cells and initially viable adnexal and interstitial cells at 1h that progressed to necrosis by 24h, and a lower zone with initially viable endothelial cells at 1h, but necrosis and apoptosis of all cell types by 24h. Importantly, this spatiotemporal series of events and rapid progression resembles myocardial infarction and stroke, and implicates mechanisms of these injuries, ischemia, ischemia reperfusion, and programmed cell death, in burn progression. PMID:23627744

  12. Cerebral Glucose Metabolism and Sedation in Brain-injured Patients: A Microdialysis Study.

    PubMed

    Hertle, Daniel N; Santos, Edgar; Hagenston, Anna M; Jungk, Christine; Haux, Daniel; Unterberg, Andreas W; Sakowitz, Oliver W

    2015-07-01

    Disturbed brain metabolism is a signature of primary damage and/or precipitates secondary injury processes after severe brain injury. Sedatives and analgesics target electrophysiological functioning and are as such well-known modulators of brain energy metabolism. Still unclear, however, is how sedatives impact glucose metabolism and whether they differentially influence brain metabolism in normally active, healthy brain and critically impaired, injured brain. We therefore examined and compared the effects of anesthetic drugs under both critical (<1 mmol/L) and noncritical (>1 mmol/L) extracellular brain glucose levels. We performed an explorative, retrospective analysis of anesthetic drug administration and brain glucose concentrations, obtained by bedside microdialysis, in 19 brain-injured patients. Our investigations revealed an inverse linear correlation between brain glucose and both the concentration of extracellular glutamate (Pearson r=-0.58, P=0.01) and the lactate/glucose ratio (Pearson r=-0.55, P=0.01). For noncritical brain glucose levels, we observed a positive linear correlation between midazolam dose and brain glucose (P<0.05). For critical brain glucose levels, extracellular brain glucose was unaffected by any type of sedative. These findings suggest that the use of anesthetic drugs may be of limited value in attempts to influence brain glucose metabolism in injured brain tissue.

  13. Testing for independence in J×K contingency tables with complex sample survey data.

    PubMed

    Lipsitz, Stuart R; Fitzmaurice, Garrett M; Sinha, Debajyoti; Hevelone, Nathanael; Giovannucci, Edward; Hu, Jim C

    2015-09-01

    The test of independence of row and column variables in a (J×K) contingency table is a widely used statistical test in many areas of application. For complex survey samples, use of the standard Pearson chi-squared test is inappropriate due to correlation among units within the same cluster. Rao and Scott (1981, Journal of the American Statistical Association 76, 221-230) proposed an approach in which the standard Pearson chi-squared statistic is multiplied by a design effect to adjust for the complex survey design. Unfortunately, this test fails to exist when one of the observed cell counts equals zero. Even with the large samples typical of many complex surveys, zero cell counts can occur for rare events, small domains, or contingency tables with a large number of cells. Here, we propose Wald and score test statistics for independence based on weighted least squares estimating equations. In contrast to the Rao-Scott test statistic, the proposed Wald and score test statistics always exist. In simulations, the score test is found to perform best with respect to type I error. The proposed method is motivated by, and applied to, post surgical complications data from the United States' Nationwide Inpatient Sample (NIS) complex survey of hospitals in 2008. © 2015, The International Biometric Society.

  14. Pelvic floor muscle strength of women consulting at the gynecology outpatient clinics and its correlation with sexual dysfunction: A cross-sectional study.

    PubMed

    Ozdemir, Filiz Ciledag; Pehlivan, Erkan; Melekoglu, Rauf

    2017-01-01

    To investigate the pelvic floor muscle strength of the women andevaluateits possible correlation with sexual dysfunction. In this cross-sectional type study, stratified clusters were used for the sampling method. Index of Female Sexual Function (IFSF) worksheetwere used for questions on sexual function. The pelvic floor muscle strength of subjects was assessed byperineometer. The chi-squared test, logistic regression and Pearson's correlation analysis were used for the statistical analysis. Four hundred thirty primiparous women, mean age 38.5 participated in this study. The average pelvic floor muscle strength value was found 31.4±9.6 cm H 2 O and the average Index of Female Sexual Function (IFSF) score was found 26.5±6.9. Parity (odds ratio OR=5.546) and age 40 or higher (OR=3.484) were found correlated with pelvic floor muscle weakness (p<0.05). The factors directly correlated with sexual dysfunction were found being overweight (OR=2.105) and age 40 or higher (OR=2.451) (p<0.05). Pearson's correlation analysis showed that there was a statistically significantlinear correlation between the muscular strength of the pelvic floor and sexual function (p=0.001). The results suggested subjects with decreased pelvic floor muscle strength value had higher frequency of sexual dysfunction.

  15. The effect of code expanding optimizations on instruction cache design

    NASA Technical Reports Server (NTRS)

    Chen, William Y.; Chang, Pohua P.; Conte, Thomas M.; Hwu, Wen-Mei W.

    1991-01-01

    It is shown that code expanding optimizations have strong and non-intuitive implications on instruction cache design. Three types of code expanding optimizations are studied: instruction placement, function inline expansion, and superscalar optimizations. Overall, instruction placement reduces the miss ratio of small caches. Function inline expansion improves the performance for small cache sizes, but degrades the performance of medium caches. Superscalar optimizations increases the cache size required for a given miss ratio. On the other hand, they also increase the sequentiality of instruction access so that a simple load-forward scheme effectively cancels the negative effects. Overall, it is shown that with load forwarding, the three types of code expanding optimizations jointly improve the performance of small caches and have little effect on large caches.

  16. Drug overdose surveillance using hospital discharge data.

    PubMed

    Slavova, Svetla; Bunn, Terry L; Talbert, Jeffery

    2014-01-01

    We compared three methods for identifying drug overdose cases in inpatient hospital discharge data on their ability to classify drug overdoses by intent and drug type(s) involved. We compared three International Classification of Diseases, Ninth Revision, Clinical Modification code-based case definitions using Kentucky hospital discharge data for 2000-2011. The first definition (Definition 1) was based on the external-cause-of-injury (E-code) matrix. The other two definitions were based on the Injury Surveillance Workgroup on Poisoning (ISW7) consensus recommendations for national and state poisoning surveillance using the principal diagnosis or first E-code (Definition 2) or any diagnosis/E-code (Definition 3). Definition 3 identified almost 50% more drug overdose cases than did Definition 1. The increase was largely due to cases with a first-listed E-code describing a drug overdose but a principal diagnosis that was different from drug overdose (e.g., mental disorders, or respiratory or circulatory system failure). Regardless of the definition, more than 53% of the hospitalizations were self-inflicted drug overdoses; benzodiazepines were involved in about 30% of the hospitalizations. The 2011 age-adjusted drug overdose hospitalization rate in Kentucky was 146/100,000 population using Definition 3 and 107/100,000 population using Definition 1. The ISW7 drug overdose definition using any drug poisoning diagnosis/E-code (Definition 3) is potentially the highest sensitivity definition for counting drug overdose hospitalizations, including by intent and drug type(s) involved. As the states enact policies and plan for adequate treatment resources, standardized drug overdose definitions are critical for accurate reporting, trend analysis, policy evaluation, and state-to-state comparison.

  17. A systematic literature review of automated clinical coding and classification systems

    PubMed Central

    Williams, Margaret; Fenton, Susan H; Jenders, Robert A; Hersh, William R

    2010-01-01

    Clinical coding and classification processes transform natural language descriptions in clinical text into data that can subsequently be used for clinical care, research, and other purposes. This systematic literature review examined studies that evaluated all types of automated coding and classification systems to determine the performance of such systems. Studies indexed in Medline or other relevant databases prior to March 2009 were considered. The 113 studies included in this review show that automated tools exist for a variety of coding and classification purposes, focus on various healthcare specialties, and handle a wide variety of clinical document types. Automated coding and classification systems themselves are not generalizable, nor are the results of the studies evaluating them. Published research shows these systems hold promise, but these data must be considered in context, with performance relative to the complexity of the task and the desired outcome. PMID:20962126

  18. Code development for ships -- A demonstration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ayyub, B.; Mansour, A.E.; White, G.

    1996-12-31

    A demonstration summary of a reliability-based structural design code for ships is presented for two ship types, a cruiser and a tanker. For both ship types, code requirements cover four failure modes: hull girder bulking, unstiffened plate yielding and buckling, stiffened plate buckling, and fatigue of critical detail. Both serviceability and ultimate limit states are considered. Because of limitation on the length, only hull girder modes are presented in this paper. Code requirements for other modes will be presented in future publication. A specific provision of the code will be a safety check expression. The design variables are to bemore » taken at their nominal values, typically values in the safe side of the respective distributions. Other safety check expressions for hull girder failure that include load combination factors, as well as consequence of failure factors, are considered. This paper provides a summary of safety check expressions for the hull girder modes.« less

  19. A systematic literature review of automated clinical coding and classification systems.

    PubMed

    Stanfill, Mary H; Williams, Margaret; Fenton, Susan H; Jenders, Robert A; Hersh, William R

    2010-01-01

    Clinical coding and classification processes transform natural language descriptions in clinical text into data that can subsequently be used for clinical care, research, and other purposes. This systematic literature review examined studies that evaluated all types of automated coding and classification systems to determine the performance of such systems. Studies indexed in Medline or other relevant databases prior to March 2009 were considered. The 113 studies included in this review show that automated tools exist for a variety of coding and classification purposes, focus on various healthcare specialties, and handle a wide variety of clinical document types. Automated coding and classification systems themselves are not generalizable, nor are the results of the studies evaluating them. Published research shows these systems hold promise, but these data must be considered in context, with performance relative to the complexity of the task and the desired outcome.

  20. Entropy-Based Bounds On Redundancies Of Huffman Codes

    NASA Technical Reports Server (NTRS)

    Smyth, Padhraic J.

    1992-01-01

    Report presents extension of theory of redundancy of binary prefix code of Huffman type which includes derivation of variety of bounds expressed in terms of entropy of source and size of alphabet. Recent developments yielded bounds on redundancy of Huffman code in terms of probabilities of various components in source alphabet. In practice, redundancies of optimal prefix codes often closer to 0 than to 1.

  1. Searching the Social Sciences Citation Index on BRS.

    ERIC Educational Resources Information Center

    Janke, Richard V.

    1980-01-01

    Concentrates on describing and illustrating by example the unique BRS features of the online Social Sciences Citation Index. Appendices provide a key to the BRS/SSCI citation elements, BRS standardized language codes, publication type codes, author's classification of BRS/SSCI subject category codes, search examples, and database specifications.…

  2. Mechanical properties of contemporary composite resins and their interrelations.

    PubMed

    Thomaidis, Socratis; Kakaboura, Afrodite; Mueller, Wolf Dieter; Zinelis, Spiros

    2013-08-01

    To characterize a spectrum of mechanical properties of four representative types of modern dental resin composites and to investigate possible interrelations. Four composite resins were used, a microhybrid (Filtek Z-250), a nanofill (Filtek Ultimate), a nanohybrid (Majesty Posterior) and an ormocer (Admira). The mechanical properties investigated were Flexural Modulus and Flexural Strength (three point bending), Brinell Hardness, Impact Strength, mode I and mode II fracture toughness employing SENB and Brazilian tests and Work of Fracture. Fractographic analysis was carried out in an SEM to determine the origin of fracture for specimens subjected to SENB, Brazilian and Impact Strength testing. The results were statistically analyzed employing ANOVA and Tukey post hoc test (a=0.05) while Pearson correlation was applied among the mechanical properties. Significant differences were found between the mechanical properties of materials tested apart from mode I fracture toughness measured by Brazilian test. The latter significantly underestimated the mode I fracture toughness due to analytical limitations and thus its validity is questionable. Fractography revealed that the origin of fracture is located at notches for fracture toughness tests and contact surface with pendulum for Impact Strength testing. Pearson analysis illustrated a strong correlation between modulus of elasticity and hardness (r=0.87) and a weak negative correlation between Work of Fracture and Flexural Modulus (r=-0.46) and Work of Fracture and Hardness (r=-0.44). Weak correlations were also allocated between Flexural Modulus and Flexural Strength (r=0.40), Flexural Strength and Hardness (r=0.39), and Impact Strength and Hardness (r=0.40). Since the four types of dental resin composite tested exhibited large differences among their mechanical properties differences in their clinical performance is also anticipated. Copyright © 2013 Academy of Dental Materials. Published by Elsevier Ltd. All rights reserved.

  3. Quality of life of gestational trophoblastic neoplasia survivors: a study of patients at the Philippine General Hospital trophoblastic disease section.

    PubMed

    Cagayan, M Stephanie Fay S; Llarena, Raquel T

    2010-01-01

    To evaluate the quality of life (QOL) of patients who were diagnosed with gestational trophoblastic neoplasia (GTN) at the Philippine General Hospital Trophoblastic Disease Section and who were in remission at the time of this study. A cross-sectional descriptive study designed to measure the QOL of all patients diagnosed as having GTN in remission and following up at the Philippine General Hospital Trophoblastic Disease Outpatient Clinic from May-August 2008 (N = 46). This study used the short form 12-question (SF-12) survey forms to evaluate the QOL of patients diagnosed with GTN. Scores from the SF-12 were analyzed using Pearson's correlation. Statistical significance was assumed for p values < 0.05 and 0.01 for all statistical tests. Forty-six GTN survivors included in the study successfully answered all the questions. Using Pearson's correlation of demographic characteristic variables and SF-12 domains, it was found that there was better physical functioning among younger patients, and there was mild limitation in moderate activities during a typical day among older patients. There was a significant positive correlation between educational level and physical functioning. A negative correlation was found between the stage of GTN and patients' general health. In conclusion, the survivors' age, educational level and type of treatment had impact on the QOL among GTN survivors in terms of physical functioning. No relationship was established between the demographic variables and mental status. SF-12 appears to be a reliable instrument, suggesting its potential in measuring health status in GTN survivors. Age, educational attainment and type of treatment were shown to have an impact on the QOL of the surviving GTN patients.

  4. Effect of Cyclic Loading on Micromotion at the Implant-Abutment Interface.

    PubMed

    Karl, Matthias; Taylor, Thomas D

    2016-01-01

    Cyclic loading may cause settling of abutments mounted on dental implants, potentially affecting screw joint stability and implant-abutment micromotion. It was the goal of this in vitro study to compare micromotion of implant-abutment assemblies before and after masticatory simulation. Six groups of abutments (n = 5) for a specific tissue-level implant system with an internal octagon were subject to micromotion measurements. The implant-abutment assemblies were loaded in a universal testing machine, and an apparatus and extensometers were used to record displacement. This was done twice, in the condition in which they were received from the abutment manufacturer and after simulated loading (100,000 cycles; 100 N). Statistical analysis was based on analysis of variance, two-sample t tests (Welch tests), and Pearson product moment correlation (α = .05). The mean values for micromotion ranged from 33.15 to 63.41 μm and from 30.03 to 42.40 μm before and after load cycling. The general trend toward reduced micromotion following load cycling was statistically significant only for CAD/CAM zirconia abutments (P = .036) and for one type of clone abutment (P = .012), with no significant correlation between values measured before and after cyclic loading (Pearson product moment correlation; P = .104). While significant differences in micromotion were found prior to load cycling, no significant difference among any of the abutment types tested could be observed afterward (P > .05 in all cases). A quantifiable settling effect at the implant-abutment interface seems to result from cyclic loading, leading to a decrease in micromotion. This effect seems to be more pronounced in low-quality abutments. For the implant system tested in this study, retightening of abutment screws is recommended after an initial period of clinical use.

  5. Comparison between uroflowmetry and sonouroflowmetry in recording of urinary flow in healthy men.

    PubMed

    Krhut, Jan; Gärtner, Marcel; Sýkora, Radek; Hurtík, Petr; Burda, Michal; Luňáček, Libor; Zvarová, Katarína; Zvara, Peter

    2015-08-01

    To evaluate the accuracy of sonouroflowmetry in recording urinary flow parameters and voided volume. A total of 25 healthy male volunteers (age 18-63 years) were included in the study. All participants were asked to carry out uroflowmetry synchronous with recording of the sound generated by the urine stream hitting the water level in the urine collection receptacle, using a dedicated cell phone. From 188 recordings, 34 were excluded, because of voided volume <150 mL or technical problems during recording. Sonouroflowmetry recording was visualized in a form of a trace, representing sound intensity over time. Subsequently, the matching datasets of uroflowmetry and sonouroflowmetry were compared with respect to flow time, voided volume, maximum flow rate and average flow rate. Pearson's correlation coefficient was used to compare parameters recorded by uroflowmetry with those calculated based on sonouroflowmetry recordings. The flow pattern recorded by sonouroflowmetry showed a good correlation with the uroflowmetry trace. A strong correlation (Pearson's correlation coefficient 0.87) was documented between uroflowmetry-recorded flow time and duration of the sound signal recorded with sonouroflowmetry. A moderate correlation was observed in voided volume (Pearson's correlation coefficient 0.68) and average flow rate (Pearson's correlation coefficient 0.57). A weak correlation (Pearson's correlation coefficient 0.38) between maximum flow rate recorded using uroflowmetry and sonouroflowmetry-recorded peak sound intensity was documented. The present study shows that the basic concept utilizing sound analysis for estimation of urinary flow parameters and voided volume is valid. However, further development of this technology and standardization of recording algorithm are required. © 2015 The Japanese Urological Association.

  6. Effects of Normal and Perturbed Social Play on the Duration and Amplitude of Different Types of Infant Smiles

    ERIC Educational Resources Information Center

    Fogel, Alan; Hsu, Hui-Chin; Shapiro, Alyson F.; Nelson-Goens, G. Christina; Secrist, Cory

    2006-01-01

    Different types of smiling varying in amplitude of lip corner retraction were investigated during 2 mother-infant games--peekaboo and tickle--at 6 and 12 months and during normally occurring and perturbed games. Using Facial Action Coding System (FACS), infant smiles were coded as simple (lip corner retraction only), Duchenne (simple plus cheek…

  7. 30 CFR 206.56 - Transportation allowances-general.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... oil has been determined under § 206.52 or § 206.53 of this subpart at a point (e.g., sales point or... sales type code may not exceed 50 percent of the value of the oil at the point of sale as determined under § 206.52 of this subpart. Transportation costs cannot be transferred between sales type codes or...

  8. Reporting of Sepsis Cases for Performance Measurement Versus for Reimbursement in New York State.

    PubMed

    Prescott, Hallie C; Cope, Tara M; Gesten, Foster C; Ledneva, Tatiana A; Friedrich, Marcus E; Iwashyna, Theodore J; Osborn, Tiffany M; Seymour, Christopher W; Levy, Mitchell M

    2018-05-01

    Under "Rory's Regulations," New York State Article 28 acute care hospitals were mandated to implement sepsis protocols and report patient-level data. This study sought to determine how well cases reported under state mandate align with discharge records in a statewide administrative database. Observational cohort study. First 27 months of mandated sepsis reporting (April 1, 2014, to June 30, 2016). Hospitalizations with sepsis at New York State Article 28 acute care hospitals. Sepsis regulations with mandated reporting. We compared cases reported to the New York State Department of Health Sepsis Clinical Database with discharge records in the Statewide Planning and Research Cooperative System database. We classified discharges as 1) "coded sepsis discharges"-a diagnosis code for severe sepsis or septic shock and 2) "possible sepsis discharges," using Dombrovskiy and Angus criteria. Of 111,816 sepsis cases reported to the New York State Department of Health Sepsis Clinical Database, 105,722 (94.5%) were matched to discharge records in Statewide Planning and Research Cooperative System. The percentage of coded sepsis discharges reported increased from 67.5% in the first quarter to 81.3% in the final quarter of the study period (mean, 77.7%). Accounting for unmatched cases, as many as 82.7% of coded sepsis discharges were potentially reported, whereas at least 17.3% were unreported. Compared with unreported discharges, reported discharges had higher rates of acute organ dysfunction (e.g., cardiovascular dysfunction 63.0% vs 51.8%; p < 0.001) and higher in-hospital mortality (30.2% vs 26.1%; p < 0.001). Hospital characteristics (e.g., number of beds, teaching status, volume of sepsis cases) were similar between hospitals with a higher versus lower percent of discharges reported, p values greater than 0.05 for all. Hospitals' percent of discharges reported was not correlated with risk-adjusted mortality of their submitted cases (Pearson correlation coefficient 0.11; p = 0.17). Approximately four of five discharges with a diagnosis code of severe sepsis or septic shock in the Statewide Planning and Research Cooperative System data were reported in the New York State Department of Health Sepsis Clinical Database. Incomplete reporting appears to be driven more by underrecognition than attempts to game the system, with minimal bias to risk-adjusted hospital performance measurement.

  9. Ecological restoration experiments (1992-2007) at the G.A. Pearson Natural Area, Fort Valley Experimental Forest (P-53)

    Treesearch

    Margaret M. Moore; Wallace Covington; Peter Z. Fulé; Stephen C. Hart; Thomas E. Kolb; Joy N. Mast; Stephen S. Sackett; Michael R. Wagner

    2008-01-01

    In 1992 an experiment was initiated at the G. A. Pearson Natural Area on the Fort Valley Experimental Forest to evaluate long-term ecosystem responses to two restoration treatments: thinning only and thinning with prescribed burning. Fifteen years of key findings about tree physiology, herbaceous, and ecosystem responses are presented.

  10. Test Review: Wechsler, D. (2014),"Wechsler Intelligence Scale for Children, Fifth Edition: Canadian 322 (WISC-V[superscript CDN])." Toronto, Ontario: Pearson Canada Assessment.

    ERIC Educational Resources Information Center

    Cormier, Damien C.; Kennedy, Kathleen E.; Aquilina, Alexandra M.

    2016-01-01

    The Wechsler Intelligence Scale for Children, Fifth Edition: Canadian (WISC-V[superscript CDN]; Wechsler, 2014) is published by Pearson Canada Assessment. The WISC-V[superscript CDN] is a norm-referenced, individually administered intelligence battery that provides a comprehensive diagnostic profile of the cognitive strengths and weaknesses of…

  11. Comparing Pearson, Spearman and Hoeffding's D measure for gene expression association analysis.

    PubMed

    Fujita, André; Sato, João Ricardo; Demasi, Marcos Angelo Almeida; Sogayar, Mari Cleide; Ferreira, Carlos Eduardo; Miyano, Satoru

    2009-08-01

    DNA microarrays have become a powerful tool to describe gene expression profiles associated with different cellular states, various phenotypes and responses to drugs and other extra- or intra-cellular perturbations. In order to cluster co-expressed genes and/or to construct regulatory networks, definition of distance or similarity between measured gene expression data is usually required, the most common choices being Pearson's and Spearman's correlations. Here, we evaluate these two methods and also compare them with a third one, namely Hoeffding's D measure, which is used to infer nonlinear and non-monotonic associations, i.e. independence in a general sense. By comparing three different variable association approaches, namely Pearson's correlation, Spearman's correlation and Hoeffding's D measure, we aimed at assessing the most appropriate one for each purpose. Using simulations, we demonstrate that the Hoeffding's D measure outperforms Pearson's and Spearman's approaches in identifying nonlinear associations. Our results demonstrate that Hoeffding's D measure is less sensitive to outliers and is a more powerful tool to identify nonlinear and non-monotonic associations. We have also applied Hoeffding's D measure in order to identify new putative genes associated with tp53. Therefore, we propose the Hoeffding's D measure to identify nonlinear associations between gene expression profiles.

  12. Quality of recording of diabetes in the UK: how does the GP's method of coding clinical data affect incidence estimates? Cross-sectional study using the CPRD database.

    PubMed

    Tate, A Rosemary; Dungey, Sheena; Glew, Simon; Beloff, Natalia; Williams, Rachael; Williams, Tim

    2017-01-25

    To assess the effect of coding quality on estimates of the incidence of diabetes in the UK between 1995 and 2014. A cross-sectional analysis examining diabetes coding from 1995 to 2014 and how the choice of codes (diagnosis codes vs codes which suggest diagnosis) and quality of coding affect estimated incidence. Routine primary care data from 684 practices contributing to the UK Clinical Practice Research Datalink (data contributed from Vision (INPS) practices). Incidence rates of diabetes and how they are affected by (1) GP coding and (2) excluding 'poor' quality practices with at least 10% incident patients inaccurately coded between 2004 and 2014. Incidence rates and accuracy of coding varied widely between practices and the trends differed according to selected category of code. If diagnosis codes were used, the incidence of type 2 increased sharply until 2004 (when the UK Quality Outcomes Framework was introduced), and then flattened off, until 2009, after which they decreased. If non-diagnosis codes were included, the numbers continued to increase until 2012. Although coding quality improved over time, 15% of the 666 practices that contributed data between 2004 and 2014 were labelled 'poor' quality. When these practices were dropped from the analyses, the downward trend in the incidence of type 2 after 2009 became less marked and incidence rates were higher. In contrast to some previous reports, diabetes incidence (based on diagnostic codes) appears not to have increased since 2004 in the UK. Choice of codes can make a significant difference to incidence estimates, as can quality of recording. Codes and data quality should be checked when assessing incidence rates using GP data. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  13. ArraySolver: an algorithm for colour-coded graphical display and Wilcoxon signed-rank statistics for comparing microarray gene expression data.

    PubMed

    Khan, Haseeb Ahmad

    2004-01-01

    The massive surge in the production of microarray data poses a great challenge for proper analysis and interpretation. In recent years numerous computational tools have been developed to extract meaningful interpretation of microarray gene expression data. However, a convenient tool for two-groups comparison of microarray data is still lacking and users have to rely on commercial statistical packages that might be costly and require special skills, in addition to extra time and effort for transferring data from one platform to other. Various statistical methods, including the t-test, analysis of variance, Pearson test and Mann-Whitney U test, have been reported for comparing microarray data, whereas the utilization of the Wilcoxon signed-rank test, which is an appropriate test for two-groups comparison of gene expression data, has largely been neglected in microarray studies. The aim of this investigation was to build an integrated tool, ArraySolver, for colour-coded graphical display and comparison of gene expression data using the Wilcoxon signed-rank test. The results of software validation showed similar outputs with ArraySolver and SPSS for large datasets. Whereas the former program appeared to be more accurate for 25 or fewer pairs (n < or = 25), suggesting its potential application in analysing molecular signatures that usually contain small numbers of genes. The main advantages of ArraySolver are easy data selection, convenient report format, accurate statistics and the familiar Excel platform.

  14. ArraySolver: An Algorithm for Colour-Coded Graphical Display and Wilcoxon Signed-Rank Statistics for Comparing Microarray Gene Expression Data

    PubMed Central

    2004-01-01

    The massive surge in the production of microarray data poses a great challenge for proper analysis and interpretation. In recent years numerous computational tools have been developed to extract meaningful interpretation of microarray gene expression data. However, a convenient tool for two-groups comparison of microarray data is still lacking and users have to rely on commercial statistical packages that might be costly and require special skills, in addition to extra time and effort for transferring data from one platform to other. Various statistical methods, including the t-test, analysis of variance, Pearson test and Mann–Whitney U test, have been reported for comparing microarray data, whereas the utilization of the Wilcoxon signed-rank test, which is an appropriate test for two-groups comparison of gene expression data, has largely been neglected in microarray studies. The aim of this investigation was to build an integrated tool, ArraySolver, for colour-coded graphical display and comparison of gene expression data using the Wilcoxon signed-rank test. The results of software validation showed similar outputs with ArraySolver and SPSS for large datasets. Whereas the former program appeared to be more accurate for 25 or fewer pairs (n ≤ 25), suggesting its potential application in analysing molecular signatures that usually contain small numbers of genes. The main advantages of ArraySolver are easy data selection, convenient report format, accurate statistics and the familiar Excel platform. PMID:18629036

  15. Mutational analysis of the multicopy hao gene coding for hydroxylamine oxidoreductase in Nitrosomonas sp. strain ENI-11.

    PubMed

    Yamagata, A; Hirota, R; Kato, J; Kuroda, A; Ikeda, T; Takiguchi, N; Ohtake, H

    2000-08-01

    The ammonia-oxidizing bacterium Nitrosomonas sp. strain ENI-11 contains three copies of the hao gene (hao1, hao2, and hao3) coding for hydroxylamine oxidoreductase (HAO). Three single mutants (hao1::kan, hao2::kan, or hao3::kan) had 68 to 75% of the wild-type growth rate and 58 to 89% of the wild-type HAO activity when grown under the same conditions. A double mutant (hao1::kan and hao3::amp) also had 68% of the wild-type growth and 37% of the wild-type HAO activity.

  16. Cheiloscopy and dactyloscopy: Do they dictate personality patterns?

    PubMed Central

    Abidullah, Mohammed; Kumar, M. Naveen; Bhorgonde, Kavita D.; Reddy, D. Shyam Prasad

    2015-01-01

    Context: Cheiloscopy and dactyloscopy, both are well-established forensic tools used in individual identification in any scenario be it a crime scene or civil cause. Like finger prints, lip prints are unique and distinguishable for every individual. But their relationship to personality types has not been established excepting the hypothesis stating that finger prints could explain these personality patterns. Aims: The study was aimed to record and correlate the lip and finger prints with that of character/personality of a person. Settings and Design: The lip and finger prints and character of a person were recorded and the data obtained was subjected for statistical analysis, especially for Pearson's Chi-square test and correlation/association between the groups was also studied. Materials and Methods: The study sample comprised of 200 subjects, 100 males and 100 females, aged between 18 and 30 years. For recording lip prints, brown/pink-colored lipstick was applied on the lips and the subjects were asked to spread uniformly over the lips. Lip prints were traced in the normal rest position on a plain white bond paper. For recording the finger prints, imprints of the fingers were taken on a plain white bond paper using ink pad. The collected prints were visualized using magnifying lens. To record the character of person, a pro forma manual for multivariable personality inventory by Dr. BC Muthayya was used. Statistical Analysis Used: Data obtained was subjected for statistical analysis, especially for Pearson's Chi-square test and correlation/association between the groups was also studied. Results: In males, predominant lip pattern recorded was Type I with whorls-type finger pattern and the character being ego ideal, pessimism, introvert, and dogmatic; whereas in females, predominant lip pattern recorded was Type II with loops-type finger pattern and the character being neurotic, need achievers, and dominant. Conclusion: Many studies on lip pattern, finger pattern, palatal rugae, etc., for individual identification and gender determination exist, but correlative studies are scanty. This is the first study done on correlating patterns, that is, lip and finger pattern with the character of a person. With this study we conclude that this correlation can be used as an adjunct in the investigatory process in forensic sciences. PMID:26005299

  17. Energy information data base: report number codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    1979-09-01

    Each report processed by the US DOE Technical Information Center is identified by a unique report number consisting of a code plus a sequential number. In most cases, the code identifies the originating installation. In some cases, it identifies a specific program or a type of publication. Listed in this publication are all codes that have been used by DOE in cataloging reports. This compilation consists of two parts. Part I is an alphabetical listing of report codes identified with the issuing installations that have used the codes. Part II is an alphabetical listing of installations identified with codes eachmore » has used. (RWR)« less

  18. Effect of degree correlations above the first shell on the percolation transition

    NASA Astrophysics Data System (ADS)

    Valdez, L. D.; Buono, C.; Braunstein, L. A.; Macri, P. A.

    2011-11-01

    The use of degree-degree correlations to model realistic networks which are characterized by their Pearson's coefficient, has become widespread. However the effect on how different correlation algorithms produce different results on processes on top of them, has not yet been discussed. In this letter, using different correlation algorithms to generate assortative networks, we show that for very assortative networks the behavior of the main observables in percolation processes depends on the algorithm used to build the network. The different alghoritms used here introduce different inner structures that are missed in Pearson's coefficient. We explain the different behaviors through a generalization of Pearson's coefficient that allows to study the correlations at chemical distances l from a root node. We apply our findings to real networks.

  19. Expression profiles of long non-coding RNAs located in autoimmune disease-associated regions reveal immune cell-type specificity.

    PubMed

    Hrdlickova, Barbara; Kumar, Vinod; Kanduri, Kartiek; Zhernakova, Daria V; Tripathi, Subhash; Karjalainen, Juha; Lund, Riikka J; Li, Yang; Ullah, Ubaid; Modderman, Rutger; Abdulahad, Wayel; Lähdesmäki, Harri; Franke, Lude; Lahesmaa, Riitta; Wijmenga, Cisca; Withoff, Sebo

    2014-01-01

    Although genome-wide association studies (GWAS) have identified hundreds of variants associated with a risk for autoimmune and immune-related disorders (AID), our understanding of the disease mechanisms is still limited. In particular, more than 90% of the risk variants lie in non-coding regions, and almost 10% of these map to long non-coding RNA transcripts (lncRNAs). lncRNAs are known to show more cell-type specificity than protein-coding genes. We aimed to characterize lncRNAs and protein-coding genes located in loci associated with nine AIDs which have been well-defined by Immunochip analysis and by transcriptome analysis across seven populations of peripheral blood leukocytes (granulocytes, monocytes, natural killer (NK) cells, B cells, memory T cells, naive CD4(+) and naive CD8(+) T cells) and four populations of cord blood-derived T-helper cells (precursor, primary, and polarized (Th1, Th2) T-helper cells). We show that lncRNAs mapping to loci shared between AID are significantly enriched in immune cell types compared to lncRNAs from the whole genome (α <0.005). We were not able to prioritize single cell types relevant for specific diseases, but we observed five different cell types enriched (α <0.005) in five AID (NK cells for inflammatory bowel disease, juvenile idiopathic arthritis, primary biliary cirrhosis, and psoriasis; memory T and CD8(+) T cells in juvenile idiopathic arthritis, primary biliary cirrhosis, psoriasis, and rheumatoid arthritis; Th0 and Th2 cells for inflammatory bowel disease, juvenile idiopathic arthritis, primary biliary cirrhosis, psoriasis, and rheumatoid arthritis). Furthermore, we show that co-expression analyses of lncRNAs and protein-coding genes can predict the signaling pathways in which these AID-associated lncRNAs are involved. The observed enrichment of lncRNA transcripts in AID loci implies lncRNAs play an important role in AID etiology and suggests that lncRNA genes should be studied in more detail to interpret GWAS findings correctly. The co-expression results strongly support a model in which the lncRNA and protein-coding genes function together in the same pathways.

  20. MIFT: GIFT Combinatorial Geometry Input to VCS Code

    DTIC Science & Technology

    1977-03-01

    r-w w-^ H ^ß0318is CQ BRL °RCUMr REPORT NO. 1967 —-S: ... MIFT: GIFT COMBINATORIAL GEOMETRY INPUT TO VCS CODE Albert E...TITLE (and Subtitle) MIFT: GIFT Combinatorial Geometry Input to VCS Code S. TYPE OF REPORT & PERIOD COVERED FINAL 6. PERFORMING ORG. REPORT NUMBER...Vehicle Code System (VCS) called MORSE was modified to accept the GIFT combinatorial geometry package. GIFT , as opposed to the geometry package

  1. Memorization of Sequences of Movements of the Right or the Left Hand by Right- and Left-Handers: Vector Coding.

    PubMed

    Bobrova, E V; Bogacheva, I N; Lyakhovetskii, V A; Fabinskaja, A A; Fomina, E V

    2017-01-01

    In order to test the hypothesis of hemisphere specialization for different types of information coding (the right hemisphere, for positional coding; the left one, for vector coding), we analyzed the errors of right and left-handers during a task involving the memorization of sequences of movements by the left or the right hand, which activates vector coding by changing the order of movements in memorized sequences. The task was first performed by the right or the left hand, then by the opposite hand. It was found that both'right- and left-handers use the information about the previous movements of the dominant hand, but not of the non-dom" inant one. After changing the hand, right-handers use the information about previous movements of the second hand, while left-handers do not. We compared our results with the data of previous experiments, in which positional coding was activated, and concluded that both right- and left-handers use vector coding for memorizing the sequences of their dominant hands and positional coding for memorizing the sequences of non-dominant hand. No similar patterns of errors were found between right- and left-handers after changing the hand, which suggests that in right- and left-handersthe skills are transferred in different ways depending on the type of coding.

  2. Analysis of the Magnitude and Frequency of Peak Discharges for the Navajo Nation in Arizona, Utah, Colorado, and New Mexico

    USGS Publications Warehouse

    Waltemeyer, Scott D.

    2006-01-01

    Estimates of the magnitude and frequency of peak discharges are necessary for the reliable flood-hazard mapping in the Navajo Nation in Arizona, Utah, Colorado, and New Mexico. The Bureau of Indian Affairs, U.S. Army Corps of Engineers, and Navajo Nation requested that the U.S. Geological Survey update estimates of peak discharge magnitude for gaging stations in the region and update regional equations for estimation of peak discharge and frequency at ungaged sites. Equations were developed for estimating the magnitude of peak discharges for recurrence intervals of 2, 5, 10, 25, 50, 100, and 500 years at ungaged sites using data collected through 1999 at 146 gaging stations, an additional 13 years of peak-discharge data since a 1997 investigation, which used gaging-station data through 1986. The equations for estimation of peak discharges at ungaged sites were developed for flood regions 8, 11, high elevation, and 6 and are delineated on the basis of the hydrologic codes from the 1997 investigation. Peak discharges for selected recurrence intervals were determined at gaging stations by fitting observed data to a log-Pearson Type III distribution with adjustments for a low-discharge threshold and a zero skew coefficient. A low-discharge threshold was applied to frequency analysis of 82 of the 146 gaging stations. This application provides an improved fit of the log-Pearson Type III frequency distribution. Use of the low-discharge threshold generally eliminated the peak discharge having a recurrence interval of less than 1.4 years in the probability-density function. Within each region, logarithms of the peak discharges for selected recurrence intervals were related to logarithms of basin and climatic characteristics using stepwise ordinary least-squares regression techniques for exploratory data analysis. Generalized least-squares regression techniques, an improved regression procedure that accounts for time and spatial sampling errors, then was applied to the same data used in the ordinary least-squares regression analyses. The average standard error of prediction for a peak discharge have a recurrence interval of 100-years for region 8 was 53 percent (average) for the 100-year flood. The average standard of prediction, which includes average sampling error and average standard error of regression, ranged from 45 to 83 percent for the 100-year flood. Estimated standard error of prediction for a hybrid method for region 11 was large in the 1997 investigation. No distinction of floods produced from a high-elevation region was presented in the 1997 investigation. Overall, the equations based on generalized least-squares regression techniques are considered to be more reliable than those in the 1997 report because of the increased length of record and improved GIS method. Techniques for transferring flood-frequency relations to ungaged sites on the same stream can be estimated at an ungaged site by a direct application of the regional regression equation or at an ungaged site on a stream that has a gaging station upstream or downstream by using the drainage-area ratio and the drainage-area exponent from the regional regression equation of the respective region.

  3. BISON and MARMOT Development for Modeling Fast Reactor Fuel Performance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gamble, Kyle Allan Lawrence; Williamson, Richard L.; Schwen, Daniel

    2015-09-01

    BISON and MARMOT are two codes under development at the Idaho National Laboratory for engineering scale and lower length scale fuel performance modeling. It is desired to add capabilities for fast reactor applications to these codes. The fast reactor fuel types under consideration are metal (U-Pu-Zr) and oxide (MOX). The cladding types of interest include 316SS, D9, and HT9. The purpose of this report is to outline the proposed plans for code development and provide an overview of the models added to the BISON and MARMOT codes for fast reactor fuel behavior. A brief overview of preliminary discussions on themore » formation of a bilateral agreement between the Idaho National Laboratory and the National Nuclear Laboratory in the United Kingdom is presented.« less

  4. New features in the design code Tlie

    NASA Astrophysics Data System (ADS)

    van Zeijts, Johannes

    1993-12-01

    We present features recently installed in the arbitrary-order accelerator design code Tlie. The code uses the MAD input language, and implements programmable extensions modeled after the C language that make it a powerful tool in a wide range of applications: from basic beamline design to high precision-high order design and even control room applications. The basic quantities important in accelerator design are easily accessible from inside the control language. Entities like parameters in elements (strength, current), transfer maps (either in Taylor series or in Lie algebraic form), lines, and beams (either as sets of particles or as distributions) are among the type of variables available. These variables can be set, used as arguments in subroutines, or just typed out. The code is easily extensible with new datatypes.

  5. Uplink Coding

    NASA Technical Reports Server (NTRS)

    Pollara, Fabrizio; Hamkins, Jon; Dolinar, Sam; Andrews, Ken; Divsalar, Dariush

    2006-01-01

    This viewgraph presentation reviews uplink coding. The purpose and goals of the briefing are (1) Show a plan for using uplink coding and describe benefits (2) Define possible solutions and their applicability to different types of uplink, including emergency uplink (3) Concur with our conclusions so we can embark on a plan to use proposed uplink system (4) Identify the need for the development of appropriate technology and infusion in the DSN (5) Gain advocacy to implement uplink coding in flight projects Action Item EMB04-1-14 -- Show a plan for using uplink coding, including showing where it is useful or not (include discussion of emergency uplink coding).

  6. Tracking Holland Interest Codes: The Case of South African Field Guides

    ERIC Educational Resources Information Center

    Watson, Mark B.; Foxcroft, Cheryl D.; Allen, Lynda J.

    2007-01-01

    Holland believes that specific personality types seek out matching occupational environments and his theory codes personality and environment according to a six letter interest typology. Since 1985 there have been numerous American studies that have queried the validity of Holland's coding system. Research in South Africa is scarcer, despite…

  7. Drug Overdose Surveillance Using Hospital Discharge Data

    PubMed Central

    Bunn, Terry L.; Talbert, Jeffery

    2014-01-01

    Objectives We compared three methods for identifying drug overdose cases in inpatient hospital discharge data on their ability to classify drug overdoses by intent and drug type(s) involved. Methods We compared three International Classification of Diseases, Ninth Revision, Clinical Modification code-based case definitions using Kentucky hospital discharge data for 2000–2011. The first definition (Definition 1) was based on the external-cause-of-injury (E-code) matrix. The other two definitions were based on the Injury Surveillance Workgroup on Poisoning (ISW7) consensus recommendations for national and state poisoning surveillance using the principal diagnosis or first E-code (Definition 2) or any diagnosis/E-code (Definition 3). Results Definition 3 identified almost 50% more drug overdose cases than did Definition 1. The increase was largely due to cases with a first-listed E-code describing a drug overdose but a principal diagnosis that was different from drug overdose (e.g., mental disorders, or respiratory or circulatory system failure). Regardless of the definition, more than 53% of the hospitalizations were self-inflicted drug overdoses; benzodiazepines were involved in about 30% of the hospitalizations. The 2011 age-adjusted drug overdose hospitalization rate in Kentucky was 146/100,000 population using Definition 3 and 107/100,000 population using Definition 1. Conclusion The ISW7 drug overdose definition using any drug poisoning diagnosis/E-code (Definition 3) is potentially the highest sensitivity definition for counting drug overdose hospitalizations, including by intent and drug type(s) involved. As the states enact policies and plan for adequate treatment resources, standardized drug overdose definitions are critical for accurate reporting, trend analysis, policy evaluation, and state-to-state comparison. PMID:25177055

  8. Comprehensive model for predicting perceptual image quality of smart mobile devices.

    PubMed

    Gong, Rui; Xu, Haisong; Luo, M R; Li, Haifeng

    2015-01-01

    An image quality model for smart mobile devices was proposed based on visual assessments of several image quality attributes. A series of psychophysical experiments were carried out on two kinds of smart mobile devices, i.e., smart phones and tablet computers, in which naturalness, colorfulness, brightness, contrast, sharpness, clearness, and overall image quality were visually evaluated under three lighting environments via categorical judgment method for various application types of test images. On the basis of Pearson correlation coefficients and factor analysis, the overall image quality could first be predicted by its two constituent attributes with multiple linear regression functions for different types of images, respectively, and then the mathematical expressions were built to link the constituent image quality attributes with the physical parameters of smart mobile devices and image appearance factors. The procedure and algorithms were applicable to various smart mobile devices, different lighting conditions, and multiple types of images, and performance was verified by the visual data.

  9. Spiritual Development and Death Attitude in Female Patients With Type II Diabetes

    PubMed Central

    Nozari, Masoumeh; Khalilian, Alireza; Dousti, Yarali

    2014-01-01

    Objective: The present study aimed to investigate the differences regarding spiritual development dimensions and death attitude profiles, and also to determinate association between them, in patients suffering from type II diabetes. Methods: In a cross-sectional design study, 100 female outpatients who were suffering from type II diabetes were recruited in Imam Khomeini Hospital, Sari, Iran. Data were collected through two questionnaires including the Spiritual Assessment Inventory (SAI) and the Death Attitude Profile-Revised (DAPR). Analysis of the data involved analysis of covariance (ANCOVA) with the Fisher's Least Significant Difference (LSD) as post-hoc test plus the Pearson correlation. Results: There was a statistical significant difference in spiritual development dimensions and death attitude profile. The results showed that spiritual development were significantly associated with some items of death attitude profiles. Conclusion: Awareness of God was suitable in diabetic patients, but the quality of relationship with God indicated spiritually immature. It is necessary to provide instruction to improve patient's death attitude and following health behavior. PMID:25780376

  10. Reusability of coded data in the primary care electronic medical record: A dynamic cohort study concerning cancer diagnoses.

    PubMed

    Sollie, Annet; Sijmons, Rolf H; Helsper, Charles; Numans, Mattijs E

    2017-03-01

    To assess quality and reusability of coded cancer diagnoses in routine primary care data. To identify factors that influence data quality and areas for improvement. A dynamic cohort study in a Dutch network database containing 250,000 anonymized electronic medical records (EMRs) from 52 general practices was performed. Coded data from 2000 to 2011 for the three most common cancer types (breast, colon and prostate cancer) was compared to the Netherlands Cancer Registry. Data quality is expressed in Standard Incidence Ratios (SIRs): the ratio between the number of coded cases observed in the primary care network database and the expected number of cases based on the Netherlands Cancer Registry. Ratios were multiplied by 100% for readability. The overall SIR was 91.5% (95%CI 88.5-94.5) and showed improvement over the years. SIRs differ between cancer types: from 71.5% for colon cancer in males to 103.9% for breast cancer. There are differences in data quality (SIRs 76.2% - 99.7%) depending on the EMR system used, with SIRs up to 232.9% for breast cancer. Frequently observed errors in routine healthcare data can be classified as: lack of integrity checks, inaccurate use and/or lack of codes, and lack of EMR system functionality. Re-users of coded routine primary care Electronic Medical Record data should be aware that 30% of cancer cases can be missed. Up to 130% of cancer cases found in the EMR data can be false-positive. The type of EMR system and the type of cancer influence the quality of coded diagnosis registry. While data quality can be improved (e.g. through improving system design and by training EMR system users), re-use should only be taken care of by appropriately trained experts. Copyright © 2016. Published by Elsevier B.V.

  11. Using a Nonparametric Bootstrap to Obtain a Confidence Interval for Pearson's "r" with Cluster Randomized Data: A Case Study

    ERIC Educational Resources Information Center

    Wagstaff, David A.; Elek, Elvira; Kulis, Stephen; Marsiglia, Flavio

    2009-01-01

    A nonparametric bootstrap was used to obtain an interval estimate of Pearson's "r," and test the null hypothesis that there was no association between 5th grade students' positive substance use expectancies and their intentions to not use substances. The students were participating in a substance use prevention program in which the unit of…

  12. Test Review: Kaufman, A. S., & Kaufman, N. L. (2014), "Kaufman Test of Educational Achievement, Third Edition." Bloomington, MN: NCS Pearson

    ERIC Educational Resources Information Center

    Frame, Laura B.; Vidrine, Stephanie M.; Hinojosa, Ryan

    2016-01-01

    The Kaufman Test of Educational Achievement, Third Edition (KTEA-3) is a revised and updated comprehensive academic achievement test (Kaufman & Kaufman, 2014). Authored by Drs. Alan and Nadeen Kaufman and published by Pearson, the KTEA-3 remains an individual achievement test normed for individuals of ages 4 through 25 years, or for those in…

  13. A Psychometric Measurement Model for Adult English Language Learners: Pearson Test of English Academic

    ERIC Educational Resources Information Center

    Pae, Hye K.

    2012-01-01

    The aim of this study was to apply Rasch modeling to an examination of the psychometric properties of the "Pearson Test of English Academic" (PTE Academic). Analyzed were 140 test-takers' scores derived from the PTE Academic database. The mean age of the participants was 26.45 (SD = 5.82), ranging from 17 to 46. Conformity of the participants'…

  14. Persistent collective trend in stock markets

    NASA Astrophysics Data System (ADS)

    Balogh, Emeric; Simonsen, Ingve; Nagy, Bálint Zs.; Néda, Zoltán

    2010-12-01

    Empirical evidence is given for a significant difference in the collective trend of the share prices during the stock index rising and falling periods. Data on the Dow Jones Industrial Average and its stock components are studied between 1991 and 2008. Pearson-type correlations are computed between the stocks and averaged over stock pairs and time. The results indicate a general trend: whenever the stock index is falling the stock prices are changing in a more correlated manner than in case the stock index is ascending. A thorough statistical analysis of the data shows that the observed difference is significant, suggesting a constant fear factor among stockholders.

  15. Circulating adipokines data associated with insulin secretagogue use in breast cancer patients.

    PubMed

    Wintrob, Zachary A P; Hammel, Jeffrey P; Nimako, George K; Fayazi, Zahra S; Gaile, Dan P; Forrest, Alan; Ceacareanu, Alice C

    2017-02-01

    Oral drugs stimulating endogenous insulin production (insulin secretagogues) may have detrimental effects on breast cancer outcomes. The data presented shows the relationship between pre-existing insulin secretagogues use, adipokine profiles at the time of breast cancer (BC) diagnosis and subsequent cancer outcomes in women diagnosed with BC and type 2 diabetes mellitus (T2DM). The Pearson correlation analysis evaluating the relationship between adipokines stratified by T2DM pharmacotherapy and controls is also provided. This information is the extension of the data presented and discussed in " Insulin use, adipokine profiles and breast cancer prognosis " (Wintrob et al., in press) [1].

  16. Simple Form of MMSE Estimator for Super-Gaussian Prior Densities

    NASA Astrophysics Data System (ADS)

    Kittisuwan, Pichid

    2015-04-01

    The denoising method that become popular in recent years for additive white Gaussian noise (AWGN) are Bayesian estimation techniques e.g., maximum a posteriori (MAP) and minimum mean square error (MMSE). In super-Gaussian prior densities, it is well known that the MMSE estimator in such a case has a complicated form. In this work, we derive the MMSE estimation with Taylor series. We show that the proposed estimator also leads to a simple formula. An extension of this estimator to Pearson type VII prior density is also offered. The experimental result shows that the proposed estimator to the original MMSE nonlinearity is reasonably good.

  17. Superimposed Code Theoretic Analysis of DNA Codes and DNA Computing

    DTIC Science & Technology

    2008-01-01

    complements of one another and the DNA duplex formed is a Watson - Crick (WC) duplex. However, there are many instances when the formation of non-WC...that the user’s requirements for probe selection are met based on the Watson - Crick probe locality within a target. The second type, called...AFRL-RI-RS-TR-2007-288 Final Technical Report January 2008 SUPERIMPOSED CODE THEORETIC ANALYSIS OF DNA CODES AND DNA COMPUTING

  18. Testing of Error-Correcting Sparse Permutation Channel Codes

    NASA Technical Reports Server (NTRS)

    Shcheglov, Kirill, V.; Orlov, Sergei S.

    2008-01-01

    A computer program performs Monte Carlo direct numerical simulations for testing sparse permutation channel codes, which offer strong error-correction capabilities at high code rates and are considered especially suitable for storage of digital data in holographic and volume memories. A word in a code of this type is characterized by, among other things, a sparseness parameter (M) and a fixed number (K) of 1 or "on" bits in a channel block length of N.

  19. Scientific and Technical Publishing at Goddard Space Flight Center in Fiscal Year 1994

    NASA Technical Reports Server (NTRS)

    1994-01-01

    This publication is a compilation of scientific and technical material that was researched, written, prepared, and disseminated by the Center's scientists and engineers during FY94. It is presented in numerical order of the GSFC author's sponsoring technical directorate; i.e., Code 300 is the Office of Flight Assurance, Code 400 is the Flight Projects Directorate, Code 500 is the Mission Operations and Data Systems Directorate, Code 600 is the Space Sciences Directorate, Code 700 is the Engineering Directorate, Code 800 is the Suborbital Projects and Operations Directorate, and Code 900 is the Earth Sciences Directorate. The publication database contains publication or presentation title, author(s), document type, sponsor, and organizational code. This is the second annual compilation for the Center.

  20. PubMed

    Trinker, Horst

    2011-10-28

    We study the distribution of triples of codewords of codes and ordered codes. Schrijver [A. Schrijver, New code upper bounds from the Terwilliger algebra and semidefinite programming, IEEE Trans. Inform. Theory 51 (8) (2005) 2859-2866] used the triple distribution of a code to establish a bound on the number of codewords based on semidefinite programming. In the first part of this work, we generalize this approach for ordered codes. In the second part, we consider linear codes and linear ordered codes and present a MacWilliams-type identity for the triple distribution of their dual code. Based on the non-negativity of this linear transform, we establish a linear programming bound and conclude with a table of parameters for which this bound yields better results than the standard linear programming bound.

  1. Characterization and first flush analysis in road and roof runoff in Shenyang, China.

    PubMed

    Li, Chunlin; Liu, Miao; Hu, Yuanman; Gong, Jiping; Sun, Fengyun; Xu, Yanyan

    2014-01-01

    As urbanization increases, urban runoff is an increasingly important component of total urban non-point source pollution. In this study, the properties of urban runoff were examined in Shenyang, in northeastern China. Runoff samples from a tiled roof, a concrete roof and a main road were analyzed for key pollutants (total suspended solids (TSS), total nitrogen (TN), total phosphorus (TP), chemical oxygen demand (COD), Pb, Cd, Cr, Cu, Ni, and Zn). The event mean concentration, site mean concentration, M(V) curves (dimensionless cumulative curve of pollutant load with runoff volume), and mass first flush ratio (MFF30) were used to analyze the characteristics of pollutant discharge and first flush (FF) effect. For all events, the pollutant concentration peaks occurred in the first half-hour after the runoff appeared and preceded the flow peaks. TN is the main pollutant in roof runoff. TSS, TN, TP, Pb, and Cr are the main pollutants in road runoff in Shenyang. There was a significant correlation between TSS and other pollutants except TN in runoff, which illustrated that TSS was an important carrier of organic matter and heavy metals. TN had strong positive correlations with total rainfall (Pearson's r = 0.927), average rainfall (Pearson's r = 0.995), and maximum rainfall intensity (Pearson's r = 0.991). TP had a strong correlation with rainfall intensity (Pearson's r = 0.940). A significant positive correlation between COD and rainfall duration (Pearson's r = 0.902, significance level = 0.05) was found. The order of FF intensity in different surfaces was concrete roof > tile roof > road. Rainfall duration and the length of the antecedent dry period were positively correlated with the FF. TN tended to exhibit strong flush for some events. Heavy metals showed a substantially stronger FF than other pollutant.

  2. Fast-GPU-PCC: A GPU-Based Technique to Compute Pairwise Pearson's Correlation Coefficients for Time Series Data-fMRI Study.

    PubMed

    Eslami, Taban; Saeed, Fahad

    2018-04-20

    Functional magnetic resonance imaging (fMRI) is a non-invasive brain imaging technique, which has been regularly used for studying brain’s functional activities in the past few years. A very well-used measure for capturing functional associations in brain is Pearson’s correlation coefficient. Pearson’s correlation is widely used for constructing functional network and studying dynamic functional connectivity of the brain. These are useful measures for understanding the effects of brain disorders on connectivities among brain regions. The fMRI scanners produce huge number of voxels and using traditional central processing unit (CPU)-based techniques for computing pairwise correlations is very time consuming especially when large number of subjects are being studied. In this paper, we propose a graphics processing unit (GPU)-based algorithm called Fast-GPU-PCC for computing pairwise Pearson’s correlation coefficient. Based on the symmetric property of Pearson’s correlation, this approach returns N ( N − 1 ) / 2 correlation coefficients located at strictly upper triangle part of the correlation matrix. Storing correlations in a one-dimensional array with the order as proposed in this paper is useful for further usage. Our experiments on real and synthetic fMRI data for different number of voxels and varying length of time series show that the proposed approach outperformed state of the art GPU-based techniques as well as the sequential CPU-based versions. We show that Fast-GPU-PCC runs 62 times faster than CPU-based version and about 2 to 3 times faster than two other state of the art GPU-based methods.

  3. Functional connectivity and structural covariance between regions of interest can be measured more accurately using multivariate distance correlation.

    PubMed

    Geerligs, Linda; Cam-Can; Henson, Richard N

    2016-07-15

    Studies of brain-wide functional connectivity or structural covariance typically use measures like the Pearson correlation coefficient, applied to data that have been averaged across voxels within regions of interest (ROIs). However, averaging across voxels may result in biased connectivity estimates when there is inhomogeneity within those ROIs, e.g., sub-regions that exhibit different patterns of functional connectivity or structural covariance. Here, we propose a new measure based on "distance correlation"; a test of multivariate dependence of high dimensional vectors, which allows for both linear and non-linear dependencies. We used simulations to show how distance correlation out-performs Pearson correlation in the face of inhomogeneous ROIs. To evaluate this new measure on real data, we use resting-state fMRI scans and T1 structural scans from 2 sessions on each of 214 participants from the Cambridge Centre for Ageing & Neuroscience (Cam-CAN) project. Pearson correlation and distance correlation showed similar average connectivity patterns, for both functional connectivity and structural covariance. Nevertheless, distance correlation was shown to be 1) more reliable across sessions, 2) more similar across participants, and 3) more robust to different sets of ROIs. Moreover, we found that the similarity between functional connectivity and structural covariance estimates was higher for distance correlation compared to Pearson correlation. We also explored the relative effects of different preprocessing options and motion artefacts on functional connectivity. Because distance correlation is easy to implement and fast to compute, it is a promising alternative to Pearson correlations for investigating ROI-based brain-wide connectivity patterns, for functional as well as structural data. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  4. Utilization of standardized patients to evaluate clinical and interpersonal skills of surgical residents.

    PubMed

    Hassett, James M; Zinnerstrom, Karen; Nawotniak, Ruth H; Schimpfhauser, Frank; Dayton, Merril T

    2006-10-01

    This project was designed to determine the growth of interpersonal skills during the first year of a surgical residency. All categorical surgical residents were given a clinical skills examination of abdominal pain using standardized patients during their orientation (T1). The categorical residents were retested after 11 months (T2). The assessment tool was based on a 12-item modified version of the 5-point Likert Interpersonal Scale (IP) used on the National Board of Medical Examiners prototype Clinical Skills Examination and a 24-item, done-or-not-done, history-taking checklist. Residents' self-evaluation scores were compared to standardized patients' assessment scores. Data were analyzed using the Pearson correlation coefficient, Wilcoxon signed rank test, Student t test, and Cronbach alpha. Thirty-eight categorical residents were evaluated at T1 and T2. At T1, in the history-taking exercise, the scores of the standardized patients and residents correlated (Pearson = .541, P = .000). In the interpersonal skills exercise, the scores of the standardized patients and residents did not correlate (Pearson = -0.238, P = .150). At T2, there was a significant improvement in the residents' self-evaluation scores in both the history-taking exercise (t = -3.280, P = .002) and the interpersonal skills exercise (t = 2.506, P = 0.017). In the history-taking exercise, the standardized patients' assessment scores correlated with the residents' self-evaluation scores (Pearson = 0.561, P = .000). In the interpersonal skills exercise, the standardized patients' assessment scores did not correlate with the residents' self-evaluation scores (Pearson = 0.078, P = .646). Surgical residents demonstrate a consistently low level of self-awareness regarding their interpersonal skills. Observed improvement in resident self-evaluation may be a function of growth in self-confidence.

  5. Variable Coded Modulation software simulation

    NASA Astrophysics Data System (ADS)

    Sielicki, Thomas A.; Hamkins, Jon; Thorsen, Denise

    This paper reports on the design and performance of a new Variable Coded Modulation (VCM) system. This VCM system comprises eight of NASA's recommended codes from the Consultative Committee for Space Data Systems (CCSDS) standards, including four turbo and four AR4JA/C2 low-density parity-check codes, together with six modulations types (BPSK, QPSK, 8-PSK, 16-APSK, 32-APSK, 64-APSK). The signaling protocol for the transmission mode is based on a CCSDS recommendation. The coded modulation may be dynamically chosen, block to block, to optimize throughput.

  6. A high temperature fatigue life prediction computer code based on the total strain version of StrainRange Partitioning (SRP)

    NASA Technical Reports Server (NTRS)

    Mcgaw, Michael A.; Saltsman, James F.

    1993-01-01

    A recently developed high-temperature fatigue life prediction computer code is presented and an example of its usage given. The code discussed is based on the Total Strain version of Strainrange Partitioning (TS-SRP). Included in this code are procedures for characterizing the creep-fatigue durability behavior of an alloy according to TS-SRP guidelines and predicting cyclic life for complex cycle types for both isothermal and thermomechanical conditions. A reasonably extensive materials properties database is included with the code.

  7. Industrial Code Development

    NASA Technical Reports Server (NTRS)

    Shapiro, Wilbur

    1991-01-01

    The industrial codes will consist of modules of 2-D and simplified 2-D or 1-D codes, intended for expeditious parametric studies, analysis, and design of a wide variety of seals. Integration into a unified system is accomplished by the industrial Knowledge Based System (KBS), which will also provide user friendly interaction, contact sensitive and hypertext help, design guidance, and an expandable database. The types of analysis to be included with the industrial codes are interfacial performance (leakage, load, stiffness, friction losses, etc.), thermoelastic distortions, and dynamic response to rotor excursions. The first three codes to be completed and which are presently being incorporated into the KBS are the incompressible cylindrical code, ICYL, and the compressible cylindrical code, GCYL.

  8. Traceability and Quality Control in Traditional Chinese Medicine: From Chemical Fingerprint to Two-Dimensional Barcode.

    PubMed

    Cai, Yong; Li, Xiwen; Li, Mei; Chen, Xiaojia; Hu, Hao; Ni, Jingyun; Wang, Yitao

    2015-01-01

    Chemical fingerprinting is currently a widely used tool that enables rapid and accurate quality evaluation of Traditional Chinese Medicine (TCM). However, chemical fingerprints are not amenable to information storage, recognition, and retrieval, which limit their use in Chinese medicine traceability. In this study, samples of three kinds of Chinese medicines were randomly selected and chemical fingerprints were then constructed by using high performance liquid chromatography. Based on chemical data, the process of converting the TCM chemical fingerprint into two-dimensional code is presented; preprocess and filtering algorithm are also proposed aiming at standardizing the large amount of original raw data. In order to know which type of two-dimensional code (2D) is suitable for storing data of chemical fingerprints, current popular types of 2D codes are analyzed and compared. Results show that QR Code is suitable for recording the TCM chemical fingerprint. The fingerprint information of TCM can be converted into data format that can be stored as 2D code for traceability and quality control.

  9. Effects of Conflicts of Interest on Practice Patterns and Complication Rates in Spine Surgery.

    PubMed

    Cook, Ralph W; Weiner, Joseph A; Schallmo, Michael S; Chun, Danielle S; Barth, Kathryn A; Singh, Sameer K; Hsu, Wellington K

    2017-09-01

    Retrospective cohort study. We sought to determine whether financial relationships with industry had any impact on operative and/or complication rates of spine surgeons performing fusion surgeries. Recent actions from Congress and the Institute of Medicine have highlighted the importance of conflicts of interest among physicians. Orthopedic surgeons and neurosurgeons have been identified as receiving the highest amount of industry payments among all specialties. No study has yet investigated the potential effects of disclosed industry payments with quality and choices of patient care. A comprehensive database of spine surgeons in the United States with compiled data of industry payments, operative fusion rates, and complication rates was created. Practice pattern data were derived from a publicly available Medicare-based database generated from selected CPT codes from 2011 to 2012. Complication rate data from 2009 to 2013 were extracted from the ProPublica-Surgeon-Scorecard database, which utilizes postoperative inhospital mortality and 30-day-readmission for designated conditions as complications of surgery. Data regarding industry payments from 2013 to 2014 were derived from the Open Payments website. Surgeons performing <10 fusions, those without complication data, and those whose identity could not be verified through public records were excluded. Pearson correlation coefficients and multivariate regression analyses were used to determine the relationship between industry payments, operative fusion rate, and/or complication rate. A total of 2110 surgeons met the inclusion criteria for our database. The average operative fusion rate was 8.8% (SD 4.8%), whereas the average complication rate for lumbar and cervical fusion was 4.1% and 1.9%, respectively. Pearson correlation analysis revealed a statistically significant but negligible relationship between disclosed payments/transactions and both operative fusion and complication rates. Our findings do not support a strong correlation between the payments a surgeon receives from industry and their decisions to perform spine fusion or associated complication rates. Large variability in the rate of fusions performed suggests a poor consensus for indications for spine fusion surgery. 3.

  10. Apolipoprotein E4 influences growth and cognitive responses to micronutrient supplementation in shantytown children from northeast Brazil.

    PubMed

    Mitter, Sumeet S; Oriá, Reinaldo B; Kvalsund, Michelle P; Pamplona, Paula; Joventino, Emanuella Silva; Mota, Rosa M S; Gonçalves, Davi C; Patrick, Peter D; Guerrant, Richard L; Lima, Aldo A M

    2012-01-01

    Apolipoprotein E4 may benefit children during early periods of life when the body is challenged by infection and nutritional decline. We examined whether apolipoprotein E4 affects intestinal barrier function, improving short-term growth and long-term cognitive outcomes in Brazilian shantytown children. A total of 213 Brazilian shantytown children with below-median height-for-age z-scores (HAZ) received 200,000 IU of retinol (every four months), zinc (40 mg twice weekly), or both for one year, with half of each group receiving glutamine supplementation for 10 days. Height-for-age z-scores, weight-for-age z-scores, weight-for-height z-scores, and lactulose:mannitol ratios were assessed during the initial four months of treatment. An average of four years (range 1.4-6.6) later, the children underwent cognitive testing to evaluate non-verbal intelligence, coding, verbal fluency, verbal learning, and delayed verbal learning. Apolipoprotein E4 carriage was determined by PCR analysis for 144 children. Thirty-seven children were apolipoprotein E4(+), with an allele frequency of 13.9%. Significant associations were found for vitamin A and glutamine with intestinal barrier function. Apolipoprotein E4(+) children receiving glutamine presented significant positive Pearson correlations between the change in height-for-age z-scores over four months and delayed verbal learning, along with correlated changes over the same period in weight-for-age z-scores and weight-for-height z-scores associated with non-verbal intelligence quotients. There was a significant correlation between vitamin A supplementation of apolipoprotein E4(+) children and improved delta lactulose/mannitol. Apolipoprotein E4(-) children, regardless of intervention, exhibited negative Pearson correlations between the change in lactulose-to-mannitol ratio over four months and verbal learning and non-verbal intelligence. During development, apolipoprotein E4 may function concomitantly with gut-tropic nutrients to benefit immediate nutritional status, which can translate into better long-term cognitive outcomes.

  11. A Dancing Black Hole

    NASA Astrophysics Data System (ADS)

    Shoemaker, Deirdre; Smith, Kenneth; Schnetter, Erik; Fiske, David; Laguna, Pablo; Pullin, Jorge

    2002-04-01

    Recently, stationary black holes have been successfully simulated for up to times of approximately 600-1000M, where M is the mass of the black hole. Considering that the expected burst of gravitational radiation from a binary black hole merger would last approximately 200-500M, black hole codes are approaching the point where simulations of mergers may be feasible. We will present two types of simulations of single black holes obtained with a code based on the Baumgarte-Shapiro-Shibata-Nakamura formulation of the Einstein evolution equations. One type of simulations addresses the stability properties of stationary black hole evolutions. The second type of simulations demonstrates the ability of our code to move a black hole through the computational domain. This is accomplished by shifting the stationary black hole solution to a coordinate system in which the location of the black hole is time dependent.

  12. Colocalization analysis in fluorescence micrographs: verification of a more accurate calculation of pearson's correlation coefficient.

    PubMed

    Barlow, Andrew L; Macleod, Alasdair; Noppen, Samuel; Sanderson, Jeremy; Guérin, Christopher J

    2010-12-01

    One of the most routine uses of fluorescence microscopy is colocalization, i.e., the demonstration of a relationship between pairs of biological molecules. Frequently this is presented simplistically by the use of overlays of red and green images, with areas of yellow indicating colocalization of the molecules. Colocalization data are rarely quantified and can be misleading. Our results from both synthetic and biological datasets demonstrate that the generation of Pearson's correlation coefficient between pairs of images can overestimate positive correlation and fail to demonstrate negative correlation. We have demonstrated that the calculation of a thresholded Pearson's correlation coefficient using only intensity values over a determined threshold in both channels produces numerical values that more accurately describe both synthetic datasets and biological examples. Its use will bring clarity and accuracy to colocalization studies using fluorescent microscopy.

  13. Global-in-time solutions for the isothermal Matovich-Pearson equations

    NASA Astrophysics Data System (ADS)

    Feireisl, Eduard; Laurençot, Philippe; Mikelić, Andro

    2011-01-01

    In this paper we study the Matovich-Pearson equations describing the process of glass fibre drawing. These equations may be viewed as a 1D-reduction of the incompressible Navier-Stokes equations including free boundary, valid for the drawing of a long and thin glass fibre. We concentrate on the isothermal case without surface tension. Then the Matovich-Pearson equations represent a nonlinearly coupled system of an elliptic equation for the axial velocity and a hyperbolic transport equation for the fluid cross-sectional area. We first prove existence of a local solution, and, after constructing appropriate barrier functions, we deduce that the fluid radius is always strictly positive and that the local solution remains in the same regularity class. This estimate leads to the global existence and uniqueness result for this important system of equations.

  14. Chimeric NP Non Coding Regions between Type A and C Influenza Viruses Reveal Their Role in Translation Regulation

    PubMed Central

    Crescenzo-Chaigne, Bernadette; Barbezange, Cyril; Frigard, Vianney; Poulain, Damien; van der Werf, Sylvie

    2014-01-01

    Exchange of the non coding regions of the NP segment between type A and C influenza viruses was used to demonstrate the importance not only of the proximal panhandle, but also of the initial distal panhandle strength in type specificity. Both elements were found to be compulsory to rescue infectious virus by reverse genetics systems. Interestingly, in type A influenza virus infectious context, the length of the NP segment 5′ NC region once transcribed into mRNA was found to impact its translation, and the level of produced NP protein consequently affected the level of viral genome replication. PMID:25268971

  15. Simulation of a small cold-leg-break experiment at the PMK-2 test facility using the RELAP5 and ATHLET codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ezsoel, G.; Guba, A.; Perneczky, L.

    Results of a small-break loss-of-coolant accident experiment, conducted on the PMK-2 integral-type test facility are presented. The experiment simulated a 1% break in the cold leg of a VVER-440-type reactor. The main phenomena of the experiment are discussed, and in the case of selected events, a more detailed interpretation with the help of measured void fraction, obtained by a special measurement device, is given. Two thermohydraulic computer codes, RELAP5 and ATHLET, are used for posttest calculations. The aim of these calculations is to investigate the code capability for modeling natural circulation phenomena in VVER-440-type reactors. Therefore, the results of themore » experiment and both calculations are compared. Both codes predict most of the transient events well, with the exception that RELAP5 fails to predict the dryout period in the core. In the experiment, the hot- and cold-leg loop-seal clearing is accompanied by natural circulation instabilities, which can be explained by means of the ATHLET calculation.« less

  16. Supernova Light Curves and Spectra from Two Different Codes: Supernu and Phoenix

    NASA Astrophysics Data System (ADS)

    Van Rossum, Daniel R; Wollaeger, Ryan T

    2014-08-01

    The observed similarities between light curve shapes from Type Ia supernovae, and in particular the correlation of light curve shape and brightness, have been actively studied for more than two decades. In recent years, hydronamic simulations of white dwarf explosions have advanced greatly, and multiple mechanisms that could potentially produce Type Ia supernovae have been explored in detail. The question which of the proposed mechanisms is (or are) possibly realized in nature remains challenging to answer, but detailed synthetic light curves and spectra from explosion simulations are very helpful and important guidelines towards answering this question.We present results from a newly developed radiation transport code, Supernu. Supernu solves the supernova radiation transfer problem uses a novel technique based on a hybrid between Implicit Monte Carlo and Discrete Diffusion Monte Carlo. This technique enhances the efficiency with respect to traditional implicit monte carlo codes and thus lends itself perfectly for multi-dimensional simulations. We show direct comparisons of light curves and spectra from Type Ia simulations with Supernu versus the legacy Phoenix code.

  17. Identification of significantly mutated regions across cancer types highlights a rich landscape of functional molecular alterations

    PubMed Central

    Araya, Carlos L.; Cenik, Can; Reuter, Jason A.; Kiss, Gert; Pande, Vijay S.; Snyder, Michael P.; Greenleaf, William J.

    2015-01-01

    Cancer sequencing studies have primarily identified cancer-driver genes by the accumulation of protein-altering mutations. An improved method would be annotation-independent, sensitive to unknown distributions of functions within proteins, and inclusive of non-coding drivers. We employed density-based clustering methods in 21 tumor types to detect variably-sized significantly mutated regions (SMRs). SMRs reveal recurrent alterations across a spectrum of coding and non-coding elements, including transcription factor binding sites and untranslated regions mutated in up to ∼15% of specific tumor types. SMRs reveal spatial clustering of mutations at molecular domains and interfaces, often with associated changes in signaling. Mutation frequencies in SMRs demonstrate that distinct protein regions are differentially mutated among tumor types, as exemplified by a linker region of PIK3CA in which biophysical simulations suggest mutations affect regulatory interactions. The functional diversity of SMRs underscores both the varied mechanisms of oncogenic misregulation and the advantage of functionally-agnostic driver identification. PMID:26691984

  18. English-Thai Code-Switching of Teachers in ESP Classes

    ERIC Educational Resources Information Center

    Promnath, Korawan; Tayjasanant, Chamaipak

    2016-01-01

    The term code-switching (CS) that occurs in everyday situations, or naturalistic code-switching, has been a controversial strategy regarding whether it benefits or impedes language learning. The aim of this study was to investigate CS in conversations between teachers and students of ESP classes in order to explore the types and functions of CS…

  19. 78 FR 78705 - Airworthiness Directives; Airbus Airplanes

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-27

    ..., and equipment to perform this type of modification, repair, and access. UAL also stated that certain... Association (ATA) of America Code 25, Equipment/ Furnishings; and Code 53, Fuselage. (e) Reason This AD was...

  20. Evaluation of Antibiotic Residues in Pasteurized and Raw Milk Distributed in the South of Khorasan-e Razavi Province, Iran.

    PubMed

    Moghadam, Mortez Mohammadzadeh; Amiri, Mostafa; Riabi, Hamed Ramezani Awal; Riabi, Hamid Ramezani Awal

    2016-12-01

    The presence of antibiotic residues in milk and other products livestock is a health problem which can endanger public health. Antibiotics are used widely in animal husbandry to treat diseases related to bacterial infections. Antimicrobial drugs have been in use for decades in industry. They are commonly used in livestock facilities to treat mastitis. This study aimed to investigate antibiotic residues in pasteurized milk distributed in schools, in milk collection centers, and in milk production factories in Gonabad city. This cross-sectional study was conducted on 251 samples of commercial pasteurized milk packet distributed in schools (code A), raw milk collection centers in Gonabad city (code B), and pasteurized milk production factories (code C) in Gonabad city. The Copan test kit of Denmark Christian Hansen Company was used to monitor antibiotic residues in milk. The data were analysed employing Chi-square test and one-way analysis of variance (ANOVA) to determine significant differences using SPSS software version 20. The significant level was considered at p<0.05. In total, 251 milk samples were collected out of which 143 (57%) were code A, 84 (33.5%) code B and 24 (9.6%) code C. Total number of 189 samples (75.2%) were negative and 62 (24.8%) were positive. From the three types of milk samples, 41 samples (28.7%) of the code A, 18 samples (21.4%) of the code B and 3 samples (12.5%) of the code C were positive. In general, from the milk samples most contaminated with antibiotics, 17 samples were positive in January and regarding code A, 13 samples were positive in the same month. There was not a significant difference among the three types of milk (p>0.05). The highest number of milk samples (n=7) contaminated with antibiotics were related to code B (38.5%). Most positive cases were related to code A in winter. Also, there was no significant difference among the three types of contaminated milk regarding the year and month (p=0.164 and p=0.917, respectively). Pasteurized milk supplied in the studied city has high level of contamination due to high use of antibiotics. A standard limit needs to be set for the right level of residue of antibiotics in milk to avoid the harmful effects.

  1. Evaluation of Antibiotic Residues in Pasteurized and Raw Milk Distributed in the South of Khorasan-e Razavi Province, Iran

    PubMed Central

    Moghadam, Mortez Mohammadzadeh; Amiri, Mostafa; Riabi, Hamid Ramezani Awal

    2016-01-01

    Introduction The presence of antibiotic residues in milk and other products livestock is a health problem which can endanger public health. Antibiotics are used widely in animal husbandry to treat diseases related to bacterial infections. Antimicrobial drugs have been in use for decades in industry. They are commonly used in livestock facilities to treat mastitis. Aim This study aimed to investigate antibiotic residues in pasteurized milk distributed in schools, in milk collection centers, and in milk production factories in Gonabad city. Materials and Methods This cross-sectional study was conducted on 251 samples of commercial pasteurized milk packet distributed in schools (code A), raw milk collection centers in Gonabad city (code B), and pasteurized milk production factories (code C) in Gonabad city. The Copan test kit of Denmark Christian Hansen Company was used to monitor antibiotic residues in milk. The data were analysed employing Chi-square test and one-way analysis of variance (ANOVA) to determine significant differences using SPSS software version 20. The significant level was considered at p<0.05. Results In total, 251 milk samples were collected out of which 143 (57%) were code A, 84 (33.5%) code B and 24 (9.6%) code C. Total number of 189 samples (75.2%) were negative and 62 (24.8%) were positive. From the three types of milk samples, 41 samples (28.7%) of the code A, 18 samples (21.4%) of the code B and 3 samples (12.5%) of the code C were positive. In general, from the milk samples most contaminated with antibiotics, 17 samples were positive in January and regarding code A, 13 samples were positive in the same month. There was not a significant difference among the three types of milk (p>0.05). The highest number of milk samples (n=7) contaminated with antibiotics were related to code B (38.5%). Most positive cases were related to code A in winter. Also, there was no significant difference among the three types of contaminated milk regarding the year and month (p=0.164 and p=0.917, respectively). Conclusion Pasteurized milk supplied in the studied city has high level of contamination due to high use of antibiotics. A standard limit needs to be set for the right level of residue of antibiotics in milk to avoid the harmful effects. PMID:28208877

  2. Improved double-multiple streamtube model for the Darrieus-type vertical axis wind turbine

    NASA Astrophysics Data System (ADS)

    Berg, D. E.

    Double streamtube codes model the curved blade (Darrieus-type) vertical axis wind turbine (VAWT) as a double actuator fish arrangement (one half) and use conservation of momentum principles to determine the forces acting on the turbine blades and the turbine performance. Sandia National Laboratories developed a double multiple streamtube model for the VAWT which incorporates the effects of the incident wind boundary layer, nonuniform velocity between the upwind and downwind sections of the rotor, dynamic stall effects and local blade Reynolds number variations. The theory underlying this VAWT model is described, as well as the code capabilities. Code results are compared with experimental data from two VAWT's and with the results from another double multiple streamtube and a vortex filament code. The effects of neglecting dynamic stall and horizontal wind velocity distribution are also illustrated.

  3. Certification of medical librarians, 1949--1977 statistical analysis.

    PubMed

    Schmidt, D

    1979-01-01

    The Medical Library Association's Code for Training and Certification of Medical Librarians was in effect from 1949 to August 1977, a period during which 3,216 individuals were certified. Statistics on each type of certificate granted each year are provided. Because 54.5% of those granted certification were awarded it in the last three-year, two-month period of the code's existence, these applications are reviewed in greater detail. Statistics on each type of certificate granted each year are provided. Because 54.5% of those granted certification were awarded it in the last three-year, two-month period of the code's existence, these applications are reviewed in greater detail. Statistics on MLA membership, sex, residence, library school, and method of meeting requirements are detailed. Questions relating to certification under the code now in existence are raised.

  4. Certification of medical librarians, 1949--1977 statistical analysis.

    PubMed Central

    Schmidt, D

    1979-01-01

    The Medical Library Association's Code for Training and Certification of Medical Librarians was in effect from 1949 to August 1977, a period during which 3,216 individuals were certified. Statistics on each type of certificate granted each year are provided. Because 54.5% of those granted certification were awarded it in the last three-year, two-month period of the code's existence, these applications are reviewed in greater detail. Statistics on each type of certificate granted each year are provided. Because 54.5% of those granted certification were awarded it in the last three-year, two-month period of the code's existence, these applications are reviewed in greater detail. Statistics on MLA membership, sex, residence, library school, and method of meeting requirements are detailed. Questions relating to certification under the code now in existence are raised. PMID:427287

  5. Cell-assembly coding in several memory processes.

    PubMed

    Sakurai, Y

    1998-01-01

    The present paper discusses why the cell assembly, i.e., an ensemble population of neurons with flexible functional connections, is a tenable view of the basic code for information processes in the brain. The main properties indicating the reality of cell-assembly coding are neurons overlaps among different assemblies and connection dynamics within and among the assemblies. The former can be detected as multiple functions of individual neurons in processing different kinds of information. Individual neurons appear to be involved in multiple information processes. The latter can be detected as changes of functional synaptic connections in processing different kinds of information. Correlations of activity among some of the recorded neurons appear to change in multiple information processes. Recent experiments have compared several different memory processes (tasks) and detected these two main properties, indicating cell-assembly coding of memory in the working brain. The first experiment compared different types of processing of identical stimuli, i.e., working memory and reference memory of auditory stimuli. The second experiment compared identical processes of different types of stimuli, i.e., discriminations of simple auditory, simple visual, and configural auditory-visual stimuli. The third experiment compared identical processes of different types of stimuli with or without temporal processing of stimuli, i.e., discriminations of elemental auditory, configural auditory-visual, and sequential auditory-visual stimuli. Some possible features of the cell-assembly coding, especially "dual coding" by individual neurons and cell assemblies, are discussed for future experimental approaches. Copyright 1998 Academic Press.

  6. 49 CFR 387.323 - Electronic filing of surety bonds, trust fund agreements, certificates of insurance and...

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... Start field End field Record type 1 Numeric 1=Filing2=Cancellation B 1 1 Insurer number 8 Text FMCSA... Filing type 1 Numeric 1 = BI&PD2 = Cargo 3 = Bond 4 = Trust Fund B 10 10 FMCSA docket number 8 Text FMCSA... 264 265 Insured zip code 9 Numeric (Do not include dash if using 9 digit code) B 266 274 Insured...

  7. 49 CFR 387.323 - Electronic filing of surety bonds, trust fund agreements, certificates of insurance and...

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... Start field End field Record type 1 Numeric 1=Filing2=Cancellation B 1 1 Insurer number 8 Text FMCSA... Filing type 1 Numeric 1 = BI&PD2 = Cargo 3 = Bond 4 = Trust Fund B 10 10 FMCSA docket number 8 Text FMCSA... 264 265 Insured zip code 9 Numeric (Do not include dash if using 9 digit code) B 266 274 Insured...

  8. Industrial Facility Combustion Energy Use

    DOE Data Explorer

    McMillan, Colin

    2016-08-01

    Facility-level industrial combustion energy use is calculated from greenhouse gas emissions data reported by large emitters (>25,000 metric tons CO2e per year) under the U.S. EPA's Greenhouse Gas Reporting Program (GHGRP, https://www.epa.gov/ghgreporting). The calculation applies EPA default emissions factors to reported fuel use by fuel type. Additional facility information is included with calculated combustion energy values, such as industry type (six-digit NAICS code), location (lat, long, zip code, county, and state), combustion unit type, and combustion unit name. Further identification of combustion energy use is provided by calculating energy end use (e.g., conventional boiler use, co-generation/CHP use, process heating, other facility support) by manufacturing NAICS code. Manufacturing facilities are matched by their NAICS code and reported fuel type with the proportion of combustion fuel energy for each end use category identified in the 2010 Energy Information Administration Manufacturing Energy Consumption Survey (MECS, http://www.eia.gov/consumption/manufacturing/data/2010/). MECS data are adjusted to account for data that were withheld or whose end use was unspecified following the procedure described in Fox, Don B., Daniel Sutter, and Jefferson W. Tester. 2011. The Thermal Spectrum of Low-Temperature Energy Use in the United States, NY: Cornell Energy Institute.

  9. Fast bi-directional prediction selection in H.264/MPEG-4 AVC temporal scalable video coding.

    PubMed

    Lin, Hung-Chih; Hang, Hsueh-Ming; Peng, Wen-Hsiao

    2011-12-01

    In this paper, we propose a fast algorithm that efficiently selects the temporal prediction type for the dyadic hierarchical-B prediction structure in the H.264/MPEG-4 temporal scalable video coding (SVC). We make use of the strong correlations in prediction type inheritance to eliminate the superfluous computations for the bi-directional (BI) prediction in the finer partitions, 16×8/8×16/8×8 , by referring to the best temporal prediction type of 16 × 16. In addition, we carefully examine the relationship in motion bit-rate costs and distortions between the BI and the uni-directional temporal prediction types. As a result, we construct a set of adaptive thresholds to remove the unnecessary BI calculations. Moreover, for the block partitions smaller than 8 × 8, either the forward prediction (FW) or the backward prediction (BW) is skipped based upon the information of their 8 × 8 partitions. Hence, the proposed schemes can efficiently reduce the extensive computational burden in calculating the BI prediction. As compared to the JSVM 9.11 software, our method saves the encoding time from 48% to 67% for a large variety of test videos over a wide range of coding bit-rates and has only a minor coding performance loss. © 2011 IEEE

  10. What does music express? Basic emotions and beyond.

    PubMed

    Juslin, Patrik N

    2013-01-01

    Numerous studies have investigated whether music can reliably convey emotions to listeners, and-if so-what musical parameters might carry this information. Far less attention has been devoted to the actual contents of the communicative process. The goal of this article is thus to consider what types of emotional content are possible to convey in music. I will argue that the content is mainly constrained by the type of coding involved, and that distinct types of content are related to different types of coding. Based on these premises, I suggest a conceptualization in terms of "multiple layers" of musical expression of emotions. The "core" layer is constituted by iconically-coded basic emotions. I attempt to clarify the meaning of this concept, dispel the myths that surround it, and provide examples of how it can be heuristic in explaining findings in this domain. However, I also propose that this "core" layer may be extended, qualified, and even modified by additional layers of expression that involve intrinsic and associative coding. These layers enable listeners to perceive more complex emotions-though the expressions are less cross-culturally invariant and more dependent on the social context and/or the individual listener. This multiple-layer conceptualization of expression in music can help to explain both similarities and differences between vocal and musical expression of emotions.

  11. A Framework for Resilient Remote Monitoring

    DTIC Science & Technology

    2014-08-01

    of low-level observables are availa- ble, audited , and recorded. This establishes the need for a re- mote monitoring framework that can integrate with...Security, WS-Policy, SAML, XML Signature, and XML Encryption. Pearson Higher Education, 2004. [3] OMG, “Common Secure Interoperability Protocol...www.darpa.mil/Our_Work/I2O/Programs/Integrated_Cyb er_Analysis_System_%28ICAS%29.aspx. [8] D. Miller and B. Pearson , Security information and event man

  12. Exposure and Experience: Additional Criteria for Selecting Future Operational Theater Commanders

    DTIC Science & Technology

    2009-10-23

    American Civil War, WWII and today ‟s conflict. However, for the scope of this paper, a pattern clearly emerges between service in direct observation of...Kaufmann. From Plato to Derrida . Upper Saddle River, New Jersey: Pearson Prentice Hall, 2008. 8 Experience Comparison of Former...Forrest E., and Walter Kaufmann. From Plato to Derrida . Upper Saddle River, New Jersey: Pearson Prentice Hall, 2008. Bell, William Gardner. Center

  13. An Analysis of the Beaufort Sea Thermohaline Structure and Variability, and Its Effects on Acoustic Propagation

    DTIC Science & Technology

    2016-06-01

    BEAUFORT SEA THERMOHALINE STRUCTURE AND VARIABILITY, AND ITS EFFECTS ON ACOUSTIC PROPAGATION by Annalise N. Pearson June 2016 Thesis...STRUCTURE AND VARIABILITY, AND ITS EFFECTS ON ACOUSTIC PROPAGATION 5. FUNDING NUMBERS 6. AUTHOR(S) Annalise N. Pearson 7. PERFORMING ORGANIZATION...public release; distribution is unlimited AN ANALYSIS OF THE BEAUFORT SEA THERMOHALINE STRUCTURE AND VARIABILITY, AND ITS EFFECTS ON ACOUSTIC

  14. Choosing the Best Correction Formula for the Pearson r[superscript 2] Effect Size

    ERIC Educational Resources Information Center

    Skidmore, Susan Troncoso; Thompson, Bruce

    2011-01-01

    In the present Monte Carlo simulation study, the authors compared bias and precision of 7 sampling error corrections to the Pearson r[superscript 2] under 6 x 3 x 6 conditions (i.e., population ρ values of 0.0, 0.1, 0.3, 0.5, 0.7, and 0.9, respectively; population shapes normal, skewness = kurtosis = 1, and skewness = -1.5 with kurtosis =…

  15. The Effect of Advanced Education on the Retention and Promotion of Army Officers

    DTIC Science & Technology

    2007-03-01

    2 Ronal G.Ehrenberg, Robert S.Smith, Modern Labor Economics , Theory and Public Policy, (New York: Pearson Education, Inc, 2006...Ehrenberg and Robert S. Smith, Modern Labor Economics , 9th ed. (New York: Pearson Education, Inc, 2006). 30 G S Becker, Human Capital: A Theoretical...their jobs. As a 32 Ronald G. Ehrenberg and Robert S. Smith, Modern Labor Economics , 9th ed. (New

  16. Ellipticity angle of electromagnetic signals and its use for non-energetic detection optimal by the Neumann-Pearson criterion

    NASA Astrophysics Data System (ADS)

    Gromov, V. A.; Sharygin, G. S.; Mironov, M. V.

    2012-08-01

    An interval method of radar signal detection and selection based on non-energetic polarization parameter - the ellipticity angle - is suggested. The examined method is optimal by the Neumann-Pearson criterion. The probability of correct detection for a preset probability of false alarm is calculated for different signal/noise ratios. Recommendations for optimization of the given method are provided.

  17. Pearson syndrome in the neonatal period: two case reports and review of the literature.

    PubMed

    Manea, Elena Maria; Leverger, Guy; Bellmann, Francoise; Stanescu, Popp Alina; Mircea, Adam; Lèbre, Anne-Sophie; Rötig, Agnes; Munnich, Arnold

    2009-12-01

    Pearson syndrome is a multiorgan mitochondrial cytopathy that results from defective oxidative phosphorylation owing to mitochondrial DNA deletions. Prognosis is severe and death occurs in infancy or early childhood. This article describes 2 cases with a severe neonatal onset of the disease. A review of the literature reveals the atypical presentation of the disease in the neonatal period, which is often overlooked and underdiagnosed.

  18. Health Service Quality Scale: Brazilian Portuguese translation, reliability and validity.

    PubMed

    Rocha, Luiz Roberto Martins; Veiga, Daniela Francescato; e Oliveira, Paulo Rocha; Song, Elaine Horibe; Ferreira, Lydia Masako

    2013-01-17

    The Health Service Quality Scale is a multidimensional hierarchical scale that is based on interdisciplinary approach. This instrument was specifically created for measuring health service quality based on marketing and health care concepts. The aim of this study was to translate and culturally adapt the Health Service Quality Scale into Brazilian Portuguese and to assess the validity and reliability of the Brazilian Portuguese version of the instrument. We conducted a cross-sectional, observational study, with public health system patients in a Brazilian university hospital. Validity was assessed using Pearson's correlation coefficient to measure the strength of the association between the Brazilian Portuguese version of the instrument and the SERVQUAL scale. Internal consistency was evaluated using Cronbach's alpha coefficient; the intraclass (ICC) and Pearson's correlation coefficients were used for test-retest reliability. One hundred and sixteen consecutive postoperative patients completed the questionnaire. Pearson's correlation coefficient for validity was 0.20. Cronbach's alpha for the first and second administrations of the final version of the instrument were 0.982 and 0.986, respectively. For test-retest reliability, Pearson's correlation coefficient was 0.89 and ICC was 0.90. The culturally adapted, Brazilian Portuguese version of the Health Service Quality Scale is a valid and reliable instrument to measure health service quality.

  19. Biochemical abnormalities in Pearson syndrome.

    PubMed

    Crippa, Beatrice Letizia; Leon, Eyby; Calhoun, Amy; Lowichik, Amy; Pasquali, Marzia; Longo, Nicola

    2015-03-01

    Pearson marrow-pancreas syndrome is a multisystem mitochondrial disorder characterized by bone marrow failure and pancreatic insufficiency. Children who survive the severe bone marrow dysfunction in childhood develop Kearns-Sayre syndrome later in life. Here we report on four new cases with this condition and define their biochemical abnormalities. Three out of four patients presented with failure to thrive, with most of them having normal development and head size. All patients had evidence of bone marrow involvement that spontaneously improved in three out of four patients. Unique findings in our patients were acute pancreatitis (one out of four), renal Fanconi syndrome (present in all patients, but symptomatic only in one), and an unusual organic aciduria with 3-hydroxyisobutyric aciduria in one patient. Biochemical analysis indicated low levels of plasma citrulline and arginine, despite low-normal ammonia levels. Regression analysis indicated a significant correlation between each intermediate of the urea cycle and the next, except between ornithine and citrulline. This suggested that the reaction catalyzed by ornithine transcarbamylase (that converts ornithine to citrulline) might not be very efficient in patients with Pearson syndrome. In view of low-normal ammonia levels, we hypothesize that ammonia and carbamylphosphate could be diverted from the urea cycle to the synthesis of nucleotides in patients with Pearson syndrome and possibly other mitochondrial disorders. © 2015 Wiley Periodicals, Inc.

  20. Trellis coding techniques for mobile communications

    NASA Technical Reports Server (NTRS)

    Divsalar, D.; Simon, M. K.; Jedrey, T.

    1988-01-01

    A criterion for designing optimum trellis codes to be used over fading channels is given. A technique is shown for reducing certain multiple trellis codes, optimally designed for the fading channel, to conventional (i.e., multiplicity one) trellis codes. The computational cutoff rate R0 is evaluated for MPSK transmitted over fading channels. Examples of trellis codes optimally designed for the Rayleigh fading channel are given and compared with respect to R0. Two types of modulation/demodulation techniques are considered, namely coherent (using pilot tone-aided carrier recovery) and differentially coherent with Doppler frequency correction. Simulation results are given for end-to-end performance of two trellis-coded systems.

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Buttler, D J

    The Java Metadata Facility is introduced by Java Specification Request (JSR) 175 [1], and incorporated into the Java language specification [2] in version 1.5 of the language. The specification allows annotations on Java program elements: classes, interfaces, methods, and fields. Annotations give programmers a uniform way to add metadata to program elements that can be used by code checkers, code generators, or other compile-time or runtime components. Annotations are defined by annotation types. These are defined the same way as interfaces, but with the symbol {at} preceding the interface keyword. There are additional restrictions on defining annotation types: (1) Theymore » cannot be generic; (2) They cannot extend other annotation types or interfaces; (3) Methods cannot have any parameters; (4) Methods cannot have type parameters; (5) Methods cannot throw exceptions; and (6) The return type of methods of an annotation type must be a primitive, a String, a Class, an annotation type, or an array, where the type of the array is restricted to one of the four allowed types. See [2] for additional restrictions and syntax. The methods of an annotation type define the elements that may be used to parameterize the annotation in code. Annotation types may have default values for any of its elements. For example, an annotation that specifies a defect report could initialize an element defining the defect outcome submitted. Annotations may also have zero elements. This could be used to indicate serializability for a class (as opposed to the current Serializability interface).« less

  2. When Homoplasy Is Not Homoplasy: Dissecting Trait Evolution by Contrasting Composite and Reductive Coding.

    PubMed

    Torres-Montúfar, Alejandro; Borsch, Thomas; Ochoterena, Helga

    2018-05-01

    The conceptualization and coding of characters is a difficult issue in phylogenetic systematics, no matter which inference method is used when reconstructing phylogenetic trees or if the characters are just mapped onto a specific tree. Complex characters are groups of features that can be divided into simpler hierarchical characters (reductive coding), although the implied hierarchical relational information may change depending on the type of coding (composite vs. reductive). Up to now, there is no common agreement to either code characters as complex or simple. Phylogeneticists have discussed which coding method is best but have not incorporated the heuristic process of reciprocal illumination to evaluate the coding. Composite coding allows to test whether 1) several characters were linked resulting in a structure described as a complex character or trait or 2) independently evolving characters resulted in the configuration incorrectly interpreted as a complex character. We propose that complex characters or character states should be decomposed iteratively into simpler characters when the original homology hypothesis is not corroborated by a phylogenetic analysis, and the character or character state is retrieved as homoplastic. We tested this approach using the case of fruit types within subfamily Cinchonoideae (Rubiaceae). The iterative reductive coding of characters associated with drupes allowed us to unthread fruit evolution within Cinchonoideae. Our results show that drupes and berries are not homologous. As a consequence, a more precise ontology for the Cinchonoideae drupes is required.

  3. Broadband transmission-type coding metamaterial for wavefront manipulation for airborne sound

    NASA Astrophysics Data System (ADS)

    Li, Kun; Liang, Bin; Yang, Jing; Yang, Jun; Cheng, Jian-chun

    2018-07-01

    The recent advent of coding metamaterials, as a new class of acoustic metamaterials, substantially reduces the complexity in the design and fabrication of acoustic functional devices capable of manipulating sound waves in exotic manners by arranging coding elements with discrete phase states in specific sequences. It is therefore intriguing, both physically and practically, to pursue a mechanism for realizing broadband acoustic coding metamaterials that control transmitted waves with a fine resolution of the phase profile. Here, we propose the design of a transmission-type acoustic coding device and demonstrate its metamaterial-based implementation. The mechanism is that, instead of relying on resonant coding elements that are necessarily narrow-band, we build weak-resonant coding elements with a helical-like metamaterial with a continuously varying pitch that effectively expands the working bandwidth while maintaining the sub-wavelength resolution of the phase profile that is vital for the production of complicated wave fields. The effectiveness of our proposed scheme is numerically verified via the demonstration of three distinctive examples of acoustic focusing, anomalous refraction, and vortex beam generation in the prescribed frequency band on the basis of 1- and 2-bit coding sequences. Simulation results agree well with theoretical predictions, showing that the designed coding devices with discrete phase profiles are efficient in engineering the wavefront of outcoming waves to form the desired spatial pattern. We anticipate the realization of coding metamaterials with broadband functionality and design flexibility to open up possibilities for novel acoustic functional devices for the special manipulation of transmitted waves and underpin diverse applications ranging from medical ultrasound imaging to acoustic detections.

  4. Demonstration of emulator-based Bayesian calibration of safety analysis codes: Theory and formulation

    DOE PAGES

    Yurko, Joseph P.; Buongiorno, Jacopo; Youngblood, Robert

    2015-05-28

    System codes for simulation of safety performance of nuclear plants may contain parameters whose values are not known very accurately. New information from tests or operating experience is incorporated into safety codes by a process known as calibration, which reduces uncertainty in the output of the code and thereby improves its support for decision-making. The work reported here implements several improvements on classic calibration techniques afforded by modern analysis techniques. The key innovation has come from development of code surrogate model (or code emulator) construction and prediction algorithms. Use of a fast emulator makes the calibration processes used here withmore » Markov Chain Monte Carlo (MCMC) sampling feasible. This study uses Gaussian Process (GP) based emulators, which have been used previously to emulate computer codes in the nuclear field. The present work describes the formulation of an emulator that incorporates GPs into a factor analysis-type or pattern recognition-type model. This “function factorization” Gaussian Process (FFGP) model allows overcoming limitations present in standard GP emulators, thereby improving both accuracy and speed of the emulator-based calibration process. Calibration of a friction-factor example using a Method of Manufactured Solution is performed to illustrate key properties of the FFGP based process.« less

  5. Qualitative Data Analysis for Health Services Research: Developing Taxonomy, Themes, and Theory

    PubMed Central

    Bradley, Elizabeth H; Curry, Leslie A; Devers, Kelly J

    2007-01-01

    Objective To provide practical strategies for conducting and evaluating analyses of qualitative data applicable for health services researchers. Data Sources and Design We draw on extant qualitative methodological literature to describe practical approaches to qualitative data analysis. Approaches to data analysis vary by discipline and analytic tradition; however, we focus on qualitative data analysis that has as a goal the generation of taxonomy, themes, and theory germane to health services research. Principle Findings We describe an approach to qualitative data analysis that applies the principles of inductive reasoning while also employing predetermined code types to guide data analysis and interpretation. These code types (conceptual, relationship, perspective, participant characteristics, and setting codes) define a structure that is appropriate for generation of taxonomy, themes, and theory. Conceptual codes and subcodes facilitate the development of taxonomies. Relationship and perspective codes facilitate the development of themes and theory. Intersectional analyses with data coded for participant characteristics and setting codes can facilitate comparative analyses. Conclusions Qualitative inquiry can improve the description and explanation of complex, real-world phenomena pertinent to health services research. Greater understanding of the processes of qualitative data analysis can be helpful for health services researchers as they use these methods themselves or collaborate with qualitative researchers from a wide range of disciplines. PMID:17286625

  6. Qualitative data analysis for health services research: developing taxonomy, themes, and theory.

    PubMed

    Bradley, Elizabeth H; Curry, Leslie A; Devers, Kelly J

    2007-08-01

    To provide practical strategies for conducting and evaluating analyses of qualitative data applicable for health services researchers. DATA SOURCES AND DESIGN: We draw on extant qualitative methodological literature to describe practical approaches to qualitative data analysis. Approaches to data analysis vary by discipline and analytic tradition; however, we focus on qualitative data analysis that has as a goal the generation of taxonomy, themes, and theory germane to health services research. We describe an approach to qualitative data analysis that applies the principles of inductive reasoning while also employing predetermined code types to guide data analysis and interpretation. These code types (conceptual, relationship, perspective, participant characteristics, and setting codes) define a structure that is appropriate for generation of taxonomy, themes, and theory. Conceptual codes and subcodes facilitate the development of taxonomies. Relationship and perspective codes facilitate the development of themes and theory. Intersectional analyses with data coded for participant characteristics and setting codes can facilitate comparative analyses. Qualitative inquiry can improve the description and explanation of complex, real-world phenomena pertinent to health services research. Greater understanding of the processes of qualitative data analysis can be helpful for health services researchers as they use these methods themselves or collaborate with qualitative researchers from a wide range of disciplines.

  7. Proper coding of the Abbreviated Injury Scale: can clinical parameters help as surrogates in estimating blood loss?

    PubMed

    Burkhardt, M; Holstein, J H; Moersdorf, P; Kristen, A; Lefering, R; Pohlemann, T; Pizanis, A

    2014-08-01

    The Abbreviated Injury Scale (AIS) requires the estimation of the lost blood volume for some severity assignments. This study aimed to develop a rule of thumb for facilitating AIS coding by using objective clinical parameters as surrogate markers of blood loss. Using the example of pelvic ring fractures, a retrospective analysis of TraumaRegister DGU(®) data from 2002 to 2011 was performed. As potential surrogate markers of blood loss, we recorded the hemoglobin (Hb) level, systolic blood pressure (SBP), base excess (BE), Quick's value, units of packed red blood cells (PRBCs) transfused before intensive care unit (ICU) admission, and mortality within 24 h. We identified 11,574 patients with pelvic ring fractures (Tile/OTA classification: 39 % type A, 40 % type B, 21 % type C). Type C fractures were 73.1 % AISpelvis 4 and 26.9 % AISpelvis 5. Type B fractures were 47 % AISpelvis 3, 47 % AISpelvis 4, and 6 % AISpelvis 5. In type C fractures, cut-off values of <7 g/dL Hb, <90 mmHg SBP, <-9 mmol/L BE, <35 % Quick's value, >15 units PRBCs, and death within 24 h had a positive predictive value of 47 % and a sensitivity of 62 % for AISpelvis 5. In type B fractures, these cut-off values had poor sensitivity (48 %) and positive predictive value (11 %) for AISpelvis 5. We failed to develop a rule of thumb for facilitating a proper future AIS coding using the example of pelvic ring fractures. The estimation of blood loss for severity assignment still remains a noteworthy weakness in the AIS coding of traumatic injuries.

  8. Genome-wide identification and characterization of long non-coding RNAs in developmental skeletal muscle of fetal goat.

    PubMed

    Zhan, Siyuan; Dong, Yao; Zhao, Wei; Guo, Jiazhong; Zhong, Tao; Wang, Linjie; Li, Li; Zhang, Hongping

    2016-08-22

    Long non-coding RNAs (lncRNAs) have been studied extensively over the past few years. Large numbers of lncRNAs have been identified in mouse, rat, and human, and some of them have been shown to play important roles in muscle development and myogenesis. However, there are few reports on the characterization of lncRNAs covering all the development stages of skeletal muscle in livestock. RNA libraries constructed from developing longissimus dorsi muscle of fetal (45, 60, and 105 days of gestation) and postnatal (3 days after birth) goat (Capra hircus) were sequenced. A total of 1,034,049,894 clean reads were generated. Among them, 3981 lncRNA transcripts corresponding to 2739 lncRNA genes were identified, including 3515 intergenic lncRNAs and 466 anti-sense lncRNAs. Notably, in pairwise comparisons between the libraries of skeletal muscle at the different development stages, a total of 577 transcripts were differentially expressed (P < 0.05) which were validated by qPCR using randomly selected six lncRNA genes. The identified goat lncRNAs shared some characteristics, such as fewer exons and shorter length, with the lncRNAs in other mammals. We also found 1153 lncRNAs genes were neighbored 1455 protein-coding genes (<10 kb upstream and downstream) and functionally enriched in transcriptional regulation and development-related processes, indicating they may be in cis-regulatory relationships. Additionally, Pearson's correlation coefficients of co-expression levels suggested 1737 lncRNAs and 19,422 mRNAs were possibly in trans-regulatory relationships (r > 0.95 or r < -0.95). These co-expressed mRNAs were enriched in development-related biological processes such as muscle system processes, regulation of cell growth, muscle cell development, regulation of transcription, and embryonic morphogenesis. This study provides a catalog of goat muscle-related lncRNAs, and will contribute to a fuller understanding of the molecular mechanism underpinning muscle development in mammals.

  9. General RMP Guidance - Appendix B: Selected NAICS Codes

    EPA Pesticide Factsheets

    This appendix contains a list of selected 2002 North American Industry Classification System (NAICS) codes used by Federal statistical agencies, in designating business types or functions in categories such as farming, manufacturing, and waste management.

  10. Effect of MultiSubstitution on the Thermoelectric Performance of the Ca11-xYbxSb10-yGez (0 ≤ x ≤ 9; 0 ≤ y ≤ 3; 0 ≤ z ≤ 3) System: Experimental and Theoretical Studies.

    PubMed

    Nam, Gnu; Choi, Woongjin; Lee, Junsu; Lim, Seong-Ji; Jo, Hongil; Ok, Kang Min; Ahn, Kyunghan; You, Tae-Soo

    2017-06-19

    The Zintl phase solid-solution Ca 11-x Yb x Sb 10-y Ge z (0 ≤ x ≤ 9; 0 ≤ y ≤ 3; 0 ≤ z ≤ 3) system with the cationic/anionic multisubstitution has been synthesized by molten Sn metal flux and arc-melting methods. The crystal structure of the nine title compounds were characterized by both powder and single-crystal X-ray diffractions and adopted the Ho 11 Ge 10 -type structure with the tetragonal space group I4/mmm (Z = 4, Pearson Code tI84). The overall isotypic structure of the nine title compounds can be illustrated as an assembly of three different types of cationic polyhedra sharing faces with their neighboring polyhedra and the three-dimensional cage-shaped anionic frameworks consisting of the dumbbell-shaped Sb 2 units and the square-shaped Sb 4 or (Sb/Ge) 4 units. During the multisubstitution trials, interestingly, we observed a metal-to-semiconductor transition as the Ca and Ge contents increased in the title system from Yb 11 Sb 10 to Ca 9 Yb 2 Sb 7 Ge 3 (nominal compositions) on the basis of a series of thermoelectric property measurements. This phenomenon can be elucidated by the suppression of a bipolar conduction of holes and electrons via an extra hole-carrier doping. The tight-binding linear muffin-tin orbital calculations using four hypothetical structural models nicely proved that the size of a pseudogap and the magnitude of the density of states at the Fermi level are significantly influenced by substituting elements as well as their atomic sites in a unit cell. The observed particular cationic/anionic site preferences, the historically known abnormalities of atomic displacement parameters, and the occupation deficiencies of particular atomic sites are further rationalized by the QVAL value criterion on the basis of the theoretical calculations. The results of SEM, EDS, and TGA analyses are also provided.

  11. Practices in public health finance: an investigation of jurisdiction funding patterns and performance.

    PubMed

    Honoré, Peggy A; Simoes, Eduardo J; Jones, Walter J; Moonesinghe, Ramal

    2004-01-01

    A field of study for public health finance has never been adequately developed. Consequently, very little is known about the relationships, types, and amount of finances that fund the public health system in America. This research was undertaken to build on the sparse knowledge of public health finance by examining the value of performance measurement systems to financial analysis. A correlational study was conducted to examine the associations between public health system performance of the 10 essential public health services and funding patterns of 50 local health departments in a large state. The specific objectives were to investigate if different levels and types of revenues, expenditures, and other demographic variables in a jurisdiction are correlated to performance. Pearson correlation analysis did not conclusively show strong associations; however, statistically significant positive associations primarily between higher levels of performance and jurisdiction taxes per capita were found.

  12. External Validation of a Case-Mix Adjustment Model for the Standardized Reporting of 30-Day Stroke Mortality Rates in China.

    PubMed

    Yu, Ping; Pan, Yuesong; Wang, Yongjun; Wang, Xianwei; Liu, Liping; Ji, Ruijun; Meng, Xia; Jing, Jing; Tong, Xu; Guo, Li; Wang, Yilong

    2016-01-01

    A case-mix adjustment model has been developed and externally validated, demonstrating promise. However, the model has not been thoroughly tested among populations in China. In our study, we evaluated the performance of the model in Chinese patients with acute stroke. The case-mix adjustment model A includes items on age, presence of atrial fibrillation on admission, National Institutes of Health Stroke Severity Scale (NIHSS) score on admission, and stroke type. Model B is similar to Model A but includes only the consciousness component of the NIHSS score. Both model A and B were evaluated to predict 30-day mortality rates in 13,948 patients with acute stroke from the China National Stroke Registry. The discrimination of the models was quantified by c-statistic. Calibration was assessed using Pearson's correlation coefficient. The c-statistic of model A in our external validation cohort was 0.80 (95% confidence interval, 0.79-0.82), and the c-statistic of model B was 0.82 (95% confidence interval, 0.81-0.84). Excellent calibration was reported in the two models with Pearson's correlation coefficient (0.892 for model A, p<0.001; 0.927 for model B, p = 0.008). The case-mix adjustment model could be used to effectively predict 30-day mortality rates in Chinese patients with acute stroke.

  13. An augmented reality C-arm for intraoperative assessment of the mechanical axis: a preclinical study.

    PubMed

    Fallavollita, Pascal; Brand, Alexander; Wang, Lejing; Euler, Ekkehard; Thaller, Peter; Navab, Nassir; Weidert, Simon

    2016-11-01

    Determination of lower limb alignment is a prerequisite for successful orthopedic surgical treatment. Traditional methods include the electrocautery cord, alignment rod, or axis board which rely solely on C-arm fluoroscopy navigation and are radiation intensive. To assess a new augmented reality technology in determining lower limb alignment. A camera-augmented mobile C-arm (CamC) technology was used to create a panorama image consisting of hip, knee, and ankle X-rays. Twenty-five human cadaver legs were used for validation with random varus or valgus deformations. Five clinicians performed experiments that consisted in achieving acceptable mechanical axis deviation. The applicability of the CamC technology was assessed with direct comparison to ground-truth CT. A t test, Pearson's correlation, and ANOVA were used to determine statistical significance. The value of Pearson's correlation coefficient R was 0.979 which demonstrates a strong positive correlation between the CamC and ground-truth CT data. The analysis of variance produced a p value equal to 0.911 signifying that clinician expertise differences were not significant with regard to the type of system used to assess mechanical axis deviation. All described measurements demonstrated valid measurement of lower limb alignment. With minimal effort, clinicians required only 3 X-ray image acquisitions using the augmented reality technology to achieve reliable mechanical axis deviation.

  14. A powerful nonparametric method for detecting differentially co-expressed genes: distance correlation screening and edge-count test.

    PubMed

    Zhang, Qingyang

    2018-05-16

    Differential co-expression analysis, as a complement of differential expression analysis, offers significant insights into the changes in molecular mechanism of different phenotypes. A prevailing approach to detecting differentially co-expressed genes is to compare Pearson's correlation coefficients in two phenotypes. However, due to the limitations of Pearson's correlation measure, this approach lacks the power to detect nonlinear changes in gene co-expression which is common in gene regulatory networks. In this work, a new nonparametric procedure is proposed to search differentially co-expressed gene pairs in different phenotypes from large-scale data. Our computational pipeline consisted of two main steps, a screening step and a testing step. The screening step is to reduce the search space by filtering out all the independent gene pairs using distance correlation measure. In the testing step, we compare the gene co-expression patterns in different phenotypes by a recently developed edge-count test. Both steps are distribution-free and targeting nonlinear relations. We illustrate the promise of the new approach by analyzing the Cancer Genome Atlas data and the METABRIC data for breast cancer subtypes. Compared with some existing methods, the new method is more powerful in detecting nonlinear type of differential co-expressions. The distance correlation screening can greatly improve computational efficiency, facilitating its application to large data sets.

  15. Isolation of an intertypic poliovirus capsid recombinant from a child with vaccine-associated paralytic poliomyelitis.

    PubMed

    Martín, Javier; Samoilovich, Elena; Dunn, Glynis; Lackenby, Angie; Feldman, Esphir; Heath, Alan; Svirchevskaya, Ekaterina; Cooper, Gill; Yermalovich, Marina; Minor, Philip D

    2002-11-01

    The isolation of a capsid intertypic poliovirus recombinant from a child with vaccine-associated paralytic poliomyelitis is described. Virus 31043 had a Sabin-derived type 3-type 2-type 1 recombinant genome with a 5'-end crossover point within the capsid coding region. The result was a poliovirus chimera containing the entire coding sequence for antigenic site 3a derived from the Sabin type 2 strain. The recombinant virus showed altered antigenic properties but did not acquire type 2 antigenic characteristics. The significance of the presence in nature of such poliovirus chimeras and the consequences for the current efforts to detect potentially dangerous vaccine-derived poliovirus strains are discussed in the context of the global polio eradication initiative.

  16. Gender and Heritage Spanish Bilingual Grammars: A Study of Code-Mixed Determiner Phrases and Copula Constructions

    ERIC Educational Resources Information Center

    Valenzuela, Elena; Faure, Ana; Ramirez-Trujillo, Alma P.; Barski, Ewelina; Pangtay, Yolanda; Diez, Adriana

    2012-01-01

    The study examined heritage speaker grammars and to what extent they diverge with respect to grammatical gender from adult L2 learners. Results from a preference task involving code-mixed Determiner Phrases (DPs) and code-mixed copula constructions show a difference between these two types of operations. Heritage speakers patterned with the…

  17. Higher Education in Further Education Colleges: Indirectly Funded Partnerships: Codes of Practice for Franchise and Consortia Arrangements. Report.

    ERIC Educational Resources Information Center

    Higher Education Funding Council for England, Bristol.

    This report provides codes of practice for two types of indirectly funded partnerships entered into by higher education institutions and further education sector colleges: franchises and consortia. The codes of practice set out guidance on the principles that should be reflected in the franchise and consortia agreements that underpin indirectly…

  18. Composite load spectra for select space propulsion structural components

    NASA Technical Reports Server (NTRS)

    Newell, J. F.; Kurth, R. E.; Ho, H.

    1986-01-01

    A multiyear program is performed with the objective to develop generic load models with multiple levels of progressive sophistication to simulate the composite (combined) load spectra that are induced in space propulsion system components, representative of Space Shuttle Main Engines (SSME), such as transfer ducts, turbine blades, and liquid oxygen (LOX) posts. Progress of the first year's effort includes completion of a sufficient portion of each task -- probabilistic models, code development, validation, and an initial operational code. This code has from its inception an expert system philosophy that could be added to throughout the program and in the future. The initial operational code is only applicable to turbine blade type loadings. The probabilistic model included in the operational code has fitting routines for loads that utilize a modified Discrete Probabilistic Distribution termed RASCAL, a barrier crossing method and a Monte Carlo method. An initial load model was developed by Battelle that is currently used for the slowly varying duty cycle type loading. The intent is to use the model and related codes essentially in the current form for all loads that are based on measured or calculated data that have followed a slowly varying profile.

  19. GeneXplorer: an interactive web application for microarray data visualization and analysis.

    PubMed

    Rees, Christian A; Demeter, Janos; Matese, John C; Botstein, David; Sherlock, Gavin

    2004-10-01

    When publishing large-scale microarray datasets, it is of great value to create supplemental websites where either the full data, or selected subsets corresponding to figures within the paper, can be browsed. We set out to create a CGI application containing many of the features of some of the existing standalone software for the visualization of clustered microarray data. We present GeneXplorer, a web application for interactive microarray data visualization and analysis in a web environment. GeneXplorer allows users to browse a microarray dataset in an intuitive fashion. It provides simple access to microarray data over the Internet and uses only HTML and JavaScript to display graphic and annotation information. It provides radar and zoom views of the data, allows display of the nearest neighbors to a gene expression vector based on their Pearson correlations and provides the ability to search gene annotation fields. The software is released under the permissive MIT Open Source license, and the complete documentation and the entire source code are freely available for download from CPAN http://search.cpan.org/dist/Microarray-GeneXplorer/.

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, C. S.; Zhang, Hongbin

    Uncertainty quantification and sensitivity analysis are important for nuclear reactor safety design and analysis. A 2x2 fuel assembly core design was developed and simulated by the Virtual Environment for Reactor Applications, Core Simulator (VERA-CS) coupled neutronics and thermal-hydraulics code under development by the Consortium for Advanced Simulation of Light Water Reactors (CASL). An approach to uncertainty quantification and sensitivity analysis with VERA-CS was developed and a new toolkit was created to perform uncertainty quantification and sensitivity analysis with fourteen uncertain input parameters. Furthermore, the minimum departure from nucleate boiling ratio (MDNBR), maximum fuel center-line temperature, and maximum outer clad surfacemore » temperature were chosen as the selected figures of merit. Pearson, Spearman, and partial correlation coefficients were considered for all of the figures of merit in sensitivity analysis and coolant inlet temperature was consistently the most influential parameter. We used parameters as inputs to the critical heat flux calculation with the W-3 correlation were shown to be the most influential on the MDNBR, maximum fuel center-line temperature, and maximum outer clad surface temperature.« less

  1. Uncertainty quantification and sensitivity analysis with CASL Core Simulator VERA-CS

    DOE PAGES

    Brown, C. S.; Zhang, Hongbin

    2016-05-24

    Uncertainty quantification and sensitivity analysis are important for nuclear reactor safety design and analysis. A 2x2 fuel assembly core design was developed and simulated by the Virtual Environment for Reactor Applications, Core Simulator (VERA-CS) coupled neutronics and thermal-hydraulics code under development by the Consortium for Advanced Simulation of Light Water Reactors (CASL). An approach to uncertainty quantification and sensitivity analysis with VERA-CS was developed and a new toolkit was created to perform uncertainty quantification and sensitivity analysis with fourteen uncertain input parameters. Furthermore, the minimum departure from nucleate boiling ratio (MDNBR), maximum fuel center-line temperature, and maximum outer clad surfacemore » temperature were chosen as the selected figures of merit. Pearson, Spearman, and partial correlation coefficients were considered for all of the figures of merit in sensitivity analysis and coolant inlet temperature was consistently the most influential parameter. We used parameters as inputs to the critical heat flux calculation with the W-3 correlation were shown to be the most influential on the MDNBR, maximum fuel center-line temperature, and maximum outer clad surface temperature.« less

  2. Digital Controller For Emergency Beacon

    NASA Technical Reports Server (NTRS)

    Ivancic, William D.

    1990-01-01

    Prototype digital controller intended for use in 406-MHz emergency beacon. Undergoing development according to international specifications, 406-MHz emergency beacon system includes satellites providing worldwide monitoring of beacons, with Doppler tracking to locate each beacon within 5 km. Controller turns beacon on and off and generates binary codes identifying source (e.g., ship, aircraft, person, or vehicle on land). Codes transmitted by phase modulation. Knowing code, monitor attempts to communicate with user, monitor uses code information to dispatch rescue team appropriate to type and locations of carrier.

  3. Coding for Efficient Image Transmission

    NASA Technical Reports Server (NTRS)

    Rice, R. F.; Lee, J. J.

    1986-01-01

    NASA publication second in series on data-coding techniques for noiseless channels. Techniques used even in noisy channels, provided data further processed with Reed-Solomon or other error-correcting code. Techniques discussed in context of transmission of monochrome imagery from Voyager II spacecraft but applicable to other streams of data. Objective of this type coding to "compress" data; that is, to transmit using as few bits as possible by omitting as much as possible of portion of information repeated in subsequent samples (or picture elements).

  4. Neural Decoder for Topological Codes

    NASA Astrophysics Data System (ADS)

    Torlai, Giacomo; Melko, Roger G.

    2017-07-01

    We present an algorithm for error correction in topological codes that exploits modern machine learning techniques. Our decoder is constructed from a stochastic neural network called a Boltzmann machine, of the type extensively used in deep learning. We provide a general prescription for the training of the network and a decoding strategy that is applicable to a wide variety of stabilizer codes with very little specialization. We demonstrate the neural decoder numerically on the well-known two-dimensional toric code with phase-flip errors.

  5. Finite-difference simulation of transonic separated flow using a full potential boundary layer interaction approach

    NASA Technical Reports Server (NTRS)

    Van Dalsem, W. R.; Steger, J. L.

    1983-01-01

    A new, fast, direct-inverse, finite-difference boundary-layer code has been developed and coupled with a full-potential transonic airfoil analysis code via new inviscid-viscous interaction algorithms. The resulting code has been used to calculate transonic separated flows. The results are in good agreement with Navier-Stokes calculations and experimental data. Solutions are obtained in considerably less computer time than Navier-Stokes solutions of equal resolution. Because efficient inviscid and viscous algorithms are used, it is expected this code will also compare favorably with other codes of its type as they become available.

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    O'Malley, Kathleen; Lopez, Hugo; Cairns, Julie

    An overview of the main North American codes and standards associated with hydrogen safety sensors is provided. The distinction between a code and a standard is defined, and the relationship between standards and codes is clarified, especially for those circumstances where a standard or a certification requirement is explicitly referenced within a code. The report identifies three main types of standards commonly applied to hydrogen sensors (interface and controls standards, shock and hazard standards, and performance-based standards). The certification process and a list and description of the main standards and model codes associated with the use of hydrogen safety sensorsmore » in hydrogen infrastructure are presented.« less

  7. Relationship between the TCAP and the Pearson Benchmark Assessment in Elementary Students' Reading and Math Performance in a Northeastern Tennessee School District

    ERIC Educational Resources Information Center

    Dugger-Roberts, Cherith A.

    2014-01-01

    The purpose of this quantitative study was to determine if there was a relationship between the TCAP test and Pearson Benchmark assessment in elementary students' reading and language arts and math performance in a northeastern Tennessee school district. This study involved 3rd, 4th, 5th, and 6th grade students. The study focused on the following…

  8. Test Review: Wagner, R. K., Torgesen, J. K., Rashotte, C. A., & Pearson, N. A., "Comprehensive Test of Phonological Processing-2nd Ed. (CTOPP-2)." Austin, Texas: Pro-Ed

    ERIC Educational Resources Information Center

    Dickens, Rachel H.; Meisinger, Elizabeth B.; Tarar, Jessica M.

    2015-01-01

    The Comprehensive Test of Phonological Processing-Second Edition (CTOPP-2; Wagner, Torgesen, Rashotte, & Pearson, 2013) is a norm-referenced test that measures phonological processing skills related to reading for individuals aged 4 to 24. According to its authors, the CTOPP-2 may be used to identify individuals who are markedly below their…

  9. Is the Pearson r[squared] Biased, and if So, What Is the Best Correction Formula?

    ERIC Educational Resources Information Center

    Wang, Zhongmiao; Thompson, Bruce

    2007-01-01

    In this study the authors investigated the use of 5 (i.e., Claudy, Ezekiel, Olkin-Pratt, Pratt, and Smith) R[squared] correction formulas with the Pearson r[squared]. The authors estimated adjustment bias and precision under 6 x 3 x 6 conditions (i.e., population [rho] values of 0.0, 0.1, 0.3, 0.5, 0.7, and 0.9; population shapes normal, skewness…

  10. Uganda: Perfection of Post-Conflict Stability or Ticking Time Bomb

    DTIC Science & Technology

    2016-01-01

    1 UGANDA: PERFECTION OF POST-CONFLICT STABILITY OR TICKING TIME BOMB ? By Kristin M. Pearson and Alex S. Pedersen, United States Air...Force Academy 2015 INSS RESEARCH PAPER 2016 2 UGANDA: PERFECTION OF POST-CONFLICT STABILITY OR TICKING TIME BOMB ? By Kristin M. Pearson and Alex...likely. “The area is a ticking time bomb without ongoing efforts. There’s an entire group of young men trained in military tactics that have said

  11. Six-year changes in mortality and crown condition of old-growth ponderosa pines in ecological restoration treatments at the G. A. Pearson Natural Area

    Treesearch

    Thomas E. Kolb; Peter Z. Fule; Michael R. Wagner; W. Wallace Covington

    2001-01-01

    Ecological restoration treatments using thinning and prescribed burning have been proposed to reverse the decline of old-growth ponderosa pines in the Southwest. However, long-term data on the effectiveness of such treatments are lacking. In 1993-1994, two ecological restoration treatments and a control were established at the G. A. Pearson Natural Area (GPNA) near...

  12. Optimization of the two-sample rank Neyman-Pearson detector

    NASA Astrophysics Data System (ADS)

    Akimov, P. S.; Barashkov, V. M.

    1984-10-01

    The development of optimal algorithms concerned with rank considerations in the case of finite sample sizes involves considerable mathematical difficulties. The present investigation provides results related to the design and the analysis of an optimal rank detector based on a utilization of the Neyman-Pearson criteria. The detection of a signal in the presence of background noise is considered, taking into account n observations (readings) x1, x2, ... xn in the experimental communications channel. The computation of the value of the rank of an observation is calculated on the basis of relations between x and the variable y, representing interference. Attention is given to conditions in the absence of a signal, the probability of the detection of an arriving signal, details regarding the utilization of the Neyman-Pearson criteria, the scheme of an optimal rank, multichannel, incoherent detector, and an analysis of the detector.

  13. Parametric distribution approach for flow availability in small hydro potential analysis

    NASA Astrophysics Data System (ADS)

    Abdullah, Samizee; Basri, Mohd Juhari Mat; Jamaluddin, Zahrul Zamri; Azrulhisham, Engku Ahmad; Othman, Jamel

    2016-10-01

    Small hydro system is one of the important sources of renewable energy and it has been recognized worldwide as clean energy sources. Small hydropower generation system uses the potential energy in flowing water to produce electricity is often questionable due to inconsistent and intermittent of power generated. Potential analysis of small hydro system which is mainly dependent on the availability of water requires the knowledge of water flow or stream flow distribution. This paper presented the possibility of applying Pearson system for stream flow availability distribution approximation in the small hydro system. By considering the stochastic nature of stream flow, the Pearson parametric distribution approximation was computed based on the significant characteristic of Pearson system applying direct correlation between the first four statistical moments of the distribution. The advantage of applying various statistical moments in small hydro potential analysis will have the ability to analyze the variation shapes of stream flow distribution.

  14. Entanglement-enhanced Neyman-Pearson target detection using quantum illumination

    NASA Astrophysics Data System (ADS)

    Zhuang, Quntao; Zhang, Zheshen; Shapiro, Jeffrey H.

    2017-08-01

    Quantum illumination (QI) provides entanglement-based target detection---in an entanglement-breaking environment---whose performance is significantly better than that of optimum classical-illumination target detection. QI's performance advantage was established in a Bayesian setting with the target presumed equally likely to be absent or present and error probability employed as the performance metric. Radar theory, however, eschews that Bayesian approach, preferring the Neyman-Pearson performance criterion to avoid the difficulties of accurately assigning prior probabilities to target absence and presence and appropriate costs to false-alarm and miss errors. We have recently reported an architecture---based on sum-frequency generation (SFG) and feedforward (FF) processing---for minimum error-probability QI target detection with arbitrary prior probabilities for target absence and presence. In this paper, we use our results for FF-SFG reception to determine the receiver operating characteristic---detection probability versus false-alarm probability---for optimum QI target detection under the Neyman-Pearson criterion.

  15. The study on dynamic cadastral coding rules based on kinship relationship

    NASA Astrophysics Data System (ADS)

    Xu, Huan; Liu, Nan; Liu, Renyi; Lu, Jingfeng

    2007-06-01

    Cadastral coding rules are an important supplement to the existing national and local standard specifications for building cadastral database. After analyzing the course of cadastral change, especially the parcel change with the method of object-oriented analysis, a set of dynamic cadastral coding rules based on kinship relationship corresponding to the cadastral change is put forward and a coding format composed of street code, block code, father parcel code, child parcel code and grandchild parcel code is worked out within the county administrative area. The coding rule has been applied to the development of an urban cadastral information system called "ReGIS", which is not only able to figure out the cadastral code automatically according to both the type of parcel change and the coding rules, but also capable of checking out whether the code is spatiotemporally unique before the parcel is stored in the database. The system has been used in several cities of Zhejiang Province and got a favorable response. This verifies the feasibility and effectiveness of the coding rules to some extent.

  16. Predictive modeling of structured electronic health records for adverse drug event detection.

    PubMed

    Zhao, Jing; Henriksson, Aron; Asker, Lars; Boström, Henrik

    2015-01-01

    The digitization of healthcare data, resulting from the increasingly widespread adoption of electronic health records, has greatly facilitated its analysis by computational methods and thereby enabled large-scale secondary use thereof. This can be exploited to support public health activities such as pharmacovigilance, wherein the safety of drugs is monitored to inform regulatory decisions about sustained use. To that end, electronic health records have emerged as a potentially valuable data source, providing access to longitudinal observations of patient treatment and drug use. A nascent line of research concerns predictive modeling of healthcare data for the automatic detection of adverse drug events, which presents its own set of challenges: it is not yet clear how to represent the heterogeneous data types in a manner conducive to learning high-performing machine learning models. Datasets from an electronic health record database are used for learning predictive models with the purpose of detecting adverse drug events. The use and representation of two data types, as well as their combination, are studied: clinical codes, describing prescribed drugs and assigned diagnoses, and measurements. Feature selection is conducted on the various types of data to reduce dimensionality and sparsity, while allowing for an in-depth feature analysis of the usefulness of each data type and representation. Within each data type, combining multiple representations yields better predictive performance compared to using any single representation. The use of clinical codes for adverse drug event detection significantly outperforms the use of measurements; however, there is no significant difference over datasets between using only clinical codes and their combination with measurements. For certain adverse drug events, the combination does, however, outperform using only clinical codes. Feature selection leads to increased predictive performance for both data types, in isolation and combined. We have demonstrated how machine learning can be applied to electronic health records for the purpose of detecting adverse drug events and proposed solutions to some of the challenges this presents, including how to represent the various data types. Overall, clinical codes are more useful than measurements and, in specific cases, it is beneficial to combine the two.

  17. Predictive modeling of structured electronic health records for adverse drug event detection

    PubMed Central

    2015-01-01

    Background The digitization of healthcare data, resulting from the increasingly widespread adoption of electronic health records, has greatly facilitated its analysis by computational methods and thereby enabled large-scale secondary use thereof. This can be exploited to support public health activities such as pharmacovigilance, wherein the safety of drugs is monitored to inform regulatory decisions about sustained use. To that end, electronic health records have emerged as a potentially valuable data source, providing access to longitudinal observations of patient treatment and drug use. A nascent line of research concerns predictive modeling of healthcare data for the automatic detection of adverse drug events, which presents its own set of challenges: it is not yet clear how to represent the heterogeneous data types in a manner conducive to learning high-performing machine learning models. Methods Datasets from an electronic health record database are used for learning predictive models with the purpose of detecting adverse drug events. The use and representation of two data types, as well as their combination, are studied: clinical codes, describing prescribed drugs and assigned diagnoses, and measurements. Feature selection is conducted on the various types of data to reduce dimensionality and sparsity, while allowing for an in-depth feature analysis of the usefulness of each data type and representation. Results Within each data type, combining multiple representations yields better predictive performance compared to using any single representation. The use of clinical codes for adverse drug event detection significantly outperforms the use of measurements; however, there is no significant difference over datasets between using only clinical codes and their combination with measurements. For certain adverse drug events, the combination does, however, outperform using only clinical codes. Feature selection leads to increased predictive performance for both data types, in isolation and combined. Conclusions We have demonstrated how machine learning can be applied to electronic health records for the purpose of detecting adverse drug events and proposed solutions to some of the challenges this presents, including how to represent the various data types. Overall, clinical codes are more useful than measurements and, in specific cases, it is beneficial to combine the two. PMID:26606038

  18. Transformation of Graphical ECA Policies into Executable PonderTalk Code

    NASA Astrophysics Data System (ADS)

    Romeikat, Raphael; Sinsel, Markus; Bauer, Bernhard

    Rules are becoming more and more important in business modeling and systems engineering and are recognized as a high-level programming paradigma. For the effective development of rules it is desired to start at a high level, e.g. with graphical rules, and to refine them into code of a particular rule language for implementation purposes later. An model-driven approach is presented in this paper to transform graphical rules into executable code in a fully automated way. The focus is on event-condition-action policies as a special rule type. These are modeled graphically and translated into the PonderTalk language. The approach may be extended to integrate other rule types and languages as well.

  19. Ensemble coding of face identity is not independent of the coding of individual identity.

    PubMed

    Neumann, Markus F; Ng, Ryan; Rhodes, Gillian; Palermo, Romina

    2018-06-01

    Information about a group of similar objects can be summarized into a compressed code, known as ensemble coding. Ensemble coding of simple stimuli (e.g., groups of circles) can occur in the absence of detailed exemplar coding, suggesting dissociable processes. Here, we investigate whether a dissociation would still be apparent when coding facial identity, where individual exemplar information is much more important. We examined whether ensemble coding can occur when exemplar coding is difficult, as a result of large sets or short viewing times, or whether the two types of coding are positively associated. We found a positive association, whereby both ensemble and exemplar coding were reduced for larger groups and shorter viewing times. There was no evidence for ensemble coding in the absence of exemplar coding. At longer presentation times, there was an unexpected dissociation, where exemplar coding increased yet ensemble coding decreased, suggesting that robust information about face identity might suppress ensemble coding. Thus, for face identity, we did not find the classic dissociation-of access to ensemble information in the absence of detailed exemplar information-that has been used to support claims of distinct mechanisms for ensemble and exemplar coding.

  20. Three-dimensional simulation of triode-type MIG for 1 MW, 120 GHz gyrotron for ECRH applications

    NASA Astrophysics Data System (ADS)

    Singh, Udaybir; Kumar, Nitin; Kumar, Narendra; Kumar, Anil; Sinha, A. K.

    2012-01-01

    In this paper, the three-dimensional simulation of triode-type magnetron injection gun (MIG) for 120 GHz, 1 MW gyrotron is presented. The operating voltages of the modulating anode and the accelerating anode are 57 kV and 80 kV respectively. The high order TE 22,6 mode is selected as the operating mode and the electron beam is launched at the first radial maxima for the fundamental beam-mode operation. The initial design is obtained by using the in-house developed code MIGSYN. The numerical simulation is performed by using the commercially available code CST-Particle Studio (PS). The simulated results of MIG obtained by using CST-PS are validated with other simulation codes EGUN and TRAK, respectively. The results on the design output parameters obtained by using these three codes are found to be in close agreement.

  1. The Development of Bimodal Bilingualism: Implications for Linguistic Theory.

    PubMed

    Lillo-Martin, Diane; de Quadros, Ronice Müller; Pichler, Deborah Chen

    2016-01-01

    A wide range of linguistic phenomena contribute to our understanding of the architecture of the human linguistic system. In this paper we present a proposal dubbed Language Synthesis to capture bilingual phenomena including code-switching and 'transfer' as automatic consequences of the addition of a second language, using basic concepts of Minimalism and Distributed Morphology. Bimodal bilinguals, who use a sign language and a spoken language, provide a new type of evidence regarding possible bilingual phenomena, namely code-blending, the simultaneous production of (aspects of) a message in both speech and sign. We argue that code-blending also follows naturally once a second articulatory interface is added to the model. Several different types of code-blending are discussed in connection to the predictions of the Synthesis model. Our primary data come from children developing as bimodal bilinguals, but our proposal is intended to capture a wide range of bilingual effects across any language pair.

  2. The Development of Bimodal Bilingualism: Implications for Linguistic Theory

    PubMed Central

    Lillo-Martin, Diane; de Quadros, Ronice Müller; Pichler, Deborah Chen

    2017-01-01

    A wide range of linguistic phenomena contribute to our understanding of the architecture of the human linguistic system. In this paper we present a proposal dubbed Language Synthesis to capture bilingual phenomena including code-switching and ‘transfer’ as automatic consequences of the addition of a second language, using basic concepts of Minimalism and Distributed Morphology. Bimodal bilinguals, who use a sign language and a spoken language, provide a new type of evidence regarding possible bilingual phenomena, namely code-blending, the simultaneous production of (aspects of) a message in both speech and sign. We argue that code-blending also follows naturally once a second articulatory interface is added to the model. Several different types of code-blending are discussed in connection to the predictions of the Synthesis model. Our primary data come from children developing as bimodal bilinguals, but our proposal is intended to capture a wide range of bilingual effects across any language pair. PMID:28603576

  3. Wind Farm Turbine Type and Placement Optimization

    NASA Astrophysics Data System (ADS)

    Graf, Peter; Dykes, Katherine; Scott, George; Fields, Jason; Lunacek, Monte; Quick, Julian; Rethore, Pierre-Elouan

    2016-09-01

    The layout of turbines in a wind farm is already a challenging nonlinear, nonconvex, nonlinearly constrained continuous global optimization problem. Here we begin to address the next generation of wind farm optimization problems by adding the complexity that there is more than one turbine type to choose from. The optimization becomes a nonlinear constrained mixed integer problem, which is a very difficult class of problems to solve. This document briefly summarizes the algorithm and code we have developed, the code validation steps we have performed, and the initial results for multi-turbine type and placement optimization (TTP_OPT) we have run.

  4. Wind farm turbine type and placement optimization

    DOE PAGES

    Graf, Peter; Dykes, Katherine; Scott, George; ...

    2016-10-03

    The layout of turbines in a wind farm is already a challenging nonlinear, nonconvex, nonlinearly constrained continuous global optimization problem. Here we begin to address the next generation of wind farm optimization problems by adding the complexity that there is more than one turbine type to choose from. The optimization becomes a nonlinear constrained mixed integer problem, which is a very difficult class of problems to solve. Furthermore, this document briefly summarizes the algorithm and code we have developed, the code validation steps we have performed, and the initial results for multi-turbine type and placement optimization (TTP_OPT) we have run.

  5. Serum bilirubin concentration is associated with eGFR and urinary albumin excretion in patients with type 1 diabetes mellitus.

    PubMed

    Nishimura, Takeshi; Tanaka, Masami; Sekioka, Risa; Itoh, Hiroshi

    2015-01-01

    Although relationships of serum bilirubin concentration with estimated glomerular filtration rate (eGFR) and urinary albumin excretion (UAE) in patients with type 2 diabetes have been reported, whether such relationships exist in patients with type 1 diabetes is unknown. A total of 123 patients with type 1 diabetes were investigated in this cross-sectional study. The relationship between bilirubin (total and indirect) concentrations and log(UAE) as well as eGFR was examined by Pearson's correlation analyses. Multivariate regression analyses were used to assess the association of bilirubin (total and indirect) with eGFR as well as log(UAE). A positive correlation was found between serum bilirubin concentration and eGFR; total bilirubin (r=0.223, p=0.013), indirect bilirubin (r=0.244, p=0.007). A negative correlation was found between serum bilirubin concentration and log(UAE); total bilirubin (r=-0.258, p=0.005), indirect bilirubin (r=-0.271, p=0.003). Multivariate regression analyses showed that indirect bilirubin concentration was an independent determinant of eGFR and log(UAE). Bilirubin concentration is associated with both eGFR and log(UAE) in patients with type 1 diabetes. Bilirubin might have a protective role in the progression of type 1 diabetic nephropathy. Copyright © 2015 Elsevier Inc. All rights reserved.

  6. Effective Universal Coverage of Diabetes Mellitus Type 2 in Chile

    PubMed Central

    Guerrero-Núñez, Sara; Valenzuela-Suazo, Sandra; Cid-Henríquez, Patricia

    2017-01-01

    ABSTRACT Objective: determine the prevalence of Effective Universal Coverage of Diabetes Mellitus Type 2 in Chile and its relation with the variables: Health Care Coverage of Diabetes Mellitus Type 2; Average of diabetics with metabolic control in 2011-2013; Mortality Rate for Diabetes Mellitus; and Percentage of nurses participating in the Cardiovascular Health Program. Method: cross-sectional descriptive study with ecological components that uses documentary sources of the Ministry of Health. It was established that there is correlation between the Universal Effective Coverage of Diabetes Mellitus Type 2 and the independent variables; it was applied the Pearson Coefficient, being significant at the 0.05 level. Results: in Chile Universal Health Care Coverage of Diabetes Mellitus Type 2 (HbA1c<7% estimated population) is less than 20%; this is related with Mortality Rate for Diabetes Mellitus and Percentage of nurses participating in the Cardiovascular Health Program, being significant at the 0.01 level. Conclusion: effective prevalence of Universal Health Coverage of Diabetes Mellitus Type 2 is low, even though some regions stand out in this research and in the metabolic control of patients who participate in health control program; its relation with percentage of nurses participating in the Cardiovascular Health Program represents a challenge and an opportunity for the health system. PMID:28403339

  7. LncRNA-DANCR: A valuable cancer related long non-coding RNA for human cancers.

    PubMed

    Thin, Khaing Zar; Liu, Xuefang; Feng, Xiaobo; Raveendran, Sudheesh; Tu, Jian Cheng

    2018-06-01

    Long noncoding RNAs (lncRNA) are a type of noncoding RNA that comprise of longer than 200 nucleotides sequences. They can regulate chromosome structure, gene expression and play an essential role in the pathophysiology of human diseases, especially in tumorigenesis and progression. Nowadays, they are being targeted as potential biomarkers for various cancer types. And many research studies have proven that lncRNAs might bring a new era to cancer diagnosis and support treatment management. The purpose of this review was to inspect the molecular mechanism and clinical significance of long non-coding RNA- differentiation antagonizing nonprotein coding RNA(DANCR) in various types of human cancers. In this review, we summarize and figure out recent research studies concerning the expression and biological mechanisms of lncRNA-DANCR in tumour development. The related studies were obtained through a systematic search of PubMed, Embase and Cochrane Library. Long non-coding RNAs-DANCR is a valuable cancer-related lncRNA that its dysregulated expression was found in a variety of malignancies, including hepatocellular carcinoma, breast cancer, glioma, colorectal cancer, gastric cancer, and lung cancer. The aberrant expressions of DANCR have been shown to contribute to proliferation, migration and invasion of cancer cells. Long non-coding RNAs-DANCR likely serves as a useful disease biomarker or therapeutic cancer target. Copyright © 2018 Elsevier GmbH. All rights reserved.

  8. Multi-optimization Criteria-based Robot Behavioral Adaptability and Motion Planning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pin, Francois G.

    2002-06-01

    Robotic tasks are typically defined in Task Space (e.g., the 3-D World), whereas robots are controlled in Joint Space (motors). The transformation from Task Space to Joint Space must consider the task objectives (e.g., high precision, strength optimization, torque optimization), the task constraints (e.g., obstacles, joint limits, non-holonomic constraints, contact or tool task constraints), and the robot kinematics configuration (e.g., tools, type of joints, mobile platform, manipulator, modular additions, locked joints). Commercially available robots are optimized for a specific set of tasks, objectives and constraints and, therefore, their control codes are extremely specific to a particular set of conditions. Thus,more » there exist a multiplicity of codes, each handling a particular set of conditions, but none suitable for use on robots with widely varying tasks, objectives, constraints, or environments. On the other hand, most DOE missions and tasks are typically ''batches of one''. Attempting to use commercial codes for such work requires significant personnel and schedule costs for re-programming or adding code to the robots whenever a change in task objective, robot configuration, number and type of constraint, etc. occurs. The objective of our project is to develop a ''generic code'' to implement this Task-space to Joint-Space transformation that would allow robot behavior adaptation, in real time (at loop rate), to changes in task objectives, number and type of constraints, modes of controls, kinematics configuration (e.g., new tools, added module). Our specific goal is to develop a single code for the general solution of under-specified systems of algebraic equations that is suitable for solving the inverse kinematics of robots, is useable for all types of robots (mobile robots, manipulators, mobile manipulators, etc.) with no limitation on the number of joints and the number of controlled Task-Space variables, can adapt to real time changes in number and type of constraints and in task objectives, and can adapt to changes in kinematics configurations (change of module, change of tool, joint failure adaptation, etc.).« less

  9. Single neuron firing properties impact correlation-based population coding

    PubMed Central

    Hong, Sungho; Ratté, Stéphanie; Prescott, Steven A.; De Schutter, Erik

    2012-01-01

    Correlated spiking has been widely observed but its impact on neural coding remains controversial. Correlation arising from co-modulation of rates across neurons has been shown to vary with the firing rates of individual neurons. This translates into rate and correlation being equivalently tuned to the stimulus; under those conditions, correlated spiking does not provide information beyond that already available from individual neuron firing rates. Such correlations are irrelevant and can reduce coding efficiency by introducing redundancy. Using simulations and experiments in rat hippocampal neurons, we show here that pairs of neurons receiving correlated input also exhibit correlations arising from precise spike-time synchronization. Contrary to rate co-modulation, spike-time synchronization is unaffected by firing rate, thus enabling synchrony- and rate-based coding to operate independently. The type of output correlation depends on whether intrinsic neuron properties promote integration or coincidence detection: “ideal” integrators (with spike generation sensitive to stimulus mean) exhibit rate co-modulation whereas “ideal” coincidence detectors (with spike generation sensitive to stimulus variance) exhibit precise spike-time synchronization. Pyramidal neurons are sensitive to both stimulus mean and variance, and thus exhibit both types of output correlation proportioned according to which operating mode is dominant. Our results explain how different types of correlations arise based on how individual neurons generate spikes, and why spike-time synchronization and rate co-modulation can encode different stimulus properties. Our results also highlight the importance of neuronal properties for population-level coding insofar as neural networks can employ different coding schemes depending on the dominant operating mode of their constituent neurons. PMID:22279226

  10. Sfg

    NASA Astrophysics Data System (ADS)

    Fischer, R. X.; Baur, W. H.

    This document is part of Subvolume E `Zeolite-Type Crystal Structures and their Chemistry. Framework Type Codes RON to STI' of Volume 14 `Microporous and other Framework Materials with Zeolite-Type Structures' of Landolt-Börnstein Group IV `Physical Chemistry'.

  11. Vet

    NASA Astrophysics Data System (ADS)

    Fischer, R. X.; Baur, W. H.

    This document is part of Subvolume F 'Zeolite-Type Crystal Structures and their Chemistry. Framework Type Codes STO to ZON' of Volume 14 'Microporous and other Framework Materials with Zeolite-Type Structures' of Landolt-Börnstein Group IV 'Physical Chemistry'.

  12. Data integration from pathology slides for quantitative imaging of multiple cell types within the tumor immune cell infiltrate.

    PubMed

    Ma, Zhaoxuan; Shiao, Stephen L; Yoshida, Emi J; Swartwood, Steven; Huang, Fangjin; Doche, Michael E; Chung, Alice P; Knudsen, Beatrice S; Gertych, Arkadiusz

    2017-09-18

    Immune cell infiltrates (ICI) of tumors are scored by pathologists around tumor glands. To obtain a better understanding of the immune infiltrate, individual immune cell types, their activation states and location relative to tumor cells need to be determined. This process requires precise identification of the tumor area and enumeration of immune cell subtypes separately in the stroma and inside tumor nests. Such measurements can be accomplished by a multiplex format using immunohistochemistry (IHC). We developed a pipeline that combines immunohistochemistry (IHC) and digital image analysis. One slide was stained with pan-cytokeratin and CD45 and the other slide with CD8, CD4 and CD68. The tumor mask generated through pan-cytokeratin staining was transferred from one slide to the other using affine image co-registration. Bland-Altman plots and Pearson correlation were used to investigate differences between densities and counts of immune cell underneath the transferred versus manually annotated tumor masks. One-way ANOVA was used to compare the mask transfer error for tissues with solid and glandular tumor architecture. The overlap between manual and transferred tumor masks ranged from 20%-90% across all cases. The error of transferring the mask was 2- to 4-fold greater in tumor regions with glandular compared to solid growth pattern (p < 10 -6 ). Analyzing data from a single slide, the Pearson correlation coefficients of cell type densities outside and inside tumor regions were highest for CD4 + T-cells (r = 0.8), CD8 + T-cells (r = 0.68) or CD68+ macrophages (r = 0.79). The correlation coefficient for CD45+ T- and B-cells was only 0.45. The transfer of the mask generated an error in the measurement of intra- and extra- tumoral CD68+, CD8+ or CD4+ counts (p < 10 -10 ). In summary, we developed a general method to integrate data from IHC stained slides into a single dataset. Because of the transfer error between slides, we recommend applying the antibody for demarcation of the tumor on the same slide as the ICI antibodies.

  13. Prediction of Osteopathic Medical School Performance on the basis of MCAT score, GPA, sex, undergraduate major, and undergraduate institution.

    PubMed

    Dixon, Donna

    2012-04-01

    The relationships of students' preadmission academic variables, sex, undergraduate major, and undergraduate institution to academic performance in medical school have not been thoroughly examined. To determine the ability of students' preadmission academic variables to predict osteopathic medical school performance and whether students' sex, undergraduate major, or undergraduate institution influence osteopathic medical school performance. The study followed students who graduated from New York College of Osteopathic Medicine of New York Institute of Technology in Old Westbury between 2003 and 2006. Student preadmission data were Medical College Admission Test (MCAT) scores, undergraduate grade point averages (GPAs), sex, undergraduate major, and undergraduate institutional selectivity. Medical school performance variables were GPAs, clinical performance (ie, clinical subject examinations and clerkship evaluations), and scores on the Comprehensive Osteopathic Medical Licensing Examination-USA (COMLEX-USA) Level 1 and Level 2-Clinical Evaluation (CE). Data were analyzed with Pearson product moment correlation coefficients and multivariate linear regression analyses. Differences between student groups were compared with the independent-samples, 2-tailed t test. A total of 737 students were included. All preadmission academic variables, except nonscience undergraduate GPA, were statistically significant predictors of performance on COMLEX-USA Level 1, and all preadmission academic variables were statistically significant predictors of performance on COMLEX-USA Level 2-CE. The MCAT score for biological sciences had the highest correlation among all variables with COMLEX-USA Level 1 performance (Pearson r=0.304; P<.001) and Level 2-CE performance (Pearson r=0.272; P<.001). All preadmission variables were moderately correlated with the mean clinical subject examination scores. The mean clerkship evaluation score was moderately correlated with mean clinical examination results (Pearson r=0.267; P<.001) and COMLEX-USA Level 2-CE performance (Pearson r=0.301; P<.001). Clinical subject examination scores were highly correlated with COMLEX-USA Level 2-CE scores (Pearson r=0.817; P<.001). No statistically significant difference in medical school performance was found between students with science and nonscience undergraduate majors, nor was undergraduate institutional selectivity a factor influencing performance. Students' preadmission academic variables were predictive of osteopathic medical school performance, including GPAs, clinical performance, and COMLEX-USA Level 1 and Level 2-CE results. Clinical performance was predictive of COMLEX-USA Level 2-CE performance.

  14. More accurate, calibrated bootstrap confidence intervals for correlating two autocorrelated climate time series

    NASA Astrophysics Data System (ADS)

    Olafsdottir, Kristin B.; Mudelsee, Manfred

    2013-04-01

    Estimation of the Pearson's correlation coefficient between two time series to evaluate the influences of one time depended variable on another is one of the most often used statistical method in climate sciences. Various methods are used to estimate confidence interval to support the correlation point estimate. Many of them make strong mathematical assumptions regarding distributional shape and serial correlation, which are rarely met. More robust statistical methods are needed to increase the accuracy of the confidence intervals. Bootstrap confidence intervals are estimated in the Fortran 90 program PearsonT (Mudelsee, 2003), where the main intention was to get an accurate confidence interval for correlation coefficient between two time series by taking the serial dependence of the process that generated the data into account. However, Monte Carlo experiments show that the coverage accuracy for smaller data sizes can be improved. Here we adapt the PearsonT program into a new version called PearsonT3, by calibrating the confidence interval to increase the coverage accuracy. Calibration is a bootstrap resampling technique, which basically performs a second bootstrap loop or resamples from the bootstrap resamples. It offers, like the non-calibrated bootstrap confidence intervals, robustness against the data distribution. Pairwise moving block bootstrap is used to preserve the serial correlation of both time series. The calibration is applied to standard error based bootstrap Student's t confidence intervals. The performances of the calibrated confidence intervals are examined with Monte Carlo simulations, and compared with the performances of confidence intervals without calibration, that is, PearsonT. The coverage accuracy is evidently better for the calibrated confidence intervals where the coverage error is acceptably small (i.e., within a few percentage points) already for data sizes as small as 20. One form of climate time series is output from numerical models which simulate the climate system. The method is applied to model data from the high resolution ocean model, INALT01 where the relationship between the Agulhas Leakage and the North Brazil Current is evaluated. Preliminary results show significant correlation between the two variables when there is 10 year lag between them, which is more or less the time that takes the Agulhas Leakage water to reach the North Brazil Current. Mudelsee, M., 2003. Estimating Pearson's correlation coefficient with bootstrap confidence interval from serially dependent time series. Mathematical Geology 35, 651-665.

  15. Task representation in individual and joint settings

    PubMed Central

    Prinz, Wolfgang

    2015-01-01

    This paper outlines a framework for task representation and discusses applications to interference tasks in individual and joint settings. The framework is derived from the Theory of Event Coding (TEC). This theory regards task sets as transient assemblies of event codes in which stimulus and response codes interact and shape each other in particular ways. On the one hand, stimulus and response codes compete with each other within their respective subsets (horizontal interactions). On the other hand, stimulus and response code cooperate with each other (vertical interactions). Code interactions instantiating competition and cooperation apply to two time scales: on-line performance (i.e., doing the task) and off-line implementation (i.e., setting the task). Interference arises when stimulus and response codes overlap in features that are irrelevant for stimulus identification, but relevant for response selection. To resolve this dilemma, the feature profiles of event codes may become restructured in various ways. The framework is applied to three kinds of interference paradigms. Special emphasis is given to joint settings where tasks are shared between two participants. Major conclusions derived from these applications include: (1) Response competition is the chief driver of interference. Likewise, different modes of response competition give rise to different patterns of interference; (2) The type of features in which stimulus and response codes overlap is also a crucial factor. Different types of such features give likewise rise to different patterns of interference; and (3) Task sets for joint settings conflate intraindividual conflicts between responses (what), with interindividual conflicts between responding agents (whom). Features of response codes may, therefore, not only address responses, but also responding agents (both physically and socially). PMID:26029085

  16. [INVITED] Luminescent QR codes for smart labelling and sensing

    NASA Astrophysics Data System (ADS)

    Ramalho, João F. C. B.; António, L. C. F.; Correia, S. F. H.; Fu, L. S.; Pinho, A. S.; Brites, C. D. S.; Carlos, L. D.; André, P. S.; Ferreira, R. A. S.

    2018-05-01

    QR (Quick Response) codes are two-dimensional barcodes composed of special geometric patterns of black modules in a white square background that can encode different types of information with high density and robustness, correct errors and physical damages, thus keeping the stored information protected. Recently, these codes have gained increased attention as they offer a simple physical tool for quick access to Web sites for advertising and social interaction. Challenges encompass the increase of the storage capacity limit, even though they can store approximately 350 times more information than common barcodes, and encode different types of characters (e.g., numeric, alphanumeric, kanji and kana). In this work, we fabricate luminescent QR codes based on a poly(methyl methacrylate) substrate coated with organic-inorganic hybrid materials doped with trivalent terbium (Tb3+) and europium (Eu3+) ions, demonstrating the increase of storage capacity per unit area by a factor of two by using the colour multiplexing, when compared to conventional QR codes. A novel methodology to decode the multiplexed QR codes is developed based on a colour separation threshold where a decision level is calculated through a maximum-likelihood criteria to minimize the error probability of the demultiplexed modules, maximizing the foreseen total storage capacity. Moreover, the thermal dependence of the emission colour coordinates of the Eu3+/Tb3+-based hybrids enables the simultaneously QR code colour-multiplexing and may be used to sense temperature (reproducibility higher than 93%), opening new fields of applications for QR codes as smart labels for sensing.

  17. Recruitment of motor units in the medial gastrocnemius muscle during human quiet standing: is recruitment intermittent? What triggers recruitment?

    PubMed Central

    Loram, Ian D.; Muceli, Silvia; Merletti, Roberto; Farina, Dario

    2012-01-01

    The recruitment and the rate of discharge of motor units are determinants of muscle force. Within a motoneuron pool, recruitment and rate coding of individual motor units might be controlled independently, depending on the circumstances. In this study, we tested whether, during human quiet standing, the force of the medial gastrocnemius (MG) muscle is predominantly controlled by recruitment or rate coding. If MG control during standing was mainly due to recruitment, then we further asked what the trigger mechanism is. Is it determined internally, or is it related to body kinematics? While seven healthy subjects stood quietly, intramuscular electromyograms were recorded from the MG muscle with three pairs of wire electrodes. The number of active motor units and their mean discharge rate were compared for different sway velocities and positions. Motor unit discharges occurred more frequently when the body swayed faster and forward (Pearson R = 0.63; P < 0.0001). This higher likelihood of observing motor unit potentials was explained chiefly by the recruitment of additional units. During forward body shifts, the median number of units detected increased from 3 to 11 (P < 0.0001), whereas the discharge rate changed from 8 ± 1.1 (mean ± SD) to 10 ± 0.9 pulses/s (P = 0.001). Strikingly, motor units did not discharge continuously throughout standing. They were recruited within individual, forward sways and intermittently, with a modal rate of two recruitments per second. This modal rate is consistent with previous circumstantial evidence relating the control of standing to an intrinsic, higher level planning process. PMID:21994258

  18. Associations between county and municipality zoning ordinances and access to fruit and vegetable outlets in rural North Carolina, 2012.

    PubMed

    Mayo, Mariel Leah; Pitts, Stephanie B Jilcott; Chriqui, Jamie F

    2013-12-05

    Zoning ordinances and land-use plans may influence the community food environment by determining placement and access to food outlets, which subsequently support or hinder residents' attempts to eat healthfully. The objective of this study was to examine associations between healthful food zoning scores as derived from information on local zoning ordinances, county demographics, and residents' access to fruit and vegetable outlets in rural northeastern North Carolina. From November 2012 through March 2013, county and municipality zoning ordinances were identified and double-coded by using the Bridging the Gap food code/policy audit form. A healthful food zoning score was derived by assigning points for the allowed use of fruit and vegetable outlets. Pearson coefficients were calculated to examine correlations between the healthful food zoning score, county demographics, and the number of fruit and vegetable outlets. In March and April 2013, qualitative interviews were conducted among county and municipal staff members knowledgeable about local zoning and planning to ascertain implementation and enforcement of zoning to support fruit and vegetable outlets. We found a strong positive correlation between healthful food zoning scores and the number of fruit and vegetable outlets in 13 northeastern North Carolina counties (r = 0.66, P = .01). Major themes in implementation and enforcement of zoning to support fruit and vegetable outlets included strict enforcement versus lack of enforcement of zoning regulations. Increasing the range of permitted uses in zoning districts to include fruit and vegetable outlets may increase access to healthful fruit and vegetable outlets in rural communities.

  19. Associations Between County and Municipality Zoning Ordinances and Access to Fruit And Vegetable Outlets in Rural North Carolina, 2012

    PubMed Central

    Mayo, Mariel Leah; Chriqui, Jamie F.

    2013-01-01

    Introduction Zoning ordinances and land-use plans may influence the community food environment by determining placement and access to food outlets, which subsequently support or hinder residents’ attempts to eat healthfully. The objective of this study was to examine associations between healthful food zoning scores as derived from information on local zoning ordinances, county demographics, and residents’ access to fruit and vegetable outlets in rural northeastern North Carolina. Methods From November 2012 through March 2013, county and municipality zoning ordinances were identified and double-coded by using the Bridging the Gap food code/policy audit form. A healthful food zoning score was derived by assigning points for the allowed use of fruit and vegetable outlets. Pearson coefficients were calculated to examine correlations between the healthful food zoning score, county demographics, and the number of fruit and vegetable outlets. In March and April 2013, qualitative interviews were conducted among county and municipal staff members knowledgeable about local zoning and planning to ascertain implementation and enforcement of zoning to support fruit and vegetable outlets. Results We found a strong positive correlation between healthful food zoning scores and the number of fruit and vegetable outlets in 13 northeastern North Carolina counties (r = 0.66, P = .01). Major themes in implementation and enforcement of zoning to support fruit and vegetable outlets included strict enforcement versus lack of enforcement of zoning regulations. Conclusion Increasing the range of permitted uses in zoning districts to include fruit and vegetable outlets may increase access to healthful fruit and vegetable outlets in rural communities. PMID:24309091

  20. Pediatric Pulmonary Hemorrhage vs. Extrapulmonary Bleeding in the Differential Diagnosis of Hemoptysis.

    PubMed

    Vaiman, Michael; Klin, Baruch; Rosenfeld, Noa; Abu-Kishk, Ibrahim

    2017-01-01

    Hemoptysis is an important symptom which causes a major concern, and warrants immediate diagnostic attention. The authors compared a group of patients with pediatric pulmonary hemorrhage with pediatric patients diagnosed with extrapulmonary bleeding focusing on differences in etiology, outcome and differential diagnosis of hemoptysis. We performed the retrospective analysis of medical charts of 134 pediatric patients admitted to the Emergency Department because of pulmonary and extrapulmonary hemorrhage and were diagnosed with suspected hemoptysis or developed hemoptysis (ICD10-CM code R04.2). The cases with pulmonary hemorrhage (Group 1) were compared with cases of extrapulmonary bleeding (Group 2) using the Fisher Exact test or Pearson's χ 2 test for categorical variables. The t-test was used to assess differences between continuous variables of the patients in the two groups. Bloody cough was the presenting symptom in 73.9% of cases. 30 patients had pulmonary hemorrhage (Group 1), while 104 patients had extrapulmonary bleeding (Group 2). The underlying causes of bleeding in Group 2 included epistaxis, inflammatory diseases of nasopharynx and larynx, foreign bodies, gingivitis, and hypertrophy of adenoids. Mortality rate was 10% in Group 1, whereas Group 2 did not have any mortality outcomes during the observation period. Etiologycal factors were significantly different between hemoptysis and extrapulmonary bleeding in children. Our research suggested that pulmonary and extrapulmonary bleeding are two conditions that differ significantly and cannot be unified under one diagnostic code. It is important to differentiate between focal and diffuse cases, and between pulmonary and extrapulmonary hemorrhage due to the diversity of clinical courses and outcomes.

  1. Recruitment of motor units in the medial gastrocnemius muscle during human quiet standing: is recruitment intermittent? What triggers recruitment?

    PubMed

    Vieira, Taian M M; Loram, Ian D; Muceli, Silvia; Merletti, Roberto; Farina, Dario

    2012-01-01

    The recruitment and the rate of discharge of motor units are determinants of muscle force. Within a motoneuron pool, recruitment and rate coding of individual motor units might be controlled independently, depending on the circumstances. In this study, we tested whether, during human quiet standing, the force of the medial gastrocnemius (MG) muscle is predominantly controlled by recruitment or rate coding. If MG control during standing was mainly due to recruitment, then we further asked what the trigger mechanism is. Is it determined internally, or is it related to body kinematics? While seven healthy subjects stood quietly, intramuscular electromyograms were recorded from the MG muscle with three pairs of wire electrodes. The number of active motor units and their mean discharge rate were compared for different sway velocities and positions. Motor unit discharges occurred more frequently when the body swayed faster and forward (Pearson R = 0.63; P < 0.0001). This higher likelihood of observing motor unit potentials was explained chiefly by the recruitment of additional units. During forward body shifts, the median number of units detected increased from 3 to 11 (P < 0.0001), whereas the discharge rate changed from 8 ± 1.1 (mean ± SD) to 10 ± 0.9 pulses/s (P = 0.001). Strikingly, motor units did not discharge continuously throughout standing. They were recruited within individual, forward sways and intermittently, with a modal rate of two recruitments per second. This modal rate is consistent with previous circumstantial evidence relating the control of standing to an intrinsic, higher level planning process.

  2. 14 CFR 217.10 - Instructions.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... and the other pertaining to on-flight markets. For example, the routing (A-B-C-D) consists of three..., Singapore A-3—Airport code Origin A-4—Airport code Destination A-5—Service class (mark an X) F G L P Q By aircraft type— B-1—Aircraft type code B-2—Revenue aircraft departures B-3—Revenue passengers transported B...

  3. 14 CFR 217.10 - Instructions.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... and the other pertaining to on-flight markets. For example, the routing (A-B-C-D) consists of three..., Singapore A-3—Airport code Origin A-4—Airport code Destination A-5—Service class (mark an X) F G L P Q By aircraft type— B-1—Aircraft type code B-2—Revenue aircraft departures B-3—Revenue passengers transported B...

  4. 14 CFR 217.10 - Instructions.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... and the other pertaining to on-flight markets. For example, the routing (A-B-C-D) consists of three..., Singapore A-3—Airport code Origin A-4—Airport code Destination A-5—Service class (mark an X) F G L P Q By aircraft type— B-1—Aircraft type code B-2—Revenue aircraft departures B-3—Revenue passengers transported B...

  5. An ecological analysis of food outlet density and prevalence of type II diabetes in South Carolina counties.

    PubMed

    AlHasan, Dana M; Eberth, Jan Marie

    2016-01-05

    Studies suggest that the built environment with high numbers of fast food restaurants and convenience stores and low numbers of super stores and grocery stores are related to obesity, type II diabetes mellitus, and other chronic diseases. Since few studies assess these relationships at the county level, we aim to examine fast food restaurant density, convenience store density, super store density, and grocery store density and prevalence of type II diabetes among counties in South Carolina. Pearson's correlation between four types of food outlet densities- fast food restaurants, convenience stores, super stores, and grocery stores- and prevalence of type II diabetes were computed. The relationship between each of these food outlet densities were mapped with prevalence of type II diabetes, and OLS regression analysis was completed adjusting for county-level rates of obesity, physical inactivity, density of recreation facilities, unemployment, households with no car and limited access to stores, education, and race. We showed a significant, negative relationship between fast food restaurant density and prevalence of type II diabetes, and a significant, positive relationship between convenience store density and prevalence of type II diabetes. In adjusted analysis, the food outlet densities (of any type) was not associated with prevalence of type II diabetes. This ecological analysis showed no associations between fast food restaurants, convenience stores, super stores, or grocery stores densities and the prevalence of type II diabetes. Consideration of environmental, social, and cultural determinants, as well as individual behaviors is needed in future research.

  6. Traceability and Quality Control in Traditional Chinese Medicine: From Chemical Fingerprint to Two-Dimensional Barcode

    PubMed Central

    Cai, Yong; Li, Xiwen; Li, Mei; Chen, Xiaojia; Ni, Jingyun; Wang, Yitao

    2015-01-01

    Chemical fingerprinting is currently a widely used tool that enables rapid and accurate quality evaluation of Traditional Chinese Medicine (TCM). However, chemical fingerprints are not amenable to information storage, recognition, and retrieval, which limit their use in Chinese medicine traceability. In this study, samples of three kinds of Chinese medicines were randomly selected and chemical fingerprints were then constructed by using high performance liquid chromatography. Based on chemical data, the process of converting the TCM chemical fingerprint into two-dimensional code is presented; preprocess and filtering algorithm are also proposed aiming at standardizing the large amount of original raw data. In order to know which type of two-dimensional code (2D) is suitable for storing data of chemical fingerprints, current popular types of 2D codes are analyzed and compared. Results show that QR Code is suitable for recording the TCM chemical fingerprint. The fingerprint information of TCM can be converted into data format that can be stored as 2D code for traceability and quality control. PMID:26089936

  7. Isolation of an Intertypic Poliovirus Capsid Recombinant from a Child with Vaccine-Associated Paralytic Poliomyelitis

    PubMed Central

    Martín, Javier; Samoilovich, Elena; Dunn, Glynis; Lackenby, Angie; Feldman, Esphir; Heath, Alan; Svirchevskaya, Ekaterina; Cooper, Gill; Yermalovich, Marina; Minor, Philip D.

    2002-01-01

    The isolation of a capsid intertypic poliovirus recombinant from a child with vaccine-associated paralytic poliomyelitis is described. Virus 31043 had a Sabin-derived type 3-type 2-type 1 recombinant genome with a 5′-end crossover point within the capsid coding region. The result was a poliovirus chimera containing the entire coding sequence for antigenic site 3a derived from the Sabin type 2 strain. The recombinant virus showed altered antigenic properties but did not acquire type 2 antigenic characteristics. The significance of the presence in nature of such poliovirus chimeras and the consequences for the current efforts to detect potentially dangerous vaccine-derived poliovirus strains are discussed in the context of the global polio eradication initiative. PMID:12368335

  8. What does music express? Basic emotions and beyond

    PubMed Central

    Juslin, Patrik N.

    2013-01-01

    Numerous studies have investigated whether music can reliably convey emotions to listeners, and—if so—what musical parameters might carry this information. Far less attention has been devoted to the actual contents of the communicative process. The goal of this article is thus to consider what types of emotional content are possible to convey in music. I will argue that the content is mainly constrained by the type of coding involved, and that distinct types of content are related to different types of coding. Based on these premises, I suggest a conceptualization in terms of “multiple layers” of musical expression of emotions. The “core” layer is constituted by iconically-coded basic emotions. I attempt to clarify the meaning of this concept, dispel the myths that surround it, and provide examples of how it can be heuristic in explaining findings in this domain. However, I also propose that this “core” layer may be extended, qualified, and even modified by additional layers of expression that involve intrinsic and associative coding. These layers enable listeners to perceive more complex emotions—though the expressions are less cross-culturally invariant and more dependent on the social context and/or the individual listener. This multiple-layer conceptualization of expression in music can help to explain both similarities and differences between vocal and musical expression of emotions. PMID:24046758

  9. Enlisting Madison Avenue: The Marketing Approach to Earning Popular Support in Theaters of Operation

    DTIC Science & Technology

    2007-01-01

    equivalent size: 7 Philip Kotler and Gary Armstrong, Principles of Marketing, 11th ed., Upper Saddle River, N.J.: Pearson Education, 2006, p. 196. 8...researchers Philip Kotler , Ned Roberto, and Nancy Lee.146 To illus- trate the application of these steps in an operational theater, we utilize a...2003, p. 18. Kotler , Philip , and Gary Armstrong, Principles of Marketing, 11th ed., Upper Saddle River, N.J.: Pearson Education, 2006. 200

  10. The development and validation of using inertial sensors to monitor postural change in resistance exercise.

    PubMed

    Gleadhill, Sam; Lee, James Bruce; James, Daniel

    2016-05-03

    This research presented and validated a method of assessing postural changes during resistance exercise using inertial sensors. A simple lifting task was broken down to a series of well-defined tasks, which could be examined and measured in a controlled environment. The purpose of this research was to determine whether timing measures obtained from inertial sensor accelerometer outputs are able to provide accurate, quantifiable information of resistance exercise movement patterns. The aim was to complete a timing measure validation of inertial sensor outputs. Eleven participants completed five repetitions of 15 different deadlift variations. Participants were monitored with inertial sensors and an infrared three dimensional motion capture system. Validation was undertaken using a Will Hopkins Typical Error of the Estimate, with a Pearson׳s correlation and a Bland Altman Limits of Agreement analysis. Statistical validation measured the timing agreement during deadlifts, from inertial sensor outputs and the motion capture system. Timing validation results demonstrated a Pearson׳s correlation of 0.9997, with trivial standardised error (0.026) and standardised bias (0.002). Inertial sensors can now be used in practical settings with as much confidence as motion capture systems, for accelerometer timing measurements of resistance exercise. This research provides foundations for inertial sensors to be applied for qualitative activity recognition of resistance exercise and safe lifting practices. Copyright © 2016 Elsevier Ltd. All rights reserved.

  11. PMD compensation in multilevel coded-modulation schemes with coherent detection using BLAST algorithm and iterative polarization cancellation.

    PubMed

    Djordjevic, Ivan B; Xu, Lei; Wang, Ting

    2008-09-15

    We present two PMD compensation schemes suitable for use in multilevel (M>or=2) block-coded modulation schemes with coherent detection. The first scheme is based on a BLAST-type polarization-interference cancellation scheme, and the second scheme is based on iterative polarization cancellation. Both schemes use the LDPC codes as channel codes. The proposed PMD compensations schemes are evaluated by employing coded-OFDM and coherent detection. When used in combination with girth-10 LDPC codes those schemes outperform polarization-time coding based OFDM by 1 dB at BER of 10(-9), and provide two times higher spectral efficiency. The proposed schemes perform comparable and are able to compensate even 1200 ps of differential group delay with negligible penalty.

  12. A Review on Spectral Amplitude Coding Optical Code Division Multiple Access

    NASA Astrophysics Data System (ADS)

    Kaur, Navpreet; Goyal, Rakesh; Rani, Monika

    2017-06-01

    This manuscript deals with analysis of Spectral Amplitude Coding Optical Code Division Multiple Access (SACOCDMA) system. The major noise source in optical CDMA is co-channel interference from other users known as multiple access interference (MAI). The system performance in terms of bit error rate (BER) degrades as a result of increased MAI. It is perceived that number of users and type of codes used for optical system directly decide the performance of system. MAI can be restricted by efficient designing of optical codes and implementing them with unique architecture to accommodate more number of users. Hence, it is a necessity to design a technique like spectral direct detection (SDD) technique with modified double weight code, which can provide better cardinality and good correlation property.

  13. Effective record length for the T-year event

    USGS Publications Warehouse

    Tasker, Gary D.

    1983-01-01

    The effect of serial dependence on the reliability of an estimate of the T-yr. event is of importance in hydrology because design decisions are based upon the estimate. In this paper the reliability of estimates of the T-yr. event from two common distributions is given as a function of number of observations and lag-one serial correlation coefficient for T = 2, 10, 20, 50, and 100 yr. A lag-one autoregressive model is assumed with either a normal or Pearson Type-III disturbance term. Results indicate that, if observations are serially correlated, the effective record length should be used to estimate the discharge associated with the expected exceedance probability. ?? 1983.

  14. Assessing the dependence of sensitivity and specificity on prevalence in meta-analysis

    PubMed Central

    Li, Jialiang; Fine, Jason P.

    2011-01-01

    We consider modeling the dependence of sensitivity and specificity on the disease prevalence in diagnostic accuracy studies. Many meta-analyses compare test accuracy across studies and fail to incorporate the possible connection between the accuracy measures and the prevalence. We propose a Pearson type correlation coefficient and an estimating equation–based regression framework to help understand such a practical dependence. The results we derive may then be used to better interpret the results from meta-analyses. In the biomedical examples analyzed in this paper, the diagnostic accuracy of biomarkers are shown to be associated with prevalence, providing insights into the utility of these biomarkers in low- and high-prevalence populations. PMID:21525421

  15. Novel mutations in the CHST6 gene associated with macular corneal dystrophy in southern India.

    PubMed

    Warren, John F; Aldave, Anthony J; Srinivasan, M; Thonar, Eugene J; Kumar, Abha B; Cevallos, Vicky; Whitcher, John P; Margolis, Todd P

    2003-11-01

    To further characterize the role of the carbohydrate sulfotransferase (CHST6) gene in macular corneal dystrophy (MCD) through identification of causative mutations in a cohort of affected patients from southern India. Genomic DNA was extracted from buccal epithelium of 75 patients (51 families) with MCD, 33 unaffected relatives, and 48 healthy volunteers. The coding region of the CHST6 gene was evaluated by means of polymerase chain reaction amplification and direct sequencing. Subtyping of MCD into types I and II was performed by measuring serum levels of antigenic keratan sulfate. Seventy patients were classified as having type I MCD, and 5 patients as having type II MCD. Analysis of the CHST6 coding region in patients with type I MCD identified 11 homozygous missense mutations (Leu22Arg, His42Tyr, Arg50Cys, Arg50Leu, Ser53Leu, Arg97Pro, Cys102Tyr, Arg127Cys, Arg205Gln, His249Pro, and Glu274Lys), 2 compound heterozygous missense mutations (Arg93His and Ala206Thr), 5 homozygous deletion mutations (delCG707-708, delC890, delA1237, del1748-1770, and delORF), and 2 homozygous replacement mutations (ACCTAC 1273 GGT, and GCG 1304 AT). One patient with type II MCD was heterozygous for the C890 deletion mutation, whereas 4 possessed no CHST6 coding region mutations. A variety of previously unreported mutations in the coding region of the CHST6 gene are associated with type I MCD in a cohort of patients in southern India. An improved understanding of the genetic basis of MCD allows for earlier, more accurate diagnosis of affected individuals, and may provide the foundation for the development of novel disease treatments.

  16. Orbital cortex neuronal responses during an odor-based conditioned associative task in rats.

    PubMed

    Yonemori, M; Nishijo, H; Uwano, T; Tamura, R; Furuta, I; Kawasaki, M; Takashima, Y; Ono, T

    2000-01-01

    Neuronal activity in the rat orbital cortex during discrimination of various odors [five volatile organic compounds (acetophenone, isoamyl acetate, cyclohexanone, p-cymene and 1,8-cineole), and food- and cosmetic-related odorants (black pepper, cheese, rose and perfume)] and other conditioned sensory stimuli (tones, light and air puff) was recorded and compared with behavioral responses to the same odors (black pepper, cheese, rose and perfume). In a neurophysiological study, the rats were trained to lick a spout that protruded close to its mouth to obtain sucrose or intracranial self-stimulation reward after presentation of conditioned stimuli. Of 150 orbital cortex neurons recorded during the task, 65 responded to one or more types of sensory stimuli. Of these, 73.8% (48/65) responded during presentation of an odor. Although the mean breadth of responsiveness (entropy) of the olfactory neurons based on the responses to five volatile organic compounds and air (control) was rather high (0.795), these stimuli were well discriminated in an odor space resulting from multidimensional scaling using Pearson's correlation coefficients between the stimuli. In a behavioral study, a rat was housed in an equilateral octagonal cage, with free access to food and choice among eight levers, four of which elicited only water (no odor, controls), and four of which elicited both water and one of four odors (black pepper, cheese, rose or perfume). Lever presses for each odor and control were counted. Distributions of these five stimuli (four odors and air) in an odor space derived from the multidimensional scaling using Pearson's correlation coefficients based on behavioral responses were very similar to those based on neuronal responses to the same five stimuli. Furthermore, Pearson's correlation coefficients between the same five stimuli based on the neuronal responses and those based on behavioral responses were significantly correlated. The results demonstrated a pivotal role of the rat orbital cortex in olfactory sensory processing and suggest that the orbital cortex is important in the manifestation of various motivated behaviors of the animals, including odor-guided motivational behaviors (odor preference).

  17. Event Generators for Simulating Heavy Ion Interactions of Interest in Evaluating Risks in Human Spaceflight

    NASA Technical Reports Server (NTRS)

    Wilson, Thomas L.; Pinsky, Lawrence; Andersen, Victor; Empl, Anton; Lee, Kerry; Smirmov, Georgi; Zapp, Neal; Ferrari, Alfredo; Tsoulou, Katerina; Roesler, Stefan; hide

    2005-01-01

    Simulating the Space Radiation environment with Monte Carlo Codes, such as FLUKA, requires the ability to model the interactions of heavy ions as they penetrate spacecraft and crew member's bodies. Monte-Carlo-type transport codes use total interaction cross sections to determine probabilistically when a particular type of interaction has occurred. Then, at that point, a distinct event generator is employed to determine separately the results of that interaction. The space radiation environment contains a full spectrum of radiation types, including relativistic nuclei, which are the most important component for the evaluation of crew doses. Interactions between incident protons with target nuclei in the spacecraft materials and crew member's bodies are well understood. However, the situation is substantially less comfortable for incident heavier nuclei (heavy ions). We have been engaged in developing several related heavy ion interaction models based on a Quantum Molecular Dynamics-type approach for energies up through about 5 GeV per nucleon (GeV/A) as part of a NASA Consortium that includes a parallel program of cross section measurements to guide and verify this code development.

  18. Neutron-encoded Signatures Enable Product Ion Annotation From Tandem Mass Spectra*

    PubMed Central

    Richards, Alicia L.; Vincent, Catherine E.; Guthals, Adrian; Rose, Christopher M.; Westphall, Michael S.; Bandeira, Nuno; Coon, Joshua J.

    2013-01-01

    We report the use of neutron-encoded (NeuCode) stable isotope labeling of amino acids in cell culture for the purpose of C-terminal product ion annotation. Two NeuCode labeling isotopologues of lysine, 13C615N2 and 2H8, which differ by 36 mDa, were metabolically embedded in a sample proteome, and the resultant labeled proteins were combined, digested, and analyzed via liquid chromatography and mass spectrometry. With MS/MS scan resolving powers of ∼50,000 or higher, product ions containing the C terminus (i.e. lysine) appear as a doublet spaced by exactly 36 mDa, whereas N-terminal fragments exist as a single m/z peak. Through theory and experiment, we demonstrate that over 90% of all y-type product ions have detectable doublets. We report on an algorithm that can extract these neutron signatures with high sensitivity and specificity. In other words, of 15,503 y-type product ion peaks, the y-type ion identification algorithm correctly identified 14,552 (93.2%) based on detection of the NeuCode doublet; 6.8% were misclassified (i.e. other ion types that were assigned as y-type products). Searching NeuCode labeled yeast with PepNovo+ resulted in a 34% increase in correct de novo identifications relative to searching through MS/MS only. We use this tool to simplify spectra prior to database searching, to sort unmatched tandem mass spectra for spectral richness, for correlation of co-fragmented ions to their parent precursor, and for de novo sequence identification. PMID:24043425

  19. EUPDF: An Eulerian-Based Monte Carlo Probability Density Function (PDF) Solver. User's Manual

    NASA Technical Reports Server (NTRS)

    Raju, M. S.

    1998-01-01

    EUPDF is an Eulerian-based Monte Carlo PDF solver developed for application with sprays, combustion, parallel computing and unstructured grids. It is designed to be massively parallel and could easily be coupled with any existing gas-phase flow and spray solvers. The solver accommodates the use of an unstructured mesh with mixed elements of either triangular, quadrilateral, and/or tetrahedral type. The manual provides the user with the coding required to couple the PDF code to any given flow code and a basic understanding of the EUPDF code structure as well as the models involved in the PDF formulation. The source code of EUPDF will be available with the release of the National Combustion Code (NCC) as a complete package.

  20. Billing and coding knowledge: a comparative survey of professional coders, practicing orthopedic surgeons, and orthopedic residents.

    PubMed

    Wiley, Kevin F; Yousuf, Tariq; Pasque, Charles B; Yousuf, Khalid

    2014-06-01

    Medical knowledge and surgical skills are necessary to become an effective orthopedic surgeon. To run an efficient practice, the surgeon must also possess a basic understanding of medical business practices, including billing and coding. In this study, we surveyed and compared the level of billing and coding knowledge among current orthopedic residents PGY3 and higher, academic and private practice attending orthopedic surgeons, and orthopedic coding professionals. According to the survey results, residents and fellows have a similar knowledge of coding and billing, regardless of their level of training or type of business education received in residency. Most residents would like formal training in coding, billing, and practice management didactics; this is consistent with data from previous studies.

  1. CSTEM User Manual

    NASA Technical Reports Server (NTRS)

    Hartle, M.; McKnight, R. L.

    2000-01-01

    This manual is a combination of a user manual, theory manual, and programmer manual. The reader is assumed to have some previous exposure to the finite element method. This manual is written with the idea that the CSTEM (Coupled Structural Thermal Electromagnetic-Computer Code) user needs to have a basic understanding of what the code is actually doing in order to properly use the code. For that reason, the underlying theory and methods used in the code are described to a basic level of detail. The manual gives an overview of the CSTEM code: how the code came into existence, a basic description of what the code does, and the order in which it happens (a flowchart). Appendices provide a listing and very brief description of every file used by the CSTEM code, including the type of file it is, what routine regularly accesses the file, and what routine opens the file, as well as special features included in CSTEM.

  2. Los Alamos radiation transport code system on desktop computing platforms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Briesmeister, J.F.; Brinkley, F.W.; Clark, B.A.

    The Los Alamos Radiation Transport Code System (LARTCS) consists of state-of-the-art Monte Carlo and discrete ordinates transport codes and data libraries. These codes were originally developed many years ago and have undergone continual improvement. With a large initial effort and continued vigilance, the codes are easily portable from one type of hardware to another. The performance of scientific work-stations (SWS) has evolved to the point that such platforms can be used routinely to perform sophisticated radiation transport calculations. As the personal computer (PC) performance approaches that of the SWS, the hardware options for desk-top radiation transport calculations expands considerably. Themore » current status of the radiation transport codes within the LARTCS is described: MCNP, SABRINA, LAHET, ONEDANT, TWODANT, TWOHEX, and ONELD. Specifically, the authors discuss hardware systems on which the codes run and present code performance comparisons for various machines.« less

  3. Predicting the Types of Ion Channel-Targeted Conotoxins Based on AVC-SVM Model.

    PubMed

    Xianfang, Wang; Junmei, Wang; Xiaolei, Wang; Yue, Zhang

    2017-01-01

    The conotoxin proteins are disulfide-rich small peptides. Predicting the types of ion channel-targeted conotoxins has great value in the treatment of chronic diseases, epilepsy, and cardiovascular diseases. To solve the problem of information redundancy existing when using current methods, a new model is presented to predict the types of ion channel-targeted conotoxins based on AVC (Analysis of Variance and Correlation) and SVM (Support Vector Machine). First, the F value is used to measure the significance level of the feature for the result, and the attribute with smaller F value is filtered by rough selection. Secondly, redundancy degree is calculated by Pearson Correlation Coefficient. And the threshold is set to filter attributes with weak independence to get the result of the refinement. Finally, SVM is used to predict the types of ion channel-targeted conotoxins. The experimental results show the proposed AVC-SVM model reaches an overall accuracy of 91.98%, an average accuracy of 92.17%, and the total number of parameters of 68. The proposed model provides highly useful information for further experimental research. The prediction model will be accessed free of charge at our web server.

  4. Predicting the Types of Ion Channel-Targeted Conotoxins Based on AVC-SVM Model

    PubMed Central

    Xiaolei, Wang

    2017-01-01

    The conotoxin proteins are disulfide-rich small peptides. Predicting the types of ion channel-targeted conotoxins has great value in the treatment of chronic diseases, epilepsy, and cardiovascular diseases. To solve the problem of information redundancy existing when using current methods, a new model is presented to predict the types of ion channel-targeted conotoxins based on AVC (Analysis of Variance and Correlation) and SVM (Support Vector Machine). First, the F value is used to measure the significance level of the feature for the result, and the attribute with smaller F value is filtered by rough selection. Secondly, redundancy degree is calculated by Pearson Correlation Coefficient. And the threshold is set to filter attributes with weak independence to get the result of the refinement. Finally, SVM is used to predict the types of ion channel-targeted conotoxins. The experimental results show the proposed AVC-SVM model reaches an overall accuracy of 91.98%, an average accuracy of 92.17%, and the total number of parameters of 68. The proposed model provides highly useful information for further experimental research. The prediction model will be accessed free of charge at our web server. PMID:28497044

  5. [Factors Influencing Physical Activity among Community-dwelling Older Adults with Type 2 Diabetes: A Path Analysis].

    PubMed

    Jang, Sun Joo; Park, Hyunju; Kim, Hyunjung; Chang, Sun Ju

    2015-06-01

    The purpose of the study was to identify factors influencing physical activity among community-dwelling older adults with type 2 diabetes. The study design was based on the Theory of Triadic Influence. A total of 242 older adults with type 2 diabetes participated in this study. Six variables related to physical activity in older adults, including self-efficacy, social normative belief, attitudes, intention, experience, and level of physical activity, were measured using reliable instruments. Data were analyzed using descriptive statistics, Pearson's correlation analyses, and a path analysis. The mean physical activity score was 104.2, range from zero to 381.21. The path analysis showed that self-efficacy had the greatest total effect on physical activity. Also, experience had direct and total effects on physical activity as well as mediated the paths of social normative beliefs to attitudes and intention to physical activity. These factors accounted for 10% of the total variance, and the fit indices of the model satisfied the criteria of fitness. The findings of the study reveal the important role of self-efficacy and past experience in physical activity in older adults with type 2 diabetes.

  6. Clinical application of antenatal genetic diagnosis of osteogenesis imperfecta type IV.

    PubMed

    Yuan, Jing; Li, Song; Xu, YeYe; Cong, Lin

    2015-04-02

    Clinical analysis and genetic testing of a family with osteogenesis imperfecta type IV were conducted, aiming to discuss antenatal genetic diagnosis of osteogenesis imperfecta type IV. Preliminary genotyping was performed based on clinical characteristics of the family members and then high-throughput sequencing was applied to rapidly and accurately detect the changes in candidate genes. Genetic testing of the III5 fetus and other family members revealed missense mutation in c.2746G>A, pGly916Arg in COL1A2 gene coding region and missense and synonymous mutation in COL1A1 gene coding region. Application of antenatal genetic diagnosis provides fast and accurate genetic counseling and eugenics suggestions for patients with osteogenesis imperfecta type IV and their families.

  7. Examining the Role of Orthographic Coding Ability in Elementary Students with Previously Identified Reading Disability, Speech or Language Impairment, or Comorbid Language and Learning Disabilities

    ERIC Educational Resources Information Center

    Haugh, Erin Kathleen

    2017-01-01

    The purpose of this study was to examine the role orthographic coding might play in distinguishing between membership in groups of language-based disability types. The sample consisted of 36 second and third-grade subjects who were administered the PAL-II Receptive Coding and Word Choice Accuracy subtest as a measure of orthographic coding…

  8. Domestic Ice Breaking Simulation Model User Guide

    DTIC Science & Technology

    2012-04-01

    Temperatures” sub-module. Notes on Ice Data Sources Selected Historical Ice Data *** D9 Historical (SIGRID Coded) NBL Waterways * D9 Waterway...numbers in NBL scheme D9 Historical Ice Data (Feet Thickness) Main Model Waterways * SIGRID code conversion to feet of ice thickness D9 Historical Ice Data...Feet Thickness) NBL Waterways * SIGRID codes Years for Ice Data ** Types of Ice Waterway Time Selected Ice and Weather Data Years DOMICE Simulation

  9. Knowledge and Practices of Diabetes Foot Care and Risk of Developing Foot Ulcers in México May Have Implications for Patients of Méxican Heritage Living in the US.

    PubMed

    Bohorquez Robles, Rosa; Compeán Ortiz, Lidia G; González Quirarte, Nora H; Berry, Diane C; Aguilera Pérez, Paulina; Piñones Martínez, Socorro

    2017-06-01

    Purpose The purpose of the study was to examine the relationship between knowledge and foot care practices among adults with type 2 diabetes. Methods A descriptive correlational study examined 200 patients with type 2 diabetes in México. Data collected included the Knowledge and Practices Self-Care Questionnaire and a Podiatry Examination Questionnaire. Data analysis included Pearson's correlations and chi-square tests. Results More than half of the participants had poor knowledge and poor foot care practices. A significant negative correlation between knowledge and practices of foot care and risk of developing diabetes foot ulcers was found. There was no relationship between sociodemographic variables and the risk of developing diabetes foot ulcers. Conclusions Patients with type 2 diabetes served in an outpatient clinic had poor knowledge and practices of foot care. They demonstrated decreased knowledge and practice of foot care and therefore showed a greater risk of developing diabetes foot, which may predispose patients to early complications.

  10. Type A behavior and the thallium stress test

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kahn, J.P.; Kornfeld, D.S.; Blood, D.K.

    1982-11-01

    Several recent studies have examined the association between Type A personality and coronary artery disease (CAD) by coronary angiography. Most of these studies have reported a significant association. The present study is an attempt at further confirmation, using a new non-invasive technique for measuring CAD. Subjects were 53 patients undergoing routine exercise stress tests with concomitant thallium-201 myocardial perfusion studies. Five aspects of Type A behavior were assessed by the use of the Rosenman-Friedman Semistructured Interview, and each was rated on a three-point scale. Severity of CAD was independently estimated on a four-point scale. Pearson correlation coefficients were separately computedmore » for patients with and without reported history of myocardial infarction (MI). For 37 patients without reported MI, CAD severity was significantly correlated with Overall Type A (r . -0.53), Vocal Characteristics (r . -0.53), Job Involvement (r . -0.36) and Aggressiveness (r . -0.48), but not Time Urgency (r . -0.25). For 16 patients with reported MI, CAD severity was significantly correlated with Job Involvement only (r . +0.49). The data are consistent with the association of Type A personality and coronary atherogenesis, but may also reflect Type A psychological and physiological characteristics. Future studies may be able to examine these and other aspects of Type A behavior using this noninvasive technique in more diverse patient populations.« less

  11. Appendix 1—California plant community types represented in Forest Service research natural areas

    Treesearch

    Sheauchi Cheng

    2004-01-01

    Community types and codes (Holland 1986) are in boldface; research natural area names (with ecological survey names in parentheses, if different from the research natural area names) are in plain type.

  12. Pearson syndrome in a Diamond-Blackfan anemia cohort.

    PubMed

    Alter, Blanche P

    2014-07-17

    In this issue of Blood, Gagne et al describe a cohort of 362 patients clinically classified as having Diamond-Blackfan anemia (DBA), in which 175 (48%) were found to have mutations and deletions in ribosomal protein genes or GATA1, and 8 of the remaining patients (2.2% overall) had mitochondrial gene deletions consistent with Pearson marrow-pancreas syndrome (PS). The authors propose that all patients with presumptive DBA should be tested for mitochondrial DNA (mtDNA) deletion during their initial genetic evaluation.

  13. Laser plasma x-ray line spectra fitted using the Pearson VII function

    NASA Astrophysics Data System (ADS)

    Michette, A. G.; Pfauntsch, S. J.

    2000-05-01

    The Pearson VII function, which is more general than the Gaussian, Lorentzian and other profiles, is used to fit the x-ray spectral lines produced in a laser-generated plasma, instead of the more usual, but computationally expensive, Voigt function. The mean full-width half-maximum of the fitted lines is 0.102+/-0.014 nm, entirely consistent with the value expected from geometrical considerations, and the fitted line profiles are generally inconsistent with being either Lorentzian or Gaussian.

  14. Reliability in Cross-National Content Analysis.

    ERIC Educational Resources Information Center

    Peter, Jochen; Lauf, Edmund

    2002-01-01

    Investigates how coder characteristics such as language skills, political knowledge, coding experience, and coding certainty affected inter-coder and coder-training reliability. Shows that language skills influenced both reliability types. Suggests that cross-national researchers should pay more attention to cross-national assessments of…

  15. Validity of the coding for herpes simplex encephalitis in the Danish National Patient Registry.

    PubMed

    Jørgensen, Laura Krogh; Dalgaard, Lars Skov; Østergaard, Lars Jørgen; Andersen, Nanna Skaarup; Nørgaard, Mette; Mogensen, Trine Hyrup

    2016-01-01

    Large health care databases are a valuable source of infectious disease epidemiology if diagnoses are valid. The aim of this study was to investigate the accuracy of the recorded diagnosis coding of herpes simplex encephalitis (HSE) in the Danish National Patient Registry (DNPR). The DNPR was used to identify all hospitalized patients, aged ≥15 years, with a first-time diagnosis of HSE according to the International Classification of Diseases, tenth revision (ICD-10), from 2004 to 2014. To validate the coding of HSE, we collected data from the Danish Microbiology Database, from departments of clinical microbiology, and from patient medical records. Cases were classified as confirmed, probable, or no evidence of HSE. We estimated the positive predictive value (PPV) of the HSE diagnosis coding stratified by diagnosis type, study period, and department type. Furthermore, we estimated the proportion of HSE cases coded with nonspecific ICD-10 codes of viral encephalitis and also the sensitivity of the HSE diagnosis coding. We were able to validate 398 (94.3%) of the 422 HSE diagnoses identified via the DNPR. Hereof, 202 (50.8%) were classified as confirmed cases and 29 (7.3%) as probable cases providing an overall PPV of 58.0% (95% confidence interval [CI]: 53.0-62.9). For "Encephalitis due to herpes simplex virus" (ICD-10 code B00.4), the PPV was 56.6% (95% CI: 51.1-62.0). Similarly, the PPV for "Meningoencephalitis due to herpes simplex virus" (ICD-10 code B00.4A) was 56.8% (95% CI: 39.5-72.9). "Herpes viral encephalitis" (ICD-10 code G05.1E) had a PPV of 75.9% (95% CI: 56.5-89.7), thereby representing the highest PPV. The estimated sensitivity was 95.5%. The PPVs of the ICD-10 diagnosis coding for adult HSE in the DNPR were relatively low. Hence, the DNPR should be used with caution when studying patients with encephalitis caused by herpes simplex virus.

  16. Impact of case type, length of stay, institution type, and comorbidities on Medicare diagnosis-related group reimbursement for adult spinal deformity surgery.

    PubMed

    Nunley, Pierce D; Mundis, Gregory M; Fessler, Richard G; Park, Paul; Zavatsky, Joseph M; Uribe, Juan S; Eastlack, Robert K; Chou, Dean; Wang, Michael Y; Anand, Neel; Frank, Kelly A; Stone, Marcus B; Kanter, Adam S; Shaffrey, Christopher I; Mummaneni, Praveen V

    2017-12-01

    OBJECTIVE The aim of this study was to educate medical professionals about potential financial impacts of improper diagnosis-related group (DRG) coding in adult spinal deformity (ASD) surgery. METHODS Medicare's Inpatient Prospective Payment System PC Pricer database was used to collect 2015 reimbursement data for ASD procedures from 12 hospitals. Case type, hospital type/location, number of operative levels, proper coding, length of stay, and complications/comorbidities (CCs) were analyzed for effects on reimbursement. DRGs were used to categorize cases into 3 types: 1) anterior or posterior only fusion, 2) anterior fusion with posterior percutaneous fixation with no dorsal fusion, and 3) combined anterior and posterior fixation and fusion. RESULTS Pooling institutions, cases were reimbursed the same for single-level and multilevel ASD surgery. Longer stay, from 3 to 8 days, resulted in an additional $1400 per stay. Posterior fusion was an additional $6588, while CCs increased reimbursement by approximately $13,000. Academic institutions received higher reimbursement than private institutions, i.e., approximately $14,000 (Case Types 1 and 2) and approximately $16,000 (Case Type 3). Urban institutions received higher reimbursement than suburban institutions, i.e., approximately $3000 (Case Types 1 and 2) and approximately $3500 (Case Type 3). Longer stay, from 3 to 8 days, increased reimbursement between $208 and $494 for private institutions and between $1397 and $1879 for academic institutions per stay. CONCLUSIONS Reimbursement is based on many factors not controlled by surgeons or hospitals, but proper DRG coding can significantly impact the financial health of hospitals and availability of quality patient care.

  17. Possible influence of the polarity reversal of the solar magnetic field on the various types of arrhythmias

    NASA Astrophysics Data System (ADS)

    Giannaropoulou, E.; Papailiou, M.; Mavromichalaki, H.; Gigolashvili, M.; Tvildiani, L.; Janashia, K.; Preka-Papadema, P.; Papadima, Th

    2013-02-01

    Over the last few years various researches have reached the conclusion that cosmic ray variations and geomagnetic disturbances are related to the condition of the human physiological state. In this study medical data concerning the number of incidents of different types of cardiac arrhythmias for the time period 1983 - 1992 which refer to 1902 patients in Tbilisi, Georgia were used. The smoothing method and the Pearson r-coefficients were used to examine the possible effect of different solar and geomagnetic activity parameters and cosmic ray intensity variations on the different types of arrhythmias. The time interval under examination was separated into two different time periods which coincided with the polarity reversal of the solar magnetic field that occurred in the years 1989-1990 and as a result a different behavior of all the above mentioned parameters as well as of the different types of arrhythmias was noticed during the two time intervals. In addition, changing of polarity sign of the solar magnetic field was found to affect the sign of correlation between the incidence of arrhythmias and the aforementioned parameters. The primary and secondary maxima observed in the solar parameters during the solar cycle 22, also appeared in several types of arrhythmias with a time lag of about five months.

  18. Root Formation in Ethylene-Insensitive Plants1

    PubMed Central

    Clark, David G.; Gubrium, Erika K.; Barrett, James E.; Nell, Terril A.; Klee, Harry J.

    1999-01-01

    Experiments with ethylene-insensitive tomato (Lycopersicon esculentum) and petunia (Petunia × hybrida) plants were conducted to determine if normal or adventitious root formation is affected by ethylene insensitivity. Ethylene-insensitive Never ripe (NR) tomato plants produced more belowground root mass but fewer aboveground adventitious roots than wild-type Pearson plants. Applied auxin (indole-3-butyric acid) increased adventitious root formation on vegetative stem cuttings of wild-type plants but had little or no effect on rooting of NR plants. Reduced adventitious root formation was also observed in ethylene-insensitive transgenic petunia plants. Applied 1-aminocyclopropane-1-carboxylic acid increased adventitious root formation on vegetative stem cuttings from NR and wild-type plants, but NR cuttings produced fewer adventitious roots than wild-type cuttings. These data suggest that the promotive effect of auxin on adventitious rooting is influenced by ethylene responsiveness. Seedling root growth of tomato in response to mechanical impedance was also influenced by ethylene sensitivity. Ninety-six percent of wild-type seedlings germinated and grown on sand for 7 d grew normal roots into the medium, whereas 47% of NR seedlings displayed elongated taproots, shortened hypocotyls, and did not penetrate the medium. These data indicate that ethylene has a critical role in various responses of roots to environmental stimuli. PMID:10482660

  19. Why Do Phylogenomic Data Sets Yield Conflicting Trees? Data Type Influences the Avian Tree of Life more than Taxon Sampling.

    PubMed

    Reddy, Sushma; Kimball, Rebecca T; Pandey, Akanksha; Hosner, Peter A; Braun, Michael J; Hackett, Shannon J; Han, Kin-Lan; Harshman, John; Huddleston, Christopher J; Kingston, Sarah; Marks, Ben D; Miglia, Kathleen J; Moore, William S; Sheldon, Frederick H; Witt, Christopher C; Yuri, Tamaki; Braun, Edward L

    2017-09-01

    Phylogenomics, the use of large-scale data matrices in phylogenetic analyses, has been viewed as the ultimate solution to the problem of resolving difficult nodes in the tree of life. However, it has become clear that analyses of these large genomic data sets can also result in conflicting estimates of phylogeny. Here, we use the early divergences in Neoaves, the largest clade of extant birds, as a "model system" to understand the basis for incongruence among phylogenomic trees. We were motivated by the observation that trees from two recent avian phylogenomic studies exhibit conflicts. Those studies used different strategies: 1) collecting many characters [$\\sim$ 42 mega base pairs (Mbp) of sequence data] from 48 birds, sometimes including only one taxon for each major clade; and 2) collecting fewer characters ($\\sim$ 0.4 Mbp) from 198 birds, selected to subdivide long branches. However, the studies also used different data types: the taxon-poor data matrix comprised 68% non-coding sequences whereas coding exons dominated the taxon-rich data matrix. This difference raises the question of whether the primary reason for incongruence is the number of sites, the number of taxa, or the data type. To test among these alternative hypotheses we assembled a novel, large-scale data matrix comprising 90% non-coding sequences from 235 bird species. Although increased taxon sampling appeared to have a positive impact on phylogenetic analyses the most important variable was data type. Indeed, by analyzing different subsets of the taxa in our data matrix we found that increased taxon sampling actually resulted in increased congruence with the tree from the previous taxon-poor study (which had a majority of non-coding data) instead of the taxon-rich study (which largely used coding data). We suggest that the observed differences in the estimates of topology for these studies reflect data-type effects due to violations of the models used in phylogenetic analyses, some of which may be difficult to detect. If incongruence among trees estimated using phylogenomic methods largely reflects problems with model fit developing more "biologically-realistic" models is likely to be critical for efforts to reconstruct the tree of life. [Birds; coding exons; GTR model; model fit; Neoaves; non-coding DNA; phylogenomics; taxon sampling.]. © The Author(s) 2017. Published by Oxford University Press, on behalf of the Society of Systematic Biologists. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  20. A critical analysis of the accuracy of several numerical techniques for combustion kinetic rate equations

    NASA Technical Reports Server (NTRS)

    Radhadrishnan, Krishnan

    1993-01-01

    A detailed analysis of the accuracy of several techniques recently developed for integrating stiff ordinary differential equations is presented. The techniques include two general-purpose codes EPISODE and LSODE developed for an arbitrary system of ordinary differential equations, and three specialized codes CHEMEQ, CREK1D, and GCKP4 developed specifically to solve chemical kinetic rate equations. The accuracy study is made by application of these codes to two practical combustion kinetics problems. Both problems describe adiabatic, homogeneous, gas-phase chemical reactions at constant pressure, and include all three combustion regimes: induction, heat release, and equilibration. To illustrate the error variation in the different combustion regimes the species are divided into three types (reactants, intermediates, and products), and error versus time plots are presented for each species type and the temperature. These plots show that CHEMEQ is the most accurate code during induction and early heat release. During late heat release and equilibration, however, the other codes are more accurate. A single global quantity, a mean integrated root-mean-square error, that measures the average error incurred in solving the complete problem is used to compare the accuracy of the codes. Among the codes examined, LSODE is the most accurate for solving chemical kinetics problems. It is also the most efficient code, in the sense that it requires the least computational work to attain a specified accuracy level. An important finding is that use of the algebraic enthalpy conservation equation to compute the temperature can be more accurate and efficient than integrating the temperature differential equation.

  1. Techniques for estimating flood-peak discharges of rural, unregulated streams in Ohio

    USGS Publications Warehouse

    Koltun, G.F.

    2003-01-01

    Regional equations for estimating 2-, 5-, 10-, 25-, 50-, 100-, and 500-year flood-peak discharges at ungaged sites on rural, unregulated streams in Ohio were developed by means of ordinary and generalized least-squares (GLS) regression techniques. One-variable, simple equations and three-variable, full-model equations were developed on the basis of selected basin characteristics and flood-frequency estimates determined for 305 streamflow-gaging stations in Ohio and adjacent states. The average standard errors of prediction ranged from about 39 to 49 percent for the simple equations, and from about 34 to 41 percent for the full-model equations. Flood-frequency estimates determined by means of log-Pearson Type III analyses are reported along with weighted flood-frequency estimates, computed as a function of the log-Pearson Type III estimates and the regression estimates. Values of explanatory variables used in the regression models were determined from digital spatial data sets by means of a geographic information system (GIS), with the exception of drainage area, which was determined by digitizing the area within basin boundaries manually delineated on topographic maps. Use of GIS-based explanatory variables represents a major departure in methodology from that described in previous reports on estimating flood-frequency characteristics of Ohio streams. Examples are presented illustrating application of the regression equations to ungaged sites on ungaged and gaged streams. A method is provided to adjust regression estimates for ungaged sites by use of weighted and regression estimates for a gaged site on the same stream. A region-of-influence method, which employs a computer program to estimate flood-frequency characteristics for ungaged sites based on data from gaged sites with similar characteristics, was also tested and compared to the GLS full-model equations. For all recurrence intervals, the GLS full-model equations had superior prediction accuracy relative to the simple equations and therefore are recommended for use.

  2. An Empirical Study on Raman Peak Fitting and Its Application to Raman Quantitative Research.

    PubMed

    Yuan, Xueyin; Mayanovic, Robert A

    2017-10-01

    Fitting experimentally measured Raman bands with theoretical model profiles is the basic operation for numerical determination of Raman peak parameters. In order to investigate the effects of peak modeling using various algorithms on peak fitting results, the representative Raman bands of mineral crystals, glass, fluids as well as the emission lines from a fluorescent lamp, some of which were measured under ambient light whereas others under elevated pressure and temperature conditions, were fitted using Gaussian, Lorentzian, Gaussian-Lorentzian, Voigtian, Pearson type IV, and beta profiles. From the fitting results of the Raman bands investigated in this study, the fitted peak position, intensity, area and full width at half-maximum (FWHM) values of the measured Raman bands can vary significantly depending upon which peak profile function is used in the fitting, and the most appropriate fitting profile should be selected depending upon the nature of the Raman bands. Specifically, the symmetric Raman bands of mineral crystals and non-aqueous fluids are best fit using Gaussian-Lorentzian or Voigtian profiles, whereas the asymmetric Raman bands are best fit using Pearson type IV profiles. The asymmetric O-H stretching vibrations of H 2 O and the Raman bands of soda-lime glass are best fit using several Gaussian profiles, whereas the emission lines from a florescent light are best fit using beta profiles. Multiple peaks that are not clearly separated can be fit simultaneously, provided the residuals in the fitting of one peak will not affect the fitting of the remaining peaks to a significant degree. Once the resolution of the Raman spectrometer has been properly accounted for, our findings show that the precision in peak position and intensity can be improved significantly by fitting the measured Raman peaks with appropriate profiles. Nevertheless, significant errors in peak position and intensity were still observed in the results from fitting of weak and wide Raman bands having unnormalized intensity/FWHM ratios lower than 200 counts/cm -1 .

  3. New Zealand adolescents' cellphone and cordless phone user-habits: are they at increased risk of brain tumours already? A cross-sectional study.

    PubMed

    Redmayne, Mary

    2013-01-10

    Cellphone and cordless phone use is very prevalent among early adolescents, but the extent and types of use is not well documented. This paper explores how, and to what extent, New Zealand adolescents are typically using and exposed to active cellphones and cordless phones, and considers implications of this in relation to brain tumour risk, with reference to current research findings. This cross-sectional study recruited 373 Year 7 and 8 school students with a mean age of 12.3 years (range 10.3-13.7 years) from the Wellington region of New Zealand. Participants completed a questionnaire and measured their normal body-to-phone texting distances. Main exposure-metrics included self-reported time spent with an active cellphone close to the body, estimated time and number of calls on both phone types, estimated and actual extent of SMS text-messaging, cellphone functions used and people texted. Statistical analyses used Pearson Chi2 tests and Pearson's correlation coefficient (r). Analyses were undertaken using SPSS version 19.0. Both cellphones and cordless phones were used by approximately 90% of students. A third of participants had already used a cordless phone for ≥ 7 years. In 4 years from the survey to mid-2013, the cordless phone use of 6% of participants would equal that of the highest Interphone decile (≥ 1640 hours), at the surveyed rate of use. High cellphone use was related to cellphone location at night, being woken regularly, and being tired at school. More than a third of parents thought cellphones carried a moderate-to-high health risk for their child. While cellphones were very popular for entertainment and social interaction via texting, cordless phones were most popular for calls. If their use continued at the reported rate, many would be at increased risk of specific brain tumours by their mid-teens, based on findings of the Interphone and Hardell-group studies.

  4. Nursing intuition as an assessment tool in predicting severity of injury in trauma patients.

    PubMed

    Cork, Lora L

    2014-01-01

    Emergency nurses assess patients using objective and subjective data. When the charge nurse takes report from a paramedic, another form of assessment occurs. By eliciting apt data and using trauma-scoring criteria, a decision to enact a "trauma code" occurs. Considering the cost and staff utilization, it is important for the charge nurse to make sound decisions when activating a trauma code. The objective of this study is to explore the validity of nurses' use of intuition in patients to predict the severity of their injuries, and whether it impacts their choice to institute a trauma code.The study design was a descriptive, quantitative, cross-sectional record review and cohort analysis. The setting was a rural Trauma Level III emergency department (ED) located 80 miles from the nearest Level I trauma center. Phase I was a convenience cluster sample of all charge nurses in an ED. Phase II was a collection of all trauma records from June 2010 to May 2012. The inclusion criterion for Phase I subjects was that all participants were currently working as ED charge nurses. Analysis for Phase I data consisted of evaluating demographic information provided in questions 1 through 6 in a questionnaire. For Phase II data, a power analysis using Cohen's d was performed to determine the sample size to be evaluated. On the basis of the 2012 trauma data, a total of 419 records needed to be assessed (confidence interval, 0.164; P < .286). Two groups were created: (1) gut instinct only, and (2) all other criteria. Injury severity scores were categorized by ascending severity: (1) 0 to 4, (2) 5 to 9, (3) 10 to 16, (4) 17 to 24, and (5) greater than 25. The data analysis consisted of a 2-tailed t test for probability and a linear regression analysis using Pearson's r for correlation. In Phase I, 6 of the 8 charge nurses responded. Results showed an average of greater than 10 years of experience as an ED registered nurse, certification was equally yes and no, and highest level of education was at the BSN level. Phase II consisted of a review of 393 eligible medical files during the specified period. Because of the lack of sufficient data, 33 records were excluded. A total of 360 files remained with 109 in the "gut instinct" and 251 in the "other" category. A t test was performed using a 2-tailed test with an α value of .05. Results were a t-score of 0.02, and the null hypothesis was rejected. To evaluate the linear relationship between the sets of data, a Pearson's r correlation coefficient was calculated to determine the relationship between the 2 variables. Results indicated a strong positive correlation (r = 0.992; P ≤ .001).Intuition is a well-known phenomenon within the nursing community, but it is an abstract concept that is difficult to substantiate. To enhance the development of properly utilizing intuition in practice, I suggest pairing experienced with novice nurses in their patient assignments. This would enable the less proficient nurse to observe and ask questions about the rationale surrounding decisions the expert nurse has made regarding patient assessment and care.

  5. The effect of client ethnicity on clinical interpretation of the MMPI-2.

    PubMed

    Knaster, Cara A; Micucci, Joseph A

    2013-02-01

    Client ethnicity has been shown to affect clinicians' diagnostic impressions. However, it is not known whether interpretation of the Minnesota Multiphasic Personality Inventory-2 (MMPI-2) clinical scales is affected by ethnic bias. In this study, clinicians (82 males, 60 females) provided severity ratings for six symptoms based on three MMPI-2 profiles (representing the 27/72, 49/94, and 68/86 code-types) with the ethnicity of the client randomly assigned as either African American or Caucasian. To determine whether symptom severity ratings based on MMPI-2 profiles were affected by ethnicity, a 3 (code-type) × 2 (ethnicity) MANOVA was performed. Neither the main effect for ethnicity nor the ethnicity × code-type interaction was significant. These results indicated that the symptom severity ratings based on the MMPI-2 clinical scales were not affected by the client's identification as African American or Caucasian. Future studies are needed to explore the interpretation of profiles from clients representing other ethnic groups and for female clients.

  6. Bidirectional automatic release of reserve for low voltage network made with low capacity PLCs

    NASA Astrophysics Data System (ADS)

    Popa, I.; Popa, G. N.; Diniş, C. M.; Deaconu, S. I.

    2018-01-01

    The article presents the design of a bidirectional automatic release of reserve made on two types low capacity programmable logic controllers: PS-3 from Klöckner-Moeller and Zelio from Schneider. It analyses the electronic timing circuits that can be used for making the bidirectional automatic release of reserve: time-on delay circuit and time-off delay circuit (two types). In the paper are present the sequences code for timing performed on the PS-3 PLC, the logical functions for the bidirectional automatic release of reserve, the classical control electrical diagram (with contacts, relays, and time relays), the electronic control diagram (with logical gates and timing circuits), the code (in IL language) made for the PS-3 PLC, and the code (in FBD language) made for Zelio PLC. A comparative analysis will be carried out on the use of the two types of PLC and will be present the advantages of using PLCs.

  7. Pedestrian injury causation study (pedestrian accident typing)

    DOT National Transportation Integrated Search

    1982-08-01

    A new computerized pedestrian accident typing procedure was tested on 1,997 cases from the Pedestrian Injury Causation Study (PICS). Two coding procedures were used to determine the effects of quantity and quality of information on accident typing ac...

  8. Reducing Bias and Error in the Correlation Coefficient Due to Nonnormality.

    PubMed

    Bishara, Anthony J; Hittner, James B

    2015-10-01

    It is more common for educational and psychological data to be nonnormal than to be approximately normal. This tendency may lead to bias and error in point estimates of the Pearson correlation coefficient. In a series of Monte Carlo simulations, the Pearson correlation was examined under conditions of normal and nonnormal data, and it was compared with its major alternatives, including the Spearman rank-order correlation, the bootstrap estimate, the Box-Cox transformation family, and a general normalizing transformation (i.e., rankit), as well as to various bias adjustments. Nonnormality caused the correlation coefficient to be inflated by up to +.14, particularly when the nonnormality involved heavy-tailed distributions. Traditional bias adjustments worsened this problem, further inflating the estimate. The Spearman and rankit correlations eliminated this inflation and provided conservative estimates. Rankit also minimized random error for most sample sizes, except for the smallest samples ( n = 10), where bootstrapping was more effective. Overall, results justify the use of carefully chosen alternatives to the Pearson correlation when normality is violated.

  9. Reducing Bias and Error in the Correlation Coefficient Due to Nonnormality

    PubMed Central

    Hittner, James B.

    2014-01-01

    It is more common for educational and psychological data to be nonnormal than to be approximately normal. This tendency may lead to bias and error in point estimates of the Pearson correlation coefficient. In a series of Monte Carlo simulations, the Pearson correlation was examined under conditions of normal and nonnormal data, and it was compared with its major alternatives, including the Spearman rank-order correlation, the bootstrap estimate, the Box–Cox transformation family, and a general normalizing transformation (i.e., rankit), as well as to various bias adjustments. Nonnormality caused the correlation coefficient to be inflated by up to +.14, particularly when the nonnormality involved heavy-tailed distributions. Traditional bias adjustments worsened this problem, further inflating the estimate. The Spearman and rankit correlations eliminated this inflation and provided conservative estimates. Rankit also minimized random error for most sample sizes, except for the smallest samples (n = 10), where bootstrapping was more effective. Overall, results justify the use of carefully chosen alternatives to the Pearson correlation when normality is violated. PMID:29795841

  10. [Effects of bioclimatology on suicides].

    PubMed

    Gómez González, M J; Alonso García, C; Piñana López, A

    1997-03-15

    To verify the influence of the weather on suicides. A retrospective descriptive study. County of Cartagena. All the suicides recorded in the Anatomical Forensic Institute of Cartagena between 1986 and 1993. Creation of a data base with the essential features of each suicide and all the relevant bioclimatological variables of the exact moment they happen. A statistical description of each variable was made. The relationship between the different variables was defined by the Pearson chi 2 test and residual analysis. The sample data were compared with the population data (Neyman-Pearson and Pearson chi 2 tests), p < 0.05. 149 suicides were recorded. These suicides occur during the day, Monday and Saturday being the days with the highest number of them. Distribution throughout the month was homogeneous: July was the month with most suicides. There was an age band in the second and third decades of life and a peak in elderly people. Our sample had 77.9% men. Retired people and housewives predominated. Suicides are generally influenced by meteorological factors.

  11. Autofocus algorithm using one-dimensional Fourier transform and Pearson correlation

    NASA Astrophysics Data System (ADS)

    Bueno Mario, A.; Alvarez-Borrego, Josue; Acho, L.

    2004-10-01

    A new autofocus algorithm based on one-dimensional Fourier transform and Pearson correlation for Z automatized microscope is proposed. Our goal is to determine in fast response time and accuracy, the best focused plane through an algorithm. We capture in bright and dark field several images set at different Z distances from biological organism sample. The algorithm uses the one-dimensional Fourier transform to obtain the image frequency content of a vectors pattern previously defined comparing the Pearson correlation of these frequency vectors versus the reference image frequency vector, the most out of focus image, we find the best focusing. Experimental results showed the algorithm has fast response time and accuracy in getting the best focus plane from captured images. In conclusions, the algorithm can be implemented in real time systems due fast response time, accuracy and robustness. The algorithm can be used to get focused images in bright and dark field and it can be extended to include fusion techniques to construct multifocus final images beyond of this paper.

  12. An invariance property of generalized Pearson random walks in bounded geometries

    NASA Astrophysics Data System (ADS)

    Mazzolo, Alain

    2009-03-01

    Invariance properties of random walks in bounded domains are a topic of growing interest since they contribute to improving our understanding of diffusion in confined geometries. Recently, limited to Pearson random walks with exponentially distributed straight paths, it has been shown that under isotropic uniform incidence, the average length of the trajectories through the domain is independent of the random walk characteristic and depends only on the ratio of the volume's domain over its surface. In this paper, thanks to arguments of integral geometry, we generalize this property to any isotropic bounded stochastic process and we give the conditions of its validity for isotropic unbounded stochastic processes. The analytical form for the traveled distance from the boundary to the first scattering event that ensures the validity of the Cauchy formula is also derived. The generalization of the Cauchy formula is an analytical constraint that thus concerns a very wide range of stochastic processes, from the original Pearson random walk to a Rayleigh distribution of the displacements, covering many situations of physical importance.

  13. Pearson-Readhead survey from space

    NASA Astrophysics Data System (ADS)

    Preston, R. A.; Lister, M. L.; Tingay, S. J.; Piner, B. G.; Murphy, D. W.; Jones, D. L.; Meier, D. L.; Pearson, T. J.; Readhead, A. C. S.; Hirabayashi, H.; Kobayashi, H.; Inoue, M.

    2001-01-01

    We are using the VSOP space VLBI mission to observe a complete sample of Pearson-Readhead survey sources at 4.8 GHz to determine core brightness temperatures and pc-scale jet properties. The Pearson-Readhead sample has been used for extensive ground-based VLBI survey studies, and is ideal for a VSOP survey because the sources are strong, the VSOP u-v coverages are especially good above +35o declination, and multi-epoch ground-based VLBI data and other existing supporting data exceed that of any other sample. To date we have imaged 27 of the 31 objects in our sample. Our preliminary results show that the majority of objects contain strong core components that remain unresolved on baselines of ~30,000 km. The brightness temperatures of several cores significantly exceed 1012 K, which is indicative of highly relativistically beamed emission. We discuss correlations with several other beaming indicators, such as variability and spectral index, that support this scenario. This research was performed in part at the Jet Propulsion Laboratory, California Institute of Technology, under contract to NASA.

  14. Image stitching and image reconstruction of intestines captured using radial imaging capsule endoscope

    NASA Astrophysics Data System (ADS)

    Ou-Yang, Mang; Jeng, Wei-De; Wu, Yin-Yi; Dung, Lan-Rong; Wu, Hsien-Ming; Weng, Ping-Kuo; Huang, Ker-Jer; Chiu, Luan-Jiau

    2012-05-01

    This study investigates image processing using the radial imaging capsule endoscope (RICE) system. First, an experimental environment is established in which a simulated object has a shape that is similar to a cylinder, such that a triaxial platform can be used to push the RICE into the sample and capture radial images. Then four algorithms (mean absolute error, mean square error, Pearson correlation coefficient, and deformation processing) are used to stitch the images together. The Pearson correlation coefficient method is the most effective algorithm because it yields the highest peak signal-to-noise ratio, higher than 80.69 compared to the original image. Furthermore, a living animal experiment is carried out. Finally, the Pearson correlation coefficient method and vector deformation processing are used to stitch the images that were captured in the living animal experiment. This method is very attractive because unlike the other methods, in which two lenses are required to reconstruct the geometrical image, RICE uses only one lens and one mirror.

  15. COMPUTATION OF GLOBAL PHOTOCHEMISTRY WITH SMVGEAR II (R823186)

    EPA Science Inventory

    A computer model was developed to simulate global gas-phase photochemistry. The model solves chemical equations with SMVGEAR II, a sparse-matrix, vectorized Gear-type code. To obtain SMVGEAR II, the original SMVGEAR code was modified to allow computation of different sets of chem...

  16. 19 CFR 24.26 - Automated Clearinghouse credit.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ...; payer identification number (importer number or Social Security number or Customs assigned number); and...; payer identifier (importer number or Social Security number or Customs assigned number or filer code if... or warehouse withdrawal number for a deferred tax payment, or bill number); payment type code...

  17. BBC users manual. [In LRLTRAN for CDC 7600 and STAR

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ltterst, R. F.; Sutcliffe, W. G.; Warshaw, S. I.

    1977-11-01

    BBC is a two-dimensional, multifluid Eulerian hydro-radiation code based on KRAKEN and some subsequent ideas. It was developed in the explosion group in T-Division as a basic two-dimensional code to which various types of physics can be added. For this reason BBC is a FORTRAN (LRLTRAN) code. In order to gain the 2-to-1 to 4-to-1 speed advantage of the STACKLIB software on the 7600's and to be able to execute at high speed on the STAR, the vector extensions of LRLTRAN (STARTRAN) are used throughout the code. Either cylindrical- or slab-type problems can be run on BBC. The grid ismore » bounded by a rectangular band of boundary zones. The interfaces between the regular and boundary zones can be selected to be either rigid or nonrigid. The setup for BBC problems is described in the KEG Manual and LEG Manual. The difference equations are described in BBC Hydrodynamics. Basic input and output for BBC are described.« less

  18. Extensions and Adjuncts to the BRL-COMGEOM Program

    DTIC Science & Technology

    1974-08-01

    m MAGIC Code, GIFT Code, Computer Simulation, Target Description, Geometric Modeling Techniques, Vulnerability Analysis 20...Arbitrary Quadric Surf ace.. 0Oo „<>. 7 III. BRITL: A GEOMETRY PREPROCESSOR PROGRAM FOR INPUT TO THE GIFT SYSTEM „ 0 18 A. Introduction <, „. ° 18 B...the BRL- GIFT code. The tasks completed under this contract and described in the report are: Ao The addition to the list of available body types

  19. Asymmetric Memory Circuit Would Resist Soft Errors

    NASA Technical Reports Server (NTRS)

    Buehler, Martin G.; Perlman, Marvin

    1990-01-01

    Some nonlinear error-correcting codes more efficient in presence of asymmetry. Combination of circuit-design and coding concepts expected to make integrated-circuit random-access memories more resistant to "soft" errors (temporary bit errors, also called "single-event upsets" due to ionizing radiation). Integrated circuit of new type made deliberately more susceptible to one kind of bit error than to other, and associated error-correcting code adapted to exploit this asymmetry in error probabilities.

  20. Relational Database Design of a Shipboard Ammunition Inventory, Requisitioning, and Reporting System

    DTIC Science & Technology

    1990-06-01

    history of transactions effecting the status or quantity of that NI1N. Information on the current inventory balance is obtained from this section of...Number * Julian Date of Transaction * Activity Classification Code (ACC) * NALC * N1IN * Condition Code * Beginning Balance * Serial Number (if applicable...Ending Balance * Remarks As with the inventory information, ATR format varies with the type of control (Material Condition Code) applicable to that

Top