NASA Astrophysics Data System (ADS)
Veres, Teodor
Cette these est consacree a l'etude de l'evolution structurale des proprietes magnetiques et de transport des multicouches Ni/Fe et nanostructures a base de Co et de l'Ag. Dans une premiere partie, essentiellement bibliographique, nous introduisons quelques concepts de base relies aux proprietes magnetiques et de transport des multicouches metalliques. Ensuite, nous presentons une breve description des methodes d'analyse des resultats. La deuxieme partie est consacree a l'etude des proprietes magnetiques et de transport des multicouches ferromagnetiques/ferromagnetiques Ni/Fe. Nous montrerons qu'une interpretation coherente de ces proprietes necessite la prise en consideration des effets des interfaces. Nous nous attacherons a mettre en evidence, a evaluer et a etudier les effets de ces interfaces ainsi que leur evolution, et ce, suite a des traitements thermiques tel que le depot a temperature elevee et l'irradiation ionique. Les analyses correlees de la structure et de la magnetoresistance nous permettront d'emettre des conclusions sur l'influence des couches tampons entre l'interface et le substrat ainsi qu'entre les couches elles-memes sur le comportement magnetique des couches F/F. La troisieme partie est consacree aux systemes a Magneto-Resistance Geante (MRG) a base de Co et Ag. Nous allons etudier l'evolution de la microstructure suite a l'irradiation avec des ions Si+ ayant une energie de 1 MeV, ainsi que les effets de ces changements sur le comportement magnetique. Cette partie debutera par l'analyse des proprietes d'une multicouche hybride, intermediaire entre les multicouches et les materiaux granulaires. Nous analyserons a l'aide des mesures de diffraction, de relaxation superparamagnetique et de magnetoresistance, les evolutions structurales produites par l'irradiation ionique. Nous etablirons des modeles qui nous aideront a interpreter les resultats pour une serie des multicouches qui couvrent un large eventail de differents comportements magnetiques et ceci en fonction de l'epaisseur de la couche magnetique de Co. Nous verrons que dans ces systemes les effets de l'irradiation ionique sont fortement influences par l'energie de surface ainsi que par l'enthalpie de formation, largement positive pour le systeme Co/Ag.
Gravitational lensing in observational cosmology
NASA Astrophysics Data System (ADS)
Nottale, L.
This paper reviews some previous theoretical and observational results concerning the various effects of gravitational lensing, and also presents still unpublished results in this field. The theoretical section deals with the Optical Scalar Equation (OSE) approach. We recall the form of these equations, which relate the deformations of the cross sectional area of a light beam to the material and energetic distribution it encounters, via the two basic contributions to lensing, the matter or Ricci term and the shear term. The introduction of a new distance, the optical distance, allows to write the OSE in a simplified way from which new solutions are easily derived. We demonstrate here that a general form may be obtained for the amplification formula in the exact relativistic treatment, provided the Universe is assumed to be Friedmannian in the mean. New results are also presented concerning the probability distribution of amplifications, the relation from matter term to shear terms (the first ones give the mean of the second ones) and the problem of energy conservation. We recall how our method let to an analytical formula yielding the amplification by any number of lenses placed anywhere along the line of sight and present new general solutions for lensing by large scale density inhomogeneities. The gravitational redshift effects are also considered, either due to the crossing by photons of inhomogeneities, or intrinsic to them ; generalized solutions to the last problem are given. Some observational evidence concerning various lensing effects, either statistical or applying to individual sources, are considered. We first recall how the dependence of the amplification formula on the various physical parameters points towards the optimisation of lensing by very rich clusters of galaxies lying at redshifts around 0.7, which may give rise to very large amplifications for reasonable values of the density parameter. Recent results concerning a statistical effect of amplification of Brightest Cluster Galaxies by foreground clusters are analysed, including the discussion of a selection effect precisely due to gravitational luminosity amplification. It results in an artificial increase of the deceleration parameter of the Universe measured from the Hubble diagram of these objects. We recall our proposal that the sample of distant 3C radiogalaxies of redshift > 1 is strongly perturbed by lensing effects, mostly by foreground clusters of galaxies (i.e. only the luminosity is changed without image multiplication), but also for some objects by galaxies, producing gravitational mirages. The case of 3C324, for which definite evidence for multiple imaging has been recently obtained, is described, including detailed modelling of the lensing configuration. We present a highly significant statistical effect of lensing on absorption line QSOs due to matter lying at the absorption redshifts. Microlensing is also considered, and we recall our recent proposal that the variability of some among the OVV QSOs turning to BL Lac at maximum brightness, like the eruptive object 0846 +51W1, is a consequence of microlensing by stars or compact objects constituting foreground galaxy halos. Finally, discrepant redshift associations are considered. We recall how the case of anomalous quintets of galaxies have been explained by the gravitational lensing effects of quartets halos on background galaxies. Then we present evidence that the Arp QSO-galaxy associations may be the result of the combined lensing effects of several superposed galaxies, groups and clusters. Cet article présente une revue de certains résultats théoriques et observationnels concernant les divers effets de lentille gravitationnelle, complétée par des résultats nouveaux non encore publiés. Dans la partie théorique, nous considérons essentiellement l'approche relativiste employant les équations scalaires optiques. La forme de ces équations est rappelée : elles relient les diverses déformations subies par la section d'un faisceau lumineux au cours de sa propagation, à la distribution matérielle et énergétique qu'il rencontre sur son trajet entre la source et l'observateur. Les deux contributions aux effets d'optique apparaissant dans ces équations (et leurs solutions) sont les termes de matière et les termes de cisaillement. Nous montrons comment une nouvelle distance peut être introduite, la distance optique, en fonction de laquelle les équations se simplifient et de nouvelles solutions analytiques peuvent être établies. En particulier, nous démontrons que la formule d'amplification gravitationnelle peut s'écrire sous une forme très générale sous la simple hypothèse que l'Univers est en moyenne de type Friedmann-Robertson-Walker. De nouveaux résultats sont aussi présentés concernant la distribution de probabilité des amplifications, les rapports entre termes de matière et termes de cisaillement (les premiers donnant la moyenne des seconds), et le problème de la conservation de l'énergie. Nous rappelons comment une formule analytique peut être obtenue pour des amplifications multiples et présentons des solutions nouvelles au problème de l'amplification par des hétérogénéités à très grande échelle. Enfin, les effets de décalages spectraux gravitationnels sont aussi considérés, qu'ils soient dus à des effets de traversée d'excès de densité, ou intrinsèques à ceux-ci ; dans ce dernier cas, une généralisation de solutions déjà connues est proposée. On considère ensuite le problème de la mise en évidence observationnelle des divers effets d'optique gravitationnelle, qu'ils soient de nature statistique ou qu'ils concernent des objets individuels. A la suite d'une analyse décrivant la dépendance de la formule d'amplification en fonction des différents paramètres y intervenant, nous insistons tout particulièrement sur le rôle cosmologique que jouent probablement les amas de galaxies très riches de décalages spectraux autour de 0,7 qui peuvent donner lieu à des amplifications quasi-infinies. La mise en évidence récente d'un effet statistique d'amplification par des amas d'avant-plan sur les galaxies les plus brillantes des amas est rappelée, ainsi que celle d'un important effet de sélection observationnel, précisément dû aux amplifications, qui augmente artificiellement la valeur de q0 mesurée à partir du diagramme de Hubble. L'extrapolation de cet effet à des objets plus lointains nous a conduit à proposer que les radiogalaxies lointaines du catalogue 3C de z < 1 étaient fortement affectées par les effets de lentille gravitationnelle, essentiellement par les amas (qui les amplifient la plupart du temps sans multiplication d'image), mais aussi dans certains cas par des galaxies qui peuvent alors provoquer des effets de mirage. Le cas de 3C324, pour lequel des preuves observationelles de multiplication d'image ont été récemment obtenues, est analysé en particulier ; un modèle détaillé en est présenté. Un autre effet statistique récemment démontré concerne les quasars à raies d'absorption, dont la luminosité est fortement corrélée à l'amplification prédite par de la matière située aux décalages spectraux d'absorption. On considère également les effets de microlentille : nous rappelons notre proposition récente que la variabilité de certains OVVs tels que le BL Lac éruptif 0846 + 51W1 est une conséquence du passage d'étoiles ou d'objets compacts du halo de galaxies intervenantes devant les régions centrales d'un quasar. Le lien entre effets d'optique gravitationnelle et associations de décalages spectraux discordants est évoqué. Le cas des quintets de galaxies discordant a été récemment expliqué par les effets (amplification, agrandissement, changement de densité superficielle d'objets lointains) des halos de groupes compacts de galaxies. Nous démontrons pour finir que les associations QSOsgalaxies de Arp peuvent s'expliquer par les effets de lentille gravitationnelle combinés de plusieurs galaxies, groupes et amas vus en superposition.
Parametrisation D'effets Non-Standard EN Phenomenologie Electrofaible
NASA Astrophysics Data System (ADS)
Maksymyk, Ivan
Cette these pat articles porte sur la parametrisation d'effets non standard en physique electrofaible. Dans chaque analyse, nous avons ajoute plusieurs operateurs non standard au lagrangien du modele standard electrofaible. Les operateurs non standard decrivent les nouveaux effets decoulant d'un modele sous-jacent non-specefie. D'emblee, le nombre d'operateurs non standard que l'on peut inclure dans une telle analyse est illimite. Mais pour une classe specifique de modeles sous-jacents, les effets non standard peuvent etre decrits par un nombre raisonnable d'operateurs. Dans chaque analyse nous avons developpe des expressions pour des observables electrofaibles, en fonction des coefficients des operateurs nouveaux. En effectuant un "fit" statistique sur un ensemble de donnees experimentales precises, nous avons obtenu des contraintes phenomenologiques sur ces coefficients. Dans "Model-Independent Global Constraints on New Physics", nous avons adopte des hypotheses tres peu contraignantes relatives aux modeles sous-jacents. Nous avons tronque le lagrangien effectif a la dimension cinq (inclusivement). Visant la plus grande generalite possible, nous avons admis des interactions qui ne respectent pas les symetries discretes (soit C, P et CP) ainsi que des interactions qui ne conservent pas la saveur. Le lagrangien effectif contient une quarantaine d'operateurs nouveaux. Nous avons determine que, pour la plupart des coefficients des nouveaux operateurs, les contraintes sont assez serrees (2 ou 3%), mais il y a des exceptions interessantes. Dans "Bounding Anomalous Three-Gauge-Boson Couplings", nous avons determine des contraintes phenomenologiques sur les deviations des couplages a trois bosons de jauge par rapport aux interactions prescrites par le modele standard. Pour ce faire, nous avons calcule les contributions indirectes des CTBJ non standard aux observables de basse energie. Puisque le lagrangien effectif est non-renormalisable, certaines difficultes techniques se posent: pour regulariser les integrales de Feynman les chercheurs se sont generalement servi de la methode de coupure, mais cette methode peut mener a des resultats incorrects. Nous avons opte pour une technique alternative: la regularisation dimensionnelle et la "soustraction minimale avec decouplage". Dans "Beyond S, T and U" nous presentons le formalisme STUVWX, qui est une extension du formalisme STU de Peskin et Takeuchi. Ces formalismes sont bases sur l'hypothese que la theorie sous-jacente se manifeste au moyen de self -energies de bosons de jauge. Ce type d'effet s'appelle 'oblique'. A la base du formalisme STU se trouve la supposition que l'echelle de la nouvelle physique, M, est beaucoup plus grande que q, l'echelle a laquelle on effectue des mesures. Il en resulte que les effets obliques se parametrisent par les trois variables S, T et U. Par contre, dans le formalisme STUVWX, nous avons admis la possibilite que M~ q. Dans "A Global Fit to Extended Oblique Parameters", nous avons effectue deux fits statistiques sur un ensemble de mesures electrofaibles de haute precision. Dans le premier fit, nous avons pose V=W=X=0, obtenant ainsi des contraintes pour l'ensemble {S,T,U}. Dans le second fit, nous avons inclus tous les six parametres.
NASA Astrophysics Data System (ADS)
Laliberte, Francis
2010-06-01
Ce memoire presente des mesures de transport thermoelectrique, les effets Seebeck et Nernst, dans une serie d'echantillons de supraconducteurs a haute temperature critique. Des resultats obtenus recemment au Laboratoire National des Champs Magnetiques Intenses a Grenoble sur La1.7Eu0.2Sr0.1 CuO4, La1.675Eu0.2Sr0.125CuO 4, La1.64Eu0.2Sr0.16CuO4, La1.74Eu0.1Sr0.16CuO4 et La 1.4Nd0.4Sr0.2CuO4 sont analyses. Une attention particuliere est accordee aux equations de la theorie semi-classique du transport et leur validite est verifiee. La procedure experimentale et les materiaux utilises pour concevoir les montages de mesures sont expliques en detail. Enfin, un chapitre est dedie a l'explication et l'interpretation des resultats de transport thermoelectrique sur YBa2Cu3O6+delta publies au cours de l'hiver 2010 dans les revues Nature et Physical Review Letters. Les donnees d'effet Seebeck dans les echantillons de La 1.8-x,Eu0.2SrxCuO 4, ou un changement de signe est observe, permettent de conclure a la presence d'une poche d'electrons dans la surface de Fermi qui domine le transport a basse temperature dans la region sous-dopee du diagramme de phase. Cette conclusion est similaire a celle obtenue par des mesures d'effet Hall dans YBa 2Cu3O6+delta et elle cadre bien dans un scenario de reconstruction de la surface de Fermi. Les donnees d'effet Nernst recueillies indiquent que la contribution des fluctuations supraconductrices est limitee a un modeste intervalle de temperature au-dessus de la temperature critique.
NASA Astrophysics Data System (ADS)
Sakhi, Said
Cette these est constituee de trois sujets de recherche distincts. Les deux premiers articles traitent du phenomene de supraconductivite dans un modele bidimensionnel, dans le troisieme article on etudie l'action effective d'un systeme electronique soumis a l'effet d'un champ magnetique (systeme de Hall) et le dernier article examine la quantification d'un systeme de particules identiques en deux dimensions d'espace et la possibilite des anyons. Le modele qu'on analyse dans les deux premiers articles est un systeme fermionique dont les particules chargees et de masse nulle interagissent entre elles avcc un couplage attractif et fort. L'analyse de l'action effective decrivant la physique a basse energie nous permet d'examiner la structure de l'espace de phase. A temperature nulle, le parametre d'ordre du systeme prend une valeur moyenne non nulle. Consequemment, la symetrie continue U(1) du modele est spontanement brisee et il en resulte l'apparition d'un mode de Goldstone. En presence d'un champ electromagnetique externe, ce mode disparait et le champ de jauge acquiert une masse donc l'effet Meissner caracteristique d'un supraconducteur. Bien que le modele ne soit pas renormalisable dans le sens perturbatif, on montre qu'il l'est dans le cadre du developpement en 1/N ou N est le nombre d'especes fermioniques. En outre, on montre que l'inclusion des effets thermiques change radicalement le mecanisme de supraconductivite. En effet, on montre que la brisure spontanee de la symetrie U(1) n'est plus possible a temperature finie a cause de tres severes divergences infrarouges. Par contre, la dynamique des tourbillons (vortex) existant dans le plan devient essentielle. On montre que le phenomene de supraconductivite resulte du confinement de ces objets topologiques et que la temperature critique s'identifie a celle de Kosterlitz -Thouless. Ce mecanisme de supraconductivite presente l'avantage d'aboutir a un rapport gap a la temperature critique plus eleve que celui du modele de BCS et la non violation de la symetrie par renversement du temps et de l'espace contrairement au modele anyonique. Dans le troisieme article, on a developpe une methode systematique pour calculer les determinants fermioniques en presence d'un champ magnetique perpendiculaire au plan de confinement des particules. Aussi bien les effets thermiques que ceux dus aux impuretes sont pris en consideration dans cette methode. La technique est illustree dans le cas du systeme de Hall quantique. Finalement, dans le dernier article, on discute la quantification d'un systeme de particules identiques dans le plan. Apres avoir correctement defini l'espace de phase classique du systeme, l'analyse fait ressortir deux parametres fondamentaux denotes par theta et alpha. Le premier parametre theta est associe a l'intensite du flux magnetique localise au vertex du cone et definit la statistique des particules. Pour des valeurs arbitraires de theta on parle d'anyons. L'autre parametre alpha est associe a l'extension auto-adjointe de l'hamiltonien et ressort de l'unitarite de la theorie. On montre par un exemple explicite que alpha peut etre vu comme le vestige d'une interaction a tres courte distance (haute energie) entre les particules du systeme. Finalement, on montre que le groupe de symetrie du systeme avec extension auto -adjointe n'est plus SO(2,1) car le generateur de dilatation de ce groupe n'est plus compatible avec un parametre alpha arbitraire. (Abstract shortened by UMI.).
Etude des melanges co-continus d'acide polylactique et d'amidon thermoplastique (PLA/TPS)
NASA Astrophysics Data System (ADS)
Chavez Garcia, Maria Graciela
Les melanges co-continus sont des melanges polymeriques ou chaque composant se trouve dans une phase continue. Pour cette raison, les caracteristiques de chacun des composants se combinent et il en resulte un materiau avec une morphologie et des proprietes particulieres. L'acide polylactique (PLA) et l'amidon thermoplastique (TPS) sont des biopolymeres qui proviennent de ressources renouvelables et qui sont biodegradables. Dans ce projet, differents melanges de PLA et TPS a une haute concentration de TPS ont ete prepares dans une extrudeuse bi-vis afin de generer des structures co-continues. Grace a la technique de lixiviation selective, le TPS est enleve pour creer une structure poreuse de PLA qui a pu etre analysee au moyen de la microtomographie R-X et de la microscopie electronique a balayage MEB. L'analyse des images 2D et 3D confirme la presence de la structure co-continue dans les melanges dont la concentration en TPS. se situe entre 66% et 80%. L'effet de deux plastifiants, le glycerol seul et le melange de glycerol et de sorbitol, dans la formulation de TPS est etudie dans ce travail. De plus, nous avons evalue l'effet du PLA greffe a l'anhydride maleique (PLAg) en tant que compatibilisant. On a trouve que la phase de TPS obtenue avec le glycerol est plus grande. L'effet de recuit sur la taille de phases est aussi analyse. Grace aux memes techniques d'analyse, on a etudie l'effet du procede de moulage par injection sur la morphologie. On a constate que les pieces injectees presentent une microstructure heterogene et differente entre la surface et le centre de la piece. Pres de la surface, une peau plus riche en PLA est presente et les phases de TPS y sont allongees sous forme de lamelles. Plus au centre de la piece, une morphologie plus cellulaire est observee pour chaque phase continue. L'effet des formulations sur les proprietes mecaniques a aussi ete etudie. Les pieces injectees dont la concentration de TPS est plus grande presentent une moindre resistance a la traction. La presence du compatibilisant dans la region co-continue affecte negativement cette resistance. En considerant que l'amidon est un biomateriau abondant, moins cher et plus rapidement biodegradable, son ajout dans le PLA presente l'avantage de reduire le cout tout en augmentant la vitesse de degradation du PLA. De plus, une structure continue poreuse de PLA produit par la technique de lixiviation selective a des applications potentielles soit comme materiau a degradation rapide ou encore, une fois la phase TPS retiree, comme substrat a porosite ouverte pour la fabrication de membranes, de supports cellulaires ou de filtres. Mots-cles : melanges immiscibles, acide polylactique, amidon thermoplastique, morphologie cocontinue, lixiviation selective, microtomographie R-X, materiau rigide poreux biodegradable.
Comportement instationnaire des thermoéléments à effet Peltier multi-étages
NASA Astrophysics Data System (ADS)
Monchoux, F.; Zély, D.; Cordier, A.
1995-01-01
The analysis of thermoelectric phenomena is possible based on non-equilibrium thermodynamics. Integration of the thermal balance equation leads to an analytical solution for the non-stationnary behaviour. The influence to Thomson effect is commmented. The model, introduced in the complete software TRNSYS, permits the modelling of complex systems including such elements in their thermal regulation. La thermodynamique des processus irréversibles permet l'analyse des phénomènes thermoélectriques. Par intégration de l'équation de bilan thermique, on a obtenu une solution analytique pour le régime non stationnaire donnant la température en tous points et le flux absorbé. On a analysé l'influence de l'effet Thomson. Le modèle a été inclu dans le code plus général TRNSYS qui permet la modélisation de systèmes complexes.
NASA Astrophysics Data System (ADS)
Croteau, Etienne
L'objectif de ce projet de doctorat est de developper des outils quantitatifs pour le suivi des traitements de chimiotherapie pour le cancer du sein et de leurs effets cardiotoxiques a l'aide de l'imagerie TEP dynamique. L'analyse cinetique en TEP dynamique permet l'evaluation de parametres biologiques in vivo. Cette analyse peut etre utilise pour caracteriser la reponse tumorale a la chimiotherapie et les effets secondaires nefastes qui peuvent en resulter. Le premier article de cette these decrit la mise au point des techniques d'analyse cinetique qui utilisent la fonction d'entree d'un radiotraceur derive de l'image dynamique. Des corrections de contamination radioactive externe (epanchement) et de l'effet de volume partiel ont ete necessaires pour standardiser l'analyse cinetique et la rendre quantitative. Le deuxieme article porte sur l'evaluation d'un nouveau radiotraceur myocardique. Le 11C-acetoacetate, un nouveau radiotraceur base sur un corps cetonique, a ete compare au 11C-acetate, couramment utilise en imagerie cardiaque TEP. L'utilisation de 3H-acetate et 14C-acetoacetate ont permis d'elucider la cinetique de ces traceurs depuis la fonction d'entree et la captation par les mitochondries cardiaques qui reflete la consommation en oxygene, jusqu'a la liberation de leurs principaux metabolites reciproques (3H20 et 14CO2). Le troisieme et dernier article de cette these presente l'integration d'un modele qui evalue la reserve cardiaque de perfusion et de consommation en oxygene. Un modele de cardiomyopathie a ete etabli a l'aide d'un agent chimiotherapeutique contre le cancer du sein, la doxorubicine, reconnu comme etant cardiotoxique. Un protocole de repos/effort a permis d'evaluer la capacite d'augmentation de perfusion et de consommation en oxygene par le coeur. La demonstration d'une reserve cardiaque reduite caracterise la cardiotoxicite. La derniere contribution de cette these porte sur la mise au point de methodes peu invasives pour mesurer la fonction d'entree en modele animal avec l'utilisation de l'artere caudale et un compteur microvolumetrique, la bi-modalite TEP/IRM dynamique avec le Gd-DTPA et l'etablissement d'un modele d'evaluation simultane de cardiotoxicite et reponse tumorale chez la souris. Le developpement d'outils d'analyse TEP dans l'evaluation de la cardiotoxicite lors de traitements du canter du sein permet de mieux comprendre la relation entre les dommages mitochondriaux et la diminution de la fraction d'ejection. Mots cles : Tomographie d'emission par positrons (TEP), analyses cinetiques, IIC-acetate, 11Cacetoacetate, cardiotoxicite.
1989-06-01
Methodes de Panel) ct sur l’equation complete du potentiel avec ou sans incorporation des effets visqueux. sont couramment employees darn l’industrie...quo pour ce qui eat do Is privision do Is tralnie ia situation actuolle nett pas satisfaisante. Il est en effet plus facile d’obtonir do bonnos...d’un nouvel appareil civil ou militairo utilisont donc encore largaent les essais en soufflerie en Is1 C. minimum eat ensuite corrigE des effets
NASA Astrophysics Data System (ADS)
Fall, H.; Charon, W.; Kouta, R.
2002-12-01
Ces dernières décennies, des activités significatives dans le monde étaient dirigées autour du contrôle actif. Le but de ces recherches était essentiellement d'améliorer les performances, la fiabilité et la sécurité des systèmes. Notamment dans le cas des structures soumises à des vibrations aléatoires. D'importants travaux ont été consacré à l'utilisation des “matériaux intelligents” comme capteurs et actionneurs. Cette article propose l'analyse de la fiabilité des systèmes mécaniques en étudiant les pannes des actionneurs ou des capteurs. L'effet de ces pannes sur la stabilité et la performance du système y est démontré. Les méthodologies de conception y sont rappelées. Des exemples numériques sont fournis à travers le contrôle d'un panneau sous chargement dynamique pour illustrer la méthode proposée.
Effets thermoelectrique et thermomagnetique du yttrium barium copper oxide monocristallin
NASA Astrophysics Data System (ADS)
Ghamlouche, Hassan
1998-09-01
Des la decouverte des supraconducteurs a haute temperature critique, les recherches se sont intensifiees afin de comprendre les mecanismes qui sont a l'origine des proprietes de ces materiaux L'etat mixte, tout comme l'etat supraconducteur pur et l'etat normal, a fait l'objet de nombreux travaux de recherche. En particulier, la structure des vortex a l'etat mixte, et leur mouvement sous l'effet d'une force quelconque, etaient et restent le centre de preoccupation. Les effets thermoelectrique (Seebeck) et thermomagnetique (Nernst) sont parmi les differentes mesures qui peuvent donner de l'information sur les etats des vortex a l'etat mixte. L'avantage essentiel de ces deux effets est l'absence d'un courant electrique applique. Ce dernier peut donner des perturbations indesirables durant les mesures. D'autre pari, nous avons utilise la methode CA (Courant Alternatif) pour effectuer nos mesures. Cette methode est caracterisee par une meilleure resolution par rapport a la methode CC (Courant Continu) conventionnelle. Nous avons etudie autant des echantillons macles que des echantillons sans macles. D'abord nous avons teste notre montage a champ magnetique nul. Nous avons alors montre que le pic rapporte par certains dans l'effet Seebeck a la transition supraconductrice ne correspond pas a une realite physique mais a un artefact experimental. On avait associe ce pic aux fluctuations. Par la suite, nous avons mis en evidence et etudie pour la premiere fois avec les effets Seebeck et Nernst le phenomene de la fusion du reseau de vortex grace a des mesures sur les echantillons sans macles. Cette etude s'est faite pour deux concentrations d'oxygene differentes et pour un gradient de temperature parallele, consecutivement, aux deux axes cristallographiques dans le plan ab. Finalement, nous avons etudie l'effet des plans de maclage sur le mouvement des vortex. Ceci a ete realise en appliquant le gradient de temperature selon trois directions differentes (0, 45 et 90°) avec les plans de maclage. Nous avons observe, pour le premier angle un mouvement libre du vortex, pour le second angle une contribution de l'effet Nernst a l'effet Seebeck et pour la troisieme direction un phenomene d'activation. Dans ce dernier cas, les plans de maclage font un obstacle qui s'oppose au mouvement des vortex. De ce qui precede, nous concluons qu'avec la bonne resolution de notre technique nous sommes capables d'observer des phenomenes que la technique CC ne met pas en relief. D'autre part, la variete d'echantillons que nous avons etudies et les phenomenes que nous avons observes valorisent la presente etude.
Relativité générale et gravitation.
NASA Astrophysics Data System (ADS)
Elbaz, E.
Contents: 1. Description de l'univers observable. 2. Lemicro-univers des particules élémentaires. 3. Analyse tensorielle.4. Relativité restreinte. 5. Effet de la gravitation en relativité générale. 6. Équations d'Einstein. 7. Ondes gravitationnelles. 8. Champ de gravitation statique et isotrope. 9. Structures stellaires. 10. Champ de gravitation non statique isotrope. 11. Introduction à la dynamique cosmologique et au modèle standard de l'univers très primitif. 12. Perspectives actuelles.
NASA Astrophysics Data System (ADS)
Lopez, L.; Gatti, A.; Maitre, A.; Treps, N.; Gigan, S.; Fabre, C.
2004-11-01
Nous nous intéressons au comportement spatial des fluctuations quantiques à la sortie d'un oscillateur paramétrique optique dégénéré en modes transverses, sous le seuil. En vue de futures expériences, nous étudions les effets de la diffraction dans le milieu paramétrique sur le bruit quantique spatial. Nous montrons que l'on voit apparaître une aire de cohérence de taille finie pour les effets quantiques transverses.
2009-07-01
auteurs discu- tent des implications des resultats pour les theories qui postulent un effet de I’emotion sur la perception du risque et pour com...effect of global negative emotion on perceived threat . The authors discuss implications of the findings for theories that postulate an effect of... auteurs ont mene une etude perception j emotion j experimentale afin d’examiner les effets d’etats emotionnels Specifi- ques (peur et colere) et globaux
2003-03-01
moins développé que le tourbillon expérimental. En moyenne, par effet de compensation entre la région de bord d’attaque et la région proche de ...que les effets d’anisotropie de la turbulence. Un premier élément de réponse peut être obtenu en utilisant un modèle de turbulence un plus élaboré, tel...que l’EARSM (Explicit Algebraïc Reynolds Stress Model) qui prend en compte les effets de rotation et d’anisotropie [Ref 6]. Nous verrons au paragraphe
Les fluctuations supraconductrices dans le compose praseodyme-cerium-oxyde de cuivre
NASA Astrophysics Data System (ADS)
Renaud, Jacques
Ce travail etudie les fluctuations supraconductrices dans le compose supraconducteur a haute temperature critique dope aux electrons Pr2-xCe xCuO4+delta. La technique utilisee pour sonder ces fluctuations est le transport electrique DC dans le plan ab. Il s'agit, a notre connaissance, de la premiere etude de ce type dans la classe generale des supraconducteurs a haute temperature critique dopes aux electrons et, plus particulierement, dans Pr2-xCe xCuO4+delta. De plus, l'etude est effectuee pour trois regimes de dopage, soit sous-dope x = 0.135, dopage optimal x = 0.15 et surdope x = 0.17. Les echantillons etudies sont des couches minces d'epaisseur plus grande que 100 nm crues par ablation laser. Les mesures electriques DC effectuees dans ce travail sont la resistance en reponse lineaire et les courbes IV en reponse non lineaire en fonction de la temperature. La mise en oeuvre experimentale de ces mesures a necessite une grande attention au filtrage et aux effets de chauffage a haut courant. Nous montrons que, sans cette attention, les donnees experimentales sont toujours erronees dans le regime pertinent pour nos echantillons. Les resultats pour le dopage optimal x = 0.15 sont expliques de facon tres convaincante dans le cadre de fluctuations purement 2D. D'abord, le regime des fluctuations gaussiennes est tres bien decrit par le modele d'Aslamazov-Larkin en deux dimensions. Ensuite, le regime de fluctuations critiques, se trouvant a plus basse temperature que le regime gaussien, est tres bien decrit par la physique 2D de Kosterlitz-Thouless. Dans cette analyse, les deux regimes ont des temperatures critiques coherentes entre elles, ce qui semble confirmer ce scenario 2D. Une analyse des donnees dans le cadre de fluctuations 3D est exploree mais donne des conclusions incoherentes. Les resultats pour les autres dopages sont qualitativement equivalents avec le dopage optimal et permettent donc une explication purement 2D. Par contre, contrairement au dopage optimal, les effets du desordre semblent etre tres importants. Une analyse detaillee de tous ces resultats semble indiquer que les signatures 2D in identifiees proviennent vraisemblablement de plans paralleles decouples formes d'environ 4 plans CuO2 couples. On discute de cette mise en ordre partielle comme une possible consequence d'une separation de phase isolante antiferromagnetique/supraconducteur. La largeur de la transition en fonction du dopage est aussi analysee dans le but de mettre en lumiere un possible effet du pseudogap. On montre que nos mesures ne supportent pas une telle interpretation.
Le Laser A Argon Ionise : Applications Therapeutiques
NASA Astrophysics Data System (ADS)
Brunetaud, J. M.; Mosquet, L.; Mordon, S.; Rotteleur, G.
1984-03-01
Le laser a argon ionise est un laser a emission continue, reglee en general en multiraies de 487 a 544 nm. Le rayonnement de ce laser est bien absorbe par les tissus vivants, avec une action preferentielle au niveau des pigments rouges (hemoglobine, myoglobine) et noirs (melanine). Le laser a argon est princi-palement utilise en therapeutique pour ses effets thermiques : en fonction du choix des parametres (puissance optique, surface exposee, temps d'exposition) on peut obtenir une coagulation (temperature optimale au niveau des tissus 60° - 80°) ou une volatisation (temperature superieure a 100°). Si la zone volatilisee est tres etroite (inferieure a 0,5 mm) on obtient un effet de coupe. Par rapport aux deux autres lasers egalement utilises pour leurs effets thermiques (CO2 et Nd. YAG) l'argon a des effets intermediaires : la coagulation sera plus superficielle qu'avec le Nd. YAG et la volatisation plus profonde qu'avec le CO2. Lors de la coupe, la necrose sur les berges sera egalement plus importante qu'avec le CO2.
Effets Josephson generalises entre antiferroaimants et entre supraconducteurs antiferromagnetiques
NASA Astrophysics Data System (ADS)
Chasse, Dominique
L'effet Josephson est generalement presente comme le resultat de l'effet tunnel coherent de paires de Cooper a travers une jonction tunnel entre deux supraconducteurs, mais il est possible de l'expliquer dans un contexte plus general. Par exemple, Esposito & al. ont recemment demontre que l'effet Josephson DC peut etre decrit a l'aide du boson pseudo-Goldstone de deux systemes couples brisant chacun la symetrie abelienne U(1). Puisque cette description se generalise de facon naturelle a des brisures de symetries continues non-abeliennes, l'equivalent de l'effet Josephson devrait donc exister pour des types d'ordre a longue portee differents de la supraconductivite. Le cas de deux ferroaimants itinerants (brisure de symetrie 0(3)) couples a travers une jonction tunnel a deja ete traite dans la litterature Afin de mettre en evidence la generalite du phenomene et dans le but de faire des predictions a partir d'un modele realiste, nous etudions le cas d'une jonction tunnel entre deux antiferroaimants itinerants. En adoptant une approche Similaire a celle d'Ambegaokar & Baratoff pour une jonction Josephson, nous trouvons un courant d'aimantation alternee a travers la jonction qui est proportionnel a sG x sD ou fG et sD sont les vecteurs de Neel de part et d'autre de la jonction. La fonction sinus caracteristique du courant Josephson standard est donc remplacee.ici par un produit vectoriel. Nous montrons que, d'un point de vue microscopique, ce phenomene resulte de l'effet tunnel coherent de paires particule-trou de spin 1 et de vecteur d'onde net egal au vecteur d'onde antiferromagnetique Q. Nous trouvons egalement la dependance en temperature de l'analogue du courant critique. En presence d'un champ magnetique externe, nous obtenons l'analogue de l'effet Josephson AC et la description complete que nous en donnons s'applique aussi au cas d'une jonction tunnel entre ferroaimants (dans ce dernier cas, les traitements anterieurs de cet effet AC s'averent incomplets). Nous considerons ensuite le cas d'une jonction tunnel entre deux materiaux au sein desquels l'antiferromagnetisme itinerant et la supraconductivite de type d coexistent de facon homogene. Nous obtenons a nouveau un courant d'aimantation alternee proportionnel a sG x sD, mais l'amplitude de l'analogue du courant critique est modulee par l'energie Josephson de la jonction E oc cos Acp, ou Acp est la difference de phase entre les deux parametres d'ordre supraconducteurs. Symetriquement, le courant Josephson supraconducteur est proportionnel a sin Acp, mais le courant critique est module par l'energie de couplage entre les moments magnetiques alternes ES cx SG· SD.
Effet de la teneur en carbone sur la resistance du CA6NM a la propagation des fissures de fatigue
NASA Astrophysics Data System (ADS)
Akhiate, Aziz
L'amelioration des performances des roues de turbines hydrauliques a fait l'objet de plusieurs etudes. Dans ce memoire, on s'est interesse a l'effet de la teneur en carbone du materiau de base de type CA6NM des aubes de turbines sur la microstructure et les proprietes mecaniques en general et en particulier sur le comportement en fatigue propagation. De maniere a bien atteindre cet objectif, on a choisi trois differentes nuances d'acier a faible 0.018% C, moyenne 0.033% C et elevee 0.067% C teneurs en carbone. Le 0.018% C et 0.067% C ont ete fabriques par le meme facturier et ils ont subi des traitements thermiques differents. De meme, le 0.033% C a subi des traitements thermiques differents des deux autres apres avoir ete coule par un autre fabricant. Afin d'effacer l'historique des traitements thermiques prealablement effectues et d'avoir la meme taille de grains parents austenitiques (GPA), on a austenitise les trois nuances d'acier a la meme temperature 1040°C, a differentes periodes suivant la teneur en carbone. Apres avoir homogeneise la GPA, on a entame le revenu. Le carbone a une influence sur la microstructure revenue notamment la quantite d'austenite de reversion ainsi que sur sa stabilite thermique et mecanique. Pour mettre en evidence l'effet de la teneur en carbone, on a trouve raisonnable d'isoler les effets relies a la temperature de revenu, en particulier la formation d'austenite de reversion. En effet, on a choisi deux temperatures de revenu. La premiere temperature est a 550°C pendant 2h imposee aux trois materiaux afin d'avoir une microstructure sans austenite de reversion. La deuxieme temperature choisie est a 610°C pour avoir un maximum et une identique quantite d'austenite de reversion sans presence de martensite fraiche. A la lumiere de ces deux microstructures, nous pouvons etablir une relation entre les proprietes mecaniques du CA6NM et la teneur en carbone, ainsi qu'entre les effets de la presence ou non d'austenite de reversion sur les proprietes mecaniques. Les resultats montrent que l'effet de la teneur en carbone est plus significatif sur les proprietes mecaniques monotones de traction et de resilience des alliages sans austenite de reversion. En effet, la limite d'elasticite, la resistance a la traction et la durete augmentent avec la teneur en carbone par contre l'allongement diminue. D'autre part, l'augmentation de la temperature et de la duree de revenu diminuent leur durete.
2009-05-01
Quelque soit le contexte, l’aide à la décision passe par une analyse en profondeur de trois (3) aspects importants interdépendants, à savoir le...information, including suggestions for reducing this burden to Department of Defense , Washington Headquarters Services, Directorate for Information...type de menace, nécessite en effet d’adopter une approche collective de la sécurité étendue à une coopération avec de multiples organisations civiles
NASA Astrophysics Data System (ADS)
Nédellec, P.; Dumoulin, L.; Burger, J. P.; Bernas, H.; Köstler, H.; Traverse, A.
1993-11-01
Metastable MgHx hydride was prepared by H ion implantation into Mg films at 5 K. The resistivity and magnetoresistance temperature dependence reveal weak localization effects due to atomic disorder. At low hydrogen concentrations, x le 0.3, the conductivity varies as σ sim log(T), typical of two-dimensional weak localization behaviour. The resistivity is also very sensitive to the sample inhomogeneity, due to H diffusion, which can be modelled by introducing a temperature-dependent geometrical percolating factor G. At higher H concentrations, 0.7 le x le 3, after annealing at 20 K, 50 K and 110 K, the samples also exhibit weak localization but with three-dimensional behaviour i.e. a σ sim sqrt{T}. Our analysis is consistent with the existence of an inhomogeneous system formed by a mixture of two phases with contrasted conduction properties, one of which is a well-behaved metal, while the other displays the localization properties. The results lead us to identify the former phase to a non percolating superconducting phase at low temperature. Des films d'hydrure MgHx sont préparés, hors de l'équilibre thermodynamique, par implantation d'ions H dans des films de Mg maintenus à 5 K. La mesure de la résistance et de la magnétorésistance en fonction de la température met en évidence des effets importants de localisation électronique due au désordre atomique. Pour les faibles concentrations d'hydrogène, x le 0,3, la conductivité varie comme σ sim log(T), variation caractéristique d'un comportement dominé par les effets de localisation électronique faible à 2 dimensions. La résistivité est très sensible aux effets d'inhomogénéité, liés à la diffusion de H. Nous avons modélisé ces effets en introduisant un coefficient géométrique variant avec la température. Pour les concentrations plus élevées (0, 7 le x le 3), après un recuit à 20 K, 50 K et 110 K, les films montrent également des effets de localisation mais la conductivité varie maintenant comme σ sim sqrt{T}, caractéristique d'un comportement tridimensionnel. L'analyse des résultats est compatible avec l'hypothèse d'un système inhomogène formé du mélange de deux phases avec des propriétés électriques contrastées : un “mauvais métal" et un “bon métal" qui, à basse température, pourrait présenter un état supraconducteur non percolant.
Pringsheim, Tamara; Panagiotopoulos, Constadina; Davidson, Jana; Ho, Josephine
2012-01-01
HISTORIQUE : Au Canada, l’utilisation d’antipsychotiques, notamment les antipsychotiques de deuxième génération (ADG), a augmenté de façon considérable depuis cinq ans chez les enfants ayant des troubles de santé mentale. Ces médicaments ont le potentiel de causer de graves complications métaboliques et neurologiques lorsqu’on les utilise de manière chronique. OBJECTIF : Synthétiser les données probantes relatives aux effets secondaires métaboliques et neurologiques précis associés à l’usage d’ADG chez les enfants et fournir des recommandations probantes sur la surveillance de ces effets secondaires. MÉTHODOLOGIE : Les auteurs ont procédé à une analyse systématique des essais cliniques contrôlés des ADG auprès d’enfants. Ils ont fait des recommandations à l’égard de la surveillance de l’innocuité des ADG d’après un modèle de classification fondé sur le système GRADE (système de notation de l’évaluation et de l’élaboration des recommandations). Lorsque les données probantes n’étaient pas suffisantes, ils fondaient leurs recommandations sur le consensus et l’avis d’experts. Un groupe consensuel multidisciplinaire a analysé toutes les données probantes pertinentes et est parvenu à un consensus à l’égard des recommandations. RÉSULTATS : Les recommandations probantes portant sur la surveillance de l’innocuité des ADG figurent dans les présentes lignes directrices. Les auteurs indiquent la qualité des recommandations relatives à des examens physiques et tests de laboratoire précis à l’égard de chaque ADG à des moments déterminés. CONCLUSION : De multiples essais aléatoires et contrôlés ont permis d’évaluer l’efficacité de bon nombre des ADG utilisés pour traiter les troubles de santé mentale en pédiatrie. Toutefois, leurs avantages ne sont pas sans risques : on observe à la fois des effets secondaires métaboliques et neurologiques chez les enfants traités au moyen d’ADG. Le risque de prise de poids, d’augmentation de l’indice de masse corporelle et de taux lipidiques anormaux est plus élevé à l’utilisation d’olanzapine, suivie de la clozapine et de la quétiapine. Quant au risque d’effets secondaires neurologiques des traitements, il est plus élevé à l’utilisation de rispéridone, d’olanzapine et d’aripiprazole. Des interventions de surveillance pertinentes des effets secondaires amélioreront la qualité des soins des enfants traités à l’aide de ces médicaments. PMID:24082813
Millogo, Georges Rosario Christian; Zongo, Ragomzingba Frank Edgard; Benao, Anita; Youl, Estelle Noëla Hoho; Bassoleth, Blaise Alexandre Bazona; Ouédraogo, Moussa; Zabsonré, Patrice; Guissou, Innocent Pierre
2018-01-01
La prise en charge médicamenteuse de l'hypertension artérielle (HTA) entraine des effets indésirables qui peuvent être gênants et ainsi influencer l'observance du patient. Nous avons étudié ces effets indésirables dans le service de cardiologie du Centre hospitalier universitaire Yalgado Ouédraogo afin de déterminer leurs fréquences et leurs caractéristiques. Il s'agissait d'une étude transversale de juillet à septembre 2015 chez les patients suivis en ambulatoire pour HTA. Les données ont été obtenues à partir de l'interrogatoire, des carnets de suivi des patients et des fiches de consultations. Au total 278 patients ont été inclus. La population d'étude incluait 69,1% de femmes. L'âge moyen était de 52,2 ans avec des extrêmes de 23 et 86 ans. Quatre vingt et sept virgule huit pourcent (87,8%) vivaient en milieu urbain. Le tabagisme, la dyslipidémie et les antécédents familiaux d'HTA représentaient respectivement 9%, 35,6% et 57,2%. Au plan thérapeutique, 43,2% étaient sous monothérapie, 35,6% sous bithérapie à l'initiation du traitement. Les inhibiteurs calciques (59,7%) étaient la classe thérapeutique la plus utilisée. La prévalence globale des effets indésirables était de 60,1%. Les inhibiteurs calciques étaient impliqués dans 53,6% suivis des diurétiques (48,6%) dans la survenue de l'effet indésirable. La prévalence spécifique par molécule était 28,1% pour l'amlodipine et 24,5% pour l'hydrochlorothiazide. La diurèse excessive (13,7%), la toux (12,9%) et les vertiges (11,5%) étaient les effets indésirables les plus fréquemment évoqués par les patients. Le système nerveux central et périphérique et le système ostéo-musculaire étaient les systèmes les plus atteints. Les effets indésirables sont un déterminant majeur de l'adhésion aux traitements antihypertenseur, car leur impact sur la vie quotidienne des patients peut s'avérer significatif. PMID:29875965
Observation de l'interaction entre atome et surface en cellule de vapeur submicrométrique
NASA Astrophysics Data System (ADS)
Dutier, G.; Saltiel, S.; Bloch, D.; Ducloy, M.; Papoyan, A.; Sarkisyan, D.
2002-06-01
Sur une cellule de vapeur d'épaisseur submicrométrique ( 300 nm), les spectres d'absorption linéaire se révèlent très peu sensibles à l'effet Doppler (les effets transitoires favorisent fortement les atomes lents), et font apparaître les effets de l'interaction van der Waals à longue portée entre atome-surface. L'étude, entreprise d'abord sur la raie de résonance D1 de Cs, est poursuivie sur une transition à deux photons vers le niveau Cs 6(D{3/2}) résonnant avec la surface de YAG de la fenêtre. Elle ouvre diverses perspectives, notamment la détection d'états liés par un puits de potentiel induit par la surface.
2008-12-01
Prof. Karin Harms-Ringdahl, PhD, RPT Karolinska Institutet Department of Neurobiology, Care Sciences, and Society Division of Physiotherapy 23100...Äng Karolinska Institutet Department of Neurobiology, Care Sciences, and Society Division of Physiotherapy Alfred Nobels Allé 23100 SE-14183...report is in preparation. The RAF has an ongoing project (from August 2006 to September 2007) determining the need for physiotherapy for aircrew on the
NASA Astrophysics Data System (ADS)
Valentin, Olivier
Selon l'Organisation mondiale de la sante, le nombre de travailleurs exposes quotidiennement a des niveaux de bruit prejudiciables a leur audition est passe de 120 millions en 1995 a 250 millions en 2004. Meme si la reduction du bruit a la source devrait etre toujours privilegiee, la solution largement utilisee pour lutter contre le bruit au travail reste la protection auditive individuelle. Malheureusement, le port des protecteurs auditifs n'est pas toujours respecte par les travailleurs car il est difficile de fournir un protecteur auditif dont le niveau d'attenuation effective est approprie a l'environnement de travail d'un individu. D'autre part, l'occlusion du canal auditif induit une modification de la perception de la parole, ce qui cree un inconfort incitant les travailleurs a retirer leurs protecteurs. Ces deux problemes existent parce que les methodes actuelles de mesure de l'effet d'occlusion et de l'attenuation sont limitees. Les mesures objectives basees sur des mesures microphoniques intra-auriculaires ne tiennent pas compte de la transmission directe du son a la cochlee par conduction osseuse. Les mesures subjectives au seuil de l'audition sont biaisees a cause de l'effet de masquage aux basses frequences induit par le bruit physiologique. L'objectif principal de ce travail de these de doctorat est d'ameliorer la mesure de l'attenuation et de l'effet d'occlusion des protecteurs auditifs intra-auriculaires. L'approche generale consiste a : (i) verifier s'il est possible de mesurer l'attenuation des protecteurs auditifs grâce au recueil des potentiels evoques stationnaires et multiples (PEASM) avec et sans protecteur auditif (protocole 1), (ii) adapter cette methodologie pour mesurer l'effet d'occlusion induit par le port de protecteur auditifs intra-auriculaires (protocole 2), et (iii) valider chaque protocole par l'intermediaire de mesures realisees sur sujets humains. Les resultats du protocole 1 demontrent que les PEASM peuvent etre utilises pour mesurer objectivement l'attenuation des protecteurs auditifs : les resultats obtenus a 500 Hz et 1 kHz demontrent que l'attenuation mesuree a partir des PEASM est relativement equivalente a l'attenuation calculee par la methode REAT, ce qui est en accord avec ce qui etait attendu puisque l'effet de masquage induit par le bruit physiologique aux basses frequences est relativement negligeable a ces frequences. Les resultats du protocole 2 demontrent que les PEASM peuvent etre egalement utilises pour mesurer objectivement l'effet d'occlusion induit par le port de protecteurs auditifs : l'effet d'occlusion mesure a partir des PEASM a 500 Hz est plus eleve que celui calcule par l'intermediaire de la methode subjective au seuil de l'audition, ce qui est en accord avec ce qui etait attendu puisqu'en dessous d'1 kHz, l'effet de masquage induit par le bruit physiologique aux basses frequences est source de biais pour les resultats obtenus par la methode subjective car il y a surestimation des seuils de l'audition en basse frequence lors du port de protecteurs auditifs. Toutefois, les resultats obtenus a 250 Hz sont en contradiction avec les resultats attendus. D'un point de vue scientifique, ce travail de these a permis de realiser deux nouvelles methodes innovantes pour mesurer objectivement l'attenuation et l'effet d'occlusion des protecteurs auditifs intra-auriculaires par electroencephalographie. D'un point de vue sante et securite au travail, les avancees presentees dans cette these pourraient aider a concevoir des protecteurs auditifs plus performants. En effet, si ces deux nouvelles methodes objectives etaient normalisees pour caracteriser les protecteurs auditifs intra-auriculaires, elles pourraient permettre : (i) de mieux apprehender l'efficacite reelle de la protection auditive et (ii) de fournir une mesure de l'inconfort induit par l'occlusion du canal auditif lors du port de protecteurs. Fournir un protecteur auditif dont l'efficacite reelle est adaptee a l'environnement de travail et dont le confort est optimise permettrait, sur le long terme, d'ameliorer les conditions des travailleurs en minimisant le risque lie a la degradation de leur appareil auditif. Les perspectives de travail proposees a la fin de cette these consistent principalement a : (i) exploiter ces deux methodes avec une gamme frequentielle plus etendue, (ii) explorer la variabilite intra-individuelle de chacune des methodes, (iii) comparer les resultats des deux methodes avec ceux obtenus par l'intermediaire de la methode "Microphone in Real Ear" (MIRE) et (iv) verifier la compatibilite de chacune des methodes avec tous les types de protecteurs auditifs. De plus, pour la methode de mesure de l'effet d'occlusion utilisant les PEASM, une etude complementaire est necessaire pour lever la contradiction observee a 250 Hz.
Boundary Layer Simulation and Control in Wind Tunnels
1988-04-01
de l’art dans le domaine de la simulation de la couche limite, ou le nombre de Reynolds n’est pas ou ne peut pas etre simule, examine les effets ...pour la definition de certains essais en soufflerie oil les effets visqueux sont d’une importance particuliere. CONTENTS Page PREFACE iii 1...transition associated with cylindrical bodies at high incidence in subsonic flow. Other relevant references are given therein. Figures 11 a-b, from Ref
1985-11-01
tourbillons daxe perpendicu-V laire A l’fcoulement principal) issu d’un profil occillant en Tamis dan;, do,, condition,, dn dorochagp dynamique. 5_10...a~rodyna- - mique sur R. A partir de cette analyse experimentale, une tentative de modelisation th~sorique des effets non *lin~ laires observes aux...cisaillement A la paroi d’un profil d’aile anim6 d’un mouvament harmonique parall~le ou parpandicu- laire A 1coulement non perturb~s", EUROMECH
Effet Bauschinger lors de la plasticité cyclique de l'aluminium pur monocristallin
NASA Astrophysics Data System (ADS)
Alhamany, A.; Chicois, J.; Fougères, R.; Hamel, A.
1992-08-01
This paper is concerned with the study of microscopic mechanisms which control the cyclic deformation of pure aluminium and especially with the analysis of the Bauschinger effect which appears in aluminium single crystals deformed by cyclic straining. Fatigue tests are performed on Al single crystals with the crystal axis parallel to [ overline{1}23] at room temperature, at plastic shear strain amplitudes in the range from 10^{-4} to 3× 10^{-3}. Mechanical saturation is not obtained at any strain level. Instead, a hardening-softening-secondary hardening sequence is found. The magnitude of the Bauschinger effect as the difference between yield stresses in traction and in compression, changes all along the fatigue loop and during the fatigue test. The Bauschinger effect disappears at two points of the fatigue loop, one in the traction part, the other in the compression one. At these points, the Bauschinger effect is inverted. Dislocation arrangement evolutions with fatigue conditions can explain the cyclic behaviour of Al single crystals. An heterogeneous dislocation distribution can be observed in the cyclically strained metal : dislocation tangles, long dislocation walls and dislocation cell walls, separated by dislocation poor channels appear in the material as a function of the cycle number. The long range internal stress necessary to ensure the compatibility of deformation between the hard and soft regions controls the observed Bauschinger effect. Ce travail s'inscrit dans le cadre de l'étude des mécanismes microsocopiques intervenant lors de la déformation cyclique de l'aluminium pur et concerne en particulier l'analyse de l'effet Bauschinger apparaissant au cours de la solliciation cyclique des monocristaux. L'étude a été menée à température ambiante sur des monocristaux d'aluminium pur orientés pour un glissement simple (axe [ overline{1}23] ), à des amplitudes de déformation plastique comprise entre 10^{-4} et quelques 10^{-3}. Nous n'avons pas obtenu de véritable saturation mécanique. Nous sommes en présence d'une séquence durcissement-adoucissement-durcissement secondaire. L'amplitude de l'effet Bauschinger considéré comme la différence entre les limites élastiques en traction et en compression mesurées selon une procédure appropriée, évolue le long d'une boucle de fatigue, s'annule pour deux points particuliers l'un en traction l'autre en compression. De part et d'autre de ces points, le signe de l'effet Bauschinger est inversé. Les microstructures des états fatigués sont caractérisés par une répartition hétérogène des dislocations constituée d'amas, de murs ou des parois, suivant le degré de déformation cyclique, séparés par des zones à faible densité de dislocations. Les contraintes internes liées aux incompatibilités de déformation résultant de cette répartition hétérogène des dislocations sont à l'origine de l'effet Bauschinger observé dans les monocristaux. Ces contraintes et l'évolution de la quantité de cellules de dislocations avec la fatigue expliquent le durcissement secondaire.
Les hémodialysés HVC sont-ils vraiment des patients difficiles à traiter?
Krati, Khadija; Cherquaoui, Hind; Oubaha, Sofia; Samlani, Zouhour
2015-01-01
L'hépatite C demeure la principale infection virale chez l'hémodialysé, dont la prise en charge thérapeutique ainsi que la gestion de ses effets secondaires restent difficiles. Il s'agit d'une étude rétrospective portant sur tous les patients atteints d'hépatite C chronique ayant une insuffisance rénale chronique sous hémodialyse, suivis au service de gastroentérologie au CHU Mohamed VI de Marrakech sur une période de Janvier 2004 à Décembre 2014. Sur un total de 355 cas d'hépatite virale C, 13 patients étaient hémodialysés (3,66%). Dix patients ont été traités, soit 76,94% des cas. Le traitement n’était pas indiqué chez 2 patients ayant une fibrose minime sans cytolyse. Il était contre-indiqué chez une patiente multitarée. Deux malades ont eu une réponse virologique rapide et 5 une réponse virologique précoce. Le taux de réponse virologique soutenue était de 40%, 30% des patients étaient non répondeurs. Le traitement fut arrêté chez 2 patientes pour effets secondaires sévères. Un seul patient a été candidat à une transplantation rénale. En analyse multivariée, la réponse virologique soutenue a été significativement associée à certains facteurs prédictifs de bonne réponse thérapeutique: Age jeune ≤40 ans (P=0,0057), fibrose minime F1-F2 (P=0,03), génotype non 1 (P=0,0064), charge virale préthérapeutique <800000 UI/ml (P=0,013), et l'absence d'arrêt thérapeutique (P=0,028). La gestion efficace des effets secondaires du traitement de HVC permet d'obtenir chez l'hémodialysé un taux de réponse virologique soutenue avoisinant celui de la population générale. PMID:27022433
Kabongo, Joe Katabwa; Kaputu-Kalala-Malu, Célestin; Luboya, Oscar; Mutombo, Valerien; Ntambwe, Abel; Mapatano, Mala Ali; Mukendi, Kavulu Mayamba
2015-01-01
Introduction En vue d'améliorer la prise en charge des patients souffrant de neuropathie (NP) associées à l'infection HIV, nous avons essayé de déterminer le profil clinique des personnes souffrant de NP au cours du suivi thérapeutique de leur infection HIV. Méthodes Il s'agit d'une étude transversale (n= 101) menée au centre d'excellence depuis 1 an. Notre analyse est essentiellement clinique. Par un examen clinique minutieux, nous avons recherché tous les symptômes et signes cliniques des NP. Subjectivement, les douleurs dominent le tableau. Pour affiner leur diagnostic, nous avons utilisé l’échelle DN4 (Diagnostic des douleurs neuropathiques) et l’échelle EVA (Evaluation de la gravité des douleurs). Nous avons ensuite analysé nos données en fonction de certains autres facteurs épidémiologiques tels que le taux des CD4, le traitement anti-HIV etc. Résultats Les 101 patients représentent 3,12% de la cohorte générale; 53,3% des patients présentent une abolition des réflexes ostéotendineux des membres inférieurs; 77,89% présentent une hypoesthésie thermo algique en chaussette et en gants; 25% ont présenté une amyotrophie des membres inférieurs; 76,5% ont été soumis à un traitement antirétroviral contenant la stavudine; 11,7% ont pris la didanosine (DDI) et Abacavir (ABC). 84% ont une moyenne de CD4 de 292 cel/mm3. Conclusion La NP altère la qualité de vie de nos patients et diminue l'adhérence au traitement antirétroviral. Plusieurs facteurs sont incriminés dans la survenue de la NP, l'effet direct des antirétroviraux, l'effet inflammatoire dysimmunitaire, l'effet infectieux lié aux infections opportunistes. D'autres facteurs seront recherchés et analysés ultérieurement. PMID:26185582
1983-04-01
Convention on International Civil Aviation, Second Edition , March 1966. 5. WORLD AIRLINE ACCIDENT SUMMARY. Civil Aviation Authority, (Great Britain...people who either provided information, or who suggested other sources of information for the current edition of this survey. E.M.R. Alexander Civil...Waverley, New Zealand. F-28C Tail rotor drive shaft. Fatigue strength reduc- ed by softened condition & surface decarbur- isation. AISA 4130 steel. Ref: NZ
2006-02-01
effets de différents renforts de sac à dos (renforts droits, renforts courbés et aucun renfort) sur la...supérieures au seuil de tolérance de la peau et des muscles sous-jacents, et elles pourraient causer des lésions sur la peau et des ecchymoses. Assessment...présente évaluation portait sur cette pratique et visait à déterminer les effets de différents renforts de sac à dos (renforts droits, renforts courbés
1983-04-01
Bureau of Standards. NTS3 National Transportation Safety Board (USA). NTSB AAR NTSB Aircraft Accident Report. NZ AAR New Zealand Aircraft Accident Report...NZ AI New Zealand Accident Investigation Bureau. 0 -5- RAN Royal Australian Navy RAAI Royal Australian Air Force RAF Royal Air Force, UK S Substantial...Ice land Iraq Ireland Jamaica (1966 -1981) Japan (1973 - Feb. 81) Kenya Lethoso Malaysia Ma law i Mal ta Mexico Netherlands New Zealand Norway
Villeval, Mélanie; Bidault, Elsa; Lang, Thierry
2016-09-01
L'Evaluation d'Impact sur la Santé (EIS) se développe au niveau international et est encore au stade émergent en France. Elle vise à évaluer les effets positifs et négatifs potentiels d'un projet, d'un programme ou d'une politique sur la santé. L'objectif est de produire des recommandations en direction des décideurs, afin d'en maximiser les effets positifs et d'en diminuer les effets négatifs. L'EIS est un moyen particulièrement intéressant d'action sur les déterminants de la santé au-delà des comportements individuels et du système de santé. Les politiques de logement, de transport, de solidarité, économiques, etc. ont, en effet, des impacts souvent non prévus sur la santé. Au-delà des effets sur la santé, l'EIS doit aussi permettre d'apprécier la distribution de ces effets dans la population.Si la préoccupation pour l'équité en santé est centrale dans l'EIS, elle reste cependant difficilement traduite en pratique. Face à cette difficulté, des démarches d'évaluation d'impact ont été développées pour renforcer la prise en compte de l'équité à chaque étape de l'EIS ou « Equity Focused Health Impact Assessment », ou prendre en compte les impacts sur les inégalités de santé de façon spécifique. Ainsi, l'Evaluation de l'Impact sur l'Equité en Santé (EIES) semble, par exemple, particulièrement intéressante pour évaluer l'impact sur les inégalités de projets dans le champ sanitaire.L'EIS et l'EIES posent de nombreuses questions de recherche, notamment autour de la réunion, dans une même démarche, du politique, du citoyen et de l'expert. La participation des populations vulnérables potentiellement affectées par la politique évaluée est une valeur centrale de l'EIS, mais pose des questions d'acceptabilité sociale. La collaboration avec les décideurs politiques est également un enjeu majeur. Les difficultés méthodologiques, notamment de quantification des impacts, peuvent constituer des freins à la promotion de la démarche auprès des décideurs. © The Author(s) 2015.
Analysis of electrical property changes of skin by oil-in-water emulsion components
Jeong, CB; Han, JY; Cho, JC; Suh, KD; Nam, GW
2013-01-01
Synopsis ObjectivesAs the ‘Dry Skin Cycle’ produces continuous deterioration, cosmetic xerosis (flaky, dry skin) is one of the major concerns to most consumers. The purpose of this study was to investigate the moisturizing effect of oil-in-water (O/W) emulsion components. There are numerous types of oils, waxes, polyols and surfactants used as ingredients in skincare products. However, the moisturizing effect of each ingredient and understanding each use to make an effective moisturizing products are still not well understood. Methods To provide answers to these questions, we investigated the moisturizing effect of widely used 41 components (four different classes) in a simple O/W emulsion using capacitance methods. 106 different single oils, and combinations of oil with oil, wax, humectants, and surfactant were formulated and tested. Results In this study, we found that most of the O/W emulsion components had hydration effects on the skin. (i) The average relative water content increase (RWCI) rate of a single oil-based emulsion was 11.8 ± 5.2% (SE) and 7.9 ± 6.0% (SE) at 3 and 6 h, respectively. (ii) An oil combination emulsion showed an average RWCI rate similar to that of a single oil-based emulsion, 12.6 ± 6.0% (SE) and 12.1 ± 6.4% (SE) at 3 and 6 h, respectively (iii) A combination of waxes with oil showed an average RWCI rate of 16 ± 5.6% (SE) and 12.4 ± 4.5% (SE) at 3 and 6 h, respectively. (iv) Humectant combinations showed the highest average RWCI rate 28 ± 7.3% (SE) and 22.2 ± 7.5% (SE) at 3 and 6 h, respectively (v) Surfactant combinations had an average RWCI of 10.8 ± 4.5% (SE) and 6.0 ± 4.0% (SE) at 3 and 6 h, respectively. Conclusion Interestingly, it was difficult to find moisturizing power differences among samples in the same group. Only the humectants group showed significant differences among samples. Glycerine and urea showed significant skin hydration effects compared with other humectants. We also found a significant moisturizing effect by analysing the chemical functional groups; amide class had a higher hydration effect than betaines and disaccharides in humectants combination. Résumé Objectif Puisque le «cycle de la peau sèche” produit une détérioration continue, la xérose cosmétique (squameuse, peau sèche) est l’une des préoccupations majeures pour la plupart des consommateurs. Le but de cette étude était d’étudier l’effet hydratant des composants d’émulsions H / E. Il existe de nombreux types d’huiles, des cires, de polyols, et des tensioactifs utilisés comme ingrédients dans les produits de soins de la peau. Cependant, l’effet hydratant de chaque ingrédient et de leur utilisation dans des produits hydratants efficaces ne sont pas encore bien compris. MethodesPour apporter des réponses à ces questions, nous avons étudié l’effet hydratant des 41 éléments (4 classes différentes) largement utilisés dans une émulsion simple O/W en utilisant des méthodes de capacitance. 106 huiles individuelles différentes et des combinaisons d’huile avec de l’huile, de la cire, des humectants, et de tensioactifs ont été formulées et testées. ResultatsDans cette étude, nous avons constaté que la plupart des composants des émulsions huile-dans-eau (H/E) possédaient des effets d’hydratation de la peau. (i) Le taux moyen d’augmentation d’eau (RWCI = relative water content increase) d’une émulsion à base d’une seule huile était de 11,8 ± 5,2% (SE) et de 7,9 ± 6,0% (SE) à 3 et 6 h, respectivement. (ii) Une émulsion de combinaison d’huile montrait une RWCI similaire à celle d’une émulsion à base d’huile unique, 12,6 ± 6,0% (SE) et 12,1 ± 6,4% (SE) à 3 et 6 h, respectivement. (iii) Une combinaison des cires avec de l’huile présentait une RWCI de 16 ± 5,6% (SE) et 12,4 ± 4,5% (SE) à 3 et 6 h, respectivement. (iv) Les combinaisons d’humectant ont montré la plus forte augmentation avec +28 ± 7,3% (SE) et 22,2 ± 7,5% (SE) à 3 et 6 h, respectivement. (v) Les combinaisons de tensioactifs ont une RWCI moyenne de 10,8 ± 4,5% (SE) et de 6,0 ± 4,0% (SE) à 3 et 6 h, respectivement. ConclusionFait intéressant, il était difficile de trouver des différences de pouvoir d’hydratation entre les échantillons dans le même groupe. Seul le groupe des humectants a montré des différences significatives entre les échantillons. La glycérine et l’urée ont montré des effets significatifs sur l’hydratation de la peau par rapport aux autres humectants. Nous avons également constaté un effet hydratant important en analysant les groupes fonctionnels chimiques; la classe “amide” a eu un effet d’hydratation plus élevé que les bétaînes et disaccharides dans les combinaisons des humectants. PMID:23621673
Les effets bénéfiques de l'accompagnement du patient cancéreux: particularités du Maroc
Lkhoyaali, Sihame; Aitelhaj, Meryem; Errihani, Hassan
2014-01-01
Au Maroc la majorité des patients âgés cancéreux sont pris en charge par leurs proches l'accompagnement des patients est à l'origine de conséquences émotionnelles psychiques et financières négatives mais en contrepartie il est à l'origine de plusieurs effets bénéfiques à savoir un resserrement des liens familiaux, une surestime du soi et il est la source d'un bien-être affectif et spirituel qui permet de faire face à la maladie. PMID:25722766
2005-08-01
iv Sommaire Les études sur les effets de la pression de contact produite par une charge d’un sac à dos sont essentielles, étant...mobilité des soldats et amenuiser leur capacité de combat. Afin de mieux comprendre les effets de la pression d’un sac à dos sur le confort et la... de mesure de pression qui pourraient être utilisés pour établir les valeurs de tolérance de la peau aux pressions durant le transport d’une charge
NASA Astrophysics Data System (ADS)
Robichaud, Luc
Les fluctuations du vide, qui consistent en l'apparition momentanee de particules, ce qui est permit par le principe d'incertitude de Heisenberg, joue un role primordial dans les processus photoniques, en particulier les processus non-lineaires. Par la manipulation de ces fluctuations du vide a l'aide de confinement optique, on retrouve deux phenomenes particuliers : l'intensification de la fluorescence parametrique (Walker, 2008) et l'inhibition de la generation du second harmonique (Collette, 2013). Dans ce travail, on presente les resultats dans le cas classique ; c'est-a-dire sans fluctuations du vide et confinement. Par la suite, on presente les effets des fluctuations du vide et du confinement, ce qui mene aux deux effets mentionnes. Dans le cas de la fluorescence parametrique, le bruit quantique sur le champ interne et externe est calcule, le role du desaccord de phase dans le modele est expose et une generalisation tridimensionnelle est etudiee afin de generaliser la conception du modele d'un cas unidimensionnel a un cas tridimensionnel planaire. Dans le cas de la generation du second harmonique, les difficultes d'un modele purement tridimensionnel sont exposees et ensuite le cas limite planaire est etudie.
2005-05-01
Résumé La performance opérationnelle des soldats dépend d’un certain nombre de facteurs, dont la charge de travail physiologique, les effets ...particulièrement à la charge de travail physiologique et aux effets biomécaniques, et à perfectionner la mise au point d’un modèle biomécanique dynamique (DBM...on a créé une couche de peau possédant des propriétés appropriées pour le modèle de torse et on a réalisé la modélisation de tous les composants
Comparaison des effets des irradiations γ, X et UV dans les fibres optiques
NASA Astrophysics Data System (ADS)
Girard, S.; Ouerdane, Y.; Baggio, J.; Boukenter, A.; Meunier, J.-P.; Leray, J.-L.
2005-06-01
Les fibres optiques présentent de nombreux avantages incitant à les intégrer dans des applications devant résister aux environnements radiatifs associés aux domaines civil, spatial ou militaire. Cependant, leur exposition à un rayonnement entraîne la création de défauts ponctuels dans la silice amorphe pure ou dopée qui constitue les différentes parties de la fibre optique. Ces défauts causent, en particulier, une augmentation transitoire de l'atténuation linéique des fibres optiques responsable de la dégradation voire de la perte du signal propagé dans celles-ci. Dans cet article, nous comparons les effets de deux types d'irradiation: une impulsion X et une dose γ cumulée. Les effets de ces irradiations sont ensuite comparés avec ceux induits par une insolation ultraviolette (244 nm) sur les propriétés d'absorption des fibres optiques. Nous montrons qu'il existe des similitudes entre ces différentes excitations et qu'il est possible, sous certaines conditions, d'utiliser celles-ci afin d'évaluer la capacité de certaines fibres optiques à fonctionner dans un environnement nucléaire donné.
Antisuperconductors: Properties of Layered Compounds with Coupling
NASA Astrophysics Data System (ADS)
Carton, J.-P.; Lammert, P. E.; Prost, J.
1995-11-01
In this note, we consider properties of a hypothetical superconductor composed of Josephson-coupled microscopic layers with tunneling energy minimized at a phase difference of π. The non-zero phase offset in the ground state engenders an intriguing interplay between the superconductive ordering and structural lattice defects. Unusual magnetic properties are expected in the case of highly disordered crystals, which are consistent with observations of a “paramagnetic Meissner” or “Wohlleben” effect in high-T_c cuprate superconductors. Dans cette note, nous considérons les propriétés d'un supraconducteur hypothétique composé de couches microscopiques, couplées par effet Josephson, mais dont l'énergie de couplage est minimisée pour une différence de phase de π. L'état de base a des propriétés fascinantes dues à l'effet combiné de l'ordre supraconducteur et des défauts structuraux du cristal. Dans le cas de cristaux très désordonnés, on attend des propriétés magnétiques exceptionnelles, qui sont compatibles avec les observations dans quelques supraconducteurs cuprate haute-T_c d'un effet “Meissner paramagnétique” ou “Wohlleben”.
NASA Astrophysics Data System (ADS)
Hoang, Long Phan; Sacovy, Paulette; Delaplace, Jean
1983-02-01
Des rubans d'alliages amorphes Metglas du type Fe 40Ni 38Mo 4B 18 à l'état brut de trempe ont été déformés par traction à la température ambiante et l'on a suivi les variations de la résistance électrique des échantillons au cours de la déformation. Il ressort de ces essais que la déformation plastique qui est de l'ordre de 0.5% avant rupture ne se produit pas de faĉon homogène dans l'échantillon. Les mesures électriques effectuées au cours de la déformation mettent en évidence dans le domaine élastique un effet d'élastorésistance, relativement important ( K ≠ 1); elles montrent que dans le domaine plastique la déformation permanente des échantillons s'accompagne d'une diminution de la résistivité électrique du matériau. Deux hypothèses sont discutées pour expliquer cet effet inattendu, l'un qui fait appel à l'existence de volumes libres, l'autre qui suppose une cristallisation localisée du matériau sous l'effet de la contrainte.
Nanoparticules d'or: De l'imagerie par resonance magnetique a la radiosensibilisation
NASA Astrophysics Data System (ADS)
Hebert, Etienne M.
Cette thèse approfondit l'étude de nanoparticules d'or de 5 nm de diamètre recouvertes de diamideéthanethioldiethylènetriaminepentacétate de gadolinium (DTDTPA:Gd), un agent de contraste pour l'imagerie par résonance magnétique (IRM). En guise de ciblage passif, la taille des nanoparticules a été contrôlée afin d'utiliser le réseau de néovaisseaux poreux et perméable des tumeurs. De plus les tumeurs ont un drainage lymphatique déficient qui permet aux nanoparticules de demeurer plus longtemps dans le milieu interstitiel de la tumeur. Les expériences ont été effectuées sur des souris Balb/c femelles portant des tumeurs MC7-L1. La concentration de nanoparticules a pu être mesurée à l'IRM in vivo. La concentration maximale se retrouvait à la fin de l'infusion de 10 min. La concentration s'élevait à 0.3 mM dans la tumeur et de 0.12 mM dans le muscle environnant. Les nanoparticules étaient éliminées avec une demi-vie de 22 min pour les tumeurs et de 20 min pour le muscle environnant. Les nanoparticules ont été fonctionnalisées avec le peptide Tat afin de leur conférer des propriétés de ciblage actif La rétention de ces nanoparticules a ainsi été augmentée de 1600 %, passant d'une demi-vie d'élimination de 22 min à 350 min. La survie des souris a été mesurée à l'aide de courbes Kaplan-Meier et d'un modèle mathématique évalue l'efficacité de traitements. Le modèle nous permet, à l'aide de la vitesse de croissance des tumeurs et de l'efficacité des traitements, de calculer la courbe de survie des spécimens. Un effet antagoniste a été observé au lieu de l'effet synergétique attendu entre une infusion de Au@DTDTPA:Gd et l'irradiation aux rayons X. L'absence d'effet synergétique a été attribuée à l'épaisseur du recouvrement de DTDTPA:Gd qui fait écran aux électrons produits par l'or. De plus, le moyen d'ancrage du recouvrement utilise des thiols qui peuvent s'avérer être des capteurs de radicaux. De plus, contrairement a ce qui était escompté, un effet chimiothérapeutique de ces nanoparticules a été observé in vitro et in vivo. Par contre, le mécanisme précis de cet effet est encore à être expliquer, mais on sait déjà que les nanoparticules d'or affectent les fonctions des macrophages ainsi que l'angiogenèse. MOTS-CLÉS : Radiosensibilisateur, Nanoparticules d'or, Agent de contraste pour l'IRM, Électrons de basses énergies, Kaplan-Meier, Effet chimiothérapeutique.
NASA Astrophysics Data System (ADS)
Maouche, B.; Feliachi, M.
1997-10-01
In this paper, a study of the interaction between the inductor and the load of an axisymmetrical induction device is proposed. This interaction concerns the effect of the eddy current on both the excitation current and on the system impedance. A half analytical model, based on a numerical discretization of the electromagnetic solution domain, is used. In each cell of the numerical discretization, an analytical calculation using the Moment Method (MM) is considered. In the case of strong skin effect (High Frequency: HF), the formulation makes use of the Impedance Boundary Condition (IBC); in the contrary case (Low Frequency: LF), the interior domain is discretized. Dans cet article nous proposons l'étude de l'influence d'une charge (induit) conductrice sur la répartition du courant inducteur ainsi que sur l'impédance du système. L'inducteur est à géométrie axisymétrique de forme solénoïdale ou pancake destiné au chauffage par induction. Une méthode semi-analytique, basée sur une discrétisation du domaine en mailles élémentaires auxquelles s'applique une formulation intégrale (Méthode des Circuits Couplés : MCC) des grandeurs électromagnétiques, est utilisée. Dans le cas où l'effet de peau est important (Haute Fréquence:HF), la formulation associe la Condition d'Impédance de Surface; dans le cas contraire (Basse Fréquence : BF), un maillage du domaine interne est pratiqué.
Brousselle, Astrid; Morales, Cristián
2013-01-01
Résumé Les nouveaux médicaments pour le VIH/sida ont créé des besoins d’accessibilité aux traitements que les gouvernements n’ont pas toujours réussi à couvrir. Il en résulte l’émergence d’un marché informel des ARV. Par l’analyse de la situation au Chili, nous traitons des différents créneaux d’approvisionnement, des conséquences de l’existence d’un tel marché, ainsi que des moyens envisageables pour réduire les effets indésirables. Les aspects tant microéconomiques que macroéconomiques concernant le marché et l’accessibilité aux médicaments sont abordés. PMID:23997580
NASA Astrophysics Data System (ADS)
Samson, Thomas
Nous proposons une methode permettant d'obtenir une expression pour la conductivite de Hall de structures electroniques bidimensionnelles et nous examinons celle -ci a la limite d'une temperature nulle dans le but de verifier l'effet Hall quantique. Nous allons nous interesser essentiellement a l'effet Hall quantique entier et aux effets fractionnaires inferieurs a un. Le systeme considere est forme d'un gaz d'electrons en interaction faible avec les impuretes de l'echantillon. Le modele du gaz d'electrons consiste en un gaz bidimensionnel d'electrons sans spin expose perpendiculairement a un champ magnetique uniforme. Ce dernier est decrit par le potentiel vecteur vec{rm A} defini dans la jauge de Dingle ou jauge symetrique. Conformement au formalisme de la seconde quantification, l'hamiltonien de ce gaz est represente dans la base des etats a un-corps de Dingle |n,m> et exprime ainsi en terme des operateurs de creation et d'annihilation correspondants a_sp{ rm n m}{dag} et a _{rm n m}. Nous supposons de plus que les electrons du niveau fondamental de Dingle interagissent entre eux via le potentiel coulombien. La methode utilisee fait appel a une equation mai tresse a N-corps, de nature quantique et statistique, et verifiant le second principe de la thermodynamique. A partir de celle-ci, nous obtenons un systeme d'equations differentielles appele hierarchie d'equations quantique dont la resolution nous permet de determiner une equation a un-corps, dite de Boltzmann quantique, et dictant l'evolution de la moyenne statistique de l'operateur non-diagonal a _sp{rm n m}{dag } a_{rm n}, _{rm m}, sous l'action du champ electrique applique vec{rm E}(t). C'est sa solution Tr(p(t) a _sp{rm n m}{dag} a_{rm n},_ {rm m}), qui definit la relation de convolution entre la densite courant de Hall vec{rm J}_{rm H }(t) et le champ electrique vec {rm E}(t) dont la transformee de Laplace-Fourier du noyau nous fournit l'expression de la conductivite de Hall desiree. Pour une valeur de facteur d'occupation (nombre d'electrons/degenerescence des etats de Dingle) superieure a un, c'est-a-dire en absence d'interaction electron-electron, il nous sera facile d'evaluer cette conductivite a la limite d'une temperature nulle et de demontrer qu'elle tend vers l'une des valeurs quantiques qe^2/h conformement a l'effet Hall quantique entier. Cependant, pour une valeur du facteur d'occupation inferieure a un, c'est-a-dire en presence d'interaction electron-electron, nous ne pourrons evaluer cette limite et obtenir les resultats escomptes a cause de l'impossibilite de determiner l'un des termes impliques. Neanmoins, ce dernier etant de nature statistique, il pourra etre aisement mis en fonction du propagateur du gaz d'electrons dont on doit maintenant determiner une expression en regime effet Hall quantique fractionnaire. Apres avoir demontre l'impuissance de la theorie des perturbations, basee sur le theoreme de Wick et la technique des diagrammes de Feynman, a accomplir cette tache correctement, nous proposons une seconde methode. Celle -ci fait appel au formalisme de l'integrale fonctionnelle et a l'utilisation d'une transformation de Hubbard-Stratonovich generalisee permettant de substituer a l'interaction a deux-corps une interaction effective a un-corps. L'expression finale obtenue bien que non completement resolue, devrait pouvoir etre estimee par une bonne approximation analytique ou au pire numeriquement.
L’effet du yoga chez les patients atteints de cancer
Côté, Andréanne; Daneault, Serge
2012-01-01
Résumé Objectif Déterminer si le yoga thérapeutique améliore la qualité de vie de patients atteints de cancer. Sources des données Recherche effectuée avec la base de données MEDLINE (1950–2010) en utilisant les mots-clés yoga, cancer et quality of life. Sélection des études Priorité accordée aux études cliniques randomisées contrôlées évaluant l’effet du yoga sur différents symptômes susceptibles de se présenter chez des patients atteints de cancer en Amérique du Nord. Synthèse Quatre études cliniques randomisées contrôlées ont d’abord été analysées, puis 2 études sans groupe-contrôle. Trois études réalisées en Inde et au Proche-Orient ont également apporté des éléments intéressants au plan méthodologique. Les interventions proposées comprenaient des séances de yoga d’une durée et d’une fréquence variables. Les paramètres mesurés variaient également d’une étude à l’autre. Plusieurs symptômes ont connu des améliorations significatives avec le yoga (meilleure qualité du sommeil, diminution des symptômes anxieux ou dépressifs, amélioration du bien-être spirituel, etc.). Il a aussi semblé que la qualité de vie, dans sa globalité ou dans certaines de ses composantes spécifiques, s’améliorait. Conclusion La variété des effets bénéfiques produits, l’absence d’effet secondaire et le rapport coût-bénéfice avantageux du yoga thérapeutique en fait une intervention intéressante à suggérer par les médecins de famille aux patients atteints de cancer. Certaines lacunes méthodologiques ont pu diminuer la puissance statistique des études présentées, à commencer par la taille restreinte des échantillons et par l’assiduité variable des patients soumis à l’intervention. Il est également possible que les échelles de mesure utilisées ne convenaient pas à ce type de situation et de clientèle pour qu’en soit dégagé un effet significatif. Toutefois, les commentaires favorables des participants recueillis au cours des études et leur degré d’appréciation et de bien-être donnent à penser qu’il faille poursuivre les recherches pour mieux en comprendre les mécanismes.
NASA Astrophysics Data System (ADS)
Minard, Benoit
De nos jours, la problématique du bruit généré par les avions est devenue un point de développement important dans le domaine de l'aéronautique. C'est ainsi que de nombreuses études sont faites dans le domaine et une première approche consiste à modéliser de façon numérique ce bruit de manière à réduire de façon conséquente les coûts lors de la conception. C'est dans ce contexte qu'un motoriste a demandé à l'université de Sherbrooke, et plus particulièrement au groupe d'acoustique de l'Université de Sherbrooke (GAUS), de développer un outil de calcul de la propagation des ondes acoustiques dans les nacelles mais aussi pour l'étude des effets d'installation. Cet outil de prédiction leur permet de réaliser des études afin d'optimiser les traitements acoustiques (« liners »), la géométrie de ces nacelles pour des études portant sur l'intérieur de la nacelle et des études de positionnement des moteurs et de design pour les effets d'installation. L'objectif de ce projet de maîtrise était donc de poursuivre le travail réalisé par [gousset, 2011] sur l'utilisation d'une méthode de lancer de rayons pour l'étude des effets d'installation des moteurs d'avion. L'amélioration du code, sa rapidité, sa fiabilité et sa généralité étaient les objectifs principaux. Le code peut être utilisé avec des traitements acoustiques de surfaces («liners») et peut prendre en compte le phénomène de la diffraction par les arêtes et enfin peut être utilisé pour réaliser des études dans des environnements complexes tels que les nacelles d'avion. Le code développé fonctionne en 3D et procéde en 3 étapes : (1) Calcul des faisceaux initiaux (division d'une sphère, demi-sphère, maillage des surfaces de la géométrie) (2) Propagation des faisceaux dans l'environnement d'étude : calcul de toutes les caractéristiques des rayons convergents (amplitude, phase, nombre de réflexions, ...) (3) Reconstruction du champ de pression en un ou plusieurs points de l'espace à partir de rayons convergents (sommation des contributions de chaque rayon) : sommation cohérente. Le code (GA3DP) permet de prendre en compte les traitements de surface des parois, la directivité de la source, l'atténuation atmosphérique et la diffraction d'ordre 1. Le code a été validé en utilisant différentes méthodes telles que la méthode des sources-images, la méthode d'analyse modale ou encore la méthode des éléments finis de frontière. Un module Matlab a été créé spécialement pour l'étude des effets d'installation et intégré au code existant chez Pratt & Whitney Canada.
Felli, M. C.; Parent, S.; Zelazo, P. D.; Tremblay, R. E.; Séguin, J. R.
2017-01-01
Résumé À la petite enfance, l’adaptation sociale de l’enfant dépend en partie des risques auxquels il est exposé dans son environnement. Toutefois, les mécanismes par lesquels les facteurs de risque opèrent leurs influences sur l’adaptation sociale de l’enfant sont peu documentés. Ainsi, cette étude examine dans un premier temps l’effet principal de l’adversité familiale, un cumul de facteurs de risque, sur les problèmes de comportement intériorisés et extériorisés, ainsi que sur la sécurité d’attachement des enfants d’âge préscolaire. Dans un deuxième temps, elle évalue le rôle médiateur du fonctionnement familial dans le lien entre l’adversité familiale et les problèmes de comportement de même qu’entre l’adversité familiale et la sécurité d’attachement des enfants d’âge préscolaire. Les 572 participants à l’étude (n=572) sont âgés entre cinq et 42 mois lors des mesures de l’adversité familiale et de 42 mois lors de la mesure des problèmes de comportement et du fonctionnement familial. Quatre-vingt d’entre eux (n=80) ont fait l’objet d’une mesure de sécurité d’attachement à 48 mois. Les résultats indiquent, d’abord, un effet principal de l’adversité familiale sur les problèmes de comportement intériorisés et extériorisés. Un effet médiateur significatif du fonctionnement familial est ensuite rapporté dans le lien entre l’adversité familiale et les problèmes de comportement intériorisés et extériorisés. Aucun effet significatif n’est observé pour la sécurité d’attachement des enfants de 48 mois. PMID:28567062
Internal stresses, dislocation mobility and ductility
NASA Astrophysics Data System (ADS)
Saada, G.
1991-06-01
The description of plastic deformation must take into account individual mechanisms and heterogeneity of plastic strain. Influence of dislocation interaction with forest dislocations and of cross slip are connected with the organization of dipole walls. The latter are described and their development is explained as a consequence of edge effects. Applications are discussed. La description de la déformation plastique doit prendre en compte les interactions individuelles des dislocations et l'hétérogénéité à grande échelle de la déformation plastique. Les interactions des dislocations mobiles avec la forêt de dislocations, le glissement dévié, ont pour effet la création de parois dipolaires. Celles-ci sont décrites et leur développement est appliqué à partir des effets de bord.
NASA Astrophysics Data System (ADS)
LeBlanc, Luc R.
Les materiaux composites sont de plus en plus utilises dans des domaines tels que l'aerospatiale, les voitures a hautes performances et les equipements sportifs, pour en nommer quelques-uns. Des etudes ont demontre qu'une exposition a l'humidite nuit a la resistance des composites en favorisant l'initiation et la propagation du delaminage. De ces etudes, tres peu traitent de l'effet de l'humidite sur l'initiation du delaminage en mode mixte I/II et aucune ne traite des effets de l'humidite sur le taux de propagation du delaminage en mode mixte I/II dans un composite. La premiere partie de cette these consiste a determiner les effets de l'humidite sur la propagation du delaminage lors d'une sollicitation en mode mixte I/II. Des eprouvettes d'un composite unidirectionnel de carbone/epoxy (G40-800/5276-1) ont ete immergees dans un bain d'eau distillee a 70°C jusqu'a leur saturation. Des essais experimentaux quasi-statiques avec des chargements d'une gamme de mixites des modes I/II (0%, 25%, 50%, 75% et 100%) ont ete executes pour determiner les effets de l'humidite sur la resistance au delaminage du composite. Des essais de fatigue ont ete realises, avec la meme gamme de mixite des modes I/II, pour determiner 1'effet de 1'humidite sur l'initiation et sur le taux de propagation du delaminage. Les resultats des essais en chargement quasi-statique ont demontre que l'humidite reduit la resistance au delaminage d'un composite carbone/epoxy pour toute la gamme des mixites des modes I/II, sauf pour le mode I ou la resistance au delaminage augmente apres une exposition a l'humidite. Pour les chargements en fatigue, l'humidite a pour effet d'accelerer l'initiation du delaminage et d'augmenter le taux de propagation pour toutes les mixites des modes I/II. Les donnees experimentales recueillies ont ete utilisees pour determiner lesquels des criteres de delaminage en statique et des modeles de taux de propagation du delaminage en fatigue en mode mixte I/II proposes dans la litterature representent le mieux le delaminage du composite etudie. Une courbe de regression a ete utilisee pour determiner le meilleur ajustement entre les donnees experimentales et les criteres de delaminage en statique etudies. Une surface de regression a ete utilisee pour determiner le meilleur ajustement entre les donnees experimentales et les modeles de taux de propagation en fatigue etudies. D'apres les ajustements, le meilleur critere de delaminage en statique est le critere B-K et le meilleur modele de propagation en fatigue est le modele de Kenane-Benzeggagh. Afin de predire le delaminage lors de la conception de pieces complexes, des modeles numeriques peuvent etre utilises. La prediction de la longueur de delaminage lors des chargements en fatigue d'une piece est tres importante pour assurer qu'une fissure interlaminaire ne va pas croitre excessivement et causer la rupture de cette piece avant la fin de sa duree de vie de conception. Selon la tendance recente, ces modeles sont souvent bases sur l'approche de zone cohesive avec une formulation par elements finis. Au cours des travaux presentes dans cette these, le modele de progression du delaminage en fatigue de Landry & LaPlante (2012) a ete ameliore en y ajoutant le traitement des chargements en mode mixte I/II et en y modifiant l'algorithme du calcul de la force d'entrainement maximale du delaminage. Une calibration des parametres de zone cohesive a ete faite a partir des essais quasi-statiques experimentaux en mode I et II. Des resultats de simulations numeriques des essais quasi-statiques en mode mixte I/II, avec des eprouvettes seches et humides, ont ete compares avec les essais experimentaux. Des simulations numeriques en fatigue ont aussi ete faites et comparees avec les resultats experimentaux du taux de propagation du delaminage. Les resultats numeriques des essais quasi-statiques et de fatigue ont montre une bonne correlation avec les resultats experimentaux pour toute la gamme des mixites des modes I/II etudiee.
2011-04-01
la Reine (en droit du Canada), telle que représentée par le ministre de la Défense nationale, 2011 DRDC Toronto TR... la gestion de la terreur, c’est parce que les êtres humains sont les seuls à posséder la capacité de comprendre la finitude de la vie qu’ils ont...entre autres l’adhésion à une vision du monde culturellement significative et un sentiment de sécurité basé sur l’estime de soi. À ce jour,
NASA Astrophysics Data System (ADS)
Auban-Senzier, P.; Bourbonnais, C.; Jérome, D.; Lenoir, C.; Batail, P.; Canadell, E.; Buisson, J. P.; Lefrant, S.
1993-03-01
We have performed the simultaneous investigation of the isotope effect on the superconducting transition and on the Raman spectra in the organic superconductor β_H-(BEDT-TTF)2I3 (T_c = 8 K). For this purpose, we substitute ^{13}C for ^{12}C on the carbon sites of the central double bond of BEDT-TTF molecule. The isotope shifts measured by Raman experiments can be fairly well explained by standard molecular dynamics. However, the critical temperature is lowered by 0.2 K in the ^{13}C enriched material. We analyse the possible sources of this remarkable downward shift which leads to an isotope coefficient higher than the BCS value. The extended-Hückel calculations of the density of states for the two HOMO bands of β_H-(BEDT-TTF)2I3 do show that, within the framework of a weak coupling theory, its sizeable variation on the scale of ω_D cannot account for the observed isotope effect. On the other hand, we discuss how inelastic electronic scattering observed in resistivity measurements just above T_c can lead through a pair breaking mechanism to a sizeable increase of the isotope coefficient. Nous présentons une étude simultanée d'effet isotopique sur la transition supraconductrice et les spectres Raman dans le supraconducteur organique β_H-(BEDT-TTF)2I3 (T_c = 8 K). Pour cela, nous avons synthétisé le composé dans lequel les atomes de carbone de la double liaison centrale de la molécule BEDT-TTF sont substitués par l'isotope ^{13}C. Les déplacements isotopiques mesurés par spectroscopie Raman sont bien expliqués par la dynamique moléculaire standard. Cependant, la température critique est abaissée de 0.2 K dans le matériau enrichi en ^{13}C. Nous étudions les origines possibles de cet effet qui permet d'obtenir un coefficient isotopique supérieur à la valeur BCS. Des calculs de la densité d'états effectués par la méthode de Hückel étendue pour les deux bandes HOMO du composé montrent que, dans le cadre d'une théorie de couplage faible, son importante variation à l'échelle de ω_D ne peut expliquer l'effet observé. D'autre part, nous expliquons comment la diffusion électronique inélastique observée en résistivité juste au-dessus de T_c peut conduire via un mécanisme de brisure de paires, à une augmentation significative du coefficient isotopique.
Pringsheim, Tamara; Doja, Asif; Belanger, Stacey; Patten, Scott
2012-01-01
HISTORIQUE ET OBJECTIF : L’utilisation d’antipsychotiques augmente chez les enfants. Le présent article visait à orienter les cliniciens quant à la prise en charge clinique des effets secondaires extrapyramidaux des antipsychotiques de deuxième génération. MÉTHODOLOGIE : Les publications, les entrevues avec des informateurs clés et des échanges avec les membres d’un groupe de discussion et les partenaires ont permis de déterminer les principaux secteurs cliniques d’orientation et les préférences quant à la structure des présentes recommandations. Les membres responsables des lignes directrices ont reçu le projet de recommandations, ont évalué l’information recueillie grâce à une analyse bibliographique systématique et ont utilisé un processus de groupe nominal pour parvenir à un consensus quant aux recommandations thérapeutiques. Les lignes directrices contiennent une description des anomalies neurologiques souvent observées avec l’utilisation d’antipsychotiques ainsi que les recommandations sur le moyen d’examiner et de quantifier ces anomalies. Une démarche séquentielle sur la prise en charge des anomalies neurologiques est présentée. RÉSULTATS : On peut observer plusieurs types de symptômes extrapyramidaux attribuables à l’utilisation d’antipsychotiques chez les enfants, y compris la dystonie aiguë, l’akathisie, le parkinsonisme et la dyskinésie tardive, toutes induites par les neuroleptiques, de même que la dystonie tardive, l’akathisie tardive et les dyskinésies de sevrage. La forte majorité des données probantes sur le traitement des troubles du mouvement induits par les antipsychotiques proviennent de patients adultes atteints de schizophrénie. Étant donné le peu de données pédiatriques, les recommandations découlent de publications portant tant sur des adultes que sur des enfants. Compte tenu des limites de généralisation des données provenant de sujets adultes pour des enfants, il faudrait évaluer ces recommandations d’après les avis d’experts plutôt que d’après les données probantes. CONCLUSION : Les cliniciens doivent savoir que les antipsychotiques de deuxième génération ont le potentiel d’induire des effets secondaires neurologiques et devraient faire preuve d’une extrême vigilance lorsqu’ils en prescrivent. PMID:24082814
NASA Astrophysics Data System (ADS)
Freuchet, Florian
Dans le milieu marin, l'abondance du recrutement depend des processus qui vont affecter les adultes et le stock de larves. Sous l'influence de signaux fiables de la qualite de l'habitat, la mere peut augmenter (effet maternel anticipatoire, 'anticipatory mother effects', AME) ou reduire (effet maternel egoiste, 'selfish maternai effects', SME) la condition physiologique de la progeniture. Dans les zones tropicales, generalement plus oligotrophes, la ressource nutritive et la temperature sont deux composantes importantes pouvant limiter le recrutement. Les effets de l'apport nutritionnel et du stress thermique sur la production de larves et sur la stategie maternelle adoptee ont ete testes dans cette etude. Nous avons cible la balane Chthamalus bisinuatus (Pilsbry) comme modele biologique car el1e domine les zones intertidales superieures le long des cotes rocheuses du Sud-Est du Bresil (region tropicale). Les hypotheses de depart stipulaient que l'apport nutritionnel permet aux adultes de produire des larves de qualite elevee et que le stress thermique genere une ponte precoce, produisant des larves de faible qualite. Afin de tester ces hypotheses, des populations de C. bisinuatus ont ete elevees selon quatre groupes experimentaux differents, en combinant des niveaux d'apport nutritionnel (eleve et faible) et de stress thermique (stresse et non stresse). Des mesures de survie et de conditions physiologiques des adultes et des larves ont permis d'identifier les reponses parentales pouvant etre avantageuses dans un environnement tropical hostile. L'analyse des profils en acides gras a ete la methode utilisee pour evaluer la qualite physiologique des adultes et de larves. Les resultats du traitement alimentaire (fort ou faible apport nutritif), ne montrent aucune difference dans l'accumulation de lipides neutres, la taille des nauplii, l'effort de reproduction ou le temps de survie des nauplii en condition de jeune. Il semble que la faible ressource nutritive est compensee par les meres qui adoptent un modele AME qui se traduit par l'anticipation du milieu par les meres afin de produire des larves au phenotype approprie. A l'ajout d'un stress thermique, on observe des diminutions de 47% de la production de larves et celles-ci etaient 18 microm plus petites. Les meres semblent utiliser un modele SME caracterise par une diminution de la performance des larves. Suite a ces resultats, nous emettons l'hypothese qu'en zone subtropicale, comme sur les cotes de l'etat de Sao Paulo, l'elevation de la temperature subie par les balanes n'est, a priori, pas dommageable pour leur organisme si eIle est combinee a un apport nutritif suffisant.
L'analyse par activation de neutrons de réacteur
NASA Astrophysics Data System (ADS)
Meyer, G.
2003-02-01
Quand les neutrons traversent la matière, certains sont transmis sans interaction, les autres interagissent avec le milieu traversé par diffusion et par absorption. Ce phénomène d'absorption est utilisé pour se protéger des neutrons, mais aussi pour les détecter; il peut également être utilisé pour identifier les noyaux “absorbants" et ainsi analyser le milieu traversé. En effet par différentes réactions nucléaires (n,γ), (n,p), (n,α), (n,fission), on obtient des noyaux résiduels qui sont souvent radioactifs; on dit que l'échantillon est “activé". Si l'on connaît le rendement d'activation et donc le pourcentage de noyaux ainsi “transmutés", les mesures de radioactivité induite vont permettre de déterminer la composition de l'échantillon irradié. Cette méthode dite d'analyse par activation neutronique est pratiquée depuis la découverte du neutron. Elle a permis grâce à sa sélectivité et à sa sensibilité d'avoir accès au domaine des traces et des ultra-traces dans des champs d'application très divers comme la métallurgie, l'archéologie, la biologie, la géochimie etc...
Effets non lineaires transversaux dans les guides d'ondes plans
NASA Astrophysics Data System (ADS)
Dumais, Patrick
Les effets non lineaires transversaux dus a l'effet Kerr optique non resonant sont etudies dans deux types de guides a geometrie plane. D'abord (au chapitre 2), l'emission de solitons spatiaux d'un guide de type canal est etudie historiquement, analytiquement et numeriquement dans le but d'en faire la conception et la fabrication, en AlGaAs, dans la region spectrale en deca de la moitie de la bande interdite de ce materiau, soit autour de 1,5 microns. Le composant, tel que concu, comporte une structure de multipuits quantiques. Le desordonnement local de cette structure permet une variation locale du coefficient Kerr dans le guide, ce qui mene a l'emission d'un soliton spatial au-dela d'une puissance optique de seuil. L'observation experimentale d'un changement en fonction de l'intensite du profil de champ a la sortie du guide realise est presentee. Deuxiemement (au chapitre 3) une technique de mesure du coefficient Kerr dans un guide plan est presentee. Cette technique consiste a mesurer le changement de transmission au travers d'un cache place a la sortie du guide en fonction de l'intensite crete a l'entree du guide plan. Une methode pour determiner les conditions optimales pour la sensibilite de la mesure est presentee, illustree de plusieurs exemples. Finalement, la realisation d'un oscillateur parametrique optique basee sur un cristal de niobate de lithium a domaines periodiquement inverses est presentee. La theorie des oscillateurs parametriques optiques est exposee avec une emphase sur la generation d'impulsions intenses a des longueurs d'onde autour de 1,5 microns a partir d'un laser Ti:saphir, dans le but d'obtenir une source pour faire les experiences sur l'emission solitonique.
NASA Astrophysics Data System (ADS)
Lirette-Pitre, Nicole T.
2009-07-01
La reussite scolaire des filles les amene de plus en plus a poursuivre une formation postsecondaire et a exercer des professions qui demandent un haut niveau de connaissances et d'expertise scientifique. Toutefois, les filles demeurent toujours tres peu nombreuses a envisager une carriere en sciences (chimie et physique), en ingenierie ou en TIC (technologie d'information et de la communication), soit une carriere reliee a la nouvelle economie. Pour plusieurs filles, les sciences et les TIC ne sont pas des matieres scolaires qu'elles trouvent interessantes meme si elles y reussissent tres bien. Ces filles admettent que leurs experiences d'apprentissage en sciences et en TIC ne leur ont pas permis de developper un interet ni de se sentir confiante en leurs habiletes a reussir dans ces matieres. Par consequent, peu de filles choisissent de poursuivre leurs etudes postsecondaires dans ces disciplines. La theorie sociocognitive du choix carriere a ete choisie comme modele theorique pour mieux comprendre quelles variables entrent en jeu lorsque les filles choisissent leur carriere. Notre etude a pour objet la conception et l'evaluation de l'efficacite d'un materiel pedagogique concu specifiquement pour ameliorer les experiences d'apprentissage en sciences et en TIC des filles de 9e annee au Nouveau-Brunswick. L'approche pedagogique privilegiee dans notre materiel a mis en oeuvre des strategies pedagogiques issues des meilleures pratiques que nous avons identifiees et qui visaient particulierement l'augmentation du sentiment d'auto-efficacite et de l'interet des filles pour ces disciplines. Ce materiel disponible par Internet a l'adresse http://www.umoncton.ca/lirettn/scientic est directement en lien avec le programme d'etudes en sciences de la nature de 9e annee du Nouveau-Brunswick. L'evaluation de l'efficacite de notre materiel pedagogique a ete faite selon deux grandes etapes methodologiques: 1) l'evaluation de l'utilisabilite et de la convivialite du materiel et 2) l'evaluation de l'effet du materiel en fonction de diverses variables reliees a l'interet et au sentiment d'auto-efficacite des filles en sciences et en TIC. Cette recherche s'est inscrite dans un paradigme pragmatique de recherche. Le pragmatisme a guide nos choix en ce qui a trait au modele de recherche et des techniques utilisees. Cette recherche a associe a la fois des techniques qualitatives et quantitatives, particulierement en ce qui concerne la collecte et l'analyse de donnees. Les donnees recueillies dans la premiere etape de l'evaluation de l'utilisabilite et de la convivialite du materiel par les enseignantes et les enseignants de sciences et les filles ont revele que le materiel concu est tres utilisable et convivial. Toutefois quelques petites ameliorations seront apportees a une version subsequente afin de faciliter davantage la navigation. Quant a l'evaluation des effets du materiel concu sur les variables reliees au sentiment d'auto-efficacite et aux interets lors de l'etape quasi experimentale, nos donnees qualitatives ont indique que ce materiel a eu des effets positifs sur le sentiment d'auto-efficacite et sur les interets des filles qui l'ont utilise. Toutefois, nos donnees quantitatives n'ont pas permis d'inferer un lien causal direct entre l'utilisation du materiel et l'augmentation du sentiment d'auto-efficacite et des interets des filles en sciences et en TIC. A la lumiere des resultats obtenus, nous avons conclu que le materiel a eu les effets escomptes. Donc, nous recommandons la creation et l'utilisation de materiel de ce genre dans toutes les classes de sciences de la 6e annee a la 12e annee au Nouveau-Brunswick.
Etude de faisabilite d'un systeme eolien diesel avec stockage d'air comprime
NASA Astrophysics Data System (ADS)
Benchaabane, Youssef
Le Systeme Hybride Eolien-Diesel avec Stockage d'Air Comprime (SHEDAC) utilise l'hybridation pneumatique pour remplacer la consommation des combustibles fossiles par de l'energie renouvelable, plus particulierement de l'energie eolienne. Le surplus de l'energie eolienne est utilise pour comprimer et stocker de l'air qui est utilise ensuite pour suralimenter le moteur diesel. Le memoire de maitrise est constitue de deux articles scientifiques. Le premier article presente le developpement d'un logiciel dedie a l'etude de faisabilite d'un systeme eolien-diesel avec stockage d'air comprime. Cette etude est basee sur l'analyse des couts et des revenus, des couts des equipements (eolienne, moteur diesel, systeme de stockage d'air). Elle est completee par une analyse de sensibilite aux differents parametres, une analyse des risques et des emissions des gaz a effet de serre (GES). Le deuxieme article est une application de ce logiciel pour l'installation d'un systeme SHEDAC au camp minier Esker au Quebec en remplacement des sources actuelles de production d'energie. L'utilisation du stockage d'air comprime a l'aide d'un systeme SHEDAC est le plus rentable par rapport a l'utilisation de l'energie eolienne seule ou d'une centrale thermique au diesel seule ou des deux combinees. Avec une valeur actuelle nette et un taux de rendement interne plus eleves, cette solution permet d'obtenir le plus bas cout de l'energie pour cette region eloignee. None None None
Effetive methods in educating extension agents and farmers on conservation farming technology
USDA-ARS?s Scientific Manuscript database
Adoption of new technologies requires transfer of information from developers to end users. Efficiency of the transfer process influences the rate of adoption and ultimate impact of the technology. Various channels are used to transfer technology from researchers to farmers. Two commonly used ones ...
Future of Software Engineering Standards
NASA Technical Reports Server (NTRS)
Poon, Peter T.
1997-01-01
In the new millennium, software engineering standards are expected to continue to influence the process of producing software-intensive systems which are cost-effetive and of high quality. These sytems may range from ground and flight systems used for planetary exploration to educational support systems used in schools as well as consumer-oriented systems.
1989-01-01
experience de gen dieses Experiments. Der 10 KDIE-Wert legt nahe. dab die combustion statique. La valeur de l’effet isotopique primaire laisse...anomaler combustion globale. Des 6tudes publifes par ailleurs confirment cette KDIE-Wert bei 10,4 MPa wirL angegeben. Diese KDIE-Experimente hypoth~se
Effets pathogènes d'un faible débit de dose : la relation « dose effet »
NASA Astrophysics Data System (ADS)
Masse, Roland
2002-10-01
There is no evidence of pathogenic effects in human groups exposed to less than 100 mSv at low dose-rate. The attributed effects are therefore the result of extrapolations from higher doses. The validity of such extrapolations is discussed from the point of view of epidemiology as well as cellular and molecular biology. The Chernobyl accident resulted in large excess of thyroid cancers in children; it also raised the point that some actual sanitary effects among distressed populations might be a direct consequence of low doses. Studies under the control of UN have not confirmed this point identifying no dose-effect relationship and " severe socio-economic and psychological pressures… poverty, poor diet and living conditions, and lifestyle factors" as the main cause for depressed health. Some hypothesis are considered for explaining the dose-dependence and high prevalence of non-cancer causes of death among human groups exposed to more than 300 mSv. To cite this article: R. Masse, C. R. Physique 3 (2002) 1049-1058.
NASA Astrophysics Data System (ADS)
Rahmani, A.; Benyaïch, F.; Bounakhla, M.; Bilal, E.; Moutte, J.; Gruffat, J. J.; Zahry, F.
2004-11-01
Dans ce travail, nous présentons une étude comparative des techniques d'analyse par fluorescence X à dispersion d'énergie (ED-XRF) et à dispersion de longueur d'onde (WD-XRF), et par spectrométrie d'émission atomique à source plasma couplée par induction (ICP-AES). Les résultats de la calibration des spectromètres à dispersion d'énergie, à excitation par sources radioactives (55Fe, 109Cd et 241Am) et à excitation secondaire (cible secondaire Mo et Cu) du Centre National pour l'Energie, les Sciences et les Techniques Nucléaires (CNESTEN, Rabat, Maroc) sur des échantillons étalons de références de l'Agence International de l'Energie Atomique (AIEA) et du Community Bureau of Référence (BCR) ont été comparés aux résultats d'analyse des mêmes échantillons étalons par la spectrométrie X à dispersion de longueur d'onde (WD-XRF) et par spectrométrie d'émission atomique à source plasma couplé par induction (ICP-AES) au département GENERIC du centre SPIN à l'Ecole des Mines de Saint-Etienne (France). Les trois techniques d'analyse utilisées donnent des résultats comparables pour le dosage des éléments majeurs, alors que pour les traces on note des déviations importantes à cause des effets de matrice qui sont difficiles à corriger dans le cas de la fluorescence X.
Métier de sociologue, approche inductive et objet d'analyse. Brèves remarques à partir de Bourdieu.
Hamel, Jacques
2015-05-01
This article seeks to reveal the role played by the inductive approach in sociology. Grounded Theory assumes its full importance in formulating sociological explanations. However, the theory does pose a problem, in that the "method" is not based on clearly defined operations, which remain implicit. This article attempts to show that the object of analysis-what is being analyzed-makes perceptible the operations implicitly conceived by the analyst, based on Grounded Theory. With qualitative analysis software, such as Atlas.ti, it is possible to shed light on these operations. The article is illustrated by the theory of Pierre Bourdieu and the epistemological considerations he developed as a result of his qualitative inquiry, La Misère du monde. Cet article cherche à montrer le rôle que joue l'approche inductive en sociologie. La Grounded Theory revêt son importance pour formuler l'explication sociologique. Celle-ci pose toutefois problème. En effet, la «méthode» ne repose pas sur des opérations clairement définies et celles-ci restent implicites. Dans cet article, on cherche à montrer que l'objet d'analyse-ce sur quoi porte l'analyse-rend perceptibles les opérations que l'analyste conçoit implicitement en s'appuyant sur la Grounded Theory. Les logiciels d'analyse qualitative, comme Atlas.ti, permettent d'autre part de les mettre en évidence. L'article est illustré par la théorie de Pierre Bourdieu et les considérations épistémologiques qu'a développées cet auteur à la suite de son enquête qualitative sur la Misère du monde. © 2015 Canadian Sociological Association/La Société canadienne de sociologie.
Detection de la fin de la compaction des anodes par le son
NASA Astrophysics Data System (ADS)
Sanogo, Bazoumana
L'objectif de ce projet etait de developper un outil de controle en temps reel du temps de compaction en se servant du son genere par le vibrocompacteur pendant le formage des anodes crues. Ainsi, une application a ete developpee pour l'analyse des sons enregistres. Des essais ont ete realises avec differents microphones pour une meilleure qualite des mesures et un a ete choisi pour la suite du projet. De meme, differents tests ont ete realises sur des anodes de laboratoire ainsi que des anodes a l'echelle industrielle afin de mettre en place une methode pour la detection du temps optimal necessaire au formage des anodes. Les travaux au laboratoire de carbone a l'Universite du Quebec a Chicoutimi (UQAC) ont consiste a l'enregistrement de son des anodes fabriquees sur place avec differentes configurations; et a la caracterisation de certaines anodes de l'usine. Les anodes fabriquees au laboratoire sont reparties en deux groupes. Le premier regroupe les anodes pour la validation de notre methode. Ce sont des anodes produites avec des temps de compaction differents. Le laboratoire de carbone a l'UQAC est unique et il est possible de produire des anodes avec les memes proprietes que celles des anodes industrielles. Par consequent, la validation initialement prevue a l'usine a ete effectuee avec les anodes de laboratoire. Le deuxieme groupe a servi a etudier les effets des matieres premieres sur le temps de compaction. Le type de coke et le type de brai ont constitue les differentes variations dans ce deuxieme groupe. Quant aux tests et mesures a l'usine, ils ont ete realises en trois campagnes de mesure. La premiere campagne en juin 2014 a servi a standardiser et a trouver le meilleur positionnement des appareils pour les mesures, a regler le logiciel et a faire les premieres mesures. Une deuxieme campagne en mai 2015 a fait l'objet d'enregistrement de son en classant les anodes selon differents temps de compaction. La troisieme et derniere campagne en decembre 2015 a ete le lieu de tests finaux a l'usine en fabriquant des anodes avec differents criteres (variation du temps de compaction, taux de brai, arret manuel du compacteur, variation de la pression des ballons du haut du compacteur). Ces anodes ont ete ensuite analysees au laboratoire a l'UQAC. En parallele a ces travaux precites, l'amelioration de l'application d'analyse du son a ete faite avec le choix des parametres d'analyse et leur standardisation. Les resultats des premiers tests au laboratoire et ceux de la campagne de juin 2014 ont montre que la formation des anodes se fait suivant trois etapes : rearrangement des particules et du brai, compaction et consolidation et enfin la finition. Ces travaux ont montre en outre que le temps de compaction joue un role tres important dans la definition des proprietes finales des anodes. Ainsi, en plus du type de brai, du taux de brai et du type de coke, il faut tenir compte du temps de sur-compaction et de sous-compaction. En effet, ceci a ete demontre a travers les deux validations qui ont ete realisees. Les resultats de la caracterisation des echantillons (venant des anodes de la campagne de decembre 2015) ont montre qu'une anode compactee a un temps optimal acquiert une bonne resistance a la compression et sa resistivite electrique baisse. En outre, on note que le temps de compaction dans notre cas a baisse legerement avec l'augmentation de la pression des ballons de haut du vibrocompacteur. Ce qui a eu pour effet d'augmenter la densite crue de l'anode. Toutefois, il faut s'abstenir de generaliser ce constat car le nombre d'anodes testees est faible dans notre cas. Par ailleurs, cette etude montre que le temps necessaire pour le formage d'une anode croit avec l'augmentation du taux de brai et baisse legerement avec l'augmentation de la pression des ballons. (Abstract shortened by ProQuest.).
Caby, Isabelle; Olivier, N; Mendelek, F; Kheir, R Bou; Vanvelcenaher, J; Pelayo, P
2014-01-01
HISTORIQUE : La lombalgie chronique est une douleur lombaire persistante d’origine multifactorielle. Le niveau de douleur initial reste faiblement utilisé pour analyser et comparer les réponses des patients lombalgiques au programme de reconditionnement. OBJECTIFS : Apprécier et évaluer les réponses des sujets lombalgiques chroniques très douloureux à une prise en charge dynamique et intensive. MÉTHODOLOGIE : 144 sujets atteints de lombalgie chronique ont été inclus dans un programme de restauration fonctionnelle du rachis de 5 semaines. Les sujets ont été classés en deux groupes de niveau de douleur: un groupe atteint de douleur sévère (n = 28) et un groupe atteint de douleur légère à modérée (n = 106). L’ensemble des sujets ont bénéficié d’une prise en charge identique comprenant principalement de la kinésithérapie, de l’ergothérapie, du reconditionnement musculaire et cardio-vasculaire ainsi qu’un suivi psychologique. Les paramètres physiques (flexibilité, force musculaire) et psychologiques (qualité de vie) ont été mesurés avant (T0) et après le programme (T5sem). RÉSULTATS : L’ensemble des performances physiques et fonctionnelles des sujets très douloureux sont moins bonnes et le retentissement de la lombalgie sur la qualité de vie, pour ces mêmes sujets, est majoré à T0. Toutes les différences significatives constatées à T0 entre les deux groupes s’effacent à T5sem. CONCLUSIONS : Les sujets lombalgiques chroniques très douloureux répondent favorablement au programme dynamique et intensif. L’intensité douloureuse de la lombalgie n’aurait pas d’effet sur les réponses au programme. La restauration fonctionnelle du rachis apporterait aux sujets la possibilité de mieux gérer leur douleur quel que soit son niveau. PMID:25299476
NASA Astrophysics Data System (ADS)
Floquet, Jimmy
Dans les cuves d'electrolyse d'aluminium, le milieu de reaction tres corrosif attaque les parois de la cuve, ce qui diminue leur duree de vie et augmente les couts de production. Le talus, qui se forme sous l'effet des pertes de chaleur qui maintiennent un equilibre thermique dans la cuve, sert de protection naturelle a la cuve. Son epaisseur doit etre controlee pour maximiser cet effet. Advenant la resorption non voulue de ce talus, les degats generes peuvent s'evaluer a plusieurs centaines de milliers de dollars par cuve. Aussi, l'objectif est de developper une mesure ultrasonore de l'epaisseur du talus, car elle serait non intrusive et non destructive. La precision attendue est de l'ordre du centimetre pour des mesures d'epaisseurs comprenant 2 materiaux, allant de 5 a 20 cm. Cette precision est le facteur cle permettant aux industriels de controler l'epaisseur du talus de maniere efficace (maximiser la protection des parois tout en maximisant l'efficacite energetique du procede), par l'ajout d'un flux thermique. Cependant, l'efficacite d'une mesure ultrasonore dans cet environnement hostile reste a demontrer. Les travaux preliminaires ont permis de selectionner un transducteur ultrasonore a contact ayant la capacite a resister aux conditions de mesure (hautes temperatures, materiaux non caracterises...). Differentes mesures a froid (traite par analyse temps-frequence) ont permis d'evaluer la vitesse de propagation des ondes dans le materiau de la cuve en graphite et de la cryolite, demontrant la possibilite d'extraire l'information pertinente d'epaisseur du talus in fine. Fort de cette phase de caracterisation des materiaux sur la reponse acoustique des materiaux, les travaux a venir ont ete realises sur un modele reduit de la cuve. Le montage experimental, un four evoluant a 1050 °C, instrumente d'une multitude de capteurs thermique, permettra une comparaison de la mesure intrusive LVDT a celle du transducteur, dans des conditions proches de la mesure industrielle. Mots-cles : Ultrasons, CND, Haute temperature, Aluminium, Cuve d'electrolyse.
Effets de la pollution de l’air sur la santé
Abelsohn, Alan; Stieb, Dave M.
2011-01-01
Résumé Objectif Faire connaître aux médecins de famille les effets de la pollution atmosphérique sur la santé et indiquer quels conseils donner aux patients vulnérables pour qu’ils soient moins exposés. Sources de l’information On a consulté MEDLINE à l’aide des termes relatifs à la pollution atmosphérique et à ses effets indésirables. On a révisé les articles en anglais publiés entre janvier 2008 et décembre 2009. La plupart des études contenaient des preuves de niveau II. Principal message Au Canada, la pollution de l’air extérieur cause une morbidité et une mortalité importantes. Elle peut affecter le système respiratoire (exacerbation de l’asthme et de la maladie pulmonaire obstructive chronique) et le système cardiovasculaire (déclencher l’arythmie, l’insuffisance cardiaque et les AVC). La cote air santé (CAS) est un nouvel outil de communication mis au point par Santé Canada et Environnement Canada qui indique sur une échelle de 1 à 10, le risque pour la santé causé par la pollution atmosphérique. La CAS est largement diffusée dans les médias et cet outil pourrait être utile au médecin de famille pour inciter les patients à haut risque (comme ceux qui souffrent d’asthme, de maladie pulmonaire obstructive chronique ou d’insuffisance cardiaque) à réduire leur exposition à la pollution atmosphérique. Conclusion Le médecin de famille peut se servir de la CAS et de ses messages sur la santé pour enseigner aux asthmatiques et aux autres patients à risque élevé la façon de réduire les risques pour la santé causés par la pollution atmosphérique.
Bergeron, Félix-Antoine; Blais, Martin; Hébert, Martine
2016-01-01
Résumé Cet article explore le rôle modérateur du soutien parental dans les relations entre la victimisation homophobe, l’homophobie intériorisée et la détresse psychologique chez des jeunes de minorités sexuelles (JMS), que l’on dit aussi lesbiennes, gais, bisexuels ou en questionnement. Il vise à 1) documenter la prévalence des différentes formes de victimisation homophobe vécue par les JMS, et ce, selon le genre et l’âge et 2) à explorer l’effet modérateur du soutien parental dans la relation entre la victimisation homophobe, l’homophobie intériorisée et la détresse psychologique. Un échantillon de 228 JMS âgés de 14 à 22 ans, non exclusivement hétérosexuels, recrutés en milieu communautaire dans le cadre de l’enquête sur le Parcours Amoureux des Jeunes (PAJ) du Québec a été analysé. L’impact de la victimisation homophobe, du soutien parental, de l’homophobie intériorisée sur la détresse psychologique est exploré par un modèle de régression linéaire avec effets de médiation modérée. Le rôle modérateur du soutien parental est confirmé dans la relation entre la victimisation homophobe et la détresse psychologique. Ces variables peuvent constituer des leviers pour prévenir les effets négatifs des préjudices homophobes sur la santé mentale des JMS. PMID:26966851
Etude thermo-hydraulique de l'ecoulement du moderateur dans le reacteur CANDU-6
NASA Astrophysics Data System (ADS)
Mehdi Zadeh, Foad
Etant donne la taille (6,0 m x 7,6 m) ainsi que le domaine multiplement connexe qui caracterisent la cuve des reacteurs CANDU-6 (380 canaux dans la cuve), la physique qui gouverne le comportement du fluide moderateur est encore mal connue de nos jours. L'echantillonnage de donnees dans un reacteur en fonction necessite d'apporter des changements a la configuration de la cuve du reacteur afin d'y inserer des sondes. De plus, la presence d'une zone intense de radiations empeche l'utilisation des capteurs courants d'echantillonnage. En consequence, l'ecoulement du moderateur doit necessairement etre etudie a l'aide d'un modele experimental ou d'un modele numerique. Pour ce qui est du modele experimental, la fabrication et la mise en fonction de telles installations coutent tres cher. De plus, les parametres de la mise a l'echelle du systeme pour fabriquer un modele experimental a l'echelle reduite sont en contradiction. En consequence, la modelisation numerique reste une alternative importante. Actuellement, l'industrie nucleaire utilise une approche numerique, dite de milieu poreux, qui approxime le domaine par un milieu continu ou le reseau des tubes est remplace par des resistances hydrauliques distribuees. Ce modele est capable de decrire les phenomenes macroscopiques de l'ecoulement, mais ne tient pas compte des effets locaux ayant un impact sur l'ecoulement global, tel que les distributions de temperatures et de vitesses a proximite des tubes ainsi que des instabilites hydrodynamiques. Dans le contexte de la surete nucleaire, on s'interesse aux effets locaux autour des tubes de calandre. En effet, des simulations faites par cette approche predisent que l'ecoulement peut prendre plusieurs configurations hydrodynamiques dont, pour certaines, l'ecoulement montre un comportement asymetrique au sein de la cuve. Ceci peut provoquer une ebullition du moderateur sur la paroi des canaux. Dans de telles conditions, le coefficient de reactivite peut varier de maniere importante, se traduisant par l'accroissement de la puissance du reacteur. Ceci peut avoir des consequences majeures pour la surete nucleaire. Une modelisation CFD (Computational Fluid Dynamics) detaillee tenant compte des effets locaux s'avere donc necessaire. Le but de ce travail de recherche est de modeliser le comportement complexe de l'ecoulement du moderateur au sein de la cuve d'un reacteur nucleaire CANDU-6, notamment a proximite des tubes de calandre. Ces simulations servent a identifier les configurations possibles de l'ecoulement dans la calandre. Cette etude consiste ainsi a formuler des bases theoriques a l'origine des instabilites macroscopiques du moderateur, c.-a-d. des mouvements asymetriques qui peuvent provoquer l'ebullition du moderateur. Le defi du projet est de determiner l'impact de ces configurations de l'ecoulement sur la reactivite du reacteur CANDU-6.
Proprietes ionochromes et photochromes de derives du polythiophene
NASA Astrophysics Data System (ADS)
Levesque, Isabelle
La synthese et la caracterisation de derives regioreguliers du polythiophene ont ete effectuees en solution et sur des films minces. La spectroscopie UV-visible de ces derives a permis de constater qu'ils peuvent posseder des proprietes chromiques particulieres selon le stimulus auquel ils sont soumis. Par exemple, une augmentation de la temperature permet en effet aux polymeres de passer d'une couleur violette a jaune, et ce, a l'etat solide aussi bien qu'en solution. Ces proprietes chromiques semblent regies par une transition conformationnelle (plane a non-plane) de la chaine principale. Ce travail avait pour but de mieux comprendre l'influence de l'organisation des chaines laterales sur les transitions chromiques. Deux derives synthetises possedant des chaines laterales sensibles aux cations alcalins se sont averes etre ionochromes en plus d'etre thermochromes. Il s'agit d'un polymere comportant des chaines laterales de type oligo(oxyethylene) et d'un autre comportant un groupement ether couronne specifique aux ions lithium. Les effets chromiques observes sont expliques par des interactions non-covalentes des cations avec les atomes d'oxygene des chaines laterales dans le cas du premier polymere, et par l'insertion de l'ion Li + dans la cavite de l'ether couronne dans le cas du second polymere. Ces interactions semblent provoquer une diminution de l'organisation induisant ainsi une torsion de la chaine principale. Les deux polymeres semblent specifiques a certains cations et pourraient donc servir comme detecteurs optiques. La specificite aux ions Li+ du second polymere pourrait aussi permettre la conduction ionique, en plus de la conductivite electronique caracteristique des polythiophenes, ce qui pourrait s'averer utile dans le cas de batteries legeres entierement faites de polymeres et de sels de lithium. D'autres derives comportant des chaines laterales de type azobenzene se sont averes etre photochromes en plus d'etre thermochromes. Le groupement lateral a la possibilite de changer de configuration de la forme trans a la forme cis lorsqu'il est soumis a une irradiation dans le domaine de l'ultraviolet ce qui provoque, selon toute evidence, un effet marque sur l'organisation des chaines laterales. Cela induit alors une torsion de la chaine principale thiophene entrainant une diminution de conjugaison marquee. Ces effets peuvent etre exploites entre autres dans l'ecriture optique. Il s'est avere que le polymere irradie peu conjugue peut etre force a retourner a son etat initial conjugue tres rapidement par un traitement electrochimique simple. En conclusion, on a pu prouver qu'une modification dans l'organisation des chaines laterales par un stimulus exterieur affecte considerablement la conformation de la chaine principale. Cela porte a croire que les chaines laterales stabilisent une conformation particuliere des polythiophenes.
Statistics for NAEG: past efforts, new results, and future plans
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gilbert, R.O.; Simpson, J.C.; Kinnison, R.R.
A brief review of Nevada Applied Ecology Group (NAEG) objectives is followed by a summary of past statistical analyses conducted by Pacific Northwest Laboratory for the NAEG. Estimates of spatial pattern of radionuclides and other statistical analyses at NS's 201, 219 and 221 are reviewed as background for new analyses presented in this paper. Suggested NAEG activities and statistical analyses needed for the projected termination date of NAEG studies in March 1986 are given.
Taking Up Space: Museum Exploration in the Twenty-First Century
ERIC Educational Resources Information Center
Sutton, Tiffany
2007-01-01
Museums have become a crucible for questions of the role that traditional art and art history should play in contemporary art. Friedrich Nietzsche argued in the nineteenth century that museums can be no more than mausoleums for effete (fine) art. Over the course of the twentieth century, however, curators dispelled such blanket pessimism by…
Issues and Prospects of Effetive Implementation of New Secondary School Curriculum in Nigeria
ERIC Educational Resources Information Center
Ahmadi, Ali A.; Lukman, Ajibola A.
2015-01-01
This paper digs into the issues surrounding effective implementation of new secondary school curriculum in Nigeria. This is based on the feeling that 21st century education is characterized with a dramatic technological revolution. The paper therefore portrays education in the 21st century as a total departure from the factory-model education of…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Krafft, J.
1960-01-01
A general study is made of the optical elements of a double-fccusing magnetic sector by the fringe effect, with a view to its application to the monochromation of the proton, deuteron, or triton beam of the 1.4 Mev accelerator. (auth)
Investigations into the Power MOSFET SEGR Phenomenon and its Physical Mechanism
NASA Technical Reports Server (NTRS)
Swift, G. M.; Edmonds, L. E.; Miyahira, T.; Nichols, D. K.; Johnston, A. H.
1997-01-01
The state of understanding of the destructive SEGR event in power MOSFETs is relatively mature with large published efforts, both experimental and theoretical. However, gasps remain in the uderstanding of the phenomenon, including unexplained anomalies, emperical-only dependencies on some important device and incident ion physical parameters, and limited insight into latent effets.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lehnert, B.E.; Oritz, J.B.; Steinkamp, J.A.
1991-01-01
In order to obtain information about the particle redistribution phenomenon following the deposition of inhaled particles, as well as to obtain information about some of the mechanisms that may be operable in the redistribution of particles, lavaged lung free cell analyses and transmission electron microscopic (TEM) analyses of lung tissue and were performed using lungs from rats after they were subchronically exposed to aerosolized dioxide (TiO{sub 2}). TEM analyses indicated that the in situ autolysis of particle-containing Alveolar Macropages (AM) is one important mechanism involved in the redistribution of particles. Evidence was also obtained that indicated that the engulfment ofmore » one particle-containing phagocyte by another phagocyte also occurs. Another prominent mechanism of the particle redistribution phenomenon may be the in situ proliferation of particle-laden AM. We used the macrophage cell line J774A.1 as a surrogate for AM to investigate how different particulate loads in macrophages may affect their abilities to proliferate. These in vitro investigations indicated that the normal rate of proliferation of macrophages is essentially unaffected by the containment of relatively high particulate burdens. Overall, the results of our investigations suggest that in situ autolysis of particle-containing AM and the rephagocytosis of freed particles by other phagocytes, the phagocytosis of effete and disintegrating particle-containing phagocytes by other AM, and the in situ division of particle-containing AM are likely mechanisms that underlie the post-depositional redistribution of particles among the lung's AM during alveolar phase clearance. 19 refs., 8 figs., 2 tabs.« less
"Metamagnetoelectric" effect in multiferroics
NASA Astrophysics Data System (ADS)
Fouokeng, G. C.; Fodouop, F. Kuate; Tchoffo, M.; Fai, L. C.; Randrianantoandro, N.
2018-05-01
We present a theoretical calculation of magnetoelectric properties in a quasi-two dimensional spin chain externally controlled by a static electric field in y-direction and magnetic field in z-direction. Given the diversity of properties in functional materials and their applications in physics, the multiferroic model is investigated. By using the Fermi-Dirac statistics of quantum gases and the Landau theory, we assess the effects of the Dzyaloshinskii-Moriya interaction and the electric polarization on the magnetoelectric coupling that induces at low temperature the "metamagnetoelectric" effet, and likewise affects the ferroelectricity induced through symmetry mechanisms and magnetic properties of the multiferroic system. In fact, the variation of the induced polarisation due to spin arrangement through the Dzyaloshinskii-Moriya interaction gives rise to a multistep interdependent metamagnetic and metaelectric transitions which are settled up by the corresponding Dzyaloshinskii-Moriya parameter and the system then exhibits a spin gap that results from an electric and a magnetic demagnetization field range. This metamagnetoelectric effect observed in these multiferroic materials model is seem to be highly tunable via the external electric and magnetic fields and thus can be crucial in the design of new mechanisms for the processing and storage of data and other spintronic applications.
NASA Astrophysics Data System (ADS)
Nicolas, Joëlle
2000-12-01
La Station Laser Ultra Mobile est la plus petite station de télémétrie laser au monde, ne pesant que 300 kg, dédiée à la poursuite de satellites équipés de rétroréflecteurs laser. Elle utilise un petit télescope de 13 cm de diamètre placé sur une monture issue d'un théodolite de précision et motorisé, un laser très compact et une photodiode à avalanche permettant la détection au niveau du simple photo-électron. Les premières expériences (Corse, fin 1996) ont révélé de nombreuses instabilités dans la qualité des mesures. Ce travail concerne l'étude et la mise en place de nombreuses modifications techniques afin d'atteindre une exactitude centimétrique des mesures et de pouvoir participer à la campagne de validation des orbites et d'étalonnage de l'altimètre du satellite océanographique JASON-1 (2001). La précision instrumentale souhaitée a été vérifiée avec succès en laboratoire. Outre cet aspect instrumental et métrologique, une analyse a été développée afin de pouvoir estimer l'exactitude et la stabilité des observations de la station mobile après intégration des modifications. A partir d'une expérience de co-localisation entre les deux stations laser fixe du plateau de Calern, l'analyse est basée sur l'ajustement, par station, de coordonnées et d'un biais instrumental moyen à partir d'une orbite de référence des satellites LAGEOS. Des variations saisonnières ont été mises en évidence dans les séries temporelles des différentes composantes. La comparaison locale des déformations de la croûte terrestre se traduisant par des variations d'altitude issues des données laser a montré une cohérence remarquable avec les mesures du gravimètre absolu transportable FG5. Des signaux de même amplitude ont aussi été observés par GPS. Ces variations sont également mises en évidence à l'échelle mondiale et leur interprétation géophysique est faite (combinaison des effets de marées terrestres et polaire et des effets de charge atmosphérique).
The Cultivation of Ivy. A Saga of the College in America.
ERIC Educational Resources Information Center
Thelin, John R
The popular image of the Ivy League is one of a slightly awesome bastion of the well-born, well-bred, and soon-to-be-powerful or, less charitably, a haven for "the effete, unAmerican, and hopelessly bookish." This pervasive idea of collegiate personality is analyzed, tracing the evolution of the Ivy League from an incongruous array of…
Human Benchmarking of Expert Systems. Literature Review
1990-01-01
effetiveness of the development procedures used in order to predict whether the aplication of similar approaches will likely have effective and...they used in their learning and problem solving. We will describe these approaches later. Reasoning. Reasoning usually includes inference. Because to ... in the software engineering process. For example, existing approaches to software evaluation in the military are based on a model of conventional
NASA Astrophysics Data System (ADS)
Behmand, Behnaz
Les mecanismes qui menent a la supraconductivite dans les supraconducteurs a haute temperature critique sont encore aujourd'hui mal compris contrairement a ceux dans les supraconducteurs conventionnels. Dans les hauts-Tc, certaines modulations de la densite d'etats electroniques coexistant avec la phase supraconductrice ont ete observees, ce qui engendre des questionnements sur leur role dans la supraconductivite. En fait, plusieurs types de modulation de la densite d'etats electroniques existent, comme par exemple l'onde de densite de charge et l'onde de densite de paires. Ces deux modulations, d'origines differentes et mesurables avec la technique de spectroscopie par effet tunnel, peuvent etre differenciees avec une etude de leur symetrie. Ce memoire consistera donc a presenter l'etude de la symetrie de l'onde de densite de charge dans le 2H-NbSe2 qui est presente dans la phase supraconductrice a 300 mK. Par contre, certaines difficultes liees au principe de mesure, soit l'effet de normalisation, nuisent a l'identification de cette symetrie. La methode, pour contourner ce probleme sera alors l'element clef de ce travail.
Accident D’Electrisation et Hemorragie Cerebro-Meningee : A Propos D’une Observation
Chaibdraa, A.; Medjellakh, M.S.; Saouli, A.; Bentakouk, M.C.
2008-01-01
Summary L'électrisation est un évènement accidentel qui diffère des autres pathologies occasionnant des brûlures graves, à cause de ses spécificités qui traduisent d'une part la destruction du revêtement cutané, mais également les effets directs ou indirects du courant électrique sur tout tissu de l'organisme rencontré lors de son passage, en particulier le tissu nerveux. Les manifestations neurologiques centrales sont nombreuses, en relation avec les effets de l'électricité sur le parenchyme cérébral ou une lésion associée à l'électrisation. Nous rapportons l'observation d'une hémorragie cérébro-meningée survenant au 3ème jour d'une électrisation grave. Cette complication est bien documentée dans la littérature traitant des accidents d'électrisation post-foudroiement. N'ayant pas rencontré de cas similaire publié lors des accidents dus au courant industriel, nous présentons cette observation, qui soulève le problème du mécanisme physiopathologique de survenue, difficile à trancher. PMID:21991125
NASA Astrophysics Data System (ADS)
Arsenault, Louis-Francois
Les applications reliees a la generation d'energie motivent la recherche de materiaux ayant un fort pouvoir thermoelectrique (S). De plus, S nous renseigne sur certaines proprietes fondamentales des materiaux, comme, par exemple, la transition entre l'etat coherent et incoherent des quasi-particules lorsque la temperature augmente. Empiriquement, la presence de fortes interactions electron-electron peut mener a un pouvoir thermoelectrique geant. Nous avons donc etudie le modele le plus simple qui tient compte de ces fortes interactions, le modele de Hubbard. La theorie du champ moyen dynamique (DMFT) est tout indiquee dans ce cas. Nous nous sommes concentres sur un systeme tridimensionnel (3d) cubique a face centree (fcc), et ce, pour plusieurs raisons. A) Ce type de cristal est tres commun dans la nature. B) La DMFT donne de tres bons resultats en 3d et donc ce choix sert aussi de preuve de principe de la methode. C) Finalement, a cause de la frustration electronique intrinseque au fcc, celui-ci ne presente pas de symetrie particule-trou, ce qui est tres favorable a l'apparition d'une grande valeur de S. Ce travail demontre que lorsque le materiau est un isolant a demi-remplissage a cause des fortes interactions (isolant de Mott), il est possible d'obtenir de grands pouvoirs thermoelectriques en le dopant legerement. C'est un resultat pratique important. Du point de vue methodologique, nous avons montre comment la limite de frequence infinie de S et l'approche dite de Kelvin, qui considere la limite de frequence nulle avant la limite thermodynamique pour S, donnent des estimations fiables de la vraie limite continue (DC) dans les domaines de temperature appropriee. Ces deux approches facilitent grandement les calculs en court-circuit ant la necessite de recourir a de problematiques prolongements analytiques. Nous avons trouve que la methode de calcul a frequence infinie fonctionne bien lorsque les echelles d'energie sont relativement faibles. En d'autres termes, cette approche donne une bonne representation de S lorsque le systeme devient coherent. Les calculs montrent aussi que la formule Kelvin est precise lorsque la fonction spectrale des electrons devient incoherente, soit a plus haute temperature. Dans la limite Kelvin, S est essentiellement l'entropie par particule, tel que propose il y a longtemps. Nos resultats demontrent ainsi que la vision purement entropique de S est la bonne dans le regime incoherent, alors que dans le regime coherent, l'approche a frequence infinie est meilleure. Nous avons utilise une methode a la fine pointe, soit le Monte-Carlo quantique en temps continu pour resoudre la DMFT. Pour permettre une exploration rapide du diagramme de phase, nous avons du developper une nouvelle version de la methode des perturbations iterees pour qu'elle soit applicable aussi a forte interaction au-dela de la valeur critique de la transition de Mott. Un autre sujet a aussi ete aborde. L'effet orbital du champ magnetique dans les systemes electroniques fortement correles est une question tres importante et peu developpee. Cela est d'autant plus essentiel depuis la decouverte des oscillations quantiques dans les supraconducteurs a haute temperature (haut- Tc). Par desir de developper une methode la moins biaisee possible, nous avons derive la DMFT lorsqu'un champ se couplant a l'operateur energie cinetique par la substitution de Peierls est present. Ce type d'approche est necessaire pour comprendre entre autres l'effet de la physique de Mott sur des phenomenes tels que les oscillations quantiques. Nous avons obtenu un resultat tres important en demontrant rigoureusement que la relation d'auto-coherence de la DMFT et le systeme intermediaire d'impurete quantique restent les memes. L'effet du champ peut etre contenu dans la fonction de Green locale, ce qui constitue la grande difference avec le cas habituel. Ceci permet de continuer a utiliser les solutionneurs d'impuretes standards, qui sont de plus en plus puissants. Nous avons aussi developpe la methode pour le cas d'un empilement de plans bidimensionnels selon z, ce qui permet d'etudier l'effet orbital du champ dans des nanostructures et meme dans les materiaux massifs, si le nombre de plans est suffisant pour obtenir la limite tridimensionnelle. Mots cles : Pouvoir thermoelectrique, Theorie du Champ Moyen Dynamique, Modele de Hubbard, Effet orbital du champ magnetique, Electrons fortement correles, Materiaux quantiques, Theorie des perturbations iterees
Fibrillation auriculaire et activité physique
Bosomworth, N. John
2015-01-01
Résumé Objectif Examiner les données probantes portant sur les effets de divers niveaux d’activité physique sur l’incidence de fibrillation auriculaire (FA) dans la population générale et chez les athlètes d’endurance. Sources des données Une recherche a initialement été menée sur PubMed à l’aide des titres MeSH ou des mots-de-texte anglais (avec descripteur de zone de recherche TIAB [title and abstract]) atrial fibrillation et exercise ou physical activity ou athlet* ou sport*, sans filtre additionnel. Le système GRADE (grading of recommendations, assessment, development, and evaluation) a été utilisé pour tirer les conclusions au sujet de la qualité et du niveau de preuve. Sélection des études Aucune étude d’intervention n’est ressortie de la recherche. Les études d’observation ont alors été jugées acceptables et, bien que des études prospectives de cohortes à long terme de plus grande envergure auraient été préférables, des essais cas-témoins ou transversaux ont aussi été inclus dans cette révision. Synthèse Les données disponibles laissent croire à un lien proportionnel à la dose entre l’exercice plus intense et l’incidence réduite de FA chez les femmes. Il en va de même pour les hommes dont le niveau d’activité physique est de faible à modéré. Chez les hommes seulement, l’activité intense est associée à un risque accru de FA et ce, dans la plupart des études, mais pas la totalité d’entre elles. Ce risque est modéré, le rapport de risque instantané étant de 1,29 dans l’une des études de meilleure qualité. Le risque de FA chez la plupart des personnes régulièrement actives est plus faible que le risque observé dans la population sédentaire appariée. Conclusion La fibrillation auriculaire est probablement moins fréquente à mesure que le niveau d’activité physique augmente, la relation dose-réponse étant démontrable. À toutes les intensités, l’exercice doit être encouragé pour ses effets sur le bien-être physique et la réduction de la mortalité. Chez les hommes qui pratiquent des activités vigoureuses, les effets bénéfiques sur la FA pourraient disparaître et le risque pourrait surpasser celui observé dans la population sédentaire; cependant les données probantes à cet effet ne sont ni robustes ni constantes. Ces hommes devraient être mis au courant de cette hausse modeste du risque s’ils choisissent de poursuivre leurs activités physiques vigoureuses.
2000-08-01
Ergonomia , 1987, 30, 13 79- physique que psychologique. Cela tend A damontrer que 1393. dans des conditions militaires op~rationnelles, les 2...melatonin. Ergonomia , 1987, 30, 1379- 1393. 2 - BLOIS R, FEINBERG I., GAILLARD J.M., KUPFER D.J. and WEBB W.B. Sleep in normal and pathological aging
Electromagnetic Noise Interference and Compatibility
1975-11-01
RADIO INTERFERENCE by C.Fengler 7 LES CHARGES ELECTROSTATIQUES ET LES PERTURBATIONS QU’ELLES ENTRAINENT DANS LES LIAISONS RADIOELECTRIQUES par...PAR LA TRANSMISSION DANS UN SYSTFJE INTEGRE AEROPORTE par r,.David et M.Vanneizel 31 Not available at time of priating Reference DIGITAL DATA...TRANSMISSION IN AIRCRAFT: EMC-PROBLEMS AND POSSIBLE SOLUTIONS by R.Rode 32 GENERATIONS El EFFETS DES TENSIONS PARASITES DE CONDUCTION ET DE RAVONNEMENT ENTRE
Impuretés et systèmes corrélés. Des chaînes aux cuprates supraconducteurs
NASA Astrophysics Data System (ADS)
Bobroff, J.
2005-01-01
Impurities and correlated systems Discovery of high TC superconductors has opened the new field of strongly correlated fermions physics. In these compounds, mostly transition metal oxides, strong correlations between electrons affect sharply their electronic properties. In order to determine accurately these correlations and their possible link with superconductivity, we study the effect of local defects such as non magnetic impurities. In fact, in the more simple case of insulating spin chains and ladders, these impurities induce a staggered magnetism in their neighborhood which reveals the underlying electronic correlations. Similar effects are observed in high TC superconductors, using local probes such as nuclear magnetic resonance. These observations allow to get a better understanding of both normal and superconducting state of these oxides, both full of surprises. La découverte des supraconducteurs à haute température critique a ouvert la voie à une nouvelle physique, celle des fermions fortement corrélés. Dans ces systèmes, le plus souvent des oxydes de métaux de transition, les fortes corrélations entre électrons affectent profondément leurs propriétés électroniques et induisent de nouveaux états originaux : liquides de spins, supraconductivité, etc. Pour mesurer ces corrélations et ainsi déterminer leur lien éventuel avec la supraconductivité, nous proposons d'étudier l'effet de défauts ponctuels tels que des impuretés non magnétiques. En effet, dans le cas plus simple de chaînes ou d'échelles de spin isolantes, ces impuretés induisent dans leur voisinage immédiat un magnétisme alterné révélateur des corrélations électroniques. Des effets semblables sont observés dans les supraconducteurs à haute température critique, grâce à des sondes locales comme la résonance magnétique nucléaire. Ces observations permettent de mieux comprendre à la fois l'état normal et supraconducteur de ces oxydes, tous deux riches en surprises.
Durcissement superficiel de la fonte grise Ft25 induit par un traitement de surface dans le moule
NASA Astrophysics Data System (ADS)
Bouitna, Mohamed; Boutarek-Zaourar, Naïma; Mansour, Samir; Chentouf, Samir Mourad; Mossang, Eric
2018-02-01
L'objectif de cette étude est la consolidation en surface de la fonte grise lamellaire Ft25 par un dépôt riche en manganèse en développant une méthode combinant en une seule opération l'élaboration et le traitement de surface dans le moule. Les effets de la granulométrie du ferro-manganèse (80 % Mn + 20 % Fe), ainsi que l'épaisseur des pièces en fontes sur les couches formées ont été étudiés. On a retenu trois granulométries du ferro-manganèse de 0,18 mm, 0,25 mm et 0,5 mm pour le traitement des pièces en fontes présentant des épaisseurs de 25 mm, 100 mm et 200 mm. Parmi les résultats obtenus, on distingue une consolidation des propriétés en surface induite par la formation d'une couche riche en manganèse continue et homogène. L'effet de la granulométrie du ferro-manganèse sur l'épaisseur de la couche traitée a été mis en évidence. La variation de l'épaisseur des couches formées diminue avec l'augmentation de la granulométrie du ferro-manganèse. Pour une pièce de 100 mm d'épaisseur, la couche formée est estimée à 350 μm pour une granulométrie de 0,18 alors qu'elle n'est que de 180 μm pour une granulométrie de 0,5. L'effet de l'épaisseur de la pièce n'est en revanche pas assez prononcé sur la taille des couches formées. Une amélioration nette de la résistance, à l'usure de la fonte traitée en relation avec les transformations en surface, a été mise en évidence.
"What If" Analyses: Ways to Interpret Statistical Significance Test Results Using EXCEL or "R"
ERIC Educational Resources Information Center
Ozturk, Elif
2012-01-01
The present paper aims to review two motivations to conduct "what if" analyses using Excel and "R" to understand the statistical significance tests through the sample size context. "What if" analyses can be used to teach students what statistical significance tests really do and in applied research either prospectively to estimate what sample size…
NASA Astrophysics Data System (ADS)
Bretin, Remy
L'endommagement par fatigue des materiaux est un probleme courant dans de nombreux domaines, dont celui de l'aeronautique. Afin de prevenir la rupture par fatigue des materiaux il est necessaire de determiner leur duree de vie en fatigue. Malheureusement, dues aux nombreuses heterogeneites presentes, la duree de vie en fatigue peut fortement varier entre deux pieces identiques faites dans le meme materiau ayant subi les memes traitements. Il est donc necessaire de considerer ces heterogeneites dans nos modeles afin d'avoir une meilleure estimation de la duree de vie des materiaux. Comme premiere etape vers une meilleure consideration des heterogeneites dans nos modeles, une etude en elasticite lineaire de l'influence des orientations cristallographiques sur les champs de deformations et de contraintes dans un polycristal a ete realisee a l'aide de la methode des elements finis. Des correlations ont pu etre etablies a partir des resultats obtenus, et un modele analytique en elasticite lineaire prenant en compte les distributions d'orientations cristallographiques et les effets de voisinage a pu etre developpe. Ce modele repose sur les bases des modeles d'homogeneisation classique, comme le schema auto-coherent, et reprend aussi les principes de voisinage des automates cellulaires. En prenant pour reference les resultats des analyses elements finis, le modele analytique ici developpe a montre avoir une precision deux fois plus grande que le modele auto-coherent, quel que soit le materiau etudie.
Impedance de surface dans les supraconducteurs quasi-bidimensionnels
NASA Astrophysics Data System (ADS)
Achkir, Driss Brice
Ce travail a caractere experimental et theorique vise l'etude de l'etat supraconducteur de trois familles de composes: les supraconducteurs conventionnels, les organiques et les cuprates YBCO. Pour ce faire, nous avons utilise une technique hyperfrequence, a savoir la mesure d'impedance de surface en fonction de la temperature et du champ magnetique. Dans les supraconducteurs conventionnels, nous avons mesure pour la premiere fois le pic de "coherence" dans la partie reelle de la conductivite. Bien que predit par la theorie BCS, ce pic n'avait pas ete clairement observe en raison de difficultes techniques liees a ce type d'experience. D'autre part, la theorie d'Eliashberg appliquee a la partie reelle de la conductivite du niobium nous a revele l'importance des mesures hyperfrequences pour mieux extraire la partie basse frequence de la densite spectrale alphasp2F(omega). Cette possibilite est attrayante puisque c'est precisement la region de frequences de alphasp2F(omega) ou les donnees d'effet tunnel sont imprecises. Les resultats obtenus sur la longueur de penetration dans les organiques et les cuprates ont permis de montrer que le gap presente des lignes de zeros au niveau de Fermi ou qu'il est, a tout le moins, fortement anisotrope. En effet, la dependance en temperature de la longueur de penetration dans les cristaux purs est lineaire a basse temperature et elle devient quadratique dans les cristaux dopes. Pour le cas des supraconducteurs organiques quasi-bidimensionnels (Et)sb2X, nous avons aussi observe un maximum sur la partie reelle de la conductivite qui n'a rien a voir avec un pic de coherence. Pour ces composes, nous avons effectue une des toutes premieres etudes des fluctuations supraconductrices en temperature et en champ magnetique. Nous montrons que la paraconductivite sigmasp' due aux fluctuations presente un comportement de type Aslamazov-Larkin de nature tridimensionnelle. Ces mesures sont appuyees par les resultats theoriques d'un modele Ginzburg-Landau dynamique que nous avons developpe. De plus, a partir de l'analyse des fluctuations, nous avons pu identifier le champ critique pour la transition resistive en champ et ainsi en deduire la transition de fonte du reseau de vortex dans le (Et)sb2Cu(SCN)sb2.
Analyse des interactions energetiques entre un arena et son systeme de refrigeration
NASA Astrophysics Data System (ADS)
Seghouani, Lotfi
La presente these s'inscrit dans le cadre d'un projet strategique sur les arenas finance par le CRSNG (Conseil de Recherche en Sciences Naturelles et en Genie du Canada) qui a pour but principal le developpement d'un outil numerique capable d'estimer et d'optimiser la consommation d'energie dans les arenas et curlings. Notre travail s'inscrit comme une suite a un travail deja realise par DAOUD et coll. (2006, 2007) qui a developpe un modele 3D (AIM) en regime transitoire de l'arena Camilien Houde a Montreal et qui calcule les flux de chaleur a travers l'enveloppe du batiment ainsi que les distributions de temperatures et d'humidite durant une annee meteorologique typique. En particulier, il calcule les flux de chaleur a travers la couche de glace dus a la convection, la radiation et la condensation. Dans un premier temps nous avons developpe un modele de la structure sous la glace (BIM) qui tient compte de sa geometrie 3D, des differentes couches, de l'effet transitoire, des gains de chaleur du sol en dessous et autour de l'arena etudie ainsi que de la temperature d'entree de la saumure dans la dalle de beton. Par la suite le BIM a ete couple le AIM. Dans la deuxieme etape, nous avons developpe un modele du systeme de refrigeration (REFSYS) en regime quasi-permanent pour l'arena etudie sur la base d'une combinaison de relations thermodynamiques, de correlations de transfert de chaleur et de relations elaborees a partir de donnees disponibles dans le catalogue du manufacturier. Enfin le couplage final entre l'AIM +BIM et le REFSYS a ete effectue sous l'interface du logiciel TRNSYS. Plusieurs etudes parametriques on ete entreprises pour evaluer les effets du climat, de la temperature de la saumure, de l'epaisseur de la glace, etc. sur la consommation energetique de l'arena. Aussi, quelques strategies pour diminuer cette consommation ont ete etudiees. Le considerable potentiel de recuperation de chaleur au niveau des condenseurs qui peut reduire l'energie requise par le systeme de ventilation de l'arena a ete mis en evidence. Mots cles. Arena, Systeme de refrigeration, Consommation d'energie, Efficacite energetique, Conduction au sol, Performance annuelle.
NASA Astrophysics Data System (ADS)
Amrani, Salah
La fabrication de l'aluminium est realisee dans une cellule d'electrolyse, et cette operation utilise des anodes en carbone. L'evaluation de la qualite de ces anodes reste indispensable avant leur utilisation. La presence des fissures dans les anodes provoque une perturbation du procede l'electrolyse et une diminution de sa performance. Ce projet a ete entrepris pour determiner l'impact des differents parametres de procedes de fabrication des anodes sur la fissuration des anodes denses. Ces parametres incluent ceux de la fabrication des anodes crues, des proprietes des matieres premieres et de la cuisson. Une recherche bibliographique a ete effectuee sur tous les aspects de la fissuration des anodes en carbone pour compiler les travaux anterieurs. Une methodologie detaillee a ete mise au point pour faciliter le deroulement des travaux et atteindre les objectifs vises. La majorite de ce document est reservee pour la discussion des resultats obtenus au laboratoire de l'UQAC et au niveau industriel. Concernant les etudes realisees a l'UQAC, une partie des travaux experimentaux est reservee a la recherche des differents mecanismes de fissuration dans les anodes denses utilisees dans l'industrie d'aluminium. L'approche etait d'abord basee sur la caracterisation qualitative du mecanisme de la fissuration en surface et en profondeur. Puis, une caracterisation quantitative a ete realisee pour la determination de la distribution de la largeur de la fissure sur toute sa longueur, ainsi que le pourcentage de sa surface par rapport a la surface totale de l'echantillon. Cette etude a ete realisee par le biais de la technique d'analyse d'image utilisee pour caracteriser la fissuration d'un echantillon d'anode cuite. L'analyse surfacique et en profondeur de cet echantillon a permis de voir clairement la formation des fissures sur une grande partie de la surface analysee. L'autre partie des travaux est basee sur la caracterisation des defauts dans des echantillons d'anodes crues fabriquees industriellement. Cette technique a consiste a determiner le profil des differentes proprietes physiques. En effet, la methode basee sur la mesure de la distribution de la resistivite electrique sur la totalite de l'echantillon est la technique qui a ete utilisee pour localiser la fissuration et les macro-pores. La microscopie optique et l'analyse d'image ont, quant a elles, permis de caracteriser les zones fissurees tout en determinant la structure des echantillons analyses a l'echelle microscopique. D'autres tests ont ete menes, et ils ont consiste a etudier des echantillons cylindriques d'anodes de 50 mm de diametre et de 130 mm de longueur. Ces derniers ont ete cuits dans un four a UQAC a differents taux de chauffage dans le but de pouvoir determiner l'influence des parametres de cuisson sur la formation de la fissuration dans ce genre de carottes. La caracterisation des echantillons d'anodes cuites a ete faite a l'aide de la microscopie electronique a balayage et de l'ultrason. La derniere partie des travaux realises a l'UQAC contient une etude sur la caracterisation des anodes fabriquees au laboratoire sous differentes conditions d'operation. L'evolution de la qualite de ces anodes a ete faite par l'utilisation de plusieurs techniques. L'evolution de la temperature de refroidissement des anodes crues de laboratoire a ete mesuree; et un modele mathematique a ete developpe et valide avec les donnees experimentales. Cela a pour objectif d'estimer la vitesse de refroidissement ainsi que le stress thermique. Toutes les anodes fabriquees ont ete caracterisees avant la cuisson par la determination de certaines proprietes physiques (resistivite electrique, densite apparente, densite optique et pourcentage de defauts). La tomographie et la distribution de la resistivite electrique, qui sont des techniques non destructives, ont ete employees pour evaluer les defauts internes des anodes. Pendant la cuisson des anodes de laboratoire, l'evolution de la resistivite electrique a ete suivie et l'etape de devolatilisation a ete identifiee. Certaines anodes ont ete cuites a differents taux de chauffage (bas, moyen, eleve et un autre combine) dans l'objectif de trouver les meilleures conditions de cuisson en vue de minimiser la fissuration. D'autres anodes ont ete cuites a differents niveaux de cuisson, cela dans le but d'identifier a quelle etape de l'operation de cuisson la fissuration commence a se developper. Apres la cuisson, les anodes ont ete recuperees pour, a nouveau, faire leur caracterisation par les memes techniques utilisees precedemment. L'objectif principal de cette partie etait de reveler l'impact de differents parametres sur le probleme de fissuration, qui sont repartis sur toute la chaine de production des anodes. Le pourcentage de megots, la quantite de brai et la distribution des particules sont des facteurs importants a considerer pour etudier l'effet de la matiere premiere sur le probleme de la fissuration. Concernant l'effet des parametres du procede de fabrication sur le meme probleme, le temps de vibration, la pression de compaction et le procede de refroidissement ont ete a la base de cette etude. Finalement, l'influence de la phase de cuisson sur l'apparition de la fissuration a ete prise en consideration par l'intermediaire du taux de chauffage et du niveau de cuisson. Les travaux realises au niveau industriel ont ete faits lors d'une campagne de mesure dans le but d'evaluer la qualite des anodes de carbone en general et l'investigation du probleme de fissuration en particulier. Ensuite, il s'agissait de reveler les effets de differents parametres sur le probleme de la fissuration. Vingt-quatre anodes cuites ont ete utilisees. Elles ont ete fabriquees avec differentes matieres premieres (brai, coke, megots) et sous diverses conditions (pression, temps de vibration). Le parametre de la densite de fissuration a ete calcule en se basant sur l'inspection visuelle de la fissuration des carottes. Cela permet de classifier les differentes fissurations en plusieurs categories en se basant sur certains criteres tels que le type de fissures (horizontale, verticale et inclinee), leurs localisations longitudinales (bas, milieu et haut de l'anode) et transversales (gauche, centrale et droite). Les effets de la matiere premiere, les parametres de fabrication des anodes crues ainsi que les conditions de cuisson sur la fissuration ont ete etudies. La fissuration des anodes denses en carbones cause un serieux probleme pour l'industrie d'aluminium primaire. La realisation de ce projet a permis la revelation de differents mecanismes de fissuration, la classification de fissuration par plusieurs criteres (position, types localisation) et l'evaluation de l'impact de differents parametres sur la fissuration. Les etudes effectuees dans le domaine de cuisson ont donne la possibilite d'ameliorer l'operation et reduire la fissuration des anodes. Le travail consiste aussi a identifier des techniques capables d'evaluer la qualite d'anodes (l'ultrason, la tomographie et la distribution de la resistivite electrique). La fissuration des anodes en carbone est consideree comme un probleme complexe, car son apparition depend de plusieurs parametres repartis sur toute la chaine de production. Dans ce projet, plusieurs nouvelles etudes ont ete realisees, et elles permettent de donner de l'originalite aux travaux de recherches faits dans le domaine de la fissuration des anodes de carbone pour l'industrie de l'aluminium primaire. Les etudes realisees dans ce projet permettent d'ajouter d'un cote, une valeur scientifique pour mieux comprendre le probleme de fissuration des anodes et d'un autre cote, d'essayer de proposer des methodes qui peuvent reduire ce probleme a l'echelle industrielle.
NATO Human Resources (Manpower) Management (Gestion des ressources humaines (effectifs) de l’OTAN)
2012-02-01
performances , la gestion des récompenses et du salaire, et la motivation du personnel. En conséquence, il faut que la gestion des ressources humaines...importance au fil du temps. Les ressources humaines, chargées à l’origine de l’embauche, du licenciement, de la paie et de la gestion des ...les effets des systèmes d’évaluation des performances ;
Afterrise: Deep Body Temperature Following Exercise
1992-03-01
Egalement, les effets de la posture et des v~tements durant la recuperation et la temperature de la salle de recuperation furent examines. Cinq hommes se...chemise a manches courtes). Les temperatures rectales et de la peau furent mesurees A chaque minute durant les exercices et la r~cup~ration affectaient...overgarment. RESUME Cette etude fut entreprise pour documenter l’augmentation continue de la temperature rectale apres exercice dans la chaleur
ERIC Educational Resources Information Center
Dugas, Tim; Green, Lyndsay; Leckie, Norm
The use of learning technologies in the workplace and their impact on lifelong learning were examined. Data were collected from three sources: the literature on learning technologies and labor market trends affecting the adoption, implementation, and success of learning technologies in the workplace; case studies of 8 Canadian firms with 100 or…
ERIC Educational Resources Information Center
Vitaro, Frank; And Others
1992-01-01
This article, written in French, describes and evaluates the first phase of a program to prevent drug addiction among 110 fifth-grade girls with behavior problems in Montreal (Quebec, Canada). Evaluation of the instructional program showed positive results for student knowledge level, attitudes, and behaviors and supported program continuation…
Possibilité d'une nouvelle technologie de traitement des minerais de fer de l'Ouenza par radiométrie
NASA Astrophysics Data System (ADS)
Idres, A.; Bounouala, M.
2005-05-01
En l'absence d'une technologie fiable de traitement des haldes de minerais de fer, les caractéristiques minéralogiques et chimiques complexes et les effets néfastes des résidus miniers posent réellement un problème environnemental. A cet effet, une étude minéralogique et chimique du minerai de fer a été menée en utilisant des techniques multiples (microscopie optique, DRX, FX, MEB). En tenant compte de la nature des résidus, des échantillons représentatifs ont été testés par séparation radiométrique. Plusieurs paramètres ont été caractérisés tels que la vitesse de la bande transporteuse, le temps d'émission des rayons gamma et la granulométrie d'alimentation du procédé. Les résultats ainsi obtenus par cette méthode de séparation sont très significatifs en récupération et en teneur fer. Cependant, cette nouvelle technologie permet d'une part une meilleure valorisation des minerais de fer et d'autre part une réduction du tonnage stocké sur le carreau de la mine.
NASA Astrophysics Data System (ADS)
Joubert, C.; Jacquet, N.; Lambert, F.; Martin, S.; Martin, C.
1998-04-01
Whole-body irradiation leads to delayed cognitive dysfunction which could result from perturbations of neurotransmission, specially the dopaminergic and the serotoninergic one. The aim of this study was to determine the concentrations of dopamine (DA), serotonin (5-HT) and their metabolites in three cerebral areas of rats, one month after (neutron-gamma) irradiation at 3.38Gy. An increase of DA, 5-HT, and their catabolites was observed. These effects are weak but observed in older rats. Au cours des mois suivant une irradiation corporell totale peuvent se manifester des troubles comportementaux qui pourraient être la conséquence d'altérations de la neuraotransmission, plus particulièrement de la transmission dopaminergique ou sérotoninergique. Nous avons recherché les variations des taux de dopamine (DA), de sérotonine (5-HT) et de leurs métabolites dans 3structures cérébrales 1 mois après une irradiation (neutron-gamma) à la dose de 3,38Gy. Les résultats préliminaires mettent en évidence une augmentation des taux de DA, de 5-HT et de leurs catabolites ; ces effets sont plus discrets mais similaires à ceux observés chez des animaux plus âgés.
Effet d'un champ magnétique uniforme sur les instabilités de Rayleigh-Bénard avec effet Soret
NASA Astrophysics Data System (ADS)
Ben Sassi, Mokhtar; Kaddeche, Slim; Abdennadher, Ali; Henry, Daniel; Hadid, Hamda Ben; Mojtabi, Abdelkader
2016-01-01
The effect of both magnitude and orientation of a uniform magnetic field on the critical transition occurring within an electrically conducting binary fluid layer, stratified in temperature and concentration, taking into account the Soret effect, is investigated numerically. For such a configuration, the results show that the critical thresholds corresponding to an arbitrary orientated magnetic field can be derived from those obtained for a vertical magnetic field and that the axes of the marginal cells are aligned with the horizontal component of the magnetic field. Moreover, an analytical study is conducted to investigate the impact of the magnetic field on long-wavelength instabilities. The effect of the magnetic field on such instabilities reveals a new phenomenon consisting in major changes of the unstable modes that lose their unicellular nature to regain their multi-roll characteristic, as it is the case without magnetic field for ψ <ψℓ0 = 131 Le / (34 - 131 Le). For a binary fluid characterized by a Lewis number Le and a separation factor ψ >ψℓ0, the value of the Hartmann number Haℓ (ψ , Le) corresponding to that transition responsible for a significant change in mass and heat transfer can be determined from the analytical relations derived in this work.
Etude vibroacoustique d'un systeme coque-plancher-cavite avec application a un fuselage simplifie
NASA Astrophysics Data System (ADS)
Missaoui, Jemai
L'objectif de ce travail est de developper des modeles semi-analytiques pour etudier le comportement structural, acoustique et vibro-acoustique d'un systeme coque-plancher-cavite. La connection entre la coque et le plancher est assuree en utilisant le concept de rigidite artificielle. Ce concept de modelisation flexible facilite le choix des fonctions de decomposition du mouvement de chaque sous-structure. Les resultats issus de cette etude vont permettre la comprehension des phenomenes physiques de base rencontres dans une structure d'avion. Une approche integro-modale est developpee pour calculer les caracteristiques modales acoustiques. Elle utilise une discretisation de la cavite irreguliere en sous-cavites acoustiques dont les bases de developpement sont connues a priori. Cette approche, a caractere physique, presente l'avantage d'etre efficace et precise. La validite de celle-ci a ete demontree en utilisant des resultats disponibles dans la litterature. Un modele vibro-acoustique est developpe dans un but d'analyser et de comprendre les effets structuraux et acoustiques du plancher dans la configuration. La validite des resultats, en termes de resonance et de fonction de transfert, est verifiee a l'aide des mesures experimentales realisees au laboratoire.
Etude du champ magnetique dans les nuages moleculaires
NASA Astrophysics Data System (ADS)
Houde, Martin
2001-12-01
Ce travail est une étude du champ magnétique duns l'environnement circumstellaire des étoiles jeunes. Il a pour origine la certitude qu'avait l'auteur qu'il se devait d'être possible de détecter la présence d'un champ magnétique, et de possiblement le caractériser, par le biais d'observations de profils spectraux d'espèces moléculaires ioniques. Il en découle donc qu'un des buts principaux était de prouver que cela est effectivement possible. La thèse comporte alors des éléments théoriques et expérimentaux qui sont à la fois complémentaires et intimement liés. L'aspect théorique est basé sur l'interaction mutuelle que des particules neutres et chargées peuvent avoir l'une sur l'autre daps un plasma faiblement ionisé comme ceux existants daps les nuages moléculaires sites de formation stellaire. Il appert que la présence d'un champ magnétique a un effet direct sur le comportement des ions (via la force de Lorentz) et indirect sur les molécules neutres (via les nombreuses collisions entre les deux types de particules). Une telle interaction est, comme il est maintenant bien connu, présente dans les premières étapes de la formation dune étoile. Il s'agit bien sûr de la diffusion ambipolaire. Nous montrerons qu'il existe cependant un autre type de diffusion, jusqu'ici inconnue, qui se manifeste plus tard au tours de l'évolution des nuages moléculaires. Celle-ci peut avoir un effet dramatique sur l'apparence des profils spectraux (de rotation moléculaire) des espèces ioniques lorsque comparés à ceux qu'exhibent des espèces neutres coexistantes. Mais pour ce faire, il doit y avoir existence de mouvements organisés (des flots ou jets) de matière ou encore la présence de turbulence dans les régions considérées. Une distribution de vélocité du type maxwellienne ne révèlera pas la présence du champ magnétique. Les observations, qui ont pour but de confirmer la théorie, se situent dans le domaine des longueurs d'ondes millimétriques et sous- millimétriques. Plusieurs espèces moléculaires furent détectées dans un échantillon significatif de nuages moléculaires. L'effet prédit fut confirmé et ce peu importe si les raies observées sont opaques ou transparentes. Dans le dernier chapitre, nous considérerons une application intéressante où nous utiliserons la manifestation de cet effet (ou son manque) pour vérifier l'alignement préférentiel des flots bipolaires, qui accompagnent souvent la présence de proto- étoiles, avec le champ magnétique local.
Statistical Data Analyses of Trace Chemical, Biochemical, and Physical Analytical Signatures
DOE Office of Scientific and Technical Information (OSTI.GOV)
Udey, Ruth Norma
Analytical and bioanalytical chemistry measurement results are most meaningful when interpreted using rigorous statistical treatments of the data. The same data set may provide many dimensions of information depending on the questions asked through the applied statistical methods. Three principal projects illustrated the wealth of information gained through the application of statistical data analyses to diverse problems.
Harrison, Megan E.; Norris, Mark L.; Obeid, Nicole; Fu, Maeghan; Weinstangel, Hannah; Sampson, Margaret
2015-01-01
Résumé Objectif Effectuer une révision systématique des effets de repas en famille fréquents sur les résultats psychosociaux chez les enfants et les adolescents et examiner s’il existe des différences dans les résultats selon le sexe. Sources des données Des études ont été cernées à la suite d’une recherche dans MEDLINE (de 1948 à la dernière semaine de juin 2011) et dans PsycINFO (de 1806 à la première semaine de juillet 2011) à l’aide de l’interface Ovide. Les expressions et mots clés MeSH utilisés seuls ou en combinaisons étaient les suivants : family, meal, food intake, nutrition, diets, body weight, adolescent attitudes, eating behaviour, feeding behaviour et eating disorders. Les bibliographies des articles jugés pertinents ont aussi été passées en revus. Sélection des études La recherche initiale a produit 1783 articles. Pour être incluses dans l’analyse, les études devaient répondre aux critères suivants : être publiées en anglais dans une revue révisée par des pairs; porter sur des enfants ou des adolescents; traiter de l’influence des repas en famille sur les paramètres psychosociaux (p. ex. consommation de drogues et autres substances, troubles de l’alimentation, dépression) chez les enfants ou les adolescents; avoir une conception d’étude appropriée, notamment des méthodes statistiques acceptables pour l’analyse des paramètres. Quatorze articles satisfaisaient aux critères d’inclusion. Deux examinateurs indépendants ont étudié et analysé les articles. Synthèse Dans l’ensemble, les résultats font valoir que la fréquence des repas en famille est inversement proportionnelle aux troubles de l’alimentation, à la consommation d’alcool et de drogues, aux comportements violents, aux sentiments de dépression ou aux pensées suicidaires chez les adolescents. Il existe une relation positive entre de fréquents repas en famille, une bonne estime de soi et la réussite scolaire. Les études révèlent des différences considérables dans les résultats chez les enfants et adolescents masculins et féminins, les sujets féminins ayant des résultats plus positifs. Conclusion Cette révision systématique vient confirmer davantage qu’il convient de préconiser de fréquents repas en famille. Tous les professionnels de la santé devraient renseigner les familles concernant les bienfaits de prendre régulièrement ensemble des repas.
Motion Cues in Flight Simulation and Simulator Induced Sickness
1988-06-01
asseusod in a driving simulator by means of a response surface methodology central-composite design . The most salient finding of the study was that visual...across treatment conditions. For an orthogonal response surface methodology (IBM) design with only tro independent variables. it can be readily shown that...J.E.Fowikes 8 SESSION III - ETIOLOGICAL FACTORS IN SIMULATOR-INDUCED AFTER EFFETS THE USE OF VE& IIBULAR MODELS FOR DESIGN AND EVALUATION OF FLIGHT
1986-12-01
effective Reynolds Number5 to include the effet Of turbulence, which was supported in a convincing manner by the same ratio of 2.4 betwveen the Reynolds...iLIFT DEVICIS 143 methods incorl)orating various forms of flas) are shown on Figure 59. The other two methods, Boundary Layer Control and the Magnus ...Class Airship Hlull with Varying Lengths of Cylindric Midships," N.A.CA. Technical Report No. 138 (1922). 276 ENGINEERING AERODYNAMICS [Ch. 9 -- - - 2.0
Environmental Exposure and Design Criteria for Offshore Oil and Gas Structures
1980-05-01
reliability ar_alysis. Because there are no clear lines of demarcation between them, these methods are often used in varying combinations. Sound ...cludes that OCSEA-P not now effe.tively contribute...to the accrual of sound scientific information adequate for OCS management." One reason for such a...procedures for resolving differences need to be developed. Sound and timely assessments of environmental exposure risks will require: 1) adequate levels of
ERIC Educational Resources Information Center
Theilheimer, Ish, Ed.; Eisner, Kathy, Ed.
1996-01-01
This issue of the Canadian quarterly "Transitions," in French and English language versions, examines the prevention of youth crime, with a specific focus on activities, trends, and research dealing with Canadian families. Major articles in this issue are: (1) "A Snowball's Chance? Communities and Families Working to Prevent Youth…
2011-01-01
plus important, comparativement à une échelle plus large. Les résultats indiquent qu’un effet de cette nature est attribuable à des facteurs...par un angle de contact à l’avancement plus petit et un angle de contact au retrait plus important, comparativement à une échelle plus large. Les...Methods ............................................................................................................ 10 3.1 Experimental Design
ERIC Educational Resources Information Center
Dalpe, Robert; Gingras, Yves
1990-01-01
The role of two main sources of university research financing in solar energy is examined to assess whether they oriented research in the direction of government programs. The strongest relationship appears to be in journal publication patterns. This scientific community has acquired the capacity to tap varying sources. (Author/MSE)
Early Warning Signs of Suicide in Service Members Who Engage in Unauthorized Acts of Violence
2016-06-01
observable to military law enforcement personnel. Statistical analyses tested for differences in warning signs between cases of suicide, violence, or...indicators, (2) Behavioral Change indicators, (3) Social indicators, and (4) Occupational indicators. Statistical analyses were conducted to test for...6 Coding _________________________________________________________________ 7 Statistical
[Statistical analysis using freely-available "EZR (Easy R)" software].
Kanda, Yoshinobu
2015-10-01
Clinicians must often perform statistical analyses for purposes such evaluating preexisting evidence and designing or executing clinical studies. R is a free software environment for statistical computing. R supports many statistical analysis functions, but does not incorporate a statistical graphical user interface (GUI). The R commander provides an easy-to-use basic-statistics GUI for R. However, the statistical function of the R commander is limited, especially in the field of biostatistics. Therefore, the author added several important statistical functions to the R commander and named it "EZR (Easy R)", which is now being distributed on the following website: http://www.jichi.ac.jp/saitama-sct/. EZR allows the application of statistical functions that are frequently used in clinical studies, such as survival analyses, including competing risk analyses and the use of time-dependent covariates and so on, by point-and-click access. In addition, by saving the script automatically created by EZR, users can learn R script writing, maintain the traceability of the analysis, and assure that the statistical process is overseen by a supervisor.
Zhang, Harrison G; Ying, Gui-Shuang
2018-02-09
The aim of this study is to evaluate the current practice of statistical analysis of eye data in clinical science papers published in British Journal of Ophthalmology ( BJO ) and to determine whether the practice of statistical analysis has improved in the past two decades. All clinical science papers (n=125) published in BJO in January-June 2017 were reviewed for their statistical analysis approaches for analysing primary ocular measure. We compared our findings to the results from a previous paper that reviewed BJO papers in 1995. Of 112 papers eligible for analysis, half of the studies analysed the data at an individual level because of the nature of observation, 16 (14%) studies analysed data from one eye only, 36 (32%) studies analysed data from both eyes at ocular level, one study (1%) analysed the overall summary of ocular finding per individual and three (3%) studies used the paired comparison. Among studies with data available from both eyes, 50 (89%) of 56 papers in 2017 did not analyse data from both eyes or ignored the intereye correlation, as compared with in 60 (90%) of 67 papers in 1995 (P=0.96). Among studies that analysed data from both eyes at an ocular level, 33 (92%) of 36 studies completely ignored the intereye correlation in 2017, as compared with in 16 (89%) of 18 studies in 1995 (P=0.40). A majority of studies did not analyse the data properly when data from both eyes were available. The practice of statistical analysis did not improve in the past two decades. Collaborative efforts should be made in the vision research community to improve the practice of statistical analysis for ocular data. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
Using R-Project for Free Statistical Analysis in Extension Research
ERIC Educational Resources Information Center
Mangiafico, Salvatore S.
2013-01-01
One option for Extension professionals wishing to use free statistical software is to use online calculators, which are useful for common, simple analyses. A second option is to use a free computing environment capable of performing statistical analyses, like R-project. R-project is free, cross-platform, powerful, and respected, but may be…
NASA Astrophysics Data System (ADS)
Charlebois, Serge
De nombreux travaux theoriques et experimentaux ont ete publies sur les excitations topologiques de gaz electroniques bidimensionnels (GE2D), appellees skyrmions, dans le regime de l'effet Hall quantique a remplissage unitaire. On attend des excitations semblables appellees bimerons dans les systemes formes de deux GE2D couples. Contrairement au cas des GE2D simples, aucune experience n'a, a notre connaissance, presente la mesure d'une propriete specifique aux bimerons. Nous presentons dans cette these des travaux experimentaux ayant pour objectif l'etude d'excitations topologiques dans les heterostructures a double puits quantique. Une manifestation attendue (les bimerons est la presence d'une anisotropie dans la conductivite a travers une constriction. Nous avons concu un dispositif original a point de contact a trois grilles non-coplanaires. Ce dispositif a trois grilles a la particularite de permettre la creation d'une constriction etroite dans le double GE2D tout en permettant l'equilibrage de la densite electronique entre les deux puits dans l'etroit canal de conduction. Nous avons fabrique ce dispositif de taille submicronique par electrolithographie sur des heterostructures a double puits. Les dispositifs ainsi fabriques ont ete etudies a basse temperature (0.3K) et ont montre un fonctionnement conforme aux attentes. Les travaux n'ont pas permis de mettre en evidence une anisotropie de transport revelatrice de l'existence de bimerons. Cette these est a notre connaissance la premiere etude experimentale visant la realisation de l'experience d'anisotropie de transport et est ainsi une contribution significative a l'avancement des connaissances dans ce domaine. Les travaux theoriques que nous presentons ont permis de montrer l'effet des excitations topologiques sur la capacite grille-GE2D du systeme. Ces travaux ouvrent la voie de la detection des bimerons par l'intermediaire de la mesure de la capacite grille-GE2D ou encore de la susceptibilite electrique du GE2D. Poursuivant cet objectif, nous avons concu, realise et teste un dispositif de mesure in situ de la capacite grille-GE2D d'une heterostructure. Nous avons egalement suggere d'autres methodes experimentales pour la mise en evidence des bimerons par le couplage de la texture de pseudospin a la capacite du GE2D.
Caracterisation de la cohesion de l'interface AMF/polymere dans une structure deformable adaptative
NASA Astrophysics Data System (ADS)
Fischer-Rousseau, Charles
Les structures déformables adaptatives (SDA) sont appelées à jouer un rôle important en aéronautique entre autres. Les alliages à mémoire de forme (AMF) sont un des candidats les plus prometteurs. Beaucoup de travail reste toutefois à faire avant que ces structures rencontrent les exigences élevées reliées à leur intégration dans un contexte aéronautique. Des travaux de recherche ont montré que la résistance à la décohésion de l’interface AMF/polymère peut être un élément limitant dans la performance des SDA. Dans ce travail, l’effet sur la résistance à la décohésion de l’interface AMF/polymère de divers traitements de surface, géométries de fil et types de polymère est évalué. La géométrie du fil est modifiée par une combinaison spécifique de laminage à froid et de recuit postdéformation qui maintient les propriétés de mémoire de forme tout en permettant de réduire l’aire de la section transversale du fil. Le traitement thermomécanique le plus prometteur est proposé. Une nouvelle méthode d’évaluation de la résistance à la décohésion est développée. Plutôt que de tester les fils en arrachement et de mesurer la force maximale, les tests en contraction sont basés sur la capacité des fils d’AMF à se contracter s’ils ont été encastrés dans un état tiré et qu’ils sont chauffés par effet Joule. L’hypothèse qu’on pose est que ces tests sont une meilleure approximation des conditions rencontrées dans une SDA, où les fils se contractent plutôt qu’ils sont arrachés par une force externe à la structure. Bien qu’une décohésion partielle ait été observée pour tous les échantillons, l’aire de la surface où il y a décohésion tait plus grande pour les échantillons avec une pré-déformation plus grande. Le front de décohésion a semblé cesser de progresser après les cycles de chauffage initiaux lorsque la vitesse de chauffage était faible. Un modèle numérique simulant la réponse thermique transitoire du polymère et du fil d’AMF lors d’un chauffage par effet Joule est programmé à l’aide du logiciel ANSYS. Le comportement du modèle est validé avec des résultats expérimentaux où des thermocouples encastrés dans l’échantillon permettent des mesures locales de la température. Les résultats calculés sont en accord avec les résultats expérimentaux d’un point de vue qualitatif, mais accusent des différences significatives d’un point de vue quantitatif. La mesure du champ de déformation à l’interface du fil et du polymère dans une SDA permettrait de développer et valider un modèle numérique prenant en compte l’effet mémoire de forme d’un fil encastré dans une matrice polymère. Dans ce but, une machine de traction miniature permettant l’analyse par microspectrométrie Raman in situ est présentée. Elle a une capacité de 1 kN et un déplacement maximal de 20 mm dans une enveloppe de conception totale de 160 mm de diamètre. La machine est conçue pour que le milieu de l’échantillon soit immobile grâce au fait que le mouvement des deux extrémités soit symétrique. Les résultats montrent que du travail supplémentaire est nécessaire avant de pouvoir encastrer des fils d’AMF dans une structure déformable adaptative. Mots-clés: décohésion, structure active, alliage à mémoire de forme, AMF, tests d’arrachement.
Etude de l'amelioration de la qualite des anodes par la modification des proprietes du brai
NASA Astrophysics Data System (ADS)
Bureau, Julie
La qualite des anodes produites se doit d'etre bonne afin d'obtenir de l'aluminium primaire tout en reduisant le cout de production du metal, la consommation d'energie et les emissions environnementales. Or, l'obtention des proprietes finales de l'anode necessite une liaison satisfaisante entre le coke et le brai. Toutefois, la matiere premiere actuelle n'assure pas forcement la compatibilite entre le coke et le brai. Une des solutions les plus prometteuses, pour ameliorer la cohesion entre ces deux materiaux, est la modification des proprietes du brai. L'objectif de ce travail consiste a modifier les proprietes du brai par l'ajout d'additifs chimiques afin d'ameliorer la mouillabilite du coke par le brai modifie pour produire des anodes de meilleure qualite. La composition chimique du brai est modifiee en utilisant des tensioactifs ou agents de modification de surface choisis dans le but d'enrichir les groupements fonctionnels susceptibles d'ameliorer la mouillabilite. L'aspect economique, l'empreinte environnementale et l'impact sur la production sont consideres dans la selection des additifs chimiques. Afin de realiser ce travail, la methodologie consiste a d'abord caracteriser les brais non modifies, les additifs chimiques et les cokes par la spectroscopie infrarouge a transformee de Fourier (FTIR) afin d'identifier les groupements chimiques presents. Puis, les brais sont modifies en ajoutant un additif chimique afin de possiblement modifier ses proprietes. Differentes quantites d'additif sont ajoutees afin d'examiner l'effet de la variation de la concentration sur les proprietes du brai modifie. La methode FTIR permet d'evaluer la composition chimique des brais modifies afin de constater si l'augmentation de la concentration d'additif enrichit les groupements fonctionnels favorisant l'adhesion coke/brai. Ensuite, la mouillabilite du coke par le brai est observee par la methode goutte- sessile. Une amelioration de la mouillabilite par la modification a l'aide d'un additif chimique signifie une possible amelioration de l'interaction entre le coke et le brai modifie. Afin de completer l'evaluation des donnees recueillies, les resultats de la FTIR et de la mouillabilite sont analyses par le reseau neuronal artificiel afin de mieux comprendre les mecanismes sous-jacents. A la lumiere des resultats obtenus, les additifs chimiques les plus prometteurs sont selectionnes afin de verifier l'effet de leur utilisation sur la qualite des anodes. Pour ce faire, des anodes de laboratoire sont produites en utilisant des brais non modifies et des brais modifies avec les additifs chimiques selectionnes. Par la suite, les anodes sont carottees afin de les caracteriser en determinant certaines de leurs proprietes physiques et chimiques. Enfin, les resultats des echantillons d'anodes faites d'un meme brai non modifie et modifie sont compares afin d'evaluer l'amelioration de la qualite des anodes. Finalement, un examen de l'impact possible de l'utilisation d'un additif chimique pour modifier le brai sur la consommation energetique et en carbone ainsi que la quantite d'aluminium produit est realise. Afin de modifier le brai, trois differents additifs chimiques sont selectionnes, soit un tensioactif et deux agents de modification de surface. L'analyse FTIR des experimentations menees sur les brais modifies demontre que deux additifs ont modifie la composition chimique des brais experimentes. L'analyse des resultats des tests goutte-sessile laisse supposer qu'un brai modifie par ces deux additifs ameliore possiblement l'interaction avec les cokes employes dans cette etude. L'analyse par reseau neuronal artificiel des donnees recueillies permet de mieux comprendre le lien entre la composition chimique d'un brai et sa capacite de mouillabilite avec un coke. La caracterisation des echantillons d'anodes produites permet d'affirmer que ces deux additifs peuvent ameliorer certaines des proprietes anodiques comparativement aux echantillons standards. L'analyse des resultats demontre que l'un des deux additifs semble donner des resultats plus prometteurs. Au final, les travaux realises au cours de ce projet demontrent qu'il est possible d'ameliorer la qualite anodique en modifiant les proprietes du brai. De plus, l'analyse des resultats obtenus fournit une meilleure comprehension des mecanismes entre un brai et un additif chimique.
Report of the Working Group on Aerodynamics of Aircraft Afterbody.
1986-06-01
14 I 21 2.1.71 Addy A.L. Experimeftal-Theoretical Correlation of Supersonic Jet-on Base Pressure for Cylindrical Afterbodies. J. Aircraft, Vol. 7, No...An Improved Experimental-Theoretical Base Pressure Cor- Addy A.L. relation for Conical and Cylindrical Afterbodies with Agrell J. Centered Propulsive...2.1.92 Carri~re P. Effet d’une Injection de Fluide dans l’Eau-Morte sur les Conditions de Recollement d’un Ecoulement Plan Supersonique. Comptes
Wind Tunnel Wall Corrections (la Correction des effets de paroi en soufflerie)
1998-10-01
round holes drilled either normal to the wall surface or at a fixed angle to the normal. Variable porosity features have been implemented in several...walls (holes drilled at 60 deg from the normal), including variable porosity configurations and the effects of screens and splitter plates for edge-tone...Figure 5.68 Schematic of slender wing and the indicated gauge func- tions in anticipation of matching. As detailed in Malmuth and Cole [122], the problems
Joseph Buongiorno
2015-01-01
The objective of this study was to determine the effect of the monetary union on the trade of forest products between euro-using countries. A differential gravity model of bilateral trade flows was developed and estimated with data for the bilateral trade between 12 euro countries from 1988 to 2013, for commodity groups HS44 (wood and articles of wood), HS47 (pulp of...
Africa Civic Action Planning and Implementation Guide
1988-03-17
transfert du titre de propriet4 ou. que les d~fectuosit~s provenant de la conception sont... transfert du * .~.titre prendra effet normalement. au point de chargement chez le * fabricant; et pour ].es articles de de’fense fournis sur les stocks...dition avant l~e transfert . du Ititre de proprie’t4. Si l~e "lieu de livraison" d~sign4 est autre que le point initial d’expe’d’tion, l~e
Pharmacogénétique: qu'en est-il au Maroc?
Idrissi, Meryem Janati; Ouldim, Karim; Amarti, Afaf; El Hassouni, Mohamed; Khabbal, Youssef
2013-01-01
La pharmacogénétique est l’étude de l'influence des variations génétiques interindividuelles sur la réponse aux médicaments, avec le but d'améliorer la prise en charge des patients en visant une médecine personnalisée. Au fait le génome de deux personnes ne diffère que par 0.1% des 3.2 milliards de paires de bases, ce qui implique les effets indésirables des médicaments, qui ont un très important impact sur le plan clinique que sur le plan économique. Or cette dernière décennie ces effets indésirables ont pu être évités grâce aux tests pharmacogénétiques. Au Maroc, la recherche en pharmacogénétique commence à susciter l'intérêt des chercheurs avec quelques études. Une toute première étude en 1986, sur l'acétylation de l'isoniazide chez la population marocaine, suivie par deux autres en 2011 se focalisant sur le métabolisme du tacrolimus et des anti-vitamines K. Ainsi l'espoir maintenant est d'identifier les majoritaires polymorphismes génétiques affectant les patients marocains, afin de leur fournir une prise en charge adaptée. PMID:23785548
Kerr effect in the isotropic phase of a side-chain polymeric liquid crystal
NASA Astrophysics Data System (ADS)
Reys, V.; Dormoy, Y.; Collin, D.; Keller, P.; Martinoty, P.
1992-02-01
The birefringence induced by a pulsed electrical field was used to study the pretransitional effects associated with the isotropic phase of a side-chain polysiloxane. The results obtained show that these effects are characterised by a conventional value of the static exponent and an abnormal value of the dynamic exponent, which shows that the dynamic theory of low molecular weight liquid crystals does not apply. The results also reveal competition between the dipolar moments induced by the electrical field and the permanent moments of the mesogenic molecules. La biréfringence induite par un champ électrique impulsionnel a été utilisée pour étudier les effets prétransitionnels associés à la phase isotrope d'un polysiloxane à chaînes latérales. Les résultats obtenus montrent que ces effets sont caractérisés par une valeur classique de l'exposant statique et une valeur anormale de l'exposant dynamique. Ce dernier résultat montre que la théorie dynamique des cristaux liquides de bas poids moléculaire n'est pas applicable au cas présent. Les expériences mettent également en évidence une compétition entre les moments dipolaires induits par le champ électrique et les moments permanents des molécules mésogènes.
NASA Astrophysics Data System (ADS)
Saint-James, Par D.
On étudie le spectre d'excitation pour une couche de métal normal déposée sur un supraconducteur. On montre que si l'interaction attractive électron-électron est négligeable dans le métal normal, il n'y a pas de gap d'énergie dans le spectre d'excitation, même si l'épaisseur de la couche normale est petite. Une étude analogue, conduisant à une conclusion similaire, est menée pour deux supraconducteurs accolés et pour des sphères de métal normal baignant dans un supraconducteur. L'effet prévu pourrait expliquer quelques résultats particuliers observés dans des mesures d'effet tunnel dans des supraconducteurs durs. The excitation spectrum of a layer of normal metal (N) deposited on a superconducting substrate (S) is discussed. It is shown that if the electron-electron attractive interaction is negligibly small in (N) there is no energy gap in the excitation spectrum even if the thickness of the layer (N) is small. A similar study, with equivalent conclusions, has been carried out for two superconductors and for normal metal spheres embedded in a superconductor. The effect may possibly explain some peculiar results of tunnelling experiments on hard superconductors.
The Problem of Auto-Correlation in Parasitology
Pollitt, Laura C.; Reece, Sarah E.; Mideo, Nicole; Nussey, Daniel H.; Colegrave, Nick
2012-01-01
Explaining the contribution of host and pathogen factors in driving infection dynamics is a major ambition in parasitology. There is increasing recognition that analyses based on single summary measures of an infection (e.g., peak parasitaemia) do not adequately capture infection dynamics and so, the appropriate use of statistical techniques to analyse dynamics is necessary to understand infections and, ultimately, control parasites. However, the complexities of within-host environments mean that tracking and analysing pathogen dynamics within infections and among hosts poses considerable statistical challenges. Simple statistical models make assumptions that will rarely be satisfied in data collected on host and parasite parameters. In particular, model residuals (unexplained variance in the data) should not be correlated in time or space. Here we demonstrate how failure to account for such correlations can result in incorrect biological inference from statistical analysis. We then show how mixed effects models can be used as a powerful tool to analyse such repeated measures data in the hope that this will encourage better statistical practices in parasitology. PMID:22511865
Modality effect in false recognition: evidence from Chinese characters.
Mao, Wei Bin; Yang, Zhi Liang; Wang, Lin Song
2010-02-01
Using the Deese/Roediger-McDermott (DRM) false memory method, Smith and Hunt ( 1998 ) first reported the modality effect on false memory and showed that false recall from DRM lists was lower following visual study than following auditory study, which led to numerous studies on the mechanism of modality effect on false memory and provided many competing explanations. In the present experiment, the authors tested the modality effect in false recognition by using a blocked presentation condition and a random presentation condition. The present experiment found a modality effect different from the results of the previous research; namely, false recognition was shown to be greater following visual study than following auditory study, especially in the blocked presentation condition rather than in the random presentation condition. The authors argued that this reversed modality effect may be due to different encoding and processing characteristics between Chinese characters and English words. Compared with English words, visual graphemes of critical lures in Chinese lists are likely to be activated and encoded in participants' minds, thus it is more difficult for participants to discriminate later inner graphemes from those items presented in visual modality. Hence visual presentation could lead to more false recognition than auditory presentation in Chinese lists. The results in the present experiment demonstrated that semantic activation occurring during the encoding and retrieve phases played an important role in modality effect in false recognition, and our findings might be explained by the activation-monitoring account. Utilisant la méthode de fausse mémoire de Deese/Roediger-McDermott (DRM), Smith et Hunt ( 1998 ) ont d'abord rendu compte de l'effet de modalité sur la fausse mémoire et ils ont montré que le faux rappel à partir des listes de DRM était plus faible suivant une étude visuelle plutôt qu'une étude auditive. Ceci a mené à plusieurs études sur le mécanisme de l'effet de modalité sur la fausse mémoire, lesquelles ont fourni plusieurs explications concurrentes. Dans la présente expérience, les auteurs ont testé l'effet de modalité dans la fausse reconnaissance en utilisant une condition de présentation fixe et une condition de présentation aléatoire. Cette expérience a révélé un effet de modalité différent des résultats obtenus dans les recherches antérieures. En effet, la fausse reconnaissance était plus élevée suivant une étude visuelle plutôt qu'une étude auditive, spécialement dans la condition de présentation fixe. Les auteurs suggèrent que cet effet de modalité inverse peut être dû à des caractéristiques d'encodage et de processus différentes entre les caractères chinois et les mots anglais. Comparativement aux mots anglais, les graphèmes visuels des leurres critiques dans les listes chinoises sont susceptibles d'être activés et encodés dans l'esprit des participants, rendant plus difficile de discriminer les graphèmes intériorisés plus tard de ces items présentés dans la modalité visuelle. Ainsi, la présentation visuelle pourrait mener à davantage de fausse reconnaissance que la présentation auditive dans les listes chinoises. Les résultats de la présente expérience ont démontré que l'activation sémantique se produisait durant l'encodage et que la phase de retrait jouait un rôle important dans l'effet de modalité dans la fausse reconnaissance. Nos résultats peuvent être expliqués par la théorie activation-contrôle. Utilizando el método de Deese/Roediger-McDermott (DRM) de falsa mamoria, Smith y Hunt ( 1998 ) fueron los primeros en encontrar el efecto de modalidad en la falsa memoria y demostraron que los falsos recuerdos del listado DRM fueron más bajos después de un estudio visual que después de un estudio auditivo lo cual llevó a varios estudios sobre el mecanismo del efecto de la modalidad sobre falsos recuerdos y proporcionó varias explicaciones que compiten entre sí. En el presente trabajo, los autores estudiaron el efecto de la modalidad en el falso reconocimiento utilizando una condición de presentación en bloques y otra condición de presentación de forma aleatoria. El presente experimento encontró un efecto de la modalidad diferente de los resultados de los estudios anteriores. En concreto, el reconocimiento falso ha resultado ser mayor después del estudio visual que después del estudio auditivo, especialmente en caso de la presentación en bloques en comparación con la condición de presentación aleatoria. Los autores argumentan que este efecto inverso de la modalidad puede ser causado por diferentes características de codificación y procesamiento entre caracteres chinos y palabras inglesas. En comparación con las palabras inglesas, los grafemas visuales de las palabras críticas en chino tienen probabilidad de ser activadas y codificadas en las mentes de los participantes, por tanto, es más difícil discriminar posteriores grafemas internos de los que fueron presentados en la modalidad visual. Por tanto, la presentación visual podría conducir a más falsos reconocimientos que la presentación auditiva en los listados de palabras chinas. Los resultados del presente experimento demostraron que la activación semántica durante las fases de codificación y recuperación jugó un rol importante en el efecto de falso reconocimiento según modalidad y que nuestros resultados se pueden explicar teniendo en cuenta la activación y la vigilancia.
Pharmacothérapie de la dépression chez les aînés
Frank, Christopher
2014-01-01
Résumé Objectif Discuter du traitement pharmacologique de la dépression chez les personnes âgées, y compris le choix des antidépresseurs, le titrage de la dose, la surveillance de la réponse et des effets secondaires et le traitement des cas réfractaires. Sources des données Les lignes directrices de 2006 de la Canadian Coalition for Seniors’ Mental Health sur l’évaluation et le traitement de la dépression ont servi comme source principale. Pour recenser les articles publiés après les lignes directrices, on a procédé à une recherche documentaire dans MEDLINE de 2007 à 2012 à l’aide des expressions en anglais depression, treatment, drug therapy et elderly. Message principal Le but du traitement devrait être la rémission des symptômes. L’amélioration des symptômes peut être surveillée en fonction des objectifs du patient qu’on a identifiés ou en se servant d’outils cliniques comme le Patient Health Questionnaire–9. On devrait envisager le traitement en 3 étapes: l’étape du traitement aigu pour obtenir la rémission des symptômes, une étape de continuation pour prévenir la récurrence d’un même épisode de la maladie (rechute) et une étape de maintien (prophylaxie) pour prévenir de futurs épisodes (récurrence). Le dosage initial devrait être la moitié de la dose de départ habituelle chez l’adulte et il devrait être titré régulièrement jusqu’à ce que le patient réponde, jusqu’à ce que la dose maximale soit atteinte ou encore que les effets secondaires en limitent l’augmentation. Parmi les effets secondaires fréquents, on peut mentionner les chutes, la nausée, les étourdissements, les céphalées et, moins communément, l’hyponatrémie et des changements dans l’intervalle QT. Des stratégies pour changer ou augmenter les antidépresseurs sont présentées. Les patients plus âgés devraient être traités pendant au moins un an à compter de l’observation d’une amélioration clinique et ceux qui ont une dépression récurrente ou des symptômes sévères devraient continuer le traitement indéfiniment. La prise en charge de situations particulières comme une dépression profonde ou une dépression avec psychose est discutée, y compris le recours à une thérapie électroconvulsive. Les critères pour demander une consultation en psychiatrie gériatrique sont indiqués; par ailleurs, de nombreux médecins de famille n’ont pas aisément accès à une telle ressource ou à d’autres stratégies cliniques non pharmacologiques. Conclusion L’efficacité de la pharmacothérapie de la dépression n’est pas considérablement influencée par l’âge. L’identification de la dépression, le choix du traitement approprié, le tritrage des médicaments, la surveillance des effets secondaires et la durée suffisante du traitement amélioreront les résultats pour les patients plus âgés.
1992-10-01
N=8) and Results of 44 Statistical Analyses for Impact Test Performed on Forefoot of Unworn Footwear A-2. Summary Statistics (N=8) and Results of...on Forefoot of Worn Footwear Vlll Tables (continued) Table Page B-2. Summary Statistics (N=4) and Results of 76 Statistical Analyses for Impact...used tests to assess heel and forefoot shock absorption, upper and sole durability, and flexibility (Cavanagh, 1978). Later, the number of tests was
2011-01-01
Background Clinical researchers have often preferred to use a fixed effects model for the primary interpretation of a meta-analysis. Heterogeneity is usually assessed via the well known Q and I2 statistics, along with the random effects estimate they imply. In recent years, alternative methods for quantifying heterogeneity have been proposed, that are based on a 'generalised' Q statistic. Methods We review 18 IPD meta-analyses of RCTs into treatments for cancer, in order to quantify the amount of heterogeneity present and also to discuss practical methods for explaining heterogeneity. Results Differing results were obtained when the standard Q and I2 statistics were used to test for the presence of heterogeneity. The two meta-analyses with the largest amount of heterogeneity were investigated further, and on inspection the straightforward application of a random effects model was not deemed appropriate. Compared to the standard Q statistic, the generalised Q statistic provided a more accurate platform for estimating the amount of heterogeneity in the 18 meta-analyses. Conclusions Explaining heterogeneity via the pre-specification of trial subgroups, graphical diagnostic tools and sensitivity analyses produced a more desirable outcome than an automatic application of the random effects model. Generalised Q statistic methods for quantifying and adjusting for heterogeneity should be incorporated as standard into statistical software. Software is provided to help achieve this aim. PMID:21473747
Gaskin, Cadeyrn J; Happell, Brenda
2014-05-01
To (a) assess the statistical power of nursing research to detect small, medium, and large effect sizes; (b) estimate the experiment-wise Type I error rate in these studies; and (c) assess the extent to which (i) a priori power analyses, (ii) effect sizes (and interpretations thereof), and (iii) confidence intervals were reported. Statistical review. Papers published in the 2011 volumes of the 10 highest ranked nursing journals, based on their 5-year impact factors. Papers were assessed for statistical power, control of experiment-wise Type I error, reporting of a priori power analyses, reporting and interpretation of effect sizes, and reporting of confidence intervals. The analyses were based on 333 papers, from which 10,337 inferential statistics were identified. The median power to detect small, medium, and large effect sizes was .40 (interquartile range [IQR]=.24-.71), .98 (IQR=.85-1.00), and 1.00 (IQR=1.00-1.00), respectively. The median experiment-wise Type I error rate was .54 (IQR=.26-.80). A priori power analyses were reported in 28% of papers. Effect sizes were routinely reported for Spearman's rank correlations (100% of papers in which this test was used), Poisson regressions (100%), odds ratios (100%), Kendall's tau correlations (100%), Pearson's correlations (99%), logistic regressions (98%), structural equation modelling/confirmatory factor analyses/path analyses (97%), and linear regressions (83%), but were reported less often for two-proportion z tests (50%), analyses of variance/analyses of covariance/multivariate analyses of variance (18%), t tests (8%), Wilcoxon's tests (8%), Chi-squared tests (8%), and Fisher's exact tests (7%), and not reported for sign tests, Friedman's tests, McNemar's tests, multi-level models, and Kruskal-Wallis tests. Effect sizes were infrequently interpreted. Confidence intervals were reported in 28% of papers. The use, reporting, and interpretation of inferential statistics in nursing research need substantial improvement. Most importantly, researchers should abandon the misleading practice of interpreting the results from inferential tests based solely on whether they are statistically significant (or not) and, instead, focus on reporting and interpreting effect sizes, confidence intervals, and significance levels. Nursing researchers also need to conduct and report a priori power analyses, and to address the issue of Type I experiment-wise error inflation in their studies. Crown Copyright © 2013. Published by Elsevier Ltd. All rights reserved.
1991-11-01
AD-A245 117 III~i~IIiiIiIIi~ifhII _________________________AGARD-R306 0 AGARD ADVISORY GROUP FOR AEROSPACE RESEARCH & DEVELOPMENT 4 7 RUE ANCELLE...REPRODUCE LEGIBLY. AGARD-AR-306 ADVISORY GROUP FOR AEROSPACE RESEARCH & DEVELOPMENT 7 RUE ANCELLE 92200 NEUILLY SUR SEINE FRANCE AGARD ADVISORY RIEPORT...Conditions NMeleorolovioiue,, Adverscs sur rAerodynarminiuc) - J.J. Reinrnann National ,\\cronautico and Space .\\drmintration L eip Research (Center
ERIC Educational Resources Information Center
Reinwein, Joachim
A study investigated the degree to which the page layout of a book affects the young reader's association of text with appropriate illustrations. Four hundred native French-speaking third-graders in eight Montreal (Canada) schools participated. In a third-grade text about animals, the names of the animals illustrated in pictures and other words…
CFD Simulations of a Ferry in Head Seas
2011-03-01
la durée de vie en fatigue globale du navire. Il est important, pour la conception, l’exploitation et la gestion du cycle de ...vibration de frottement qui exerce une contrainte sur la structure de coque primaire et qui a un effet défavorable sur la durée de vie en fatigue globale du ...laquelle ne peut simuler directement la physique du tossage. Le présent rapport contient les résultats d’une étude faisant
Real-Time Adaptive Control of Mixing in a Plane Shear Layer
1994-02-02
l’icoulement d’un fuide visqueux incompressible autour d’un cylinder fixe ou en rotation. Effet Magnus . J. Mdc. 14, 109-134. TANEDA, S. 1977 Visual study...Mokhtarian & Yokomizo 1990), and in lift enhancement schemes employing the Magnus effect (Swanson 1961). Rotation of all or part of a body may also have...coordinate system. In this work, the body-fitted grid is simply one of cylindrical polar coordinates and is time-independent, except for a = 3.25 where
1999-04-01
project, there was a requirement to place a camera behind a each photogrammetric target in the image and for each cylindrically curved window...testing. T by the wind tunnel’s captive trajectory sting, U.S. wing open effets on the o erthnd, h e Navyengneer h e obervd sgnifcan difereces wing...Flying Qualities, Symposium on Aeroballistics, May 1981. Aerodynamics, and Structures disciplines benefit directly 6. Magnus , A. E., and Epton, M. A
1991-01-01
cylindre fixe ou en rotation. Effet Magnus . J. Mec. 14, 109-134. Taneda, S. 1977 Visual study of unsteady separated flows around bodies. Prog. Aero...enhancement schemes employing the Magnus effect (Swanson 1961). Rotating all or part of a body may also have applications in active or feedback control of...and yt into the governing equations in the generalized coordinate system. In this study, the body-fitted grid is simply one of cylindrical polar
AFTERRISE: Deep Body Temperature Following Exercise
1992-04-01
Egalement, les effets de la posture et des v~tements durant la r~cup~ration et la temperature de la salle de r~cup~ration furent examines. Cinq hommes...une chemise a manches courtes). Les tempdratures rectales et de la peau furent mesurdes & chaque minute durant les exercices et la r6cupdration...Ndcnillll~ l OLen 1+ 1DNfnce na~on~e :x DTIC Ss ELECTE FEB 5 1993DI C AFTERRISE: DEEP BODY TEMPERATURE FOLLOWING EXERCISE (U) by S. Tuck and A.A
Why Cold-Wet Makes One Feel Chilled: A Literature Review
1988-06-01
froid et mouill6. On examine aussi l’effet de la radiation solaire , l’interaction entre la peau at l’humidit6, entre la peau et la temp~rature de mgme...directions, including back out into space. Aerosols of water in clouds reflect incident solar energy . The upper surface of a stratus cloud cover can reflect...earth than under clear conditions. Albedo, the fraction of the incident energy which is reflected by a surface, varies considerably with the terrain
1983-09-01
reduction of stress intensity at a crack tip due to Lreep was responsible for increasing the fatigue life during the "slow- fast " L .sts. As creep is clearly...Aeronautical Establishment Structures and Materials Laboratory SPONSORING AGENCY/AGENCE DE SUBVENTION 8 DATE FILE/DOSSIER LAB. ORDER PAGES FIGS/ DIAGRAMMES
40 CFR 91.512 - Request for public hearing.
Code of Federal Regulations, 2010 CFR
2010-07-01
... plans and statistical analyses have been properly applied (specifically, whether sampling procedures and statistical analyses specified in this subpart were followed and whether there exists a basis for... will be made available to the public during Agency business hours. ...
Jin, Zhichao; Yu, Danghui; Zhang, Luoman; Meng, Hong; Lu, Jian; Gao, Qingbin; Cao, Yang; Ma, Xiuqiang; Wu, Cheng; He, Qian; Wang, Rui; He, Jia
2010-05-25
High quality clinical research not only requires advanced professional knowledge, but also needs sound study design and correct statistical analyses. The number of clinical research articles published in Chinese medical journals has increased immensely in the past decade, but study design quality and statistical analyses have remained suboptimal. The aim of this investigation was to gather evidence on the quality of study design and statistical analyses in clinical researches conducted in China for the first decade of the new millennium. Ten (10) leading Chinese medical journals were selected and all original articles published in 1998 (N = 1,335) and 2008 (N = 1,578) were thoroughly categorized and reviewed. A well-defined and validated checklist on study design, statistical analyses, results presentation, and interpretation was used for review and evaluation. Main outcomes were the frequencies of different types of study design, error/defect proportion in design and statistical analyses, and implementation of CONSORT in randomized clinical trials. From 1998 to 2008: The error/defect proportion in statistical analyses decreased significantly ( = 12.03, p<0.001), 59.8% (545/1,335) in 1998 compared to 52.2% (664/1,578) in 2008. The overall error/defect proportion of study design also decreased ( = 21.22, p<0.001), 50.9% (680/1,335) compared to 42.40% (669/1,578). In 2008, design with randomized clinical trials remained low in single digit (3.8%, 60/1,578) with two-third showed poor results reporting (defects in 44 papers, 73.3%). Nearly half of the published studies were retrospective in nature, 49.3% (658/1,335) in 1998 compared to 48.2% (761/1,578) in 2008. Decreases in defect proportions were observed in both results presentation ( = 93.26, p<0.001), 92.7% (945/1,019) compared to 78.2% (1023/1,309) and interpretation ( = 27.26, p<0.001), 9.7% (99/1,019) compared to 4.3% (56/1,309), some serious ones persisted. Chinese medical research seems to have made significant progress regarding statistical analyses, but there remains ample room for improvement regarding study designs. Retrospective clinical studies are the most often used design, whereas randomized clinical trials are rare and often show methodological weaknesses. Urgent implementation of the CONSORT statement is imperative.
ERIC Educational Resources Information Center
Cafri, Guy; Kromrey, Jeffrey D.; Brannick, Michael T.
2010-01-01
This article uses meta-analyses published in "Psychological Bulletin" from 1995 to 2005 to describe meta-analyses in psychology, including examination of statistical power, Type I errors resulting from multiple comparisons, and model choice. Retrospective power estimates indicated that univariate categorical and continuous moderators, individual…
Algorithm for Identifying Erroneous Rain-Gauge Readings
NASA Technical Reports Server (NTRS)
Rickman, Doug
2005-01-01
An algorithm analyzes rain-gauge data to identify statistical outliers that could be deemed to be erroneous readings. Heretofore, analyses of this type have been performed in burdensome manual procedures that have involved subjective judgements. Sometimes, the analyses have included computational assistance for detecting values falling outside of arbitrary limits. The analyses have been performed without statistically valid knowledge of the spatial and temporal variations of precipitation within rain events. In contrast, the present algorithm makes it possible to automate such an analysis, makes the analysis objective, takes account of the spatial distribution of rain gauges in conjunction with the statistical nature of spatial variations in rainfall readings, and minimizes the use of arbitrary criteria. The algorithm implements an iterative process that involves nonparametric statistics.
Citation of previous meta-analyses on the same topic: a clue to perpetuation of incorrect methods?
Li, Tianjing; Dickersin, Kay
2013-06-01
Systematic reviews and meta-analyses serve as a basis for decision-making and clinical practice guidelines and should be carried out using appropriate methodology to avoid incorrect inferences. We describe the characteristics, statistical methods used for meta-analyses, and citation patterns of all 21 glaucoma systematic reviews we identified pertaining to the effectiveness of prostaglandin analog eye drops in treating primary open-angle glaucoma, published between December 2000 and February 2012. We abstracted data, assessed whether appropriate statistical methods were applied in meta-analyses, and examined citation patterns of included reviews. We identified two forms of problematic statistical analyses in 9 of the 21 systematic reviews examined. Except in 1 case, none of the 9 reviews that used incorrect statistical methods cited a previously published review that used appropriate methods. Reviews that used incorrect methods were cited 2.6 times more often than reviews that used appropriate statistical methods. We speculate that by emulating the statistical methodology of previous systematic reviews, systematic review authors may have perpetuated incorrect approaches to meta-analysis. The use of incorrect statistical methods, perhaps through emulating methods described in previous research, calls conclusions of systematic reviews into question and may lead to inappropriate patient care. We urge systematic review authors and journal editors to seek the advice of experienced statisticians before undertaking or accepting for publication a systematic review and meta-analysis. The author(s) have no proprietary or commercial interest in any materials discussed in this article. Copyright © 2013 American Academy of Ophthalmology. Published by Elsevier Inc. All rights reserved.
Wu, Robert; Glen, Peter; Ramsay, Tim; Martel, Guillaume
2014-06-28
Observational studies dominate the surgical literature. Statistical adjustment is an important strategy to account for confounders in observational studies. Research has shown that published articles are often poor in statistical quality, which may jeopardize their conclusions. The Statistical Analyses and Methods in the Published Literature (SAMPL) guidelines have been published to help establish standards for statistical reporting.This study will seek to determine whether the quality of statistical adjustment and the reporting of these methods are adequate in surgical observational studies. We hypothesize that incomplete reporting will be found in all surgical observational studies, and that the quality and reporting of these methods will be of lower quality in surgical journals when compared with medical journals. Finally, this work will seek to identify predictors of high-quality reporting. This work will examine the top five general surgical and medical journals, based on a 5-year impact factor (2007-2012). All observational studies investigating an intervention related to an essential component area of general surgery (defined by the American Board of Surgery), with an exposure, outcome, and comparator, will be included in this systematic review. Essential elements related to statistical reporting and quality were extracted from the SAMPL guidelines and include domains such as intent of analysis, primary analysis, multiple comparisons, numbers and descriptive statistics, association and correlation analyses, linear regression, logistic regression, Cox proportional hazard analysis, analysis of variance, survival analysis, propensity analysis, and independent and correlated analyses. Each article will be scored as a proportion based on fulfilling criteria in relevant analyses used in the study. A logistic regression model will be built to identify variables associated with high-quality reporting. A comparison will be made between the scores of surgical observational studies published in medical versus surgical journals. Secondary outcomes will pertain to individual domains of analysis. Sensitivity analyses will be conducted. This study will explore the reporting and quality of statistical analyses in surgical observational studies published in the most referenced surgical and medical journals in 2013 and examine whether variables (including the type of journal) can predict high-quality reporting.
Statistical analyses of commercial vehicle accident factors. Volume 1 Part 1
DOT National Transportation Integrated Search
1978-02-01
Procedures for conducting statistical analyses of commercial vehicle accidents have been established and initially applied. A file of some 3,000 California Highway Patrol accident reports from two areas of California during a period of about one year...
40 CFR 90.712 - Request for public hearing.
Code of Federal Regulations, 2010 CFR
2010-07-01
... sampling plans and statistical analyses have been properly applied (specifically, whether sampling procedures and statistical analyses specified in this subpart were followed and whether there exists a basis... Clerk and will be made available to the public during Agency business hours. ...
Supply Chain Collaboration: Information Sharing in a Tactical Operating Environment
2013-06-01
architecture, there are four tiers: Client (Web Application Clients ), Presentation (Web-Server), Processing (Application-Server), Data (Database...organization in each period. This data will be collected to analyze. i) Analyses and Validation: We will do a statistics test in this data, Pareto ...notes, outstanding deliveries, and inventory. i) Analyses and Validation: We will do a statistics test in this data, Pareto analyses and confirmation
Research of Extension of the Life Cycle of Helicopter Rotor Blade in Hungary
2003-02-01
Radiography (DXR), and (iii) Vibration Diagnostics (VD) with Statistical Energy Analysis (SEA) were semi- simultaneously applied [1]. The used three...2.2. Vibration Diagnostics (VD)) Parallel to the NDT measurements the Statistical Energy Analysis (SEA) as a vibration diagnostical tool were...noises were analysed with a dual-channel real time frequency analyser (BK2035). In addition to the Statistical Energy Analysis measurement a small
Hamel, Jean-Francois; Saulnier, Patrick; Pe, Madeline; Zikos, Efstathios; Musoro, Jammbe; Coens, Corneel; Bottomley, Andrew
2017-09-01
Over the last decades, Health-related Quality of Life (HRQoL) end-points have become an important outcome of the randomised controlled trials (RCTs). HRQoL methodology in RCTs has improved following international consensus recommendations. However, no international recommendations exist concerning the statistical analysis of such data. The aim of our study was to identify and characterise the quality of the statistical methods commonly used for analysing HRQoL data in cancer RCTs. Building on our recently published systematic review, we analysed a total of 33 published RCTs studying the HRQoL methods reported in RCTs since 1991. We focussed on the ability of the methods to deal with the three major problems commonly encountered when analysing HRQoL data: their multidimensional and longitudinal structure and the commonly high rate of missing data. All studies reported HRQoL being assessed repeatedly over time for a period ranging from 2 to 36 months. Missing data were common, with compliance rates ranging from 45% to 90%. From the 33 studies considered, 12 different statistical methods were identified. Twenty-nine studies analysed each of the questionnaire sub-dimensions without type I error adjustment. Thirteen studies repeated the HRQoL analysis at each assessment time again without type I error adjustment. Only 8 studies used methods suitable for repeated measurements. Our findings show a lack of consistency in statistical methods for analysing HRQoL data. Problems related to multiple comparisons were rarely considered leading to a high risk of false positive results. It is therefore critical that international recommendations for improving such statistical practices are developed. Copyright © 2017. Published by Elsevier Ltd.
Sunspot activity and influenza pandemics: a statistical assessment of the purported association.
Towers, S
2017-10-01
Since 1978, a series of papers in the literature have claimed to find a significant association between sunspot activity and the timing of influenza pandemics. This paper examines these analyses, and attempts to recreate the three most recent statistical analyses by Ertel (1994), Tapping et al. (2001), and Yeung (2006), which all have purported to find a significant relationship between sunspot numbers and pandemic influenza. As will be discussed, each analysis had errors in the data. In addition, in each analysis arbitrary selections or assumptions were also made, and the authors did not assess the robustness of their analyses to changes in those arbitrary assumptions. Varying the arbitrary assumptions to other, equally valid, assumptions negates the claims of significance. Indeed, an arbitrary selection made in one of the analyses appears to have resulted in almost maximal apparent significance; changing it only slightly yields a null result. This analysis applies statistically rigorous methodology to examine the purported sunspot/pandemic link, using more statistically powerful un-binned analysis methods, rather than relying on arbitrarily binned data. The analyses are repeated using both the Wolf and Group sunspot numbers. In all cases, no statistically significant evidence of any association was found. However, while the focus in this particular analysis was on the purported relationship of influenza pandemics to sunspot activity, the faults found in the past analyses are common pitfalls; inattention to analysis reproducibility and robustness assessment are common problems in the sciences, that are unfortunately not noted often enough in review.
Lubala, Toni Kasole; Shongo, Mick Yapongombo; Munkana, Arthur Ndundula; Mutombo, Augustin Mulangu; Mbuyi, Sébastien Musanzayi; wa Momat, Félix Kitenge
2012-01-01
En République Démocratique du Congo, les malformations congénitales constituent un véritable problème de santé publique. En effet, elles relancent le débat sur les effets de l'intensification de l'activité minière sur la santé de la reproduction. De 2009 à 2010, nous avons calculé une prévalence de 5.84 pour 1000 naissances. Les malformations du système nerveux central étaient les plus fréquentes (2.029 pour 1000) suivies des malformations des membres (1.055 pour 1000), et des fentes oro-faciales (0.811 pour 1000). Ces données sont certainement largement sous-estimées et les causes y relatives en République Démocratique du Congo ne sont ni surveillées, ni prévenues dans le cadre d'une politique gouvernementale. La mise en place d'un registre national et d'un centre national de génétique humaine de référence pourrait constituer un cadre rigoureux, organisé et structuré de surveillance et de prévention des malformations congénitales. PMID:23396951
Cold pearl surfactant-based blends.
Crombie, R L
1997-10-01
Pearlizing agents have been used for many years in cosmetic formulations to add a pearlescent effect. Cold pearl surfactant-based blends are mixtures of glycol stearates and surfactants which can be blended in the cold into a wide range of personal-care formulations to create a pearlescent lustre effect. Under controlled manufacturing conditions constant viscosities and crystalline characteristics can be obtained. The development of these blends has been driven by efforts to improve the economics of adding solid pearlizing agents directly into a hot mix formulation. This paper summarizes the history of pearlizers, describes their advantages and physical chemistry of the manufacturing process. Finally some suggestions for applications are given. Les agents nacrants sont utilises depuis de nombreuses annees dans les formulations cosmetiques pour ajouter un effet nacre. Les melanges a froid a base de tensioactif nacre sont des melanges de stearates de glycol et de tensioactifs qui peuvent etre melanges a froid dans une large gamme de formulations d'hygiene personnelle pour creer un effet de lustre nacre. On peut obtenir des viscosites et des proprietes cristallines constantes avec des conditions de fabrication maitrisees. Le developpement de ces melanges a ete porte par les efforts pour ameliorer les couts de l'ajout d'agents nacrants solides directement dans une formulation melangee de l'ajout d'agents nacrants solides directement dans une formulation melangee a chaud. Cet article resume l'histoire des agents nacrants, decrit leurs avantages et al physico-chimie du procede de fabrication. On emet a la fin cetaines suggestions d'applications.
NASA Astrophysics Data System (ADS)
Nicolas, M.; Guyen van Huong, C. N.; Burger, J. P.; Dubon, A.; Fitoussi, J. P.; Laval, J. Y.
1991-11-01
a.c. susceptibility measurements and 3D electronic microscopy observation were carried out on three Bi-compounds (2212 and 2223). Depending on the annealing conditions, one observes lamellar-type materials organized either in a more or less dense structure of fine particles or in very large slabs. The a.c. susceptibility signals allow to separate the intergranular from the intragranular effects. The combined results obtained by both techniques lead us to conclude that the intergranular and the intragranular decoupling, induced by magnetic fields are of same nature, i.e. occur at the (BiO)2 slab. Nous avons couplé des mesures de susceptibilité altemative et d'observation au microscope électronique en 3D sur trois échantillons supraconducteurs au bismuth (2212 et 2223). Selon les conditions de recuit, on observe des matériaux de type lamellaire, organisés soit en une structure plus ou moins dense de fines particules, soit en larges plaquettes. L'étude des signaux de susceptibilité alternative χ' et χ'' permet de séparer les effets intergranulaires des effets intragranulaires. les résultats combinés des deux techniques nous ont conduit à conclure que les deux types de découplage (inter et intragranulaires), induits par les champs magnétiques sont de même nature, c'est-à-dire se produisent au niveau des bicouches (BiO)2.
Magnétochiralité et résonances stochastiques dans les lasers
NASA Astrophysics Data System (ADS)
Bonnet, C.; Bretenaker, F.; Brunel, M.; Chauvat, D.; Emile, O.; Lai, N. D.; Le Floch, A.; Ropars, G.; Ruchon, T.; Singh, K.; Thépot, J.-Y.; Vallet, M.
2002-06-01
Les états propres d'un laser constituent un outil de choix pour étudier les différents rôles joués par le bruit dans un système. D'une part, si on veut isoler un effet petit difficilement accessible par les méthodes classiques, ces états propres permettent de réaliser des mesures différentielles de haute précision, à condition de pouvoir éliminer les bruits mécaniques, optiques, électroniques. A titre d'exemple, nous avons utilisé les états propageant et contrapropageant d'un laser ionique en anneau pour mesurer une interaction fondamentale faible: la biréfringence magnétochirale. Cette "biréfringence" se manifeste en effet par une petite variation d'indice selon le sens de parcours de l'anneau, de l'ordre de Δ n.10^{-11}, indépendante de la polarisation. A l'opposé, les deux états propres d'un laser du type Fabry-Perot constituent un système idéal pour explorer les résonances stochastiques à deux dimensions. Les résonances stochastiques par inhibition et par rotation sont isolées en présence de bruits blancs gaussiens tant pour les bruits optiques que magnétiques. L'utilisation possible de l'émission spontanée comme bruit actif est démontrée.
Youth Supervision While Mothers Work: A Daily Diary Study of Maternal Worry.
Blocklin, Michelle K; Crouter, Ann C; McHale, Susan M
2012-01-01
Using data from a daily diary study of hourly hotel employees in the U.S. and their children, this study examined links between youth supervision arrangements and maternal worry while at work, examining both differences between individuals and day-to-day variation within individuals. Multilevel model analyses revealed both between- and within-person effects linking youth supervision to maternal worry. Mothers' partner status functioned as moderator, and maternal knowledge also emerged as a protective factor when youth were in self-care, highlighting a potential target for future work-family interventions, particularly those for hourly employees with limited access to family-friendly workplace policies.En utilisant les données d'une étude de journal quotidien des employés horaires de l'hôtel aux États-Unis et leurs enfants, cette étude a examiné les liens entre les modalités de supervision des jeunes et l'inquiétude maternelle pendant le travail, en examinant à la fois les différences inter individus et la variation intra individus au jour le jour. Analyses multi-niveaux ont révélé à la fois des effets inter et intra reliant la supervision des jeunes à l'inquiétude maternelle. Statut de partenaire des mères a fonctionné en tant que modérateur, et la connaissance maternelle est également apparue comme un facteur de protection lorsque les jeunes ont pris soins d'eux-mêmes, soulignant une cible potentielle pour des interventions de conciliation travail-famille, en particulier ceux conçus pour des employés horaires avec un accès limité à des politiques favorables à la famille.
Caracterisation experimentale de la transmission acoustique de structures aeronautiques
NASA Astrophysics Data System (ADS)
Pointel, Vincent
Le confort des passagers à l'intérieur des avions pendant le vol est un axe en voie d'amélioration constante. L'augmentation de la proportion des matériaux composites dans la fabrication des structures aéronautiques amène de nouvelles problématiques à résoudre. Le faible amortissement de ces structures, en contre partie de leur poids/raideur faible, est non favorable sur le plan acoustique, ce qui oblige les concepteurs à devoir trouver des moyens d'amélioration. De plus, les mécanismes de transmission du son au travers d'un système double paroi de type aéronautique ne sont pas complètement compris, c'est la raison qui motive cette étude. L'objectif principal de ce projet est de constituer une base de données pour le partenaire industriel de ce projet : Bombardier Aéronautique. En effet, les données expérimentales de performance d'isolation acoustique, de systèmes complets représentatifs d'un fuselage d'avion sont très rares dans la littérature scientifique. C'est pourquoi une méthodologie expérimentale est utilisée dans ce projet. Deux conceptions différentes de fuselage sont comparées. La première possède une peau (partie extérieure du fuselage) métallique raidie, alors que la deuxième est constituée d'un panneau sandwich composite. Dans les deux cas, un panneau de finition de fabrication sandwich est utilisé. Un traitement acoustique en laine de verre est placé à l'intérieur de chacun des fuselages. Des isolateurs vibratoires sont utilisés pour connecter les deux panneaux du fuselage. La simulation en laboratoire de la couche limite turbulente, qui est la source d'excitation prépondérante pendant la phase de vol, n'est pas encore possible hormis en soufflerie. C'est pourquoi deux cas d'excitation sont considérés pour essayer d'approcher cette sollicitation : une excitation mécanique (pot vibrant) et une acoustique (champ diffus). La validation et l'analyse des résultats sont effectuées par le biais des logiciels NOVA et VAONE, utilisés par le partenaire industriel de ce projet. Un des objectifs secondaires est de valider le modèle double paroi implémenté dans NOVA. L'investigation de l'effet de compression local du traitement acoustique, sur la perte par transmission d'une simple paroi, montre que cette action n'a aucun effet bénéfique notable. D'autre part, il apparaît que la raideur des isolateurs vibratoires a un lien direct avec les performances d'isolation du système double paroi. Le système double paroi avec peau composite semble moins sensible à ce paramètre. Le modèle double paroi de NOVA donne de bons résultats concernant le système double paroi avec une peau métallique. Des écarts plus importants sont observés en moyennes et hautes fréquences dans le cas du système avec une peau composite. Cependant, la bonne tendance de la prédiction au vu de la complexité de la structure est plutôt prometteuse.
Suivi après le traitement du cancer du sein
Sisler, Jeffrey; Chaput, Geneviève; Sussman, Jonathan; Ozokwelu, Emmanuel
2016-01-01
Résumé Objectif Offrir aux médecins de famille un résumé des recommandations fondées sur les données probantes pour guider les soins aux survivantes traitées pour le cancer du sein. Qualité des données Une recherche documentaire a été effectuée dans MEDLINE entre 2000 et 2016 à l’aide des mots-clés anglais suivants : breast cancer, survivorship, follow-up care, aftercare, guidelines et survivorship care plans, en se concentrant sur la revue des lignes directrices publiées récemment par les organismes nationaux de cancérologie. Les données étaient de niveaux I à III. Message principal Les soins aux survivantes comportent 4 facettes : surveillance et dépistage, prise en charge des effets à long terme, promotion de la santé et coordination des soins. La surveillance des récidives ne se traduit que par une mammographie annuelle, et le dépistage d’autres cancers doit suivre les lignes directrices basées sur la population. La prise en charge des effets à long terme du cancer et de son traitement aborde des problèmes courants tels la douleur, la fatigue, le lymphœdème, la détresse et les effets indésirables des médicaments, de même que les préoccupations à long terme comme la santé du cœur et des os. La promotion de la santé met en relief les bienfaits de l’activité chez les survivantes du cancer, avec l’accent mis sur l’activité physique. Les soins aux survivantes sont de meilleure qualité lorsque divers services et professionnels de la santé participent aux soins, et le médecin de famille joue un rôle important dans la coordination des soins. Conclusion Les médecins de famille sont de plus en plus souvent les principaux fournisseurs de soins de suivi après le traitement du cancer du sein. Le cancer du sein doit être considéré comme une affection médicale chronique, même chez les femmes en rémission, et les patientes profitent de la même approche que celle utilisée pour les autres affections chroniques en soins de première ligne. PMID:27737992
Jin, Zhichao; Yu, Danghui; Zhang, Luoman; Meng, Hong; Lu, Jian; Gao, Qingbin; Cao, Yang; Ma, Xiuqiang; Wu, Cheng; He, Qian; Wang, Rui; He, Jia
2010-01-01
Background High quality clinical research not only requires advanced professional knowledge, but also needs sound study design and correct statistical analyses. The number of clinical research articles published in Chinese medical journals has increased immensely in the past decade, but study design quality and statistical analyses have remained suboptimal. The aim of this investigation was to gather evidence on the quality of study design and statistical analyses in clinical researches conducted in China for the first decade of the new millennium. Methodology/Principal Findings Ten (10) leading Chinese medical journals were selected and all original articles published in 1998 (N = 1,335) and 2008 (N = 1,578) were thoroughly categorized and reviewed. A well-defined and validated checklist on study design, statistical analyses, results presentation, and interpretation was used for review and evaluation. Main outcomes were the frequencies of different types of study design, error/defect proportion in design and statistical analyses, and implementation of CONSORT in randomized clinical trials. From 1998 to 2008: The error/defect proportion in statistical analyses decreased significantly ( = 12.03, p<0.001), 59.8% (545/1,335) in 1998 compared to 52.2% (664/1,578) in 2008. The overall error/defect proportion of study design also decreased ( = 21.22, p<0.001), 50.9% (680/1,335) compared to 42.40% (669/1,578). In 2008, design with randomized clinical trials remained low in single digit (3.8%, 60/1,578) with two-third showed poor results reporting (defects in 44 papers, 73.3%). Nearly half of the published studies were retrospective in nature, 49.3% (658/1,335) in 1998 compared to 48.2% (761/1,578) in 2008. Decreases in defect proportions were observed in both results presentation ( = 93.26, p<0.001), 92.7% (945/1,019) compared to 78.2% (1023/1,309) and interpretation ( = 27.26, p<0.001), 9.7% (99/1,019) compared to 4.3% (56/1,309), some serious ones persisted. Conclusions/Significance Chinese medical research seems to have made significant progress regarding statistical analyses, but there remains ample room for improvement regarding study designs. Retrospective clinical studies are the most often used design, whereas randomized clinical trials are rare and often show methodological weaknesses. Urgent implementation of the CONSORT statement is imperative. PMID:20520824
Use of Statistical Analyses in the Ophthalmic Literature
Lisboa, Renato; Meira-Freitas, Daniel; Tatham, Andrew J.; Marvasti, Amir H.; Sharpsten, Lucie; Medeiros, Felipe A.
2014-01-01
Purpose To identify the most commonly used statistical analyses in the ophthalmic literature and to determine the likely gain in comprehension of the literature that readers could expect if they were to sequentially add knowledge of more advanced techniques to their statistical repertoire. Design Cross-sectional study Methods All articles published from January 2012 to December 2012 in Ophthalmology, American Journal of Ophthalmology and Archives of Ophthalmology were reviewed. A total of 780 peer-reviewed articles were included. Two reviewers examined each article and assigned categories to each one depending on the type of statistical analyses used. Discrepancies between reviewers were resolved by consensus. Main Outcome Measures Total number and percentage of articles containing each category of statistical analysis were obtained. Additionally we estimated the accumulated number and percentage of articles that a reader would be expected to be able to interpret depending on their statistical repertoire. Results Readers with little or no statistical knowledge would be expected to be able to interpret the statistical methods presented in only 20.8% of articles. In order to understand more than half (51.4%) of the articles published, readers were expected to be familiar with at least 15 different statistical methods. Knowledge of 21 categories of statistical methods was necessary to comprehend 70.9% of articles, while knowledge of more than 29 categories was necessary to comprehend more than 90% of articles. Articles in retina and glaucoma subspecialties showed a tendency for using more complex analysis when compared to cornea. Conclusions Readers of clinical journals in ophthalmology need to have substantial knowledge of statistical methodology to understand the results of published studies in the literature. The frequency of use of complex statistical analyses also indicates that those involved in the editorial peer-review process must have sound statistical knowledge in order to critically appraise articles submitted for publication. The results of this study could provide guidance to direct the statistical learning of clinical ophthalmologists, researchers and educators involved in the design of courses for residents and medical students. PMID:24612977
Global atmospheric circulation statistics, 1000-1 mb
NASA Technical Reports Server (NTRS)
Randel, William J.
1992-01-01
The atlas presents atmospheric general circulation statistics derived from twelve years (1979-90) of daily National Meteorological Center (NMC) operational geopotential height analyses; it is an update of a prior atlas using data over 1979-1986. These global analyses are available on pressure levels covering 1000-1 mb (approximately 0-50 km). The geopotential grids are a combined product of the Climate Analysis Center (which produces analyses over 70-1 mb) and operational NMC analyses (over 1000-100 mb). Balance horizontal winds and hydrostatic temperatures are derived from the geopotential fields.
Development of the Statistical Reasoning in Biology Concept Inventory (SRBCI)
Deane, Thomas; Nomme, Kathy; Jeffery, Erica; Pollock, Carol; Birol, Gülnur
2016-01-01
We followed established best practices in concept inventory design and developed a 12-item inventory to assess student ability in statistical reasoning in biology (Statistical Reasoning in Biology Concept Inventory [SRBCI]). It is important to assess student thinking in this conceptual area, because it is a fundamental requirement of being statistically literate and associated skills are needed in almost all walks of life. Despite this, previous work shows that non–expert-like thinking in statistical reasoning is common, even after instruction. As science educators, our goal should be to move students along a novice-to-expert spectrum, which could be achieved with growing experience in statistical reasoning. We used item response theory analyses (the one-parameter Rasch model and associated analyses) to assess responses gathered from biology students in two populations at a large research university in Canada in order to test SRBCI’s robustness and sensitivity in capturing useful data relating to the students’ conceptual ability in statistical reasoning. Our analyses indicated that SRBCI is a unidimensional construct, with items that vary widely in difficulty and provide useful information about such student ability. SRBCI should be useful as a diagnostic tool in a variety of biology settings and as a means of measuring the success of teaching interventions designed to improve statistical reasoning skills. PMID:26903497
Secondary Analysis of National Longitudinal Transition Study 2 Data
ERIC Educational Resources Information Center
Hicks, Tyler A.; Knollman, Greg A.
2015-01-01
This review examines published secondary analyses of National Longitudinal Transition Study 2 (NLTS2) data, with a primary focus upon statistical objectives, paradigms, inferences, and methods. Its primary purpose was to determine which statistical techniques have been common in secondary analyses of NLTS2 data. The review begins with an…
A Nonparametric Geostatistical Method For Estimating Species Importance
Andrew J. Lister; Rachel Riemann; Michael Hoppus
2001-01-01
Parametric statistical methods are not always appropriate for conducting spatial analyses of forest inventory data. Parametric geostatistical methods such as variography and kriging are essentially averaging procedures, and thus can be affected by extreme values. Furthermore, non normal distributions violate the assumptions of analyses in which test statistics are...
ERIC Educational Resources Information Center
Ellis, Barbara G.; Dick, Steven J.
1996-01-01
Employs the statistics-documentation portion of a word-processing program's grammar-check feature together with qualitative analyses to determine that Henry Watterson, long-time editor of the "Louisville Courier-Journal," was probably the South's famed Civil War correspondent "Shadow." (TB)
2014-11-01
du taux de change, et les responsables de la gestion interne se voient donc pressés de trouver des ... mesurer les effets négatifs que peuvent avoir les fluctuations mo- nétaires sur le budget et la planification du MDN, il faut connaître le poids des ...qualités comparables et qu’ils permettent d’effectuer une meilleure évaluation du risque qu’avec la méthode courante. On obtient désormais des estimations de
Swell Across the Continental Shelf
2001-09-01
Arlington, VA The views expressed in this thesis are those of the author and do not reflect the official policy or position of the Department of Defense ...le terme de source, tandis que les effets de la réfraction et du levage causés par les variations de profondeur aux échelles sous-maille sont...précisement pris en compte grâce aux rayons pré- calculés. Ainsi ce modèle peut être appliqué à de vastes zones côtières avec des maillages
2018-01-01
promote cooperative research and information exchange, and secondly an in-house delivery business model where S&T activities are conducted in a NATO...Panel These Panels and Group are the power-house of the collaborative model and are made up of national representatives as well as recognised world...of Radiation Injury and Recovery 1-31 1.3.3.2 Mini-Pig Model of Acute Radiation Syndrome ( ARS ) 1-31 1.3.3.3 Medical Countermeasures (MedCM) 1-32
1983-01-01
considered important, complete, and a lasting contribution to existing knowledge. -’ Mechanical Engineering Reports (MS): Scientific and technical information...pertaining to investigations outside aeronautics considered important, complete, and a lasting contribution to existing knowledge. * AERONAUTICAL...NOTES (AN): Information les~s broad in scope but nevertheless of importance as a * contribution to existing knowledge. LABORATORY TECHNICAL REPORTS (LTR
2011-06-01
performance ont été observées entre les différents groupes de sujets en ce qui a trait à l’effet du mode d’utilisation passif comparativement à l’écoute sans...work is staged over three fiscal/reporting periods as follows: Stage I (ending March 31st 2009): Design a Study Stage II (ending March 31st...under PWGSC Contract No. W7711-088145/001/TOR. It presents the study design and methods devised in Stage I, subjective and objective data collected
1993-08-01
subtitled "Simulation Data," consists of detailed infonrnation on the design parmneter variations tested, subsequent statistical analyses conducted...used with confidence during the design process. The data quality can be examined in various forms such as statistical analyses of measure of merit data...merit, such as time to capture or nmaximurn pitch rate, can be calculated from the simulation time history data. Statistical techniques are then used
Chung, Sang M; Lee, David J; Hand, Austin; Young, Philip; Vaidyanathan, Jayabharathi; Sahajwalla, Chandrahas
2015-12-01
The study evaluated whether the renal function decline rate per year with age in adults varies based on two primary statistical analyses: cross-section (CS), using one observation per subject, and longitudinal (LT), using multiple observations per subject over time. A total of 16628 records (3946 subjects; age range 30-92 years) of creatinine clearance and relevant demographic data were used. On average, four samples per subject were collected for up to 2364 days (mean: 793 days). A simple linear regression and random coefficient models were selected for CS and LT analyses, respectively. The renal function decline rates per year were 1.33 and 0.95 ml/min/year for CS and LT analyses, respectively, and were slower when the repeated individual measurements were considered. The study confirms that rates are different based on statistical analyses, and that a statistically robust longitudinal model with a proper sampling design provides reliable individual as well as population estimates of the renal function decline rates per year with age in adults. In conclusion, our findings indicated that one should be cautious in interpreting the renal function decline rate with aging information because its estimation was highly dependent on the statistical analyses. From our analyses, a population longitudinal analysis (e.g. random coefficient model) is recommended if individualization is critical, such as a dose adjustment based on renal function during a chronic therapy. Copyright © 2015 John Wiley & Sons, Ltd.
Inferential Statistics in "Language Teaching Research": A Review and Ways Forward
ERIC Educational Resources Information Center
Lindstromberg, Seth
2016-01-01
This article reviews all (quasi)experimental studies appearing in the first 19 volumes (1997-2015) of "Language Teaching Research" (LTR). Specifically, it provides an overview of how statistical analyses were conducted in these studies and of how the analyses were reported. The overall conclusion is that there has been a tight adherence…
SOCR Analyses - an Instructional Java Web-based Statistical Analysis Toolkit.
Chu, Annie; Cui, Jenny; Dinov, Ivo D
2009-03-01
The Statistical Online Computational Resource (SOCR) designs web-based tools for educational use in a variety of undergraduate courses (Dinov 2006). Several studies have demonstrated that these resources significantly improve students' motivation and learning experiences (Dinov et al. 2008). SOCR Analyses is a new component that concentrates on data modeling and analysis using parametric and non-parametric techniques supported with graphical model diagnostics. Currently implemented analyses include commonly used models in undergraduate statistics courses like linear models (Simple Linear Regression, Multiple Linear Regression, One-Way and Two-Way ANOVA). In addition, we implemented tests for sample comparisons, such as t-test in the parametric category; and Wilcoxon rank sum test, Kruskal-Wallis test, Friedman's test, in the non-parametric category. SOCR Analyses also include several hypothesis test models, such as Contingency tables, Friedman's test and Fisher's exact test.The code itself is open source (http://socr.googlecode.com/), hoping to contribute to the efforts of the statistical computing community. The code includes functionality for each specific analysis model and it has general utilities that can be applied in various statistical computing tasks. For example, concrete methods with API (Application Programming Interface) have been implemented in statistical summary, least square solutions of general linear models, rank calculations, etc. HTML interfaces, tutorials, source code, activities, and data are freely available via the web (www.SOCR.ucla.edu). Code examples for developers and demos for educators are provided on the SOCR Wiki website.In this article, the pedagogical utilization of the SOCR Analyses is discussed, as well as the underlying design framework. As the SOCR project is on-going and more functions and tools are being added to it, these resources are constantly improved. The reader is strongly encouraged to check the SOCR site for most updated information and newly added models.
A d-statistic for single-case designs that is equivalent to the usual between-groups d-statistic.
Shadish, William R; Hedges, Larry V; Pustejovsky, James E; Boyajian, Jonathan G; Sullivan, Kristynn J; Andrade, Alma; Barrientos, Jeannette L
2014-01-01
We describe a standardised mean difference statistic (d) for single-case designs that is equivalent to the usual d in between-groups experiments. We show how it can be used to summarise treatment effects over cases within a study, to do power analyses in planning new studies and grant proposals, and to meta-analyse effects across studies of the same question. We discuss limitations of this d-statistic, and possible remedies to them. Even so, this d-statistic is better founded statistically than other effect size measures for single-case design, and unlike many general linear model approaches such as multilevel modelling or generalised additive models, it produces a standardised effect size that can be integrated over studies with different outcome measures. SPSS macros for both effect size computation and power analysis are available.
L'effet des deformations plastiques severes sur les proprietes d'hydruration du magnesium
NASA Astrophysics Data System (ADS)
Lang, Julien
Le travail de recherche effectue durant mon projet de maitrise en physique a l'Universite du Quebec a Trois-Rivieres dans les laboratoires de l'Institut de Recherche sur l'Hydrogene etait de comparer l'effet du laminage a froid de la poudre de MgH2 avec celui du broyage mecanique. Nous avons etudie cette nouvelle technique en utilisant un laminoir vertical concu specialement pour laminer de la poudre. Nous avons lamine la poudre de MgH2 5, 25, 50 et 100 fois. La comparaison de la morphologie de la poudre de MgH 2 telle que recu du manufacturier et broye mecaniquement pendant 30 minutes avec celle de la poudre laminee ete faite a l'aide d'un microscope electronique a balayage. Nous avons par la suite mesure les proprietes de sorption d'hydrogene grace un appareil PCT de type Sievert. Nous avons aussi determine la structure cristalline par diffraction de rayons X. A partir de ces resultats, nous avons constate que le nombre optimal de laminages est de cinq et a les caracteristiques d'absorption/desorption d'hydrogene similaires a un broyage mecanique de 30 minutes. Nous avons aussi utilise les courbes de cinetiques d'absorption et de desorption d'hydrogene pour calculer l'etape limitative dans les reactions de sorption des echantillons lamines. Comme cinq laminages se font en environ 10 secondes, on voit que le laminage a froid est une technique plus interessante industriellement que le broyage mecanique a cause de l'important gain en temps et en energie.
NASA Astrophysics Data System (ADS)
Roy, G.; Buy, F.; Llorca, F.
2002-12-01
L'étude présentée s'inscrit dans le cadre d'une démarche menant à la construction d'un modèle analytique ou semi analytique de comportement élasto-visco-plastique endommageable, applicable aux chargements rencontrés en configuration d'impact violent et générant de l'écaillage ductile. La prise en compte des effets de compressibilité et de micro inertie est essentielle pour modéliser la phase de croissance. Des simulations numériques globales de la structure et locales à l'échelle des hétérogénéités permettent d'évaluer les niveaux de sollicitations dans les zones susceptibles de s'endommager, dévaluer des critères analytiques de germination de l'endommagement et de comprendre les mécanismes d'interaction entre les défauts. Les effets micro inertiels et de compressibilité sont ainsi mis en évidence dans les phases de germination et de coalescence des micro défauts. II s'agit ici d'une illustration non exhaustive de travaux engagés au CEA Valduc sur le tantale, dans le cadre d'une thèse [10]. Un programme matériaux en partenariat CEA-CNRS sur la modélisation multi échelles du comportement de structures a également été initié dans ce contexte.
ERIC Educational Resources Information Center
Thompson, Bruce; Melancon, Janet G.
Effect sizes have been increasingly emphasized in research as more researchers have recognized that: (1) all parametric analyses (t-tests, analyses of variance, etc.) are correlational; (2) effect sizes have played an important role in meta-analytic work; and (3) statistical significance testing is limited in its capacity to inform scientific…
Comments on `A Cautionary Note on the Interpretation of EOFs'.
NASA Astrophysics Data System (ADS)
Behera, Swadhin K.; Rao, Suryachandra A.; Saji, Hameed N.; Yamagata, Toshio
2003-04-01
The misleading aspect of the statistical analyses used in Dommenget and Latif, which raises concerns on some of the reported climate modes, is demonstrated. Adopting simple statistical techniques, the physical existence of the Indian Ocean dipole mode is shown and then the limitations of varimax and regression analyses in capturing the climate mode are discussed.
Barbie, Dana L.; Wehmeyer, Loren L.
2012-01-01
Trends in selected streamflow statistics during 1922-2009 were evaluated at 19 long-term streamflow-gaging stations considered indicative of outflows from Texas to Arkansas, Louisiana, Galveston Bay, and the Gulf of Mexico. The U.S. Geological Survey, in cooperation with the Texas Water Development Board, evaluated streamflow data from streamflow-gaging stations with more than 50 years of record that were active as of 2009. The outflows into Arkansas and Louisiana were represented by 3 streamflow-gaging stations, and outflows into the Gulf of Mexico, including Galveston Bay, were represented by 16 streamflow-gaging stations. Monotonic trend analyses were done using the following three streamflow statistics generated from daily mean values of streamflow: (1) annual mean daily discharge, (2) annual maximum daily discharge, and (3) annual minimum daily discharge. The trend analyses were based on the nonparametric Kendall's Tau test, which is useful for the detection of monotonic upward or downward trends with time. A total of 69 trend analyses by Kendall's Tau were computed - 19 periods of streamflow multiplied by the 3 streamflow statistics plus 12 additional trend analyses because the periods of record for 2 streamflow-gaging stations were divided into periods representing pre- and post-reservoir impoundment. Unless otherwise described, each trend analysis used the entire period of record for each streamflow-gaging station. The monotonic trend analysis detected 11 statistically significant downward trends, 37 instances of no trend, and 21 statistically significant upward trends. One general region studied, which seemingly has relatively more upward trends for many of the streamflow statistics analyzed, includes the rivers and associated creeks and bayous to Galveston Bay in the Houston metropolitan area. Lastly, the most western river basins considered (the Nueces and Rio Grande) had statistically significant downward trends for many of the streamflow statistics analyzed.
Hydrodynamic Coating of a Fiber
NASA Astrophysics Data System (ADS)
Quéré, D.; de Ryck, A.
We discuss how a solid (especially a fiber) is coated when drawn out of a bath of liquid. 1. For slow withdrawals out of pure viscous liquids, the data are found to be fitted by the famous Landau law: then, the coating results from a balance between viscosity and capillarity. For quicker withdrawals, the thickness of the entrained film suddenly diverges, at a velocity on order 1 m/s. Inertia is shown to be responsible for this effect. At still higher velocities, the thickness decreases with the velocity because the solid can only entrain the viscous boundary layer. 2. For complex fluids, surface effects are found in the low velocity regime: out of a surfactant solution, films are thicker than predicted by Landau, by a factor of order 2. The thickening factor is shown to be fixed by the Marangoni flow due to the presence of surfactants; out of an emulsion, the film can be enriched with oil , which can be understood by a simple model of capture; out of a polymer solution, a strong swelling of the film is observed if normal stresses are present. Hence, the problem has two families of solution: (i) at low velocity, the thickness of the layer is fixed by a balance between viscous and surface forces and thus is sensitive to the presence of surfactants, or other heterogeneities; (ii) at high velocity, inertia must be considered and the film thickness is fixed by the bulk properties of the liquid (density and viscosity). In these regimes, it is not affected by the presence of surfactants in the bath. Nous décrivons le dépôt de liquide sur un solide (le plus souvent une fibre) qui advient quand on tire ce solide d'un bain. 1. Si le retrait se fait lentement hors d'un liquide pur et visqueux, les données expérimentales suivent la loi de Landau : le dépôt résulte d'un compromis entre forces visqueuses et forces capillaires. Pour des retraits plus rapides, on observe que l'épaisseur du dépôt diverge, pour une vitesse de l'ordre du mètre par seconde. Nous montrons comment l'inertie du fluide engendre un tel effet. Plus vite encore, l'épaisseur décroît lentement avec la vitesse, le solide ne parvenant à entraîner avec lui que la couche limite visqueuse qu'il a mis en mouvement. 2. Pour des liquides complexes, des effets de surface sont observés dans le régime basse vitesse : hors d'une solution de tensioactifs, les films sont plus épais que ce que prévoit la loi de Landau, d'un facteur 2 environ. Nous montrons que l'épaississement est déterminé par l'écoulement Marangoni dû à la présence des tensioactifs ; hors d'une émulsion, le film peut être enrichi en huile, ce que l'on peut interpréter à l'aide d'un modèle de capture ; hors d'une solution de polymère, on observe un fort gonflement du film dès que la solution est semi-diluée, à cause de l'effet des contraintes normales (effet Weissenberg). Le problème étudié a donc deux familles de solution : (i) à basse vitesse, le dépôt résulte d'un compromis entre viscosité et capillarité, si bien qu'il est sensible à la présence dans le bain d'hétérogénéités (tensioactifs, gouttes d'huile) ; (ii) à plus grande vitesse, l'inertie doit être prise en compte et l'épaisseur du film est alors liée aux propriétés de volume du liquide (densité et viscosité).
Papageorgiou, Spyridon N; Kloukos, Dimitrios; Petridis, Haralampos; Pandis, Nikolaos
2015-10-01
To assess the hypothesis that there is excessive reporting of statistically significant studies published in prosthodontic and implantology journals, which could indicate selective publication. The last 30 issues of 9 journals in prosthodontics and implant dentistry were hand-searched for articles with statistical analyses. The percentages of significant and non-significant results were tabulated by parameter of interest. Univariable/multivariable logistic regression analyses were applied to identify possible predictors of reporting statistically significance findings. The results of this study were compared with similar studies in dentistry with random-effects meta-analyses. From the 2323 included studies 71% of them reported statistically significant results, with the significant results ranging from 47% to 86%. Multivariable modeling identified that geographical area and involvement of statistician were predictors of statistically significant results. Compared to interventional studies, the odds that in vitro and observational studies would report statistically significant results was increased by 1.20 times (OR: 2.20, 95% CI: 1.66-2.92) and 0.35 times (OR: 1.35, 95% CI: 1.05-1.73), respectively. The probability of statistically significant results from randomized controlled trials was significantly lower compared to various study designs (difference: 30%, 95% CI: 11-49%). Likewise the probability of statistically significant results in prosthodontics and implant dentistry was lower compared to other dental specialties, but this result did not reach statistical significant (P>0.05). The majority of studies identified in the fields of prosthodontics and implant dentistry presented statistically significant results. The same trend existed in publications of other specialties in dentistry. Copyright © 2015 Elsevier Ltd. All rights reserved.
DESIGNING ENVIRONMENTAL MONITORING DATABASES FOR STATISTIC ASSESSMENT
Databases designed for statistical analyses have characteristics that distinguish them from databases intended for general use. EMAP uses a probabilistic sampling design to collect data to produce statistical assessments of environmental conditions. In addition to supporting the ...
Comparing Visual and Statistical Analysis of Multiple Baseline Design Graphs.
Wolfe, Katie; Dickenson, Tammiee S; Miller, Bridget; McGrath, Kathleen V
2018-04-01
A growing number of statistical analyses are being developed for single-case research. One important factor in evaluating these methods is the extent to which each corresponds to visual analysis. Few studies have compared statistical and visual analysis, and information about more recently developed statistics is scarce. Therefore, our purpose was to evaluate the agreement between visual analysis and four statistical analyses: improvement rate difference (IRD); Tau-U; Hedges, Pustejovsky, Shadish (HPS) effect size; and between-case standardized mean difference (BC-SMD). Results indicate that IRD and BC-SMD had the strongest overall agreement with visual analysis. Although Tau-U had strong agreement with visual analysis on raw values, it had poorer agreement when those values were dichotomized to represent the presence or absence of a functional relation. Overall, visual analysis appeared to be more conservative than statistical analysis, but further research is needed to evaluate the nature of these disagreements.
USDA-ARS?s Scientific Manuscript database
Agronomic and Environmental research experiments result in data that are analyzed using statistical methods. These data are unavoidably accompanied by uncertainty. Decisions about hypotheses, based on statistical analyses of these data are therefore subject to error. This error is of three types,...
The Large-Scale Structure of Semantic Networks: Statistical Analyses and a Model of Semantic Growth
ERIC Educational Resources Information Center
Steyvers, Mark; Tenenbaum, Joshua B.
2005-01-01
We present statistical analyses of the large-scale structure of 3 types of semantic networks: word associations, WordNet, and Roget's Thesaurus. We show that they have a small-world structure, characterized by sparse connectivity, short average path lengths between words, and strong local clustering. In addition, the distributions of the number of…
Specialists Meeting on Wing-with-Stores Flutter
1975-04-01
I, 150 -z20L0 250 00 350o PYLON COMPONENT MODE (PITCH4-) GENER~ALISED INERTIA A C, 2-20 FIG. 13. AIRCRAFT BENDING MODE FREQUENCI ES , FOR SfORES ON...ldgbre ddportance tandis Gus J ’effet inverse eut o’uorv6 pour ] es sections au- .elth du rdscteur. Calcul et exp~rionce oont v,-. lw. accord at...nontrent que i’interaction du r’~acteur sur i.aile eat trbs locale ot d’intensitC6 s;.ffismaaent faible pour pouvoir 6trc n~glii4e dana lea calculs do
2009-09-01
le présent rapport, nous analysons la sécurité du secteur énergétique et de son infrastructure au Yémen et nous examinons les répercussions...conflits. Toutefois, le problème fondamental du secteur de l‘énergie au Yémen est l‘épuisement des réserves pétrolières, menant à une réduction rapide des...niveaux de production de pétrole. Le déclin du secteur de l‘énergie dans ce pays aura un effet très négligeable sur les réserves énergétiques
1991-12-01
and a refrigeration system and of a large, free jet wind tunnel. A schematic of the facil- heat exchanger that cools the air to temperatures as low ity...rotor testing turers, but correlations for heat and mass transfer over would involve the use of simulated ice applied to the wet airfoil surfaces are not...and also has incidence. A transient heat conduction the ability to introduce a correction for analysis applied around the rotor azimuth viscous effects
2001-03-01
possibilities to treat many groups in a repeated analysis the parametric methods were preferred. Student’s t-test has been used to identify significant...activation of the adrenal cortical glands and could therefore use the central POMC- serotonin pathway. This effect is observed in moderately trained...their unusual hours of arrival and departure deserves more study. It would be useful to know what selections they have and what they are making
Validation of an Acoustic Head Simulator for the Evaluation of Personal Hearing Protection Devices
2004-11-01
et recouvert de peau artificielle. Les cavités de chaque côté permettent l’insertion de modules d’oreilles qui reproduisent les mécanismes des ...aux spécifications publiées. Ces différences n’ont pas influé sur la perte d’insertion. Après correction pour tenir compte des effets de la...un simulateur de tête époxy chargé d’aluminium et recouvert de peau artificielle. La tête est soutenue par un module de cou souple rattaché à un
1991-01-01
des rdalitfs diverses ct niouvantes sous I’cffet croisd des 6volutions techniques et &conomIques. I en rdsulte tine...r6le est croissant. Si un tel mouvement s’observe dans nombre de pays, cla ne signifie pas qu’il leur soit commun. En effet, la confrontation des ...conceptions nationales, voire rgionales, est dIdj engagdc en raison de l’internationalisation des activitds de transfert de technologies et de
1993-11-01
In this section, we recall definitions of dual linear incoherent KH,’ radar measurables, rainfall rate and the specific attenuation (7) due to...reflectivity data. Two different path lengths (d1,) 10 and 20 from a C-band dual linear polarization radar measurements, Km., have been considered...model for simulation of dual linear polarization radar 7. REFERENCES measurement fields", to be published on lEE 1. Leitao, M. J. and P. A. Watson
Differences in Performance Among Test Statistics for Assessing Phylogenomic Model Adequacy.
Duchêne, David A; Duchêne, Sebastian; Ho, Simon Y W
2018-05-18
Statistical phylogenetic analyses of genomic data depend on models of nucleotide or amino acid substitution. The adequacy of these substitution models can be assessed using a number of test statistics, allowing the model to be rejected when it is found to provide a poor description of the evolutionary process. A potentially valuable use of model-adequacy test statistics is to identify when data sets are likely to produce unreliable phylogenetic estimates, but their differences in performance are rarely explored. We performed a comprehensive simulation study to identify test statistics that are sensitive to some of the most commonly cited sources of phylogenetic estimation error. Our results show that, for many test statistics, traditional thresholds for assessing model adequacy can fail to reject the model when the phylogenetic inferences are inaccurate and imprecise. This is particularly problematic when analysing loci that have few variable informative sites. We propose new thresholds for assessing substitution model adequacy and demonstrate their effectiveness in analyses of three phylogenomic data sets. These thresholds lead to frequent rejection of the model for loci that yield topological inferences that are imprecise and are likely to be inaccurate. We also propose the use of a summary statistic that provides a practical assessment of overall model adequacy. Our approach offers a promising means of enhancing model choice in genome-scale data sets, potentially leading to improvements in the reliability of phylogenomic inference.
Statistical Analyses of Raw Material Data for MTM45-1/CF7442A-36% RW: CMH Cure Cycle
NASA Technical Reports Server (NTRS)
Coroneos, Rula; Pai, Shantaram, S.; Murthy, Pappu
2013-01-01
This report describes statistical characterization of physical properties of the composite material system MTM45-1/CF7442A, which has been tested and is currently being considered for use on spacecraft structures. This composite system is made of 6K plain weave graphite fibers in a highly toughened resin system. This report summarizes the distribution types and statistical details of the tests and the conditions for the experimental data generated. These distributions will be used in multivariate regression analyses to help determine material and design allowables for similar material systems and to establish a procedure for other material systems. Additionally, these distributions will be used in future probabilistic analyses of spacecraft structures. The specific properties that are characterized are the ultimate strength, modulus, and Poisson??s ratio by using a commercially available statistical package. Results are displayed using graphical and semigraphical methods and are included in the accompanying appendixes.
Post Hoc Analyses of ApoE Genotype-Defined Subgroups in Clinical Trials.
Kennedy, Richard E; Cutter, Gary R; Wang, Guoqiao; Schneider, Lon S
2016-01-01
Many post hoc analyses of clinical trials in Alzheimer's disease (AD) and mild cognitive impairment (MCI) are in small Phase 2 trials. Subject heterogeneity may lead to statistically significant post hoc results that cannot be replicated in larger follow-up studies. We investigated the extent of this problem using simulation studies mimicking current trial methods with post hoc analyses based on ApoE4 carrier status. We used a meta-database of 24 studies, including 3,574 subjects with mild AD and 1,171 subjects with MCI/prodromal AD, to simulate clinical trial scenarios. Post hoc analyses examined if rates of progression on the Alzheimer's Disease Assessment Scale-cognitive (ADAS-cog) differed between ApoE4 carriers and non-carriers. Across studies, ApoE4 carriers were younger and had lower baseline scores, greater rates of progression, and greater variability on the ADAS-cog. Up to 18% of post hoc analyses for 18-month trials in AD showed greater rates of progression for ApoE4 non-carriers that were statistically significant but unlikely to be confirmed in follow-up studies. The frequency of erroneous conclusions dropped below 3% with trials of 100 subjects per arm. In MCI, rates of statistically significant differences with greater progression in ApoE4 non-carriers remained below 3% unless sample sizes were below 25 subjects per arm. Statistically significant differences for ApoE4 in post hoc analyses often reflect heterogeneity among small samples rather than true differential effect among ApoE4 subtypes. Such analyses must be viewed cautiously. ApoE genotype should be incorporated into the design stage to minimize erroneous conclusions.
Rao, Goutham; Lopez-Jimenez, Francisco; Boyd, Jack; D'Amico, Frank; Durant, Nefertiti H; Hlatky, Mark A; Howard, George; Kirley, Katherine; Masi, Christopher; Powell-Wiley, Tiffany M; Solomonides, Anthony E; West, Colin P; Wessel, Jennifer
2017-09-05
Meta-analyses are becoming increasingly popular, especially in the fields of cardiovascular disease prevention and treatment. They are often considered to be a reliable source of evidence for making healthcare decisions. Unfortunately, problems among meta-analyses such as the misapplication and misinterpretation of statistical methods and tests are long-standing and widespread. The purposes of this statement are to review key steps in the development of a meta-analysis and to provide recommendations that will be useful for carrying out meta-analyses and for readers and journal editors, who must interpret the findings and gauge methodological quality. To make the statement practical and accessible, detailed descriptions of statistical methods have been omitted. Based on a survey of cardiovascular meta-analyses, published literature on methodology, expert consultation, and consensus among the writing group, key recommendations are provided. Recommendations reinforce several current practices, including protocol registration; comprehensive search strategies; methods for data extraction and abstraction; methods for identifying, measuring, and dealing with heterogeneity; and statistical methods for pooling results. Other practices should be discontinued, including the use of levels of evidence and evidence hierarchies to gauge the value and impact of different study designs (including meta-analyses) and the use of structured tools to assess the quality of studies to be included in a meta-analysis. We also recommend choosing a pooling model for conventional meta-analyses (fixed effect or random effects) on the basis of clinical and methodological similarities among studies to be included, rather than the results of a test for statistical heterogeneity. © 2017 American Heart Association, Inc.
2014-01-01
Objective To offer a practical demonstration of receiver operating characteristic (ROC) analyses, diagnostic efficiency statistics, and their application to clinical decision making using a popular parent checklist to assess for potential mood disorder. Method Secondary analyses of data from 589 families seeking outpatient mental health services, completing the Child Behavior Checklist and semi-structured diagnostic interviews. Results Internalizing Problems raw scores discriminated mood disorders significantly better than did age- and gender-normed T scores, or an Affective Problems score. Internalizing scores <8 had a diagnostic likelihood ratio <0.3, and scores >30 had a diagnostic likelihood ratio of 7.4. Conclusions This study illustrates a series of steps in defining a clinical problem, operationalizing it, selecting a valid study design, and using ROC analyses to generate statistics that support clinical decisions. The ROC framework offers important advantages for clinical interpretation. Appendices include sample scripts using SPSS and R to check assumptions and conduct ROC analyses. PMID:23965298
Youngstrom, Eric A
2014-03-01
To offer a practical demonstration of receiver operating characteristic (ROC) analyses, diagnostic efficiency statistics, and their application to clinical decision making using a popular parent checklist to assess for potential mood disorder. Secondary analyses of data from 589 families seeking outpatient mental health services, completing the Child Behavior Checklist and semi-structured diagnostic interviews. Internalizing Problems raw scores discriminated mood disorders significantly better than did age- and gender-normed T scores, or an Affective Problems score. Internalizing scores <8 had a diagnostic likelihood ratio <0.3, and scores >30 had a diagnostic likelihood ratio of 7.4. This study illustrates a series of steps in defining a clinical problem, operationalizing it, selecting a valid study design, and using ROC analyses to generate statistics that support clinical decisions. The ROC framework offers important advantages for clinical interpretation. Appendices include sample scripts using SPSS and R to check assumptions and conduct ROC analyses.
Winer, E Samuel; Cervone, Daniel; Bryant, Jessica; McKinney, Cliff; Liu, Richard T; Nadorff, Michael R
2016-09-01
A popular way to attempt to discern causality in clinical psychology is through mediation analysis. However, mediation analysis is sometimes applied to research questions in clinical psychology when inferring causality is impossible. This practice may soon increase with new, readily available, and easy-to-use statistical advances. Thus, we here provide a heuristic to remind clinical psychological scientists of the assumptions of mediation analyses. We describe recent statistical advances and unpack assumptions of causality in mediation, underscoring the importance of time in understanding mediational hypotheses and analyses in clinical psychology. Example analyses demonstrate that statistical mediation can occur despite theoretical mediation being improbable. We propose a delineation of mediational effects derived from cross-sectional designs into the terms temporal and atemporal associations to emphasize time in conceptualizing process models in clinical psychology. The general implications for mediational hypotheses and the temporal frameworks from within which they may be drawn are discussed. © 2016 Wiley Periodicals, Inc.
This tool allows users to animate cancer trends over time by cancer site and cause of death, race, and sex. Provides access to incidence, mortality, and survival. Select the type of statistic, variables, format, and then extract the statistics in a delimited format for further analyses.
SOCR Analyses – an Instructional Java Web-based Statistical Analysis Toolkit
Chu, Annie; Cui, Jenny; Dinov, Ivo D.
2011-01-01
The Statistical Online Computational Resource (SOCR) designs web-based tools for educational use in a variety of undergraduate courses (Dinov 2006). Several studies have demonstrated that these resources significantly improve students' motivation and learning experiences (Dinov et al. 2008). SOCR Analyses is a new component that concentrates on data modeling and analysis using parametric and non-parametric techniques supported with graphical model diagnostics. Currently implemented analyses include commonly used models in undergraduate statistics courses like linear models (Simple Linear Regression, Multiple Linear Regression, One-Way and Two-Way ANOVA). In addition, we implemented tests for sample comparisons, such as t-test in the parametric category; and Wilcoxon rank sum test, Kruskal-Wallis test, Friedman's test, in the non-parametric category. SOCR Analyses also include several hypothesis test models, such as Contingency tables, Friedman's test and Fisher's exact test. The code itself is open source (http://socr.googlecode.com/), hoping to contribute to the efforts of the statistical computing community. The code includes functionality for each specific analysis model and it has general utilities that can be applied in various statistical computing tasks. For example, concrete methods with API (Application Programming Interface) have been implemented in statistical summary, least square solutions of general linear models, rank calculations, etc. HTML interfaces, tutorials, source code, activities, and data are freely available via the web (www.SOCR.ucla.edu). Code examples for developers and demos for educators are provided on the SOCR Wiki website. In this article, the pedagogical utilization of the SOCR Analyses is discussed, as well as the underlying design framework. As the SOCR project is on-going and more functions and tools are being added to it, these resources are constantly improved. The reader is strongly encouraged to check the SOCR site for most updated information and newly added models. PMID:21546994
Strategies facilitant les tests en pre-certification pour la robustesse a l'egard des radiations =
NASA Astrophysics Data System (ADS)
Souari, Anis
Les effets des radiations cosmiques sur l'electronique embarquee preoccupent depuis. quelques decennies les chercheurs interesses par la robustesse des circuits integres. Plusieurs. recherches ont ete menees dans cette direction, principalement pour les applications spatiales. ou lâenvironnement de leur deploiement est hostile. En effet, ces environnements sont denses. en termes de particules qui, lorsquâelles interagissent avec les circuits integres, peuvent. mener a leur dysfonctionnement, voir meme a leur destruction. De plus, les effets des. radiations sâaccentuent pour les nouvelles generations des circuits integres ou la diminution. de la taille des transistors et lâaugmentation de la complexite de ces circuits augmentent la. probabilite dâapparition des anomalies et par consequence la croissance des besoins de test. Lâexpansion de lâelectronique grand public (commercial off-the-shelf, COTS) et lâadoption. de ces composants pour des applications critiques comme les applications avioniques et. spatiales incitent egalement les chercheurs a doubler les efforts de verification de la fiabilite. de ces circuits. Les COTS, malgre leurs meilleures caracteristiques en comparaison avec les. circuits durcis tolerants aux radiations qui sont couteux et en retard en termes de technologie. utilisee, sont vulnerables aux radiations. Afin dâameliorer la fiabilite de ces circuits, une evaluation de leur vulnerabilite dans les. differents niveaux dâabstraction du flot de conception est recommandee. Ceci aide les. concepteurs a prendre les mesures de mitigation necessaires sur le design au niveau. dâabstraction en question. Enfin, afin de satisfaire les exigences de tolerance aux pannes, des. tests tres couteux de certification, obtenus a lâaide de bombardement de particules (protons, neutrons, etc.), sont necessaires. Dans cette these, nous nous interessons principalement a definir une strategie de precertification. permettant dâevaluer dâune facon realiste la sensibilite des circuits integres face. aux effets des radiations afin dâeviter dâenvoyer des circuits non robustes a la phase tres. couteuse de la certification. Les circuits cibles par nos travaux sont les circuits integres. programmables par lâusager (FPGA) a base de memoire SRAM et le type de pannes ciblees, causees par les radiations, est les SEU (single event upset) consistant a un basculement de. lâetat logique dâun element de memoire a son complementaire. En effet, les FPGA a base de. memoire SRAM sont de plus en plus demandes par la communaute de lâaerospatial grace a. leurs caracteristiques de prototypage rapide et de reconfiguration sur site mais ils sont. vulnerables face aux radiations ou les SEU sont les pannes les plus frequentes dans les. elements de memoire de type SRAM. Nous proposons une nouvelle approche dâinjection de. pannes par emulation permettant de mimer les effets des radiations sur la memoire de. configuration des FPGA et de generer des resultats les plus fideles possibles des resultats des. tests de certification. Cette approche est basee sur la consideration de la difference de. sensibilite des elements de memoire de configuration lorsquâils sont a lâetat '1' et a lâetat '0', observee sous des tests acceleres sous faisceaux de protons au renomme laboratoire. TRIUMF, dans la procedure de generation des sequences de test dans le but de mimer la. distribution des pannes dans la memoire de configuration. Les resultats des experimentations. de validation montrent que la strategie proposee est efficace et genere des resultats realistes. Ces resultats revelent que ne pas considerer la difference de sensibilite peut mener a une. sous-estimation de la sensibilite des circuits face aux radiations. Dans la meme optique dâoptimisation de la procedure dâinjection des pannes par emulation, a. savoir le test de pre-certification, nous proposons une methodologie permettant de maximiser. la detection des bits critiques (bits provoquant une defaillance fonctionnelle sâils changent. dâetat) pour un nombre bien determine de SEU (qui est le modele de pannes adopte) ou de. maximiser la precision de lâestimation de nombre des bits critiques. Pour ce faire, une. classification des bits de configuration en differents ensembles est tout dâabord mise en. oeuvre, selon leur contenu, les ressources quâils configurent et leur criticite. Ensuite, une. evaluation de la sensibilite de chaque ensemble est accomplie. Enfin, la priorisation. dâinjection des pannes dans les ensembles les plus sensibles est recommandee. Plusieurs. scenarios dâoptimisation dâinjection des pannes sont proposes et les resultats sont compares. avec ceux donnes par la methode conventionnelle dâinjection aleatoire des pannes. La. methodologie dâoptimisation proposee assure une amelioration de plus de deux ordres de. grandeur. Une derniere approche facilitant lâevaluation de la sensibilite des bits configurant les LUT. (look up table) de FPGA, les plus petites entites configurables du FPGA permettant. dâimplementer des fonctions combinatoires, utilises par un design est presentee. Elle permet. lâidentification facile et sans cout en termes dâutilisation du materiel ou dâoutils externes des. bits des LUT. Lâapproche proposee est simple et efficace, offrant une couverture de pannes. de 100 % et applicable aux nouvelles generations des FPGA de Xilinx. Les approches proposees contribuent a repondre aux exigences du cahier des charges de cette. these et a achever les objectifs definis. Le realisme et la maximisation de lâestimation de la. vulnerabilite des circuits sous test offerts par les nouvelles approches assurent le. developpement dâune strategie de test en pre-certification efficace. En effet, la premiere. approche dâinjection de pannes considerant la difference de sensibilite relative des elements. de memoire selon leur contenu genere des resultats donnant une erreur relative atteignant. 3.1 % quand compares aux resultats obtenus a TRIUMF alors que lâerreur relative donnee. par la comparaison des resultats dâune injection conventionnelle aleatoire de pannes avec. ceux de TRIUMF peut atteindre la valeur de 75 %. De plus, lâapplication de cette approche a. des circuits plus conventionnels montre que 2.3 fois plus dâerreurs sont detectees en. comparaison avec lâinjection aleatoire des pannes. Ceci suggere que ne pas considerer la. difference de sensibilite relative dans la procedure dâemulation peut mener a une sousestimation. de la sensibilite du design face aux radiations. Les resultats de la deuxieme. approche proposee ont ete aussi compares aux resultats dâune injection aleatoire de pannes. Lâapproche proposee, maximisant le nombre des bits critiques inverses, permet dâatteindre. un facteur dâacceleration de 108 de la procedure dâinjection des pannes en comparaison a. lâapproche aleatoire. Elle permet aussi de minimiser lâerreur dâestimation du nombre des bits. critiques pour atteindre une valeur de ±1.1 % calculee pour un intervalle de confiance de. 95 % tandis que la valeur dâerreur dâestimation des bits critiques generee par lâapproche. aleatoire dâinjection des pannes pour le meme intervalle de confiance peut atteindre ±8.6 %. Enfin, la derniere approche proposee dâinjection de pannes dans les LUT se distingue des. autres approches disponibles dans la litterature par sa simplicite tout en assurant une. couverture maximale de pannes de 100 %. En effet, lâapproche proposee est independante. des outils externes permettant dâidentifier les bits configurant les LUT qui sont obsoletes ou. ne supportent pas les nouvelles generations des FPGA. Elle agit directement sur les fichiers. generes par lâoutil de synthese adopte.
Dissecting the genetics of complex traits using summary association statistics.
Pasaniuc, Bogdan; Price, Alkes L
2017-02-01
During the past decade, genome-wide association studies (GWAS) have been used to successfully identify tens of thousands of genetic variants associated with complex traits and diseases. These studies have produced extensive repositories of genetic variation and trait measurements across large numbers of individuals, providing tremendous opportunities for further analyses. However, privacy concerns and other logistical considerations often limit access to individual-level genetic data, motivating the development of methods that analyse summary association statistics. Here, we review recent progress on statistical methods that leverage summary association data to gain insights into the genetic basis of complex traits and diseases.
Statistical innovations in diagnostic device evaluation.
Yu, Tinghui; Li, Qin; Gray, Gerry; Yue, Lilly Q
2016-01-01
Due to rapid technological development, innovations in diagnostic devices are proceeding at an extremely fast pace. Accordingly, the needs for adopting innovative statistical methods have emerged in the evaluation of diagnostic devices. Statisticians in the Center for Devices and Radiological Health at the Food and Drug Administration have provided leadership in implementing statistical innovations. The innovations discussed in this article include: the adoption of bootstrap and Jackknife methods, the implementation of appropriate multiple reader multiple case study design, the application of robustness analyses for missing data, and the development of study designs and data analyses for companion diagnostics.
Huvane, Jacqueline; Komarow, Lauren; Hill, Carol; Tran, Thuy Tien T.; Pereira, Carol; Rosenkranz, Susan L.; Finnemeyer, Matt; Earley, Michelle; Jiang, Hongyu (Jeanne); Wang, Rui; Lok, Judith
2017-01-01
Abstract The Statistical and Data Management Center (SDMC) provides the Antibacterial Resistance Leadership Group (ARLG) with statistical and data management expertise to advance the ARLG research agenda. The SDMC is active at all stages of a study, including design; data collection and monitoring; data analyses and archival; and publication of study results. The SDMC enhances the scientific integrity of ARLG studies through the development and implementation of innovative and practical statistical methodologies and by educating research colleagues regarding the application of clinical trial fundamentals. This article summarizes the challenges and roles, as well as the innovative contributions in the design, monitoring, and analyses of clinical trials and diagnostic studies, of the ARLG SDMC. PMID:28350899
ISSUES IN THE STATISTICAL ANALYSIS OF SMALL-AREA HEALTH DATA. (R825173)
The availability of geographically indexed health and population data, with advances in computing, geographical information systems and statistical methodology, have opened the way for serious exploration of small area health statistics based on routine data. Such analyses may be...
Measurement of the Carbon Isotopic Composition of Methane Using Helicoidal Laser Eigenstates
NASA Astrophysics Data System (ADS)
Jacob, D.; Le Floch, A.; Bretenaker, F.; Guenot, P.
1996-06-01
The spatially generalized Jones matrix formalism is used to design a laser cavity to make intracavity measurements of the carbon isotopic composition of methane. the method is based on a double optical lever effect for helicoidally polarized eigenstates, permitting to measure successively the ^{12}CH_4 and ^{13}CH_4 concentrations. To choose the probed isotope, one simply tunes the frequency of the laser by Zeeman effect. The experiment exhibits a good agreement with the predictions and permits to measure the ^{13}CH4/^{12}CH_4 composition ratio of methane with an uncertainty of the order of ± 0.07% for a sample containing only 6× 10^{-9} mole of methane. On utilise le formalisme des matrices de Jones généralisées spatialement pour concevoir une cavité laser permettant la mesure intra-cavité de la composition isotopique du carbone présent dans le méthane. La méthode est fondée sur une double application de l'effet de levier optique pour les états propres hélicoïdaux, permettant de mesurer successivement les concentrations de ^{12}CH_4 et de ^{13}CH_4. Pour passer d'un isotope à l'autre, on ajuste simplement la fréquence du laser par effet Zeeman. L'expérience est en bon accord avec les prédictions et permet d'effectuer la mesure du rapport isotopique ^{13}CH4/^{12}CH_4 avec une fourchette d'incertitude de ± 0,07% pour des échantillons de gaz ne contenant que 6× 10^{-9} mole de méthane.
NASA Astrophysics Data System (ADS)
Aguir, K.; Fennouh, A.; Carchano, H.; Lollman, D.
1995-10-01
Heterojunctions were fabricated by deposit of amorphous GaAs and GaAsN on c-GaAs. I(V) and C(V) measurements were performed to determine electrical properties of these structures. The a-GaAs/c-GaAs(n) heterojunctions present a p-n junction like behaviour. The characteristics of the a-GaAsN/c-GaAs(n) heterojunctions present a MIS like structure behaviour with some imperfections. A fixed positive charge was detected and a density of interface states of about 10^{11} eV^{-1}cm^{-2} was evaluated. L'étude porte sur des couches minces de GaAs et de GaAsN amorphes déposées par pulvérisation cathodique RF réactive sur des substrats de GaAs cristallin. Les caractéristiques électriques I(V) et C(V) ont été mesurées. Les hétérojonctions a-GaAs/c-GaAs(n) présentent un effet redresseur. Cet effet laisse place à une caractéristique symétrique avec une forte atténuation de l'intensité du courant pour les structures a-GaAsN/cGaAs(n). Les structures réalisées ont alors un comportement semblable à celui d'une structure MIS imparfaite. L'existence d'une charge positive fixe dans le a-GaAsN a été mise en évidence. La densité des états d'interface au milieu de la bande interdite est évaluée à quelques 10^{11} cm^{-2}eV^{-1}.
NASA Astrophysics Data System (ADS)
Durand, S.; Tellier, C. R.
1996-02-01
This paper constitutes the first part of a work devoted to applications of piezoresistance effects in germanium and silicon semiconductors. In this part, emphasis is placed on a formal explanation of non-linear effects. We propose a brief phenomenological description based on the multi-valleys model of semiconductors before to adopt a macroscopic tensorial model from which general analytical expressions for primed non-linear piezoresistance coefficients are derived. Graphical representations of linear and non-linear piezoresistance coefficients allows us to characterize the influence of the two angles of cut and of directions of alignment. The second part will primarily deal with specific applications for piezoresistive sensors. Cette publication constitue la première partie d'un travail consacré aux applications des effets piézorésistifs dans les semiconducteurs germanium et silicium. Cette partie traite essentiellement de la modélisation des effets non-linéaires. Après une description phénoménologique à partir du modèle de bande des semiconducteurs nous développons un modèle tensoriel macroscopique et nous proposons des équations générales analytiques exprimant les coefficients piézorésistifs non-linéaires dans des repères tournés. Des représentations graphiques des variations des coefficients piézorésistifs linéaires et non-linéaires permettent une pré-caractérisation de l'influence des angles de coupes et des directions d'alignement avant l'étude d'applications spécifiques qui feront l'objet de la deuxième partie.
Black, Amanda; Guilbert, Edith; Costescu, Dustin; Dunn, Sheila; Fisher, William; Kives, Sari; Mirosh, Melissa; Norman, Wendy V; Pymar, Helen; Reid, Robert; Roy, Geneviève; Varto, Hannah; Waddington, Ashley; Wagner, Marie-Soleil; Whelan, Anne Marie
2017-04-01
Mettre à la disposition des fournisseurs de soins des lignes directrices concernant le recours à des méthodes contraceptives pour prévenir la grossesse et la promotion d'une sexualité saine. Efficacité globale des méthodes contraceptives citées : évaluation de l'innocuité, des effets indésirables et de la baisse du taux de grossesse; effet des méthodes contraceptives citées sur la santé sexuelle et le bien-être général; disponibilité des méthodes contraceptives citées au Canada. RéSULTATS: Des recherches ont été effectuées dans MEDLINE et la base de données Cochrane afin d'en tirer les articles en anglais publiés entre janvier 1994 et décembre 2015 traitant de sujets liés à la contraception, à la sexualité et à la santé sexuelle, dans le but de mettre à jour le consensus canadien sur la contraception paru de février à avril 2004. Nous avons également passé en revue les publications pertinentes du gouvernement canadien, ainsi que les déclarations de principes issues d'organisations compétentes vouées à la santé et à la planification familiale. La qualité des résultats a été évaluée au moyen des critères décrits par le Groupe d'étude canadien sur les soins de santé préventifs. Les recommandations quant à la pratique sont classées en fonction de la méthode décrite dans le rapport du Groupe. DéCLARATIONS SOMMAIRES: RECOMMANDATIONS. Copyright © 2017. Published by Elsevier Inc.
Nonindependence and sensitivity analyses in ecological and evolutionary meta-analyses.
Noble, Daniel W A; Lagisz, Malgorzata; O'dea, Rose E; Nakagawa, Shinichi
2017-05-01
Meta-analysis is an important tool for synthesizing research on a variety of topics in ecology and evolution, including molecular ecology, but can be susceptible to nonindependence. Nonindependence can affect two major interrelated components of a meta-analysis: (i) the calculation of effect size statistics and (ii) the estimation of overall meta-analytic estimates and their uncertainty. While some solutions to nonindependence exist at the statistical analysis stages, there is little advice on what to do when complex analyses are not possible, or when studies with nonindependent experimental designs exist in the data. Here we argue that exploring the effects of procedural decisions in a meta-analysis (e.g. inclusion of different quality data, choice of effect size) and statistical assumptions (e.g. assuming no phylogenetic covariance) using sensitivity analyses are extremely important in assessing the impact of nonindependence. Sensitivity analyses can provide greater confidence in results and highlight important limitations of empirical work (e.g. impact of study design on overall effects). Despite their importance, sensitivity analyses are seldom applied to problems of nonindependence. To encourage better practice for dealing with nonindependence in meta-analytic studies, we present accessible examples demonstrating the impact that ignoring nonindependence can have on meta-analytic estimates. We also provide pragmatic solutions for dealing with nonindependent study designs, and for analysing dependent effect sizes. Additionally, we offer reporting guidelines that will facilitate disclosure of the sources of nonindependence in meta-analyses, leading to greater transparency and more robust conclusions. © 2017 John Wiley & Sons Ltd.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Metoyer, Candace N.; Walsh, Stephen J.; Tardiff, Mark F.
2008-10-30
The detection and identification of weak gaseous plumes using thermal imaging data is complicated by many factors. These include variability due to atmosphere, ground and plume temperature, and background clutter. This paper presents an analysis of one formulation of the physics-based model that describes the at-sensor observed radiance. The motivating question for the analyses performed in this paper is as follows. Given a set of backgrounds, is there a way to predict the background over which the probability of detecting a given chemical will be the highest? Two statistics were developed to address this question. These statistics incorporate data frommore » the long-wave infrared band to predict the background over which chemical detectability will be the highest. These statistics can be computed prior to data collection. As a preliminary exploration into the predictive ability of these statistics, analyses were performed on synthetic hyperspectral images. Each image contained one chemical (either carbon tetrachloride or ammonia) spread across six distinct background types. The statistics were used to generate predictions for the background ranks. Then, the predicted ranks were compared to the empirical ranks obtained from the analyses of the synthetic images. For the simplified images under consideration, the predicted and empirical ranks showed a promising amount of agreement. One statistic accurately predicted the best and worst background for detection in all of the images. Future work may include explorations of more complicated plume ingredients, background types, and noise structures.« less
Tuuli, Methodius G; Odibo, Anthony O
2011-08-01
The objective of this article is to discuss the rationale for common statistical tests used for the analysis and interpretation of prenatal diagnostic imaging studies. Examples from the literature are used to illustrate descriptive and inferential statistics. The uses and limitations of linear and logistic regression analyses are discussed in detail.
Using a Five-Step Procedure for Inferential Statistical Analyses
ERIC Educational Resources Information Center
Kamin, Lawrence F.
2010-01-01
Many statistics texts pose inferential statistical problems in a disjointed way. By using a simple five-step procedure as a template for statistical inference problems, the student can solve problems in an organized fashion. The problem and its solution will thus be a stand-by-itself organic whole and a single unit of thought and effort. The…
Harris, Michael; Radtke, Arthur S.
1976-01-01
Linear regression and discriminant analyses techniques were applied to gold, mercury, arsenic, antimony, barium, copper, molybdenum, lead, zinc, boron, tellurium, selenium, and tungsten analyses from drill holes into unoxidized gold ore at the Carlin gold mine near Carlin, Nev. The statistical treatments employed were used to judge proposed hypotheses on the origin and geochemical paragenesis of this disseminated gold deposit.
ERIC Educational Resources Information Center
Neumann, David L.; Hood, Michelle
2009-01-01
A wiki was used as part of a blended learning approach to promote collaborative learning among students in a first year university statistics class. One group of students analysed a data set and communicated the results by jointly writing a practice report using a wiki. A second group analysed the same data but communicated the results in a…
Extreme between-study homogeneity in meta-analyses could offer useful insights.
Ioannidis, John P A; Trikalinos, Thomas A; Zintzaras, Elias
2006-10-01
Meta-analyses are routinely evaluated for the presence of large between-study heterogeneity. We examined whether it is also important to probe whether there is extreme between-study homogeneity. We used heterogeneity tests with left-sided statistical significance for inference and developed a Monte Carlo simulation test for testing extreme homogeneity in risk ratios across studies, using the empiric distribution of the summary risk ratio and heterogeneity statistic. A left-sided P=0.01 threshold was set for claiming extreme homogeneity to minimize type I error. Among 11,803 meta-analyses with binary contrasts from the Cochrane Library, 143 (1.21%) had left-sided P-value <0.01 for the asymptotic Q statistic and 1,004 (8.50%) had left-sided P-value <0.10. The frequency of extreme between-study homogeneity did not depend on the number of studies in the meta-analyses. We identified examples where extreme between-study homogeneity (left-sided P-value <0.01) could result from various possibilities beyond chance. These included inappropriate statistical inference (asymptotic vs. Monte Carlo), use of a specific effect metric, correlated data or stratification using strong predictors of outcome, and biases and potential fraud. Extreme between-study homogeneity may provide useful insights about a meta-analysis and its constituent studies.
Cavalcante, Y L; Hauser-Davis, R A; Saraiva, A C F; Brandão, I L S; Oliveira, T F; Silveira, A M
2013-01-01
This paper compared and evaluated seasonal variations in physico-chemical parameters and metals at a hydroelectric power station reservoir by applying Multivariate Analyses and Artificial Neural Networks (ANN) statistical techniques. A Factor Analysis was used to reduce the number of variables: the first factor was composed of elements Ca, K, Mg and Na, and the second by Chemical Oxygen Demand. The ANN showed 100% correct classifications in training and validation samples. Physico-chemical analyses showed that water pH values were not statistically different between the dry and rainy seasons, while temperature, conductivity, alkalinity, ammonia and DO were higher in the dry period. TSS, hardness and COD, on the other hand, were higher during the rainy season. The statistical analyses showed that Ca, K, Mg and Na are directly connected to the Chemical Oxygen Demand, which indicates a possibility of their input into the reservoir system by domestic sewage and agricultural run-offs. These statistical applications, thus, are also relevant in cases of environmental management and policy decision-making processes, to identify which factors should be further studied and/or modified to recover degraded or contaminated water bodies. Copyright © 2012 Elsevier B.V. All rights reserved.
From sexless to sexy: Why it is time for human genetics to consider and report analyses of sex.
Powers, Matthew S; Smith, Phillip H; McKee, Sherry A; Ehringer, Marissa A
2017-01-01
Science has come a long way with regard to the consideration of sex differences in clinical and preclinical research, but one field remains behind the curve: human statistical genetics. The goal of this commentary is to raise awareness and discussion about how to best consider and evaluate possible sex effects in the context of large-scale human genetic studies. Over the course of this commentary, we reinforce the importance of interpreting genetic results in the context of biological sex, establish evidence that sex differences are not being considered in human statistical genetics, and discuss how best to conduct and report such analyses. Our recommendation is to run stratified analyses by sex no matter the sample size or the result and report the findings. Summary statistics from stratified analyses are helpful for meta-analyses, and patterns of sex-dependent associations may be hidden in a combined dataset. In the age of declining sequencing costs, large consortia efforts, and a number of useful control samples, it is now time for the field of human genetics to appropriately include sex in the design, analysis, and reporting of results.
Pike, Katie; Nash, Rachel L; Murphy, Gavin J; Reeves, Barnaby C; Rogers, Chris A
2015-02-22
The Transfusion Indication Threshold Reduction (TITRe2) trial is the largest randomized controlled trial to date to compare red blood cell transfusion strategies following cardiac surgery. This update presents the statistical analysis plan, detailing how the study will be analyzed and presented. The statistical analysis plan has been written following recommendations from the International Conference on Harmonisation of Technical Requirements for Registration of Pharmaceuticals for Human Use, prior to database lock and the final analysis of trial data. Outlined analyses are in line with the Consolidated Standards of Reporting Trials (CONSORT). The study aims to randomize 2000 patients from 17 UK centres. Patients are randomized to either a restrictive (transfuse if haemoglobin concentration <7.5 g/dl) or liberal (transfuse if haemoglobin concentration <9 g/dl) transfusion strategy. The primary outcome is a binary composite outcome of any serious infectious or ischaemic event in the first 3 months following randomization. The statistical analysis plan details how non-adherence with the intervention, withdrawals from the study, and the study population will be derived and dealt with in the analysis. The planned analyses of the trial primary and secondary outcome measures are described in detail, including approaches taken to deal with multiple testing, model assumptions not being met and missing data. Details of planned subgroup and sensitivity analyses and pre-specified ancillary analyses are given, along with potential issues that have been identified with such analyses and possible approaches to overcome such issues. ISRCTN70923932 .
Conceptual and statistical problems associated with the use of diversity indices in ecology.
Barrantes, Gilbert; Sandoval, Luis
2009-09-01
Diversity indices, particularly the Shannon-Wiener index, have extensively been used in analyzing patterns of diversity at different geographic and ecological scales. These indices have serious conceptual and statistical problems which make comparisons of species richness or species abundances across communities nearly impossible. There is often no a single statistical method that retains all information needed to answer even a simple question. However, multivariate analyses could be used instead of diversity indices, such as cluster analyses or multiple regressions. More complex multivariate analyses, such as Canonical Correspondence Analysis, provide very valuable information on environmental variables associated to the presence and abundance of the species in a community. In addition, particular hypotheses associated to changes in species richness across localities, or change in abundance of one, or a group of species can be tested using univariate, bivariate, and/or rarefaction statistical tests. The rarefaction method has proved to be robust to standardize all samples to a common size. Even the simplest method as reporting the number of species per taxonomic category possibly provides more information than a diversity index value.
NASA Technical Reports Server (NTRS)
Morrissey, L. A.; Weinstock, K. J.; Mouat, D. A.; Card, D. H.
1984-01-01
An evaluation of Thematic Mapper Simulator (TMS) data for the geobotanical discrimination of rock types based on vegetative cover characteristics is addressed in this research. A methodology for accomplishing this evaluation utilizing univariate and multivariate techniques is presented. TMS data acquired with a Daedalus DEI-1260 multispectral scanner were integrated with vegetation and geologic information for subsequent statistical analyses, which included a chi-square test, an analysis of variance, stepwise discriminant analysis, and Duncan's multiple range test. Results indicate that ultramafic rock types are spectrally separable from nonultramafics based on vegetative cover through the use of statistical analyses.
[Clinical research=design*measurements*statistical analyses].
Furukawa, Toshiaki
2012-06-01
A clinical study must address true endpoints that matter for the patients and the doctors. A good clinical study starts with a good clinical question. Formulating a clinical question in the form of PECO can sharpen one's original question. In order to perform a good clinical study one must have a knowledge of study design, measurements and statistical analyses: The first is taught by epidemiology, the second by psychometrics and the third by biostatistics.
Reframing Serial Murder Within Empirical Research.
Gurian, Elizabeth A
2017-04-01
Empirical research on serial murder is limited due to the lack of consensus on a definition, the continued use of primarily descriptive statistics, and linkage to popular culture depictions. These limitations also inhibit our understanding of these offenders and affect credibility in the field of research. Therefore, this comprehensive overview of a sample of 508 cases (738 total offenders, including partnered groups of two or more offenders) provides analyses of solo male, solo female, and partnered serial killers to elucidate statistical differences and similarities in offending and adjudication patterns among the three groups. This analysis of serial homicide offenders not only supports previous research on offending patterns present in the serial homicide literature but also reveals that empirically based analyses can enhance our understanding beyond traditional case studies and descriptive statistics. Further research based on these empirical analyses can aid in the development of more accurate classifications and definitions of serial murderers.
Nimptsch, Ulrike; Wengler, Annelene; Mansky, Thomas
2016-11-01
In Germany, nationwide hospital discharge data (DRG statistics provided by the research data centers of the Federal Statistical Office and the Statistical Offices of the 'Länder') are increasingly used as data source for health services research. Within this data hospitals can be separated via their hospital identifier ([Institutionskennzeichen] IK). However, this hospital identifier primarily designates the invoicing unit and is not necessarily equivalent to one hospital location. Aiming to investigate direction and extent of possible bias in hospital-level analyses this study examines the continuity of the hospital identifier within a cross-sectional and longitudinal approach and compares the results to official hospital census statistics. Within the DRG statistics from 2005 to 2013 the annual number of hospitals as classified by hospital identifiers was counted for each year of observation. The annual number of hospitals derived from DRG statistics was compared to the number of hospitals in the official census statistics 'Grunddaten der Krankenhäuser'. Subsequently, the temporal continuity of hospital identifiers in the DRG statistics was analyzed within cohorts of hospitals. Until 2013, the annual number of hospital identifiers in the DRG statistics fell by 175 (from 1,725 to 1,550). This decline affected only providers with small or medium case volume. The number of hospitals identified in the DRG statistics was lower than the number given in the census statistics (e.g., in 2013 1,550 IK vs. 1,668 hospitals in the census statistics). The longitudinal analyses revealed that the majority of hospital identifiers persisted in the years of observation, while one fifth of hospital identifiers changed. In cross-sectional studies of German hospital discharge data the separation of hospitals via the hospital identifier might lead to underestimating the number of hospitals and consequential overestimation of caseload per hospital. Discontinuities of hospital identifiers over time might impair the follow-up of hospital cohorts. These limitations must be taken into account in analyses of German hospital discharge data focusing on the hospital level. Copyright © 2016. Published by Elsevier GmbH.
Han, Kyunghwa; Jung, Inkyung
2018-05-01
This review article presents an assessment of trends in statistical methods and an evaluation of their appropriateness in articles published in the Archives of Plastic Surgery (APS) from 2012 to 2017. We reviewed 388 original articles published in APS between 2012 and 2017. We categorized the articles that used statistical methods according to the type of statistical method, the number of statistical methods, and the type of statistical software used. We checked whether there were errors in the description of statistical methods and results. A total of 230 articles (59.3%) published in APS between 2012 and 2017 used one or more statistical method. Within these articles, there were 261 applications of statistical methods with continuous or ordinal outcomes, and 139 applications of statistical methods with categorical outcome. The Pearson chi-square test (17.4%) and the Mann-Whitney U test (14.4%) were the most frequently used methods. Errors in describing statistical methods and results were found in 133 of the 230 articles (57.8%). Inadequate description of P-values was the most common error (39.1%). Among the 230 articles that used statistical methods, 71.7% provided details about the statistical software programs used for the analyses. SPSS was predominantly used in the articles that presented statistical analyses. We found that the use of statistical methods in APS has increased over the last 6 years. It seems that researchers have been paying more attention to the proper use of statistics in recent years. It is expected that these positive trends will continue in APS.
2007-04-01
1 Chapter 1 – Introduction 1 - 1 1.1 Background and Problem Definition 1 - 1 1.1.1...Background 1 - 1 1.1.2 Problem Definition 1 -2 1.2 The Objective and Approach of the HFM-090/TG-25 1 -2 1.2.1 Objective 1 -2 1.2.2 Approach 1 -2 1.3...Organization of this Report 1 -3 1.4 References 1 -3 Chapter 2 – The Mine Detonation Process and Occupant Loading 2- 1 2.1 Introduction to Mines 2- 1 2.2
1991-06-01
resolution are essential. The resulting frequency Paul A K., Anharmonic Frequency Analysis, pattern would be nonuniform and would change Mati. Comp...veloppement laire donnte par Ia relation empiri- de la trainte ainsi que Ie mouvemnent des par que 1231 ticules neutres dans Ia haute atmosph~re. log D...1515, 1973b. Bahar, E., Depolarization in nonuniform multi- layered structures--Full wave solutions, J. Math. Phys,, 15(2), 202-208, 1974, Ba , and M
Stability and Control of Tactical Missile Systems Held in Ankara (Turkey) on 9-12 May 1988
1989-03-01
basic body for the systematic tests consisted of a cylindrical body with an ogive nose 3.0 calibers long for a total body length of 12.5 calibers A...Portance et stabilittd d’un Aiulege (ogies / iei 4 cylindre ) A N - 2.8. Compareison celcul "Euler" eec 1 d~~w~~hR~~t -- x~ine CALCUL -EULERV OPUP...toujours repr~sentatifs de Ia r~alit6. Compte tenu des pr~cisions recher- ch~es, lea effets du montage, d’un jet propulsif, du nombre de Reynolds, ou de
1986-06-01
D’INTENSITE 0 APPLIQUEE A LA PROPAGATION "ANORNALE" par D. Dion DEFENCE RESEARCH ESTABLISHMENT CENTRE DE RECHERCHES POUR LA DEFENSE VALCARTI ER 6- Tel: (418...faqon ils sont reli~s aux conditions atmosph~riques. Les ph~no- manes les plus importants A signaler sont les conduits et les "trous radio". En effet...6tant tr~s fr~quents en mer, 11 est d’int&rt pour la marine de rechercher des m~thodes simples permettant de les caract~riser. Des 6quations d’int
[Adverse effects of oxcarbazepine].
Fang, Shu; Gong, Zhi-Cheng
2015-04-01
Oxcarbazepine is a new antiepileptic drug. The results of clinical trials suggest that oxcarbazepine is well tolerated and has less drug interactions. It is being used more and more widely in clinical practice, but its adverse effects should not be ignored. The most common adverse effects of oxcarbazepine are usually related to the central nervous system and digestive system, including fatigue, drowsiness, diplopia, dizziness, nausea and vomit. The common skin adverse reaction is rash. Long-term use of oxcarbazepine may also cause hyponatremia. This article reviews the literature from China and overseas about the adverse effets of oxcarbazepine over the last 10 years in order to find information about rational clinical use of oxcarbazepine.
2005-08-01
de charge et de comprendre les effets des caractéristiques de conception de systèmes de transport de charge sur la santé et la mobilité humaines...supérieur et le torse inférieur et la force totale de contact. Les travaux sont en cours en vue de l’intégration d’une couche de peau , ce qui...permettrait d’examiner en détail l’interaction au niveau de l’interface équipement- peau . Le but consiste à évaluer les risques de
2008-09-01
diverses temperatures 26 a) HTPB pur b) HTPB-DOA (polymere et plastifiant) c) GAP pur d) Gpl pur e)Gap-Gpl Liste des tableaux Tableau 1...Composition des mailles amorphes construites 11 Tableau 2. Proprietes des polymeres et plastifiants utilises 11 Tableau 3. Comparaisons entre les Tt...obtenues experimentalement, les T% publiees dans les ecrits scientifiques et celles predites a partir des 7"gdes composes purs 19 Tableau 4. Comparaison
Vers la diode laser organique ?
NASA Astrophysics Data System (ADS)
Horowitz, G.
1998-06-01
The recent report of a laser effect in polyparaphenylenevinylene has raised up a great interest. Here, we review the works in that field. Stimulated emission (SE) in conjugated polymers (PCs) manifests itself by a spectral narrowing and a shortening of the emission life time. By putting the emitting medium into microcavity, an increase of the emission directionality has also been reported. We also give results on SE in sexithiophene (6T) single crystals. An analysis of the emitting levels shows that we are dealing with a four level system, which could account for the low excitation threshold. L'annonce récente d'un possible effet laser dans le polyparaphénylè nevinylène (PPV) a suscité un grand intérêt. Nous faisons ici le point sur les travaux dans ce domaine. L'émission stimulée (ES) dans les polymères conjugués (PC) se manifeste par un rétrécissement spectral et un raccourcissement de la durée de vie d'émission. En plaçant le milieu émissif dans une micro-cavité, une augmentation de la directionnalité de l'émission a été observée. Nous donnons par ailleurs des résultats sur l'ES dans des monocristaux de sexithiophène (6T). L'analyse des niveaux émissifs montre qu'il s'agit . d'un système à quatre niveaux, ce qui pourrait expliquer le faible seuil d'excitation.
Application du concept de maillon faible à un critère d'endurance multiaxial
NASA Astrophysics Data System (ADS)
Flacelière, L.; Morel, F.; Palin-Luc, T.
2002-12-01
En fatigue à grand nombre de cycle, il est aujourd'hui admis que la distribution des contraintes, ainsi que la taille des composants, sont responsables de variations de la limite de fatigue. Sous chargement uniaxial ou multiaxial, on peut montrer qu'une approche statistique dite du maillon le plus faible, combiné à un critère multiaxial d'endurance basé sur une analyse micro plastique, permet de prédire la limite de fatigue de plusieurs matériaux métalliques. Quatre types de chargement sont analysés (traction-compression, torsion, flexion rotative et flexion plane), puis comparées aux résultats expérimentaux, pour une fonte et deux aciers haute résistance. L'approche statistique proposée permet d'intégrer un certain nombre d'aspects: la dispersion des données pour tous types de chargement, l'effet de gradient et l'influence de la présence de défauts matériaux. Enfin, ce modèle rend également compte de la diminution de la limite de fatigue avec l'augmentation du volume contraint. Les prédictions des probabilités de rupture sont raisonnables bien que seules des limites de fatigue relatives à des probabilités de rupture de 50% soient utilisées pour l'identification des paramètres du modèle.
Castro, Marcelo P; Pataky, Todd C; Sole, Gisela; Vilas-Boas, Joao Paulo
2015-07-16
Ground reaction force (GRF) data from men and women are commonly pooled for analyses. However, it may not be justifiable to pool sexes on the basis of discrete parameters extracted from continuous GRF gait waveforms because this can miss continuous effects. Forty healthy participants (20 men and 20 women) walked at a cadence of 100 steps per minute across two force plates, recording GRFs. Two statistical methods were used to test the null hypothesis of no mean GRF differences between sexes: (i) Statistical Parametric Mapping-using the entire three-component GRF waveform; and (ii) traditional approach-using the first and second vertical GRF peaks. Statistical Parametric Mapping results suggested large sex differences, which post-hoc analyses suggested were due predominantly to higher anterior-posterior and vertical GRFs in early stance in women compared to men. Statistically significant differences were observed for the first GRF peak and similar values for the second GRF peak. These contrasting results emphasise that different parts of the waveform have different signal strengths and thus that one may use the traditional approach to choose arbitrary metrics and make arbitrary conclusions. We suggest that researchers and clinicians consider both the entire gait waveforms and sex-specificity when analysing GRF data. Copyright © 2015 Elsevier Ltd. All rights reserved.
ERIC Educational Resources Information Center
Merrill, Ray M.; Chatterley, Amanda; Shields, Eric C.
2005-01-01
This study explored the effectiveness of selected statistical measures at motivating or maintaining regular exercise among college students. The study also considered whether ease in understanding these statistical measures was associated with perceived effectiveness at motivating or maintaining regular exercise. Analyses were based on a…
ERIC Educational Resources Information Center
Petocz, Peter; Sowey, Eric
2012-01-01
The term "data snooping" refers to the practice of choosing which statistical analyses to apply to a set of data after having first looked at those data. Data snooping contradicts a fundamental precept of applied statistics, that the scheme of analysis is to be planned in advance. In this column, the authors shall elucidate the…
The Empirical Nature and Statistical Treatment of Missing Data
ERIC Educational Resources Information Center
Tannenbaum, Christyn E.
2009-01-01
Introduction. Missing data is a common problem in research and can produce severely misleading analyses, including biased estimates of statistical parameters, and erroneous conclusions. In its 1999 report, the APA Task Force on Statistical Inference encouraged authors to report complications such as missing data and discouraged the use of…
ERIC Educational Resources Information Center
Norris, John M.
2015-01-01
Traditions of statistical significance testing in second language (L2) quantitative research are strongly entrenched in how researchers design studies, select analyses, and interpret results. However, statistical significance tests using "p" values are commonly misinterpreted by researchers, reviewers, readers, and others, leading to…
Federal Register 2010, 2011, 2012, 2013, 2014
2010-05-05
...] Guidance for Industry on Documenting Statistical Analysis Programs and Data Files; Availability AGENCY... Programs and Data Files.'' This guidance is provided to inform study statisticians of recommendations for documenting statistical analyses and data files submitted to the Center for Veterinary Medicine (CVM) for the...
Development of the Statistical Reasoning in Biology Concept Inventory (SRBCI).
Deane, Thomas; Nomme, Kathy; Jeffery, Erica; Pollock, Carol; Birol, Gülnur
2016-01-01
We followed established best practices in concept inventory design and developed a 12-item inventory to assess student ability in statistical reasoning in biology (Statistical Reasoning in Biology Concept Inventory [SRBCI]). It is important to assess student thinking in this conceptual area, because it is a fundamental requirement of being statistically literate and associated skills are needed in almost all walks of life. Despite this, previous work shows that non-expert-like thinking in statistical reasoning is common, even after instruction. As science educators, our goal should be to move students along a novice-to-expert spectrum, which could be achieved with growing experience in statistical reasoning. We used item response theory analyses (the one-parameter Rasch model and associated analyses) to assess responses gathered from biology students in two populations at a large research university in Canada in order to test SRBCI's robustness and sensitivity in capturing useful data relating to the students' conceptual ability in statistical reasoning. Our analyses indicated that SRBCI is a unidimensional construct, with items that vary widely in difficulty and provide useful information about such student ability. SRBCI should be useful as a diagnostic tool in a variety of biology settings and as a means of measuring the success of teaching interventions designed to improve statistical reasoning skills. © 2016 T. Deane et al. CBE—Life Sciences Education © 2016 The American Society for Cell Biology. This article is distributed by The American Society for Cell Biology under license from the author(s). It is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http://creativecommons.org/licenses/by-nc-sa/3.0).
Evaluation and application of summary statistic imputation to discover new height-associated loci.
Rüeger, Sina; McDaid, Aaron; Kutalik, Zoltán
2018-05-01
As most of the heritability of complex traits is attributed to common and low frequency genetic variants, imputing them by combining genotyping chips and large sequenced reference panels is the most cost-effective approach to discover the genetic basis of these traits. Association summary statistics from genome-wide meta-analyses are available for hundreds of traits. Updating these to ever-increasing reference panels is very cumbersome as it requires reimputation of the genetic data, rerunning the association scan, and meta-analysing the results. A much more efficient method is to directly impute the summary statistics, termed as summary statistics imputation, which we improved to accommodate variable sample size across SNVs. Its performance relative to genotype imputation and practical utility has not yet been fully investigated. To this end, we compared the two approaches on real (genotyped and imputed) data from 120K samples from the UK Biobank and show that, genotype imputation boasts a 3- to 5-fold lower root-mean-square error, and better distinguishes true associations from null ones: We observed the largest differences in power for variants with low minor allele frequency and low imputation quality. For fixed false positive rates of 0.001, 0.01, 0.05, using summary statistics imputation yielded a decrease in statistical power by 9, 43 and 35%, respectively. To test its capacity to discover novel associations, we applied summary statistics imputation to the GIANT height meta-analysis summary statistics covering HapMap variants, and identified 34 novel loci, 19 of which replicated using data in the UK Biobank. Additionally, we successfully replicated 55 out of the 111 variants published in an exome chip study. Our study demonstrates that summary statistics imputation is a very efficient and cost-effective way to identify and fine-map trait-associated loci. Moreover, the ability to impute summary statistics is important for follow-up analyses, such as Mendelian randomisation or LD-score regression.
Evaluation and application of summary statistic imputation to discover new height-associated loci
2018-01-01
As most of the heritability of complex traits is attributed to common and low frequency genetic variants, imputing them by combining genotyping chips and large sequenced reference panels is the most cost-effective approach to discover the genetic basis of these traits. Association summary statistics from genome-wide meta-analyses are available for hundreds of traits. Updating these to ever-increasing reference panels is very cumbersome as it requires reimputation of the genetic data, rerunning the association scan, and meta-analysing the results. A much more efficient method is to directly impute the summary statistics, termed as summary statistics imputation, which we improved to accommodate variable sample size across SNVs. Its performance relative to genotype imputation and practical utility has not yet been fully investigated. To this end, we compared the two approaches on real (genotyped and imputed) data from 120K samples from the UK Biobank and show that, genotype imputation boasts a 3- to 5-fold lower root-mean-square error, and better distinguishes true associations from null ones: We observed the largest differences in power for variants with low minor allele frequency and low imputation quality. For fixed false positive rates of 0.001, 0.01, 0.05, using summary statistics imputation yielded a decrease in statistical power by 9, 43 and 35%, respectively. To test its capacity to discover novel associations, we applied summary statistics imputation to the GIANT height meta-analysis summary statistics covering HapMap variants, and identified 34 novel loci, 19 of which replicated using data in the UK Biobank. Additionally, we successfully replicated 55 out of the 111 variants published in an exome chip study. Our study demonstrates that summary statistics imputation is a very efficient and cost-effective way to identify and fine-map trait-associated loci. Moreover, the ability to impute summary statistics is important for follow-up analyses, such as Mendelian randomisation or LD-score regression. PMID:29782485
ParallABEL: an R library for generalized parallelization of genome-wide association studies.
Sangket, Unitsa; Mahasirimongkol, Surakameth; Chantratita, Wasun; Tandayya, Pichaya; Aulchenko, Yurii S
2010-04-29
Genome-Wide Association (GWA) analysis is a powerful method for identifying loci associated with complex traits and drug response. Parts of GWA analyses, especially those involving thousands of individuals and consuming hours to months, will benefit from parallel computation. It is arduous acquiring the necessary programming skills to correctly partition and distribute data, control and monitor tasks on clustered computers, and merge output files. Most components of GWA analysis can be divided into four groups based on the types of input data and statistical outputs. The first group contains statistics computed for a particular Single Nucleotide Polymorphism (SNP), or trait, such as SNP characterization statistics or association test statistics. The input data of this group includes the SNPs/traits. The second group concerns statistics characterizing an individual in a study, for example, the summary statistics of genotype quality for each sample. The input data of this group includes individuals. The third group consists of pair-wise statistics derived from analyses between each pair of individuals in the study, for example genome-wide identity-by-state or genomic kinship analyses. The input data of this group includes pairs of SNPs/traits. The final group concerns pair-wise statistics derived for pairs of SNPs, such as the linkage disequilibrium characterisation. The input data of this group includes pairs of individuals. We developed the ParallABEL library, which utilizes the Rmpi library, to parallelize these four types of computations. ParallABEL library is not only aimed at GenABEL, but may also be employed to parallelize various GWA packages in R. The data set from the North American Rheumatoid Arthritis Consortium (NARAC) includes 2,062 individuals with 545,080, SNPs' genotyping, was used to measure ParallABEL performance. Almost perfect speed-up was achieved for many types of analyses. For example, the computing time for the identity-by-state matrix was linearly reduced from approximately eight hours to one hour when ParallABEL employed eight processors. Executing genome-wide association analysis using the ParallABEL library on a computer cluster is an effective way to boost performance, and simplify the parallelization of GWA studies. ParallABEL is a user-friendly parallelization of GenABEL.
Takemura, Kosuke; Yuki, Masaki
2007-02-01
The interindividual-intergroup discontinuity effect is the tendency for relationships between groups to be more competitive than the relationships between individuals. It has been observed robustly in studies conducted in the United States, which is a society characterized as "individualistic." In this study, it was explored whether the effect was replicable in a "collectivistic" society such as Japan. From the traditional view in cross-cultural psychology, which emphasizes the collectivistic nature of East Asian peoples, it was expected that the discontinuity effect would be greater in Japan than in the United States. On the other hand, based on recent empirical findings suggesting that North Americans are no less group-oriented than East Asians, it was expected that the discontinuity effect would be no greater in Japan than in the United States. One hundred and sixty Japanese university students played a 10-trial repeated prisoner's dilemma game: 26 sessions of interindividual and 18 sessions of intergroup. Following exactly the procedure of prior experiments in the US, individuals and groups were allowed face-to-face communication with their opponents before making their decisions, and participants in the intergroup condition were further allowed to converse freely with their in-group members. Results replicated previous findings in the United States; groups made more competitive choices than did individuals. In addition, neither the magnitude of the discontinuity effect, nor the frequency of competitive choices made by the groups, were larger in Japan than they were in the majority of prior studies conducted in the United States. These findings suggest cross-cultural robustness of the interindividual-intergroup discontinuity effect. Also, interestingly, they contradict the simple distinction between individualism and collectivism. Implications for studies of culture and group processes are discussed. This research was supported by grants from the Center for the Study of Cultural and Ecological Foundations of the Mind, a 21(st) Century Center of Excellence Program at Hokkaido University. The authors would like to thank Dr. Laura Hernández-Guzmán, three anonymous reviewers, and Robin Cooper, Mark H. B. Radford, and Paul A. Wehr for their helpful comments on earlier versions of this article. They would also like to thank Dr. Chester A. Insko for his kind and valuable advice during the planning of this experiment as well as the interpretation of its results, Kaori Akaishi for her help with data collection, and, finally, colleagues at Hokkaido University who helped to recruit potential participants from their classes. L'effet de discontinuité entre les individus et entre les groupes est la tendance des relations entre les groupes à être plus compétitives que les relations entre les individus. Cet effet fut fermement démontré dans des études menées aux États-Unis, une société caractérisée d' «individualiste». Dans la présente étude, nous avons exploré dans quelle mesure l'effet était applicable à une société «collectiviste» comme le Japon. À partir du point de vue traditionnel de la psychologie interculturelle, laquelle met l'emphase sur la nature collectiviste des peuples de l'Asie de l'Est, il était attendu que l'effet de discontinuité allait être plus grand au Japon qu'aux États-Unis. D'un autre côté, se basant sur les données empiriques récentes qui suggèrent que les Nord-américains ne sont pas moins orientés vers le groupe que les Asiatiques de l'Est, il était attendu que l'effet de discontinuité ne serait pas plus important au Japon qu'aux États-Unis. Cent soixante étudiants universitaires japonais ont pris part à un jeu de dilemme de prisonnier de 10 essais répétés: 26 sessions entre individus et 18 sessions entre groupes. Suivant exactement la procédure des expériences menées précédemment aux États-Unis, les individus et les groupes avaient la permission de communiquer face-à-face avec leur opposant avant de prendre leur décision. De plus, les participants de la condition entre groupes avaient également la possibilité de converser librement avec les membres de leur propre groupe. Les résultats se sont révélés semblables à ceux des études antérieures réalisées aux États-Unis; les groupes ont fait des choix plus compétitifs que les individus. En outre, ni la magnitude de l'effet de discontinuité, ni la fréquence des choix compétitifs faits par les groupes n'étaient plus grands chez les Japonais comparativement à la majorité des études antérieures menées aux États-Unis. Ces résultats soutiennent la robustesse interculturelle de l'effet de discontinuité entre les individus et entre les groupes. Aussi, fait intéressant, ils contredisent la simple distinction entre l'individualisme et le collectivisme. Les implications pour des études sur les cultures et les processus de groupe sont discutées. El efecto de discontinuidad entre individuos y entre grupos es la tendencia a que las relaciones entre grupos sean más competitivas que las relaciones entre individuos. Se ha observado con insistencia en estudios conducidos en Estados Unidos, una sociedad caracterizada como "individualista". En el presente estudio, se exploró si el efecto se repetía en una sociedad "colectivista" como la japonesa. Desde la perspectiva tradicional de la psicología trans cultural, que subraya la naturaleza colectivista de los pueblos asiáticos, se esperaba que el efecto de discontinuidad fuese mayor en Japón que en Estados Unidos. Por otra parte, con base en los hallazgos empíricos recientes que sugieren que los estadounidenses no están menos orientados al grupo que los de este asiático, se esperaba que el efecto de discontinuidad no fuera mayor en Japón que en los Estados Unidos. Ciento sesenta estudiantes universitarios japoneses participaron en un juego del dilema de un prisionero de diez ensayos repetidos: veintiséis sesiones entre individuos y diez y ocho entre grupos. Siguiendo exactamente el procedimiento de los experimentos previos en los Estados Unidos, se permitió tanto a los individuos como a los grupos una comunicación cara a cara con sus oponentes antes de tomar sus decisiones, y se permitió a los participantes en la condición entre grupos que conversaran libremente con los miembros de su propio grupo. Los resultados repitieron los hallazgos previos en los Estados Unidos; los grupos hicieron elecciones más competitivas que los individuos. Además, ni la magnitud del efecto de discontinuidad ni la frecuencia de las elecciones competitivas de los grupos, fueron mayores en Japón que en la mayoría de los estudios previos en Estados Unidos. Estos hallazgos sugieren la robustez trans cultural del efecto de discontinuidad entre individuos y entre grupos. También, de manera interesante, contradicen la simple distinción entre individualismo y colectivismo. Se discutieron las implicaciones para los estudios sobre cultura y procesos grupales.
Damiani, Lucas Petri; Berwanger, Otavio; Paisani, Denise; Laranjeira, Ligia Nasi; Suzumura, Erica Aranha; Amato, Marcelo Britto Passos; Carvalho, Carlos Roberto Ribeiro; Cavalcanti, Alexandre Biasi
2017-01-01
Background The Alveolar Recruitment for Acute Respiratory Distress Syndrome Trial (ART) is an international multicenter randomized pragmatic controlled trial with allocation concealment involving 120 intensive care units in Brazil, Argentina, Colombia, Italy, Poland, Portugal, Malaysia, Spain, and Uruguay. The primary objective of ART is to determine whether maximum stepwise alveolar recruitment associated with PEEP titration, adjusted according to the static compliance of the respiratory system (ART strategy), is able to increase 28-day survival in patients with acute respiratory distress syndrome compared to conventional treatment (ARDSNet strategy). Objective To describe the data management process and statistical analysis plan. Methods The statistical analysis plan was designed by the trial executive committee and reviewed and approved by the trial steering committee. We provide an overview of the trial design with a special focus on describing the primary (28-day survival) and secondary outcomes. We describe our data management process, data monitoring committee, interim analyses, and sample size calculation. We describe our planned statistical analyses for primary and secondary outcomes as well as pre-specified subgroup analyses. We also provide details for presenting results, including mock tables for baseline characteristics, adherence to the protocol and effect on clinical outcomes. Conclusion According to best trial practice, we report our statistical analysis plan and data management plan prior to locking the database and beginning analyses. We anticipate that this document will prevent analysis bias and enhance the utility of the reported results. Trial registration ClinicalTrials.gov number, NCT01374022. PMID:28977255
Formalizing the definition of meta-analysis in Molecular Ecology.
ArchMiller, Althea A; Bauer, Eric F; Koch, Rebecca E; Wijayawardena, Bhagya K; Anil, Ammu; Kottwitz, Jack J; Munsterman, Amelia S; Wilson, Alan E
2015-08-01
Meta-analysis, the statistical synthesis of pertinent literature to develop evidence-based conclusions, is relatively new to the field of molecular ecology, with the first meta-analysis published in the journal Molecular Ecology in 2003 (Slate & Phua 2003). The goal of this article is to formalize the definition of meta-analysis for the authors, editors, reviewers and readers of Molecular Ecology by completing a review of the meta-analyses previously published in this journal. We also provide a brief overview of the many components required for meta-analysis with a more specific discussion of the issues related to the field of molecular ecology, including the use and statistical considerations of Wright's FST and its related analogues as effect sizes in meta-analysis. We performed a literature review to identify articles published as 'meta-analyses' in Molecular Ecology, which were then evaluated by at least two reviewers. We specifically targeted Molecular Ecology publications because as a flagship journal in this field, meta-analyses published in Molecular Ecology have the potential to set the standard for meta-analyses in other journals. We found that while many of these reviewed articles were strong meta-analyses, others failed to follow standard meta-analytical techniques. One of these unsatisfactory meta-analyses was in fact a secondary analysis. Other studies attempted meta-analyses but lacked the fundamental statistics that are considered necessary for an effective and powerful meta-analysis. By drawing attention to the inconsistency of studies labelled as meta-analyses, we emphasize the importance of understanding the components of traditional meta-analyses to fully embrace the strengths of quantitative data synthesis in the field of molecular ecology. © 2015 John Wiley & Sons Ltd.
Statistical power analyses using G*Power 3.1: tests for correlation and regression analyses.
Faul, Franz; Erdfelder, Edgar; Buchner, Axel; Lang, Albert-Georg
2009-11-01
G*Power is a free power analysis program for a variety of statistical tests. We present extensions and improvements of the version introduced by Faul, Erdfelder, Lang, and Buchner (2007) in the domain of correlation and regression analyses. In the new version, we have added procedures to analyze the power of tests based on (1) single-sample tetrachoric correlations, (2) comparisons of dependent correlations, (3) bivariate linear regression, (4) multiple linear regression based on the random predictor model, (5) logistic regression, and (6) Poisson regression. We describe these new features and provide a brief introduction to their scope and handling.
SEER Cancer Query Systems (CanQues)
These applications provide access to cancer statistics including incidence, mortality, survival, prevalence, and probability of developing or dying from cancer. Users can display reports of the statistics or extract them for additional analyses.
Facilitating the Transition from Bright to Dim Environments
2016-03-04
For the parametric data, a multivariate ANOVA was used in determining the systematic presence of any statistically significant performance differences...performed. All significance levels were p < 0.05, and statistical analyses were performed with the Statistical Package for Social Sciences ( SPSS ...1950. Age changes in rate and level of visual dark adaptation. Journal of Applied Physiology, 2, 407–411. Field, A. 2009. Discovering statistics
Huh, Iksoo; Wu, Xin; Park, Taesung; Yi, Soojin V
2017-07-21
DNA methylation is one of the most extensively studied epigenetic modifications of genomic DNA. In recent years, sequencing of bisulfite-converted DNA, particularly via next-generation sequencing technologies, has become a widely popular method to study DNA methylation. This method can be readily applied to a variety of species, dramatically expanding the scope of DNA methylation studies beyond the traditionally studied human and mouse systems. In parallel to the increasing wealth of genomic methylation profiles, many statistical tools have been developed to detect differentially methylated loci (DMLs) or differentially methylated regions (DMRs) between biological conditions. We discuss and summarize several key properties of currently available tools to detect DMLs and DMRs from sequencing of bisulfite-converted DNA. However, the majority of the statistical tools developed for DML/DMR analyses have been validated using only mammalian data sets, and less priority has been placed on the analyses of invertebrate or plant DNA methylation data. We demonstrate that genomic methylation profiles of non-mammalian species are often highly distinct from those of mammalian species using examples of honey bees and humans. We then discuss how such differences in data properties may affect statistical analyses. Based on these differences, we provide three specific recommendations to improve the power and accuracy of DML and DMR analyses of invertebrate data when using currently available statistical tools. These considerations should facilitate systematic and robust analyses of DNA methylation from diverse species, thus advancing our understanding of DNA methylation. © The Author 2017. Published by Oxford University Press.
Imaging Depression in Adults with ASD
2017-10-01
collected temporally close enough to imaging data in Phase 2 to be confidently incorporated in the planned statistical analyses, and (b) not unduly risk...Phase 2 to be confidently incorporated in the planned statistical analyses, and (b) not unduly risk attrition between Phase 1 and 2, we chose to hold...supervision is ongoing (since 9/2014). • Co-l Dr. Lerner’s 2nd year Clinical Psychology PhD students have participated in ADOS- 2 Introductory Clinical
Schulz, Marcus; Neumann, Daniel; Fleet, David M; Matthies, Michael
2013-12-01
During the last decades, marine pollution with anthropogenic litter has become a worldwide major environmental concern. Standardized monitoring of litter since 2001 on 78 beaches selected within the framework of the Convention for the Protection of the Marine Environment of the North-East Atlantic (OSPAR) has been used to identify temporal trends of marine litter. Based on statistical analyses of this dataset a two-part multi-criteria evaluation system for beach litter pollution of the North-East Atlantic and the North Sea is proposed. Canonical correlation analyses, linear regression analyses, and non-parametric analyses of variance were used to identify different temporal trends. A classification of beaches was derived from cluster analyses and served to define different states of beach quality according to abundances of 17 input variables. The evaluation system is easily applicable and relies on the above-mentioned classification and on significant temporal trends implied by significant rank correlations. Copyright © 2013 Elsevier Ltd. All rights reserved.
Le niobate de lithium a haute temperature pour les applications ultrasons =
NASA Astrophysics Data System (ADS)
De Castilla, Hector
L'objectif de ce travail de maitrise en sciences appliquees est de trouver puis etudier un materiau piezoelectrique qui est potentiellement utilisable dans les transducteurs ultrasons a haute temperature. En effet, ces derniers sont actuellement limites a des temperatures de fonctionnement en dessous de 300°C a cause de l'element piezoelectrique qui les compose. Palier a cette limitation permettrait des controles non destructifs par ultrasons a haute temperature. Avec de bonnes proprietes electromecaniques et une temperature de Curie elevee (1200°C), le niobate de lithium (LiNbO 3) est un bon candidat. Mais certaines etudes affirment que des processus chimiques tels que l'apparition de conductivite ionique ou l'emergence d'une nouvelle phase ne permettent pas son utilisation dans les transducteurs ultrasons au-dessus de 600°C. Cependant, d'autres etudes plus recentes ont montre qu'il pouvait generer des ultrasons jusqu'a 1000°C et qu'aucune conductivite n'etait visible. Une hypothese a donc emerge : une conductivite ionique est presente dans le niobate de lithium a haute temperature (>500°C) mais elle n'affecte que faiblement ses proprietes a hautes frequences (>100 kHz). Une caracterisation du niobate de lithium a haute temperature est donc necessaire afin de verifier cette hypothese. Pour cela, la methode par resonance a ete employee. Elle permet une caracterisation de la plupart des coefficients electromecaniques avec une simple spectroscopie d'impedance electrochimique et un modele reliant de facon explicite les proprietes au spectre d'impedance. Il s'agit de trouver les coefficients du modele permettant de superposer au mieux le modele avec les mesures experimentales. Un banc experimental a ete realise permettant de controler la temperature des echantillons et de mesurer leur impedance electrochimique. Malheureusement, les modeles actuellement utilises pour la methode par resonance sont imprecis en presence de couplages entre les modes de vibration. Cela implique de posseder plusieurs echantillons de differentes formes afin d'isoler chaque mode principal de vibration. De plus, ces modeles ne prennent pas bien en compte les harmoniques et modes en cisaillement. C'est pourquoi un nouveau modele analytique couvrant tout le spectre frequentiel a ete developpe afin de predire les resonances en cisaillement, les harmoniques et les couplages entre les modes. Neanmoins, certains modes de resonances et certains couplages ne sont toujours pas modelises. La caracterisation d'echantillons carres a pu etre menee jusqu'a 750°C. Les resultats confirment le caractere prometteur du niobate de lithium. Les coefficients piezoelectriques sont stables en fonction de la temperature et l'elasticite et la permittivite ont le comportement attendu. Un effet thermoelectrique ayant un effet similaire a de la conductivite ionique a ete observe ce qui ne permet pas de quantifier l'impact de ce dernier. Bien que des etudes complementaires soient necessaires, l'intensite des resonances a 750°C semble indiquer que le niobate de lithium peut etre utilise pour des applications ultrasons a hautes frequences (>100 kHz).
A new statistical method for design and analyses of component tolerance
NASA Astrophysics Data System (ADS)
Movahedi, Mohammad Mehdi; Khounsiavash, Mohsen; Otadi, Mahmood; Mosleh, Maryam
2017-03-01
Tolerancing conducted by design engineers to meet customers' needs is a prerequisite for producing high-quality products. Engineers use handbooks to conduct tolerancing. While use of statistical methods for tolerancing is not something new, engineers often use known distributions, including the normal distribution. Yet, if the statistical distribution of the given variable is unknown, a new statistical method will be employed to design tolerance. In this paper, we use generalized lambda distribution for design and analyses component tolerance. We use percentile method (PM) to estimate the distribution parameters. The findings indicated that, when the distribution of the component data is unknown, the proposed method can be used to expedite the design of component tolerance. Moreover, in the case of assembled sets, more extensive tolerance for each component with the same target performance can be utilized.
Sequi, Marco; Campi, Rita; Clavenna, Antonio; Bonati, Maurizio
2013-03-01
To evaluate the quality of data reporting and statistical methods performed in drug utilization studies in the pediatric population. Drug utilization studies evaluating all drug prescriptions to children and adolescents published between January 1994 and December 2011 were retrieved and analyzed. For each study, information on measures of exposure/consumption, the covariates considered, descriptive and inferential analyses, statistical tests, and methods of data reporting was extracted. An overall quality score was created for each study using a 12-item checklist that took into account the presence of outcome measures, covariates of measures, descriptive measures, statistical tests, and graphical representation. A total of 22 studies were reviewed and analyzed. Of these, 20 studies reported at least one descriptive measure. The mean was the most commonly used measure (18 studies), but only five of these also reported the standard deviation. Statistical analyses were performed in 12 studies, with the chi-square test being the most commonly performed test. Graphs were presented in 14 papers. Sixteen papers reported the number of drug prescriptions and/or packages, and ten reported the prevalence of the drug prescription. The mean quality score was 8 (median 9). Only seven of the 22 studies received a score of ≥10, while four studies received a score of <6. Our findings document that only a few of the studies reviewed applied statistical methods and reported data in a satisfactory manner. We therefore conclude that the methodology of drug utilization studies needs to be improved.
2015-08-01
the nine questions. The Statistical Package for the Social Sciences ( SPSS ) [11] was used to conduct statistical analysis on the sample. Two types...constructs. SPSS was again used to conduct statistical analysis on the sample. This time factor analysis was conducted. Factor analysis attempts to...Business Research Methods and Statistics using SPSS . P432. 11 IBM SPSS Statistics . (2012) 12 Burns, R.B., Burns, R.A. (2008) ‘Business Research
Research Design and Statistical Methods in Indian Medical Journals: A Retrospective Survey
Hassan, Shabbeer; Yellur, Rajashree; Subramani, Pooventhan; Adiga, Poornima; Gokhale, Manoj; Iyer, Manasa S.; Mayya, Shreemathi S.
2015-01-01
Good quality medical research generally requires not only an expertise in the chosen medical field of interest but also a sound knowledge of statistical methodology. The number of medical research articles which have been published in Indian medical journals has increased quite substantially in the past decade. The aim of this study was to collate all evidence on study design quality and statistical analyses used in selected leading Indian medical journals. Ten (10) leading Indian medical journals were selected based on impact factors and all original research articles published in 2003 (N = 588) and 2013 (N = 774) were categorized and reviewed. A validated checklist on study design, statistical analyses, results presentation, and interpretation was used for review and evaluation of the articles. Main outcomes considered in the present study were – study design types and their frequencies, error/defects proportion in study design, statistical analyses, and implementation of CONSORT checklist in RCT (randomized clinical trials). From 2003 to 2013: The proportion of erroneous statistical analyses did not decrease (χ2=0.592, Φ=0.027, p=0.4418), 25% (80/320) in 2003 compared to 22.6% (111/490) in 2013. Compared with 2003, significant improvement was seen in 2013; the proportion of papers using statistical tests increased significantly (χ2=26.96, Φ=0.16, p<0.0001) from 42.5% (250/588) to 56.7 % (439/774). The overall proportion of errors in study design decreased significantly (χ2=16.783, Φ=0.12 p<0.0001), 41.3% (243/588) compared to 30.6% (237/774). In 2013, randomized clinical trials designs has remained very low (7.3%, 43/588) with majority showing some errors (41 papers, 95.3%). Majority of the published studies were retrospective in nature both in 2003 [79.1% (465/588)] and in 2013 [78.2% (605/774)]. Major decreases in error proportions were observed in both results presentation (χ2=24.477, Φ=0.17, p<0.0001), 82.2% (263/320) compared to 66.3% (325/490) and interpretation (χ2=25.616, Φ=0.173, p<0.0001), 32.5% (104/320) compared to 17.1% (84/490), though some serious ones were still present. Indian medical research seems to have made no major progress regarding using correct statistical analyses, but error/defects in study designs have decreased significantly. Randomized clinical trials are quite rarely published and have high proportion of methodological problems. PMID:25856194
Research design and statistical methods in Indian medical journals: a retrospective survey.
Hassan, Shabbeer; Yellur, Rajashree; Subramani, Pooventhan; Adiga, Poornima; Gokhale, Manoj; Iyer, Manasa S; Mayya, Shreemathi S
2015-01-01
Good quality medical research generally requires not only an expertise in the chosen medical field of interest but also a sound knowledge of statistical methodology. The number of medical research articles which have been published in Indian medical journals has increased quite substantially in the past decade. The aim of this study was to collate all evidence on study design quality and statistical analyses used in selected leading Indian medical journals. Ten (10) leading Indian medical journals were selected based on impact factors and all original research articles published in 2003 (N = 588) and 2013 (N = 774) were categorized and reviewed. A validated checklist on study design, statistical analyses, results presentation, and interpretation was used for review and evaluation of the articles. Main outcomes considered in the present study were - study design types and their frequencies, error/defects proportion in study design, statistical analyses, and implementation of CONSORT checklist in RCT (randomized clinical trials). From 2003 to 2013: The proportion of erroneous statistical analyses did not decrease (χ2=0.592, Φ=0.027, p=0.4418), 25% (80/320) in 2003 compared to 22.6% (111/490) in 2013. Compared with 2003, significant improvement was seen in 2013; the proportion of papers using statistical tests increased significantly (χ2=26.96, Φ=0.16, p<0.0001) from 42.5% (250/588) to 56.7 % (439/774). The overall proportion of errors in study design decreased significantly (χ2=16.783, Φ=0.12 p<0.0001), 41.3% (243/588) compared to 30.6% (237/774). In 2013, randomized clinical trials designs has remained very low (7.3%, 43/588) with majority showing some errors (41 papers, 95.3%). Majority of the published studies were retrospective in nature both in 2003 [79.1% (465/588)] and in 2013 [78.2% (605/774)]. Major decreases in error proportions were observed in both results presentation (χ2=24.477, Φ=0.17, p<0.0001), 82.2% (263/320) compared to 66.3% (325/490) and interpretation (χ2=25.616, Φ=0.173, p<0.0001), 32.5% (104/320) compared to 17.1% (84/490), though some serious ones were still present. Indian medical research seems to have made no major progress regarding using correct statistical analyses, but error/defects in study designs have decreased significantly. Randomized clinical trials are quite rarely published and have high proportion of methodological problems.
Statistical Literacy in the Data Science Workplace
ERIC Educational Resources Information Center
Grant, Robert
2017-01-01
Statistical literacy, the ability to understand and make use of statistical information including methods, has particular relevance in the age of data science, when complex analyses are undertaken by teams from diverse backgrounds. Not only is it essential to communicate to the consumers of information but also within the team. Writing from the…
Reporting Practices and Use of Quantitative Methods in Canadian Journal Articles in Psychology.
Counsell, Alyssa; Harlow, Lisa L
2017-05-01
With recent focus on the state of research in psychology, it is essential to assess the nature of the statistical methods and analyses used and reported by psychological researchers. To that end, we investigated the prevalence of different statistical procedures and the nature of statistical reporting practices in recent articles from the four major Canadian psychology journals. The majority of authors evaluated their research hypotheses through the use of analysis of variance (ANOVA), t -tests, and multiple regression. Multivariate approaches were less common. Null hypothesis significance testing remains a popular strategy, but the majority of authors reported a standardized or unstandardized effect size measure alongside their significance test results. Confidence intervals on effect sizes were infrequently employed. Many authors provided minimal details about their statistical analyses and less than a third of the articles presented on data complications such as missing data and violations of statistical assumptions. Strengths of and areas needing improvement for reporting quantitative results are highlighted. The paper concludes with recommendations for how researchers and reviewers can improve comprehension and transparency in statistical reporting.
The SPARC Intercomparison of Middle Atmosphere Climatologies
NASA Technical Reports Server (NTRS)
Randel, William; Fleming, Eric; Geller, Marvin; Gelman, Mel; Hamilton, Kevin; Karoly, David; Ortland, Dave; Pawson, Steve; Swinbank, Richard; Udelhofen, Petra
2003-01-01
Our current confidence in 'observed' climatological winds and temperatures in the middle atmosphere (over altitudes approx. 10-80 km) is assessed by detailed intercomparisons of contemporary and historic data sets. These data sets include global meteorological analyses and assimilations, climatologies derived from research satellite measurements, and historical reference atmosphere circulation statistics. We also include comparisons with historical rocketsonde wind and temperature data, and with more recent lidar temperature measurements. The comparisons focus on a few basic circulation statistics, such as temperature, zonal wind, and eddy flux statistics. Special attention is focused on tropical winds and temperatures, where large differences exist among separate analyses. Assimilated data sets provide the most realistic tropical variability, but substantial differences exist among current schemes.
NASA Technical Reports Server (NTRS)
1982-01-01
A FORTRAN coded computer program and method to predict the reaction control fuel consumption statistics for a three axis stabilized rocket vehicle upper stage is described. A Monte Carlo approach is used which is more efficient by using closed form estimates of impulses. The effects of rocket motor thrust misalignment, static unbalance, aerodynamic disturbances, and deviations in trajectory, mass properties and control system characteristics are included. This routine can be applied to many types of on-off reaction controlled vehicles. The pseudorandom number generation and statistical analyses subroutines including the output histograms can be used for other Monte Carlo analyses problems.
Stopka, Thomas J; Goulart, Michael A; Meyers, David J; Hutcheson, Marga; Barton, Kerri; Onofrey, Shauna; Church, Daniel; Donahue, Ashley; Chui, Kenneth K H
2017-04-20
Hepatitis C virus (HCV) infections have increased during the past decade but little is known about geographic clustering patterns. We used a unique analytical approach, combining geographic information systems (GIS), spatial epidemiology, and statistical modeling to identify and characterize HCV hotspots, statistically significant clusters of census tracts with elevated HCV counts and rates. We compiled sociodemographic and HCV surveillance data (n = 99,780 cases) for Massachusetts census tracts (n = 1464) from 2002 to 2013. We used a five-step spatial epidemiological approach, calculating incremental spatial autocorrelations and Getis-Ord Gi* statistics to identify clusters. We conducted logistic regression analyses to determine factors associated with the HCV hotspots. We identified nine HCV clusters, with the largest in Boston, New Bedford/Fall River, Worcester, and Springfield (p < 0.05). In multivariable analyses, we found that HCV hotspots were independently and positively associated with the percent of the population that was Hispanic (adjusted odds ratio [AOR]: 1.07; 95% confidence interval [CI]: 1.04, 1.09) and the percent of households receiving food stamps (AOR: 1.83; 95% CI: 1.22, 2.74). HCV hotspots were independently and negatively associated with the percent of the population that were high school graduates or higher (AOR: 0.91; 95% CI: 0.89, 0.93) and the percent of the population in the "other" race/ethnicity category (AOR: 0.88; 95% CI: 0.85, 0.91). We identified locations where HCV clusters were a concern, and where enhanced HCV prevention, treatment, and care can help combat the HCV epidemic in Massachusetts. GIS, spatial epidemiological and statistical analyses provided a rigorous approach to identify hotspot clusters of disease, which can inform public health policy and intervention targeting. Further studies that incorporate spatiotemporal cluster analyses, Bayesian spatial and geostatistical models, spatially weighted regression analyses, and assessment of associations between HCV clustering and the built environment are needed to expand upon our combined spatial epidemiological and statistical methods.
ERIC Educational Resources Information Center
Kadhi, Tau; Holley, D.
2010-01-01
The following report gives the statistical findings of the July 2010 TMSL Bar results. Procedures: Data is pre-existing and was given to the Evaluator by email from the Registrar and Dean. Statistical analyses were run using SPSS 17 to address the following research questions: 1. What are the statistical descriptors of the July 2010 overall TMSL…
Evaluating the consistency of gene sets used in the analysis of bacterial gene expression data.
Tintle, Nathan L; Sitarik, Alexandra; Boerema, Benjamin; Young, Kylie; Best, Aaron A; Dejongh, Matthew
2012-08-08
Statistical analyses of whole genome expression data require functional information about genes in order to yield meaningful biological conclusions. The Gene Ontology (GO) and Kyoto Encyclopedia of Genes and Genomes (KEGG) are common sources of functionally grouped gene sets. For bacteria, the SEED and MicrobesOnline provide alternative, complementary sources of gene sets. To date, no comprehensive evaluation of the data obtained from these resources has been performed. We define a series of gene set consistency metrics directly related to the most common classes of statistical analyses for gene expression data, and then perform a comprehensive analysis of 3581 Affymetrix® gene expression arrays across 17 diverse bacteria. We find that gene sets obtained from GO and KEGG demonstrate lower consistency than those obtained from the SEED and MicrobesOnline, regardless of gene set size. Despite the widespread use of GO and KEGG gene sets in bacterial gene expression data analysis, the SEED and MicrobesOnline provide more consistent sets for a wide variety of statistical analyses. Increased use of the SEED and MicrobesOnline gene sets in the analysis of bacterial gene expression data may improve statistical power and utility of expression data.
NASA Technical Reports Server (NTRS)
Foster, Winfred A., Jr.; Crowder, Winston; Steadman, Todd E.
2014-01-01
This paper presents the results of statistical analyses performed to predict the thrust imbalance between two solid rocket motor boosters to be used on the Space Launch System (SLS) vehicle. Two legacy internal ballistics codes developed for the Space Shuttle program were coupled with a Monte Carlo analysis code to determine a thrust imbalance envelope for the SLS vehicle based on the performance of 1000 motor pairs. Thirty three variables which could impact the performance of the motors during the ignition transient and thirty eight variables which could impact the performance of the motors during steady state operation of the motor were identified and treated as statistical variables for the analyses. The effects of motor to motor variation as well as variations between motors of a single pair were included in the analyses. The statistical variations of the variables were defined based on data provided by NASA's Marshall Space Flight Center for the upgraded five segment booster and from the Space Shuttle booster when appropriate. The results obtained for the statistical envelope are compared with the design specification thrust imbalance limits for the SLS launch vehicle
NASA Technical Reports Server (NTRS)
Foster, Winfred A., Jr.; Crowder, Winston; Steadman, Todd E.
2014-01-01
This paper presents the results of statistical analyses performed to predict the thrust imbalance between two solid rocket motor boosters to be used on the Space Launch System (SLS) vehicle. Two legacy internal ballistics codes developed for the Space Shuttle program were coupled with a Monte Carlo analysis code to determine a thrust imbalance envelope for the SLS vehicle based on the performance of 1000 motor pairs. Thirty three variables which could impact the performance of the motors during the ignition transient and thirty eight variables which could impact the performance of the motors during steady state operation of the motor were identified and treated as statistical variables for the analyses. The effects of motor to motor variation as well as variations between motors of a single pair were included in the analyses. The statistical variations of the variables were defined based on data provided by NASA's Marshall Space Flight Center for the upgraded five segment booster and from the Space Shuttle booster when appropriate. The results obtained for the statistical envelope are compared with the design specification thrust imbalance limits for the SLS launch vehicle.
Statistical Analyses of Scatterplots to Identify Important Factors in Large-Scale Simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kleijnen, J.P.C.; Helton, J.C.
1999-04-01
The robustness of procedures for identifying patterns in scatterplots generated in Monte Carlo sensitivity analyses is investigated. These procedures are based on attempts to detect increasingly complex patterns in the scatterplots under consideration and involve the identification of (1) linear relationships with correlation coefficients, (2) monotonic relationships with rank correlation coefficients, (3) trends in central tendency as defined by means, medians and the Kruskal-Wallis statistic, (4) trends in variability as defined by variances and interquartile ranges, and (5) deviations from randomness as defined by the chi-square statistic. The following two topics related to the robustness of these procedures are consideredmore » for a sequence of example analyses with a large model for two-phase fluid flow: the presence of Type I and Type II errors, and the stability of results obtained with independent Latin hypercube samples. Observations from analysis include: (1) Type I errors are unavoidable, (2) Type II errors can occur when inappropriate analysis procedures are used, (3) physical explanations should always be sought for why statistical procedures identify variables as being important, and (4) the identification of important variables tends to be stable for independent Latin hypercube samples.« less
NASA Astrophysics Data System (ADS)
Tazlauanu, Mihai
The research work reported in this thesis details a new fabrication technology for high speed integrated circuits in the broadest sense, including original contributions to device modeling, circuit simulation, integrated circuit design, wafer fabrication, micro-physical and electrical characterization, process flow and final device testing as part of an electrical system. The primary building block of this technology is the heterostructure insulated gate field effect transistor, HIGFET. We used an InP/InGaAs epitaxial heterostructure to ensure a high charge carrier mobility and hence obtain a higher operating frequency than that currently possible for silicon devices. We designed and built integrated circuits with two system architectures. The first architecture integrates the clock signal generator with the sample and hold circuitry on the InP die, while the second is a hybrid architecture of an InP sample and hold assembled with an external clock signal generator made with ECL circuits on GaAs. To generate the clock signals on the same die with the sample and hold circuits, we developed a digital circuit family based on an original inverter, appropriate for depletion mode NMOS technology. We used this circuit to design buffer amplifiers and ring oscillators. Four mask sets produced in a Cadence environment, have permitted the fabrication of test and working devices. Each new mask generation has reflected the previous achievements and has implemented new structures and circuit techniques. The fabrication technology has undergone successive modifications and refinements to optimize device manufacturing. Particular attention has been paid to the technological robustness. The plasma enhanced etching process (RIE) had been used for an exhaustive study for the statistical simulation of the technological steps. Electrical measurements, performed on the experimental samples, have permitted the modeling of the devices, technological processing to be adjusted and circuit design improved. Electrical measurements performed on dedicated test structures, during the fabrication cycle, allowed the identification and correction of some technological problems (ohmic contacts, current leakage, interconnection integrity, and thermal instabilities). Feedback corrections were validated by dedicated experiments with the experimental effort optimized by statistical techniques (factorial fractional design). (Abstract shortened by UMI.)
A new approach to the study of therapeutic work in the transference.
Pessier, J; Stuart, J
2000-02-01
This article proposes a new method for evaluating the effects of therapist and patient work in the transference. Work in the transference is often difficult for the patient, and may show a characteristic pattern of lag between a transference interpretation and its therapeutic effect. To account for this lag, we assessed patient responses to interpretations over the course of entire sessions. The narratives patients told about others, or Relationship Episodes (REs), were used as units of study. In a sample of three consecutive sessions taken from each of three psychodynamic cases, we identified several instances when transference work appeared to have an initial inhibitory effect, but facilitated progress over the course of the entire session. We recommend that to examine the effects of interpretations future studies use longer, more clinically meaningful segments of patient speech than have been used in the past. Dieser Beitrag propagiert eine neue Methode zur Evaluierung der Effekte von Übertragungsarbeit durch Therapeut und Patient. Arbeit in der Übertragung ist für den Patienten oftmals schwierig und zeigt häufig eine charakteristisches Muster von zeitlichen Verzögerungen bzgl. Übertragungsdeutungen und deren therapeutischen Effekten. Um diese zeitliche Verzögerung zu erklären, untersuchten wir die Reaktionen von Patienten auf derartige Deutungen im Verlauf ganzer Sitzungen. Narrative, in denen die Patienten über andere berichteten, also Beziehungsepisoden, dienten in dieser Studie als Einheit. In einter Stichprobe dreier aufeinanderfolgender Sitzugnen, die sich auf drei Fälle bezogen, identifizierten wir verschiedene Umstände, unter denen Übertragungsarbeit anfänglich einen hemmenden Affekt zu haben schien, letztlich aber den Gesamtverlauf der Sitzung günstig beeinflussten. Wir empfehlen, in Zukunft die Effekte von Übertragungsdeutungen auf der Basis längerer, klinische sinnvoller Segmente von Patientenäußerungen zu untersuchen als dies in der Vergangenheit der Fall war. Cet article propose une nouvelle méthode pour évaluer les effets du travail sur le transfert entre thérapeute et patient. Ce travail est souvent difficile pour le patient, et il peut y avoir un pattern caractéristique de délai entre une înterprétation de transfert et son effet thérapeutique. Pour expliquer ce délai, nous avons évalué les réponses des patients à des interprétatios au cours de séances entières. C'est les narrations des patients sur d'autres, ou Episodes Relationnels (ERs), qui ont constitué les unités de base de cette étude. Dans un échantillon de 3 séances consécutives venant de 3 cas psychodynamiques, nous avons identifié plusieurs moments où le traail sur le transfert semblait avoir un effet inhibitoire initial, mais favorisait le progrès en regardant la séance entière. Nous recommandons pour de futures études sur l'effet des interprétations de prendre des segments plus longs et cliniquement significatifs du récit du patient que ceux utilisés dans le passé. Este artículo propone un nuevo método para evaluar los efectos del trabajo de terapeuta y paciente en la transferencia. El trabajo en la transferencia es, con frecuencia, dificil para el paciente y puede mostrar un lapso característico entre una interpretación transferencial y sus efectos terapéuticos. Para explicar este lapso, hemos evaluado las respuestas del paciente a las interpretaciones a lo largo de sesiones enteras. Como unidad de estudio se usaron las narrativas de los pacientes acerca de otros, o sea, los episodios relacionales (REs). En una muestra de tres sesiones consecutivas tomadas de tres casos psicodinámicos, identificamos varioss casos en los que la trasferencia parecía tener un efecto inicial inhibitorio, aunque se vio que a lo largo de la sesión facilitaba el progreso. Recomendamos que, para examinar los efectos de las interpretaciones, los futuros estudios usen segmentos más largos y clínicamente más significativos del habia del paciente que los que se usaron en el pasado.
Coordinate based random effect size meta-analysis of neuroimaging studies.
Tench, C R; Tanasescu, Radu; Constantinescu, C S; Auer, D P; Cottam, W J
2017-06-01
Low power in neuroimaging studies can make them difficult to interpret, and Coordinate based meta-analysis (CBMA) may go some way to mitigating this issue. CBMA has been used in many analyses to detect where published functional MRI or voxel-based morphometry studies testing similar hypotheses report significant summary results (coordinates) consistently. Only the reported coordinates and possibly t statistics are analysed, and statistical significance of clusters is determined by coordinate density. Here a method of performing coordinate based random effect size meta-analysis and meta-regression is introduced. The algorithm (ClusterZ) analyses both coordinates and reported t statistic or Z score, standardised by the number of subjects. Statistical significance is determined not by coordinate density, but by a random effects meta-analyses of reported effects performed cluster-wise using standard statistical methods and taking account of censoring inherent in the published summary results. Type 1 error control is achieved using the false cluster discovery rate (FCDR), which is based on the false discovery rate. This controls both the family wise error rate under the null hypothesis that coordinates are randomly drawn from a standard stereotaxic space, and the proportion of significant clusters that are expected under the null. Such control is necessary to avoid propagating and even amplifying the very issues motivating the meta-analysis in the first place. ClusterZ is demonstrated on both numerically simulated data and on real data from reports of grey matter loss in multiple sclerosis (MS) and syndromes suggestive of MS, and of painful stimulus in healthy controls. The software implementation is available to download and use freely. Copyright © 2017 Elsevier Inc. All rights reserved.
Dwan, Kerry; Altman, Douglas G.; Clarke, Mike; Gamble, Carrol; Higgins, Julian P. T.; Sterne, Jonathan A. C.; Williamson, Paula R.; Kirkham, Jamie J.
2014-01-01
Background Most publications about selective reporting in clinical trials have focussed on outcomes. However, selective reporting of analyses for a given outcome may also affect the validity of findings. If analyses are selected on the basis of the results, reporting bias may occur. The aims of this study were to review and summarise the evidence from empirical cohort studies that assessed discrepant or selective reporting of analyses in randomised controlled trials (RCTs). Methods and Findings A systematic review was conducted and included cohort studies that assessed any aspect of the reporting of analyses of RCTs by comparing different trial documents, e.g., protocol compared to trial report, or different sections within a trial publication. The Cochrane Methodology Register, Medline (Ovid), PsycInfo (Ovid), and PubMed were searched on 5 February 2014. Two authors independently selected studies, performed data extraction, and assessed the methodological quality of the eligible studies. Twenty-two studies (containing 3,140 RCTs) published between 2000 and 2013 were included. Twenty-two studies reported on discrepancies between information given in different sources. Discrepancies were found in statistical analyses (eight studies), composite outcomes (one study), the handling of missing data (three studies), unadjusted versus adjusted analyses (three studies), handling of continuous data (three studies), and subgroup analyses (12 studies). Discrepancy rates varied, ranging from 7% (3/42) to 88% (7/8) in statistical analyses, 46% (36/79) to 82% (23/28) in adjusted versus unadjusted analyses, and 61% (11/18) to 100% (25/25) in subgroup analyses. This review is limited in that none of the included studies investigated the evidence for bias resulting from selective reporting of analyses. It was not possible to combine studies to provide overall summary estimates, and so the results of studies are discussed narratively. Conclusions Discrepancies in analyses between publications and other study documentation were common, but reasons for these discrepancies were not discussed in the trial reports. To ensure transparency, protocols and statistical analysis plans need to be published, and investigators should adhere to these or explain discrepancies. Please see later in the article for the Editors' Summary PMID:24959719
ERIC Educational Resources Information Center
Tractenberg, Rochelle E.
2017-01-01
Statistical literacy is essential to an informed citizenry; and two emerging trends highlight a growing need for training that achieves this literacy. The first trend is towards "big" data: while automated analyses can exploit massive amounts of data, the interpretation--and possibly more importantly, the replication--of results are…
Using DEWIS and R for Multi-Staged Statistics e-Assessments
ERIC Educational Resources Information Center
Gwynllyw, D. Rhys; Weir, Iain S.; Henderson, Karen L.
2016-01-01
We demonstrate how the DEWIS e-Assessment system may use embedded R code to facilitate the assessment of students' ability to perform involved statistical analyses. The R code has been written to emulate SPSS output and thus the statistical results for each bespoke data set can be generated efficiently and accurately using standard R routines.…
Jeffrey P. Prestemon
2009-01-01
Timber product markets are subject to large shocks deriving from natural disturbances and policy shifts. Statistical modeling of shocks is often done to assess their economic importance. In this article, I simulate the statistical power of univariate and bivariate methods of shock detection using time series intervention models. Simulations show that bivariate methods...
Huvane, Jacqueline; Komarow, Lauren; Hill, Carol; Tran, Thuy Tien T; Pereira, Carol; Rosenkranz, Susan L; Finnemeyer, Matt; Earley, Michelle; Jiang, Hongyu Jeanne; Wang, Rui; Lok, Judith; Evans, Scott R
2017-03-15
The Statistical and Data Management Center (SDMC) provides the Antibacterial Resistance Leadership Group (ARLG) with statistical and data management expertise to advance the ARLG research agenda. The SDMC is active at all stages of a study, including design; data collection and monitoring; data analyses and archival; and publication of study results. The SDMC enhances the scientific integrity of ARLG studies through the development and implementation of innovative and practical statistical methodologies and by educating research colleagues regarding the application of clinical trial fundamentals. This article summarizes the challenges and roles, as well as the innovative contributions in the design, monitoring, and analyses of clinical trials and diagnostic studies, of the ARLG SDMC. © The Author 2017. Published by Oxford University Press for the Infectious Diseases Society of America. All rights reserved. For permissions, e-mail: journals.permissions@oup.com.
P-MartCancer-Interactive Online Software to Enable Analysis of Shotgun Cancer Proteomic Datasets.
Webb-Robertson, Bobbie-Jo M; Bramer, Lisa M; Jensen, Jeffrey L; Kobold, Markus A; Stratton, Kelly G; White, Amanda M; Rodland, Karin D
2017-11-01
P-MartCancer is an interactive web-based software environment that enables statistical analyses of peptide or protein data, quantitated from mass spectrometry-based global proteomics experiments, without requiring in-depth knowledge of statistical programming. P-MartCancer offers a series of statistical modules associated with quality assessment, peptide and protein statistics, protein quantification, and exploratory data analyses driven by the user via customized workflows and interactive visualization. Currently, P-MartCancer offers access and the capability to analyze multiple cancer proteomic datasets generated through the Clinical Proteomics Tumor Analysis Consortium at the peptide, gene, and protein levels. P-MartCancer is deployed as a web service (https://pmart.labworks.org/cptac.html), alternatively available via Docker Hub (https://hub.docker.com/r/pnnl/pmart-web/). Cancer Res; 77(21); e47-50. ©2017 AACR . ©2017 American Association for Cancer Research.
Glass-Kaastra, Shiona K.; Pearl, David L.; Reid-Smith, Richard J.; McEwen, Beverly; Slavic, Durda; McEwen, Scott A.; Fairles, Jim
2014-01-01
Antimicrobial susceptibility data on Escherichia coli F4, Pasteurella multocida, and Streptococcus suis isolates from Ontario swine (January 1998 to October 2010) were acquired from a comprehensive diagnostic veterinary laboratory in Ontario, Canada. In relation to the possible development of a surveillance system for antimicrobial resistance, data were assessed for ease of management, completeness, consistency, and applicability for temporal and spatial statistical analyses. Limited farm location data precluded spatial analyses and missing demographic data limited their use as predictors within multivariable statistical models. Changes in the standard panel of antimicrobials used for susceptibility testing reduced the number of antimicrobials available for temporal analyses. Data consistency and quality could improve over time in this and similar diagnostic laboratory settings by encouraging complete reporting with sample submission and by modifying database systems to limit free-text data entry. These changes could make more statistical methods available for disease surveillance and cluster detection. PMID:24688133
Glass-Kaastra, Shiona K; Pearl, David L; Reid-Smith, Richard J; McEwen, Beverly; Slavic, Durda; McEwen, Scott A; Fairles, Jim
2014-04-01
Antimicrobial susceptibility data on Escherichia coli F4, Pasteurella multocida, and Streptococcus suis isolates from Ontario swine (January 1998 to October 2010) were acquired from a comprehensive diagnostic veterinary laboratory in Ontario, Canada. In relation to the possible development of a surveillance system for antimicrobial resistance, data were assessed for ease of management, completeness, consistency, and applicability for temporal and spatial statistical analyses. Limited farm location data precluded spatial analyses and missing demographic data limited their use as predictors within multivariable statistical models. Changes in the standard panel of antimicrobials used for susceptibility testing reduced the number of antimicrobials available for temporal analyses. Data consistency and quality could improve over time in this and similar diagnostic laboratory settings by encouraging complete reporting with sample submission and by modifying database systems to limit free-text data entry. These changes could make more statistical methods available for disease surveillance and cluster detection.
Ratio index variables or ANCOVA? Fisher's cats revisited.
Tu, Yu-Kang; Law, Graham R; Ellison, George T H; Gilthorpe, Mark S
2010-01-01
Over 60 years ago Ronald Fisher demonstrated a number of potential pitfalls with statistical analyses using ratio variables. Nonetheless, these pitfalls are largely overlooked in contemporary clinical and epidemiological research, which routinely uses ratio variables in statistical analyses. This article aims to demonstrate how very different findings can be generated as a result of less than perfect correlations among the data used to generate ratio variables. These imperfect correlations result from measurement error and random biological variation. While the former can often be reduced by improvements in measurement, random biological variation is difficult to estimate and eliminate in observational studies. Moreover, wherever the underlying biological relationships among epidemiological variables are unclear, and hence the choice of statistical model is also unclear, the different findings generated by different analytical strategies can lead to contradictory conclusions. Caution is therefore required when interpreting analyses of ratio variables whenever the underlying biological relationships among the variables involved are unspecified or unclear. (c) 2009 John Wiley & Sons, Ltd.
ERIC Educational Resources Information Center
Kadhi, T.; Holley, D.; Rudley, D.; Garrison, P.; Green, T.
2010-01-01
The following report gives the statistical findings of the 2010 Thurgood Marshall School of Law (TMSL) Texas Bar results. This data was pre-existing and was given to the Evaluator by email from the Dean. Then, in-depth statistical analyses were run using the SPSS 17 to address the following questions: 1. What are the statistical descriptors of the…
A statistical package for computing time and frequency domain analysis
NASA Technical Reports Server (NTRS)
Brownlow, J.
1978-01-01
The spectrum analysis (SPA) program is a general purpose digital computer program designed to aid in data analysis. The program does time and frequency domain statistical analyses as well as some preanalysis data preparation. The capabilities of the SPA program include linear trend removal and/or digital filtering of data, plotting and/or listing of both filtered and unfiltered data, time domain statistical characterization of data, and frequency domain statistical characterization of data.
NASA Astrophysics Data System (ADS)
Gaffet, Eric
2011-09-01
Nanomaterials are an active area of research but also an economic sector in full expansion which addresses many application domains. For instance, French production for the most common nanomaterials (such as silica, titanium dioxide, carbon black) is in the hundreds of thousands of tons. As for any innovation, one must consider the risks and, if necessary, establish rules to protect consumer health and that of the worker. This article addresses in particular difficulties in defining these materials, the state of knowledge on human or environmental toxicity and requirements and agencies in charge of safety.
NASA Astrophysics Data System (ADS)
Daoudi, Lahcen; Pot de Vin, Jean-Luc
Thermal and hydrothermal effects of Triassic-Liassic basalt flow deposition on sedimentary series of the Argana Basin are responsible for major modifications in detrital clays, until 20 m in depth. It expressed by transformation of detrital smectite to corrensite and moreover to chlorite, and by increasing illite crystallinity. On the 2 m of sediments located immediately under the flow, magnesium-rich hydrothermal fluids have caused precipitation of new mineral phases. To cite this article: L. Daoudi, J.-L. Pot de Vin, C. R. Geoscience 334 (2002) 463-468.
2010-08-01
des groupes de consultation pour déterminer l’effet des opérations communes sur la population locale. Ils ont établi des « mesures de rendement...fondées sur les relations avec les habitants et les résultats des groupes de consultation ainsi que des « mesures de production » basées sur la ...l’ethnographie de combat, et la recherche d’initié étranger peuvent aider à tracer les microcosmes sociaux dans un
1987-09-01
response. An estimate of the buffeting response for the two cases is presented in Figure 4, using the theory of Irwin (Reference 7). Data acquisition was...values were obtained using the log decrement method by exciting the bridge in one mode and observing the decay of the response. Classical theory would...added mass or structural damping level. The addition of inertia to the deck would tend to lower the response according to classical vibration theory
2009-07-01
National Defence, 2009 © Sa Majesté la Reine (en droit du Canada), telle que représentée par le ministre de la Défense nationale, 2009 DRDC Toronto TR...Parallèlement, on a évalué le degré d’acceptation de l’utilisateur à l’aide d’un questionnaire. Méthode: Cinq hommes ayant un seuil d’audition...hearing protection and usability. Résumé …..... L’intelligibilité de la parole en provenance d’un poste radio personnel (PRR) muni de casques
Joseph Buongiorno; Ronald Raunikar; Shushuai Zhu
2011-01-01
Lâarticle présente une exploration, menée au moyen dâun modèle mondial de la filière-bois, de lâeffet sur la filière-bois française des modifications actuelles et prévisibles de la demande mondiale en énergie issue de la biomasse. Deux scénarios contrastés sont testés. Les résultats sont mis en perspective et soulignent le conflit potentiel entre usages du bois : bois...
2010-08-01
indicateurs de rendement de haut niveau, ce qui donne à penser que la flotte a une réaction inélastique aux chocs de dépenses. Ces résultats révèlent...corrélations entre les données. DRDC CORA TM 2010-168 i Dr aft Co py This page intentionally left blank. ii DRDC CORA TM 2010-168 Dr aft Co py Executive ...celles qui sont observées dans les données ont peu d’effet sur le rendement. Nous pouvons conclure que l’entretien est très robuste ; les chocs de
Lewis, Gregory F.; Furman, Senta A.; McCool, Martha F.; Porges, Stephen W.
2011-01-01
Three frequently used RSA metrics are investigated to document violations of assumptions for parametric analyses, moderation by respiration, influences of nonstationarity, and sensitivity to vagal blockade. Although all metrics are highly correlated, new findings illustrate that the metrics are noticeably different on the above dimensions. Only one method conforms to the assumptions for parametric analyses, is not moderated by respiration, is not influenced by nonstationarity, and reliably generates stronger effect sizes. Moreover, this method is also the most sensitive to vagal blockade. Specific features of this method may provide insights into improving the statistical characteristics of other commonly used RSA metrics. These data provide the evidence to question, based on statistical grounds, published reports using particular metrics of RSA. PMID:22138367
Confidence crisis of results in biomechanics research.
Knudson, Duane
2017-11-01
Many biomechanics studies have small sample sizes and incorrect statistical analyses, so reporting of inaccurate inferences and inflated magnitude of effects are common in the field. This review examines these issues in biomechanics research and summarises potential solutions from research in other fields to increase the confidence in the experimental effects reported in biomechanics. Authors, reviewers and editors of biomechanics research reports are encouraged to improve sample sizes and the resulting statistical power, improve reporting transparency, improve the rigour of statistical analyses used, and increase the acceptance of replication studies to improve the validity of inferences from data in biomechanics research. The application of sports biomechanics research results would also improve if a larger percentage of unbiased effects and their uncertainty were reported in the literature.
Bennett, Bradley C; Husby, Chad E
2008-03-28
Botanical pharmacopoeias are non-random subsets of floras, with some taxonomic groups over- or under-represented. Moerman [Moerman, D.E., 1979. Symbols and selectivity: a statistical analysis of Native American medical ethnobotany, Journal of Ethnopharmacology 1, 111-119] introduced linear regression/residual analysis to examine these patterns. However, regression, the commonly-employed analysis, suffers from several statistical flaws. We use contingency table and binomial analyses to examine patterns of Shuar medicinal plant use (from Amazonian Ecuador). We first analyzed the Shuar data using Moerman's approach, modified to better meet requirements of linear regression analysis. Second, we assessed the exact randomization contingency table test for goodness of fit. Third, we developed a binomial model to test for non-random selection of plants in individual families. Modified regression models (which accommodated assumptions of linear regression) reduced R(2) to from 0.59 to 0.38, but did not eliminate all problems associated with regression analyses. Contingency table analyses revealed that the entire flora departs from the null model of equal proportions of medicinal plants in all families. In the binomial analysis, only 10 angiosperm families (of 115) differed significantly from the null model. These 10 families are largely responsible for patterns seen at higher taxonomic levels. Contingency table and binomial analyses offer an easy and statistically valid alternative to the regression approach.
Validating Future Force Performance Measures (Army Class): Concluding Analyses
2016-06-01
32 Table 3.10. Descriptive Statistics and Intercorrelations for LV Final Predictor Factor Scores...55 Table 4.7. Descriptive Statistics for Analysis Criteria...Soldier attrition and performance: Dependability (Non- Delinquency ), Adjustment, Physical Conditioning, Leadership, Work Orientation, and Agreeableness
FHWA statistical program : a customer's guide to using highway statistics
DOT National Transportation Integrated Search
1995-08-01
The appropriate level of spatial and temporal data aggregation for highway vehicle emissions analyses is one of several important analytical questions that has received considerable interest following passage of the Clean Air Act Amendments (CAAA) of...
ParallABEL: an R library for generalized parallelization of genome-wide association studies
2010-01-01
Background Genome-Wide Association (GWA) analysis is a powerful method for identifying loci associated with complex traits and drug response. Parts of GWA analyses, especially those involving thousands of individuals and consuming hours to months, will benefit from parallel computation. It is arduous acquiring the necessary programming skills to correctly partition and distribute data, control and monitor tasks on clustered computers, and merge output files. Results Most components of GWA analysis can be divided into four groups based on the types of input data and statistical outputs. The first group contains statistics computed for a particular Single Nucleotide Polymorphism (SNP), or trait, such as SNP characterization statistics or association test statistics. The input data of this group includes the SNPs/traits. The second group concerns statistics characterizing an individual in a study, for example, the summary statistics of genotype quality for each sample. The input data of this group includes individuals. The third group consists of pair-wise statistics derived from analyses between each pair of individuals in the study, for example genome-wide identity-by-state or genomic kinship analyses. The input data of this group includes pairs of SNPs/traits. The final group concerns pair-wise statistics derived for pairs of SNPs, such as the linkage disequilibrium characterisation. The input data of this group includes pairs of individuals. We developed the ParallABEL library, which utilizes the Rmpi library, to parallelize these four types of computations. ParallABEL library is not only aimed at GenABEL, but may also be employed to parallelize various GWA packages in R. The data set from the North American Rheumatoid Arthritis Consortium (NARAC) includes 2,062 individuals with 545,080, SNPs' genotyping, was used to measure ParallABEL performance. Almost perfect speed-up was achieved for many types of analyses. For example, the computing time for the identity-by-state matrix was linearly reduced from approximately eight hours to one hour when ParallABEL employed eight processors. Conclusions Executing genome-wide association analysis using the ParallABEL library on a computer cluster is an effective way to boost performance, and simplify the parallelization of GWA studies. ParallABEL is a user-friendly parallelization of GenABEL. PMID:20429914
Weir, Christopher J; Butcher, Isabella; Assi, Valentina; Lewis, Stephanie C; Murray, Gordon D; Langhorne, Peter; Brady, Marian C
2018-03-07
Rigorous, informative meta-analyses rely on availability of appropriate summary statistics or individual participant data. For continuous outcomes, especially those with naturally skewed distributions, summary information on the mean or variability often goes unreported. While full reporting of original trial data is the ideal, we sought to identify methods for handling unreported mean or variability summary statistics in meta-analysis. We undertook two systematic literature reviews to identify methodological approaches used to deal with missing mean or variability summary statistics. Five electronic databases were searched, in addition to the Cochrane Colloquium abstract books and the Cochrane Statistics Methods Group mailing list archive. We also conducted cited reference searching and emailed topic experts to identify recent methodological developments. Details recorded included the description of the method, the information required to implement the method, any underlying assumptions and whether the method could be readily applied in standard statistical software. We provided a summary description of the methods identified, illustrating selected methods in example meta-analysis scenarios. For missing standard deviations (SDs), following screening of 503 articles, fifteen methods were identified in addition to those reported in a previous review. These included Bayesian hierarchical modelling at the meta-analysis level; summary statistic level imputation based on observed SD values from other trials in the meta-analysis; a practical approximation based on the range; and algebraic estimation of the SD based on other summary statistics. Following screening of 1124 articles for methods estimating the mean, one approximate Bayesian computation approach and three papers based on alternative summary statistics were identified. Illustrative meta-analyses showed that when replacing a missing SD the approximation using the range minimised loss of precision and generally performed better than omitting trials. When estimating missing means, a formula using the median, lower quartile and upper quartile performed best in preserving the precision of the meta-analysis findings, although in some scenarios, omitting trials gave superior results. Methods based on summary statistics (minimum, maximum, lower quartile, upper quartile, median) reported in the literature facilitate more comprehensive inclusion of randomised controlled trials with missing mean or variability summary statistics within meta-analyses.
Mathematical background and attitudes toward statistics in a sample of Spanish college students.
Carmona, José; Martínez, Rafael J; Sánchez, Manuel
2005-08-01
To examine the relation of mathematical background and initial attitudes toward statistics of Spanish college students in social sciences the Survey of Attitudes Toward Statistics was given to 827 students. Multivariate analyses tested the effects of two indicators of mathematical background (amount of exposure and achievement in previous courses) on the four subscales. Analysis suggested grades in previous courses are more related to initial attitudes toward statistics than the number of mathematics courses taken. Mathematical background was related with students' affective responses to statistics but not with their valuing of statistics. Implications of possible research are discussed.
Mass univariate analysis of event-related brain potentials/fields I: a critical tutorial review.
Groppe, David M; Urbach, Thomas P; Kutas, Marta
2011-12-01
Event-related potentials (ERPs) and magnetic fields (ERFs) are typically analyzed via ANOVAs on mean activity in a priori windows. Advances in computing power and statistics have produced an alternative, mass univariate analyses consisting of thousands of statistical tests and powerful corrections for multiple comparisons. Such analyses are most useful when one has little a priori knowledge of effect locations or latencies, and for delineating effect boundaries. Mass univariate analyses complement and, at times, obviate traditional analyses. Here we review this approach as applied to ERP/ERF data and four methods for multiple comparison correction: strong control of the familywise error rate (FWER) via permutation tests, weak control of FWER via cluster-based permutation tests, false discovery rate control, and control of the generalized FWER. We end with recommendations for their use and introduce free MATLAB software for their implementation. Copyright © 2011 Society for Psychophysiological Research.
Impact of ontology evolution on functional analyses.
Groß, Anika; Hartung, Michael; Prüfer, Kay; Kelso, Janet; Rahm, Erhard
2012-10-15
Ontologies are used in the annotation and analysis of biological data. As knowledge accumulates, ontologies and annotation undergo constant modifications to reflect this new knowledge. These modifications may influence the results of statistical applications such as functional enrichment analyses that describe experimental data in terms of ontological groupings. Here, we investigate to what degree modifications of the Gene Ontology (GO) impact these statistical analyses for both experimental and simulated data. The analysis is based on new measures for the stability of result sets and considers different ontology and annotation changes. Our results show that past changes in the GO are non-uniformly distributed over different branches of the ontology. Considering the semantic relatedness of significant categories in analysis results allows a more realistic stability assessment for functional enrichment studies. We observe that the results of term-enrichment analyses tend to be surprisingly stable despite changes in ontology and annotation.
Predicting Subsequent Myopia in Initially Pilot-Qualified USAFA Cadets.
1985-12-27
Refraction Measurement 14 Accesion For . 4.0 RESULTS NTIS CRA&I 15 4.1 Descriptive Statistics DTIC TAB 0 15i ~ ~Unannoutwced [ 4.2 Predictive Statistics ...mentioned), and three were missing a status. The data of the subject who was commissionable were dropped from the statistical analyses. Of the 91...relatively equal numbers of participants from all classes will become obvious ’’" - within the results. J 4.1 Descriptive Statistics In the original plan
Statistical reporting of clinical pharmacology research.
Ring, Arne; Schall, Robert; Loke, Yoon K; Day, Simon
2017-06-01
Research in clinical pharmacology covers a wide range of experiments, trials and investigations: clinical trials, systematic reviews and meta-analyses of drug usage after market approval, the investigation of pharmacokinetic-pharmacodynamic relationships, the search for mechanisms of action or for potential signals for efficacy and safety using biomarkers. Often these investigations are exploratory in nature, which has implications for the way the data should be analysed and presented. Here we summarize some of the statistical issues that are of particular importance in clinical pharmacology research. © 2017 The British Pharmacological Society.
Brennan, Jennifer Sousa
2010-01-01
This chapter is an introductory reference guide highlighting some of the most common statistical topics, broken down into both command-line syntax and graphical interface point-and-click commands. This chapter serves to supplement more formal statistics lessons and expedite using Stata to compute basic analyses.
Mise à jour sur le nouveau vaccin 9-valent pour la prévention du virus du papillome humain
Yang, David Yi; Bracken, Keyna
2016-01-01
Résumé Objectif Informer les médecins de famille quant à l’efficacité, à l’innocuité, aux effets sur la santé publique et à la rentabilité du vaccin 9-valent contre le virus du papillome humain (VPH). Qualité des données Des articles pertinents publiés dans PubMed jusqu’en mai 2015 ont été examinés et analysés. La plupart des données citées sont de niveau I (essais randomisés et contrôlés et méta-analyses) ou de niveau II (études transversales, cas-témoins et épidémiologiques). Des rapports et recommandations du gouvernement sont aussi cités en référence. Message principal Le vaccin 9-valent contre le VPH, qui offre une protection contre les types 6, 11, 16, 18, 31, 33, 45, 52 et 58 du VPH, est sûr et efficace et réduira encore plus l’incidence des infections à VPH, de même que les cas de cancer lié au VPH. Il peut également protéger indirectement les personnes non immunisées par l’entremise du phénomène d’immunité collective. Un programme d’immunisation efficace peut prévenir la plupart des cancers du col de l’utérus. Les analyses montrent que la rentabilité du vaccin 9-valent chez les femmes est comparable à celle du vaccin quadrivalent original contre le VPH (qui protège contre les types 6, 11, 16 et 18 du VPH) en usage à l’heure actuelle. Toutefois, il faut investiguer plus en profondeur l’utilité d’immuniser les garçons avec le vaccin 9-valent contre le VPH. Conclusion en plus d’être sûr, le vaccin 9-valent protège mieux contre le VPH que le vaccin quadrivalent. Une analyse coûtefficacité en favorise l’emploi, du moins chez les adolescentes. Ainsi, les médecins devraient recommander le vaccin 9-valent à leurs patients plutôt que le vaccin quadrivalent contre le VPH.
Lefrançois, Mélanie; Saint-Charles, Johanne; Riel, Jessica
2017-11-01
Whether or not official work/family balance measures exist within an organization, scheduling accommodations often go through informal channels involving colleagues and superiors and are negotiated within interpersonal relationships. This study examines the relationship dimensions of the scheduling strategies of cleaners working atypical hours in the transport sector through the lenses of ergonomic activity, network, and gender analyses. Using semi-directed interviews, observation, and network analysis, we revealed the effect of gender on relationship dynamics and the influence of these dynamics on work/family balance strategies deployed by cleaners. One of the main contributions of this study is to demonstrate the decisive effect of relationships by revealing inequalities in access to organizational social networks. Creating spaces to discuss work/family balancing and a more equitable circulation of information could contribute to reducing inequalities associated with gender, social status, and family responsibilities and support the work/family strategies developed by workers dealing with restrictive work schedules. Résumé Les accommodements du temps de travail pour la conciliation travail-famille (CTF) passent souvent par des ententes informelles qui s'inscrivent dans les relations entre collègues ou avec des gestionnaires. Notre étude, intégrant l'ergonomie et la communication dans une perspective de genre, porte sur les dimensions relationnelles des stratégies de choix d'horaire d'agentes et agents de nettoyage devant composer avec des horaires atypiques dans le secteur des transports. À partir d'entretiens semi-dirigés, d'observations et d'analyse de réseaux, nous avons pu observer l'influence des dynamiques relationnelles, notamment de genre, sur les stratégies de CTF. Un apport central de cette étude est de montrer l'effet structurant des relations en révélant notamment des inégalités dans l'accès aux ressources facilitant le choix d'horaire, mais aussi dans l'inclusion au sein du réseau de relations. L'article conclut en proposant des pistes de solutions concrètes pour la réduction de ces inégalités.
NASA Astrophysics Data System (ADS)
Emoto, K.; Saito, T.; Shiomi, K.
2017-12-01
Short-period (<1 s) seismograms are strongly affected by small-scale (<10 km) heterogeneities in the lithosphere. In general, short-period seismograms are analysed based on the statistical method by considering the interaction between seismic waves and randomly distributed small-scale heterogeneities. Statistical properties of the random heterogeneities have been estimated by analysing short-period seismograms. However, generally, the small-scale random heterogeneity is not taken into account for the modelling of long-period (>2 s) seismograms. We found that the energy of the coda of long-period seismograms shows a spatially flat distribution. This phenomenon is well known in short-period seismograms and results from the scattering by small-scale heterogeneities. We estimate the statistical parameters that characterize the small-scale random heterogeneity by modelling the spatiotemporal energy distribution of long-period seismograms. We analyse three moderate-size earthquakes that occurred in southwest Japan. We calculate the spatial distribution of the energy density recorded by a dense seismograph network in Japan at the period bands of 8-16 s, 4-8 s and 2-4 s and model them by using 3-D finite difference (FD) simulations. Compared to conventional methods based on statistical theories, we can calculate more realistic synthetics by using the FD simulation. It is not necessary to assume a uniform background velocity, body or surface waves and scattering properties considered in general scattering theories. By taking the ratio of the energy of the coda area to that of the entire area, we can separately estimate the scattering and the intrinsic absorption effects. Our result reveals the spectrum of the random inhomogeneity in a wide wavenumber range including the intensity around the corner wavenumber as P(m) = 8πε2a3/(1 + a2m2)2, where ε = 0.05 and a = 3.1 km, even though past studies analysing higher-frequency records could not detect the corner. Finally, we estimate the intrinsic attenuation by modelling the decay rate of the energy. The method proposed in this study is suitable for quantifying the statistical properties of long-wavelength subsurface random inhomogeneity, which leads the way to characterizing a wider wavenumber range of spectra, including the corner wavenumber.
Proliferative changes in the bronchial epithelium of former smokers treated with retinoids.
Hittelman, Walter N; Liu, Diane D; Kurie, Jonathan M; Lotan, Reuben; Lee, Jin Soo; Khuri, Fadlo; Ibarguen, Heladio; Morice, Rodolfo C; Walsh, Garrett; Roth, Jack A; Minna, John; Ro, Jae Y; Broxson, Anita; Hong, Waun Ki; Lee, J Jack
2007-11-07
Retinoids have shown antiproliferative and chemopreventive activity. We analyzed data from a randomized, placebo-controlled chemoprevention trial to determine whether a 3-month treatment with either 9-cis-retinoic acid (RA) or 13-cis-RA and alpha-tocopherol reduced Ki-67, a proliferation biomarker, in the bronchial epithelium. Former smokers (n = 225) were randomly assigned to receive 3 months of daily oral 9-cis-RA (100 mg), 13-cis-RA (1 mg/kg) and alpha-tocopherol (1200 IU), or placebo. Bronchoscopic biopsy specimens obtained before and after treatment were immunohistochemically assessed for changes in the Ki-67 proliferative index (i.e., percentage of cells with Ki-67-positive nuclear staining) in the basal and parabasal layers of the bronchial epithelium. Per-subject and per-biopsy site analyses were conducted. Multicovariable analyses, including a mixed-effects model and a generalized estimating equations model, were used to investigate the treatment effect (Ki-67 labeling index and percentage of bronchial epithelial biopsy sites with a Ki-67 index > or = 5%) with adjustment for multiple covariates, such as smoking history and metaplasia. Coefficient estimates and 95% confidence intervals (CIs) were obtained from the models. All statistical tests were two-sided. In per-subject analyses, Ki-67 labeling in the basal layer was not changed by any treatment; the percentage of subjects with a high Ki-67 labeling in the parabasal layer dropped statistically significantly after treatment with 13-cis-RA and alpha-tocopherol treatment (P = .04) compared with placebo, but the drop was not statistically significant after 9-cis-RA treatment (P = .17). A similar effect was observed in the parabasal layer in a per-site analysis; the percentage of sites with high Ki-67 labeling dropped statistically significantly after 9-cis-RA treatment (coefficient estimate = -0.72, 95% CI = -1.24 to -0.20; P = .007) compared with placebo, and after 13-cis-RA and alpha-tocopherol treatment (coefficient estimate = -0.66, 95% CI = -1.15 to -0.17; P = .008). In per-subject analyses, treatment with 13-cis-RA and alpha-tocopherol, compared with placebo, was statistically significantly associated with reduced bronchial epithelial cell proliferation; treatment with 9-cis-RA was not. In per-site analyses, statistically significant associations were obtained with both treatments.
Proliferative Changes in the Bronchial Epithelium of Former Smokers Treated With Retinoids
Hittelman, Walter N.; Liu, Diane D.; Kurie, Jonathan M.; Lotan, Reuben; Lee, Jin Soo; Khuri, Fadlo; Ibarguen, Heladio; Morice, Rodolfo C.; Walsh, Garrett; Roth, Jack A.; Minna, John; Ro, Jae Y.; Broxson, Anita; Hong, Waun Ki; Lee, J. Jack
2012-01-01
Background Retinoids have shown antiproliferative and chemopreventive activity. We analyzed data from a randomized, placebo-controlled chemoprevention trial to determine whether a 3-month treatment with either 9-cis-retinoic acid (RA) or 13-cis-RA and α-tocopherol reduced Ki-67, a proliferation biomarker, in the bronchial epithelium. Methods Former smokers (n = 225) were randomly assigned to receive 3 months of daily oral 9-cis-RA (100 mg), 13-cis-RA (1 mg/kg) and α-tocopherol (1200 IU), or placebo. Bronchoscopic biopsy specimens obtained before and after treatment were immunohistochemically assessed for changes in the Ki-67 proliferative index (i.e., percentage of cells with Ki-67–positive nuclear staining) in the basal and parabasal layers of the bronchial epithelium. Per-subject and per–biopsy site analyses were conducted. Multicovariable analyses, including a mixed-effects model and a generalized estimating equations model, were used to investigate the treatment effect (Ki-67 labeling index and percentage of bronchial epithelial biopsy sites with a Ki-67 index ≥ 5%) with adjustment for multiple covariates, such as smoking history and metaplasia. Coefficient estimates and 95% confidence intervals (CIs) were obtained from the models. All statistical tests were two-sided. Results In per-subject analyses, Ki-67 labeling in the basal layer was not changed by any treatment; the percentage of subjects with a high Ki-67 labeling in the parabasal layer dropped statistically significantly after treatment with 13-cis-RA and α-tocopherol treatment (P = .04) compared with placebo, but the drop was not statistically significant after 9-cis-RA treatment (P = .17). A similar effect was observed in the parabasal layer in a per-site analysis; the percentage of sites with high Ki-67 labeling dropped statistically significantly after 9-cis-RA treatment (coefficient estimate = −0.72, 95% CI = −1.24 to −0.20; P = .007) compared with placebo, and after 13-cis-RA and α-tocopherol treatment (coefficient estimate = −0.66, 95% CI = −1.15 to −0.17; P = .008). Conclusions In per-subject analyses, treatment with 13-cis-RA and α-tocopherol, compared with placebo, was statistically significantly associated with reduced bronchial epithelial cell proliferation; treatment with 9-cis-RA was not. In per-site analyses, statistically significant associations were obtained with both treatments. PMID:17971525
Lociciro, S; Esseiva, P; Hayoz, P; Dujourdy, L; Besacier, F; Margot, P
2008-05-20
Harmonisation and optimization of analytical and statistical methodologies were carried out between two forensic laboratories (Lausanne, Switzerland and Lyon, France) in order to provide drug intelligence for cross-border cocaine seizures. Part I dealt with the optimization of the analytical method and its robustness. This second part investigates statistical methodologies that will provide reliable comparison of cocaine seizures analysed on two different gas chromatographs interfaced with a flame ionisation detectors (GC-FIDs) in two distinct laboratories. Sixty-six statistical combinations (ten data pre-treatments followed by six different distance measurements and correlation coefficients) were applied. One pre-treatment (N+S: area of each peak is divided by its standard deviation calculated from the whole data set) followed by the Cosine or Pearson correlation coefficients were found to be the best statistical compromise for optimal discrimination of linked and non-linked samples. The centralisation of the analyses in one single laboratory is not a required condition anymore to compare samples seized in different countries. This allows collaboration, but also, jurisdictional control over data.
Zhang, Ying; Sun, Jin; Zhang, Yun-Jiao; Chai, Qian-Yun; Zhang, Kang; Ma, Hong-Li; Wu, Xiao-Ke; Liu, Jian-Ping
2016-10-21
Although Traditional Chinese Medicine (TCM) has been widely used in clinical settings, a major challenge that remains in TCM is to evaluate its efficacy scientifically. This randomized controlled trial aims to evaluate the efficacy and safety of berberine in the treatment of patients with polycystic ovary syndrome. In order to improve the transparency and research quality of this clinical trial, we prepared this statistical analysis plan (SAP). The trial design, primary and secondary outcomes, and safety outcomes were declared to reduce selection biases in data analysis and result reporting. We specified detailed methods for data management and statistical analyses. Statistics in corresponding tables, listings, and graphs were outlined. The SAP provided more detailed information than trial protocol on data management and statistical analysis methods. Any post hoc analyses could be identified via referring to this SAP, and the possible selection bias and performance bias will be reduced in the trial. This study is registered at ClinicalTrials.gov, NCT01138930 , registered on 7 June 2010.
Becker, Betsy Jane; Aloe, Ariel M; Duvendack, Maren; Stanley, T D; Valentine, Jeffrey C; Fretheim, Atle; Tugwell, Peter
2017-09-01
To outline issues of importance to analytic approaches to the synthesis of quasi-experiments (QEs) and to provide a statistical model for use in analysis. We drew on studies of statistics, epidemiology, and social-science methodology to outline methods for synthesis of QE studies. The design and conduct of QEs, effect sizes from QEs, and moderator variables for the analysis of those effect sizes were discussed. Biases, confounding, design complexities, and comparisons across designs offer serious challenges to syntheses of QEs. Key components of meta-analyses of QEs were identified, including the aspects of QE study design to be coded and analyzed. Of utmost importance are the design and statistical controls implemented in the QEs. Such controls and any potential sources of bias and confounding must be modeled in analyses, along with aspects of the interventions and populations studied. Because of such controls, effect sizes from QEs are more complex than those from randomized experiments. A statistical meta-regression model that incorporates important features of the QEs under review was presented. Meta-analyses of QEs provide particular challenges, but thorough coding of intervention characteristics and study methods, along with careful analysis, should allow for sound inferences. Copyright © 2017 Elsevier Inc. All rights reserved.
Detection of semi-volatile organic compounds in permeable ...
Abstract The Edison Environmental Center (EEC) has a research and demonstration permeable parking lot comprised of three different permeable systems: permeable asphalt, porous concrete and interlocking concrete permeable pavers. Water quality and quantity analysis has been ongoing since January, 2010. This paper describes a subset of the water quality analysis, analysis of semivolatile organic compounds (SVOCs) to determine if hydrocarbons were in water infiltrated through the permeable surfaces. SVOCs were analyzed in samples collected from 11 dates over a 3 year period, from 2/8/2010 to 4/1/2013.Results are broadly divided into three categories: 42 chemicals were never detected; 12 chemicals (11 chemical test) were detected at a rate of less than 10% or less; and 22 chemicals were detected at a frequency of 10% or greater (ranging from 10% to 66.5% detections). Fundamental and exploratory statistical analyses were performed on these latter analyses results by grouping results by surface type. The statistical analyses were limited due to low frequency of detections and dilutions of samples which impacted detection limits. The infiltrate data through three permeable surfaces were analyzed as non-parametric data by the Kaplan-Meier estimation method for fundamental statistics; there were some statistically observable difference in concentration between pavement types when using Tarone-Ware Comparison Hypothesis Test. Additionally Spearman Rank order non-parame
Accounting for Multiple Births in Neonatal and Perinatal Trials: Systematic Review and Case Study
Hibbs, Anna Maria; Black, Dennis; Palermo, Lisa; Cnaan, Avital; Luan, Xianqun; Truog, William E; Walsh, Michele C; Ballard, Roberta A
2010-01-01
Objectives To determine the prevalence in the neonatal literature of statistical approaches accounting for the unique clustering patterns of multiple births. To explore the sensitivity of an actual trial to several analytic approaches to multiples. Methods A systematic review of recent perinatal trials assessed the prevalence of studies accounting for clustering of multiples. The NO CLD trial served as a case study of the sensitivity of the outcome to several statistical strategies. We calculated odds ratios using non-clustered (logistic regression) and clustered (generalized estimating equations, multiple outputation) analyses. Results In the systematic review, most studies did not describe the randomization of twins and did not account for clustering. Of those studies that did, exclusion of multiples and generalized estimating equations were the most common strategies. The NO CLD study included 84 infants with a sibling enrolled in the study. Multiples were more likely than singletons to be white and were born to older mothers (p<0.01). Analyses that accounted for clustering were statistically significant; analyses assuming independence were not. Conclusions The statistical approach to multiples can influence the odds ratio and width of confidence intervals, thereby affecting the interpretation of a study outcome. A minority of perinatal studies address this issue. PMID:19969305
Accounting for multiple births in neonatal and perinatal trials: systematic review and case study.
Hibbs, Anna Maria; Black, Dennis; Palermo, Lisa; Cnaan, Avital; Luan, Xianqun; Truog, William E; Walsh, Michele C; Ballard, Roberta A
2010-02-01
To determine the prevalence in the neonatal literature of statistical approaches accounting for the unique clustering patterns of multiple births and to explore the sensitivity of an actual trial to several analytic approaches to multiples. A systematic review of recent perinatal trials assessed the prevalence of studies accounting for clustering of multiples. The Nitric Oxide to Prevent Chronic Lung Disease (NO CLD) trial served as a case study of the sensitivity of the outcome to several statistical strategies. We calculated odds ratios using nonclustered (logistic regression) and clustered (generalized estimating equations, multiple outputation) analyses. In the systematic review, most studies did not describe the random assignment of twins and did not account for clustering. Of those studies that did, exclusion of multiples and generalized estimating equations were the most common strategies. The NO CLD study included 84 infants with a sibling enrolled in the study. Multiples were more likely than singletons to be white and were born to older mothers (P < .01). Analyses that accounted for clustering were statistically significant; analyses assuming independence were not. The statistical approach to multiples can influence the odds ratio and width of confidence intervals, thereby affecting the interpretation of a study outcome. A minority of perinatal studies address this issue. Copyright 2010 Mosby, Inc. All rights reserved.
Power-up: A Reanalysis of 'Power Failure' in Neuroscience Using Mixture Modeling
Wood, John
2017-01-01
Recently, evidence for endemically low statistical power has cast neuroscience findings into doubt. If low statistical power plagues neuroscience, then this reduces confidence in the reported effects. However, if statistical power is not uniformly low, then such blanket mistrust might not be warranted. Here, we provide a different perspective on this issue, analyzing data from an influential study reporting a median power of 21% across 49 meta-analyses (Button et al., 2013). We demonstrate, using Gaussian mixture modeling, that the sample of 730 studies included in that analysis comprises several subcomponents so the use of a single summary statistic is insufficient to characterize the nature of the distribution. We find that statistical power is extremely low for studies included in meta-analyses that reported a null result and that it varies substantially across subfields of neuroscience, with particularly low power in candidate gene association studies. Therefore, whereas power in neuroscience remains a critical issue, the notion that studies are systematically underpowered is not the full story: low power is far from a universal problem. SIGNIFICANCE STATEMENT Recently, researchers across the biomedical and psychological sciences have become concerned with the reliability of results. One marker for reliability is statistical power: the probability of finding a statistically significant result given that the effect exists. Previous evidence suggests that statistical power is low across the field of neuroscience. Our results present a more comprehensive picture of statistical power in neuroscience: on average, studies are indeed underpowered—some very seriously so—but many studies show acceptable or even exemplary statistical power. We show that this heterogeneity in statistical power is common across most subfields in neuroscience. This new, more nuanced picture of statistical power in neuroscience could affect not only scientific understanding, but potentially policy and funding decisions for neuroscience research. PMID:28706080
Assessing the significance of pedobarographic signals using random field theory.
Pataky, Todd C
2008-08-07
Traditional pedobarographic statistical analyses are conducted over discrete regions. Recent studies have demonstrated that regionalization can corrupt pedobarographic field data through conflation when arbitrary dividing lines inappropriately delineate smooth field processes. An alternative is to register images such that homologous structures optimally overlap and then conduct statistical tests at each pixel to generate statistical parametric maps (SPMs). The significance of SPM processes may be assessed within the framework of random field theory (RFT). RFT is ideally suited to pedobarographic image analysis because its fundamental data unit is a lattice sampling of a smooth and continuous spatial field. To correct for the vast number of multiple comparisons inherent in such data, recent pedobarographic studies have employed a Bonferroni correction to retain a constant family-wise error rate. This approach unfortunately neglects the spatial correlation of neighbouring pixels, so provides an overly conservative (albeit valid) statistical threshold. RFT generally relaxes the threshold depending on field smoothness and on the geometry of the search area, but it also provides a framework for assigning p values to suprathreshold clusters based on their spatial extent. The current paper provides an overview of basic RFT concepts and uses simulated and experimental data to validate both RFT-relevant field smoothness estimations and RFT predictions regarding the topological characteristics of random pedobarographic fields. Finally, previously published experimental data are re-analysed using RFT inference procedures to demonstrate how RFT yields easily understandable statistical results that may be incorporated into routine clinical and laboratory analyses.
Thompson, Ronald E.; Hoffman, Scott A.
2006-01-01
A suite of 28 streamflow statistics, ranging from extreme low to high flows, was computed for 17 continuous-record streamflow-gaging stations and predicted for 20 partial-record stations in Monroe County and contiguous counties in north-eastern Pennsylvania. The predicted statistics for the partial-record stations were based on regression analyses relating inter-mittent flow measurements made at the partial-record stations indexed to concurrent daily mean flows at continuous-record stations during base-flow conditions. The same statistics also were predicted for 134 ungaged stream locations in Monroe County on the basis of regression analyses relating the statistics to GIS-determined basin characteristics for the continuous-record station drainage areas. The prediction methodology for developing the regression equations used to estimate statistics was developed for estimating low-flow frequencies. This study and a companion study found that the methodology also has application potential for predicting intermediate- and high-flow statistics. The statistics included mean monthly flows, mean annual flow, 7-day low flows for three recurrence intervals, nine flow durations, mean annual base flow, and annual mean base flows for two recurrence intervals. Low standard errors of prediction and high coefficients of determination (R2) indicated good results in using the regression equations to predict the statistics. Regression equations for the larger flow statistics tended to have lower standard errors of prediction and higher coefficients of determination (R2) than equations for the smaller flow statistics. The report discusses the methodologies used in determining the statistics and the limitations of the statistics and the equations used to predict the statistics. Caution is indicated in using the predicted statistics for small drainage area situations. Study results constitute input needed by water-resource managers in Monroe County for planning purposes and evaluation of water-resources availability.
NASA Astrophysics Data System (ADS)
Reid, Jean-Philippe
ommaire La structure du gap supraconducteur et sa modulation sont intimement liees au potentiel d'interaction responsable de l'appariement des electrons d'un supraconducteur. Ainsi, l'etude de la structure du gap-SC et de sa modulation permettent de faire la lumiere sur la nature du mecanisme d'appariement des electrons. A cet egard, les resultats experimentaux des supraconducteurs a base de fer ne cadrent pas dans un seul ensemble, ce qui est en opposition au gap-SC universel des cuprates. Dans ce qui suit, nous presenterons une etude systematique du gap-SC pour plusieurs pnictides. En effet, en utilisant la conductivite thermique, une sonde directionnelle du gap-SC, nous avons ete en mesure de reveler la structure du gap-SC pour les composes suivants : Ba1-xKxFe 2As2, Ba(Fe1-xCo x)2As2, LiFeAs et Fe1-deltaTe 1-xSex. L'etude de ces quatre composes, de trois differentes familles structurales, a pu etablir un tableau partiel mais tres exhaustif de la structure du gap-SC de pnictides. En effet, tel qu'illustre dans cette these, ces quatre composes ne possedent aucun noeud dans leur structure du gap-SC a dopage optimal. Toutefois, a une concentration differente de celle optimale pour les composes K-Ba122 et Co-Ba122, des noeuds apparaissent sur la surface de Fermi, aux extremites 'du dome supraconducteur. Ceci suggere fortement que, pour ces composes, la presence de noeuds sur la surface de Fermi est nuisible a la phase supraconductrice. Mots-cles: Supraconducteurs a base de fer, Pnictides, Structure du gap supraconducteur, Conductivite thermique
L'effet de p53 sur la radiosensibilité des cellules humaines normales et cancéreuses
NASA Astrophysics Data System (ADS)
Little, J. B.; Li, C. Y.; Nagasawa, H.; Huang, H.
1998-04-01
The radiosensitivity of normal human fibroblasts in p53 dependent and associated with the loss of cells from the cycling population as the result of an irreversible G1 arrest; cells lacking normal p53 function show no arrest and are more radioresistant. Under conditions in which the repair potentially lethal radiation damage is facilitated, the fraction of cells arrested in G1 is reduced and survival is enhanced. The response of human tumor cells differs significantly. The radiation-induced G1 arrest is minimal or absent in p53+ tumor cells, and loss of normal p53 function has no consistent effect on their radiosensitivity. These results suggest that p53 status may not be a useful predictive marker for the response of human solid tumors to radiation therapy. La radiosensibilité des fibroblastes diploïdes humains est liée à l'expression de p53, et à la perte de cellules en cycle résultant d'un arrêt irréversible en phase G1 ; dans les cellules n'ayant pas une fonction p53 normale, on ne constate aucun arrêt, et elles sont plus radio-résistantes. Dans des conditions favorables à la réparation de lésions potentiellement léthales dues à l'irradiation, la proportion de cellules bloquées en phase G1 baisse, et les chances de survie sont accrues. Bien différente est la réaction des cellules cancéreuses humaines. Le blocage par irradiation en phase G1 est minime ou inexistant dans les cellules cancéreuses p53^+, et la perte de la fonction normale p53 n'a pas d'effet constant sur leur radiosensibilité. Ces résultats laissent penser que l'expression de p53 n'est pas un indice fiable permettant de prévoir la réaction des tumeurs solides à la radiothérapie.
ERIC Educational Resources Information Center
Dahabreh, Issa J.; Chung, Mei; Kitsios, Georgios D.; Terasawa, Teruhiko; Raman, Gowri; Tatsioni, Athina; Tobar, Annette; Lau, Joseph; Trikalinos, Thomas A.; Schmid, Christopher H.
2013-01-01
We performed a survey of meta-analyses of test performance to describe the evolution in their methods and reporting. Studies were identified through MEDLINE (1966-2009), reference lists, and relevant reviews. We extracted information on clinical topics, literature review methods, quality assessment, and statistical analyses. We reviewed 760…
NASA Technical Reports Server (NTRS)
Davis, B. J.; Feiveson, A. H.
1975-01-01
Results are presented of CITARS data processing in raw form. Tables of descriptive statistics are given along with descriptions and results of inferential analyses. The inferential results are organized by questions which CITARS was designed to answer.
Quasi-Static Probabilistic Structural Analyses Process and Criteria
NASA Technical Reports Server (NTRS)
Goldberg, B.; Verderaime, V.
1999-01-01
Current deterministic structural methods are easily applied to substructures and components, and analysts have built great design insights and confidence in them over the years. However, deterministic methods cannot support systems risk analyses, and it was recently reported that deterministic treatment of statistical data is inconsistent with error propagation laws that can result in unevenly conservative structural predictions. Assuming non-nal distributions and using statistical data formats throughout prevailing stress deterministic processes lead to a safety factor in statistical format, which integrated into the safety index, provides a safety factor and first order reliability relationship. The embedded safety factor in the safety index expression allows a historically based risk to be determined and verified over a variety of quasi-static metallic substructures consistent with the traditional safety factor methods and NASA Std. 5001 criteria.
One-dimensional statistical parametric mapping in Python.
Pataky, Todd C
2012-01-01
Statistical parametric mapping (SPM) is a topological methodology for detecting field changes in smooth n-dimensional continua. Many classes of biomechanical data are smooth and contained within discrete bounds and as such are well suited to SPM analyses. The current paper accompanies release of 'SPM1D', a free and open-source Python package for conducting SPM analyses on a set of registered 1D curves. Three example applications are presented: (i) kinematics, (ii) ground reaction forces and (iii) contact pressure distribution in probabilistic finite element modelling. In addition to offering a high-level interface to a variety of common statistical tests like t tests, regression and ANOVA, SPM1D also emphasises fundamental concepts of SPM theory through stand-alone example scripts. Source code and documentation are available at: www.tpataky.net/spm1d/.
Football goal distributions and extremal statistics
NASA Astrophysics Data System (ADS)
Greenhough, J.; Birch, P. C.; Chapman, S. C.; Rowlands, G.
2002-12-01
We analyse the distributions of the number of goals scored by home teams, away teams, and the total scored in the match, in domestic football games from 169 countries between 1999 and 2001. The probability density functions (PDFs) of goals scored are too heavy-tailed to be fitted over their entire ranges by Poisson or negative binomial distributions which would be expected for uncorrelated processes. Log-normal distributions cannot include zero scores and here we find that the PDFs are consistent with those arising from extremal statistics. In addition, we show that it is sufficient to model English top division and FA Cup matches in the seasons of 1970/71-2000/01 on Poisson or negative binomial distributions, as reported in analyses of earlier seasons, and that these are not consistent with extremal statistics.
Logistic regression applied to natural hazards: rare event logistic regression with replications
NASA Astrophysics Data System (ADS)
Guns, M.; Vanacker, V.
2012-06-01
Statistical analysis of natural hazards needs particular attention, as most of these phenomena are rare events. This study shows that the ordinary rare event logistic regression, as it is now commonly used in geomorphologic studies, does not always lead to a robust detection of controlling factors, as the results can be strongly sample-dependent. In this paper, we introduce some concepts of Monte Carlo simulations in rare event logistic regression. This technique, so-called rare event logistic regression with replications, combines the strength of probabilistic and statistical methods, and allows overcoming some of the limitations of previous developments through robust variable selection. This technique was here developed for the analyses of landslide controlling factors, but the concept is widely applicable for statistical analyses of natural hazards.
Gadbury, Gary L.; Allison, David B.
2012-01-01
Much has been written regarding p-values below certain thresholds (most notably 0.05) denoting statistical significance and the tendency of such p-values to be more readily publishable in peer-reviewed journals. Intuition suggests that there may be a tendency to manipulate statistical analyses to push a “near significant p-value” to a level that is considered significant. This article presents a method for detecting the presence of such manipulation (herein called “fiddling”) in a distribution of p-values from independent studies. Simulations are used to illustrate the properties of the method. The results suggest that the method has low type I error and that power approaches acceptable levels as the number of p-values being studied approaches 1000. PMID:23056287
Gadbury, Gary L; Allison, David B
2012-01-01
Much has been written regarding p-values below certain thresholds (most notably 0.05) denoting statistical significance and the tendency of such p-values to be more readily publishable in peer-reviewed journals. Intuition suggests that there may be a tendency to manipulate statistical analyses to push a "near significant p-value" to a level that is considered significant. This article presents a method for detecting the presence of such manipulation (herein called "fiddling") in a distribution of p-values from independent studies. Simulations are used to illustrate the properties of the method. The results suggest that the method has low type I error and that power approaches acceptable levels as the number of p-values being studied approaches 1000.
Prison Radicalization: The New Extremist Training Grounds?
2007-09-01
distributing and collecting survey data , and the data analysis. The analytical methodology includes descriptive and inferential statistical methods, in... statistical analysis of the responses to identify significant correlations and relationships. B. SURVEY DATA COLLECTION To effectively access a...Q18, Q19, Q20, and Q21. Due to the exploratory nature of this small survey, data analyses were confined mostly to descriptive statistics and
Chen, C; Xiang, J Y; Hu, W; Xie, Y B; Wang, T J; Cui, J W; Xu, Y; Liu, Z; Xiang, H; Xie, Q
2015-11-01
To screen and identify safe micro-organisms used during Douchi fermentation, and verify the feasibility of producing high-quality Douchi using these identified micro-organisms. PCR-denaturing gradient gel electrophoresis (DGGE) and automatic amino-acid analyser were used to investigate the microbial diversity and free amino acids (FAAs) content of 10 commercial Douchi samples. The correlations between microbial communities and FAAs were analysed by statistical analysis. Ten strains with significant positive correlation were identified. Then an experiment on Douchi fermentation by identified strains was carried out, and the nutritional composition in Douchi was analysed. Results showed that FAAs and relative content of isoflavone aglycones in verification Douchi samples were generally higher than those in commercial Douchi samples. Our study indicated that fungi, yeasts, Bacillus and lactic acid bacteria were the key players in Douchi fermentation, and with identified probiotic micro-organisms participating in fermentation, a higher quality Douchi product was produced. This is the first report to analyse and confirm the key micro-organisms during Douchi fermentation by statistical analysis. This work proves fermentation micro-organisms to be the key influencing factor of Douchi quality, and demonstrates the feasibility of fermenting Douchi using identified starter micro-organisms. © 2015 The Society for Applied Microbiology.
Unconscious analyses of visual scenes based on feature conjunctions.
Tachibana, Ryosuke; Noguchi, Yasuki
2015-06-01
To efficiently process a cluttered scene, the visual system analyzes statistical properties or regularities of visual elements embedded in the scene. It is controversial, however, whether those scene analyses could also work for stimuli unconsciously perceived. Here we show that our brain performs the unconscious scene analyses not only using a single featural cue (e.g., orientation) but also based on conjunctions of multiple visual features (e.g., combinations of color and orientation information). Subjects foveally viewed a stimulus array (duration: 50 ms) where 4 types of bars (red-horizontal, red-vertical, green-horizontal, and green-vertical) were intermixed. Although a conscious perception of those bars was inhibited by a subsequent mask stimulus, the brain correctly analyzed the information about color, orientation, and color-orientation conjunctions of those invisible bars. The information of those features was then used for the unconscious configuration analysis (statistical processing) of the central bars, which induced a perceptual bias and illusory feature binding in visible stimuli at peripheral locations. While statistical analyses and feature binding are normally 2 key functions of the visual system to construct coherent percepts of visual scenes, our results show that a high-level analysis combining those 2 functions is correctly performed by unconscious computations in the brain. (c) 2015 APA, all rights reserved).
Study Designs and Statistical Analyses for Biomarker Research
Gosho, Masahiko; Nagashima, Kengo; Sato, Yasunori
2012-01-01
Biomarkers are becoming increasingly important for streamlining drug discovery and development. In addition, biomarkers are widely expected to be used as a tool for disease diagnosis, personalized medication, and surrogate endpoints in clinical research. In this paper, we highlight several important aspects related to study design and statistical analysis for clinical research incorporating biomarkers. We describe the typical and current study designs for exploring, detecting, and utilizing biomarkers. Furthermore, we introduce statistical issues such as confounding and multiplicity for statistical tests in biomarker research. PMID:23012528
NASA Technical Reports Server (NTRS)
Young, M.; Koslovsky, M.; Schaefer, Caroline M.; Feiveson, A. H.
2017-01-01
Back by popular demand, the JSC Biostatistics Laboratory and LSAH statisticians are offering an opportunity to discuss your statistical challenges and needs. Take the opportunity to meet the individuals offering expert statistical support to the JSC community. Join us for an informal conversation about any questions you may have encountered with issues of experimental design, analysis, or data visualization. Get answers to common questions about sample size, repeated measures, statistical assumptions, missing data, multiple testing, time-to-event data, and when to trust the results of your analyses.
Accuracy of medical subject heading indexing of dental survival analyses.
Layton, Danielle M; Clarke, Michael
2014-01-01
To assess the Medical Subject Headings (MeSH) indexing of articles that employed time-to-event analyses to report outcomes of dental treatment in patients. Articles published in 2008 in 50 dental journals with the highest impact factors were hand searched to identify articles reporting dental treatment outcomes over time in human subjects with time-to-event statistics (included, n = 95), without time-to-event statistics (active controls, n = 91), and all other articles (passive controls, n = 6,769). The search was systematic (kappa 0.92 for screening, 0.86 for eligibility). Outcome-, statistic- and time-related MeSH were identified, and differences in allocation between groups were analyzed with chi-square and Fischer exact statistics. The most frequently allocated MeSH for included and active control articles were "dental restoration failure" (77% and 52%, respectively) and "treatment outcome" (54% and 48%, respectively). Outcome MeSH was similar between these groups (86% and 77%, respectively) and significantly greater than passive controls (10%, P < .001). Significantly more statistical MeSH were allocated to the included articles than to the active or passive controls (67%, 15%, and 1%, respectively, P < .001). Sixty-nine included articles specifically used Kaplan-Meier or life table analyses, but only 42% (n = 29) were indexed as such. Significantly more time-related MeSH were allocated to the included than the active controls (92% and 79%, respectively, P = .02), or to the passive controls (22%, P < .001). MeSH allocation within MEDLINE to time-to-event dental articles was inaccurate and inconsistent. Statistical MeSH were omitted from 30% of the included articles and incorrectly allocated to 15% of active controls. Such errors adversely impact search accuracy.
Statistical analysis of iron geochemical data suggests limited late Proterozoic oxygenation
NASA Astrophysics Data System (ADS)
Sperling, Erik A.; Wolock, Charles J.; Morgan, Alex S.; Gill, Benjamin C.; Kunzmann, Marcus; Halverson, Galen P.; MacDonald, Francis A.; Knoll, Andrew H.; Johnston, David T.
2015-07-01
Sedimentary rocks deposited across the Proterozoic-Phanerozoic transition record extreme climate fluctuations, a potential rise in atmospheric oxygen or re-organization of the seafloor redox landscape, and the initial diversification of animals. It is widely assumed that the inferred redox change facilitated the observed trends in biodiversity. Establishing this palaeoenvironmental context, however, requires that changes in marine redox structure be tracked by means of geochemical proxies and translated into estimates of atmospheric oxygen. Iron-based proxies are among the most effective tools for tracking the redox chemistry of ancient oceans. These proxies are inherently local, but have global implications when analysed collectively and statistically. Here we analyse about 4,700 iron-speciation measurements from shales 2,300 to 360 million years old. Our statistical analyses suggest that subsurface water masses in mid-Proterozoic oceans were predominantly anoxic and ferruginous (depleted in dissolved oxygen and iron-bearing), but with a tendency towards euxinia (sulfide-bearing) that is not observed in the Neoproterozoic era. Analyses further indicate that early animals did not experience appreciable benthic sulfide stress. Finally, unlike proxies based on redox-sensitive trace-metal abundances, iron geochemical data do not show a statistically significant change in oxygen content through the Ediacaran and Cambrian periods, sharply constraining the magnitude of the end-Proterozoic oxygen increase. Indeed, this re-analysis of trace-metal data is consistent with oxygenation continuing well into the Palaeozoic era. Therefore, if changing redox conditions facilitated animal diversification, it did so through a limited rise in oxygen past critical functional and ecological thresholds, as is seen in modern oxygen minimum zone benthic animal communities.
Ensor, Joie; Riley, Richard D.
2016-01-01
Meta‐analysis using individual participant data (IPD) obtains and synthesises the raw, participant‐level data from a set of relevant studies. The IPD approach is becoming an increasingly popular tool as an alternative to traditional aggregate data meta‐analysis, especially as it avoids reliance on published results and provides an opportunity to investigate individual‐level interactions, such as treatment‐effect modifiers. There are two statistical approaches for conducting an IPD meta‐analysis: one‐stage and two‐stage. The one‐stage approach analyses the IPD from all studies simultaneously, for example, in a hierarchical regression model with random effects. The two‐stage approach derives aggregate data (such as effect estimates) in each study separately and then combines these in a traditional meta‐analysis model. There have been numerous comparisons of the one‐stage and two‐stage approaches via theoretical consideration, simulation and empirical examples, yet there remains confusion regarding when each approach should be adopted, and indeed why they may differ. In this tutorial paper, we outline the key statistical methods for one‐stage and two‐stage IPD meta‐analyses, and provide 10 key reasons why they may produce different summary results. We explain that most differences arise because of different modelling assumptions, rather than the choice of one‐stage or two‐stage itself. We illustrate the concepts with recently published IPD meta‐analyses, summarise key statistical software and provide recommendations for future IPD meta‐analyses. © 2016 The Authors. Statistics in Medicine published by John Wiley & Sons Ltd. PMID:27747915
SPARC Intercomparison of Middle Atmosphere Climatologies
NASA Technical Reports Server (NTRS)
Randel, William; Fleming, Eric; Geller, Marvin; Hamilton, Kevin; Karoly, David; Ortland, Dave; Pawson, Steve; Swinbank, Richard; Udelhofen, Petra
2002-01-01
This atlas presents detailed incomparisons of several climatological wind and temperature data sets which cover the middle atmosphere (over altitudes approx. 10-80 km). A number of middle atmosphere climatologies have been developed in the research community based on a variety of meteorological analyses and satellite data sets. Here we present comparisons between these climatological data sets for a number of basic circulation statistics, such as zonal mean temperature, winds and eddy flux statistics. Special attention is focused on tropical winds and temperatures, where large differences exist among separate analyses. We also include comparisons between the global climatologies and historical rocketsonde wind and temperature measurements, and also with more recent lidar temperature data. These comparisons highlight differences and uncertainties in contemporary middle atmosphere data sets, and allow biases in particular analyses to be isolated. In addition, a brief atlas of zonal mean temperature and wind statistics is provided to highlight data availability and as a quick-look reference. This technical report is intended as a companion to the climatological data sets held in archive at the SPARC Data Center (http://www.sparc.sunysb.edu).
permGPU: Using graphics processing units in RNA microarray association studies.
Shterev, Ivo D; Jung, Sin-Ho; George, Stephen L; Owzar, Kouros
2010-06-16
Many analyses of microarray association studies involve permutation, bootstrap resampling and cross-validation, that are ideally formulated as embarrassingly parallel computing problems. Given that these analyses are computationally intensive, scalable approaches that can take advantage of multi-core processor systems need to be developed. We have developed a CUDA based implementation, permGPU, that employs graphics processing units in microarray association studies. We illustrate the performance and applicability of permGPU within the context of permutation resampling for a number of test statistics. An extensive simulation study demonstrates a dramatic increase in performance when using permGPU on an NVIDIA GTX 280 card compared to an optimized C/C++ solution running on a conventional Linux server. permGPU is available as an open-source stand-alone application and as an extension package for the R statistical environment. It provides a dramatic increase in performance for permutation resampling analysis in the context of microarray association studies. The current version offers six test statistics for carrying out permutation resampling analyses for binary, quantitative and censored time-to-event traits.
Neyeloff, Jeruza L; Fuchs, Sandra C; Moreira, Leila B
2012-01-20
Meta-analyses are necessary to synthesize data obtained from primary research, and in many situations reviews of observational studies are the only available alternative. General purpose statistical packages can meta-analyze data, but usually require external macros or coding. Commercial specialist software is available, but may be expensive and focused in a particular type of primary data. Most available softwares have limitations in dealing with descriptive data, and the graphical display of summary statistics such as incidence and prevalence is unsatisfactory. Analyses can be conducted using Microsoft Excel, but there was no previous guide available. We constructed a step-by-step guide to perform a meta-analysis in a Microsoft Excel spreadsheet, using either fixed-effect or random-effects models. We have also developed a second spreadsheet capable of producing customized forest plots. It is possible to conduct a meta-analysis using only Microsoft Excel. More important, to our knowledge this is the first description of a method for producing a statistically adequate but graphically appealing forest plot summarizing descriptive data, using widely available software.
2012-01-01
Background Meta-analyses are necessary to synthesize data obtained from primary research, and in many situations reviews of observational studies are the only available alternative. General purpose statistical packages can meta-analyze data, but usually require external macros or coding. Commercial specialist software is available, but may be expensive and focused in a particular type of primary data. Most available softwares have limitations in dealing with descriptive data, and the graphical display of summary statistics such as incidence and prevalence is unsatisfactory. Analyses can be conducted using Microsoft Excel, but there was no previous guide available. Findings We constructed a step-by-step guide to perform a meta-analysis in a Microsoft Excel spreadsheet, using either fixed-effect or random-effects models. We have also developed a second spreadsheet capable of producing customized forest plots. Conclusions It is possible to conduct a meta-analysis using only Microsoft Excel. More important, to our knowledge this is the first description of a method for producing a statistically adequate but graphically appealing forest plot summarizing descriptive data, using widely available software. PMID:22264277
Statistical Prediction in Proprietary Rehabilitation.
ERIC Educational Resources Information Center
Johnson, Kurt L.; And Others
1987-01-01
Applied statistical methods to predict case expenditures for low back pain rehabilitation cases in proprietary rehabilitation. Extracted predictor variables from case records of 175 workers compensation claimants with some degree of permanent disability due to back injury. Performed several multiple regression analyses resulting in a formula that…
Injuries and Illnesses of Vietnam War POWs Revisited: IV. Air Force Risk Factors
2017-03-22
predominantly aviators imprisoned in North Vietnam. Statistical analyses were performed using SPSS version 19. Pearson correlations were obtained...Repatriated Prisoner of War Initial Medical Evaluation Forms. Department of Defense. Washington, D.C. 5. IBM Corporation (2010). IBM SPSS Statistics for
Statistics for Learning Genetics
ERIC Educational Resources Information Center
Charles, Abigail Sheena
2012-01-01
This study investigated the knowledge and skills that biology students may need to help them understand statistics/mathematics as it applies to genetics. The data are based on analyses of current representative genetics texts, practicing genetics professors' perspectives, and more directly, students' perceptions of, and performance in, doing…
USDA-ARS?s Scientific Manuscript database
Characterizing population genetic structure across geographic space is a fundamental challenge in population genetics. Multivariate statistical analyses are powerful tools for summarizing genetic variability, but geographic information and accompanying metadata is not always easily integrated into t...
P-MartCancer–Interactive Online Software to Enable Analysis of Shotgun Cancer Proteomic Datasets
DOE Office of Scientific and Technical Information (OSTI.GOV)
Webb-Robertson, Bobbie-Jo M.; Bramer, Lisa M.; Jensen, Jeffrey L.
P-MartCancer is a new interactive web-based software environment that enables biomedical and biological scientists to perform in-depth analyses of global proteomics data without requiring direct interaction with the data or with statistical software. P-MartCancer offers a series of statistical modules associated with quality assessment, peptide and protein statistics, protein quantification and exploratory data analyses driven by the user via customized workflows and interactive visualization. Currently, P-MartCancer offers access to multiple cancer proteomic datasets generated through the Clinical Proteomics Tumor Analysis Consortium (CPTAC) at the peptide, gene and protein levels. P-MartCancer is deployed using Azure technologies (http://pmart.labworks.org/cptac.html), the web-service is alternativelymore » available via Docker Hub (https://hub.docker.com/r/pnnl/pmart-web/) and many statistical functions can be utilized directly from an R package available on GitHub (https://github.com/pmartR).« less
The disagreeable behaviour of the kappa statistic.
Flight, Laura; Julious, Steven A
2015-01-01
It is often of interest to measure the agreement between a number of raters when an outcome is nominal or ordinal. The kappa statistic is used as a measure of agreement. The statistic is highly sensitive to the distribution of the marginal totals and can produce unreliable results. Other statistics such as the proportion of concordance, maximum attainable kappa and prevalence and bias adjusted kappa should be considered to indicate how well the kappa statistic represents agreement in the data. Each kappa should be considered and interpreted based on the context of the data being analysed. Copyright © 2014 John Wiley & Sons, Ltd.
Analysis and meta-analysis of single-case designs: an introduction.
Shadish, William R
2014-04-01
The last 10 years have seen great progress in the analysis and meta-analysis of single-case designs (SCDs). This special issue includes five articles that provide an overview of current work on that topic, including standardized mean difference statistics, multilevel models, Bayesian statistics, and generalized additive models. Each article analyzes a common example across articles and presents syntax or macros for how to do them. These articles are followed by commentaries from single-case design researchers and journal editors. This introduction briefly describes each article and then discusses several issues that must be addressed before we can know what analyses will eventually be best to use in SCD research. These issues include modeling trend, modeling error covariances, computing standardized effect size estimates, assessing statistical power, incorporating more accurate models of outcome distributions, exploring whether Bayesian statistics can improve estimation given the small samples common in SCDs, and the need for annotated syntax and graphical user interfaces that make complex statistics accessible to SCD researchers. The article then discusses reasons why SCD researchers are likely to incorporate statistical analyses into their research more often in the future, including changing expectations and contingencies regarding SCD research from outside SCD communities, changes and diversity within SCD communities, corrections of erroneous beliefs about the relationship between SCD research and statistics, and demonstrations of how statistics can help SCD researchers better meet their goals. Copyright © 2013 Society for the Study of School Psychology. Published by Elsevier Ltd. All rights reserved.
1991-09-01
In subsequent discussions, we shall classify a clutter process to be 2-7 predominantly Rayleigh if the value of f is less than 0.8, and the Pfa ...classified as "others’, the Pfa vs threshold curve , is closer to the Ricean model than to the Rayleigh model, and the value of the parameter 0 was usually...better, and for precipitation clutter the lattice was 4 to approximation of "...equally likely". PD and PFA are 6 dB bettor, here specified a 0.5 and 0.01
NASA Astrophysics Data System (ADS)
Ghosn, Rania; Villégier, Anne-Sophie; Selmaoui, Brahim; Thuróczy, Georges; de Sèze, René
2013-05-01
Most of clinical studies on radiofrequency electromagnetic fields (RF) were directed at mobile phone-related exposures, usually at the level of the head, at their effect on some physiological functions including sleep, brain electrical activity (EEG), cognitive processes, brain vascularisation, and more generally on the cardiovascular and endocrine systems. They were frequently carried out on healthy adults. Effects on the amplitude of EEG alpha waves, mainly during sleep, look reproducible. It would however be important to define more precisely whether and how the absence of electromagnetic disturbance between RF exposure and the recording systems is checked. No consensus arises about cognitive effects. Some effects on cerebral vascularisation need complementary work.
Traumatic stress: effects on the brain
Bremner, J. Douglas
2006-01-01
Brain areas implicated in the stress response include the amygdala, hippocampus, and prefrontal cortex. Traumatic stress can be associated with lasting changes in these brain areas. Traumatic stress is associated with increased cortisol and norepinephrine responses to subsequent stressors. Antidepressants have effets on the hippocampus that counteract the effects of stress. Findings from animal studies have been extended to patients with post-traumatic stress disorder (PTSD) showing smaller hippocampal and anterior cingulate volumes, increased amygdala function, and decreased medial prefrontal/anterior cingulate function. In addition, patients with PTSD show increased cortisol and norepinephrine responses to stress. Treatments that are efficacious for PTSD show a promotion of neurogenesis in animal studies, as well as promotion of memory and increased hippocampal volume in PTSD. PMID:17290802
Somerville, Jane
2012-12-01
The World Congress of Paediatric Cardiology and Cardiac Surgery has survived with minimal assets and simple organisation. Each congress is special, taking on the humour, flavour, and culture of the organising country. It is hard work for a few organisers and money is hard to raise. The steering committee works closely, fairly, and successfully, and even though accused of being secretive and effete that does not matter. It is efficient and produces successful, happy world congresses, where all involved with the speciality are welcome. With so many "grown-ups" with congenital heart disease, it is no longer just a paediatric problem - maybe the name of this congress must change again. Regardless, the flag must fly on.
An Exploratory Data Analysis System for Support in Medical Decision-Making
Copeland, J. A.; Hamel, B.; Bourne, J. R.
1979-01-01
An experimental system was developed to allow retrieval and analysis of data collected during a study of neurobehavioral correlates of renal disease. After retrieving data organized in a relational data base, simple bivariate statistics of parametric and nonparametric nature could be conducted. An “exploratory” mode in which the system provided guidance in selection of appropriate statistical analyses was also available to the user. The system traversed a decision tree using the inherent qualities of the data (e.g., the identity and number of patients, tests, and time epochs) to search for the appropriate analyses to employ.
Statistical study of air pollutant concentrations via generalized gamma distribution
DOE Office of Scientific and Technical Information (OSTI.GOV)
Marani, A.; Lavagnini, I.; Buttazzoni, C.
1986-11-01
This paper deals with modeling observed frequency distributions of air quality data measured in the area of Venice, Italy. The paper discusses the application of the generalized gamma distribution (ggd) which has not been commonly applied to air quality data notwithstanding the fact that it embodies most distribution models used for air quality analyses. The approach yields important simplifications for statistical analyses. A comparison among the ggd and other relevant models (standard gamma, Weibull, lognormal), carried out on daily sulfur dioxide concentrations in the area of Venice underlines the efficiency of ggd models in portraying experimental data.
2016-11-15
participants who were followed for the development of back pain for an average of 3.9 years. Methods. Descriptive statistics and longitudinal...health, military personnel, occupational health, outcome assessment, statistics, survey methodology . Level of Evidence: 3 Spine 2016;41:1754–1763ack...based on the National Health and Nutrition Examination Survey.21 Statistical Analysis Descriptive and univariate analyses compared character- istics
ERIC Educational Resources Information Center
Kadhi, T.; Palasota, A.; Holley, D.; Rudley, D.
2010-01-01
The following report gives the statistical findings of the 2009-2010 Watson-Glaser test. Data is pre-existing and was given to the Evaluator by email from the Director, Center for Legal Pedagogy. Statistical analyses were run using SPSS 17 to address the following questions: 1. What are the statistical descriptors of the Watson-Glaser results of…
Human Deception Detection from Whole Body Motion Analysis
2015-12-01
9.3.2. Prediction Probability The output reports from SPSS detail the stepwise procedures for each series of analyses using Wald statistic values for... statistical significance in determining replication, but instead used a combination of significance and direction of means to determine partial or...and the independents need not be unbound. All data were analyzed utilizing the Statistical Package for Social Sciences ( SPSS , v.19.0, Chicago, IL
de Sá, Joceline Cássia Ferezini; Marini, Gabriela; Gelaleti, Rafael Bottaro; da Silva, João Batista; de Azevedo, George Gantas; Rudge, Marilza Vieira Cunha
2013-11-01
To evaluate the methodological and statistical design evolution of the publications in the Brazilian Journal of Gynecology and Obstetrics (RBGO) from resolution 196/96. A review of 133 articles published in 1999 (65) and 2009 (68) was performed by two independent reviewers with training in clinical epidemiology and methodology of scientific research. We included all original clinical articles, case and series reports and excluded editorials, letters to the editor, systematic reviews, experimental studies, opinion articles, besides abstracts of theses and dissertations. Characteristics related to the methodological quality of the studies were analyzed in each article using a checklist that evaluated two criteria: methodological aspects and statistical procedures. We used descriptive statistics and the χ2 test for comparison of the two years. There was a difference between 1999 and 2009 regarding the study and statistical design, with more accuracy in the procedures and the use of more robust tests between 1999 and 2009. In RBGO, we observed an evolution in the methods of published articles and a more in-depth use of the statistical analyses, with more sophisticated tests such as regression and multilevel analyses, which are essential techniques for the knowledge and planning of health interventions, leading to fewer interpretation errors.
Methodological difficulties of conducting agroecological studies from a statistical perspective
USDA-ARS?s Scientific Manuscript database
Statistical methods for analysing agroecological data might not be able to help agroecologists to solve all of the current problems concerning crop and animal husbandry, but such methods could well help agroecologists to assess, tackle, and resolve several agroecological issues in a more reliable an...
How to Create Automatically Graded Spreadsheets for Statistics Courses
ERIC Educational Resources Information Center
LoSchiavo, Frank M.
2016-01-01
Instructors often use spreadsheet software (e.g., Microsoft Excel) in their statistics courses so that students can gain experience conducting computerized analyses. Unfortunately, students tend to make several predictable errors when programming spreadsheets. Without immediate feedback, programming errors are likely to go undetected, and as a…
Secondary Analysis of Qualitative Data.
ERIC Educational Resources Information Center
Turner, Paul D.
The reanalysis of data to answer the original research question with better statistical techniques or to answer new questions with old data is not uncommon in quantitative studies. Meta analysis and research syntheses have increased with the increase in research using similar statistical analyses, refinements of analytical techniques, and the…
Dynamic Graphics in Excel for Teaching Statistics: Understanding the Probability Density Function
ERIC Educational Resources Information Center
Coll-Serrano, Vicente; Blasco-Blasco, Olga; Alvarez-Jareno, Jose A.
2011-01-01
In this article, we show a dynamic graphic in Excel that is used to introduce an important concept in our subject, Statistics I: the probability density function. This interactive graphic seeks to facilitate conceptual understanding of the main aspects analysed by the learners.
Method and data evaluation at NASA endocrine laboratory. [Skylab 3 experiments
NASA Technical Reports Server (NTRS)
Johnston, D. A.
1974-01-01
The biomedical data of the astronauts on Skylab 3 were analyzed to evaluate the univariate statistical methods for comparing endocrine series experiments in relation to other medical experiments. It was found that an information storage and retrieval system was needed to facilitate statistical analyses.
A decade of individual participant data meta-analyses: A review of current practice.
Simmonds, Mark; Stewart, Gavin; Stewart, Lesley
2015-11-01
Individual participant data (IPD) systematic reviews and meta-analyses are often considered to be the gold standard for meta-analysis. In the ten years since the first review into the methodology and reporting practice of IPD reviews was published much has changed in the field. This paper investigates current reporting and statistical practice in IPD systematic reviews. A systematic review was performed to identify systematic reviews that collected and analysed IPD. Data were extracted from each included publication on a variety of issues related to the reporting of IPD review process, and the statistical methods used. There has been considerable growth in the use of "one-stage" methods to perform IPD meta-analyses. The majority of reviews consider at least one covariate other than the primary intervention, either using subgroup analysis or including covariates in one-stage regression models. Random-effects analyses, however, are not often used. Reporting of review methods was often limited, with few reviews presenting a risk-of-bias assessment. Details on issues specific to the use of IPD were little reported, including how IPD were obtained; how data was managed and checked for consistency and errors; and for how many studies and participants IPD were sought and obtained. While the last ten years have seen substantial changes in how IPD meta-analyses are performed there remains considerable scope for improving the quality of reporting for both the process of IPD systematic reviews, and the statistical methods employed in them. It is to be hoped that the publication of the PRISMA-IPD guidelines specific to IPD reviews will improve reporting in this area. Copyright © 2015 Elsevier Inc. All rights reserved.
How Big of a Problem is Analytic Error in Secondary Analyses of Survey Data?
West, Brady T; Sakshaug, Joseph W; Aurelien, Guy Alain S
2016-01-01
Secondary analyses of survey data collected from large probability samples of persons or establishments further scientific progress in many fields. The complex design features of these samples improve data collection efficiency, but also require analysts to account for these features when conducting analysis. Unfortunately, many secondary analysts from fields outside of statistics, biostatistics, and survey methodology do not have adequate training in this area, and as a result may apply incorrect statistical methods when analyzing these survey data sets. This in turn could lead to the publication of incorrect inferences based on the survey data that effectively negate the resources dedicated to these surveys. In this article, we build on the results of a preliminary meta-analysis of 100 peer-reviewed journal articles presenting analyses of data from a variety of national health surveys, which suggested that analytic errors may be extremely prevalent in these types of investigations. We first perform a meta-analysis of a stratified random sample of 145 additional research products analyzing survey data from the Scientists and Engineers Statistical Data System (SESTAT), which describes features of the U.S. Science and Engineering workforce, and examine trends in the prevalence of analytic error across the decades used to stratify the sample. We once again find that analytic errors appear to be quite prevalent in these studies. Next, we present several example analyses of real SESTAT data, and demonstrate that a failure to perform these analyses correctly can result in substantially biased estimates with standard errors that do not adequately reflect complex sample design features. Collectively, the results of this investigation suggest that reviewers of this type of research need to pay much closer attention to the analytic methods employed by researchers attempting to publish or present secondary analyses of survey data.
Tonelli, Adriano R.; Zein, Joe; Adams, Jacob; Ioannidis, John P.A.
2014-01-01
Purpose Multiple interventions have been tested in acute respiratory distress syndrome (ARDS). We examined the entire agenda of published randomized controlled trials (RCTs) in ARDS that reported on mortality and of respective meta-analyses. Methods We searched PubMed, the Cochrane Library and Web of Knowledge until July 2013. We included RCTs in ARDS published in English. We excluded trials of newborns and children; and those on short-term interventions, ARDS prevention or post-traumatic lung injury. We also reviewed all meta-analyses of RCTs in this field that addressed mortality. Treatment modalities were grouped in five categories: mechanical ventilation strategies and respiratory care, enteral or parenteral therapies, inhaled / intratracheal medications, nutritional support and hemodynamic monitoring. Results We identified 159 published RCTs of which 93 had overall mortality reported (n= 20,671 patients) - 44 trials (14,426 patients) reported mortality as a primary outcome. A statistically significant survival benefit was observed in 8 trials (7 interventions) and two trials reported an adverse effect on survival. Among RTCs with >50 deaths in at least 1 treatment arm (n=21), 2 showed a statistically significant mortality benefit of the intervention (lower tidal volumes and prone positioning), 1 showed a statistically significant mortality benefit only in adjusted analyses (cisatracurium) and 1 (high-frequency oscillatory ventilation) showed a significant detrimental effect. Across 29 meta-analyses, the most consistent evidence was seen for low tidal volumes and prone positioning in severe ARDS. Conclusions There is limited supportive evidence that specific interventions can decrease mortality in ARDS. While low tidal volumes and prone positioning in severe ARDS seem effective, most sporadic findings of interventions suggesting reduced mortality are not corroborated consistently in large-scale evidence including meta-analyses. PMID:24667919
How Big of a Problem is Analytic Error in Secondary Analyses of Survey Data?
West, Brady T.; Sakshaug, Joseph W.; Aurelien, Guy Alain S.
2016-01-01
Secondary analyses of survey data collected from large probability samples of persons or establishments further scientific progress in many fields. The complex design features of these samples improve data collection efficiency, but also require analysts to account for these features when conducting analysis. Unfortunately, many secondary analysts from fields outside of statistics, biostatistics, and survey methodology do not have adequate training in this area, and as a result may apply incorrect statistical methods when analyzing these survey data sets. This in turn could lead to the publication of incorrect inferences based on the survey data that effectively negate the resources dedicated to these surveys. In this article, we build on the results of a preliminary meta-analysis of 100 peer-reviewed journal articles presenting analyses of data from a variety of national health surveys, which suggested that analytic errors may be extremely prevalent in these types of investigations. We first perform a meta-analysis of a stratified random sample of 145 additional research products analyzing survey data from the Scientists and Engineers Statistical Data System (SESTAT), which describes features of the U.S. Science and Engineering workforce, and examine trends in the prevalence of analytic error across the decades used to stratify the sample. We once again find that analytic errors appear to be quite prevalent in these studies. Next, we present several example analyses of real SESTAT data, and demonstrate that a failure to perform these analyses correctly can result in substantially biased estimates with standard errors that do not adequately reflect complex sample design features. Collectively, the results of this investigation suggest that reviewers of this type of research need to pay much closer attention to the analytic methods employed by researchers attempting to publish or present secondary analyses of survey data. PMID:27355817
Predation and fragmentation portrayed in the statistical structure of prey time series
Hendrichsen, Ditte K; Topping, Chris J; Forchhammer, Mads C
2009-01-01
Background Statistical autoregressive analyses of direct and delayed density dependence are widespread in ecological research. The models suggest that changes in ecological factors affecting density dependence, like predation and landscape heterogeneity are directly portrayed in the first and second order autoregressive parameters, and the models are therefore used to decipher complex biological patterns. However, independent tests of model predictions are complicated by the inherent variability of natural populations, where differences in landscape structure, climate or species composition prevent controlled repeated analyses. To circumvent this problem, we applied second-order autoregressive time series analyses to data generated by a realistic agent-based computer model. The model simulated life history decisions of individual field voles under controlled variations in predator pressure and landscape fragmentation. Analyses were made on three levels: comparisons between predated and non-predated populations, between populations exposed to different types of predators and between populations experiencing different degrees of habitat fragmentation. Results The results are unambiguous: Changes in landscape fragmentation and the numerical response of predators are clearly portrayed in the statistical time series structure as predicted by the autoregressive model. Populations without predators displayed significantly stronger negative direct density dependence than did those exposed to predators, where direct density dependence was only moderately negative. The effects of predation versus no predation had an even stronger effect on the delayed density dependence of the simulated prey populations. In non-predated prey populations, the coefficients of delayed density dependence were distinctly positive, whereas they were negative in predated populations. Similarly, increasing the degree of fragmentation of optimal habitat available to the prey was accompanied with a shift in the delayed density dependence, from strongly negative to gradually becoming less negative. Conclusion We conclude that statistical second-order autoregressive time series analyses are capable of deciphering interactions within and across trophic levels and their effect on direct and delayed density dependence. PMID:19419539
Mediation analysis in nursing research: a methodological review.
Liu, Jianghong; Ulrich, Connie
2016-12-01
Mediation statistical models help clarify the relationship between independent predictor variables and dependent outcomes of interest by assessing the impact of third variables. This type of statistical analysis is applicable for many clinical nursing research questions, yet its use within nursing remains low. Indeed, mediational analyses may help nurse researchers develop more effective and accurate prevention and treatment programs as well as help bridge the gap between scientific knowledge and clinical practice. In addition, this statistical approach allows nurse researchers to ask - and answer - more meaningful and nuanced questions that extend beyond merely determining whether an outcome occurs. Therefore, the goal of this paper is to provide a brief tutorial on the use of mediational analyses in clinical nursing research by briefly introducing the technique and, through selected empirical examples from the nursing literature, demonstrating its applicability in advancing nursing science.
Mercer, Theresa G; Frostick, Lynne E; Walmsley, Anthony D
2011-10-15
This paper presents a statistical technique that can be applied to environmental chemistry data where missing values and limit of detection levels prevent the application of statistics. A working example is taken from an environmental leaching study that was set up to determine if there were significant differences in levels of leached arsenic (As), chromium (Cr) and copper (Cu) between lysimeters containing preservative treated wood waste and those containing untreated wood. Fourteen lysimeters were setup and left in natural conditions for 21 weeks. The resultant leachate was analysed by ICP-OES to determine the As, Cr and Cu concentrations. However, due to the variation inherent in each lysimeter combined with the limits of detection offered by ICP-OES, the collected quantitative data was somewhat incomplete. Initial data analysis was hampered by the number of 'missing values' in the data. To recover the dataset, the statistical tool of Statistical Multiple Imputation (SMI) was applied, and the data was re-analysed successfully. It was demonstrated that using SMI did not affect the variance in the data, but facilitated analysis of the complete dataset. Copyright © 2011 Elsevier B.V. All rights reserved.
Spurious correlations and inference in landscape genetics
Samuel A. Cushman; Erin L. Landguth
2010-01-01
Reliable interpretation of landscape genetic analyses depends on statistical methods that have high power to identify the correct process driving gene flow while rejecting incorrect alternative hypotheses. Little is known about statistical power and inference in individual-based landscape genetics. Our objective was to evaluate the power of causalmodelling with partial...
Recommendations for Improved Performance Appraisal in the Federal Sector
1986-01-01
camera-ready copy of a Participant’s Coursebook to be used in conducting sessions of the course, and (d) an evaluation instrument for use in obtaining...Timeliness and Availability of Departmental Statistics and Analyses Develop complete plans for conducting the 1990 census • Improve statistics on
Learning from Friends: Measuring Influence in a Dyadic Computer Instructional Setting
ERIC Educational Resources Information Center
DeLay, Dawn; Hartl, Amy C.; Laursen, Brett; Denner, Jill; Werner, Linda; Campe, Shannon; Ortiz, Eloy
2014-01-01
Data collected from partners in a dyadic instructional setting are, by definition, not statistically independent. As a consequence, conventional parametric statistical analyses of change and influence carry considerable risk of bias. In this article, we illustrate a strategy to overcome this obstacle: the longitudinal actor-partner interdependence…
Most analyses of daily time series epidemiology data relate mortality or morbidity counts to PM and other air pollutants by means of single-outcome regression models using multiple predictors, without taking into account the complex statistical structure of the predictor variable...
The Surprisingly Modest Relationship between SES and Educational Achievement
ERIC Educational Resources Information Center
Harwell, Michael; Maeda, Yukiko; Bishop, Kyoungwon; Xie, Aolin
2017-01-01
Measures of socioeconomic status (SES) are routinely used in analyses of achievement data to increase statistical power, statistically control for the effects of SES, and enhance causality arguments under the premise that the SES-achievement relationship is moderate to strong. Empirical evidence characterizing the strength of the SES-achievement…
ERIC Educational Resources Information Center
Everson, Howard T.; And Others
This paper explores the feasibility of neural computing methods such as artificial neural networks (ANNs) and abductory induction mechanisms (AIM) for use in educational measurement. ANNs and AIMS methods are contrasted with more traditional statistical techniques, such as multiple regression and discriminant function analyses, for making…
The Poor in 1970: A Chartbook.
ERIC Educational Resources Information Center
Ryscavage, Paul M.
The analyses in this presentation book, prepared by the Policy Research Division of the Office of Planning, Research, and Evaluation, Office of Economic Opportunity, reflect poverty statistics based on data from the Current Population Survey of the Bureau of the Census. These statistics reflect incomes of families and individuals which fall below…
ERIC Educational Resources Information Center
Wiggins, Lyna; Nower, Lia; Mayers, Raymond Sanchez; Peterson, N. Andrew
2010-01-01
This study examines the density of lottery outlets within ethnically concentrated neighborhoods in Middlesex County, New Jersey, using geospatial statistical analyses. No prior studies have empirically examined the relationship between lottery outlet density and population demographics. Results indicate that lottery outlets were not randomly…
Statistical Treatment of Looking-Time Data
ERIC Educational Resources Information Center
Csibra, Gergely; Hernik, Mikolaj; Mascaro, Olivier; Tatone, Denis; Lengyel, Máté
2016-01-01
Looking times (LTs) are frequently measured in empirical research on infant cognition. We analyzed the statistical distribution of LTs across participants to develop recommendations for their treatment in infancy research. Our analyses focused on a common within-subject experimental design, in which longer looking to novel or unexpected stimuli is…
Conducting Multilevel Analyses in Medical Education
ERIC Educational Resources Information Center
Zyphur, Michael J.; Kaplan, Seth A.; Islam, Gazi; Barsky, Adam P.; Franklin, Michael S.
2008-01-01
A significant body of education literature has begun using multilevel statistical models to examine data that reside at multiple levels of analysis. In order to provide a primer for medical education researchers, the current work gives a brief overview of some issues associated with multilevel statistical modeling. To provide an example of this…
Torres-Carvajal, Omar; Schulte, James A; Cadle, John E
2006-04-01
The South American iguanian lizard genus Stenocercus includes 54 species occurring mostly in the Andes and adjacent lowland areas from northern Venezuela and Colombia to central Argentina at elevations of 0-4000m. Small taxon or character sampling has characterized all phylogenetic analyses of Stenocercus, which has long been recognized as sister taxon to the Tropidurus Group. In this study, we use mtDNA sequence data to perform phylogenetic analyses that include 32 species of Stenocercus and 12 outgroup taxa. Monophyly of this genus is strongly supported by maximum parsimony and Bayesian analyses. Evolutionary relationships within Stenocercus are further analyzed with a Bayesian implementation of a general mixture model, which accommodates variability in the pattern of evolution across sites. These analyses indicate a basal split of Stenocercus into two clades, one of which receives very strong statistical support. In addition, we test previous hypotheses using non-parametric and parametric statistical methods, and provide a phylogenetic classification for Stenocercus.
Kent, David M; Dahabreh, Issa J; Ruthazer, Robin; Furlan, Anthony J; Weimar, Christian; Serena, Joaquín; Meier, Bernhard; Mattle, Heinrich P; Di Angelantonio, Emanuele; Paciaroni, Maurizio; Schuchlenz, Herwig; Homma, Shunichi; Lutz, Jennifer S; Thaler, David E
2015-09-14
The preferred antithrombotic strategy for secondary prevention in patients with cryptogenic stroke (CS) and patent foramen ovale (PFO) is unknown. We pooled multiple observational studies and used propensity score-based methods to estimate the comparative effectiveness of oral anticoagulation (OAC) compared with antiplatelet therapy (APT). Individual participant data from 12 databases of medically treated patients with CS and PFO were analysed with Cox regression models, to estimate database-specific hazard ratios (HRs) comparing OAC with APT, for both the primary composite outcome [recurrent stroke, transient ischaemic attack (TIA), or death] and stroke alone. Propensity scores were applied via inverse probability of treatment weighting to control for confounding. We synthesized database-specific HRs using random-effects meta-analysis models. This analysis included 2385 (OAC = 804 and APT = 1581) patients with 227 composite endpoints (stroke/TIA/death). The difference between OAC and APT was not statistically significant for the primary composite outcome [adjusted HR = 0.76, 95% confidence interval (CI) 0.52-1.12] or for the secondary outcome of stroke alone (adjusted HR = 0.75, 95% CI 0.44-1.27). Results were consistent in analyses applying alternative weighting schemes, with the exception that OAC had a statistically significant beneficial effect on the composite outcome in analyses standardized to the patient population who actually received APT (adjusted HR = 0.64, 95% CI 0.42-0.99). Subgroup analyses did not detect statistically significant heterogeneity of treatment effects across clinically important patient groups. We did not find a statistically significant difference comparing OAC with APT; our results justify randomized trials comparing different antithrombotic approaches in these patients. Published on behalf of the European Society of Cardiology. All rights reserved. © The Author 2015. For permissions please email: journals.permissions@oup.com.
Chaisinanunkul, Napasri; Adeoye, Opeolu; Lewis, Roger J.; Grotta, James C.; Broderick, Joseph; Jovin, Tudor G.; Nogueira, Raul G.; Elm, Jordan; Graves, Todd; Berry, Scott; Lees, Kennedy R.; Barreto, Andrew D.; Saver, Jeffrey L.
2015-01-01
Background and Purpose Although the modified Rankin Scale (mRS) is the most commonly employed primary endpoint in acute stroke trials, its power is limited when analyzed in dichotomized fashion and its indication of effect size challenging to interpret when analyzed ordinally. Weighting the seven Rankin levels by utilities may improve scale interpretability while preserving statistical power. Methods A utility weighted mRS (UW-mRS) was derived by averaging values from time-tradeoff (patient centered) and person-tradeoff (clinician centered) studies. The UW-mRS, standard ordinal mRS, and dichotomized mRS were applied to 11 trials or meta-analyses of acute stroke treatments, including lytic, endovascular reperfusion, blood pressure moderation, and hemicraniectomy interventions. Results Utility values were: mRS 0–1.0; mRS 1 - 0.91; mRS 2 - 0.76; mRS 3 - 0.65; mRS 4 - 0.33; mRS 5 & 6 - 0. For trials with unidirectional treatment effects, the UW-mRS paralleled the ordinal mRS and outperformed dichotomous mRS analyses. Both the UW-mRS and the ordinal mRS were statistically significant in six of eight unidirectional effect trials, while dichotomous analyses were statistically significant in two to four of eight. In bidirectional effect trials, both the UW-mRS and ordinal tests captured the divergent treatment effects by showing neutral results whereas some dichotomized analyses showed positive results. Mean utility differences in trials with statistically significant positive results ranged from 0.026 to 0.249. Conclusion A utility-weighted mRS performs similarly to the standard ordinal mRS in detecting treatment effects in actual stroke trials and ensures the quantitative outcome is a valid reflection of patient-centered benefits. PMID:26138130
Kelley, George A.; Kelley, Kristi S.
2013-01-01
Purpose. Conduct a systematic review of previous meta-analyses addressing the effects of exercise in the treatment of overweight and obese children and adolescents. Methods. Previous meta-analyses of randomized controlled exercise trials that assessed adiposity in overweight and obese children and adolescents were included by searching nine electronic databases and cross-referencing from retrieved studies. Methodological quality was assessed using the Assessment of Multiple Systematic Reviews (AMSTAR) Instrument. The alpha level for statistical significance was set at P ≤ 0.05. Results. Of the 308 studies reviewed, two aggregate data meta-analyses representing 14 and 17 studies and 481 and 701 boys and girls met all eligibility criteria. Methodological quality was 64% and 73%. For both studies, statistically significant reductions in percent body fat were observed (P = 0.006 and P < 0.00001). The number-needed-to treat (NNT) was 4 and 3 with an estimated 24.5 and 31.5 million overweight and obese children in the world potentially benefitting, 2.8 and 3.6 million in the US. No other measures of adiposity (BMI-related measures, body weight, and central obesity) were statistically significant. Conclusions. Exercise is efficacious for reducing percent body fat in overweight and obese children and adolescents. Insufficient evidence exists to suggest that exercise reduces other measures of adiposity. PMID:24455215
Emotional and cognitive effects of peer tutoring among secondary school mathematics students
NASA Astrophysics Data System (ADS)
Alegre Ansuategui, Francisco José; Moliner Miravet, Lidón
2017-11-01
This paper describes an experience of same-age peer tutoring conducted with 19 eighth-grade mathematics students in a secondary school in Castellon de la Plana (Spain). Three constructs were analysed before and after launching the program: academic performance, mathematics self-concept and attitude of solidarity. Students' perceptions of the method were also analysed. The quantitative data was gathered by means of a mathematics self-concept questionnaire, an attitude of solidarity questionnaire and the students' numerical ratings. A statistical analysis was performed using Student's t-test. The qualitative information was gathered by means of discussion groups and a field diary. This information was analysed using descriptive analysis and by categorizing the information. Results show statistically significant improvements in all the variables and the positive assessment of the experience and the interactions that took place between the students.
NASA Astrophysics Data System (ADS)
Hentabli, Kamel
Cette recherche s'inscrit dans le cadre du projet de recherche Active Control Technology entre l'Ecole de Technologie Superieure et le constructeur Bombardier Aeronautique . Le but est de concevoir des strategies de commandes multivariables et robustes pour des modeles dynamiques d'avions. Ces strategies de commandes devraient assurer a l'avion une haute performance et satisfaire des qualites de vol desirees en l'occurrence, une bonne manoeuvrabilite, de bonnes marges de stabilite et un amortissement des mouvements phugoides et rapides de l'avion. Dans un premier temps, nous nous sommes principalement interesses aux methodes de synthese LTI et plus exactement a l'approche Hinfinity et la mu-synthese. Par la suite, nous avons accorde un interet particulier aux techniques de commande LPV. Pour mener a bien ce travail, nous avons envisage une approche frequentielle, typiquement Hinfinity. Cette approche est particulierement interessante, dans la mesure ou le modele de synthese est construit directement a partir des differentes specifications du cahier des charges. En effet, ces specifications sont traduites sous forme de gabarits frequentiels, correspondant a des ponderations en entree et en sortie que l'on retrouve dans la synthese Hinfinity classique. Par ailleurs, nous avons utilise une representation de type lineaire fractionnelle (LFT), jugee mieux adaptee pour la prise en compte des differents types d'incertitudes, qui peuvent intervenir sur le systeme. De plus, cette representation s'avere tres appropriee pour l'analyse de la robustesse via les outils de la mu-analyse. D'autre part, afin d'optimiser le compromis entre les specifications de robustesse et de performance, nous avons opte pour une structure de commande a 2 degres de liberte avec modele de reference. Enfin, ces techniques sont illustrees sur des applications realistes, demontrant ainsi la pertinence et l'applicabilite de chacune d'elle. Mots cles. Commande de vol, qualites de vol et manoeuvrabilite, commande robuste, approche Hinfinity , mu-synthese, systemes lineaires a parametres variants, sequencement de gains, transformation lineaire fractionnelle, inegalite matricielle lineaire.
Guillaumet, Jean-Louis; Betsch, Jean-Marie; Callmander, Martin W
2008-01-01
Le programme intitulé « Étude des écosystémes montagnards dans la région malgache» (RCP 225/CNRS; responsable: Recteur Renaud Paulian) avait pour ambition de dégager leurs caractères généraux, l'origine des éléments constitutifs et de tester la validité d'un Domaine malgache des Hautes Montagnes proposé par Humbert dès 1951. De 1970 à 1973, trois campagnes (Andringitra; Chaînes anosyennes et Ankaratra; Itremo, Ibity et Marojejy) ont permis une caractérisation écologique des milieux particuliers ainsi que des analyses de systématique sur certains taxa connus pour leur intérêt biogéographique. La succession altitudinale des formations végétales, définies par des critères physionomiques et structuraux, est précisée par massif. Le dernier étage caractérisé par le fourré éricoïde et ses groupements associés ne correspond pas à l'Étage des Hautes Montagnes de l'Est africain. Des groupes de la faune (invertébrés hexapodes: Collemboles et Dermaptères) indiquent une disjonction entre les massifs du Nord (Tsaratanana, Marojejy), ceux du Centre et du Sud; des éléments de la flore (Pandanaceae, Araliaceae, Asteraceae) sont en cours d'analyse dans le même sens. Le Domaine des Hautes montagnes à Madagascar est une réalité écologique mais ne peut être défini floristiquement; chaque massif montagneux est une entité phytogéographique d'étages de végétation interdépendants inclus dans les différents Sous-Domaines du Centre. Les groupes peu mobiles de la faune indiquent globalement une dépendance trophique et bioclimatique (effet tampon du climat intraforestier) vis-à-vis des étages de végétation, mais peuvent réagir à des microclimats locaux par des décalages à leurs limites.
Renaud Paulian et le programme du CNRS sur les hautes montagnes à Madagascar: étage vs domaine
Guillaumet, Jean-Louis; Betsch, Jean-Marie; Callmander, Martin W.
2011-01-01
Résumé Le programme intitulé « Étude des écosystémes montagnards dans la région malgache» (RCP 225/CNRS; responsable: Recteur Renaud Paulian) avait pour ambition de dégager leurs caractères généraux, l'origine des éléments constitutifs et de tester la validité d'un Domaine malgache des Hautes Montagnes proposé par Humbert dès 1951. De 1970 à 1973, trois campagnes (Andringitra; Chaînes anosyennes et Ankaratra; Itremo, Ibity et Marojejy) ont permis une caractérisation écologique des milieux particuliers ainsi que des analyses de systématique sur certains taxa connus pour leur intérêt biogéographique. La succession altitudinale des formations végétales, définies par des critères physionomiques et structuraux, est précisée par massif. Le dernier étage caractérisé par le fourré éricoïde et ses groupements associés ne correspond pas à l'Étage des Hautes Montagnes de l'Est africain. Des groupes de la faune (invertébrés hexapodes: Collemboles et Dermaptères) indiquent une disjonction entre les massifs du Nord (Tsaratanana, Marojejy), ceux du Centre et du Sud; des éléments de la flore (Pandanaceae, Araliaceae, Asteraceae) sont en cours d'analyse dans le même sens. Le Domaine des Hautes montagnes à Madagascar est une réalité écologique mais ne peut être défini floristiquement; chaque massif montagneux est une entité phytogéographique d'étages de végétation interdépendants inclus dans les différents Sous-Domaines du Centre. Les groupes peu mobiles de la faune indiquent globalement une dépendance trophique et bioclimatique (effet tampon du climat intraforestier) vis-à-vis des étages de végétation, mais peuvent réagir à des microclimats locaux par des décalages à leurs limites. PMID:21731422
La violence sur les réseaux canadiens de télévision
Paquette, Guy
2003-01-01
La question des effets de la violence à la télévision occupe une place très importante dans l’opinion publique depuis les vingt dernières années, et des centaines d’études y ont été consacrées. Plusieurs chercheurs concluent à une influence négative de cette violence sur le comportement. Le public, les diffuseurs et les autorités politiques ont tous endossé l’idée de réduire la quantité totale de violence présentée au petit écran – en particulier dans les émissions accessibles aux enfants. Nous avons analysé un millier d’émissions de fiction présentées entre 1993 et 2001 sur les principaux réseaux généralistes de télévision au Canada : TVA et TQS ainsi que CTV et Global pour les réseaux privés français et anglais, ainsi que Radio-Canada français et anglais pour les réseaux publics. La méthodologie utilisée est l’analyse de contenu classique, où l’acte de violence constitue l’unité d’analyse. Les données recueillies démontrent que la quantité de violence a augmenté régulièrement depuis 1993, malgré la volonté affirmée des télédiffuseurs de présenter une programmation moins violente. C’est le cas du nombre brut d’actes et du nombre d’actes à l’heure, qui est lui aussi en forte croissance. Les réseaux privés véhiculent trois fois plus de violence que les réseaux publics. On constate également qu’une très forte proportion d’actes de violence figure dans des émissions qui commencent avant 21 h, et que de nombreux enfants y sont probablement exposés. Nous signalons finalement la place de plus en plus importante occupée par la violence psychologique. PMID:20020031
NASA Astrophysics Data System (ADS)
Ostiguy, Pierre-Claude
Les matériaux composites sont de plus en plus utilisés en aéronautique. Leurs excellentes propriétés mécaniques et leur faible poids leur procurent un avantage certain par rapport aux matériaux métalliques. Ceux-ci étant soumis à diverses conditions de chargement et environnementales, ils sont suceptibles de subir plusieurs types d'endommagements, compromettant leur intégrité. Des méthodes fiables d'inspection sont donc nécessaires pour évaluer leur intégrité. Néanmoins, peu d'approches non destructives, embarquées et efficaces sont présentement utilisées. Ce travail de recherche se penche sur l'étude de l'effet de la composition des matériaux composites sur la détection et la caractérisation par ondes guidées. L'objectif du projet est de développer une approche de caractérisation mécanique embarquée permettant d'améliorer la performance d'une approche d'imagerie par antenne piézoélectriques sur des structures composite et métalliques. La contribution de ce projet est de proposer une approche embarquée de caractérisation mécanique par ultrasons qui ne requiert pas une mesure sur une multitude d'échantillons et qui est non destructive. Ce mémoire par articles est divisé en quatre parties, dont les parties deux A quatre présentant les articles publiés et soumis. La première partie présente l'état des connaissances dans la matière nécessaires à l'acomplissement de ce projet de maîtrise. Les principaux sujets traités portent sur les matériaux composites, propagation d'ondes, la modélisation des ondes guidées, la caractérisation par ondes guidées et la surveillance embarquée des structures. La deuxième partie présente une étude de l'effet des propriétés mécaniques sur la performance de l'algorithme d'imagerie Excitelet. L'étude est faite sur une structure isotrope. Les résultats ont démontré que l'algorithme est sensible à l'exactitude des propriétés mécaniques utilisées dans le modèle. Cette sensibilité a également été explorée afin de développer une méthode embarquée permettant d'évaluer les propriétés mécaniques d'une structure. La troisième partie porte sur une étude plus rigoureuse des performances de la méthode de caractérisation mécanique embarquée. La précision, la répétabilité et la robustesse de la méthode sont validés à l'aide d'un simulateur par FEM. Les propriétés estimées avec l'approche de caractérisation sont à moins de 1% des propriétés utilisées dans le modèle, ce qui rivalise avec l'incertitude des méthodes ASTM. L'analyse expérimentale s'est avérée précise et répétable pour des fréquences sous les 200 kHz, permettant d'estimer les propriétés mécaniques à moins de 1% des propriétés du fournisseur. La quatrième partie a démontrée la capacité de l'approche de caractérisation à identifier les propriétés mécaniques d'une plaques composite orthotrope. Les résultats estimés expérimentalement sont inclus dans les barres d'incertitude des propriétés estimées à l'aide des tests ASTM. Finalement, une simulation FEM a démontré la précision de l'approche avec des propriétés mécaniques à moins de 4 % des propriétés du modèle simulé.
Power-up: A Reanalysis of 'Power Failure' in Neuroscience Using Mixture Modeling.
Nord, Camilla L; Valton, Vincent; Wood, John; Roiser, Jonathan P
2017-08-23
Recently, evidence for endemically low statistical power has cast neuroscience findings into doubt. If low statistical power plagues neuroscience, then this reduces confidence in the reported effects. However, if statistical power is not uniformly low, then such blanket mistrust might not be warranted. Here, we provide a different perspective on this issue, analyzing data from an influential study reporting a median power of 21% across 49 meta-analyses (Button et al., 2013). We demonstrate, using Gaussian mixture modeling, that the sample of 730 studies included in that analysis comprises several subcomponents so the use of a single summary statistic is insufficient to characterize the nature of the distribution. We find that statistical power is extremely low for studies included in meta-analyses that reported a null result and that it varies substantially across subfields of neuroscience, with particularly low power in candidate gene association studies. Therefore, whereas power in neuroscience remains a critical issue, the notion that studies are systematically underpowered is not the full story: low power is far from a universal problem. SIGNIFICANCE STATEMENT Recently, researchers across the biomedical and psychological sciences have become concerned with the reliability of results. One marker for reliability is statistical power: the probability of finding a statistically significant result given that the effect exists. Previous evidence suggests that statistical power is low across the field of neuroscience. Our results present a more comprehensive picture of statistical power in neuroscience: on average, studies are indeed underpowered-some very seriously so-but many studies show acceptable or even exemplary statistical power. We show that this heterogeneity in statistical power is common across most subfields in neuroscience. This new, more nuanced picture of statistical power in neuroscience could affect not only scientific understanding, but potentially policy and funding decisions for neuroscience research. Copyright © 2017 Nord, Valton et al.
Statistical Design Model (SDM) of satellite thermal control subsystem
NASA Astrophysics Data System (ADS)
Mirshams, Mehran; Zabihian, Ehsan; Aarabi Chamalishahi, Mahdi
2016-07-01
Satellites thermal control, is a satellite subsystem that its main task is keeping the satellite components at its own survival and activity temperatures. Ability of satellite thermal control plays a key role in satisfying satellite's operational requirements and designing this subsystem is a part of satellite design. In the other hand due to the lack of information provided by companies and designers still doesn't have a specific design process while it is one of the fundamental subsystems. The aim of this paper, is to identify and extract statistical design models of spacecraft thermal control subsystem by using SDM design method. This method analyses statistical data with a particular procedure. To implement SDM method, a complete database is required. Therefore, we first collect spacecraft data and create a database, and then we extract statistical graphs using Microsoft Excel, from which we further extract mathematical models. Inputs parameters of the method are mass, mission, and life time of the satellite. For this purpose at first thermal control subsystem has been introduced and hardware using in the this subsystem and its variants has been investigated. In the next part different statistical models has been mentioned and a brief compare will be between them. Finally, this paper particular statistical model is extracted from collected statistical data. Process of testing the accuracy and verifying the method use a case study. Which by the comparisons between the specifications of thermal control subsystem of a fabricated satellite and the analyses results, the methodology in this paper was proved to be effective. Key Words: Thermal control subsystem design, Statistical design model (SDM), Satellite conceptual design, Thermal hardware
Electric Field Magnitude and Radar Reflectivity as a Function of Distance from Cloud Edge
NASA Technical Reports Server (NTRS)
Ward, Jennifer G.; Merceret, Francis J.
2004-01-01
The results of analyses of data collected during a field investigation of thunderstorm anvil and debris clouds are reported. Statistics of the magnitude of the electric field are determined as a function of distance from cloud edge. Statistics of radar reflectivity near cloud edge are also determined. Both analyses use in-situ airborne field mill and cloud physics data coupled with ground-based radar measurements obtained in east-central Florida during the summer convective season. Electric fields outside of anvil and debris clouds averaged less than 3 kV/m. The average radar reflectivity at the cloud edge ranged between 0 and 5 dBZ.
The extent and consequences of p-hacking in science.
Head, Megan L; Holman, Luke; Lanfear, Rob; Kahn, Andrew T; Jennions, Michael D
2015-03-01
A focus on novel, confirmatory, and statistically significant results leads to substantial bias in the scientific literature. One type of bias, known as "p-hacking," occurs when researchers collect or select data or statistical analyses until nonsignificant results become significant. Here, we use text-mining to demonstrate that p-hacking is widespread throughout science. We then illustrate how one can test for p-hacking when performing a meta-analysis and show that, while p-hacking is probably common, its effect seems to be weak relative to the real effect sizes being measured. This result suggests that p-hacking probably does not drastically alter scientific consensuses drawn from meta-analyses.
Histometric analyses of cancellous and cortical interface in autogenous bone grafting
Netto, Henrique Duque; Olate, Sergio; Klüppel, Leandro; do Carmo, Antonio Marcio Resende; Vásquez, Bélgica; Albergaria-Barbosa, Jose
2013-01-01
Surgical procedures involving the rehabilitation of the maxillofacial region frequently require bone grafts; the aim of this research was to evaluate the interface between recipient and graft with cortical or cancellous contact. 6 adult beagle dogs with 15 kg weight were included in the study. Under general anesthesia, an 8 mm diameter block was obtained from parietal bone of each animal and was put on the frontal bone with a 12 mm 1.5 screws. Was used the lag screw technique from better contact between the recipient and graft. 3-week and 6-week euthanized period were chosen for histometric evaluation. Hematoxylin-eosin was used in a histologic routine technique and histomorphometry was realized with IMAGEJ software. T test was used for data analyses with p<0.05 for statistical significance. The result show some differences in descriptive histology but non statistical differences in the interface between cortical or cancellous bone at 3 or 6 week; as natural, after 6 week of surgery, bone integration was better and statistically superior to 3-week analyses. We conclude that integration of cortical or cancellous bone can be usefully without differences. PMID:23923071
Kanda, Junya
2016-01-01
The Transplant Registry Unified Management Program (TRUMP) made it possible for members of the Japan Society for Hematopoietic Cell Transplantation (JSHCT) to analyze large sets of national registry data on autologous and allogeneic hematopoietic stem cell transplantation. However, as the processes used to collect transplantation information are complex and differed over time, the background of these processes should be understood when using TRUMP data. Previously, information on the HLA locus of patients and donors had been collected using a questionnaire-based free-description method, resulting in some input errors. To correct minor but significant errors and provide accurate HLA matching data, the use of a Stata or EZR/R script offered by the JSHCT is strongly recommended when analyzing HLA data in the TRUMP dataset. The HLA mismatch direction, mismatch counting method, and different impacts of HLA mismatches by stem cell source are other important factors in the analysis of HLA data. Additionally, researchers should understand the statistical analyses specific for hematopoietic stem cell transplantation, such as competing risk, landmark analysis, and time-dependent analysis, to correctly analyze transplant data. The data center of the JSHCT can be contacted if statistical assistance is required.
Gait patterns for crime fighting: statistical evaluation
NASA Astrophysics Data System (ADS)
Sulovská, Kateřina; Bělašková, Silvie; Adámek, Milan
2013-10-01
The criminality is omnipresent during the human history. Modern technology brings novel opportunities for identification of a perpetrator. One of these opportunities is an analysis of video recordings, which may be taken during the crime itself or before/after the crime. The video analysis can be classed as identification analyses, respectively identification of a person via externals. The bipedal locomotion focuses on human movement on the basis of their anatomical-physiological features. Nowadays, the human gait is tested by many laboratories to learn whether the identification via bipedal locomotion is possible or not. The aim of our study is to use 2D components out of 3D data from the VICON Mocap system for deep statistical analyses. This paper introduces recent results of a fundamental study focused on various gait patterns during different conditions. The study contains data from 12 participants. Curves obtained from these measurements were sorted, averaged and statistically tested to estimate the stability and distinctiveness of this biometrics. Results show satisfactory distinctness of some chosen points, while some do not embody significant difference. However, results presented in this paper are of initial phase of further deeper and more exacting analyses of gait patterns under different conditions.
Shitara, Kohei; Matsuo, Keitaro; Oze, Isao; Mizota, Ayako; Kondo, Chihiro; Nomura, Motoo; Yokota, Tomoya; Takahari, Daisuke; Ura, Takashi; Muro, Kei
2011-08-01
We performed a systematic review and meta-analysis to determine the impact of neutropenia or leukopenia experienced during chemotherapy on survival. Eligible studies included prospective or retrospective analyses that evaluated neutropenia or leukopenia as a prognostic factor for overall survival or disease-free survival. Statistical analyses were conducted to calculate a summary hazard ratio and 95% confidence interval (CI) using random-effects or fixed-effects models based on the heterogeneity of the included studies. Thirteen trials were selected for the meta-analysis, with a total of 9,528 patients. The hazard ratio of death was 0.69 (95% CI, 0.64-0.75) for patients with higher-grade neutropenia or leukopenia compared to patients with lower-grade or lack of cytopenia. Our analysis was also stratified by statistical method (any statistical method to decrease lead-time bias; time-varying analysis or landmark analysis), but no differences were observed. Our results indicate that neutropenia or leukopenia experienced during chemotherapy is associated with improved survival in patients with advanced cancer or hematological malignancies undergoing chemotherapy. Future prospective analyses designed to investigate the potential impact of chemotherapy dose adjustment coupled with monitoring of neutropenia or leukopenia on survival are warranted.
Hutton, Brian; Wolfe, Dianna; Moher, David; Shamseer, Larissa
2017-05-01
Research waste has received considerable attention from the biomedical community. One noteworthy contributor is incomplete reporting in research publications. When detailing statistical methods and results, ensuring analytic methods and findings are completely documented improves transparency. For publications describing randomised trials and systematic reviews, guidelines have been developed to facilitate complete reporting. This overview summarises aspects of statistical reporting in trials and systematic reviews of health interventions. A narrative approach to summarise features regarding statistical methods and findings from reporting guidelines for trials and reviews was taken. We aim to enhance familiarity of statistical details that should be reported in biomedical research among statisticians and their collaborators. We summarise statistical reporting considerations for trials and systematic reviews from guidance documents including the Consolidated Standards of Reporting Trials (CONSORT) Statement for reporting of trials, the Standard Protocol Items: Recommendations for Interventional Trials (SPIRIT) Statement for trial protocols, the Statistical Analyses and Methods in the Published Literature (SAMPL) Guidelines for statistical reporting principles, the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) Statement for systematic reviews and PRISMA for Protocols (PRISMA-P). Considerations regarding sharing of study data and statistical code are also addressed. Reporting guidelines provide researchers with minimum criteria for reporting. If followed, they can enhance research transparency and contribute improve quality of biomedical publications. Authors should employ these tools for planning and reporting of their research. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.
Bayesian statistics in medicine: a 25 year review.
Ashby, Deborah
2006-11-15
This review examines the state of Bayesian thinking as Statistics in Medicine was launched in 1982, reflecting particularly on its applicability and uses in medical research. It then looks at each subsequent five-year epoch, with a focus on papers appearing in Statistics in Medicine, putting these in the context of major developments in Bayesian thinking and computation with reference to important books, landmark meetings and seminal papers. It charts the growth of Bayesian statistics as it is applied to medicine and makes predictions for the future. From sparse beginnings, where Bayesian statistics was barely mentioned, Bayesian statistics has now permeated all the major areas of medical statistics, including clinical trials, epidemiology, meta-analyses and evidence synthesis, spatial modelling, longitudinal modelling, survival modelling, molecular genetics and decision-making in respect of new technologies.
Parent and Friend Social Support and Adolescent Hope.
Mahon, Noreen E; Yarcheski, Adela
2017-04-01
The purpose of this study was to conduct two meta-analyses. The first examined social support from parents in relation to adolescent hope, and the second examined social support from friends in relation to adolescent hope. Using Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) guidelines for the literature reviewed, nine published studies or doctoral dissertations completed between 1990 and 2014 met the inclusion criteria. Using meta-analytic techniques and the mean weighted r statistic, the results indicated that social support from friends had a stronger mean effect size (ES = .31) than social support from parents (ES = .21); there was a statistically significant difference between the two ESs. Two of the four moderators for the parent social support-adolescent hope relationship were statistically significant. They were quality score and health status. Implications for school nurses and nurses in all settings are addressed, and conclusions are drawn based on the findings.
Tomlinson, Alan; Hair, Mario; McFadyen, Angus
2013-10-01
Dry eye is a multifactorial disease which would require a broad spectrum of test measures in the monitoring of its treatment and diagnosis. However, studies have typically reported improvements in individual measures with treatment. Alternative approaches involve multiple, combined outcomes being assessed by different statistical analyses. In order to assess the effect of various statistical approaches to the use of single and combined test measures in dry eye, this review reanalyzed measures from two previous studies (osmolarity, evaporation, tear turnover rate, and lipid film quality). These analyses assessed the measures as single variables within groups, pre- and post-intervention with a lubricant supplement, by creating combinations of these variables and by validating these combinations with the combined sample of data from all groups of dry eye subjects. The effectiveness of single measures and combinations in diagnosis of dry eye was also considered. Copyright © 2013. Published by Elsevier Inc.
Performance of Between-Study Heterogeneity Measures in the Cochrane Library.
Ma, Xiaoyue; Lin, Lifeng; Qu, Zhiyong; Zhu, Motao; Chu, Haitao
2018-05-29
The growth in comparative effectiveness research and evidence-based medicine has increased attention to systematic reviews and meta-analyses. Meta-analysis synthesizes and contrasts evidence from multiple independent studies to improve statistical efficiency and reduce bias. Assessing heterogeneity is critical for performing a meta-analysis and interpreting results. As a widely used heterogeneity measure, the I statistic quantifies the proportion of total variation across studies that is due to real differences in effect size. The presence of outlying studies can seriously exaggerate the I statistic. Two alternative heterogeneity measures, the Ir and Im, have been recently proposed to reduce the impact of outlying studies. To evaluate these measures' performance empirically, we applied them to 20,599 meta-analyses in the Cochrane Library. We found that the Ir and Im have strong agreement with the I, while they are more robust than the I when outlying studies appear.
Mediation analysis in nursing research: a methodological review
Liu, Jianghong; Ulrich, Connie
2017-01-01
Mediation statistical models help clarify the relationship between independent predictor variables and dependent outcomes of interest by assessing the impact of third variables. This type of statistical analysis is applicable for many clinical nursing research questions, yet its use within nursing remains low. Indeed, mediational analyses may help nurse researchers develop more effective and accurate prevention and treatment programs as well as help bridge the gap between scientific knowledge and clinical practice. In addition, this statistical approach allows nurse researchers to ask – and answer – more meaningful and nuanced questions that extend beyond merely determining whether an outcome occurs. Therefore, the goal of this paper is to provide a brief tutorial on the use of mediational analyses in clinical nursing research by briefly introducing the technique and, through selected empirical examples from the nursing literature, demonstrating its applicability in advancing nursing science. PMID:26176804
ERIC Educational Resources Information Center
Ho, Andrew D.; Yu, Carol C.
2015-01-01
Many statistical analyses benefit from the assumption that unconditional or conditional distributions are continuous and normal. More than 50 years ago in this journal, Lord and Cook chronicled departures from normality in educational tests, and Micerri similarly showed that the normality assumption is met rarely in educational and psychological…
In analyses supporting the development of numeric nutrient criteria, multiple statistical techniques can be used to extract critical values from stressor response relationships. However there is little guidance for choosing among techniques, and the extent to which log-transfor...
ERIC Educational Resources Information Center
Karazsia, Bryan T.; Wong, Kendal
2016-01-01
Quantitative and statistical literacy are core domains in the undergraduate psychology curriculum. An important component of such literacy includes interpretation of visual aids, such as tables containing results from statistical analyses. This article presents results of a quasi-experimental study with longitudinal follow-up that tested the…
Finding Balance at the Elusive Mean
ERIC Educational Resources Information Center
Hudson, Rick A.
2012-01-01
Data analysis plays an important role in people's lives. Citizens need to be able to conduct critical analyses of statistical information in the work place, in their personal lives, and when portrayed by the media. However, becoming a conscientious consumer of statistics is a gradual process. The experiences that students have with data in the…
Improved analyses using function datasets and statistical modeling
John S. Hogland; Nathaniel M. Anderson
2014-01-01
Raster modeling is an integral component of spatial analysis. However, conventional raster modeling techniques can require a substantial amount of processing time and storage space and have limited statistical functionality and machine learning algorithms. To address this issue, we developed a new modeling framework using C# and ArcObjects and integrated that framework...
Measuring the Impacts of ICT Using Official Statistics. OECD Digital Economy Papers, No. 136
ERIC Educational Resources Information Center
Roberts, Sheridan
2008-01-01
This paper describes the findings of an OECD project examining ICT impact measurement and analyses based on official statistics. Both economic and social impacts are covered and some results are presented. It attempts to place ICT impacts measurement into an Information Society conceptual framework, provides some suggestions for standardising…
NASA Technical Reports Server (NTRS)
Thomas-Keprta, Kathie L.; Clemett, Simon J.; Bazylinski, Dennis A.; Kirschvink, Joseph L.; McKay, David S.; Wentworth, Susan J.; Vali, H.; Gibson, Everett K.
2000-01-01
Here we use rigorous mathematical modeling to compare ALH84001 prismatic magnetites with those produced by terrestrial magnetotactic bacteria, MV-1. We find that this subset of the Martian magnetites appears to be statistically indistinguishable from those of MV-1.
Using Data Mining to Teach Applied Statistics and Correlation
ERIC Educational Resources Information Center
Hartnett, Jessica L.
2016-01-01
This article describes two class activities that introduce the concept of data mining and very basic data mining analyses. Assessment data suggest that students learned some of the conceptual basics of data mining, understood some of the ethical concerns related to the practice, and were able to perform correlations via the Statistical Package for…
Image and multifactorial statistical analyses were used to evaluate the intensity of fluorescence signal from cells of three strains of A. pullulans and one strain of Rhodosporidium toruloides, as an outgroup, hybridized with either a universal o...
Exploring Foundation Concepts in Introductory Statistics Using Dynamic Data Points
ERIC Educational Resources Information Center
Ekol, George
2015-01-01
This paper analyses introductory statistics students' verbal and gestural expressions as they interacted with a dynamic sketch (DS) designed using "Sketchpad" software. The DS involved numeric data points built on the number line whose values changed as the points were dragged along the number line. The study is framed on aggregate…
Statistical methods for convergence detection of multi-objective evolutionary algorithms.
Trautmann, H; Wagner, T; Naujoks, B; Preuss, M; Mehnen, J
2009-01-01
In this paper, two approaches for estimating the generation in which a multi-objective evolutionary algorithm (MOEA) shows statistically significant signs of convergence are introduced. A set-based perspective is taken where convergence is measured by performance indicators. The proposed techniques fulfill the requirements of proper statistical assessment on the one hand and efficient optimisation for real-world problems on the other hand. The first approach accounts for the stochastic nature of the MOEA by repeating the optimisation runs for increasing generation numbers and analysing the performance indicators using statistical tools. This technique results in a very robust offline procedure. Moreover, an online convergence detection method is introduced as well. This method automatically stops the MOEA when either the variance of the performance indicators falls below a specified threshold or a stagnation of their overall trend is detected. Both methods are analysed and compared for two MOEA and on different classes of benchmark functions. It is shown that the methods successfully operate on all stated problems needing less function evaluations while preserving good approximation quality at the same time.
Mali, Matilda; Dell'Anna, Maria Michela; Mastrorilli, Piero; Damiani, Leonardo; Ungaro, Nicola; Belviso, Claudia; Fiore, Saverio
2015-11-01
Sediment contamination by metals poses significant risks to coastal ecosystems and is considered to be problematic for dredging operations. The determination of the background values of metal and metalloid distribution based on site-specific variability is fundamental in assessing pollution levels in harbour sediments. The novelty of the present work consists of addressing the scope and limitation of analysing port sediments through the use of conventional statistical techniques (such as: linear regression analysis, construction of cumulative frequency curves and the iterative 2σ technique), that are commonly employed for assessing Regional Geochemical Background (RGB) values in coastal sediments. This study ascertained that although the tout court use of such techniques in determining the RGB values in harbour sediments seems appropriate (the chemical-physical parameters of port sediments fit well with statistical equations), it should nevertheless be avoided because it may be misleading and can mask key aspects of the study area that can only be revealed by further investigations, such as mineralogical and multivariate statistical analyses. Copyright © 2015 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Shirota, Yukari; Hashimoto, Takako; Fitri Sari, Riri
2018-03-01
It has been very significant to visualize time series big data. In the paper we shall discuss a new analysis method called “statistical shape analysis” or “geometry driven statistics” on time series statistical data in economics. In the paper, we analyse the agriculture, value added and industry, value added (percentage of GDP) changes from 2000 to 2010 in Asia. We handle the data as a set of landmarks on a two-dimensional image to see the deformation using the principal components. The point of the analysis method is the principal components of the given formation which are eigenvectors of its bending energy matrix. The local deformation can be expressed as the set of non-Affine transformations. The transformations give us information about the local differences between in 2000 and in 2010. Because the non-Affine transformation can be decomposed into a set of partial warps, we present the partial warps visually. The statistical shape analysis is widely used in biology but, in economics, no application can be found. In the paper, we investigate its potential to analyse the economic data.
The Economic Cost of Homosexuality: Multilevel Analyses
ERIC Educational Resources Information Center
Baumle, Amanda K.; Poston, Dudley, Jr.
2011-01-01
This article builds on earlier studies that have examined "the economic cost of homosexuality," by using data from the 2000 U.S. Census and by employing multilevel analyses. Our findings indicate that partnered gay men experience a 12.5 percent earnings penalty compared to married heterosexual men, and a statistically insignificant earnings…
Detecting Test Tampering Using Item Response Theory
ERIC Educational Resources Information Center
Wollack, James A.; Cohen, Allan S.; Eckerly, Carol A.
2015-01-01
Test tampering, especially on tests for educational accountability, is an unfortunate reality, necessitating that the state (or its testing vendor) perform data forensic analyses, such as erasure analyses, to look for signs of possible malfeasance. Few statistical approaches exist for detecting fraudulent erasures, and those that do largely do not…
Developing educational resources for population genetics in R: An open and collaborative approach
USDA-ARS?s Scientific Manuscript database
The R computing and statistical language community has developed a myriad of resources for conducting populations genetic analyses. However, resources for learning how to carry out population genetic analyses in R are scattered and often incomplete, which can make acquiring this skill unnecessarily ...
WAIS-IV Subtest Covariance Structure: Conceptual and Statistical Considerations
ERIC Educational Resources Information Center
Ward, L. Charles; Bergman, Maria A.; Hebert, Katina R.
2012-01-01
D. Wechsler (2008b) reported confirmatory factor analyses (CFAs) with standardization data (ages 16-69 years) for 10 core and 5 supplemental subtests from the Wechsler Adult Intelligence Scale-Fourth Edition (WAIS-IV). Analyses of the 15 subtests supported 4 hypothesized oblique factors (Verbal Comprehension, Working Memory, Perceptual Reasoning,…
BRepertoire: a user-friendly web server for analysing antibody repertoire data.
Margreitter, Christian; Lu, Hui-Chun; Townsend, Catherine; Stewart, Alexander; Dunn-Walters, Deborah K; Fraternali, Franca
2018-04-14
Antibody repertoire analysis by high throughput sequencing is now widely used, but a persisting challenge is enabling immunologists to explore their data to discover discriminating repertoire features for their own particular investigations. Computational methods are necessary for large-scale evaluation of antibody properties. We have developed BRepertoire, a suite of user-friendly web-based software tools for large-scale statistical analyses of repertoire data. The software is able to use data preprocessed by IMGT, and performs statistical and comparative analyses with versatile plotting options. BRepertoire has been designed to operate in various modes, for example analysing sequence-specific V(D)J gene usage, discerning physico-chemical properties of the CDR regions and clustering of clonotypes. Those analyses are performed on the fly by a number of R packages and are deployed by a shiny web platform. The user can download the analysed data in different table formats and save the generated plots as image files ready for publication. We believe BRepertoire to be a versatile analytical tool that complements experimental studies of immune repertoires. To illustrate the server's functionality, we show use cases including differential gene usage in a vaccination dataset and analysis of CDR3H properties in old and young individuals. The server is accessible under http://mabra.biomed.kcl.ac.uk/BRepertoire.
Niclasen, Janni; Keilow, Maria; Obel, Carsten
2018-05-01
Well-being is considered a prerequisite for learning. The Danish Ministry of Education initiated the development of a new 40-item student well-being questionnaire in 2014 to monitor well-being among all Danish public school students on a yearly basis. The aim of this study was to investigate the basic psychometric properties of this questionnaire. We used the data from the 2015 Danish student well-being survey for 268,357 students in grades 4-9 (about 85% of the study population). Descriptive statistics, exploratory factor analyses, confirmatory factor analyses and Cronbach's α reliability measures were used in the analyses. The factor analyses did not unambiguously support one particular factor structure. However, based on the basic descriptive statistics, exploratory factor analyses, confirmatory factor analyses, the semantics of the individual items and Cronbach's α, we propose a four-factor structure including 27 of the 40 items originally proposed. The four scales measure school connectedness, learning self-efficacy, learning environment and classroom management. Two bullying items and two psychosomatic items should be considered separately, leaving 31 items in the questionnaire. The proposed four-factor structure addresses central aspects of well-being, which, if used constructively, may support public schools' work to increase levels of student well-being.
Progressive statistics for studies in sports medicine and exercise science.
Hopkins, William G; Marshall, Stephen W; Batterham, Alan M; Hanin, Juri
2009-01-01
Statistical guidelines and expert statements are now available to assist in the analysis and reporting of studies in some biomedical disciplines. We present here a more progressive resource for sample-based studies, meta-analyses, and case studies in sports medicine and exercise science. We offer forthright advice on the following controversial or novel issues: using precision of estimation for inferences about population effects in preference to null-hypothesis testing, which is inadequate for assessing clinical or practical importance; justifying sample size via acceptable precision or confidence for clinical decisions rather than via adequate power for statistical significance; showing SD rather than SEM, to better communicate the magnitude of differences in means and nonuniformity of error; avoiding purely nonparametric analyses, which cannot provide inferences about magnitude and are unnecessary; using regression statistics in validity studies, in preference to the impractical and biased limits of agreement; making greater use of qualitative methods to enrich sample-based quantitative projects; and seeking ethics approval for public access to the depersonalized raw data of a study, to address the need for more scrutiny of research and better meta-analyses. Advice on less contentious issues includes the following: using covariates in linear models to adjust for confounders, to account for individual differences, and to identify potential mechanisms of an effect; using log transformation to deal with nonuniformity of effects and error; identifying and deleting outliers; presenting descriptive, effect, and inferential statistics in appropriate formats; and contending with bias arising from problems with sampling, assignment, blinding, measurement error, and researchers' prejudices. This article should advance the field by stimulating debate, promoting innovative approaches, and serving as a useful checklist for authors, reviewers, and editors.
The Australasian Resuscitation in Sepsis Evaluation (ARISE) trial statistical analysis plan.
Delaney, Anthony P; Peake, Sandra L; Bellomo, Rinaldo; Cameron, Peter; Holdgate, Anna; Howe, Belinda; Higgins, Alisa; Presneill, Jeffrey; Webb, Steve
2013-09-01
The Australasian Resuscitation in Sepsis Evaluation (ARISE) study is an international, multicentre, randomised, controlled trial designed to evaluate the effectiveness of early goal-directed therapy compared with standard care for patients presenting to the emergency department with severe sepsis. In keeping with current practice, and considering aspects of trial design and reporting specific to non-pharmacological interventions, our plan outlines the principles and methods for analysing and reporting the trial results. The document is prepared before completion of recruitment into the ARISE study, without knowledge of the results of the interim analysis conducted by the data safety and monitoring committee and before completion of the two related international studies. Our statistical analysis plan was designed by the ARISE chief investigators, and reviewed and approved by the ARISE steering committee. We reviewed the data collected by the research team as specified in the study protocol and detailed in the study case report form. We describe information related to baseline characteristics, characteristics of delivery of the trial interventions, details of resuscitation, other related therapies and other relevant data with appropriate comparisons between groups. We define the primary, secondary and tertiary outcomes for the study, with description of the planned statistical analyses. We have developed a statistical analysis plan with a trial profile, mock-up tables and figures. We describe a plan for presenting baseline characteristics, microbiological and antibiotic therapy, details of the interventions, processes of care and concomitant therapies and adverse events. We describe the primary, secondary and tertiary outcomes with identification of subgroups to be analysed. We have developed a statistical analysis plan for the ARISE study, available in the public domain, before the completion of recruitment into the study. This will minimise analytical bias and conforms to current best practice in conducting clinical trials.
Dynamic systems approaches and levels of analysis in the nervous system
Parker, David; Srivastava, Vipin
2013-01-01
Various analyses are applied to physiological signals. While epistemological diversity is necessary to address effects at different levels, there is often a sense of competition between analyses rather than integration. This is evidenced by the differences in the criteria needed to claim understanding in different approaches. In the nervous system, neuronal analyses that attempt to explain network outputs in cellular and synaptic terms are rightly criticized as being insufficient to explain global effects, emergent or otherwise, while higher-level statistical and mathematical analyses can provide quantitative descriptions of outputs but can only hypothesize on their underlying mechanisms. The major gap in neuroscience is arguably our inability to translate what should be seen as complementary effects between levels. We thus ultimately need approaches that allow us to bridge between different spatial and temporal levels. Analytical approaches derived from critical phenomena in the physical sciences are increasingly being applied to physiological systems, including the nervous system, and claim to provide novel insight into physiological mechanisms and opportunities for their control. Analyses of criticality have suggested several important insights that should be considered in cellular analyses. However, there is a mismatch between lower-level neurophysiological approaches and statistical phenomenological analyses that assume that lower-level effects can be abstracted away, which means that these effects are unknown or inaccessible to experimentalists. As a result experimental designs often generate data that is insufficient for analyses of criticality. This review considers the relevance of insights from analyses of criticality to neuronal network analyses, and highlights that to move the analyses forward and close the gap between the theoretical and neurobiological levels, it is necessary to consider that effects at each level are complementary rather than in competition. PMID:23386835
Dexter, Franklin; Shafer, Steven L
2017-03-01
Considerable attention has been drawn to poor reproducibility in the biomedical literature. One explanation is inadequate reporting of statistical methods by authors and inadequate assessment of statistical reporting and methods during peer review. In this narrative review, we examine scientific studies of several well-publicized efforts to improve statistical reporting. We also review several retrospective assessments of the impact of these efforts. These studies show that instructions to authors and statistical checklists are not sufficient; no findings suggested that either improves the quality of statistical methods and reporting. Second, even basic statistics, such as power analyses, are frequently missing or incorrectly performed. Third, statistical review is needed for all papers that involve data analysis. A consistent finding in the studies was that nonstatistical reviewers (eg, "scientific reviewers") and journal editors generally poorly assess statistical quality. We finish by discussing our experience with statistical review at Anesthesia & Analgesia from 2006 to 2016.
Low-flow statistics of selected streams in Chester County, Pennsylvania
Schreffler, Curtis L.
1998-01-01
Low-flow statistics for many streams in Chester County, Pa., were determined on the basis of data from 14 continuous-record streamflow stations in Chester County and data from 1 station in Maryland and 1 station in Delaware. The stations in Maryland and Delaware are on streams that drain large areas within Chester County. Streamflow data through the 1994 water year were used in the analyses. The low-flow statistics summarized are the 1Q10, 7Q10, 30Q10, and harmonic mean. Low-flow statistics were estimated at 34 partial-record stream sites throughout Chester County.
Multiple statistical tests: Lessons from a d20.
Madan, Christopher R
2016-01-01
Statistical analyses are often conducted with α= .05. When multiple statistical tests are conducted, this procedure needs to be adjusted to compensate for the otherwise inflated Type I error. In some instances in tabletop gaming, sometimes it is desired to roll a 20-sided die (or 'd20') twice and take the greater outcome. Here I draw from probability theory and the case of a d20, where the probability of obtaining any specific outcome is (1)/ 20, to determine the probability of obtaining a specific outcome (Type-I error) at least once across repeated, independent statistical tests.
Tang, Qi-Yi; Zhang, Chuan-Xi
2013-04-01
A comprehensive but simple-to-use software package called DPS (Data Processing System) has been developed to execute a range of standard numerical analyses and operations used in experimental design, statistics and data mining. This program runs on standard Windows computers. Many of the functions are specific to entomological and other biological research and are not found in standard statistical software. This paper presents applications of DPS to experimental design, statistical analysis and data mining in entomology. © 2012 The Authors Insect Science © 2012 Institute of Zoology, Chinese Academy of Sciences.
NASA Technical Reports Server (NTRS)
Feiveson, Alan H.; Foy, Millennia; Ploutz-Snyder, Robert; Fiedler, James
2014-01-01
Do you have elevated p-values? Is the data analysis process getting you down? Do you experience anxiety when you need to respond to criticism of statistical methods in your manuscript? You may be suffering from Insufficient Statistical Support Syndrome (ISSS). For symptomatic relief of ISSS, come for a free consultation with JSC biostatisticians at our help desk during the poster sessions at the HRP Investigators Workshop. Get answers to common questions about sample size, missing data, multiple testing, when to trust the results of your analyses and more. Side effects may include sudden loss of statistics anxiety, improved interpretation of your data, and increased confidence in your results.
Analyse de la dégradation thermique du Poly(éther imide)
NASA Astrophysics Data System (ADS)
Courvoisier, Emilie; Bicaba, Yoann; Colin, Xavier
2018-03-01
La dégradation thermique du PEI a été étudiée dans de larges intervalles de température (entre 180 et 250 °C) et de pression partielle d'oxygène (entre 0,21 et 50 bars). Tout d'abord, les mécanismes de vieillissement thermique ont été analysés et élucidés par spectroscopie IRTF et par calorimétrie différentielle (DSC) sur des films de PEI suffisamment minces (entre 10 et 60 μm d'épaisseur) pour s'affranchir totalement des effets de la diffusion d'oxygène. Comme attendu, et par analogie avec d'autres polymères aromatiques de structure chimique similaire, l'oxydation se produit préférentiellement sur les groupes méthyle de l'unité isopropylidène du motif bisphenol A, causant la disparition de leur bande d'absorption IR caractéristique à 2970 cm-1 et la croissance d'une nouvelle bande d'absorption IR centrée à 3350 cm-1 et attribuée aux groupes alcool. De plus, l'oxydation conduit successivement à une prédominance relative des coupures de chaîne (diminution de Tg) et de la réticulation (augmentation de Tg). Enfin, les conséquences de l'oxydation sur les propriétés élastiques ont été analysées et élucidées par micro-indentation sur des sections droites préalablement polies de plaquettes de PEI de 3 mm d'épaisseur. Cependant, l'augmentation du module d'Young dans la couche superficielle oxydée est principalement due à un vieillissement physique.
NASA Astrophysics Data System (ADS)
Tutashkonko, Sergii
Le sujet de cette these porte sur l'elaboration du nouveau nanomateriau par la gravure electrochimique bipolaire (BEE) --- le Ge mesoporeux et sur l'analyse de ses proprietes physico-chimiques en vue de son utilisation dans des applications photovoltaiques. La formation du Ge mesoporeux par gravure electrochimique a ete precedemment rapportee dans la litterature. Cependant, le verrou technologique important des procedes de fabrication existants consistait a obtenir des couches epaisses (superieure a 500 nm) du Ge mesoporeux a la morphologie parfaitement controlee. En effet, la caracterisation physico-chimique des couches minces est beaucoup plus compliquee et le nombre de leurs applications possibles est fortement limite. Nous avons developpe un modele electrochimique qui decrit les mecanismes principaux de formation des pores ce qui nous a permis de realiser des structures epaisses du Ge mesoporeux (jusqu'au 10 mum) ayant la porosite ajustable dans une large gamme de 15% a 60%. En plus, la formation des nanostructures poreuses aux morphologies variables et bien controlees est desormais devenue possible. Enfin, la maitrise de tous ces parametres a ouvert la voie extremement prometteuse vers la realisation des structures poreuses a multi-couches a base de Ge pour des nombreuses applications innovantes et multidisciplinaires grace a la flexibilite technologique actuelle atteinte. En particulier, dans le cadre de cette these, les couches du Ge mesoporeux ont ete optimisees dans le but de realiser le procede de transfert de couches minces d'une cellule solaire a triple jonctions via une couche sacrificielle en Ge poreux. Mots-cles : Germanium meso-poreux, Gravure electrochimique bipolaire, Electrochimie des semi-conducteurs, Report des couches minces, Cellule photovoltaique
Chapter C. The Loma Prieta, California, Earthquake of October 17, 1989 - Building Structures
Çelebi, Mehmet
1998-01-01
Several approaches are used to assess the performance of the built environment following an earthquake -- preliminary damage surveys conducted by professionals, detailed studies of individual structures, and statistical analyses of groups of structures. Reports of damage that are issued by many organizations immediately following an earthquake play a key role in directing subsequent detailed investigations. Detailed studies of individual structures and statistical analyses of groups of structures may be motivated by particularly good or bad performance during an earthquake. Beyond this, practicing engineers typically perform stress analyses to assess the performance of a particular structure to vibrational levels experienced during an earthquake. The levels may be determined from recorded or estimated ground motions; actual levels usually differ from design levels. If a structure has seismic instrumentation to record response data, the estimated and recorded response and behavior of the structure can be compared.
Customer perceived service quality, satisfaction and loyalty in Indian private healthcare.
Kondasani, Rama Koteswara Rao; Panda, Rajeev Kumar
2015-01-01
The purpose of this paper is to analyse how perceived service quality and customer satisfaction lead to loyalty towards healthcare service providers. In total, 475 hospital patients participated in a questionnaire survey in five Indian private hospitals. Descriptive statistics, factor analysis, regression and correlation statistics were employed to analyse customer perceived service quality and how it leads to loyalty towards service providers. Results indicate that the service seeker-service provider relationship, quality of facilities and the interaction with supporting staff have a positive effect on customer perception. Findings help healthcare managers to formulate effective strategies to ensure a better quality of services to the customers. This study helps healthcare managers to build customer loyalty towards healthcare services, thereby attracting and gaining more customers. This paper will help healthcare managers and service providers to analyse customer perceptions and their loyalty towards Indian private healthcare services.
Statistical analyses of Higgs- and Z -portal dark matter models
NASA Astrophysics Data System (ADS)
Ellis, John; Fowlie, Andrew; Marzola, Luca; Raidal, Martti
2018-06-01
We perform frequentist and Bayesian statistical analyses of Higgs- and Z -portal models of dark matter particles with spin 0, 1 /2 , and 1. Our analyses incorporate data from direct detection and indirect detection experiments, as well as LHC searches for monojet and monophoton events, and we also analyze the potential impacts of future direct detection experiments. We find acceptable regions of the parameter spaces for Higgs-portal models with real scalar, neutral vector, Majorana, or Dirac fermion dark matter particles, and Z -portal models with Majorana or Dirac fermion dark matter particles. In many of these cases, there are interesting prospects for discovering dark matter particles in Higgs or Z decays, as well as dark matter particles weighing ≳100 GeV . Negative results from planned direct detection experiments would still allow acceptable regions for Higgs- and Z -portal models with Majorana or Dirac fermion dark matter particles.
Wallach, Joshua D; Sullivan, Patrick G; Trepanowski, John F; Sainani, Kristin L; Steyerberg, Ewout W; Ioannidis, John P A
2017-04-01
Many published randomized clinical trials (RCTs) make claims for subgroup differences. To evaluate how often subgroup claims reported in the abstracts of RCTs are actually supported by statistical evidence (P < .05 from an interaction test) and corroborated by subsequent RCTs and meta-analyses. This meta-epidemiological survey examines data sets of trials with at least 1 subgroup claim, including Subgroup Analysis of Trials Is Rarely Easy (SATIRE) articles and Discontinuation of Randomized Trials (DISCO) articles. We used Scopus (updated July 2016) to search for English-language articles citing each of the eligible index articles with at least 1 subgroup finding in the abstract. Articles with a subgroup claim in the abstract with or without evidence of statistical heterogeneity (P < .05 from an interaction test) in the text and articles attempting to corroborate the subgroup findings. Study characteristics of trials with at least 1 subgroup claim in the abstract were recorded. Two reviewers extracted the data necessary to calculate subgroup-level effect sizes, standard errors, and the P values for interaction. For individual RCTs and meta-analyses that attempted to corroborate the subgroup findings from the index articles, trial characteristics were extracted. Cochran Q test was used to reevaluate heterogeneity with the data from all available trials. The number of subgroup claims in the abstracts of RCTs, the number of subgroup claims in the abstracts of RCTs with statistical support (subgroup findings), and the number of subgroup findings corroborated by subsequent RCTs and meta-analyses. Sixty-four eligible RCTs made a total of 117 subgroup claims in their abstracts. Of these 117 claims, only 46 (39.3%) in 33 articles had evidence of statistically significant heterogeneity from a test for interaction. In addition, out of these 46 subgroup findings, only 16 (34.8%) ensured balance between randomization groups within the subgroups (eg, through stratified randomization), 13 (28.3%) entailed a prespecified subgroup analysis, and 1 (2.2%) was adjusted for multiple testing. Only 5 (10.9%) of the 46 subgroup findings had at least 1 subsequent pure corroboration attempt by a meta-analysis or an RCT. In all 5 cases, the corroboration attempts found no evidence of a statistically significant subgroup effect. In addition, all effect sizes from meta-analyses were attenuated toward the null. A minority of subgroup claims made in the abstracts of RCTs are supported by their own data (ie, a significant interaction effect). For those that have statistical support (P < .05 from an interaction test), most fail to meet other best practices for subgroup tests, including prespecification, stratified randomization, and adjustment for multiple testing. Attempts to corroborate statistically significant subgroup differences are rare; when done, the initially observed subgroup differences are not reproduced.
Dérive à la surface de l'océan sous l'effet des vagues
NASA Astrophysics Data System (ADS)
Ardhuin, Fabrice; Martin-Lauzer, François-Régis; Chapron, Bertrand; Craneguy, Philippe; Girard-Ardhuin, Fanny; Elfouhaily, Tanos
2004-09-01
We model the drift velocity near the ocean surface separating the motion induced by the local current, itself influenced by winds and waves, and the motion induced by the waves, which are generated by local and remote winds. Application to the drift of 'tar balls', following the sinking of the oil tanker Prestige-Nassau in November 2002, shows that waves contribute at least one third of the drift for pollutants floating 1 m below the surface, with a mean direction about 30° to the right of the wind-sea direction. Although not new, this result was previously obtained with specific models, whereas the formalism used here combines classical wave and circulation forecasting models. To cite this article: F. Ardhuin et al., C. R. Geoscience 336 (2004).
Amplification sans bruit d'images optiques
NASA Astrophysics Data System (ADS)
Gigan, S.; Delaubert, V.; Lopez, L.; Treps, N.; Maitre, A.; Fabre, C.
2004-11-01
Nous utilisons un Oscillateur Paramétrique Optique (OPO) pompé sous le seuil dans le but d'amplifier une image multimode transverse sans dégradation du rapport signal à bruit. Le dispositif expérimental met en œuvre un OPO de type II triplement résonant et semi-confocal pour le faisceau amplifié. L'existence d'effets quantiques lors de l'amplification multimode dans un tel dispositif a été montrée expérimentalement. Plus généralement, ceci nous a amené à étudier les propriétés quantiques transverses des faisceaux lumineux amplifiés. Une telle étude peut trouver des applications non seulement en imagerie, mais également dans le traitement quantique de l'information.
2003-12-01
la dangerosité résultant d’un certain niveau d’exposition à un agent, ainsi que l’estimation des effets sur les populations après une exposition...grand que ce que peut accepter la société. 5) La phase de contrôle du risque se focalise sur les différentes mesures à prendre afin soit « de... les sections précédentes. Sur base de ce schéma, on peut envisager une stratégie globale de gestion des risques de santé au sein des Forces Armées
Statistical Selection of Biological Models for Genome-Wide Association Analyses.
Bi, Wenjian; Kang, Guolian; Pounds, Stanley B
2018-05-24
Genome-wide association studies have discovered many biologically important associations of genes with phenotypes. Typically, genome-wide association analyses formally test the association of each genetic feature (SNP, CNV, etc) with the phenotype of interest and summarize the results with multiplicity-adjusted p-values. However, very small p-values only provide evidence against the null hypothesis of no association without indicating which biological model best explains the observed data. Correctly identifying a specific biological model may improve the scientific interpretation and can be used to more effectively select and design a follow-up validation study. Thus, statistical methodology to identify the correct biological model for a particular genotype-phenotype association can be very useful to investigators. Here, we propose a general statistical method to summarize how accurately each of five biological models (null, additive, dominant, recessive, co-dominant) represents the data observed for each variant in a GWAS study. We show that the new method stringently controls the false discovery rate and asymptotically selects the correct biological model. Simulations of two-stage discovery-validation studies show that the new method has these properties and that its validation power is similar to or exceeds that of simple methods that use the same statistical model for all SNPs. Example analyses of three data sets also highlight these advantages of the new method. An R package is freely available at www.stjuderesearch.org/site/depts/biostats/maew. Copyright © 2018. Published by Elsevier Inc.
Ecoulement et mise en structure de suspensions macroscopiques
NASA Astrophysics Data System (ADS)
Petit, L.
In this review, we report the various experimental studies performed on suspensions of solid particules in liquids, and concerning the rheological behaviour of such systems and the ordering of particules under the applied velocity fields. The number of materials which are flowing suspensions (reagents in chemical engineering, paints while spreading, blood flow, muds in oil reservoir) gave rise to a number of experimental studies, especially in the last twenty years. The results show a wide variety of behaviour, even for systems of intermediate concentration. In addition, even for identical systems, the results depend of the type of flow. Then, it is clear that, in addition of the standard parameters which are to be taken into account, it is necessary to consider the type of flow the suspension is submitted to. The flow influences the spatial distribution of the particules, leading to their ordering or migration. These ordering or motions influence the flow, and then, the rheological behaviour of the whole system. So, there is a feedback mechanism from the ordering to the flow, which explains the experimental observations. Nous reportons dans cette revue les différentes études expérimentales réalisées sur les suspensions de particules solides dans un liquide et qui concernent le comportement rhéologique de ces systèmes ainsi que les mouvements des particules sous l'effet des champs de vitesse imposés. Ces mouvements peuvent conduire soit à des mises en structure, ou encore à des migrations des particules. L'importance du nombre de matériaux qui se présentent sous forme de suspension et qui sont mis en écoulement (réactif en génie chimique, peintures lors de leur mise en place, écoulement sanguin, boues dans les forages pétroliers,...) a motivé un grand nombre de travaux expérimentaux plus fondamentaux sur le sujet, particulièrement dans les vingt dernières années. Les résultats correspondants montrent une très grande diversité des comportements même pour les systèmes moyennement concentrés. Bien plus, pour des systèmes identiques, les résultats dépendent du type d'écoulement imposé. Ceci montre clairement, qu'à côté des facteurs standards qui sont à prendre en compte pour caractériser le comportement de tels systèmes (concentration de la phase solide, taille des particules, interactions entre particules, effets de diffusion,...), il est nécessaire de tenir compte de la nature de l'écoulement auquel est soumise la suspension. Cet écoulement modifie la distribution spatiale des particules, soit directement, soit par l'intermédiaire de différents mécanismes tels que écoulements secondaires, diffusion. Ces effets se traduisent par une mise en ordre des particules, ou bien à des mouvements de migration. Les structures ou les mouvements ainsi créés agissent à leur tour sur les propriétés de l'écoulement, et de là, sur le comportement rhéologique de l'ensemble du système. Il existe ainsi un mécanisme de rétroaction de la structure sur l'écoulement qui permet d'expliquer les comportements observés expérimentalement.
NASA Astrophysics Data System (ADS)
Benard, Pierre
Nous presentons une etude des fluctuations magnetiques de la phase normale de l'oxyde de cuivre supraconducteur La_{2-x}Sr _{x}CuO_4 . Le compose est modelise par le Hamiltonien de Hubbard bidimensionnel avec un terme de saut vers les deuxiemes voisins (modele tt'U). Le modele est etudie en utilisant l'approximation de la GRPA (Generalized Random Phase Approximation) et en incluant les effets de la renormalisation de l'interaction de Hubbard par les diagrammes de Brueckner-Kanamori. Dans l'approche presentee dans ce travail, les maximums du facteur de structure magnetique observes par les experiences de diffusion de neutrons sont associes aux anomalies 2k _{F} de reseau du facteur de structure des gaz d'electrons bidimensionnels sans interaction. Ces anomalies proviennent de la diffusion entre particules situees a des points de la surface de Fermi ou les vitesses de Fermi sont tangentes, et conduisent a des divergences dont la nature depend de la geometrie de la surface de Fermi au voisinage de ces points. Ces resultats sont ensuite appliques au modele tt'U, dont le modele de Hubbard usuel tU est un cas particulier. Dans la majorite des cas, les interactions ne determinent pas la position des maximums du facteur de structure. Le role de l'interaction est d'augmenter l'intensite des structures du facteur de structure magnetique associees a l'instabilite magnetique du systeme. Ces structures sont souvent deja presentes dans la partie imaginaire de la susceptibilite sans interaction. Le rapport d'intensite entre les maximums absolus et les autres structures du facteur de structure magnetique permet de determiner le rapport U_ {rn}/U_{c} qui mesure la proximite d'une instabilite magnetique. Le diagramme de phase est ensuite etudie afin de delimiter la plage de validite de l'approximation. Apres avoir discute des modes collectifs et de l'effet d'une partie imaginaire non-nulle de la self-energie, l'origine de l'echelle d'energie des fluctuations magnetiques est examinee. Il est ensuite demontre que le modele a trois bandes predit les memes resultats pour la position des structures du facteur de structure magnetique que le modele a une bande, dans la limite ou l'hybridation des orbitales des atomes d'oxygene des plans Cu-O_2 et l'amplitude de sauts vers les seconds voisins sont nulles. Il est de plus constate que l'effet de l'hybridation des orbitales des atomes d'oxygene est bien modelise par le terme de saut vers les seconds voisins. Meme si ils decrivent correctement le comportement qualitatif des maximums du facteur de structure magnetique, les modeles a trois bandes et a une bande ne permettent pas d'obtenir une position de ces structures conforme avec les mesures experimentales, si on suppose que la bande est rigide, c'est-a-dire que les parametres du Hamiltonien sont independants de la concentration de strontium. Ceci peut etre cause par la dependance des parametres du Hamiltonien sur la concentration de strontium. Finalement, les resultats sont compares avec les experiences de diffusion de neutrons et les autres theories, en particulier celles de Littlewood et al. (1993) et de Q. Si et al. (1993). La comparaison avec les resultats experimentaux pour le compose de lanthane suggere que le liquide de Fermi possede une surface de Fermi disjointe, et qu'il est situe pres d'une instabilite magnetique incommensurable.
Vieira, Rute; McDonald, Suzanne; Araújo-Soares, Vera; Sniehotta, Falko F; Henderson, Robin
2017-09-01
N-of-1 studies are based on repeated observations within an individual or unit over time and are acknowledged as an important research method for generating scientific evidence about the health or behaviour of an individual. Statistical analyses of n-of-1 data require accurate modelling of the outcome while accounting for its distribution, time-related trend and error structures (e.g., autocorrelation) as well as reporting readily usable contextualised effect sizes for decision-making. A number of statistical approaches have been documented but no consensus exists on which method is most appropriate for which type of n-of-1 design. We discuss the statistical considerations for analysing n-of-1 studies and briefly review some currently used methodologies. We describe dynamic regression modelling as a flexible and powerful approach, adaptable to different types of outcomes and capable of dealing with the different challenges inherent to n-of-1 statistical modelling. Dynamic modelling borrows ideas from longitudinal and event history methodologies which explicitly incorporate the role of time and the influence of past on future. We also present an illustrative example of the use of dynamic regression on monitoring physical activity during the retirement transition. Dynamic modelling has the potential to expand researchers' access to robust and user-friendly statistical methods for individualised studies.
Edjabou, Maklawe Essonanawe; Martín-Fernández, Josep Antoni; Scheutz, Charlotte; Astrup, Thomas Fruergaard
2017-11-01
Data for fractional solid waste composition provide relative magnitudes of individual waste fractions, the percentages of which always sum to 100, thereby connecting them intrinsically. Due to this sum constraint, waste composition data represent closed data, and their interpretation and analysis require statistical methods, other than classical statistics that are suitable only for non-constrained data such as absolute values. However, the closed characteristics of waste composition data are often ignored when analysed. The results of this study showed, for example, that unavoidable animal-derived food waste amounted to 2.21±3.12% with a confidence interval of (-4.03; 8.45), which highlights the problem of the biased negative proportions. A Pearson's correlation test, applied to waste fraction generation (kg mass), indicated a positive correlation between avoidable vegetable food waste and plastic packaging. However, correlation tests applied to waste fraction compositions (percentage values) showed a negative association in this regard, thus demonstrating that statistical analyses applied to compositional waste fraction data, without addressing the closed characteristics of these data, have the potential to generate spurious or misleading results. Therefore, ¨compositional data should be transformed adequately prior to any statistical analysis, such as computing mean, standard deviation and correlation coefficients. Copyright © 2017 Elsevier Ltd. All rights reserved.
Statistical methods for the beta-binomial model in teratology.
Yamamoto, E; Yanagimoto, T
1994-01-01
The beta-binomial model is widely used for analyzing teratological data involving littermates. Recent developments in statistical analyses of teratological data are briefly reviewed with emphasis on the model. For statistical inference of the parameters in the beta-binomial distribution, separation of the likelihood introduces an likelihood inference. This leads to reducing biases of estimators and also to improving accuracy of empirical significance levels of tests. Separate inference of the parameters can be conducted in a unified way. PMID:8187716
The Deployment Life Study: Longitudinal Analysis of Military Families Across the Deployment Cycle
2016-01-01
psychological and physical aggression than they reported prior to the deployment. 1 H. Fischer, A Guide to U.S. Military Casualty Statistics ...analyses include a large number of statistical tests and thus the results pre- sented in this report should be viewed in terms of patterns, rather...Military Children and Families,” The Future of Children, Vol. 23, No. 2, 2013, pp. 13–39. Fischer, H., A Guide to U.S. Military Casualty Statistics
Group Influences on Young Adult Warfighters’ Risk Taking
2016-12-01
Statistical Analysis Latent linear growth models were fitted using the maximum likelihood estimation method in Mplus (version 7.0; Muthen & Muthen...condition had a higher net score than those in the alone condition (b = 20.53, SE = 6.29, p < .001). Results of the relevant statistical analyses are...8.56 110.86*** 22.01 158.25*** 29.91 Model fit statistics BIC 4004.50 5302.539 5540.58 Chi-square (df) 41.51*** (16) 38.10** (20) 42.19** (20
Mediation Analysis with Survival Outcomes: Accelerated Failure Time vs. Proportional Hazards Models.
Gelfand, Lois A; MacKinnon, David P; DeRubeis, Robert J; Baraldi, Amanda N
2016-01-01
Survival time is an important type of outcome variable in treatment research. Currently, limited guidance is available regarding performing mediation analyses with survival outcomes, which generally do not have normally distributed errors, and contain unobserved (censored) events. We present considerations for choosing an approach, using a comparison of semi-parametric proportional hazards (PH) and fully parametric accelerated failure time (AFT) approaches for illustration. We compare PH and AFT models and procedures in their integration into mediation models and review their ability to produce coefficients that estimate causal effects. Using simulation studies modeling Weibull-distributed survival times, we compare statistical properties of mediation analyses incorporating PH and AFT approaches (employing SAS procedures PHREG and LIFEREG, respectively) under varied data conditions, some including censoring. A simulated data set illustrates the findings. AFT models integrate more easily than PH models into mediation models. Furthermore, mediation analyses incorporating LIFEREG produce coefficients that can estimate causal effects, and demonstrate superior statistical properties. Censoring introduces bias in the coefficient estimate representing the treatment effect on outcome-underestimation in LIFEREG, and overestimation in PHREG. With LIFEREG, this bias can be addressed using an alternative estimate obtained from combining other coefficients, whereas this is not possible with PHREG. When Weibull assumptions are not violated, there are compelling advantages to using LIFEREG over PHREG for mediation analyses involving survival-time outcomes. Irrespective of the procedures used, the interpretation of coefficients, effects of censoring on coefficient estimates, and statistical properties should be taken into account when reporting results.
Orphan therapies: making best use of postmarket data.
Maro, Judith C; Brown, Jeffrey S; Dal Pan, Gerald J; Li, Lingling
2014-08-01
Postmarket surveillance of the comparative safety and efficacy of orphan therapeutics is challenging, particularly when multiple therapeutics are licensed for the same orphan indication. To make best use of product-specific registry data collected to fulfill regulatory requirements, we propose the creation of a distributed electronic health data network among registries. Such a network could support sequential statistical analyses designed to detect early warnings of excess risks. We use a simulated example to explore the circumstances under which a distributed network may prove advantageous. We perform sample size calculations for sequential and non-sequential statistical studies aimed at comparing the incidence of hepatotoxicity following initiation of two newly licensed therapies for homozygous familial hypercholesterolemia. We calculate the sample size savings ratio, or the proportion of sample size saved if one conducted a sequential study as compared to a non-sequential study. Then, using models to describe the adoption and utilization of these therapies, we simulate when these sample sizes are attainable in calendar years. We then calculate the analytic calendar time savings ratio, analogous to the sample size savings ratio. We repeat these analyses for numerous scenarios. Sequential analyses detect effect sizes earlier or at the same time as non-sequential analyses. The most substantial potential savings occur when the market share is more imbalanced (i.e., 90% for therapy A) and the effect size is closest to the null hypothesis. However, due to low exposure prevalence, these savings are difficult to realize within the 30-year time frame of this simulation for scenarios in which the outcome of interest occurs at or more frequently than one event/100 person-years. We illustrate a process to assess whether sequential statistical analyses of registry data performed via distributed networks may prove a worthwhile infrastructure investment for pharmacovigilance.
Data from the Television Game Show "Friend or Foe?"
ERIC Educational Resources Information Center
Kalist, David E.
2004-01-01
The data discussed in this paper are from the television game show "Friend or Foe", and can be used to examine whether age, gender, race, and the amount of prize money affect contestants' strategies. The data are suitable for a variety of statistical analyses, such as descriptive statistics, testing for differences in means or proportions, and…
Targeted On-Demand Team Performance App Development
2016-10-01
from three sites; 6) Preliminary analysis indicates larger than estimate effect size and study is sufficiently powered for generalizable outcomes...statistical analyses, and examine any resulting qualitative data for trends or connections to statistical outcomes. On Schedule 21 Predictive...Preliminary analysis indicates larger than estimate effect size and study is sufficiently powered for generalizable outcomes. What opportunities for
ERIC Educational Resources Information Center
Peterlin, Primoz
2010-01-01
Two methods of data analysis are compared: spreadsheet software and a statistics software suite. Their use is compared analysing data collected in three selected experiments taken from an introductory physics laboratory, which include a linear dependence, a nonlinear dependence and a histogram. The merits of each method are compared. (Contains 7…
ERIC Educational Resources Information Center
Maric, Marija; Wiers, Reinout W.; Prins, Pier J. M.
2012-01-01
Despite guidelines and repeated calls from the literature, statistical mediation analysis in youth treatment outcome research is rare. Even more concerning is that many studies that "have" reported mediation analyses do not fulfill basic requirements for mediation analysis, providing inconclusive data and clinical implications. As a result, after…
Statistical issues in searches for new phenomena in High Energy Physics
NASA Astrophysics Data System (ADS)
Lyons, Louis; Wardle, Nicholas
2018-03-01
Many analyses of data in High Energy Physics are concerned with searches for New Physics. We review the statistical issues that arise in such searches, and then illustrate these using the specific example of the recent successful search for the Higgs boson, produced in collisions between high energy protons at CERN’s Large Hadron Collider.
Synthetic Indicators of Quality of Life in Europe
ERIC Educational Resources Information Center
Somarriba, Noelia; Pena, Bernardo
2009-01-01
For more than three decades now, sociologists, politicians and economists have used a wide range of statistical and econometric techniques to analyse and measure the quality of life of individuals with the aim of obtaining useful instruments for social, political and economic decision making. The aim of this paper is to analyse the advantages and…
Factor Scores, Structure and Communality Coefficients: A Primer
ERIC Educational Resources Information Center
Odum, Mary
2011-01-01
(Purpose) The purpose of this paper is to present an easy-to-understand primer on three important concepts of factor analysis: Factor scores, structure coefficients, and communality coefficients. Given that statistical analyses are a part of a global general linear model (GLM), and utilize weights as an integral part of analyses (Thompson, 2006;…
Comparability of a Paper-Based Language Test and a Computer-Based Language Test.
ERIC Educational Resources Information Center
Choi, Inn-Chull; Kim, Kyoung Sung; Boo, Jaeyool
2003-01-01
Utilizing the Test of English Proficiency, developed by Seoul National University (TEPS), examined comparability between the paper-based language test and the computer-based language test based on content and construct validation employing content analyses based on corpus linguistic techniques in addition to such statistical analyses as…
A data storage, retrieval and analysis system for endocrine research. [for Skylab
NASA Technical Reports Server (NTRS)
Newton, L. E.; Johnston, D. A.
1975-01-01
This retrieval system builds, updates, retrieves, and performs basic statistical analyses on blood, urine, and diet parameters for the M071 and M073 Skylab and Apollo experiments. This system permits data entry from cards to build an indexed sequential file. Programs are easily modified for specialized analyses.
Analyses of the 1981-82 Illinois Public Library Statistics.
ERIC Educational Resources Information Center
Wallace, Danny P.
Using data provided by the annual reports of Illinois public libraries and by the Illinois state library, this publication is a companion to the November 1982 issue of "Illinois Libraries," which enumerated the 16 data elements upon which the analyses are based. Three additional types of information are provided for each of six…
ERIC Educational Resources Information Center
Camerer, Rudi
2014-01-01
The testing of intercultural competence has long been regarded as the field of psychometric test procedures, which claim to analyse an individual's personality by specifying and quantifying personality traits with the help of self-answer questionnaires and the statistical evaluation of these. The underlying assumption is that what is analysed and…
Applying Beliefs and Resources Frameworks to the Psychometric Analyses of an Epistemology Survey
ERIC Educational Resources Information Center
Yerdelen-Damar, Sevda; Elby, Andrew; Eryilmaz, Ali
2012-01-01
This study explored how researchers' views about the form of students' epistemologies influence how the researchers develop and refine surveys and how they interpret survey results. After running standard statistical analyses on 505 physics students' responses to the Turkish version of the Maryland Physics Expectations-II survey, probing students'…
Tom, Jennifer A; Sinsheimer, Janet S; Suchard, Marc A
Massive datasets in the gigabyte and terabyte range combined with the availability of increasingly sophisticated statistical tools yield analyses at the boundary of what is computationally feasible. Compromising in the face of this computational burden by partitioning the dataset into more tractable sizes results in stratified analyses, removed from the context that justified the initial data collection. In a Bayesian framework, these stratified analyses generate intermediate realizations, often compared using point estimates that fail to account for the variability within and correlation between the distributions these realizations approximate. However, although the initial concession to stratify generally precludes the more sensible analysis using a single joint hierarchical model, we can circumvent this outcome and capitalize on the intermediate realizations by extending the dynamic iterative reweighting MCMC algorithm. In doing so, we reuse the available realizations by reweighting them with importance weights, recycling them into a now tractable joint hierarchical model. We apply this technique to intermediate realizations generated from stratified analyses of 687 influenza A genomes spanning 13 years allowing us to revisit hypotheses regarding the evolutionary history of influenza within a hierarchical statistical framework.
Tom, Jennifer A.; Sinsheimer, Janet S.; Suchard, Marc A.
2015-01-01
Massive datasets in the gigabyte and terabyte range combined with the availability of increasingly sophisticated statistical tools yield analyses at the boundary of what is computationally feasible. Compromising in the face of this computational burden by partitioning the dataset into more tractable sizes results in stratified analyses, removed from the context that justified the initial data collection. In a Bayesian framework, these stratified analyses generate intermediate realizations, often compared using point estimates that fail to account for the variability within and correlation between the distributions these realizations approximate. However, although the initial concession to stratify generally precludes the more sensible analysis using a single joint hierarchical model, we can circumvent this outcome and capitalize on the intermediate realizations by extending the dynamic iterative reweighting MCMC algorithm. In doing so, we reuse the available realizations by reweighting them with importance weights, recycling them into a now tractable joint hierarchical model. We apply this technique to intermediate realizations generated from stratified analyses of 687 influenza A genomes spanning 13 years allowing us to revisit hypotheses regarding the evolutionary history of influenza within a hierarchical statistical framework. PMID:26681992
Topographic ERP analyses: a step-by-step tutorial review.
Murray, Micah M; Brunet, Denis; Michel, Christoph M
2008-06-01
In this tutorial review, we detail both the rationale for as well as the implementation of a set of analyses of surface-recorded event-related potentials (ERPs) that uses the reference-free spatial (i.e. topographic) information available from high-density electrode montages to render statistical information concerning modulations in response strength, latency, and topography both between and within experimental conditions. In these and other ways these topographic analysis methods allow the experimenter to glean additional information and neurophysiologic interpretability beyond what is available from canonical waveform analyses. In this tutorial we present the example of somatosensory evoked potentials (SEPs) in response to stimulation of each hand to illustrate these points. For each step of these analyses, we provide the reader with both a conceptual and mathematical description of how the analysis is carried out, what it yields, and how to interpret its statistical outcome. We show that these topographic analysis methods are intuitive and easy-to-use approaches that can remove much of the guesswork often confronting ERP researchers and also assist in identifying the information contained within high-density ERP datasets.
Shadish, William R; Hedges, Larry V; Pustejovsky, James E
2014-04-01
This article presents a d-statistic for single-case designs that is in the same metric as the d-statistic used in between-subjects designs such as randomized experiments and offers some reasons why such a statistic would be useful in SCD research. The d has a formal statistical development, is accompanied by appropriate power analyses, and can be estimated using user-friendly SPSS macros. We discuss both advantages and disadvantages of d compared to other approaches such as previous d-statistics, overlap statistics, and multilevel modeling. It requires at least three cases for computation and assumes normally distributed outcomes and stationarity, assumptions that are discussed in some detail. We also show how to test these assumptions. The core of the article then demonstrates in depth how to compute d for one study, including estimation of the autocorrelation and the ratio of between case variance to total variance (between case plus within case variance), how to compute power using a macro, and how to use the d to conduct a meta-analysis of studies using single-case designs in the free program R, including syntax in an appendix. This syntax includes how to read data, compute fixed and random effect average effect sizes, prepare a forest plot and a cumulative meta-analysis, estimate various influence statistics to identify studies contributing to heterogeneity and effect size, and do various kinds of publication bias analyses. This d may prove useful for both the analysis and meta-analysis of data from SCDs. Copyright © 2013 Society for the Study of School Psychology. Published by Elsevier Ltd. All rights reserved.
Willis, Brian H; Riley, Richard D
2017-09-20
An important question for clinicians appraising a meta-analysis is: are the findings likely to be valid in their own practice-does the reported effect accurately represent the effect that would occur in their own clinical population? To this end we advance the concept of statistical validity-where the parameter being estimated equals the corresponding parameter for a new independent study. Using a simple ('leave-one-out') cross-validation technique, we demonstrate how we may test meta-analysis estimates for statistical validity using a new validation statistic, Vn, and derive its distribution. We compare this with the usual approach of investigating heterogeneity in meta-analyses and demonstrate the link between statistical validity and homogeneity. Using a simulation study, the properties of Vn and the Q statistic are compared for univariate random effects meta-analysis and a tailored meta-regression model, where information from the setting (included as model covariates) is used to calibrate the summary estimate to the setting of application. Their properties are found to be similar when there are 50 studies or more, but for fewer studies Vn has greater power but a higher type 1 error rate than Q. The power and type 1 error rate of Vn are also shown to depend on the within-study variance, between-study variance, study sample size, and the number of studies in the meta-analysis. Finally, we apply Vn to two published meta-analyses and conclude that it usefully augments standard methods when deciding upon the likely validity of summary meta-analysis estimates in clinical practice. © 2017 The Authors. Statistics in Medicine published by John Wiley & Sons Ltd. © 2017 The Authors. Statistics in Medicine published by John Wiley & Sons Ltd.
Classical Statistics and Statistical Learning in Imaging Neuroscience
Bzdok, Danilo
2017-01-01
Brain-imaging research has predominantly generated insight by means of classical statistics, including regression-type analyses and null-hypothesis testing using t-test and ANOVA. Throughout recent years, statistical learning methods enjoy increasing popularity especially for applications in rich and complex data, including cross-validated out-of-sample prediction using pattern classification and sparsity-inducing regression. This concept paper discusses the implications of inferential justifications and algorithmic methodologies in common data analysis scenarios in neuroimaging. It is retraced how classical statistics and statistical learning originated from different historical contexts, build on different theoretical foundations, make different assumptions, and evaluate different outcome metrics to permit differently nuanced conclusions. The present considerations should help reduce current confusion between model-driven classical hypothesis testing and data-driven learning algorithms for investigating the brain with imaging techniques. PMID:29056896
Separate-channel analysis of two-channel microarrays: recovering inter-spot information.
Smyth, Gordon K; Altman, Naomi S
2013-05-26
Two-channel (or two-color) microarrays are cost-effective platforms for comparative analysis of gene expression. They are traditionally analysed in terms of the log-ratios (M-values) of the two channel intensities at each spot, but this analysis does not use all the information available in the separate channel observations. Mixed models have been proposed to analyse intensities from the two channels as separate observations, but such models can be complex to use and the gain in efficiency over the log-ratio analysis is difficult to quantify. Mixed models yield test statistics for the null distributions can be specified only approximately, and some approaches do not borrow strength between genes. This article reformulates the mixed model to clarify the relationship with the traditional log-ratio analysis, to facilitate information borrowing between genes, and to obtain an exact distributional theory for the resulting test statistics. The mixed model is transformed to operate on the M-values and A-values (average log-expression for each spot) instead of on the log-expression values. The log-ratio analysis is shown to ignore information contained in the A-values. The relative efficiency of the log-ratio analysis is shown to depend on the size of the intraspot correlation. A new separate channel analysis method is proposed that assumes a constant intra-spot correlation coefficient across all genes. This approach permits the mixed model to be transformed into an ordinary linear model, allowing the data analysis to use a well-understood empirical Bayes analysis pipeline for linear modeling of microarray data. This yields statistically powerful test statistics that have an exact distributional theory. The log-ratio, mixed model and common correlation methods are compared using three case studies. The results show that separate channel analyses that borrow strength between genes are more powerful than log-ratio analyses. The common correlation analysis is the most powerful of all. The common correlation method proposed in this article for separate-channel analysis of two-channel microarray data is no more difficult to apply in practice than the traditional log-ratio analysis. It provides an intuitive and powerful means to conduct analyses and make comparisons that might otherwise not be possible.
The analysis of morphometric data on rocky mountain wolves and artic wolves using statistical method
NASA Astrophysics Data System (ADS)
Ammar Shafi, Muhammad; Saifullah Rusiman, Mohd; Hamzah, Nor Shamsidah Amir; Nor, Maria Elena; Ahmad, Noor’ani; Azia Hazida Mohamad Azmi, Nur; Latip, Muhammad Faez Ab; Hilmi Azman, Ahmad
2018-04-01
Morphometrics is a quantitative analysis depending on the shape and size of several specimens. Morphometric quantitative analyses are commonly used to analyse fossil record, shape and size of specimens and others. The aim of the study is to find the differences between rocky mountain wolves and arctic wolves based on gender. The sample utilised secondary data which included seven variables as independent variables and two dependent variables. Statistical modelling was used in the analysis such was the analysis of variance (ANOVA) and multivariate analysis of variance (MANOVA). The results showed there exist differentiating results between arctic wolves and rocky mountain wolves based on independent factors and gender.
The Extent and Consequences of P-Hacking in Science
Head, Megan L.; Holman, Luke; Lanfear, Rob; Kahn, Andrew T.; Jennions, Michael D.
2015-01-01
A focus on novel, confirmatory, and statistically significant results leads to substantial bias in the scientific literature. One type of bias, known as “p-hacking,” occurs when researchers collect or select data or statistical analyses until nonsignificant results become significant. Here, we use text-mining to demonstrate that p-hacking is widespread throughout science. We then illustrate how one can test for p-hacking when performing a meta-analysis and show that, while p-hacking is probably common, its effect seems to be weak relative to the real effect sizes being measured. This result suggests that p-hacking probably does not drastically alter scientific consensuses drawn from meta-analyses. PMID:25768323
Across-cohort QC analyses of GWAS summary statistics from complex traits.
Chen, Guo-Bo; Lee, Sang Hong; Robinson, Matthew R; Trzaskowski, Maciej; Zhu, Zhi-Xiang; Winkler, Thomas W; Day, Felix R; Croteau-Chonka, Damien C; Wood, Andrew R; Locke, Adam E; Kutalik, Zoltán; Loos, Ruth J F; Frayling, Timothy M; Hirschhorn, Joel N; Yang, Jian; Wray, Naomi R; Visscher, Peter M
2016-01-01
Genome-wide association studies (GWASs) have been successful in discovering SNP trait associations for many quantitative traits and common diseases. Typically, the effect sizes of SNP alleles are very small and this requires large genome-wide association meta-analyses (GWAMAs) to maximize statistical power. A trend towards ever-larger GWAMA is likely to continue, yet dealing with summary statistics from hundreds of cohorts increases logistical and quality control problems, including unknown sample overlap, and these can lead to both false positive and false negative findings. In this study, we propose four metrics and visualization tools for GWAMA, using summary statistics from cohort-level GWASs. We propose methods to examine the concordance between demographic information, and summary statistics and methods to investigate sample overlap. (I) We use the population genetics F st statistic to verify the genetic origin of each cohort and their geographic location, and demonstrate using GWAMA data from the GIANT Consortium that geographic locations of cohorts can be recovered and outlier cohorts can be detected. (II) We conduct principal component analysis based on reported allele frequencies, and are able to recover the ancestral information for each cohort. (III) We propose a new statistic that uses the reported allelic effect sizes and their standard errors to identify significant sample overlap or heterogeneity between pairs of cohorts. (IV) To quantify unknown sample overlap across all pairs of cohorts, we propose a method that uses randomly generated genetic predictors that does not require the sharing of individual-level genotype data and does not breach individual privacy.
Analysis of the dependence of extreme rainfalls
NASA Astrophysics Data System (ADS)
Padoan, Simone; Ancey, Christophe; Parlange, Marc
2010-05-01
The aim of spatial analysis is to quantitatively describe the behavior of environmental phenomena such as precipitation levels, wind speed or daily temperatures. A number of generic approaches to spatial modeling have been developed[1], but these are not necessarily ideal for handling extremal aspects given their focus on mean process levels. The areal modelling of the extremes of a natural process observed at points in space is important in environmental statistics; for example, understanding extremal spatial rainfall is crucial in flood protection. In light of recent concerns over climate change, the use of robust mathematical and statistical methods for such analyses has grown in importance. Multivariate extreme value models and the class of maxstable processes [2] have a similar asymptotic motivation to the univariate Generalized Extreme Value (GEV) distribution , but providing a general approach to modeling extreme processes incorporating temporal or spatial dependence. Statistical methods for max-stable processes and data analyses of practical problems are discussed by [3] and [4]. This work illustrates methods to the statistical modelling of spatial extremes and gives examples of their use by means of a real extremal data analysis of Switzerland precipitation levels. [1] Cressie, N. A. C. (1993). Statistics for Spatial Data. Wiley, New York. [2] de Haan, L and Ferreria A. (2006). Extreme Value Theory An Introduction. Springer, USA. [3] Padoan, S. A., Ribatet, M and Sisson, S. A. (2009). Likelihood-Based Inference for Max-Stable Processes. Journal of the American Statistical Association, Theory & Methods. In press. [4] Davison, A. C. and Gholamrezaee, M. (2009), Geostatistics of extremes. Journal of the Royal Statistical Society, Series B. To appear.
The sumLINK statistic for genetic linkage analysis in the presence of heterogeneity.
Christensen, G B; Knight, S; Camp, N J
2009-11-01
We present the "sumLINK" statistic--the sum of multipoint LOD scores for the subset of pedigrees with nominally significant linkage evidence at a given locus--as an alternative to common methods to identify susceptibility loci in the presence of heterogeneity. We also suggest the "sumLOD" statistic (the sum of positive multipoint LOD scores) as a companion to the sumLINK. sumLINK analysis identifies genetic regions of extreme consistency across pedigrees without regard to negative evidence from unlinked or uninformative pedigrees. Significance is determined by an innovative permutation procedure based on genome shuffling that randomizes linkage information across pedigrees. This procedure for generating the empirical null distribution may be useful for other linkage-based statistics as well. Using 500 genome-wide analyses of simulated null data, we show that the genome shuffling procedure results in the correct type 1 error rates for both the sumLINK and sumLOD. The power of the statistics was tested using 100 sets of simulated genome-wide data from the alternative hypothesis from GAW13. Finally, we illustrate the statistics in an analysis of 190 aggressive prostate cancer pedigrees from the International Consortium for Prostate Cancer Genetics, where we identified a new susceptibility locus. We propose that the sumLINK and sumLOD are ideal for collaborative projects and meta-analyses, as they do not require any sharing of identifiable data between contributing institutions. Further, loci identified with the sumLINK have good potential for gene localization via statistical recombinant mapping, as, by definition, several linked pedigrees contribute to each peak.
Across-cohort QC analyses of GWAS summary statistics from complex traits
Chen, Guo-Bo; Lee, Sang Hong; Robinson, Matthew R; Trzaskowski, Maciej; Zhu, Zhi-Xiang; Winkler, Thomas W; Day, Felix R; Croteau-Chonka, Damien C; Wood, Andrew R; Locke, Adam E; Kutalik, Zoltán; Loos, Ruth J F; Frayling, Timothy M; Hirschhorn, Joel N; Yang, Jian; Wray, Naomi R; Visscher, Peter M
2017-01-01
Genome-wide association studies (GWASs) have been successful in discovering SNP trait associations for many quantitative traits and common diseases. Typically, the effect sizes of SNP alleles are very small and this requires large genome-wide association meta-analyses (GWAMAs) to maximize statistical power. A trend towards ever-larger GWAMA is likely to continue, yet dealing with summary statistics from hundreds of cohorts increases logistical and quality control problems, including unknown sample overlap, and these can lead to both false positive and false negative findings. In this study, we propose four metrics and visualization tools for GWAMA, using summary statistics from cohort-level GWASs. We propose methods to examine the concordance between demographic information, and summary statistics and methods to investigate sample overlap. (I) We use the population genetics Fst statistic to verify the genetic origin of each cohort and their geographic location, and demonstrate using GWAMA data from the GIANT Consortium that geographic locations of cohorts can be recovered and outlier cohorts can be detected. (II) We conduct principal component analysis based on reported allele frequencies, and are able to recover the ancestral information for each cohort. (III) We propose a new statistic that uses the reported allelic effect sizes and their standard errors to identify significant sample overlap or heterogeneity between pairs of cohorts. (IV) To quantify unknown sample overlap across all pairs of cohorts, we propose a method that uses randomly generated genetic predictors that does not require the sharing of individual-level genotype data and does not breach individual privacy. PMID:27552965
Muko, Soyoka; Shimatani, Ichiro K; Nozawa, Yoko
2014-07-01
Spatial distributions of individuals are conventionally analysed by representing objects as dimensionless points, in which spatial statistics are based on centre-to-centre distances. However, if organisms expand without overlapping and show size variations, such as is the case for encrusting corals, interobject spacing is crucial for spatial associations where interactions occur. We introduced new pairwise statistics using minimum distances between objects and demonstrated their utility when examining encrusting coral community data. We also calculated the conventional point process statistics and the grid-based statistics to clarify the advantages and limitations of each spatial statistical method. For simplicity, coral colonies were approximated by disks in these demonstrations. Focusing on short-distance effects, the use of minimum distances revealed that almost all coral genera were aggregated at a scale of 1-25 cm. However, when fragmented colonies (ramets) were treated as a genet, a genet-level analysis indicated weak or no aggregation, suggesting that most corals were randomly distributed and that fragmentation was the primary cause of colony aggregations. In contrast, point process statistics showed larger aggregation scales, presumably because centre-to-centre distances included both intercolony spacing and colony sizes (radius). The grid-based statistics were able to quantify the patch (aggregation) scale of colonies, but the scale was strongly affected by the colony size. Our approach quantitatively showed repulsive effects between an aggressive genus and a competitively weak genus, while the grid-based statistics (covariance function) also showed repulsion although the spatial scale indicated from the statistics was not directly interpretable in terms of ecological meaning. The use of minimum distances together with previously proposed spatial statistics helped us to extend our understanding of the spatial patterns of nonoverlapping objects that vary in size and the associated specific scales. © 2013 The Authors. Journal of Animal Ecology © 2013 British Ecological Society.
Weighted Statistical Binning: Enabling Statistically Consistent Genome-Scale Phylogenetic Analyses
Bayzid, Md Shamsuzzoha; Mirarab, Siavash; Boussau, Bastien; Warnow, Tandy
2015-01-01
Because biological processes can result in different loci having different evolutionary histories, species tree estimation requires multiple loci from across multiple genomes. While many processes can result in discord between gene trees and species trees, incomplete lineage sorting (ILS), modeled by the multi-species coalescent, is considered to be a dominant cause for gene tree heterogeneity. Coalescent-based methods have been developed to estimate species trees, many of which operate by combining estimated gene trees, and so are called "summary methods". Because summary methods are generally fast (and much faster than more complicated coalescent-based methods that co-estimate gene trees and species trees), they have become very popular techniques for estimating species trees from multiple loci. However, recent studies have established that summary methods can have reduced accuracy in the presence of gene tree estimation error, and also that many biological datasets have substantial gene tree estimation error, so that summary methods may not be highly accurate in biologically realistic conditions. Mirarab et al. (Science 2014) presented the "statistical binning" technique to improve gene tree estimation in multi-locus analyses, and showed that it improved the accuracy of MP-EST, one of the most popular coalescent-based summary methods. Statistical binning, which uses a simple heuristic to evaluate "combinability" and then uses the larger sets of genes to re-calculate gene trees, has good empirical performance, but using statistical binning within a phylogenomic pipeline does not have the desirable property of being statistically consistent. We show that weighting the re-calculated gene trees by the bin sizes makes statistical binning statistically consistent under the multispecies coalescent, and maintains the good empirical performance. Thus, "weighted statistical binning" enables highly accurate genome-scale species tree estimation, and is also statistically consistent under the multi-species coalescent model. New data used in this study are available at DOI: http://dx.doi.org/10.6084/m9.figshare.1411146, and the software is available at https://github.com/smirarab/binning. PMID:26086579
Laberge, Marie; Tondoux, Aurélie; Camiré Tremblay, Fanny; MacEachen, Ellen
2017-11-01
In Quebec (Canada), the Work-Oriented Training Path, a work-study program, prepares students who are having difficulty at school for the job market. Occupational health and safety is an important part of their training. This article aims to analyze the impact of gender on the interpersonal dynamics among teachers, trainees, and key actors from the businesses involved. This article also looks at the influence of gender on teachers' strategies and capacity to act regarding occupational health and safety. Using a work activity analysis lens, a multiple case-study analysis of teachers' work activity was carried out. The findings show that gendered social relationships create a specific supervisory context that influences occupational health and safety training. Solutions aimed at reducing the negative impact of gender-associated prejudice on work injury prevention include training for teachers, attention to work organization at the schools, and the creation of cohesive teachers' work teams. Résumé Au Québec, le Parcours de formation axée sur l'emploi (WOTP), un programme en alternance, offre une préparation au marché du travail aux élèves en difficultés scolaires. La santé et la sécurité du travail (SST) est un enjeu important de la formation. L'article vise à analyser l'impact du genre dans la dynamique relationnelle entre les enseignant.es, leurs élèves et les interlocuteurs clés des entreprises impliquées, et son influence sur les stratégies et la capacité d'agir des enseignant.es en matière de SST. Une analyse de cas multiples basée sur l'analyse ergonomique de l'activité des enseignants a été menée. Les résultats montrent que les relations sociales de genre déterminent un contexte spécifique de supervision qui influence la formation à la SST. La formation des enseignant.es, l'organisation scolaire et la création de collectifs enseignants cohésifs sont des pistes de solution pour réduire l'effet négatif des préjugés liés au genre influençant la prévention des lésions professionnelles.
Kitronza, Panda Lukongo
2014-01-01
Introduction Le but de cette étude est de mettre en évidence les facteurs de risques professionnels liés aux conditions de travail. Méthodes Cette étude qualitative basée sur les entretiens de groupe a été réalisée par une équipe pluridisciplinaire dans l'industrie textile de la région du Est de la RDC; comprenant un médecin de travail, un médecin de santé publique, un toxicologue, deux infirmiers du centre hospitalier de l'usine, un représentant du comité d'hygiène et un technicien de prévention. La démarche méthodologique a consisté en des entretiens en groupe, des observations et visites guidées de lieux de travail de l'entreprise. Résultats Dans la culture du coton, les effets d'une forte exposition aux pesticides peuvent entraîner des intoxications aiguës, chroniques et voire le décès. Les autres risques sont les accidents de travail, les maladies professionnelles, les troubles psychologiques. Dans l'industrie, les travailleurs sont exposés aux risques liés à l'empoussiérage des fibres de coton, aux facteurs des risques traumatiques, physiques (bruits, vibration) et chimiques (acides forts, bases fortes, solvants et colorants minéraux), ainsi qu'aux risques psychosociaux. La pollution de l'environnement et l’écotoxicité inhérente à ces activités restent l'effet de l'usage des grandes quantités d'intrants agricoles, engrais et produits phytosanitaires. Conclusion Cette étude a permis de mettre en évidence les différents facteurs de risques auxquelles sont soumis les travailleurs textiles; ainsi que les risques environnementaux liés à cette activité. Cela est de nature à permettre la mise sur pied d'une stratégie efficace de prévention et de protection des travailleurs. PMID:25977736
La supraconductivité a 100 ans !
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lebrun, Philippe
2011-04-14
Il y a 100 ans, le 8 avril 1911, une découverte majeure était réalisée : celle de la supraconductivité. La supraconductivité est la caractéristique qu’ont certains métaux et alliages de perdre toute résistance électrique en dessous d’une température donnée. Cette renversante découverte, réalisée de manière presque fortuite par Kammerlingh Onnes de l’Université de Leyde (Pays-Bas) et son étudiant Gilles Holst, a ouvert un nouveau champ de recherche en physique et de fabuleuses perspectives d’applications technologiques. Du point de vue scientifique, la supraconductivité est en effet l’une des rares manifestations de la physique quantique à l’échelle macroscopique. Du point de vuemore » des retombées techniques, elle est porteuse d’applications majeures dans le domaine de la santé, des communications et de l’énergie. 100 ans après, les physiciens n’ont toujours pas fini d’explorer ce phénomène et ses applications. Le CERN abrite des applications de la supraconductivité à des échelles inédites. L’accélérateur de particules LHC, avec ses milliers d’aimants supraconducteurs répartis sur 27 kilomètres de circonférence, est en effet la plus grande application mondiale de la supraconductivité. Il ne pourrait exister sans elle. Le CERN fête donc la découverte de la supraconductivité avec une conférence exceptionnelle donnée par Philippe Lebrun. Au cours de cette conférence, l’expérience historique de Kammerlingh Onnes sera reproduite. Philippe Lebrun racontera l’histoire de cette étonnante découverte, en la replaçant dans le contexte scientifique de l’époque. Il racontera les développements scientifiques et les applications du premier siècle de la supraconductivité. Conférence en français Merci de bien vouloir vous inscrire au : +41 22 767 76 76 ou cern.reception@cern.ch« less
Arung, Willy; Tshilombo, François; Odimba, Etienne
2015-01-01
Introduction Bien d’études ont été menées sur les adhérences intrapéritonéales, mais aucune unanimité n'est encore acquise sur leur prévention. Le but de notre étude a été d’évaluer le potentiel effet d'un antiinflammatoire, parecoxib dans la prévention des adhérences ainsi que sur la cicatrisation chez des rats. Méthodes Dans un modèle expérimental d'adhérences postopératoires secondaires à des lésions péritonéales par brûlure, 30 rats furent randomisés en trois groupes suivant le mode d'administration de parecoxib (groupe contrôle; intrapéritonéal; intramusculaire. Résultats Le parecoxib a significativement diminué la quantité (p < .05) et la sévérité (p < .01) des adhérences postopératoires dans les deux modèles expérimentaux. Au total, 21 rats ont développé des adhérences, respectivement 9 (100%) dans le groupe A, 5 (50%) dans le groupe B et 7 (70%) dans le groupe C (p = 0.05). Du point de vue de la formation des adhérences au site du traumatisme, dix-neuf rats en ont développé: 9 (100%) dans le groupe A et 5 (50%) pour chacun de deux autres groupes B et C. Une différence significative a été constatée en comparant ces groupes deux à deux: A vs B (p < 0.05); A vs C (p < 0,05). Parecoxib n'a pas compromis la cicatrisation intestinale, ni cutanée. Conclusion Cette étude a montré que le parecoxib pouvait réduire la formation des adhérences postopératoires. La confirmation de la sécurité du parecoxib sur les anastomoses intestinales doit être investiguée au cours d'autres expérimentations. PMID:26966478
Metz, Anneke M
2008-01-01
There is an increasing need for students in the biological sciences to build a strong foundation in quantitative approaches to data analyses. Although most science, engineering, and math field majors are required to take at least one statistics course, statistical analysis is poorly integrated into undergraduate biology course work, particularly at the lower-division level. Elements of statistics were incorporated into an introductory biology course, including a review of statistics concepts and opportunity for students to perform statistical analysis in a biological context. Learning gains were measured with an 11-item statistics learning survey instrument developed for the course. Students showed a statistically significant 25% (p < 0.005) increase in statistics knowledge after completing introductory biology. Students improved their scores on the survey after completing introductory biology, even if they had previously completed an introductory statistics course (9%, improvement p < 0.005). Students retested 1 yr after completing introductory biology showed no loss of their statistics knowledge as measured by this instrument, suggesting that the use of statistics in biology course work may aid long-term retention of statistics knowledge. No statistically significant differences in learning were detected between male and female students in the study.
Challenges and solutions to pre- and post-randomization subgroup analyses.
Desai, Manisha; Pieper, Karen S; Mahaffey, Ken
2014-01-01
Subgroup analyses are commonly performed in the clinical trial setting with the purpose of illustrating that the treatment effect was consistent across different patient characteristics or identifying characteristics that should be targeted for treatment. There are statistical issues involved in performing subgroup analyses, however. These have been given considerable attention in the literature for analyses where subgroups are defined by a pre-randomization feature. Although subgroup analyses are often performed with subgroups defined by a post-randomization feature--including analyses that estimate the treatment effect among compliers--discussion of these analyses has been neglected in the clinical literature. Such analyses pose a high risk of presenting biased descriptions of treatment effects. We summarize the challenges of doing all types of subgroup analyses described in the literature. In particular, we emphasize issues with post-randomization subgroup analyses. Finally, we provide guidelines on how to proceed across the spectrum of subgroup analyses.
Interpretation of correlations in clinical research.
Hung, Man; Bounsanga, Jerry; Voss, Maren Wright
2017-11-01
Critically analyzing research is a key skill in evidence-based practice and requires knowledge of research methods, results interpretation, and applications, all of which rely on a foundation based in statistics. Evidence-based practice makes high demands on trained medical professionals to interpret an ever-expanding array of research evidence. As clinical training emphasizes medical care rather than statistics, it is useful to review the basics of statistical methods and what they mean for interpreting clinical studies. We reviewed the basic concepts of correlational associations, violations of normality, unobserved variable bias, sample size, and alpha inflation. The foundations of causal inference were discussed and sound statistical analyses were examined. We discuss four ways in which correlational analysis is misused, including causal inference overreach, over-reliance on significance, alpha inflation, and sample size bias. Recent published studies in the medical field provide evidence of causal assertion overreach drawn from correlational findings. The findings present a primer on the assumptions and nature of correlational methods of analysis and urge clinicians to exercise appropriate caution as they critically analyze the evidence before them and evaluate evidence that supports practice. Critically analyzing new evidence requires statistical knowledge in addition to clinical knowledge. Studies can overstate relationships, expressing causal assertions when only correlational evidence is available. Failure to account for the effect of sample size in the analyses tends to overstate the importance of predictive variables. It is important not to overemphasize the statistical significance without consideration of effect size and whether differences could be considered clinically meaningful.
Petersson, K M; Nichols, T E; Poline, J B; Holmes, A P
1999-01-01
Functional neuroimaging (FNI) provides experimental access to the intact living brain making it possible to study higher cognitive functions in humans. In this review and in a companion paper in this issue, we discuss some common methods used to analyse FNI data. The emphasis in both papers is on assumptions and limitations of the methods reviewed. There are several methods available to analyse FNI data indicating that none is optimal for all purposes. In order to make optimal use of the methods available it is important to know the limits of applicability. For the interpretation of FNI results it is also important to take into account the assumptions, approximations and inherent limitations of the methods used. This paper gives a brief overview over some non-inferential descriptive methods and common statistical models used in FNI. Issues relating to the complex problem of model selection are discussed. In general, proper model selection is a necessary prerequisite for the validity of the subsequent statistical inference. The non-inferential section describes methods that, combined with inspection of parameter estimates and other simple measures, can aid in the process of model selection and verification of assumptions. The section on statistical models covers approaches to global normalization and some aspects of univariate, multivariate, and Bayesian models. Finally, approaches to functional connectivity and effective connectivity are discussed. In the companion paper we review issues related to signal detection and statistical inference. PMID:10466149
An evaluation of GTAW-P versus GTA welding of alloy 718
NASA Technical Reports Server (NTRS)
Gamwell, W. R.; Kurgan, C.; Malone, T. W.
1991-01-01
Mechanical properties were evaluated to determine statistically whether the pulsed current gas tungsten arc welding (GTAW-P) process produces welds in alloy 718 with room temperature structural performance equivalent to current Space Shuttle Main Engine (SSME) welds manufactured by the constant current GTAW-P process. Evaluations were conducted on two base metal lots, two filler metal lots, two heat input levels, and two welding processes. The material form was 0.125-inch (3.175-mm) alloy 718 sheet. Prior to welding, sheets were treated to either the ST or STA-1 condition. After welding, panels were left as welded or heat treated to the STA-1 condition, and weld beads were left intact or machined flush. Statistical analyses were performed on yield strength, ultimate tensile strength (UTS), and high cycle fatigue (HCF) properties for all the post welded material conditions. Analyses of variance were performed on the data to determine if there were any significant effects on UTS or HCF life due to variations in base metal, filler metal, heat input level, or welding process. Statistical analyses showed that the GTAW-P process does produce welds with room temperature structural performance equivalent to current SSME welds manufactured by the GTAW process, regardless of prior material condition or post welding condition.
Living systematic reviews: 3. Statistical methods for updating meta-analyses.
Simmonds, Mark; Salanti, Georgia; McKenzie, Joanne; Elliott, Julian
2017-11-01
A living systematic review (LSR) should keep the review current as new research evidence emerges. Any meta-analyses included in the review will also need updating as new material is identified. If the aim of the review is solely to present the best current evidence standard meta-analysis may be sufficient, provided reviewers are aware that results may change at later updates. If the review is used in a decision-making context, more caution may be needed. When using standard meta-analysis methods, the chance of incorrectly concluding that any updated meta-analysis is statistically significant when there is no effect (the type I error) increases rapidly as more updates are performed. Inaccurate estimation of any heterogeneity across studies may also lead to inappropriate conclusions. This paper considers four methods to avoid some of these statistical problems when updating meta-analyses: two methods, that is, law of the iterated logarithm and the Shuster method control primarily for inflation of type I error and two other methods, that is, trial sequential analysis and sequential meta-analysis control for type I and II errors (failing to detect a genuine effect) and take account of heterogeneity. This paper compares the methods and considers how they could be applied to LSRs. Copyright © 2017 Elsevier Inc. All rights reserved.
Guerrero, Luis; Guàrdia, Maria Dolors; Xicola, Joan; Verbeke, Wim; Vanhonacker, Filiep; Zakowska-Biemans, Sylwia; Sajdakowska, Marta; Sulmont-Rossé, Claire; Issanchou, Sylvie; Contel, Michele; Scalvedi, M Luisa; Granli, Britt Signe; Hersleth, Margrethe
2009-04-01
Traditional food products (TFP) are an important part of European culture, identity, and heritage. In order to maintain and expand the market share of TFP, further improvement in safety, health, or convenience is needed by means of different innovations. The aim of this study was to obtain a consumer-driven definition for the concept of TFP and innovation and to compare these across six European countries (Belgium, France, Italy, Norway, Poland and Spain) by means of semantic and textual statistical analyses. Twelve focus groups were performed, two per country, under similar conditions. The transcriptions obtained were submitted to an ordinary semantic analysis and to a textual statistical analysis using the software ALCESTE. Four main dimensions were identified for the concept of TFP: habit-natural, origin-locality, processing-elaboration and sensory properties. Five dimensions emerged around the concept of innovation: novelty-change, variety, processing-technology, origin-ethnicity and convenience. TFP were similarly perceived in the countries analysed, while some differences were detected for the concept of innovation. Semantic and statistical analyses of the focus groups led to similar results for both concepts. In some cases and according to the consumers' point of view the application of innovations may damage the traditional character of TFP.
Ramón, M; Martínez-Pastor, F
2018-04-23
Computer-aided sperm analysis (CASA) produces a wealth of data that is frequently ignored. The use of multiparametric statistical methods can help explore these datasets, unveiling the subpopulation structure of sperm samples. In this review we analyse the significance of the internal heterogeneity of sperm samples and its relevance. We also provide a brief description of the statistical tools used for extracting sperm subpopulations from the datasets, namely unsupervised clustering (with non-hierarchical, hierarchical and two-step methods) and the most advanced supervised methods, based on machine learning. The former method has allowed exploration of subpopulation patterns in many species, whereas the latter offering further possibilities, especially considering functional studies and the practical use of subpopulation analysis. We also consider novel approaches, such as the use of geometric morphometrics or imaging flow cytometry. Finally, although the data provided by CASA systems provides valuable information on sperm samples by applying clustering analyses, there are several caveats. Protocols for capturing and analysing motility or morphometry should be standardised and adapted to each experiment, and the algorithms should be open in order to allow comparison of results between laboratories. Moreover, we must be aware of new technology that could change the paradigm for studying sperm motility and morphology.
Meta-analysis of magnitudes, differences and variation in evolutionary parameters.
Morrissey, M B
2016-10-01
Meta-analysis is increasingly used to synthesize major patterns in the large literatures within ecology and evolution. Meta-analytic methods that do not account for the process of observing data, which we may refer to as 'informal meta-analyses', may have undesirable properties. In some cases, informal meta-analyses may produce results that are unbiased, but do not necessarily make the best possible use of available data. In other cases, unbiased statistical noise in individual reports in the literature can potentially be converted into severe systematic biases in informal meta-analyses. I first present a general description of how failure to account for noise in individual inferences should be expected to lead to biases in some kinds of meta-analysis. In particular, informal meta-analyses of quantities that reflect the dispersion of parameters in nature, for example, the mean absolute value of a quantity, are likely to be generally highly misleading. I then re-analyse three previously published informal meta-analyses, where key inferences were of aspects of the dispersion of values in nature, for example, the mean absolute value of selection gradients. Major biological conclusions in each original informal meta-analysis closely match those that could arise as artefacts due to statistical noise. I present alternative mixed-model-based analyses that are specifically tailored to each situation, but where all analyses may be implemented with widely available open-source software. In each example meta-re-analysis, major conclusions change substantially. © 2016 European Society For Evolutionary Biology. Journal of Evolutionary Biology © 2016 European Society For Evolutionary Biology.
Metamodels for Computer-Based Engineering Design: Survey and Recommendations
NASA Technical Reports Server (NTRS)
Simpson, Timothy W.; Peplinski, Jesse; Koch, Patrick N.; Allen, Janet K.
1997-01-01
The use of statistical techniques to build approximations of expensive computer analysis codes pervades much of todays engineering design. These statistical approximations, or metamodels, are used to replace the actual expensive computer analyses, facilitating multidisciplinary, multiobjective optimization and concept exploration. In this paper we review several of these techniques including design of experiments, response surface methodology, Taguchi methods, neural networks, inductive learning, and kriging. We survey their existing application in engineering design and then address the dangers of applying traditional statistical techniques to approximate deterministic computer analysis codes. We conclude with recommendations for the appropriate use of statistical approximation techniques in given situations and how common pitfalls can be avoided.
Dymova, Natalya; Hanumara, R. Choudary; Gagnon, Ronald N.
2009-01-01
Performance measurement is increasingly viewed as an essential component of environmental and public health protection programs. In characterizing program performance over time, investigators often observe multiple changes resulting from a single intervention across a range of categories. Although a variety of statistical tools allow evaluation of data one variable at a time, the global test statistic is uniquely suited for analyses of categories or groups of interrelated variables. Here we demonstrate how the global test statistic can be applied to environmental and occupational health data for the purpose of making overall statements on the success of targeted intervention strategies. PMID:19696393
Dymova, Natalya; Hanumara, R Choudary; Enander, Richard T; Gagnon, Ronald N
2009-10-01
Performance measurement is increasingly viewed as an essential component of environmental and public health protection programs. In characterizing program performance over time, investigators often observe multiple changes resulting from a single intervention across a range of categories. Although a variety of statistical tools allow evaluation of data one variable at a time, the global test statistic is uniquely suited for analyses of categories or groups of interrelated variables. Here we demonstrate how the global test statistic can be applied to environmental and occupational health data for the purpose of making overall statements on the success of targeted intervention strategies.
Descriptive and inferential statistical methods used in burns research.
Al-Benna, Sammy; Al-Ajam, Yazan; Way, Benjamin; Steinstraesser, Lars
2010-05-01
Burns research articles utilise a variety of descriptive and inferential methods to present and analyse data. The aim of this study was to determine the descriptive methods (e.g. mean, median, SD, range, etc.) and survey the use of inferential methods (statistical tests) used in articles in the journal Burns. This study defined its population as all original articles published in the journal Burns in 2007. Letters to the editor, brief reports, reviews, and case reports were excluded. Study characteristics, use of descriptive statistics and the number and types of statistical methods employed were evaluated. Of the 51 articles analysed, 11(22%) were randomised controlled trials, 18(35%) were cohort studies, 11(22%) were case control studies and 11(22%) were case series. The study design and objectives were defined in all articles. All articles made use of continuous and descriptive data. Inferential statistics were used in 49(96%) articles. Data dispersion was calculated by standard deviation in 30(59%). Standard error of the mean was quoted in 19(37%). The statistical software product was named in 33(65%). Of the 49 articles that used inferential statistics, the tests were named in 47(96%). The 6 most common tests used (Student's t-test (53%), analysis of variance/co-variance (33%), chi(2) test (27%), Wilcoxon & Mann-Whitney tests (22%), Fisher's exact test (12%)) accounted for the majority (72%) of statistical methods employed. A specified significance level was named in 43(88%) and the exact significance levels were reported in 28(57%). Descriptive analysis and basic statistical techniques account for most of the statistical tests reported. This information should prove useful in deciding which tests should be emphasised in educating burn care professionals. These results highlight the need for burn care professionals to have a sound understanding of basic statistics, which is crucial in interpreting and reporting data. Advice should be sought from professionals in the fields of biostatistics and epidemiology when using more advanced statistical techniques. Copyright 2009 Elsevier Ltd and ISBI. All rights reserved.
Regression methods for spatially correlated data: an example using beetle attacks in a seed orchard
Preisler Haiganoush; Nancy G. Rappaport; David L. Wood
1997-01-01
We present a statistical procedure for studying the simultaneous effects of observed covariates and unmeasured spatial variables on responses of interest. The procedure uses regression type analyses that can be used with existing statistical software packages. An example using the rate of twig beetle attacks on Douglas-fir trees in a seed orchard illustrates the...
ERIC Educational Resources Information Center
Brossart, Daniel F.; Parker, Richard I.; Olson, Elizabeth A.; Mahadevan, Lakshmi
2006-01-01
This study explored some practical issues for single-case researchers who rely on visual analysis of graphed data, but who also may consider supplemental use of promising statistical analysis techniques. The study sought to answer three major questions: (a) What is a typical range of effect sizes from these analytic techniques for data from…
Differences among Myopes, Emmetropes, and Hyperopes.
1980-04-01
parasympathetic activity (see Wenger & Cullen, 1965). The second index, Cw, is a coherence statistic (a normalized function with values ranging between zero...farther distances. ACKNOWLEDGMENT The literature search and the statistical analyses presented in this report were conducted at New Mexico State...in infant and child. New York: Paul B. Hoeber, 1949. Gould, G. M. Diagnosis, diseases and therapeutics of ametropia . British Journal of Ophthalmology